USASOC Injury Prevention/Performance Optimization Musculoskeletal Screening Initiative
2011-11-01
Tactical Human Optimization , Rapid Rehabilitation , and Reconditioning (THOR3) program to identify the...Special Operations Command (USASOC) to support development of USASOC’s Tactical Human Optimization , Rapid Rehabilitation , and Reconditioning (THOR3...biomechanical, musculoskeletal, physiological, tactical , and injury data and refine its current human performance program to address the
NASA Technical Reports Server (NTRS)
Eckstein, Miguel P.; Abbey, Craig K.; Pham, Binh T.; Shimozaki, Steven S.
2004-01-01
Human performance in visual detection, discrimination, identification, and search tasks typically improves with practice. Psychophysical studies suggest that perceptual learning is mediated by an enhancement in the coding of the signal, and physiological studies suggest that it might be related to the plasticity in the weighting or selection of sensory units coding task relevant information (learning through attention optimization). We propose an experimental paradigm (optimal perceptual learning paradigm) to systematically study the dynamics of perceptual learning in humans by allowing comparisons to that of an optimal Bayesian algorithm and a number of suboptimal learning models. We measured improvement in human localization (eight-alternative forced-choice with feedback) performance of a target randomly sampled from four elongated Gaussian targets with different orientations and polarities and kept as a target for a block of four trials. The results suggest that the human perceptual learning can occur within a lapse of four trials (<1 min) but that human learning is slower and incomplete with respect to the optimal algorithm (23.3% reduction in human efficiency from the 1st-to-4th learning trials). The greatest improvement in human performance, occurring from the 1st-to-2nd learning trial, was also present in the optimal observer, and, thus reflects a property inherent to the visual task and not a property particular to the human perceptual learning mechanism. One notable source of human inefficiency is that, unlike the ideal observer, human learning relies more heavily on previous decisions than on the provided feedback, resulting in no human learning on trials following a previous incorrect localization decision. Finally, the proposed theory and paradigm provide a flexible framework for future studies to evaluate the optimality of human learning of other visual cues and/or sensory modalities.
Human Performance on Hard Non-Euclidean Graph Problems: Vertex Cover
ERIC Educational Resources Information Center
Carruthers, Sarah; Masson, Michael E. J.; Stege, Ulrike
2012-01-01
Recent studies on a computationally hard visual optimization problem, the Traveling Salesperson Problem (TSP), indicate that humans are capable of finding close to optimal solutions in near-linear time. The current study is a preliminary step in investigating human performance on another hard problem, the Minimum Vertex Cover Problem, in which…
Humans make efficient use of natural image statistics when performing spatial interpolation.
D'Antona, Anthony D; Perry, Jeffrey S; Geisler, Wilson S
2013-12-16
Visual systems learn through evolution and experience over the lifespan to exploit the statistical structure of natural images when performing visual tasks. Understanding which aspects of this statistical structure are incorporated into the human nervous system is a fundamental goal in vision science. To address this goal, we measured human ability to estimate the intensity of missing image pixels in natural images. Human estimation accuracy is compared with various simple heuristics (e.g., local mean) and with optimal observers that have nearly complete knowledge of the local statistical structure of natural images. Human estimates are more accurate than those of simple heuristics, and they match the performance of an optimal observer that knows the local statistical structure of relative intensities (contrasts). This optimal observer predicts the detailed pattern of human estimation errors and hence the results place strong constraints on the underlying neural mechanisms. However, humans do not reach the performance of an optimal observer that knows the local statistical structure of the absolute intensities, which reflect both local relative intensities and local mean intensity. As predicted from a statistical analysis of natural images, human estimation accuracy is negligibly improved by expanding the context from a local patch to the whole image. Our results demonstrate that the human visual system exploits efficiently the statistical structure of natural images.
NASA Human Health and Performance Information Architecture Panel
NASA Technical Reports Server (NTRS)
Johnson-Throop, Kathy; Kadwa, Binafer; VanBaalen, Mary
2014-01-01
The Human Health and Performance (HH&P) Directorate at NASA's Johnson Space Center has a mission to enable optimization of human health and performance throughout all phases of spaceflight. All HH&P functions are ultimately aimed at achieving this mission. Our activities enable mission success, optimizing human health and productivity in space before, during, and after the actual spaceflight experience of our crews, and include support for ground-based functions. Many of our spaceflight innovations also provide solutions for terrestrial challenges, thereby enhancing life on Earth.
Human-Machine Collaborative Optimization via Apprenticeship Scheduling
2016-09-09
prenticeship Scheduling (COVAS), which performs ma- chine learning using human expert demonstration, in conjunction with optimization, to automatically and ef...ficiently produce optimal solutions to challenging real- world scheduling problems. COVAS first learns a policy from human scheduling demonstration via...apprentice- ship learning , then uses this initial solution to provide a tight bound on the value of the optimal solution, thereby substantially
Optimized Assistive Human-Robot Interaction Using Reinforcement Learning.
Modares, Hamidreza; Ranatunga, Isura; Lewis, Frank L; Popa, Dan O
2016-03-01
An intelligent human-robot interaction (HRI) system with adjustable robot behavior is presented. The proposed HRI system assists the human operator to perform a given task with minimum workload demands and optimizes the overall human-robot system performance. Motivated by human factor studies, the presented control structure consists of two control loops. First, a robot-specific neuro-adaptive controller is designed in the inner loop to make the unknown nonlinear robot behave like a prescribed robot impedance model as perceived by a human operator. In contrast to existing neural network and adaptive impedance-based control methods, no information of the task performance or the prescribed robot impedance model parameters is required in the inner loop. Then, a task-specific outer-loop controller is designed to find the optimal parameters of the prescribed robot impedance model to adjust the robot's dynamics to the operator skills and minimize the tracking error. The outer loop includes the human operator, the robot, and the task performance details. The problem of finding the optimal parameters of the prescribed robot impedance model is transformed into a linear quadratic regulator (LQR) problem which minimizes the human effort and optimizes the closed-loop behavior of the HRI system for a given task. To obviate the requirement of the knowledge of the human model, integral reinforcement learning is used to solve the given LQR problem. Simulation results on an x - y table and a robot arm, and experimental implementation results on a PR2 robot confirm the suitability of the proposed method.
Kell, Alexander J E; Yamins, Daniel L K; Shook, Erica N; Norman-Haignere, Sam V; McDermott, Josh H
2018-05-02
A core goal of auditory neuroscience is to build quantitative models that predict cortical responses to natural sounds. Reasoning that a complete model of auditory cortex must solve ecologically relevant tasks, we optimized hierarchical neural networks for speech and music recognition. The best-performing network contained separate music and speech pathways following early shared processing, potentially replicating human cortical organization. The network performed both tasks as well as humans and exhibited human-like errors despite not being optimized to do so, suggesting common constraints on network and human performance. The network predicted fMRI voxel responses substantially better than traditional spectrotemporal filter models throughout auditory cortex. It also provided a quantitative signature of cortical representational hierarchy-primary and non-primary responses were best predicted by intermediate and late network layers, respectively. The results suggest that task optimization provides a powerful set of tools for modeling sensory systems. Copyright © 2018 Elsevier Inc. All rights reserved.
Human Performance on the Traveling Salesman and Related Problems: A Review
ERIC Educational Resources Information Center
MacGregor, James N.; Chu, Yun
2011-01-01
The article provides a review of recent research on human performance on the traveling salesman problem (TSP) and related combinatorial optimization problems. We discuss what combinatorial optimization problems are, why they are important, and why they may be of interest to cognitive scientists. We next describe the main characteristics of human…
Solving the optimal attention allocation problem in manual control
NASA Technical Reports Server (NTRS)
Kleinman, D. L.
1976-01-01
Within the context of the optimal control model of human response, analytic expressions for the gradients of closed-loop performance metrics with respect to human operator attention allocation are derived. These derivatives serve as the basis for a gradient algorithm that determines the optimal attention that a human should allocate among several display indicators in a steady-state manual control task. Application of the human modeling techniques are made to study the hover control task for a CH-46 VTOL flight tested by NASA.
Simple summation rule for optimal fixation selection in visual search.
Najemnik, Jiri; Geisler, Wilson S
2009-06-01
When searching for a known target in a natural texture, practiced humans achieve near-optimal performance compared to a Bayesian ideal searcher constrained with the human map of target detectability across the visual field [Najemnik, J., & Geisler, W. S. (2005). Optimal eye movement strategies in visual search. Nature, 434, 387-391]. To do so, humans must be good at choosing where to fixate during the search [Najemnik, J., & Geisler, W.S. (2008). Eye movement statistics in humans are consistent with an optimal strategy. Journal of Vision, 8(3), 1-14. 4]; however, it seems unlikely that a biological nervous system would implement the computations for the Bayesian ideal fixation selection because of their complexity. Here we derive and test a simple heuristic for optimal fixation selection that appears to be a much better candidate for implementation within a biological nervous system. Specifically, we show that the near-optimal fixation location is the maximum of the current posterior probability distribution for target location after the distribution is filtered by (convolved with) the square of the retinotopic target detectability map. We term the model that uses this strategy the entropy limit minimization (ELM) searcher. We show that when constrained with human-like retinotopic map of target detectability and human search error rates, the ELM searcher performs as well as the Bayesian ideal searcher, and produces fixation statistics similar to human.
Box, Simon
2014-01-01
Optimal switching of traffic lights on a network of junctions is a computationally intractable problem. In this research, road traffic networks containing signallized junctions are simulated. A computer game interface is used to enable a human ‘player’ to control the traffic light settings on the junctions within the simulation. A supervised learning approach, based on simple neural network classifiers can be used to capture human player's strategies in the game and thus develop a human-trained machine control (HuTMaC) system that approaches human levels of performance. Experiments conducted within the simulation compare the performance of HuTMaC to two well-established traffic-responsive control systems that are widely deployed in the developed world and also to a temporal difference learning-based control method. In all experiments, HuTMaC outperforms the other control methods in terms of average delay and variance over delay. The conclusion is that these results add weight to the suggestion that HuTMaC may be a viable alternative, or supplemental method, to approximate optimization for some practical engineering control problems where the optimal strategy is computationally intractable. PMID:26064570
Box, Simon
2014-12-01
Optimal switching of traffic lights on a network of junctions is a computationally intractable problem. In this research, road traffic networks containing signallized junctions are simulated. A computer game interface is used to enable a human 'player' to control the traffic light settings on the junctions within the simulation. A supervised learning approach, based on simple neural network classifiers can be used to capture human player's strategies in the game and thus develop a human-trained machine control (HuTMaC) system that approaches human levels of performance. Experiments conducted within the simulation compare the performance of HuTMaC to two well-established traffic-responsive control systems that are widely deployed in the developed world and also to a temporal difference learning-based control method. In all experiments, HuTMaC outperforms the other control methods in terms of average delay and variance over delay. The conclusion is that these results add weight to the suggestion that HuTMaC may be a viable alternative, or supplemental method, to approximate optimization for some practical engineering control problems where the optimal strategy is computationally intractable.
A Grey Wolf Optimizer for Modular Granular Neural Networks for Human Recognition
Sánchez, Daniela; Melin, Patricia
2017-01-01
A grey wolf optimizer for modular neural network (MNN) with a granular approach is proposed. The proposed method performs optimal granulation of data and design of modular neural networks architectures to perform human recognition, and to prove its effectiveness benchmark databases of ear, iris, and face biometric measures are used to perform tests and comparisons against other works. The design of a modular granular neural network (MGNN) consists in finding optimal parameters of its architecture; these parameters are the number of subgranules, percentage of data for the training phase, learning algorithm, goal error, number of hidden layers, and their number of neurons. Nowadays, there is a great variety of approaches and new techniques within the evolutionary computing area, and these approaches and techniques have emerged to help find optimal solutions to problems or models and bioinspired algorithms are part of this area. In this work a grey wolf optimizer is proposed for the design of modular granular neural networks, and the results are compared against a genetic algorithm and a firefly algorithm in order to know which of these techniques provides better results when applied to human recognition. PMID:28894461
A Grey Wolf Optimizer for Modular Granular Neural Networks for Human Recognition.
Sánchez, Daniela; Melin, Patricia; Castillo, Oscar
2017-01-01
A grey wolf optimizer for modular neural network (MNN) with a granular approach is proposed. The proposed method performs optimal granulation of data and design of modular neural networks architectures to perform human recognition, and to prove its effectiveness benchmark databases of ear, iris, and face biometric measures are used to perform tests and comparisons against other works. The design of a modular granular neural network (MGNN) consists in finding optimal parameters of its architecture; these parameters are the number of subgranules, percentage of data for the training phase, learning algorithm, goal error, number of hidden layers, and their number of neurons. Nowadays, there is a great variety of approaches and new techniques within the evolutionary computing area, and these approaches and techniques have emerged to help find optimal solutions to problems or models and bioinspired algorithms are part of this area. In this work a grey wolf optimizer is proposed for the design of modular granular neural networks, and the results are compared against a genetic algorithm and a firefly algorithm in order to know which of these techniques provides better results when applied to human recognition.
Optimization of a reversible hood for protecting a pedestrian's head during car collisions.
Huang, Sunan; Yang, Jikuang
2010-07-01
This study evaluated and optimized the performance of a reversible hood (RH) for the prevention of the head injuries of an adult pedestrian from car collisions. The FE model of a production car front was introduced and validated. The baseline RH was developed from the original hood in the validated car front model. In order to evaluate the protective performance of the baseline RH, the FE models of an adult headform and a 50th percentile human head were used in parallel to impact the baseline RH. Based on the evaluation, the response surface method was applied to optimize the RH in terms of the material stiffness, lifting speed, and lifted height. Finally, the headform model and the human head model were again used to evaluate the protective performance of the optimized RH. It was found that the lifted baseline RH can obviously reduce the impact responses of the headform model and the human head model by comparing with the retracted and lifting baseline RH. When the optimized RH was lifted, the HIC values of the headform model and the human head model were further reduced to much lower than 1000. The risk of pedestrian head injuries can be prevented as required by EEVC WG17. Copyright 2009 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Seo, Jongho; Kim, Jin-Su; Jeong, Un-Chang; Kim, Yong-Dae; Kim, Young-Cheol; Lee, Hanmin; Oh, Jae-Eung
2016-02-01
In this study, we derived an equation of motion for an electromechanical system in view of the components and working mechanism of an electromagnetic-type energy harvester (ETEH). An electromechanical transduction factor (ETF) was calculated using a finite-element analysis (FEA) based on Maxwell's theory. The experimental ETF of the ETEH measured by means of sine wave excitation was compared with and FEA data. Design parameters for the stationary part of the energy harvester were optimized in terms of the power performance by using a response surface method (RSM). With optimized design parameters, the ETEH showed an improvement in performance. We experimented with the optimized ETEH (OETEH) with respect to changes in the external excitation frequency and the load resistance by taking human body vibration in to account. The OETEH achieved a performance improvement of about 30% compared to the initial model.
NASA Technical Reports Server (NTRS)
Baron, S.; Muralidharan, R.; Kleinman, D. L.
1978-01-01
The optimal control model of the human operator is used to develop closed loop models for analyzing the effects of (digital) simulator characteristics on predicted performance and/or workload. Two approaches are considered: the first utilizes a continuous approximation to the discrete simulation in conjunction with the standard optimal control model; the second involves a more exact discrete description of the simulator in a closed loop multirate simulation in which the optimal control model simulates the pilot. Both models predict that simulator characteristics can have significant effects on performance and workload.
Ong, Carmichael F; Hicks, Jennifer L; Delp, Scott L
2016-05-01
Technologies that augment human performance are the focus of intensive research and development, driven by advances in wearable robotic systems. Success has been limited by the challenge of understanding human-robot interaction. To address this challenge, we developed an optimization framework to synthesize a realistic human standing long jump and used the framework to explore how simulated wearable robotic devices might enhance jump performance. A planar, five-segment, seven-degree-of-freedom model with physiological torque actuators, which have variable torque capacity depending on joint position and velocity, was used to represent human musculoskeletal dynamics. An active augmentation device was modeled as a torque actuator that could apply a single pulse of up to 100 Nm of extension torque. A passive design was modeled as rotational springs about each lower limb joint. Dynamic optimization searched for physiological and device actuation patterns to maximize jump distance. Optimization of the nominal case yielded a 2.27 m jump that captured salient kinematic and kinetic features of human jumps. When the active device was added to the ankle, knee, or hip, jump distance increased to between 2.49 and 2.52 m. Active augmentation of all three joints increased the jump distance to 3.10 m. The passive design increased jump distance to 3.32 m by adding torques of 135, 365, and 297 Nm to the ankle, knee, and hip, respectively. Dynamic optimization can be used to simulate a standing long jump and investigate human-robot interaction. Simulation can aid in the design of performance-enhancing technologies.
ERIC Educational Resources Information Center
Brusco, Michael J.
2007-01-01
The study of human performance on discrete optimization problems has a considerable history that spans various disciplines. The two most widely studied problems are the Euclidean traveling salesperson problem and the quadratic assignment problem. The purpose of this paper is to outline a program of study for the measurement of human performance on…
The lawful imprecision of human surface tilt estimation in natural scenes
2018-01-01
Estimating local surface orientation (slant and tilt) is fundamental to recovering the three-dimensional structure of the environment. It is unknown how well humans perform this task in natural scenes. Here, with a database of natural stereo-images having groundtruth surface orientation at each pixel, we find dramatic differences in human tilt estimation with natural and artificial stimuli. Estimates are precise and unbiased with artificial stimuli and imprecise and strongly biased with natural stimuli. An image-computable Bayes optimal model grounded in natural scene statistics predicts human bias, precision, and trial-by-trial errors without fitting parameters to the human data. The similarities between human and model performance suggest that the complex human performance patterns with natural stimuli are lawful, and that human visual systems have internalized local image and scene statistics to optimally infer the three-dimensional structure of the environment. These results generalize our understanding of vision from the lab to the real world. PMID:29384477
The lawful imprecision of human surface tilt estimation in natural scenes.
Kim, Seha; Burge, Johannes
2018-01-31
Estimating local surface orientation (slant and tilt) is fundamental to recovering the three-dimensional structure of the environment. It is unknown how well humans perform this task in natural scenes. Here, with a database of natural stereo-images having groundtruth surface orientation at each pixel, we find dramatic differences in human tilt estimation with natural and artificial stimuli. Estimates are precise and unbiased with artificial stimuli and imprecise and strongly biased with natural stimuli. An image-computable Bayes optimal model grounded in natural scene statistics predicts human bias, precision, and trial-by-trial errors without fitting parameters to the human data. The similarities between human and model performance suggest that the complex human performance patterns with natural stimuli are lawful, and that human visual systems have internalized local image and scene statistics to optimally infer the three-dimensional structure of the environment. These results generalize our understanding of vision from the lab to the real world. © 2018, Kim et al.
NASA Astrophysics Data System (ADS)
Zeng, Rongping; Badano, Aldo; Myers, Kyle J.
2017-04-01
We showed in our earlier work that the choice of reconstruction methods does not affect the optimization of DBT acquisition parameters (angular span and number of views) using simulated breast phantom images in detecting lesions with a channelized Hotelling observer (CHO). In this work we investigate whether the model-observer based conclusion is valid when using humans to interpret images. We used previously generated DBT breast phantom images and recruited human readers to find the optimal geometry settings associated with two reconstruction algorithms, filtered back projection (FBP) and simultaneous algebraic reconstruction technique (SART). The human reader results show that image quality trends as a function of the acquisition parameters are consistent between FBP and SART reconstructions. The consistent trends confirm that the optimization of DBT system geometry is insensitive to the choice of reconstruction algorithm. The results also show that humans perform better in SART reconstructed images than in FBP reconstructed images. In addition, we applied CHOs with three commonly used channel models, Laguerre-Gauss (LG) channels, square (SQR) channels and sparse difference-of-Gaussian (sDOG) channels. We found that LG channels predict human performance trends better than SQR and sDOG channel models for the task of detecting lesions in tomosynthesis backgrounds. Overall, this work confirms that the choice of reconstruction algorithm is not critical for optimizing DBT system acquisition parameters.
Modeling human decision making behavior in supervisory control
NASA Technical Reports Server (NTRS)
Tulga, M. K.; Sheridan, T. B.
1977-01-01
An optimal decision control model was developed, which is based primarily on a dynamic programming algorithm which looks at all the available task possibilities, charts an optimal trajectory, and commits itself to do the first step (i.e., follow the optimal trajectory during the next time period), and then iterates the calculation. A Bayesian estimator was included which estimates the tasks which might occur in the immediate future and provides this information to the dynamic programming routine. Preliminary trials comparing the human subject's performance to that of the optimal model show a great similarity, but indicate that the human skips certain movements which require quick change in strategy.
It looks easy! Heuristics for combinatorial optimization problems.
Chronicle, Edward P; MacGregor, James N; Ormerod, Thomas C; Burr, Alistair
2006-04-01
Human performance on instances of computationally intractable optimization problems, such as the travelling salesperson problem (TSP), can be excellent. We have proposed a boundary-following heuristic to account for this finding. We report three experiments with TSPs where the capacity to employ this heuristic was varied. In Experiment 1, participants free to use the heuristic produced solutions significantly closer to optimal than did those prevented from doing so. Experiments 2 and 3 together replicated this finding in larger problems and demonstrated that a potential confound had no effect. In all three experiments, performance was closely matched by a boundary-following model. The results implicate global rather than purely local processes. Humans may have access to simple, perceptually based, heuristics that are suited to some combinatorial optimization tasks.
NASA Astrophysics Data System (ADS)
Platisa, Ljiljana; Vansteenkiste, Ewout; Goossens, Bart; Marchessoux, Cédric; Kimpe, Tom; Philips, Wilfried
2009-02-01
Medical-imaging systems are designed to aid medical specialists in a specific task. Therefore, the physical parameters of a system need to optimize the task performance of a human observer. This requires measurements of human performance in a given task during the system optimization. Typically, psychophysical studies are conducted for this purpose. Numerical observer models have been successfully used to predict human performance in several detection tasks. Especially, the task of signal detection using a channelized Hotelling observer (CHO) in simulated images has been widely explored. However, there are few studies done for clinically acquired images that also contain anatomic noise. In this paper, we investigate the performance of a CHO in the task of detecting lung nodules in real radiographic images of the chest. To evaluate variability introduced by the limited available data, we employ a commonly used study of a multi-reader multi-case (MRMC) scenario. It accounts for both case and reader variability. Finally, we use the "oneshot" methods to estimate the MRMC variance of the area under the ROC curve (AUC). The obtained AUC compares well to those reported for human observer study on a similar data set. Furthermore, the "one-shot" analysis implies a fairly consistent performance of the CHO with the variance of AUC below 0.002. This indicates promising potential for numerical observers in optimization of medical imaging displays and encourages further investigation on the subject.
Acquisition of decision making criteria: reward rate ultimately beats accuracy.
Balci, Fuat; Simen, Patrick; Niyogi, Ritwik; Saxe, Andrew; Hughes, Jessica A; Holmes, Philip; Cohen, Jonathan D
2011-02-01
Speed-accuracy trade-offs strongly influence the rate of reward that can be earned in many decision-making tasks. Previous reports suggest that human participants often adopt suboptimal speed-accuracy trade-offs in single session, two-alternative forced-choice tasks. We investigated whether humans acquired optimal speed-accuracy trade-offs when extensively trained with multiple signal qualities. When performance was characterized in terms of decision time and accuracy, our participants eventually performed nearly optimally in the case of higher signal qualities. Rather than adopting decision criteria that were individually optimal for each signal quality, participants adopted a single threshold that was nearly optimal for most signal qualities. However, setting a single threshold for different coherence conditions resulted in only negligible decrements in the maximum possible reward rate. Finally, we tested two hypotheses regarding the possible sources of suboptimal performance: (1) favoring accuracy over reward rate and (2) misestimating the reward rate due to timing uncertainty. Our findings provide support for both hypotheses, but also for the hypothesis that participants can learn to approach optimality. We find specifically that an accuracy bias dominates early performance, but diminishes greatly with practice. The residual discrepancy between optimal and observed performance can be explained by an adaptive response to uncertainty in time estimation.
ERIC Educational Resources Information Center
Walwyn, Amy L.; Navarro, Daniel J.
2010-01-01
An experiment is reported comparing human performance on two kinds of visually presented traveling salesperson problems (TSPs), those reliant on Euclidean geometry and those reliant on city block geometry. Across multiple array sizes, human performance was near-optimal in both geometries, but was slightly better in the Euclidean format. Even so,…
NASA Technical Reports Server (NTRS)
Connolly, Janis H.; Arch, M.; Elfezouaty, Eileen Schultz; Novak, Jennifer Blume; Bond, Robert L. (Technical Monitor)
1999-01-01
Design and Human Engineering (HE) processes strive to ensure that the human-machine interface is designed for optimal performance throughout the system life cycle. Each component can be tested and assessed independently to assure optimal performance, but it is not until full integration that the system and the inherent interactions between the system components can be assessed as a whole. HE processes (which are defining/app lying requirements for human interaction with missions/systems) are included in space flight activities, but also need to be included in ground activities and specifically, ground facility testbeds such as Bio-Plex. A unique aspect of the Bio-Plex Facility is the integral issue of Habitability which includes qualities of the environment that allow humans to work and live. HE is a process by which Habitability and system performance can be assessed.
NASA Astrophysics Data System (ADS)
Handford, Matthew L.; Srinivasan, Manoj
2016-02-01
Robotic lower limb prostheses can improve the quality of life for amputees. Development of such devices, currently dominated by long prototyping periods, could be sped up by predictive simulations. In contrast to some amputee simulations which track experimentally determined non-amputee walking kinematics, here, we explicitly model the human-prosthesis interaction to produce a prediction of the user’s walking kinematics. We obtain simulations of an amputee using an ankle-foot prosthesis by simultaneously optimizing human movements and prosthesis actuation, minimizing a weighted sum of human metabolic and prosthesis costs. The resulting Pareto optimal solutions predict that increasing prosthesis energy cost, decreasing prosthesis mass, and allowing asymmetric gaits all decrease human metabolic rate for a given speed and alter human kinematics. The metabolic rates increase monotonically with speed. Remarkably, by performing an analogous optimization for a non-amputee human, we predict that an amputee walking with an appropriately optimized robotic prosthesis can have a lower metabolic cost - even lower than assuming that the non-amputee’s ankle torques are cost-free.
Optimizing the Distribution of Leg Muscles for Vertical Jumping
Wong, Jeremy D.; Bobbert, Maarten F.; van Soest, Arthur J.; Gribble, Paul L.; Kistemaker, Dinant A.
2016-01-01
A goal of biomechanics and motor control is to understand the design of the human musculoskeletal system. Here we investigated human functional morphology by making predictions about the muscle volume distribution that is optimal for a specific motor task. We examined a well-studied and relatively simple human movement, vertical jumping. We investigated how high a human could jump if muscle volume were optimized for jumping, and determined how the optimal parameters improve performance. We used a four-link inverted pendulum model of human vertical jumping actuated by Hill-type muscles, that well-approximates skilled human performance. We optimized muscle volume by allowing the cross-sectional area and muscle fiber optimum length to be changed for each muscle, while maintaining constant total muscle volume. We observed, perhaps surprisingly, that the reference model, based on human anthropometric data, is relatively good for vertical jumping; it achieves 90% of the jump height predicted by a model with muscles designed specifically for jumping. Alteration of cross-sectional areas—which determine the maximum force deliverable by the muscles—constitutes the majority of improvement to jump height. The optimal distribution results in large vastus, gastrocnemius and hamstrings muscles that deliver more work, while producing a kinematic pattern essentially identical to the reference model. Work output is increased by removing muscle from rectus femoris, which cannot do work on the skeleton given its moment arm at the hip and the joint excursions during push-off. The gluteus composes a disproportionate amount of muscle volume and jump height is improved by moving it to other muscles. This approach represents a way to test hypotheses about optimal human functional morphology. Future studies may extend this approach to address other morphological questions in ethological tasks such as locomotion, and feature other sets of parameters such as properties of the skeletal segments. PMID:26919645
Nindl, Bradley C; Jaffin, Dianna P; Dretsch, Michael N; Cheuvront, Samuel N; Wesensten, Nancy J; Kent, Michael L; Grunberg, Neil E; Pierce, Joseph R; Barry, Erin S; Scott, Jonathan M; Young, Andrew J; OʼConnor, Francis G; Deuster, Patricia A
2015-11-01
Human performance optimization (HPO) is defined as "the process of applying knowledge, skills and emerging technologies to improve and preserve the capabilities of military members, and organizations to execute essential tasks." The lack of consensus for operationally relevant and standardized metrics that meet joint military requirements has been identified as the single most important gap for research and application of HPO. In 2013, the Consortium for Health and Military Performance hosted a meeting to develop a toolkit of standardized HPO metrics for use in military and civilian research, and potentially for field applications by commanders, units, and organizations. Performance was considered from a holistic perspective as being influenced by various behaviors and barriers. To accomplish the goal of developing a standardized toolkit, key metrics were identified and evaluated across a spectrum of domains that contribute to HPO: physical performance, nutritional status, psychological status, cognitive performance, environmental challenges, sleep, and pain. These domains were chosen based on relevant data with regard to performance enhancers and degraders. The specific objectives at this meeting were to (a) identify and evaluate current metrics for assessing human performance within selected domains; (b) prioritize metrics within each domain to establish a human performance assessment toolkit; and (c) identify scientific gaps and the needed research to more effectively assess human performance across domains. This article provides of a summary of 150 total HPO metrics across multiple domains that can be used as a starting point-the beginning of an HPO toolkit: physical fitness (29 metrics), nutrition (24 metrics), psychological status (36 metrics), cognitive performance (35 metrics), environment (12 metrics), sleep (9 metrics), and pain (5 metrics). These metrics can be particularly valuable as the military emphasizes a renewed interest in Human Dimension efforts, and leverages science, resources, programs, and policies to optimize the performance capacities of all Service members.
Fusing face-verification algorithms and humans.
O'Toole, Alice J; Abdi, Hervé; Jiang, Fang; Phillips, P Jonathon
2007-10-01
It has been demonstrated recently that state-of-the-art face-recognition algorithms can surpass human accuracy at matching faces over changes in illumination. The ranking of algorithms and humans by accuracy, however, does not provide information about whether algorithms and humans perform the task comparably or whether algorithms and humans can be fused to improve performance. In this paper, we fused humans and algorithms using partial least square regression (PLSR). In the first experiment, we applied PLSR to face-pair similarity scores generated by seven algorithms participating in the Face Recognition Grand Challenge. The PLSR produced an optimal weighting of the similarity scores, which we tested for generality with a jackknife procedure. Fusing the algorithms' similarity scores using the optimal weights produced a twofold reduction of error rate over the most accurate algorithm. Next, human-subject-generated similarity scores were added to the PLSR analysis. Fusing humans and algorithms increased the performance to near-perfect classification accuracy. These results are discussed in terms of maximizing face-verification accuracy with hybrid systems consisting of multiple algorithms and humans.
Park, Gloria H; Messina, Lauren A; Deuster, Patricia A
Within the Department of Defense over the past decade, a focus on enhancing Warfighter resilience and readiness has increased. For Special Operation Forces (SOF), who bear unique burdens for training and deployment, programs like the Preservation of the Force and Family have been created to help support SOF and their family members in sustaining capabilities and enhancing resilience in the face of prolonged warfare. In this review, we describe the shift in focus from resilience to human performance optimization (HPO) and the benefits of human performance initiatives that include holistic fitness. We then describe strategies for advancing the application of HPO for future initiatives through tailoring and cultural adaptation, as well as advancing methods for measurement. By striving toward specificity and precision performance, SOF human performance programs can impact individual and team capabilities to a greater extent than in the past, as well as maintaining the well-being of SOF and their families across their careers and beyond. 2017.
2016-12-01
Instructors Course. First aid and combat life saver training, as well as combatives or martial arts training were also common, although given the...Training ☐ ☐ ☐ Leader courses (e.g. Ranger, CLC, ARC, RSLC, ☐ ☐ ☐ Resilience and Human Performance ☐ ☐ ☐ Martial Arts / Combatives
Measuring human performance on NASA's microgravity aircraft
NASA Technical Reports Server (NTRS)
Morris, Randy B.; Whitmore, Mihriban
1993-01-01
Measuring human performance in a microgravity environment will aid in identifying the design requirements, human capabilities, safety, and productivity of future astronauts. The preliminary understanding of the microgravity effects on human performance can be achieved through evaluations conducted onboard NASA's KC-135 aircraft. These evaluations can be performed in relation to hardware performance, human-hardware interface, and hardware integration. Measuring human performance in the KC-135 simulated environment will contribute to the efforts of optimizing the human-machine interfaces for future and existing space vehicles. However, there are limitations, such as limited number of qualified subjects, unexpected hardware problems, and miscellaneous plane movements which must be taken into consideration. Examples for these evaluations, the results, and their implications are discussed in the paper.
Ong, Carmichael F.; Hicks, Jennifer L.; Delp, Scott L.
2017-01-01
Goal Technologies that augment human performance are the focus of intensive research and development, driven by advances in wearable robotic systems. Success has been limited by the challenge of understanding human–robot interaction. To address this challenge, we developed an optimization framework to synthesize a realistic human standing long jump and used the framework to explore how simulated wearable robotic devices might enhance jump performance. Methods A planar, five-segment, seven-degree-of-freedom model with physiological torque actuators, which have variable torque capacity depending on joint position and velocity, was used to represent human musculoskeletal dynamics. An active augmentation device was modeled as a torque actuator that could apply a single pulse of up to 100 Nm of extension torque. A passive design was modeled as rotational springs about each lower limb joint. Dynamic optimization searched for physiological and device actuation patterns to maximize jump distance. Results Optimization of the nominal case yielded a 2.27 m jump that captured salient kinematic and kinetic features of human jumps. When the active device was added to the ankle, knee, or hip, jump distance increased to between 2.49 and 2.52 m. Active augmentation of all three joints increased the jump distance to 3.10 m. The passive design increased jump distance to 3.32 m by adding torques of 135 Nm, 365 Nm, and 297 Nm to the ankle, knee, and hip, respectively. Conclusion Dynamic optimization can be used to simulate a standing long jump and investigate human-robot interaction. Significance Simulation can aid in the design of performance-enhancing technologies. PMID:26258930
Airline Maintenance Manpower Optimization from the De Novo Perspective
NASA Astrophysics Data System (ADS)
Liou, James J. H.; Tzeng, Gwo-Hshiung
Human resource management (HRM) is an important issue for today’s competitive airline marketing. In this paper, we discuss a multi-objective model designed from the De Novo perspective to help airlines optimize their maintenance manpower portfolio. The effectiveness of the model and solution algorithm is demonstrated in an empirical study of the optimization of the human resources needed for airline line maintenance. Both De Novo and traditional multiple objective programming (MOP) methods are analyzed. A comparison of the results with those of traditional MOP indicates that the proposed model and solution algorithm does provide better performance and an improved human resource portfolio.
2D and 3D Traveling Salesman Problem
ERIC Educational Resources Information Center
Haxhimusa, Yll; Carpenter, Edward; Catrambone, Joseph; Foldes, David; Stefanov, Emil; Arns, Laura; Pizlo, Zygmunt
2011-01-01
When a two-dimensional (2D) traveling salesman problem (TSP) is presented on a computer screen, human subjects can produce near-optimal tours in linear time. In this study we tested human performance on a real and virtual floor, as well as in a three-dimensional (3D) virtual space. Human performance on the real floor is as good as that on a…
A holistic approach to movement education in sport and fitness: a systems based model.
Polsgrove, Myles Jay
2012-01-01
The typical model used by movement professionals to enhance performance relies on the notion that a linear increase in load results in steady and progressive gains, whereby, the greater the effort, the greater the gains in performance. Traditional approaches to movement progression typically rely on the proper sequencing of extrinsically based activities to facilitate the individual in reaching performance objectives. However, physical rehabilitation or physical performance rarely progresses in such a linear fashion; instead they tend to evolve non-linearly and rather unpredictably. A dynamic system can be described as an entity that self-organizes into increasingly complex forms. Applying this view to the human body, practitioners could facilitate non-linear performance gains through a systems based programming approach. Utilizing a dynamic systems view, the Holistic Approach to Movement Education (HADME) is a model designed to optimize performance by accounting for non-linear and self-organizing traits associated with human movement. In this model, gains in performance occur through advancing individual perspectives and through optimizing sub-system performance. This inward shift of the focus of performance creates a sharper self-awareness and may lead to more optimal movements. Copyright © 2011 Elsevier Ltd. All rights reserved.
Optimality Principles for Model-Based Prediction of Human Gait
Ackermann, Marko; van den Bogert, Antonie J.
2010-01-01
Although humans have a large repertoire of potential movements, gait patterns tend to be stereotypical and appear to be selected according to optimality principles such as minimal energy. When applied to dynamic musculoskeletal models such optimality principles might be used to predict how a patient’s gait adapts to mechanical interventions such as prosthetic devices or surgery. In this paper we study the effects of different performance criteria on predicted gait patterns using a 2D musculoskeletal model. The associated optimal control problem for a family of different cost functions was solved utilizing the direct collocation method. It was found that fatigue-like cost functions produced realistic gait, with stance phase knee flexion, as opposed to energy-related cost functions which avoided knee flexion during the stance phase. We conclude that fatigue minimization may be one of the primary optimality principles governing human gait. PMID:20074736
The MIT Summit Speech Recognition System: A Progress Report
1989-01-01
understanding of the human communication process. Despite recent development of some speech recognition systems with high accuracy, the performance of such...over the past four decades on human communication , in the hope that such systems will one day have a performance approaching that of humans. We are...optimize its use. Third, the system must have a stochastic component to deal with the present state of ignorance in our understanding of the human
Automated Sensitivity Analysis of Interplanetary Trajectories for Optimal Mission Design
NASA Technical Reports Server (NTRS)
Knittel, Jeremy; Hughes, Kyle; Englander, Jacob; Sarli, Bruno
2017-01-01
This work describes a suite of Python tools known as the Python EMTG Automated Trade Study Application (PEATSA). PEATSA was written to automate the operation of trajectory optimization software, simplify the process of performing sensitivity analysis, and was ultimately found to out-perform a human trajectory designer in unexpected ways. These benefits will be discussed and demonstrated on sample mission designs.
An Ideal Observer Analysis of Visual Working Memory
ERIC Educational Resources Information Center
Sims, Chris R.; Jacobs, Robert A.; Knill, David C.
2012-01-01
Limits in visual working memory (VWM) strongly constrain human performance across many tasks. However, the nature of these limits is not well understood. In this article we develop an ideal observer analysis of human VWM by deriving the expected behavior of an optimally performing but limited-capacity memory system. This analysis is framed around…
Human-in-the-loop Bayesian optimization of wearable device parameters
Malcolm, Philippe; Speeckaert, Jozefien; Siviy, Christoper J.; Walsh, Conor J.; Kuindersma, Scott
2017-01-01
The increasing capabilities of exoskeletons and powered prosthetics for walking assistance have paved the way for more sophisticated and individualized control strategies. In response to this opportunity, recent work on human-in-the-loop optimization has considered the problem of automatically tuning control parameters based on realtime physiological measurements. However, the common use of metabolic cost as a performance metric creates significant experimental challenges due to its long measurement times and low signal-to-noise ratio. We evaluate the use of Bayesian optimization—a family of sample-efficient, noise-tolerant, and global optimization methods—for quickly identifying near-optimal control parameters. To manage experimental complexity and provide comparisons against related work, we consider the task of minimizing metabolic cost by optimizing walking step frequencies in unaided human subjects. Compared to an existing approach based on gradient descent, Bayesian optimization identified a near-optimal step frequency with a faster time to convergence (12 minutes, p < 0.01), smaller inter-subject variability in convergence time (± 2 minutes, p < 0.01), and lower overall energy expenditure (p < 0.01). PMID:28926613
ERIC Educational Resources Information Center
Rimland, Jeffrey C.
2013-01-01
In many evolving systems, inputs can be derived from both human observations and physical sensors. Additionally, many computation and analysis tasks can be performed by either human beings or artificial intelligence (AI) applications. For example, weather prediction, emergency event response, assistive technology for various human sensory and…
The surprisingly high human efficiency at learning to recognize faces
Peterson, Matthew F.; Abbey, Craig K.; Eckstein, Miguel P.
2009-01-01
We investigated the ability of humans to optimize face recognition performance through rapid learning of individual relevant features. We created artificial faces with discriminating visual information heavily concentrated in single features (nose, eyes, chin or mouth). In each of 2500 learning blocks a feature was randomly selected and retained over the course of four trials, during which observers identified randomly sampled, noisy face images. Observers learned the discriminating feature through indirect feedback, leading to large performance gains. Performance was compared to a learning Bayesian ideal observer, resulting in unexpectedly high learning compared to previous studies with simpler stimuli. We explore various explanations and conclude that the higher learning measured with faces cannot be driven by adaptive eye movement strategies but can be mostly accounted for by suboptimalities in human face discrimination when observers are uncertain about the discriminating feature. We show that an initial bias of humans to use specific features to perform the task even though they are informed that each of four features is equally likely to be the discriminatory feature would lead to seemingly supra-optimal learning. We also examine the possibility of inefficient human integration of visual information across the spatially distributed facial features. Together, the results suggest that humans can show large performance improvement effects in discriminating faces as they learn to identify the feature containing the discriminatory information. PMID:19000918
Human performance on visually presented Traveling Salesman problems.
Vickers, D; Butavicius, M; Lee, M; Medvedev, A
2001-01-01
Little research has been carried out on human performance in optimization problems, such as the Traveling Salesman problem (TSP). Studies by Polivanova (1974, Voprosy Psikhologii, 4, 41-51) and by MacGregor and Ormerod (1996, Perception & Psychophysics, 58, 527-539) suggest that: (1) the complexity of solutions to visually presented TSPs depends on the number of points on the convex hull; and (2) the perception of optimal structure is an innate tendency of the visual system, not subject to individual differences. Results are reported from two experiments. In the first, measures of the total length and completion speed of pathways, and a measure of path uncertainty were compared with optimal solutions produced by an elastic net algorithm and by several heuristic methods. Performance was also compared under instructions to draw the shortest or the most attractive pathway. In the second, various measures of performance were compared with scores on Raven's advanced progressive matrices (APM). The number of points on the convex hull did not determine the relative optimality of solutions, although both this factor and the total number of points influenced solution speed and path uncertainty. Subjects' solutions showed appreciable individual differences, which had a strong correlation with APM scores. The relation between perceptual organization and the process of solving visually presented TSPs is briefly discussed, as is the potential of optimization for providing a conceptual framework for the study of intelligence.
de Koning, Jos J; van der Zweep, Cees-Jan; Cornelissen, Jesper; Kuiper, Bouke
2013-03-01
Optimal pacing strategy was determined for breaking the world speed record on a human-powered vehicle (HPV) using an energy-flow model in which the rider's physical capacities, the vehicle's properties, and the environmental conditions were included. Power data from world-record attempts were compared with data from the model, and race protocols were adjusted to the results from the model. HPV performance can be improved by using an energy-flow model for optimizing race strategy. A biphased in-run followed by a sprint gave best results.
Magic in the machine: a computational magician's assistant.
Williams, Howard; McOwan, Peter W
2014-01-01
A human magician blends science, psychology, and performance to create a magical effect. In this paper we explore what can be achieved when that human intelligence is replaced or assisted by machine intelligence. Magical effects are all in some form based on hidden mathematical, scientific, or psychological principles; often the parameters controlling these underpinning techniques are hard for a magician to blend to maximize the magical effect required. The complexity is often caused by interacting and often conflicting physical and psychological constraints that need to be optimally balanced. Normally this tuning is done by trial and error, combined with human intuitions. Here we focus on applying Artificial Intelligence methods to the creation and optimization of magic tricks exploiting mathematical principles. We use experimentally derived data about particular perceptual and cognitive features, combined with a model of the underlying mathematical process to provide a psychologically valid metric to allow optimization of magical impact. In the paper we introduce our optimization methodology and describe how it can be flexibly applied to a range of different types of mathematics based tricks. We also provide two case studies as exemplars of the methodology at work: a magical jigsaw, and a mind reading card trick effect. We evaluate each trick created through testing in laboratory and public performances, and further demonstrate the real world efficacy of our approach for professional performers through sales of the tricks in a reputable magic shop in London.
A control-theory model for human decision-making
NASA Technical Reports Server (NTRS)
Levison, W. H.; Tanner, R. B.
1971-01-01
A model for human decision making is an adaptation of an optimal control model for pilot/vehicle systems. The models for decision and control both contain concepts of time delay, observation noise, optimal prediction, and optimal estimation. The decision making model was intended for situations in which the human bases his decision on his estimate of the state of a linear plant. Experiments are described for the following task situations: (a) single decision tasks, (b) two-decision tasks, and (c) simultaneous manual control and decision making. Using fixed values for model parameters, single-task and two-task decision performance can be predicted to within an accuracy of 10 percent. Agreement is less good for the simultaneous decision and control situation.
NASA Technical Reports Server (NTRS)
Johannsen, G.; Govindaraj, T.
1980-01-01
The influence of different types of predictor displays in a longitudinal vertical takeoff and landing (VTOL) hover task is analyzed in a theoretical study. Several cases with differing amounts of predictive and rate information are compared. The optimal control model of the human operator is used to estimate human and system performance in terms of root-mean-square (rms) values and to compute optimized attention allocation. The only part of the model which is varied to predict these data is the observation matrix. Typical cases are selected for a subsequent experimental validation. The rms values as well as eye-movement data are recorded. The results agree favorably with those of the theoretical study in terms of relative differences. Better matching is achieved by revised model input data.
NASA Technical Reports Server (NTRS)
Chappell, Steven P.; Norcross, Jason R.; Gernhardt, Michael L.
2009-01-01
NASA's Constellation Program has plans to return to the Moon within the next 10 years. Although reaching the Moon during the Apollo Program was a remarkable human engineering achievement, fewer than 20 extravehicular activities (EVAs) were performed. Current projections indicate that the next lunar exploration program will require thousands of EVAs, which will require spacesuits that are better optimized for human performance. Limited mobility and dexterity, and the position of the center of gravity (CG) are a few of many features of the Apollo suit that required significant crew compensation to accomplish the objectives. Development of a new EVA suit system will ideally result in performance close to or better than that in shirtsleeves at 1 G, i.e., in "a suit that is a pleasure to work in, one that you would want to go out and explore in on your day off." Unlike the Shuttle program, in which only a fraction of the crew perform EVA, the Constellation program will require that all crewmembers be able to perform EVA. As a result, suits must be built to accommodate and optimize performance for a larger range of crew anthropometry, strength, and endurance. To address these concerns, NASA has begun a series of tests to better understand the factors affecting human performance and how to utilize various lunar gravity simulation environments available for testing.
Application of high-performance computing to numerical simulation of human movement
NASA Technical Reports Server (NTRS)
Anderson, F. C.; Ziegler, J. M.; Pandy, M. G.; Whalen, R. T.
1995-01-01
We have examined the feasibility of using massively-parallel and vector-processing supercomputers to solve large-scale optimization problems for human movement. Specifically, we compared the computational expense of determining the optimal controls for the single support phase of gait using a conventional serial machine (SGI Iris 4D25), a MIMD parallel machine (Intel iPSC/860), and a parallel-vector-processing machine (Cray Y-MP 8/864). With the human body modeled as a 14 degree-of-freedom linkage actuated by 46 musculotendinous units, computation of the optimal controls for gait could take up to 3 months of CPU time on the Iris. Both the Cray and the Intel are able to reduce this time to practical levels. The optimal solution for gait can be found with about 77 hours of CPU on the Cray and with about 88 hours of CPU on the Intel. Although the overall speeds of the Cray and the Intel were found to be similar, the unique capabilities of each machine are better suited to different portions of the computational algorithm used. The Intel was best suited to computing the derivatives of the performance criterion and the constraints whereas the Cray was best suited to parameter optimization of the controls. These results suggest that the ideal computer architecture for solving very large-scale optimal control problems is a hybrid system in which a vector-processing machine is integrated into the communication network of a MIMD parallel machine.
Optimal Configuration of Human Motion Tracking Systems: A Systems Engineering Approach
NASA Technical Reports Server (NTRS)
Henderson, Steve
2005-01-01
Human motion tracking systems represent a crucial technology in the area of modeling and simulation. These systems, which allow engineers to capture human motion for study or replication in virtual environments, have broad applications in several research disciplines including human engineering, robotics, and psychology. These systems are based on several sensing paradigms, including electro-magnetic, infrared, and visual recognition. Each of these paradigms requires specialized environments and hardware configurations to optimize performance of the human motion tracking system. Ideally, these systems are used in a laboratory or other facility that was designed to accommodate the particular sensing technology. For example, electromagnetic systems are highly vulnerable to interference from metallic objects, and should be used in a specialized lab free of metal components.
Dynamic motion planning of 3D human locomotion using gradient-based optimization.
Kim, Hyung Joo; Wang, Qian; Rahmatalla, Salam; Swan, Colby C; Arora, Jasbir S; Abdel-Malek, Karim; Assouline, Jose G
2008-06-01
Since humans can walk with an infinite variety of postures and limb movements, there is no unique solution to the modeling problem to predict human gait motions. Accordingly, we test herein the hypothesis that the redundancy of human walking mechanisms makes solving for human joint profiles and force time histories an indeterminate problem best solved by inverse dynamics and optimization methods. A new optimization-based human-modeling framework is thus described for predicting three-dimensional human gait motions on level and inclined planes. The basic unknowns in the framework are the joint motion time histories of a 25-degree-of-freedom human model and its six global degrees of freedom. The joint motion histories are calculated by minimizing an objective function such as deviation of the trunk from upright posture that relates to the human model's performance. A variety of important constraints are imposed on the optimization problem, including (1) satisfaction of dynamic equilibrium equations by requiring the model's zero moment point (ZMP) to lie within the instantaneous geometrical base of support, (2) foot collision avoidance, (3) limits on ground-foot friction, and (4) vanishing yawing moment. Analytical forms of objective and constraint functions are presented and discussed for the proposed human-modeling framework in which the resulting optimization problems are solved using gradient-based mathematical programming techniques. When the framework is applied to the modeling of bipedal locomotion on level and inclined planes, acyclic human walking motions that are smooth and realistic as opposed to less natural robotic motions are obtained. The aspects of the modeling framework requiring further investigation and refinement, as well as potential applications of the framework in biomechanics, are discussed.
2012-01-01
us.army.mil ABSTRACT Scenario-based training exemplifies the learning-by-doing approach to human performance improvement. In this paper , we enumerate...through a narrative, mission, quest, or scenario. In this paper we argue for a combinatorial optimization search approach to selecting and ordering...the role of an expert for the purposes of practicing skills and knowledge in realistic situations in a learning-by-doing approach to performance
Magic in the machine: a computational magician's assistant
Williams, Howard; McOwan, Peter W.
2014-01-01
A human magician blends science, psychology, and performance to create a magical effect. In this paper we explore what can be achieved when that human intelligence is replaced or assisted by machine intelligence. Magical effects are all in some form based on hidden mathematical, scientific, or psychological principles; often the parameters controlling these underpinning techniques are hard for a magician to blend to maximize the magical effect required. The complexity is often caused by interacting and often conflicting physical and psychological constraints that need to be optimally balanced. Normally this tuning is done by trial and error, combined with human intuitions. Here we focus on applying Artificial Intelligence methods to the creation and optimization of magic tricks exploiting mathematical principles. We use experimentally derived data about particular perceptual and cognitive features, combined with a model of the underlying mathematical process to provide a psychologically valid metric to allow optimization of magical impact. In the paper we introduce our optimization methodology and describe how it can be flexibly applied to a range of different types of mathematics based tricks. We also provide two case studies as exemplars of the methodology at work: a magical jigsaw, and a mind reading card trick effect. We evaluate each trick created through testing in laboratory and public performances, and further demonstrate the real world efficacy of our approach for professional performers through sales of the tricks in a reputable magic shop in London. PMID:25452736
A Neuroscience Approach to Optimizing Brain Resources for Human Performance in Extreme Environments
Paulus, Martin P.; Potterat, Eric G.; Taylor, Marcus K.; Van Orden, Karl F.; Bauman, James; Momen, Nausheen; Padilla, Genieleah A.; Swain, Judith L.
2009-01-01
Extreme environments requiring optimal cognitive and behavioral performance occur in a wide variety of situations ranging from complex combat operations to elite athletic competitions. Although a large literature characterizes psychological and other aspects of individual differences in performances in extreme environments, virtually nothing is known about the underlying neural basis for these differences. This review summarizes the cognitive, emotional, and behavioral consequences of exposure to extreme environments, discusses predictors of performance, and builds a case for the use of neuroscience approaches to quantify and understand optimal cognitive and behavioral performance. Extreme environments are defined as an external context that exposes individuals to demanding psychological and/or physical conditions, and which may have profound effects on cognitive and behavioral performance. Examples of these types of environments include combat situations, Olympic-level competition, and expeditions in extreme cold, at high altitudes, or in space. Optimal performance is defined as the degree to which individuals achieve a desired outcome when completing goal-oriented tasks. It is hypothesized that individual variability with respect to optimal performance in extreme environments depends on a well “contextualized” internal body state that is associated with an appropriate potential to act. This hypothesis can be translated into an experimental approach that may be useful for quantifying the degree to which individuals are particularly suited to performing optimally in demanding environments. PMID:19447132
Memory monitoring by animals and humans
NASA Technical Reports Server (NTRS)
Smith, J. D.; Shields, W. E.; Allendoerfer, K. R.; Washburn, D. A.; Rumbaugh, D. M. (Principal Investigator)
1998-01-01
The authors asked whether animals and humans would use similarly an uncertain response to escape indeterminate memories. Monkeys and humans performed serial probe recognition tasks that produced differential memory difficulty across serial positions (e.g., primacy and recency effects). Participants were given an escape option that let them avoid any trials they wished and receive a hint to the trial's answer. Across species, across tasks, and even across conspecifics with sharper or duller memories, monkeys and humans used the escape option selectively when more indeterminate memory traces were probed. Their pattern of escaping always mirrored the pattern of their primary memory performance across serial positions. Signal-detection analyses confirm the similarity of the animals' and humans' performances. Optimality analyses assess their efficiency. Several aspects of monkeys' performance suggest the cognitive sophistication of their decisions to escape.
An improved real time image detection system for elephant intrusion along the forest border areas.
Sugumar, S J; Jayaparvathy, R
2014-01-01
Human-elephant conflict is a major problem leading to crop damage, human death and injuries caused by elephants, and elephants being killed by humans. In this paper, we propose an automated unsupervised elephant image detection system (EIDS) as a solution to human-elephant conflict in the context of elephant conservation. The elephant's image is captured in the forest border areas and is sent to a base station via an RF network. The received image is decomposed using Haar wavelet to obtain multilevel wavelet coefficients, with which we perform image feature extraction and similarity match between the elephant query image and the database image using image vision algorithms. A GSM message is sent to the forest officials indicating that an elephant has been detected in the forest border and is approaching human habitat. We propose an optimized distance metric to improve the image retrieval time from the database. We compare the optimized distance metric with the popular Euclidean and Manhattan distance methods. The proposed optimized distance metric retrieves more images with lesser retrieval time than the other distance metrics which makes the optimized distance method more efficient and reliable.
Support Vector Machine Based on Adaptive Acceleration Particle Swarm Optimization
Abdulameer, Mohammed Hasan; Othman, Zulaiha Ali
2014-01-01
Existing face recognition methods utilize particle swarm optimizer (PSO) and opposition based particle swarm optimizer (OPSO) to optimize the parameters of SVM. However, the utilization of random values in the velocity calculation decreases the performance of these techniques; that is, during the velocity computation, we normally use random values for the acceleration coefficients and this creates randomness in the solution. To address this problem, an adaptive acceleration particle swarm optimization (AAPSO) technique is proposed. To evaluate our proposed method, we employ both face and iris recognition based on AAPSO with SVM (AAPSO-SVM). In the face and iris recognition systems, performance is evaluated using two human face databases, YALE and CASIA, and the UBiris dataset. In this method, we initially perform feature extraction and then recognition on the extracted features. In the recognition process, the extracted features are used for SVM training and testing. During the training and testing, the SVM parameters are optimized with the AAPSO technique, and in AAPSO, the acceleration coefficients are computed using the particle fitness values. The parameters in SVM, which are optimized by AAPSO, perform efficiently for both face and iris recognition. A comparative analysis between our proposed AAPSO-SVM and the PSO-SVM technique is presented. PMID:24790584
Radiation -- A Cosmic Hazard to Human Habitation in Space
NASA Technical Reports Server (NTRS)
Lewis, Ruthan; Pellish, Jonathan
2017-01-01
Radiation exposure is one of the greatest environmental threats to the performance and success of human and robotic space missions. Radiation permeates all space and aeronautical systems, challenges optimal and reliable performance, and tests survival and survivability. We will discuss the broad scope of research, technological, and operational considerations to forecast and mitigate the effects of the radiation environment for deep space and planetary exploration.
Human Mars Ascent Vehicle Performance Sensitivities
NASA Technical Reports Server (NTRS)
Polsgrove, Tara P.; Thomas, Herbert D.
2016-01-01
Human Mars mission architecture studies have shown that the ascent vehicle mass drives performance requirements for the descent and in-space transportation elements. Understanding the sensitivity of Mars ascent vehicle (MAV) mass to various mission and vehicle design choices enables overall transportation system optimization. This paper presents the results of a variety of sensitivity trades affecting MAV performance including: landing site latitude, target orbit, initial thrust to weight ratio, staging options, specific impulse, propellant type and engine design.
NASA Astrophysics Data System (ADS)
Petrov, Dimitar; Michielsen, Koen; Cockmartin, Lesley; Zhang, Gouzhi; Young, Kenneth; Marshall, Nicholas; Bosmans, Hilde
2016-03-01
Digital breast tomosynthesis (DBT) is a 3D mammography technique that promises better visualization of low contrast lesions than conventional 2D mammography. A wide range of parameters influence the diagnostic information in DBT images and a systematic means of DBT system optimization is needed. The gold standard for image quality assessment is to perform a human observer experiment with experienced readers. Using human observers for optimization is time consuming and not feasible for the large parameter space of DBT. Our goal was to develop a model observer (MO) that can predict human reading performance for standard detection tasks of target objects within a structured phantom and subsequently apply it in a first comparative study. The phantom consists of an acrylic semi-cylindrical container with acrylic spheres of different sizes and the remaining space filled with water. Three types of lesions were included: 3D printed spiculated and non-spiculated mass lesions along with calcification groups. The images of the two mass lesion types were reconstructed with 3 different reconstruction methods (FBP, FBP with SRSAR, MLTRpr) and read by human readers. A Channelized Hotelling model observer was created for the non-spiculated lesion detection task using five Laguerre-Gauss channels, tuned for better performance. For the non-spiculated mass lesions a linear relation between the MO and human observer results was found, with correlation coefficients of 0.956 for standard FBP, 0.998 for FBP with SRSAR and 0.940 for MLTRpr. Both the MO and human observer percentage correct results for the spiculated masses were close to 100%, and showed no difference from each other for every reconstruction algorithm.
NASA Human Health and Performance Strategy
NASA Technical Reports Server (NTRS)
Davis, Jeffrey R.
2012-01-01
In May 2007, what was then the Space Life Sciences Directorate, issued the 2007 Space Life Sciences Strategy for Human Space Exploration. In January 2012, leadership and key directorate personnel were once again brought together to assess the current and expected future environment against its 2007 Strategy and the Agency and Johnson Space Center goals and strategies. The result was a refined vision and mission, and revised goals, objectives, and strategies. One of the first changes implemented was to rename the directorate from Space Life Sciences to Human Health and Performance to better reflect our vision and mission. The most significant change in the directorate from 2007 to the present is the integration of the Human Research Program and Crew Health and Safety activities. Subsequently, the Human Health and Performance Directorate underwent a reorganization to achieve enhanced integration of research and development with operations to better support human spaceflight and International Space Station utilization. These changes also enable a more effective and efficient approach to human system risk mitigation. Since 2007, we have also made significant advances in external collaboration and implementation of new business models within the directorate and the Agency, and through two newly established virtual centers, the NASA Human Health and Performance Center and the Center of Excellence for Collaborative Innovation. Our 2012 Strategy builds upon these successes to address the Agency s increased emphasis on societal relevance and being a leader in research and development and innovative business and communications practices. The 2012 Human Health and Performance Vision is to lead the world in human health and performance innovations for life in space and on Earth. Our mission is to enable optimization of human health and performance throughout all phases of spaceflight. All HHPD functions are ultimately aimed at achieving this mission. Our activities enable mission success, optimizing human health and productivity in space before, during, and after the actual spaceflight experience of our crews, and include support for ground-based functions. Many of our spaceflight innovations also provide solutions for terrestrial challenges, thereby enhancing life on Earth. Our strategic goals are aimed at leading human exploration and ISS utilization, leading human health and performance internationally, excelling in management and advancement of innovations in health and human system integration, and expanding relevance to life on Earth and creating enduring support and enthusiasm for space exploration.
NASA Human Health and Performance Strategy
NASA Technical Reports Server (NTRS)
Davis, Jeffrey R.
2012-01-01
In May 2007, what was then the Space Life Sciences Directorate, issued the 2007 Space Life Sciences Strategy for Human Space Exploration. In January 2012, leadership and key directorate personnel were once again brought together to assess the current and expected future environment against its 2007 Strategy and the Agency and Johnson Space Center goals and strategies. The result was a refined vision and mission, and revised goals, objectives, and strategies. One of the first changes implemented was to rename the directorate from Space Life Sciences to Human Health and Performance to better reflect our vision and mission. The most significant change in the directorate from 2007 to the present is the integration of the Human Research Program and Crew Health and Safety activities. Subsequently, the Human Health and Performance Directorate underwent a reorganization to achieve enhanced integration of research and development with operations to better support human spaceflight and International Space Station utilization. These changes also enable a more effective and efficient approach to human system risk mitigation. Since 2007, we have also made significant advances in external collaboration and implementation of new business models within the directorate and the Agency, and through two newly established virtual centers, the NASA Human Health and Performance Center and the Center of Excellence for Collaborative Innovation. Our 2012 Strategy builds upon these successes to address the Agency's increased emphasis on societal relevance and being a leader in research and development and innovative business and communications practices. The 2012 Human Health and Performance Vision is to lead the world in human health and performance innovations for life in space and on Earth. Our mission is to enable optimization of human health and performance throughout all phases of spaceflight. All HH&P functions are ultimately aimed at achieving this mission. Our activities enable mission success, optimizing human health and productivity in space before, during, and after the actual spaceflight experience of our crews, and include support for ground-- based functions. Many of our spaceflight innovations also provide solutions for terrestrial challenges, thereby enhancing life on Earth. Our strategic goals are aimed at leading human exploration and ISS utilization, leading human health and performance internationally, excelling in management and advancement of innovations in health and human system integration, and expanding relevance to life on Earth and creating enduring support and enthusiasm for space exploration.
Lin, Yi-Chung; Pandy, Marcus G
2017-07-05
The aim of this study was to perform full-body three-dimensional (3D) dynamic optimization simulations of human locomotion by driving a neuromusculoskeletal model toward in vivo measurements of body-segmental kinematics and ground reaction forces. Gait data were recorded from 5 healthy participants who walked at their preferred speeds and ran at 2m/s. Participant-specific data-tracking dynamic optimization solutions were generated for one stride cycle using direct collocation in tandem with an OpenSim-MATLAB interface. The body was represented as a 12-segment, 21-degree-of-freedom skeleton actuated by 66 muscle-tendon units. Foot-ground interaction was simulated using six contact spheres under each foot. The dynamic optimization problem was to find the set of muscle excitations needed to reproduce 3D measurements of body-segmental motions and ground reaction forces while minimizing the time integral of muscle activations squared. Direct collocation took on average 2.7±1.0h and 2.2±1.6h of CPU time, respectively, to solve the optimization problems for walking and running. Model-computed kinematics and foot-ground forces were in good agreement with corresponding experimental data while the calculated muscle excitation patterns were consistent with measured EMG activity. The results demonstrate the feasibility of implementing direct collocation on a detailed neuromusculoskeletal model with foot-ground contact to accurately and efficiently generate 3D data-tracking dynamic optimization simulations of human locomotion. The proposed method offers a viable tool for creating feasible initial guesses needed to perform predictive simulations of movement using dynamic optimization theory. The source code for implementing the model and computational algorithm may be downloaded at http://simtk.org/home/datatracking. Copyright © 2017 Elsevier Ltd. All rights reserved.
Saini, Sanjay; Zakaria, Nordin; Rambli, Dayang Rohaya Awang; Sulaiman, Suziah
2015-01-01
The high-dimensional search space involved in markerless full-body articulated human motion tracking from multiple-views video sequences has led to a number of solutions based on metaheuristics, the most recent form of which is Particle Swarm Optimization (PSO). However, the classical PSO suffers from premature convergence and it is trapped easily into local optima, significantly affecting the tracking accuracy. To overcome these drawbacks, we have developed a method for the problem based on Hierarchical Multi-Swarm Cooperative Particle Swarm Optimization (H-MCPSO). The tracking problem is formulated as a non-linear 34-dimensional function optimization problem where the fitness function quantifies the difference between the observed image and a projection of the model configuration. Both the silhouette and edge likelihoods are used in the fitness function. Experiments using Brown and HumanEva-II dataset demonstrated that H-MCPSO performance is better than two leading alternative approaches-Annealed Particle Filter (APF) and Hierarchical Particle Swarm Optimization (HPSO). Further, the proposed tracking method is capable of automatic initialization and self-recovery from temporary tracking failures. Comprehensive experimental results are presented to support the claims.
Space Human Factors: Research to Application
NASA Technical Reports Server (NTRS)
Woolford, Barbara
2008-01-01
Human Factors has been instrumental in preventing potential on-orbit hazards and increasing overall crew safety. Poor performance & operational learning curves on-orbit are mitigated. Human-centered design is applied to optimize design and minimize potentially hazardous conditions, especially with larger crew sizes and habitat constraints. Lunar and Mars requirements and design developments are enhanced, based on ISS Lessons Learned.
Muramyl Peptide-Enhanced Sleep: Pharmacological Optimization of Performance
1990-06-01
Dower, S . K., S . R. Kronheim, T. P. Hopp, M. Cantrell, M. Deeley, S . Gillis, C. S . Henney, and D. L. Urdal. The cell surface receptors for...sleep-promoting factor isolated from human urine. J. Biol. Chem. 259:12652-12658, 1984. 90. Martin, S . A., R. S . Rosenthal, and K. Biemann. Fast atom...AD REPORT NUMBER0 TITLE Muramyl Peptide-Enhanced Sleep: Pharmacological Optimization of Performance TYPE OF REPORT Annj~Jl AUTHOR ( s ) O I James M
Human problem solving performance in a fault diagnosis task
NASA Technical Reports Server (NTRS)
Rouse, W. B.
1978-01-01
It is proposed that humans in automated systems will be asked to assume the role of troubleshooter or problem solver and that the problems which they will be asked to solve in such systems will not be amenable to rote solution. The design of visual displays for problem solving in such situations is considered, and the results of two experimental investigations of human problem solving performance in the diagnosis of faults in graphically displayed network problems are discussed. The effects of problem size, forced-pacing, computer aiding, and training are considered. Results indicate that human performance deviates from optimality as problem size increases. Forced-pacing appears to cause the human to adopt fairly brute force strategies, as compared to those adopted in self-paced situations. Computer aiding substantially lessens the number of mistaken diagnoses by performing the bookkeeping portions of the task.
HURON (HUman and Robotic Optimization Network) Multi-Agent Temporal Activity Planner/Scheduler
NASA Technical Reports Server (NTRS)
Hua, Hook; Mrozinski, Joseph J.; Elfes, Alberto; Adumitroaie, Virgil; Shelton, Kacie E.; Smith, Jeffrey H.; Lincoln, William P.; Weisbin, Charles R.
2012-01-01
HURON solves the problem of how to optimize a plan and schedule for assigning multiple agents to a temporal sequence of actions (e.g., science tasks). Developed as a generic planning and scheduling tool, HURON has been used to optimize space mission surface operations. The tool has also been used to analyze lunar architectures for a variety of surface operational scenarios in order to maximize return on investment and productivity. These scenarios include numerous science activities performed by a diverse set of agents: humans, teleoperated rovers, and autonomous rovers. Once given a set of agents, activities, resources, resource constraints, temporal constraints, and de pendencies, HURON computes an optimal schedule that meets a specified goal (e.g., maximum productivity or minimum time), subject to the constraints. HURON performs planning and scheduling optimization as a graph search in state-space with forward progression. Each node in the graph contains a state instance. Starting with the initial node, a graph is automatically constructed with new successive nodes of each new state to explore. The optimization uses a set of pre-conditions and post-conditions to create the children states. The Python language was adopted to not only enable more agile development, but to also allow the domain experts to easily define their optimization models. A graphical user interface was also developed to facilitate real-time search information feedback and interaction by the operator in the search optimization process. The HURON package has many potential uses in the fields of Operations Research and Management Science where this technology applies to many commercial domains requiring optimization to reduce costs. For example, optimizing a fleet of transportation truck routes, aircraft flight scheduling, and other route-planning scenarios involving multiple agent task optimization would all benefit by using HURON.
Perceptual precision of passive body tilt is consistent with statistically optimal cue integration
Karmali, Faisal; Nicoucar, Keyvan; Merfeld, Daniel M.
2017-01-01
When making perceptual decisions, humans have been shown to optimally integrate independent noisy multisensory information, matching maximum-likelihood (ML) limits. Such ML estimators provide a theoretic limit to perceptual precision (i.e., minimal thresholds). However, how the brain combines two interacting (i.e., not independent) sensory cues remains an open question. To study the precision achieved when combining interacting sensory signals, we measured perceptual roll tilt and roll rotation thresholds between 0 and 5 Hz in six normal human subjects. Primary results show that roll tilt thresholds between 0.2 and 0.5 Hz were significantly lower than predicted by a ML estimator that includes only vestibular contributions that do not interact. In this paper, we show how other cues (e.g., somatosensation) and an internal representation of sensory and body dynamics might independently contribute to the observed performance enhancement. In short, a Kalman filter was combined with an ML estimator to match human performance, whereas the potential contribution of nonvestibular cues was assessed using published bilateral loss patient data. Our results show that a Kalman filter model including previously proven canal-otolith interactions alone (without nonvestibular cues) can explain the observed performance enhancements as can a model that includes nonvestibular contributions. NEW & NOTEWORTHY We found that human whole body self-motion direction-recognition thresholds measured during dynamic roll tilts were significantly lower than those predicted by a conventional maximum-likelihood weighting of the roll angular velocity and quasistatic roll tilt cues. Here, we show that two models can each match this “apparent” better-than-optimal performance: 1) inclusion of a somatosensory contribution and 2) inclusion of a dynamic sensory interaction between canal and otolith cues via a Kalman filter model. PMID:28179477
Ancient DNA in human bone remains from Pompeii archaeological site.
Cipollaro, M; Di Bernardo, G; Galano, G; Galderisi, U; Guarino, F; Angelini, F; Cascino, A
1998-06-29
aDNA extraction and amplification procedures have been optimized for Pompeian human bone remains whose diagenesis has been determined by histological analysis. Single copy genes amplification (X and Y amelogenin loci and Y specific alphoid repeat sequences) have been performed and compared with anthropometric data on sexing.
ERIC Educational Resources Information Center
Simen, Patrick; Contreras, David; Buck, Cara; Hu, Peter; Holmes, Philip; Cohen, Jonathan D.
2009-01-01
The drift-diffusion model (DDM) implements an optimal decision procedure for stationary, 2-alternative forced-choice tasks. The height of a decision threshold applied to accumulating information on each trial determines a speed-accuracy tradeoff (SAT) for the DDM, thereby accounting for a ubiquitous feature of human performance in speeded response…
NASA Astrophysics Data System (ADS)
Kaewkasi, Pitchaya; Widjaja, Joewono; Uozumi, Jun
2007-03-01
Effects of threshold value on detection performance of the modified amplitude-modulated joint transform correlator are quantitatively studied using computer simulation. Fingerprint and human face images are used as test scenes in the presence of noise and a contrast difference. Simulation results demonstrate that this correlator improves detection performance for both types of image used, but moreso for human face images. Optimal detection of low-contrast human face images obscured by strong noise can be obtained by selecting an appropriate threshold value.
The Exploration-Exploitation Dilemma: A Multidisciplinary Framework
Berger-Tal, Oded; Meron, Ehud; Saltz, David
2014-01-01
The trade-off between the need to obtain new knowledge and the need to use that knowledge to improve performance is one of the most basic trade-offs in nature, and optimal performance usually requires some balance between exploratory and exploitative behaviors. Researchers in many disciplines have been searching for the optimal solution to this dilemma. Here we present a novel model in which the exploration strategy itself is dynamic and varies with time in order to optimize a definite goal, such as the acquisition of energy, money, or prestige. Our model produced four very distinct phases: Knowledge establishment, Knowledge accumulation, Knowledge maintenance, and Knowledge exploitation, giving rise to a multidisciplinary framework that applies equally to humans, animals, and organizations. The framework can be used to explain a multitude of phenomena in various disciplines, such as the movement of animals in novel landscapes, the most efficient resource allocation for a start-up company, or the effects of old age on knowledge acquisition in humans. PMID:24756026
Simulation of the human-telerobot interface on the Space Station
NASA Technical Reports Server (NTRS)
Stuart, Mark A.; Smith, Randy L.
1993-01-01
Many issues remain unresolved concerning the components of the human-telerobot interface presented in this work. It is critical that these components be optimally designed and arranged to ensure, not only that the overall system's goals are met, but but that the intended end-user has been optimally accommodated. With sufficient testing and evaluation throughout the development cycle, the selection of the components to use in the final telerobotic system can promote efficient, error-free performance. It is recommended that whole-system simulation with full-scale mockups be used to help design the human-telerobot interface. It is contended that the use of simulation can facilitate this design and evaluation process.
Sakuma, Kaname; Tanaka, Akira; Mataga, Izumi
2016-12-01
The collagen gel droplet-embedded culture drug sensitivity test (CD-DST) is an anticancer drug sensitivity test that uses a method of three-dimensional culture of extremely small samples, and it is suited to primary cultures of human cancer cells. It is a useful method for oral squamous cell carcinoma (OSCC), in which the cancer tissues available for testing are limited. However, since the optimal contact concentrations of anticancer drugs have yet to be established in OSCC, CD-DST for detecting drug sensitivities of OSCC is currently performed by applying the optimal contact concentrations for stomach cancer. In the present study, squamous carcinoma cell lines from human oral cancer were used to investigate the optimal contact concentrations of cisplatin (CDDP) and fluorouracil (5-FU) during CD-DST for OSCC. CD-DST was performed in 7 squamous cell carcinoma cell lines derived from human oral cancers (Ca9-22, HSC-3, HSC-4, HO-1-N-1, KON, OSC-19 and SAS) using CDDP (0.15, 0.3, 1.25, 2.5, 5.0 and 10.0 µg/ml) and 5-FU (0.4, 0.9, 1.8, 3.8, 7.5, 15.0 and 30.0 µg/ml), and the optimal contact concentrations were calculated from the clinical response rate of OSCC to single-drug treatment and the in vitro efficacy rate curve. The optimal concentrations were 0.5 µg/ml for CDDP and 0.7 µg/ml for 5-FU. The antitumor efficacy of CDDP at this optimal contact concentration in CD-DST was compared to the antitumor efficacy in the nude mouse method. The T/C values, which were calculated as the ratio of the colony volume of the treatment group and the colony volume of the control group, at the optimal contact concentration of CDDP and of the nude mouse method were almost in agreement (P<0.05) and predicted clinical efficacy, indicating that the calculated optimal contact concentration is valid. Therefore, chemotherapy for OSCC based on anticancer drug sensitivity tests offers patients a greater freedom of choice and is likely to assume a greater importance in the selection of treatment from the perspectives of function preservation and quality of life, as well as representing a treatment option for unresectable, intractable or recurrent cases.
Visualization tool for human-machine interface designers
NASA Astrophysics Data System (ADS)
Prevost, Michael P.; Banda, Carolyn P.
1991-06-01
As modern human-machine systems continue to grow in capabilities and complexity, system operators are faced with integrating and managing increased quantities of information. Since many information components are highly related to each other, optimizing the spatial and temporal aspects of presenting information to the operator has become a formidable task for the human-machine interface (HMI) designer. The authors describe a tool in an early stage of development, the Information Source Layout Editor (ISLE). This tool is to be used for information presentation design and analysis; it uses human factors guidelines to assist the HMI designer in the spatial layout of the information required by machine operators to perform their tasks effectively. These human factors guidelines address such areas as the functional and physical relatedness of information sources. By representing these relationships with metaphors such as spring tension, attractors, and repellers, the tool can help designers visualize the complex constraint space and interacting effects of moving displays to various alternate locations. The tool contains techniques for visualizing the relative 'goodness' of a configuration, as well as mechanisms such as optimization vectors to provide guidance toward a more optimal design. Also available is a rule-based design checker to determine compliance with selected human factors guidelines.
Prediction of pilot-aircraft stability boundaries and performance contours
NASA Technical Reports Server (NTRS)
Stengel, R. F.; Broussard, J. R.
1977-01-01
Control-theoretic pilot models can provide important new insights regarding the stability and performance characteristics of the pilot-aircraft system. Optimal-control pilot models can be formed for a wide range of flight conditions, suggesting that the human pilot can maintain stability if he adapts his control strategy to the aircraft's changing dynamics. Of particular concern is the effect of sub-optimal pilot adaptation as an aircraft transitions from low to high angle-of-attack during rapid maneuvering, as the changes in aircraft stability and control response can be extreme. This paper examines the effects of optimal and sub-optimal effort during a typical 'high-g' maneuver, and it introduces the concept of minimum-control effort (MCE) adaptation. Limited experimental results tend to support the MCE adaptation concept.
Evaluation of an Integrated Multi-Task Machine Learning System with Humans in the Loop
2007-01-01
machine learning components natural language processing, and optimization...was examined with a test explicitly developed to measure the impact of integrated machine learning when used by a human user in a real world setting...study revealed that integrated machine learning does produce a positive impact on overall performance. This paper also discusses how specific machine learning components contributed to human-system
An optimized proportional-derivative controller for the human upper extremity with gravity.
Jagodnik, Kathleen M; Blana, Dimitra; van den Bogert, Antonie J; Kirsch, Robert F
2015-10-15
When Functional Electrical Stimulation (FES) is used to restore movement in subjects with spinal cord injury (SCI), muscle stimulation patterns should be selected to generate accurate and efficient movements. Ideally, the controller for such a neuroprosthesis will have the simplest architecture possible, to facilitate translation into a clinical setting. In this study, we used the simulated annealing algorithm to optimize two proportional-derivative (PD) feedback controller gain sets for a 3-dimensional arm model that includes musculoskeletal dynamics and has 5 degrees of freedom and 22 muscles, performing goal-oriented reaching movements. Controller gains were optimized by minimizing a weighted sum of position errors, orientation errors, and muscle activations. After optimization, gain performance was evaluated on the basis of accuracy and efficiency of reaching movements, along with three other benchmark gain sets not optimized for our system, on a large set of dynamic reaching movements for which the controllers had not been optimized, to test ability to generalize. Robustness in the presence of weakened muscles was also tested. The two optimized gain sets were found to have very similar performance to each other on all metrics, and to exhibit significantly better accuracy, compared with the three standard gain sets. All gain sets investigated used physiologically acceptable amounts of muscular activation. It was concluded that optimization can yield significant improvements in controller performance while still maintaining muscular efficiency, and that optimization should be considered as a strategy for future neuroprosthesis controller design. Published by Elsevier Ltd.
ERIC Educational Resources Information Center
Imangulova, Tatiyana; Makogonov, Aleksandr; Kulakhmetova, Gulbaram; Sardarov, Osman
2016-01-01
The development of desert areas in the industrial and tourist and educational purposes related to the implementation of physical activity in extreme conditions. A complex set of hot climate causes the body deep adaptive adjustment, impact on health, human physical performance. Optimization of physical activity in hot climates is of particular…
ERIC Educational Resources Information Center
Burns, Nicholas R.; Lee, Michael D.; Vickers, Douglas
2006-01-01
Studies of human problem solving have traditionally used deterministic tasks that require the execution of a systematic series of steps to reach a rational and optimal solution. Most real-world problems, however, are characterized by uncertainty, the need to consider an enormous number of variables and possible courses of action at each stage in…
Optimized Periocular Template Selection for Human Recognition
Sa, Pankaj K.; Majhi, Banshidhar
2013-01-01
A novel approach for selecting a rectangular template around periocular region optimally potential for human recognition is proposed. A comparatively larger template of periocular image than the optimal one can be slightly more potent for recognition, but the larger template heavily slows down the biometric system by making feature extraction computationally intensive and increasing the database size. A smaller template, on the contrary, cannot yield desirable recognition though the smaller template performs faster due to low computation for feature extraction. These two contradictory objectives (namely, (a) to minimize the size of periocular template and (b) to maximize the recognition through the template) are aimed to be optimized through the proposed research. This paper proposes four different approaches for dynamic optimal template selection from periocular region. The proposed methods are tested on publicly available unconstrained UBIRISv2 and FERET databases and satisfactory results have been achieved. Thus obtained template can be used for recognition of individuals in an organization and can be generalized to recognize every citizen of a nation. PMID:23984370
Review of Findings for Human Performance Contribution to Risk in Operating Events
2002-03-01
and loss of DC power. Key to this event was failure to control setpoints on safety-related equipment and failure to maintain the load tap changer...34 Therefore, "to optimize task execution at the job site, it is important to align organizational processes and values." Effective team skills are an...reactor was blocked and the water level rapidly dropped to the automatic low-level scram setpoint . Human Performance Issues Control rods were fully
An opinion formation based binary optimization approach for feature selection
NASA Astrophysics Data System (ADS)
Hamedmoghadam, Homayoun; Jalili, Mahdi; Yu, Xinghuo
2018-02-01
This paper proposed a novel optimization method based on opinion formation in complex network systems. The proposed optimization technique mimics human-human interaction mechanism based on a mathematical model derived from social sciences. Our method encodes a subset of selected features to the opinion of an artificial agent and simulates the opinion formation process among a population of agents to solve the feature selection problem. The agents interact using an underlying interaction network structure and get into consensus in their opinions, while finding better solutions to the problem. A number of mechanisms are employed to avoid getting trapped in local minima. We compare the performance of the proposed method with a number of classical population-based optimization methods and a state-of-the-art opinion formation based method. Our experiments on a number of high dimensional datasets reveal outperformance of the proposed algorithm over others.
Behavior and neural basis of near-optimal visual search
Ma, Wei Ji; Navalpakkam, Vidhya; Beck, Jeffrey M; van den Berg, Ronald; Pouget, Alexandre
2013-01-01
The ability to search efficiently for a target in a cluttered environment is one of the most remarkable functions of the nervous system. This task is difficult under natural circumstances, as the reliability of sensory information can vary greatly across space and time and is typically a priori unknown to the observer. In contrast, visual-search experiments commonly use stimuli of equal and known reliability. In a target detection task, we randomly assigned high or low reliability to each item on a trial-by-trial basis. An optimal observer would weight the observations by their trial-to-trial reliability and combine them using a specific nonlinear integration rule. We found that humans were near-optimal, regardless of whether distractors were homogeneous or heterogeneous and whether reliability was manipulated through contrast or shape. We present a neural-network implementation of near-optimal visual search based on probabilistic population coding. The network matched human performance. PMID:21552276
Cue Integration in Categorical Tasks: Insights from Audio-Visual Speech Perception
Bejjanki, Vikranth Rao; Clayards, Meghan; Knill, David C.; Aslin, Richard N.
2011-01-01
Previous cue integration studies have examined continuous perceptual dimensions (e.g., size) and have shown that human cue integration is well described by a normative model in which cues are weighted in proportion to their sensory reliability, as estimated from single-cue performance. However, this normative model may not be applicable to categorical perceptual dimensions (e.g., phonemes). In tasks defined over categorical perceptual dimensions, optimal cue weights should depend not only on the sensory variance affecting the perception of each cue but also on the environmental variance inherent in each task-relevant category. Here, we present a computational and experimental investigation of cue integration in a categorical audio-visual (articulatory) speech perception task. Our results show that human performance during audio-visual phonemic labeling is qualitatively consistent with the behavior of a Bayes-optimal observer. Specifically, we show that the participants in our task are sensitive, on a trial-by-trial basis, to the sensory uncertainty associated with the auditory and visual cues, during phonemic categorization. In addition, we show that while sensory uncertainty is a significant factor in determining cue weights, it is not the only one and participants' performance is consistent with an optimal model in which environmental, within category variability also plays a role in determining cue weights. Furthermore, we show that in our task, the sensory variability affecting the visual modality during cue-combination is not well estimated from single-cue performance, but can be estimated from multi-cue performance. The findings and computational principles described here represent a principled first step towards characterizing the mechanisms underlying human cue integration in categorical tasks. PMID:21637344
Human breath metabolomics using an optimized noninvasive exhaled breath condensate sampler
Zamuruyev, Konstantin O.; Aksenov, Alexander A.; Pasamontes, Alberto; Brown, Joshua F.; Pettit, Dayna R.; Foutouhi, Soraya; Weimer, Bart C.; Schivo, Michael; Kenyon, Nicholas J.; Delplanque, Jean-Pierre; Davis, Cristina E.
2017-01-01
Exhaled breath condensate (EBC) analysis is a developing field with tremendous promise to advance personalized, non-invasive health diagnostics as new analytical instrumentation platforms and detection methods are developed. Multiple commercially-available and researcher-built experimental samplers are reported in the literature. However, there is very limited information available to determine an effective breath sampling approach, especially regarding the dependence of breath sample metabolomic content on the collection device design and sampling methodology. This lack of an optimal standard procedure results in a range of reported results that are sometimes contradictory. Here, we present a design of a portable human EBC sampler optimized for collection and preservation of the rich metabolomic content of breath. The performance of the engineered device is compared to two commercially available breath collection devices: the RTube™ and TurboDECCS. A number of design and performance parameters are considered, including: condenser temperature stability during sampling, collection efficiency, condenser material choice, and saliva contamination in the collected breath samples. The significance of the biological content of breath samples, collected with each device, is evaluated with a set of mass spectrometry methods and was the primary factor for evaluating device performance. The design includes an adjustable mass-size threshold for aerodynamic filtering of saliva droplets from the breath flow. Engineering an inexpensive device that allows efficient collection of metalomic-rich breath samples is intended to aid further advancement in the field of breath analysis for non-invasive health diagnostic. EBC sampling from human volunteers was performed under UC Davis IRB protocol 63701-3 (09/30/2014-07/07/2017). PMID:28004639
Human breath metabolomics using an optimized non-invasive exhaled breath condensate sampler.
Zamuruyev, Konstantin O; Aksenov, Alexander A; Pasamontes, Alberto; Brown, Joshua F; Pettit, Dayna R; Foutouhi, Soraya; Weimer, Bart C; Schivo, Michael; Kenyon, Nicholas J; Delplanque, Jean-Pierre; Davis, Cristina E
2016-12-22
Exhaled breath condensate (EBC) analysis is a developing field with tremendous promise to advance personalized, non-invasive health diagnostics as new analytical instrumentation platforms and detection methods are developed. Multiple commercially-available and researcher-built experimental samplers are reported in the literature. However, there is very limited information available to determine an effective breath sampling approach, especially regarding the dependence of breath sample metabolomic content on the collection device design and sampling methodology. This lack of an optimal standard procedure results in a range of reported results that are sometimes contradictory. Here, we present a design of a portable human EBC sampler optimized for collection and preservation of the rich metabolomic content of breath. The performance of the engineered device is compared to two commercially available breath collection devices: the RTube ™ and TurboDECCS. A number of design and performance parameters are considered, including: condenser temperature stability during sampling, collection efficiency, condenser material choice, and saliva contamination in the collected breath samples. The significance of the biological content of breath samples, collected with each device, is evaluated with a set of mass spectrometry methods and was the primary factor for evaluating device performance. The design includes an adjustable mass-size threshold for aerodynamic filtering of saliva droplets from the breath flow. Engineering an inexpensive device that allows efficient collection of metalomic-rich breath samples is intended to aid further advancement in the field of breath analysis for non-invasive health diagnostic. EBC sampling from human volunteers was performed under UC Davis IRB protocol 63701-3 (09/30/2014-07/07/2017).
Thompson-Bean, E; Das, R; McDaid, A
2016-10-31
We present a novel methodology for the design and manufacture of complex biologically inspired soft robotic fluidic actuators. The methodology is applied to the design and manufacture of a prosthetic for the hand. Real human hands are scanned to produce a 3D model of a finger, and pneumatic networks are implemented within it to produce a biomimetic bending motion. The finger is then partitioned into material sections, and a genetic algorithm based optimization, using finite element analysis, is employed to discover the optimal material for each section. This is based on two biomimetic performance criteria. Two sets of optimizations using two material sets are performed. Promising optimized material arrangements are fabricated using two techniques to validate the optimization routine, and the fabricated and simulated results are compared. We find that the optimization is successful in producing biomimetic soft robotic fingers and that fabrication of the fingers is possible. Limitations and paths for development are discussed. This methodology can be applied for other fluidic soft robotic devices.
Applying Mathematical Optimization Methods to an ACT-R Instance-Based Learning Model.
Said, Nadia; Engelhart, Michael; Kirches, Christian; Körkel, Stefan; Holt, Daniel V
2016-01-01
Computational models of cognition provide an interface to connect advanced mathematical tools and methods to empirically supported theories of behavior in psychology, cognitive science, and neuroscience. In this article, we consider a computational model of instance-based learning, implemented in the ACT-R cognitive architecture. We propose an approach for obtaining mathematical reformulations of such cognitive models that improve their computational tractability. For the well-established Sugar Factory dynamic decision making task, we conduct a simulation study to analyze central model parameters. We show how mathematical optimization techniques can be applied to efficiently identify optimal parameter values with respect to different optimization goals. Beyond these methodological contributions, our analysis reveals the sensitivity of this particular task with respect to initial settings and yields new insights into how average human performance deviates from potential optimal performance. We conclude by discussing possible extensions of our approach as well as future steps towards applying more powerful derivative-based optimization methods.
NASA Astrophysics Data System (ADS)
Xue, Lixia; Dai, Yun; Rao, Xuejun; Wang, Cheng; Hu, Yiyun; Liu, Qian; Jiang, Wenhan
2008-01-01
Higher-order aberrations correction can improve visual performance of human eye to some extent. To evaluate how much visual benefit can be obtained with higher-order aberrations correction we developed an adaptive optics vision simulator (AOVS). Dynamic real time optimized modal compensation was used to implement various customized higher-order ocular aberrations correction strategies. The experimental results indicate that higher-order aberrations correction can improve visual performance of human eye comparing with only lower-order aberration correction but the improvement degree and higher-order aberration correction strategy are different from each individual. Some subjects can acquire great visual benefit when higher-order aberrations were corrected but some subjects acquire little visual benefit even though all higher-order aberrations were corrected. Therefore, relative to general lower-order aberrations correction strategy, customized higher-order aberrations correction strategy is needed to obtain optimal visual improvement for each individual. AOVS provides an effective tool for higher-order ocular aberrations optometry for customized ocular aberrations correction.
Display/control requirements for VTOL aircraft
NASA Technical Reports Server (NTRS)
Hoffman, W. C.; Curry, R. E.; Kleinman, D. L.; Hollister, W. M.; Young, L. R.
1975-01-01
Quantative metrics were determined for system control performance, workload for control, monitoring performance, and workload for monitoring. Pilot tasks were allocated for navigation and guidance of automated commercial V/STOL aircraft in all weather conditions using an optimal control model of the human operator to determine display elements and design.
A new method to evaluate human-robot system performance
NASA Technical Reports Server (NTRS)
Rodriguez, G.; Weisbin, C. R.
2003-01-01
One of the key issues in space exploration is that of deciding what space tasks are best done with humans, with robots, or a suitable combination of each. In general, human and robot skills are complementary. Humans provide as yet unmatched capabilities to perceive, think, and act when faced with anomalies and unforeseen events, but there can be huge potential risks to human safety in getting these benefits. Robots provide complementary skills in being able to work in extremely risky environments, but their ability to perceive, think, and act by themselves is currently not error-free, although these capabilities are continually improving with the emergence of new technologies. Substantial past experience validates these generally qualitative notions. However, there is a need for more rigorously systematic evaluation of human and robot roles, in order to optimize the design and performance of human-robot system architectures using well-defined performance evaluation metrics. This article summarizes a new analytical method to conduct such quantitative evaluations. While the article focuses on evaluating human-robot systems, the method is generally applicable to a much broader class of systems whose performance needs to be evaluated.
Optimal control in a model of malaria with differential susceptibility
NASA Astrophysics Data System (ADS)
Hincapié, Doracelly; Ospina, Juan
2014-06-01
A malaria model with differential susceptibility is analyzed using the optimal control technique. In the model the human population is classified as susceptible, infected and recovered. Susceptibility is assumed dependent on genetic, physiological, or social characteristics that vary between individuals. The model is described by a system of differential equations that relate the human and vector populations, so that the infection is transmitted to humans by vectors, and the infection is transmitted to vectors by humans. The model considered is analyzed using the optimal control method when the control consists in using of insecticide-treated nets and educational campaigns; and the optimality criterion is to minimize the number of infected humans, while keeping the cost as low as is possible. One first goal is to determine the effects of differential susceptibility in the proposed control mechanism; and the second goal is to determine the algebraic form of the basic reproductive number of the model. All computations are performed using computer algebra, specifically Maple. It is claimed that the analytical results obtained are important for the design and implementation of control measures for malaria. It is suggested some future investigations such as the application of the method to other vector-borne diseases such as dengue or yellow fever; and also it is suggested the possible application of free software of computer algebra like Maxima.
HRP Chief Scientist's Office: Conducting Research to Enable Deep Space Exploration
NASA Technical Reports Server (NTRS)
Charles, J. B.; Fogarty, J.; Vega, L.; Cromwell, R. L.; Haven, C. P.; McFather, J. C.; Savelev, I.
2017-01-01
The HRP Chief Scientist's Office sets the scientific agenda for the Human Research Program. As NASA plans for deep space exploration, HRP is conducting research to ensure the health of astronauts, and optimize human performance during extended duration missions. To accomplish this research, HRP solicits for proposals within the U.S., collaborates with agencies both domestically and abroad, and makes optimal use of ISS resources in support of human research. This session will expand on these topics and provide an opportunity for questions and discussion with the HRP Chief Scientist. Presentations in this session will include: NRA solicitations - process improvements and focus for future solicitations, Multilateral Human Research Panel for Exploration - future directions (MHRPE 2.0), Extramural liaisons - National Science Foundation (NSF) and Department of Defense (DOD), Standardized Measures for spaceflight, Ground-based Analogs - international collaborations, and International data sharing.
Testing the limits of optimality: the effect of base rates in the Monty Hall dilemma.
Herbranson, Walter T; Wang, Shanglun
2014-03-01
The Monty Hall dilemma is a probability puzzle in which a player tries to guess which of three doors conceals a desirable prize. After an initial selection, one of the nonchosen doors is opened, revealing that it is not a winner, and the player is given the choice of staying with the initial selection or switching to the other remaining door. Pigeons and humans were tested on two variants of the Monty Hall dilemma, in which one of the three doors had either a higher or a lower chance of containing the prize than did the other two options. The optimal strategy in both cases was to initially choose the lowest-probability door available and then switch away from it. Whereas pigeons learned to approximate the optimal strategy, humans failed to do so on both accounts: They did not show a preference for low-probability options, and they did not consistently switch. An analysis of performance over the course of training indicated that pigeons learned to perform a sequence of responses on each trial, and that sequence was one that yielded the highest possible rate of reinforcement. Humans, in contrast, continued to vary their responses throughout the experiment, possibly in search of a more complex strategy that would exceed the maximum possible win rate.
Impedance learning for robotic contact tasks using natural actor-critic algorithm.
Kim, Byungchan; Park, Jooyoung; Park, Shinsuk; Kang, Sungchul
2010-04-01
Compared with their robotic counterparts, humans excel at various tasks by using their ability to adaptively modulate arm impedance parameters. This ability allows us to successfully perform contact tasks even in uncertain environments. This paper considers a learning strategy of motor skill for robotic contact tasks based on a human motor control theory and machine learning schemes. Our robot learning method employs impedance control based on the equilibrium point control theory and reinforcement learning to determine the impedance parameters for contact tasks. A recursive least-square filter-based episodic natural actor-critic algorithm is used to find the optimal impedance parameters. The effectiveness of the proposed method was tested through dynamic simulations of various contact tasks. The simulation results demonstrated that the proposed method optimizes the performance of the contact tasks in uncertain conditions of the environment.
Eckstein, Miguel P; Mack, Stephen C; Liston, Dorion B; Bogush, Lisa; Menzel, Randolf; Krauzlis, Richard J
2013-06-07
Visual attention is commonly studied by using visuo-spatial cues indicating probable locations of a target and assessing the effect of the validity of the cue on perceptual performance and its neural correlates. Here, we adapt a cueing task to measure spatial cueing effects on the decisions of honeybees and compare their behavior to that of humans and monkeys in a similarly structured two-alternative forced-choice perceptual task. Unlike the typical cueing paradigm in which the stimulus strength remains unchanged within a block of trials, for the monkey and human studies we randomized the contrast of the signal to simulate more real world conditions in which the organism is uncertain about the strength of the signal. A Bayesian ideal observer that weights sensory evidence from cued and uncued locations based on the cue validity to maximize overall performance is used as a benchmark of comparison against the three animals and other suboptimal models: probability matching, ignore the cue, always follow the cue, and an additive bias/single decision threshold model. We find that the cueing effect is pervasive across all three species but is smaller in size than that shown by the Bayesian ideal observer. Humans show a larger cueing effect than monkeys and bees show the smallest effect. The cueing effect and overall performance of the honeybees allows rejection of the models in which the bees are ignoring the cue, following the cue and disregarding stimuli to be discriminated, or adopting a probability matching strategy. Stimulus strength uncertainty also reduces the theoretically predicted variation in cueing effect with stimulus strength of an optimal Bayesian observer and diminishes the size of the cueing effect when stimulus strength is low. A more biologically plausible model that includes an additive bias to the sensory response from the cued location, although not mathematically equivalent to the optimal observer for the case stimulus strength uncertainty, can approximate the benefits of the more computationally complex optimal Bayesian model. We discuss the implications of our findings on the field's common conceptualization of covert visual attention in the cueing task and what aspects, if any, might be unique to humans. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Telban, Robert J.
While the performance of flight simulator motion system hardware has advanced substantially, the development of the motion cueing algorithm, the software that transforms simulated aircraft dynamics into realizable motion commands, has not kept pace. To address this, new human-centered motion cueing algorithms were developed. A revised "optimal algorithm" uses time-invariant filters developed by optimal control, incorporating human vestibular system models. The "nonlinear algorithm" is a novel approach that is also formulated by optimal control, but can also be updated in real time. It incorporates a new integrated visual-vestibular perception model that includes both visual and vestibular sensation and the interaction between the stimuli. A time-varying control law requires the matrix Riccati equation to be solved in real time by a neurocomputing approach. Preliminary pilot testing resulted in the optimal algorithm incorporating a new otolith model, producing improved motion cues. The nonlinear algorithm vertical mode produced a motion cue with a time-varying washout, sustaining small cues for longer durations and washing out large cues more quickly compared to the optimal algorithm. The inclusion of the integrated perception model improved the responses to longitudinal and lateral cues. False cues observed with the NASA adaptive algorithm were absent. As a result of unsatisfactory sensation, an augmented turbulence cue was added to the vertical mode for both the optimal and nonlinear algorithms. The relative effectiveness of the algorithms, in simulating aircraft maneuvers, was assessed with an eleven-subject piloted performance test conducted on the NASA Langley Visual Motion Simulator (VMS). Two methods, the quasi-objective NASA Task Load Index (TLX), and power spectral density analysis of pilot control, were used to assess pilot workload. TLX analysis reveals, in most cases, less workload and variation among pilots with the nonlinear algorithm. Control input analysis shows pilot-induced oscillations on a straight-in approach are less prevalent compared to the optimal algorithm. The augmented turbulence cues increased workload on an offset approach that the pilots deemed more realistic compared to the NASA adaptive algorithm. The takeoff with engine failure showed the least roll activity for the nonlinear algorithm, with the least rudder pedal activity for the optimal algorithm.
NASA Technical Reports Server (NTRS)
Clipson, Colin
1994-01-01
This paper will review and summarize research initiatives conducted between 1987 and 1992 at NASA Ames Research Center by a research team from the University of Michigan Architecture Research Laboratory. These research initiatives, funded by a NASA grant NAG2-635, examined the viability of establishing collaborative, reconfigurable research environments for the Human Performance Research Laboratory at NASA Ames in California. Collaborative Research Environments are envisioned as a way of enhancing the work of NASA research teams, optimizing the use of shared resources, and providing superior environments for housing research activities. The Integrated Simulation Project at NASA, Ames Human Performance Research Laboratory is one of the current realizations of this initiative.
Long, Yi; Du, Zhi-jiang; Wang, Wei-dong; Dong, Wei
2016-01-01
A lower limb assistive exoskeleton is designed to help operators walk or carry payloads. The exoskeleton is required to shadow human motion intent accurately and compliantly to prevent incoordination. If the user's intention is estimated accurately, a precise position control strategy will improve collaboration between the user and the exoskeleton. In this paper, a hybrid position control scheme, combining sliding mode control (SMC) with a cerebellar model articulation controller (CMAC) neural network, is proposed to control the exoskeleton to react appropriately to human motion intent. A genetic algorithm (GA) is utilized to determine the optimal sliding surface and the sliding control law to improve performance of SMC. The proposed control strategy (SMC_GA_CMAC) is compared with three other types of approaches, that is, conventional SMC without optimization, optimal SMC with GA (SMC_GA), and SMC with CMAC compensation (SMC_CMAC), all of which are employed to track the desired joint angular position which is deduced from Clinical Gait Analysis (CGA) data. Position tracking performance is investigated with cosimulation using ADAMS and MATLAB/SIMULINK in two cases, of which the first case is without disturbances while the second case is with a bounded disturbance. The cosimulation results show the effectiveness of the proposed control strategy which can be employed in similar exoskeleton systems. PMID:27069353
Long, Yi; Du, Zhi-Jiang; Wang, Wei-Dong; Dong, Wei
2016-01-01
A lower limb assistive exoskeleton is designed to help operators walk or carry payloads. The exoskeleton is required to shadow human motion intent accurately and compliantly to prevent incoordination. If the user's intention is estimated accurately, a precise position control strategy will improve collaboration between the user and the exoskeleton. In this paper, a hybrid position control scheme, combining sliding mode control (SMC) with a cerebellar model articulation controller (CMAC) neural network, is proposed to control the exoskeleton to react appropriately to human motion intent. A genetic algorithm (GA) is utilized to determine the optimal sliding surface and the sliding control law to improve performance of SMC. The proposed control strategy (SMC_GA_CMAC) is compared with three other types of approaches, that is, conventional SMC without optimization, optimal SMC with GA (SMC_GA), and SMC with CMAC compensation (SMC_CMAC), all of which are employed to track the desired joint angular position which is deduced from Clinical Gait Analysis (CGA) data. Position tracking performance is investigated with cosimulation using ADAMS and MATLAB/SIMULINK in two cases, of which the first case is without disturbances while the second case is with a bounded disturbance. The cosimulation results show the effectiveness of the proposed control strategy which can be employed in similar exoskeleton systems.
Learning and inference using complex generative models in a spatial localization task.
Bejjanki, Vikranth R; Knill, David C; Aslin, Richard N
2016-01-01
A large body of research has established that, under relatively simple task conditions, human observers integrate uncertain sensory information with learned prior knowledge in an approximately Bayes-optimal manner. However, in many natural tasks, observers must perform this sensory-plus-prior integration when the underlying generative model of the environment consists of multiple causes. Here we ask if the Bayes-optimal integration seen with simple tasks also applies to such natural tasks when the generative model is more complex, or whether observers rely instead on a less efficient set of heuristics that approximate ideal performance. Participants localized a "hidden" target whose position on a touch screen was sampled from a location-contingent bimodal generative model with different variances around each mode. Over repeated exposure to this task, participants learned the a priori locations of the target (i.e., the bimodal generative model), and integrated this learned knowledge with uncertain sensory information on a trial-by-trial basis in a manner consistent with the predictions of Bayes-optimal behavior. In particular, participants rapidly learned the locations of the two modes of the generative model, but the relative variances of the modes were learned much more slowly. Taken together, our results suggest that human performance in a more complex localization task, which requires the integration of sensory information with learned knowledge of a bimodal generative model, is consistent with the predictions of Bayes-optimal behavior, but involves a much longer time-course than in simpler tasks.
Display/control requirements for automated VTOL aircraft
NASA Technical Reports Server (NTRS)
Hoffman, W. C.; Kleinman, D. L.; Young, L. R.
1976-01-01
A systematic design methodology for pilot displays in advanced commercial VTOL aircraft was developed and refined. The analyst is provided with a step-by-step procedure for conducting conceptual display/control configurations evaluations for simultaneous monitoring and control pilot tasks. The approach consists of three phases: formulation of information requirements, configuration evaluation, and system selection. Both the monitoring and control performance models are based upon the optimal control model of the human operator. Extensions to the conventional optimal control model required in the display design methodology include explicit optimization of control/monitoring attention; simultaneous monitoring and control performance predictions; and indifference threshold effects. The methodology was applied to NASA's experimental CH-47 helicopter in support of the VALT program. The CH-47 application examined the system performance of six flight conditions. Four candidate configurations are suggested for evaluation in pilot-in-the-loop simulations and eventual flight tests.
2013-01-01
decision, unless so designated by other documentation. 12. DISTRIBUTION AVAILIBILITY STATEMENT Approved for public release; distribution is unlimited. UU 9...SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 6. AUTHORS 7. PERFORMING ORGANIZATION NAMES AND ADDRESSES U.S. Army Research Office P.O. Box...12211 Research Triangle Park, NC 27709-2211 15. SUBJECT TERMS Human Attention Eran Zaidel Pacific Development and Technology LLC 999 Commercial St
Circadian rhythms, sleep, and performance in space.
Mallis, M M; DeRoshia, C W
2005-06-01
Maintaining optimal alertness and neurobehavioral functioning during space operations is critical to enable the National Aeronautics and Space Administration's (NASA's) vision "to extend humanity's reach to the Moon, Mars and beyond" to become a reality. Field data have demonstrated that sleep times and performance of crewmembers can be compromised by extended duty days, irregular work schedules, high workload, and varying environmental factors. This paper documents evidence of significant sleep loss and disruption of circadian rhythms in astronauts and associated performance decrements during several space missions, which demonstrates the need to develop effective countermeasures. Both sleep and circadian disruptions have been identified in the Behavioral Health and Performance (BH&P) area and the Advanced Human Support Technology (AHST) area of NASA's Bioastronautics Critical Path Roadmap. Such disruptions could have serious consequences on the effectiveness, health, and safety of astronaut crews, thus reducing the safety margin and increasing the chances of an accident or incident. These decrements oftentimes can be difficult to detect and counter effectively in restrictive operational environments. NASA is focusing research on the development of optimal sleep/wake schedules and countermeasure timing and application to help mitigate the cumulative effects of sleep and circadian disruption and enhance operational performance. Investing research in humans is one of NASA's building blocks that will allow for both short- and long-duration space missions and help NASA in developing approaches to manage and overcome the human limitations of space travel. In addition to reviewing the current state of knowledge concerning sleep and circadian disruptions during space operations, this paper provides an overview of NASA's broad research goals. Also, NASA-funded research, designed to evaluate the relationships between sleep quality, circadian rhythm stability, and performance proficiency in both ground-based simulations and space mission studies, as described in the 2003 NASA Task Book, will be reviewed.
Circadian rhythms, sleep, and performance in space
NASA Technical Reports Server (NTRS)
Mallis, M. M.; DeRoshia, C. W.
2005-01-01
Maintaining optimal alertness and neurobehavioral functioning during space operations is critical to enable the National Aeronautics and Space Administration's (NASA's) vision "to extend humanity's reach to the Moon, Mars and beyond" to become a reality. Field data have demonstrated that sleep times and performance of crewmembers can be compromised by extended duty days, irregular work schedules, high workload, and varying environmental factors. This paper documents evidence of significant sleep loss and disruption of circadian rhythms in astronauts and associated performance decrements during several space missions, which demonstrates the need to develop effective countermeasures. Both sleep and circadian disruptions have been identified in the Behavioral Health and Performance (BH&P) area and the Advanced Human Support Technology (AHST) area of NASA's Bioastronautics Critical Path Roadmap. Such disruptions could have serious consequences on the effectiveness, health, and safety of astronaut crews, thus reducing the safety margin and increasing the chances of an accident or incident. These decrements oftentimes can be difficult to detect and counter effectively in restrictive operational environments. NASA is focusing research on the development of optimal sleep/wake schedules and countermeasure timing and application to help mitigate the cumulative effects of sleep and circadian disruption and enhance operational performance. Investing research in humans is one of NASA's building blocks that will allow for both short- and long-duration space missions and help NASA in developing approaches to manage and overcome the human limitations of space travel. In addition to reviewing the current state of knowledge concerning sleep and circadian disruptions during space operations, this paper provides an overview of NASA's broad research goals. Also, NASA-funded research, designed to evaluate the relationships between sleep quality, circadian rhythm stability, and performance proficiency in both ground-based simulations and space mission studies, as described in the 2003 NASA Task Book, will be reviewed.
2015-01-01
We present and discuss philosophy and methodology of chaotic evolution that is theoretically supported by chaos theory. We introduce four chaotic systems, that is, logistic map, tent map, Gaussian map, and Hénon map, in a well-designed chaotic evolution algorithm framework to implement several chaotic evolution (CE) algorithms. By comparing our previous proposed CE algorithm with logistic map and two canonical differential evolution (DE) algorithms, we analyse and discuss optimization performance of CE algorithm. An investigation on the relationship between optimization capability of CE algorithm and distribution characteristic of chaotic system is conducted and analysed. From evaluation result, we find that distribution of chaotic system is an essential factor to influence optimization performance of CE algorithm. We propose a new interactive EC (IEC) algorithm, interactive chaotic evolution (ICE) that replaces fitness function with a real human in CE algorithm framework. There is a paired comparison-based mechanism behind CE search scheme in nature. A simulation experimental evaluation is conducted with a pseudo-IEC user to evaluate our proposed ICE algorithm. The evaluation result indicates that ICE algorithm can obtain a significant better performance than or the same performance as interactive DE. Some open topics on CE, ICE, fusion of these optimization techniques, algorithmic notation, and others are presented and discussed. PMID:25879067
Pei, Yan
2015-01-01
We present and discuss philosophy and methodology of chaotic evolution that is theoretically supported by chaos theory. We introduce four chaotic systems, that is, logistic map, tent map, Gaussian map, and Hénon map, in a well-designed chaotic evolution algorithm framework to implement several chaotic evolution (CE) algorithms. By comparing our previous proposed CE algorithm with logistic map and two canonical differential evolution (DE) algorithms, we analyse and discuss optimization performance of CE algorithm. An investigation on the relationship between optimization capability of CE algorithm and distribution characteristic of chaotic system is conducted and analysed. From evaluation result, we find that distribution of chaotic system is an essential factor to influence optimization performance of CE algorithm. We propose a new interactive EC (IEC) algorithm, interactive chaotic evolution (ICE) that replaces fitness function with a real human in CE algorithm framework. There is a paired comparison-based mechanism behind CE search scheme in nature. A simulation experimental evaluation is conducted with a pseudo-IEC user to evaluate our proposed ICE algorithm. The evaluation result indicates that ICE algorithm can obtain a significant better performance than or the same performance as interactive DE. Some open topics on CE, ICE, fusion of these optimization techniques, algorithmic notation, and others are presented and discussed.
A single-layer platform for Boolean logic and arithmetic through DNA excision in mammalian cells
Weinberg, Benjamin H.; Hang Pham, N. T.; Caraballo, Leidy D.; Lozanoski, Thomas; Engel, Adrien; Bhatia, Swapnil; Wong, Wilson W.
2017-01-01
Genetic circuits engineered for mammalian cells often require extensive fine-tuning to perform their intended functions. To overcome this problem, we present a generalizable biocomputing platform that can engineer genetic circuits which function in human cells with minimal optimization. We used our Boolean Logic and Arithmetic through DNA Excision (BLADE) platform to build more than 100 multi-input-multi-output circuits. We devised a quantitative metric to evaluate the performance of the circuits in human embryonic kidney and Jurkat T cells. Of 113 circuits analysed, 109 functioned (96.5%) with the correct specified behavior without any optimization. We used our platform to build a three-input, two-output Full Adder and six-input, one-output Boolean Logic Look Up Table. We also used BLADE to design circuits with temporal small molecule-mediated inducible control and circuits that incorporate CRISPR/Cas9 to regulate endogenous mammalian genes. PMID:28346402
Autologous islet transplantation: challenges and lessons.
Dunn, Ty B; Wilhelm, Joshua J; Bellin, Melena D; Pruett, Timothy L
2017-08-01
Human islet isolation and autotransplantation [autologous islet transplant (AUTX)] is performed to prevent or ameliorate brittle diabetes after total pancreatectomy performed for benign disease. The success or failure of the transplant can be associated with a profound impact on the individual's quality of life and even survival. AUTX offers unique insights into the effects of pancreas quality, islet number, isolation technique and alternate site engraftment on transplant efficacy. Herein, we review islet isolation with a focus on potential pathways to further optimize the endocrine outcome of AUTX, and compare and contrast differences in islet processing for AUTX and allotransplantation (allogeneic islet transplant). New knowledge of human islet biology and issues surrounding the engraftment process offer opportunities for innovative approaches toward optimizing islet cell transplantation. Improving the rate and durability of insulin independence in the often-times marginal dose model of AUTX may provide new insight toward improving the efficiency and durability of single donor islet (allogeneic islet transplant).
A novel framework for virtual prototyping of rehabilitation exoskeletons.
Agarwal, Priyanshu; Kuo, Pei-Hsin; Neptune, Richard R; Deshpande, Ashish D
2013-06-01
Human-worn rehabilitation exoskeletons have the potential to make therapeutic exercises increasingly accessible to disabled individuals while reducing the cost and labor involved in rehabilitation therapy. In this work, we propose a novel human-model-in-the-loop framework for virtual prototyping (design, control and experimentation) of rehabilitation exoskeletons by merging computational musculoskeletal analysis with simulation-based design techniques. The framework allows to iteratively optimize design and control algorithm of an exoskeleton using simulation. We introduce biomechanical, morphological, and controller measures to quantify the performance of the device for optimization study. Furthermore, the framework allows one to carry out virtual experiments for testing specific "what-if" scenarios to quantify device performance and recovery progress. To illustrate the application of the framework, we present a case study wherein the design and analysis of an index-finger exoskeleton is carried out using the proposed framework.
Optimal digital filtering for tremor suppression.
Gonzalez, J G; Heredia, E A; Rahman, T; Barner, K E; Arce, G R
2000-05-01
Remote manually operated tasks such as those found in teleoperation, virtual reality, or joystick-based computer access, require the generation of an intermediate electrical signal which is transmitted to the controlled subsystem (robot arm, virtual environment, or a cursor in a computer screen). When human movements are distorted, for instance, by tremor, performance can be improved by digitally filtering the intermediate signal before it reaches the controlled device. This paper introduces a novel tremor filtering framework in which digital equalizers are optimally designed through pursuit tracking task experiments. Due to inherent properties of the man-machine system, the design of tremor suppression equalizers presents two serious problems: 1) performance criteria leading to optimizations that minimize mean-squared error are not efficient for tremor elimination and 2) movement signals show ill-conditioned autocorrelation matrices, which often result in useless or unstable solutions. To address these problems, a new performance indicator in the context of tremor is introduced, and the optimal equalizer according to this new criterion is developed. Ill-conditioning of the autocorrelation matrix is overcome using a novel method which we call pulled-optimization. Experiments performed with artificially induced vibrations and a subject with Parkinson's disease show significant improvement in performance. Additional results, along with MATLAB source code of the algorithms, and a customizable demo for PC joysticks, are available on the Internet at http:¿tremor-suppression.com.
NASA Technical Reports Server (NTRS)
Axdahl, Erik L.
2015-01-01
Removing human interaction from design processes by using automation may lead to gains in both productivity and design precision. This memorandum describes efforts to incorporate high fidelity numerical analysis tools into an automated framework and applying that framework to applications of practical interest. The purpose of this effort was to integrate VULCAN-CFD into an automated, DAKOTA-enabled framework with a proof-of-concept application being the optimization of supersonic test facility nozzles. It was shown that the optimization framework could be deployed on a high performance computing cluster with the flow of information handled effectively to guide the optimization process. Furthermore, the application of the framework to supersonic test facility nozzle flowpath design and optimization was demonstrated using multiple optimization algorithms.
2016-01-01
An all-chain-wireless brain-to-brain system (BTBS), which enabled motion control of a cyborg cockroach via human brain, was developed in this work. Steady-state visual evoked potential (SSVEP) based brain-computer interface (BCI) was used in this system for recognizing human motion intention and an optimization algorithm was proposed in SSVEP to improve online performance of the BCI. The cyborg cockroach was developed by surgically integrating a portable microstimulator that could generate invasive electrical nerve stimulation. Through Bluetooth communication, specific electrical pulse trains could be triggered from the microstimulator by BCI commands and were sent through the antenna nerve to stimulate the brain of cockroach. Serial experiments were designed and conducted to test overall performance of the BTBS with six human subjects and three cockroaches. The experimental results showed that the online classification accuracy of three-mode BCI increased from 72.86% to 78.56% by 5.70% using the optimization algorithm and the mean response accuracy of the cyborgs using this system reached 89.5%. Moreover, the results also showed that the cyborg could be navigated by the human brain to complete walking along an S-shape track with the success rate of about 20%, suggesting the proposed BTBS established a feasible functional information transfer pathway from the human brain to the cockroach brain. PMID:26982717
Li, Guangye; Zhang, Dingguo
2016-01-01
An all-chain-wireless brain-to-brain system (BTBS), which enabled motion control of a cyborg cockroach via human brain, was developed in this work. Steady-state visual evoked potential (SSVEP) based brain-computer interface (BCI) was used in this system for recognizing human motion intention and an optimization algorithm was proposed in SSVEP to improve online performance of the BCI. The cyborg cockroach was developed by surgically integrating a portable microstimulator that could generate invasive electrical nerve stimulation. Through Bluetooth communication, specific electrical pulse trains could be triggered from the microstimulator by BCI commands and were sent through the antenna nerve to stimulate the brain of cockroach. Serial experiments were designed and conducted to test overall performance of the BTBS with six human subjects and three cockroaches. The experimental results showed that the online classification accuracy of three-mode BCI increased from 72.86% to 78.56% by 5.70% using the optimization algorithm and the mean response accuracy of the cyborgs using this system reached 89.5%. Moreover, the results also showed that the cyborg could be navigated by the human brain to complete walking along an S-shape track with the success rate of about 20%, suggesting the proposed BTBS established a feasible functional information transfer pathway from the human brain to the cockroach brain.
Model for Predicting the Performance of Planetary Suit Hip Bearing Designs
NASA Technical Reports Server (NTRS)
Cowley, Matthew S.; Margerum, Sarah; Hharvill, Lauren; Rajulu, Sudhakar
2012-01-01
Designing a space suit is very complex and often requires difficult trade-offs between performance, cost, mass, and system complexity. During the development period of the suit numerous design iterations need to occur before the hardware meets human performance requirements. Using computer models early in the design phase of hardware development is advantageous, by allowing virtual prototyping to take place. A virtual design environment allows designers to think creatively, exhaust design possibilities, and study design impacts on suit and human performance. A model of the rigid components of the Mark III Technology Demonstrator Suit (planetary-type space suit) and a human manikin were created and tested in a virtual environment. The performance of the Mark III hip bearing model was first developed and evaluated virtually by comparing the differences in mobility performance between the nominal bearing configurations and modified bearing configurations. Suited human performance was then simulated with the model and compared to actual suited human performance data using the same bearing configurations. The Mark III hip bearing model was able to visually represent complex bearing rotations and the theoretical volumetric ranges of motion in three dimensions. The model was also able to predict suited human hip flexion and abduction maximums to within 10% of the actual suited human subject data, except for one modified bearing condition in hip flexion which was off by 24%. Differences between the model predictions and the human subject performance data were attributed to the lack of joint moment limits in the model, human subject fitting issues, and the limited suit experience of some of the subjects. The results demonstrate that modeling space suit rigid segments is a feasible design tool for evaluating and optimizing suited human performance. Keywords: space suit, design, modeling, performance
Fast Human Detection for Intelligent Monitoring Using Surveillance Visible Sensors
Ko, Byoung Chul; Jeong, Mira; Nam, JaeYeal
2014-01-01
Human detection using visible surveillance sensors is an important and challenging work for intruder detection and safety management. The biggest barrier of real-time human detection is the computational time required for dense image scaling and scanning windows extracted from an entire image. This paper proposes fast human detection by selecting optimal levels of image scale using each level's adaptive region-of-interest (ROI). To estimate the image-scaling level, we generate a Hough windows map (HWM) and select a few optimal image scales based on the strength of the HWM and the divide-and-conquer algorithm. Furthermore, adaptive ROIs are arranged per image scale to provide a different search area. We employ a cascade random forests classifier to separate candidate windows into human and nonhuman classes. The proposed algorithm has been successfully applied to real-world surveillance video sequences, and its detection accuracy and computational speed show a better performance than those of other related methods. PMID:25393782
Human Performance Optimization: An Evolving Charge to the Department of Defense
2007-06-01
University of the Health Sciences hosted a conference in June 2006 entitled ~Human Performance Opti- mization in the Department of Defense: Charting a Course...20814, INaval M«!lcal Research Center. 503 Robert Grant AI’enue. SUvcr Spring, MO 20910, IU.S. AIr Fort.’t Office of the Surgeon General, 110 Luke...AI’enue. Suite 400. BoIlIng Alr Foree Baile. DC 203.12·7050. §School of Nursing. Uniformed 5c,,1ctS Unh’erslty of the Health Sciences , 4301 Jon~ Bridlle
Automated Sensitivity Analysis of Interplanetary Trajectories
NASA Technical Reports Server (NTRS)
Knittel, Jeremy; Hughes, Kyle; Englander, Jacob; Sarli, Bruno
2017-01-01
This work describes a suite of Python tools known as the Python EMTG Automated Trade Study Application (PEATSA). PEATSA was written to automate the operation of trajectory optimization software, simplify the process of performing sensitivity analysis, and was ultimately found to out-perform a human trajectory designer in unexpected ways. These benefits will be discussed and demonstrated on sample mission designs.
ERIC Educational Resources Information Center
Avey, James B.; Reichard, Rebecca J.; Luthans, Fred; Mhatre, Ketan H.
2011-01-01
The positive core construct of psychological capital (or simply PsyCap), consisting of the psychological resources of hope, efficacy, resilience, and optimism, has recently been demonstrated to be open to human resource development (HRD) and performance management. The research stream on PsyCap has now grown to the point that a quantitative…
[Regional ecological construction and mission of landscape ecology].
Xiao, Duning; Xie, Fuju; Wei, Jianbing
2004-10-01
The eco-construction on regional and landscape scale is the one which can be used to specific landscape and intercrossing ecosystem in specific region including performing scientific administration of ecosystem and optimizing environmental function. Recently, the government has taken a series of significant projects into action, such as national forest protection item, partly forest restoration, and adjustment of water, etc. Enforcing regional eco-construction and maintaining the ecology security of the nation have become the strategic requisition. In various regions, different eco-construction should be applied, for example, performing ecological safeguard measure in ecological sensitive zone, accommodating the ecological load in ecological fragile zone, etc., which can control the activities of human being, so that, sustainable development can be reached. Facing opportunity and challenge in the development of landscape ecology, we have some key topics: landscape pattern of ecological security, land use and ecological process, landscape changes under human activity stress, quantitative evaluation of the influence on human being activities, evaluation of zonal ecological security and advance warning of ecological risk, and planning and optimizing of model in landscape eco-construction.
Example-based human motion denoising.
Lou, Hui; Chai, Jinxiang
2010-01-01
With the proliferation of motion capture data, interest in removing noise and outliers from motion capture data has increased. In this paper, we introduce an efficient human motion denoising technique for the simultaneous removal of noise and outliers from input human motion data. The key idea of our approach is to learn a series of filter bases from precaptured motion data and use them along with robust statistics techniques to filter noisy motion data. Mathematically, we formulate the motion denoising process in a nonlinear optimization framework. The objective function measures the distance between the noisy input and the filtered motion in addition to how well the filtered motion preserves spatial-temporal patterns embedded in captured human motion data. Optimizing the objective function produces an optimal filtered motion that keeps spatial-temporal patterns in captured motion data. We also extend the algorithm to fill in the missing values in input motion data. We demonstrate the effectiveness of our system by experimenting with both real and simulated motion data. We also show the superior performance of our algorithm by comparing it with three baseline algorithms and to those in state-of-art motion capture data processing software such as Vicon Blade.
Field-Based Optimal Placement of Antennas for Body-Worn Wireless Sensors
Januszkiewicz, Łukasz; Di Barba, Paolo; Hausman, Sławomir
2016-01-01
We investigate a case of automated energy-budget-aware optimization of the physical position of nodes (sensors) in a Wireless Body Area Network (WBAN). This problem has not been presented in the literature yet, as opposed to antenna and routing optimization, which are relatively well-addressed. In our research, which was inspired by a safety-critical application for firefighters, the sensor network consists of three nodes located on the human body. The nodes communicate over a radio link operating in the 2.4 GHz or 5.8 GHz ISM frequency band. Two sensors have a fixed location: one on the head (earlobe pulse oximetry) and one on the arm (with accelerometers, temperature and humidity sensors, and a GPS receiver), while the position of the third sensor can be adjusted within a predefined region on the wearer’s chest. The path loss between each node pair strongly depends on the location of the nodes and is difficult to predict without performing a full-wave electromagnetic simulation. Our optimization scheme employs evolutionary computing. The novelty of our approach lies not only in the formulation of the problem but also in linking a fully automated optimization procedure with an electromagnetic simulator and a simplified human body model. This combination turns out to be a computationally effective solution, which, depending on the initial placement, has a potential to improve performance of our example sensor network setup by up to about 20 dB with respect to the path loss between selected nodes. PMID:27196911
Optimal control of a hybrid rhythmic-discrete task: the bouncing ball revisited.
Ronsse, Renaud; Wei, Kunlin; Sternad, Dagmar
2010-05-01
Rhythmically bouncing a ball with a racket is a hybrid task that combines continuous rhythmic actuation of the racket with the control of discrete impact events between racket and ball. This study presents experimental data and a two-layered modeling framework that explicitly addresses the hybrid nature of control: a first discrete layer calculates the state to reach at impact and the second continuous layer smoothly drives the racket to this desired state, based on optimality principles. The testbed for this hybrid model is task performance at a range of increasingly slower tempos. When slowing the rhythm of the bouncing actions, the continuous cycles become separated into a sequence of discrete movements interspersed by dwell times and directed to achieve the desired impact. Analyses of human performance show increasing variability of performance measures with slower tempi, associated with a change in racket trajectories from approximately sinusoidal to less symmetrical velocity profiles. Matching results of model simulations give support to a hybrid control model based on optimality, and therefore suggest that optimality principles are applicable to the sensorimotor control of complex movements such as ball bouncing.
The economics of motion perception and invariants of visual sensitivity.
Gepshtein, Sergei; Tyukin, Ivan; Kubovy, Michael
2007-06-21
Neural systems face the challenge of optimizing their performance with limited resources, just as economic systems do. Here, we use tools of neoclassical economic theory to explore how a frugal visual system should use a limited number of neurons to optimize perception of motion. The theory prescribes that vision should allocate its resources to different conditions of stimulation according to the degree of balance between measurement uncertainties and stimulus uncertainties. We find that human vision approximately follows the optimal prescription. The equilibrium theory explains why human visual sensitivity is distributed the way it is and why qualitatively different regimes of apparent motion are observed at different speeds. The theory offers a new normative framework for understanding the mechanisms of visual sensitivity at the threshold of visibility and above the threshold and predicts large-scale changes in visual sensitivity in response to changes in the statistics of stimulation and system goals.
Probabilistic models in human sensorimotor control
Wolpert, Daniel M.
2009-01-01
Sensory and motor uncertainty form a fundamental constraint on human sensorimotor control. Bayesian decision theory (BDT) has emerged as a unifying framework to understand how the central nervous system performs optimal estimation and control in the face of such uncertainty. BDT has two components: Bayesian statistics and decision theory. Here we review Bayesian statistics and show how it applies to estimating the state of the world and our own body. Recent results suggest that when learning novel tasks we are able to learn the statistical properties of both the world and our own sensory apparatus so as to perform estimation using Bayesian statistics. We review studies which suggest that humans can combine multiple sources of information to form maximum likelihood estimates, can incorporate prior beliefs about possible states of the world so as to generate maximum a posteriori estimates and can use Kalman filter-based processes to estimate time-varying states. Finally, we review Bayesian decision theory in motor control and how the central nervous system processes errors to determine loss functions and optimal actions. We review results that suggest we plan movements based on statistics of our actions that result from signal-dependent noise on our motor outputs. Taken together these studies provide a statistical framework for how the motor system performs in the presence of uncertainty. PMID:17628731
An ideal observer analysis of visual working memory.
Sims, Chris R; Jacobs, Robert A; Knill, David C
2012-10-01
Limits in visual working memory (VWM) strongly constrain human performance across many tasks. However, the nature of these limits is not well understood. In this article we develop an ideal observer analysis of human VWM by deriving the expected behavior of an optimally performing but limited-capacity memory system. This analysis is framed around rate-distortion theory, a branch of information theory that provides optimal bounds on the accuracy of information transmission subject to a fixed information capacity. The result of the ideal observer analysis is a theoretical framework that provides a task-independent and quantitative definition of visual memory capacity and yields novel predictions regarding human performance. These predictions are subsequently evaluated and confirmed in 2 empirical studies. Further, the framework is general enough to allow the specification and testing of alternative models of visual memory (e.g., how capacity is distributed across multiple items). We demonstrate that a simple model developed on the basis of the ideal observer analysis-one that allows variability in the number of stored memory representations but does not assume the presence of a fixed item limit-provides an excellent account of the empirical data and further offers a principled reinterpretation of existing models of VWM. PsycINFO Database Record (c) 2012 APA, all rights reserved.
An Ideal Observer Analysis of Visual Working Memory
Sims, Chris R.; Jacobs, Robert A.; Knill, David C.
2013-01-01
Limits in visual working memory (VWM) strongly constrain human performance across many tasks. However, the nature of these limits is not well understood. In this paper we develop an ideal observer analysis of human visual working memory, by deriving the expected behavior of an optimally performing, but limited-capacity memory system. This analysis is framed around rate–distortion theory, a branch of information theory that provides optimal bounds on the accuracy of information transmission subject to a fixed information capacity. The result of the ideal observer analysis is a theoretical framework that provides a task-independent and quantitative definition of visual memory capacity and yields novel predictions regarding human performance. These predictions are subsequently evaluated and confirmed in two empirical studies. Further, the framework is general enough to allow the specification and testing of alternative models of visual memory (for example, how capacity is distributed across multiple items). We demonstrate that a simple model developed on the basis of the ideal observer analysis—one which allows variability in the number of stored memory representations, but does not assume the presence of a fixed item limit—provides an excellent account of the empirical data, and further offers a principled re-interpretation of existing models of visual working memory. PMID:22946744
A learning-based autonomous driver: emulate human driver's intelligence in low-speed car following
NASA Astrophysics Data System (ADS)
Wei, Junqing; Dolan, John M.; Litkouhi, Bakhtiar
2010-04-01
In this paper, an offline learning mechanism based on the genetic algorithm is proposed for autonomous vehicles to emulate human driver behaviors. The autonomous driving ability is implemented based on a Prediction- and Cost function-Based algorithm (PCB). PCB is designed to emulate a human driver's decision process, which is modeled as traffic scenario prediction and evaluation. This paper focuses on using a learning algorithm to optimize PCB with very limited training data, so that PCB can have the ability to predict and evaluate traffic scenarios similarly to human drivers. 80 seconds of human driving data was collected in low-speed (< 30miles/h) car-following scenarios. In the low-speed car-following tests, PCB was able to perform more human-like carfollowing after learning. A more general 120 kilometer-long simulation showed that PCB performs robustly even in scenarios that are not part of the training set.
Human-Robot Interaction in High Vulnerability Domains
NASA Technical Reports Server (NTRS)
Gore, Brian F.
2016-01-01
Future NASA missions will require successful integration of the human with highly complex systems. Highly complex systems are likely to involve humans, automation, and some level of robotic assistance. The complex environments will require successful integration of the human with automation, with robots, and with human-automation-robot teams to accomplish mission critical goals. Many challenges exist for the human performing in these types of operational environments with these kinds of systems. Systems must be designed to optimally integrate various levels of inputs and outputs based on the roles and responsibilities of the human, the automation, and the robots; from direct manual control, shared human-robotic control, or no active human control (i.e. human supervisory control). It is assumed that the human will remain involved at some level. Technologies that vary based on contextual demands and on operator characteristics (workload, situation awareness) will be needed when the human integrates into these systems. Predictive models that estimate the impact of the technologies on the system performance and the on the human operator are also needed to meet the challenges associated with such future complex human-automation-robot systems in extreme environments.
Saturation pulse design for quantitative myocardial T1 mapping.
Chow, Kelvin; Kellman, Peter; Spottiswoode, Bruce S; Nielles-Vallespin, Sonia; Arai, Andrew E; Salerno, Michael; Thompson, Richard B
2015-10-01
Quantitative saturation-recovery based T1 mapping sequences are less sensitive to systematic errors than the Modified Look-Locker Inversion recovery (MOLLI) technique but require high performance saturation pulses. We propose to optimize adiabatic and pulse train saturation pulses for quantitative T1 mapping to have <1 % absolute residual longitudinal magnetization (|MZ/M0|) over ranges of B0 and [Formula: see text] (B1 scale factor) inhomogeneity found at 1.5 T and 3 T. Design parameters for an adiabatic BIR4-90 pulse were optimized for improved performance within 1.5 T B0 (±120 Hz) and [Formula: see text] (0.7-1.0) ranges. Flip angles in hard pulse trains of 3-6 pulses were optimized for 1.5 T and 3 T, with consideration of T1 values, field inhomogeneities (B0 = ±240 Hz and [Formula: see text]=0.4-1.2 at 3 T), and maximum achievable B1 field strength. Residual MZ/M0 was simulated and measured experimentally for current standard and optimized saturation pulses in phantoms and in-vivo human studies. T1 maps were acquired at 3 T in human subjects and a swine using a SAturation recovery single-SHot Acquisition (SASHA) technique with a standard 90°-90°-90° and an optimized 6-pulse train. Measured residual MZ/M0 in phantoms had excellent agreement with simulations over a wide range of B0 and [Formula: see text]. The optimized BIR4-90 reduced the maximum residual |MZ/M0| to <1 %, a 5.8× reduction compared to a reference BIR4-90. An optimized 3-pulse train achieved a maximum residual |MZ/M0| <1 % for the 1.5 T optimization range compared to 11.3 % for a standard 90°-90°-90° pulse train, while a 6-pulse train met this target for the wider 3 T ranges of B0 and [Formula: see text]. The 6-pulse train demonstrated more uniform saturation across both the myocardium and entire field of view than other saturation pulses in human studies. T1 maps were more spatially homogeneous with 6-pulse train SASHA than the reference 90°-90°-90° SASHA in both human and animal studies. Adiabatic and pulse train saturation pulses optimized for different constraints found at 1.5 T and 3 T achieved <1 % residual |MZ/M0| in phantom experiments, enabling greater accuracy in quantitative saturation recovery T1 imaging.
Optimizing Filter-Probe Diffusion Weighting in the Rat Spinal Cord for Human Translation
Budde, Matthew D.; Skinner, Nathan P.; Muftuler, L. Tugan; Schmit, Brian D.; Kurpad, Shekar N.
2017-01-01
Diffusion tensor imaging (DTI) is a promising biomarker of spinal cord injury (SCI). In the acute aftermath, DTI in SCI animal models consistently demonstrates high sensitivity and prognostic performance, yet translation of DTI to acute human SCI has been limited. In addition to technical challenges, interpretation of the resulting metrics is ambiguous, with contributions in the acute setting from both axonal injury and edema. Novel diffusion MRI acquisition strategies such as double diffusion encoding (DDE) have recently enabled detection of features not available with DTI or similar methods. In this work, we perform a systematic optimization of DDE using simulations and an in vivo rat model of SCI and subsequently implement the protocol to the healthy human spinal cord. First, two complementary DDE approaches were evaluated using an orientationally invariant or a filter-probe diffusion encoding approach. While the two methods were similar in their ability to detect acute SCI, the filter-probe DDE approach had greater predictive power for functional outcomes. Next, the filter-probe DDE was compared to an analogous single diffusion encoding (SDE) approach, with the results indicating that in the spinal cord, SDE provides similar contrast with improved signal to noise. In the SCI rat model, the filter-probe SDE scheme was coupled with a reduced field of view (rFOV) excitation, and the results demonstrate high quality maps of the spinal cord without contamination from edema and cerebrospinal fluid, thereby providing high sensitivity to injury severity. The optimized protocol was demonstrated in the healthy human spinal cord using the commercially-available diffusion MRI sequence with modifications only to the diffusion encoding directions. Maps of axial diffusivity devoid of CSF partial volume effects were obtained in a clinically feasible imaging time with a straightforward analysis and variability comparable to axial diffusivity derived from DTI. Overall, the results and optimizations describe a protocol that mitigates several difficulties with DTI of the spinal cord. Detection of acute axonal damage in the injured or diseased spinal cord will benefit the optimized filter-probe diffusion MRI protocol outlined here. PMID:29311786
Optimal External Wrench Distribution During a Multi-Contact Sit-to-Stand Task.
Bonnet, Vincent; Azevedo-Coste, Christine; Robert, Thomas; Fraisse, Philippe; Venture, Gentiane
2017-07-01
This paper aims at developing and evaluating a new practical method for the real-time estimate of joint torques and external wrenches during multi-contact sit-to-stand (STS) task using kinematics data only. The proposed method allows also identifying subject specific body inertial segment parameters that are required to perform inverse dynamics. The identification phase is performed using simple and repeatable motions. Thanks to an accurately identified model the estimate of the total external wrench can be used as an input to solve an under-determined multi-contact problem. It is solved using a constrained quadratic optimization process minimizing a hybrid human-like energetic criterion. The weights of this hybrid cost function are adjusted and a sensitivity analysis is performed in order to reproduce robustly human external wrench distribution. The results showed that the proposed method could successfully estimate the external wrenches under buttocks, feet, and hands during STS tasks (RMS error lower than 20 N and 6 N.m). The simplicity and generalization abilities of the proposed method allow paving the way of future diagnosis solutions and rehabilitation applications, including in-home use.
Benchmarking image fusion system design parameters
NASA Astrophysics Data System (ADS)
Howell, Christopher L.
2013-06-01
A clear and absolute method for discriminating between image fusion algorithm performances is presented. This method can effectively be used to assist in the design and modeling of image fusion systems. Specifically, it is postulated that quantifying human task performance using image fusion should be benchmarked to whether the fusion algorithm, at a minimum, retained the performance benefit achievable by each independent spectral band being fused. The established benchmark would then clearly represent the threshold that a fusion system should surpass to be considered beneficial to a particular task. A genetic algorithm is employed to characterize the fused system parameters using a Matlab® implementation of NVThermIP as the objective function. By setting the problem up as a mixed-integer constraint optimization problem, one can effectively look backwards through the image acquisition process: optimizing fused system parameters by minimizing the difference between modeled task difficulty measure and the benchmark task difficulty measure. The results of an identification perception experiment are presented, where human observers were asked to identify a standard set of military targets, and used to demonstrate the effectiveness of the benchmarking process.
Stochastic Resonance in Signal Detection and Human Perception
2006-07-05
learning scheme performing a stochastic gradient ascent on the SNR to determine the optimal noise level based on the samples from the process. Rather than...produce some SR effect in threshold neurons and a new statistically robust learning law was proposed to find the optimal noise level. [McDonnell...Ultimately, we know that it is the brain that responds to a visual stimulus causing neurons to fire. Conceivably if we understood the effect of the noise PDF
ERIC Educational Resources Information Center
MacGregor, James N.; Chronicle, Edward P.; Ormerod, Thomas C.
2006-01-01
We compared the performance of three heuristics with that of subjects on variants of a well-known combinatorial optimization task, the Traveling Salesperson Problem (TSP). The present task consisted of finding the shortest path through an array of points from one side of the array to the other. Like the standard TSP, the task is computationally…
Validation of in vitro assays in three-dimensional human dermal constructs.
Idrees, Ayesha; Chiono, Valeria; Ciardelli, Gianluca; Shah, Siegfried; Viebahn, Richard; Zhang, Xiang; Salber, Jochen
2018-05-01
Three-dimensional cell culture systems are urgently needed for cytocompatibility testing of biomaterials. This work aimed at the development of three-dimensional in vitro dermal skin models and their optimization for cytocompatibility evaluation. Initially "murine in vitro dermal construct" based on L929 cells was generated, leading to the development of "human in vitro dermal construct" consisting of normal human dermal fibroblasts in rat tail tendon collagen type I. To assess the viability of the cells, different assays CellTiter-Blue ® , RealTime-Glo ™ MT, and CellTiter-Glo ® (Promega) were evaluated to optimize the best-suited assay to the respective cell type and three-dimensional system. Z-stack imaging (Live/Dead and Phalloidin/DAPI-Promokine) was performed to visualize normal human dermal fibroblasts inside matrix revealing filopodia-like morphology and a uniform distribution of normal human dermal fibroblasts in matrix. CellTiter-Glo was found to be the optimal cell viability assay among those analyzed. CellTiter-Blue reagent affected the cell morphology of normal human dermal fibroblasts (unlike L929), suggesting an interference with cell biological activity, resulting in less reliable viability data. On the other hand, RealTime-Glo provided a linear signal only with a very low cell density, which made this assay unsuitable for this system. CellTiter-Glo adapted to three-dimensional dermal construct by optimizing the "shaking time" to enhance the reagent penetration and maximum adenosine triphosphate release, indicating 2.4 times higher viability value by shaking for 60 min than for 5 min. In addition, viability results showed that cells were viable inside the matrix. This model would be further advanced with more layers of skin to make a full thickness model.
Geometry and gravity influences on strength capability
NASA Technical Reports Server (NTRS)
Poliner, Jeffrey; Wilmington, Robert P.; Klute, Glenn K.
1994-01-01
Strength, defined as the capability of an individual to produce an external force, is one of the most important determining characteristics of human performance. Knowledge of strength capabilities of a group of individuals can be applied to designing equipment and workplaces, planning procedures and tasks, and training individuals. In the manned space program, with the high risk and cost associated with spaceflight, information pertaining to human performance is important to ensuring mission success and safety. Knowledge of individual's strength capabilities in weightlessness is of interest within many areas of NASA, including workplace design, tool development, and mission planning. The weightless environment of space places the human body in a completely different context. Astronauts perform a variety of manual tasks while in orbit. Their ability to perform these tasks is partly determined by their strength capability as demanded by that particular task. Thus, an important step in task planning, development, and evaluation is to determine the ability of the humans performing it. This can be accomplished by utilizing quantitative techniques to develop a database of human strength capabilities in weightlessness. Furthermore, if strength characteristics are known, equipment and tools can be built to optimize the operators' performance. This study examined strength in performing a simple task, specifically, using a tool to apply a torque to a fixture.
Dynamic Decision Making in Complex Task Environments: Principles and Neural Mechanisms
2013-03-01
Dynamical models of cognition . Mathematical models of mental processes. Human performance optimization. U U U U Dr. Jay Myung 703-696-8487 Reset 1...we have continued to develop a neurodynamic theory of decision making, using a combination of computational and experimental approaches, to address...a long history in the field of human cognitive psychology. The theoretical foundations of this research can be traced back to signal detection
Optimization measurement of muscle oxygen saturation under isometric studies using FNIRS
NASA Astrophysics Data System (ADS)
Halim, A. A. A.; Laili, M. H.; Salikin, M. S.; Rusop, M.
2018-05-01
Development of functional near infrared spectroscopy (fNIRS) technologies has advanced quantification signal using multiple wavelength and detector to investigate hemodynamic response in human muscle. These non-invasive technologies have been widely used to solve the propagation of light inside the tissues including the absorption, scattering coefficient and to quantify the oxygenation level of haemoglobin and myoglobin in human muscle. The goal of this paper is to optimize the measurement of muscle oxygen saturation during isometric exercise using functional near infrared spectroscopy (fNIRS). The experiment was carried out on 15 sedentary healthy male volunteers. All volunteers are required to perform an isometric exercise at three assessment of muscular fatigue's level on flexor digitalis (FDS) muscle in the human forearm using fNIRS. The slopes of the signals have been highlighted to evaluate the muscle oxygen saturation of regional muscle fatigue. As a result, oxygen saturation slope from 10% exercise showed steeper than the first assessment at 30%-50% of fatigues level. The hemodynamic signal response showed significant value (p=0.04) at all three assessment of muscular fatigue's level which produce a p-value (p<0.05) measured by fNIRS. Thus, this highlighted parameter could be used to estimate fatigue's level of human and could open other possibilities to study muscle performance diagnosis.
A physiologically based model for temporal envelope encoding in human primary auditory cortex.
Dugué, Pierre; Le Bouquin-Jeannès, Régine; Edeline, Jean-Marc; Faucon, Gérard
2010-09-01
Communication sounds exhibit temporal envelope fluctuations in the low frequency range (<70 Hz) and human speech has prominent 2-16 Hz modulations with a maximum at 3-4 Hz. Here, we propose a new phenomenological model of the human auditory pathway (from cochlea to primary auditory cortex) to simulate responses to amplitude-modulated white noise. To validate the model, performance was estimated by quantifying temporal modulation transfer functions (TMTFs). Previous models considered either the lower stages of the auditory system (up to the inferior colliculus) or only the thalamocortical loop. The present model, divided in two stages, is based on anatomical and physiological findings and includes the entire auditory pathway. The first stage, from the outer ear to the colliculus, incorporates inhibitory interneurons in the cochlear nucleus to increase performance at high stimuli levels. The second stage takes into account the anatomical connections of the thalamocortical system and includes the fast and slow excitatory and inhibitory currents. After optimizing the parameters of the model to reproduce the diversity of TMTFs obtained from human subjects, a patient-specific model was derived and the parameters were optimized to effectively reproduce both spontaneous activity and the oscillatory part of the evoked response. Copyright (c) 2010 Elsevier B.V. All rights reserved.
Combined Economic and Hydrologic Modeling to Support Collaborative Decision Making Processes
NASA Astrophysics Data System (ADS)
Sheer, D. P.
2008-12-01
For more than a decade, the core concept of the author's efforts in support of collaborative decision making has been a combination of hydrologic simulation and multi-objective optimization. The modeling has generally been used to support collaborative decision making processes. The OASIS model developed by HydroLogics Inc. solves a multi-objective optimization at each time step using a mixed integer linear program (MILP). The MILP can be configured to include any user defined objective, including but not limited too economic objectives. For example, an estimated marginal value for water for crops and M&I use were included in the objective function to drive trades in a model of the lower Rio Grande. The formulation of the MILP, constraints and objectives, in any time step is conditional: it changes based on the value of state variables and dynamic external forcing functions, such as rainfall, hydrology, market prices, arrival of migratory fish, water temperature, etc. It therefore acts as a dynamic short term multi-objective economic optimization for each time step. MILP is capable of solving a general problem that includes a very realistic representation of the physical system characteristics in addition to the normal multi-objective optimization objectives and constraints included in economic models. In all of these models, the short term objective function is a surrogate for achieving long term multi-objective results. The long term performance for any alternative (especially including operating strategies) is evaluated by simulation. An operating rule is the combination of conditions, parameters, constraints and objectives used to determine the formulation of the short term optimization in each time step. Heuristic wrappers for the simulation program have been developed improve the parameters of an operating rule, and are initiating research on a wrapper that will allow us to employ a genetic algorithm to improve the form of the rule (conditions, constraints, and short term objectives) as well. In the models operating rules represent different models of human behavior, and the objective of the modeling is to find rules for human behavior that perform well in terms of long term human objectives. The conceptual model used to represent human behavior incorporates economic multi-objective optimization for surrogate objectives, and rules that set those objectives based on current conditions and accounting for uncertainty, at least implicitly. The author asserts that real world operating rules follow this form and have evolved because they have been perceived as successful in the past. Thus, the modeling efforts focus on human behavior in much the same way that economic models focus on human behavior. This paper illustrates the above concepts with real world examples.
Acuña, Daniel E; Parada, Víctor
2010-07-29
Humans need to solve computationally intractable problems such as visual search, categorization, and simultaneous learning and acting, yet an increasing body of evidence suggests that their solutions to instantiations of these problems are near optimal. Computational complexity advances an explanation to this apparent paradox: (1) only a small portion of instances of such problems are actually hard, and (2) successful heuristics exploit structural properties of the typical instance to selectively improve parts that are likely to be sub-optimal. We hypothesize that these two ideas largely account for the good performance of humans on computationally hard problems. We tested part of this hypothesis by studying the solutions of 28 participants to 28 instances of the Euclidean Traveling Salesman Problem (TSP). Participants were provided feedback on the cost of their solutions and were allowed unlimited solution attempts (trials). We found a significant improvement between the first and last trials and that solutions are significantly different from random tours that follow the convex hull and do not have self-crossings. More importantly, we found that participants modified their current better solutions in such a way that edges belonging to the optimal solution ("good" edges) were significantly more likely to stay than other edges ("bad" edges), a hallmark of structural exploitation. We found, however, that more trials harmed the participants' ability to tell good from bad edges, suggesting that after too many trials the participants "ran out of ideas." In sum, we provide the first demonstration of significant performance improvement on the TSP under repetition and feedback and evidence that human problem-solving may exploit the structure of hard problems paralleling behavior of state-of-the-art heuristics.
Acuña, Daniel E.; Parada, Víctor
2010-01-01
Humans need to solve computationally intractable problems such as visual search, categorization, and simultaneous learning and acting, yet an increasing body of evidence suggests that their solutions to instantiations of these problems are near optimal. Computational complexity advances an explanation to this apparent paradox: (1) only a small portion of instances of such problems are actually hard, and (2) successful heuristics exploit structural properties of the typical instance to selectively improve parts that are likely to be sub-optimal. We hypothesize that these two ideas largely account for the good performance of humans on computationally hard problems. We tested part of this hypothesis by studying the solutions of 28 participants to 28 instances of the Euclidean Traveling Salesman Problem (TSP). Participants were provided feedback on the cost of their solutions and were allowed unlimited solution attempts (trials). We found a significant improvement between the first and last trials and that solutions are significantly different from random tours that follow the convex hull and do not have self-crossings. More importantly, we found that participants modified their current better solutions in such a way that edges belonging to the optimal solution (“good” edges) were significantly more likely to stay than other edges (“bad” edges), a hallmark of structural exploitation. We found, however, that more trials harmed the participants' ability to tell good from bad edges, suggesting that after too many trials the participants “ran out of ideas.” In sum, we provide the first demonstration of significant performance improvement on the TSP under repetition and feedback and evidence that human problem-solving may exploit the structure of hard problems paralleling behavior of state-of-the-art heuristics. PMID:20686597
Effects of Shift Work on Air Force Security Police Personnel.
1986-01-01
Naitoh, Paul. " Chronobiologic Approach for Optimizing Human Performance ." In Rhythmic Aspects of Behavior. eds. Frederick M. Brown and R. Curtis...Effects of Shift Work on Air Force THESIS/DISSERTATION Security Police Personnel 6. PERFORMING O1G. REPORT NUMBER 7. AUTHOR(.) 8. CONTRACT OR GRANT...shift work has a significant effect on those who perform it, this study examines the perceived effects of shift work on a population of security
Lattice Boltzmann Simulation Optimization on Leading Multicore Platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Samuel; Carter, Jonathan; Oliker, Leonid
2008-02-01
We present an auto-tuning approach to optimize application performance on emerging multicore architectures. The methodology extends the idea of search-based performance optimizations, popular in linear algebra and FFT libraries, to application-specific computational kernels. Our work applies this strategy to a lattice Boltzmann application (LBMHD) that historically has made poor use of scalar microprocessors due to its complex data structures and memory access patterns. We explore one of the broadest sets of multicore architectures in the HPC literature, including the Intel Clovertown, AMD Opteron X2, Sun Niagara2, STI Cell, as well as the single core Intel Itanium2. Rather than hand-tuning LBMHDmore » for each system, we develop a code generator that allows us identify a highly optimized version for each platform, while amortizing the human programming effort. Results show that our auto-tuned LBMHD application achieves up to a 14x improvement compared with the original code. Additionally, we present detailed analysis of each optimization, which reveal surprising hardware bottlenecks and software challenges for future multicore systems and applications.« less
Lattice Boltzmann simulation optimization on leading multicore platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, S.; Carter, J.; Oliker, L.
2008-01-01
We present an auto-tuning approach to optimize application performance on emerging multicore architectures. The methodology extends the idea of searchbased performance optimizations, popular in linear algebra and FFT libraries, to application-specific computational kernels. Our work applies this strategy to a lattice Boltzmann application (LBMHD) that historically has made poor use of scalar microprocessors due to its complex data structures and memory access patterns. We explore one of the broadest sets of multicore architectures in the HPC literature, including the Intel Clovertown, AMD Opteron X2, Sun Niagara2, STI Cell, as well as the single core Intel Itanium2. Rather than hand-tuning LBMHDmore » for each system, we develop a code generator that allows us identify a highly optimized version for each platform, while amortizing the human programming effort. Results show that our autotuned LBMHD application achieves up to a 14 improvement compared with the original code. Additionally, we present detailed analysis of each optimization, which reveal surprising hardware bottlenecks and software challenges for future multicore systems and applications.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Hong; Wang, Shaobu; Fan, Rui
This report summaries the work performed under the LDRD project on the preliminary study on knowledge automation, where specific focus has been made on the investigation of the impact of uncertainties of human decision making onto the optimization of the process operation. At first the statistics on signals from the Brain-Computing Interface (BCI) is analyzed so as to obtain the uncertainties characterization of human operators during the decision making phase using the electroencephalogram (EEG) signals. This is then followed by the discussions of an architecture that reveals the equivalence between optimization and closed loop feedback control design, where it hasmore » been shown that all the optimization problems can be transferred into the control design problem for closed loop systems. This has led to a “closed loop” framework, where the structure of the decision making is shown to be subjected to both process disturbances and controller’s uncertainties. The latter can well represent the uncertainties or randomness occurred during human decision making phase. As a result, a stochastic optimization problem has been formulated and a novel solution has been proposed using probability density function (PDF) shaping for both the cost function and the constraints using stochastic distribution control concept. A sufficient condition has been derived that guarantees the convergence of the optimal solution and discussions have been made for both the total probabilistic solution and chanced constrained optimization which have been well-studied in optimal power flows (OPF) area. A simple case study has been carried out for the economic dispatch of powers for a grid system when there are distributed energy resources (DERs) in the system, and encouraging results have been obtained showing that a significant savings on the generation cost can be expected.« less
Metaheuristic Algorithms for Convolution Neural Network
Fanany, Mohamad Ivan; Arymurthy, Aniati Murni
2016-01-01
A typical modern optimization technique is usually either heuristic or metaheuristic. This technique has managed to solve some optimization problems in the research area of science, engineering, and industry. However, implementation strategy of metaheuristic for accuracy improvement on convolution neural networks (CNN), a famous deep learning method, is still rarely investigated. Deep learning relates to a type of machine learning technique, where its aim is to move closer to the goal of artificial intelligence of creating a machine that could successfully perform any intellectual tasks that can be carried out by a human. In this paper, we propose the implementation strategy of three popular metaheuristic approaches, that is, simulated annealing, differential evolution, and harmony search, to optimize CNN. The performances of these metaheuristic methods in optimizing CNN on classifying MNIST and CIFAR dataset were evaluated and compared. Furthermore, the proposed methods are also compared with the original CNN. Although the proposed methods show an increase in the computation time, their accuracy has also been improved (up to 7.14 percent). PMID:27375738
Metaheuristic Algorithms for Convolution Neural Network.
Rere, L M Rasdi; Fanany, Mohamad Ivan; Arymurthy, Aniati Murni
2016-01-01
A typical modern optimization technique is usually either heuristic or metaheuristic. This technique has managed to solve some optimization problems in the research area of science, engineering, and industry. However, implementation strategy of metaheuristic for accuracy improvement on convolution neural networks (CNN), a famous deep learning method, is still rarely investigated. Deep learning relates to a type of machine learning technique, where its aim is to move closer to the goal of artificial intelligence of creating a machine that could successfully perform any intellectual tasks that can be carried out by a human. In this paper, we propose the implementation strategy of three popular metaheuristic approaches, that is, simulated annealing, differential evolution, and harmony search, to optimize CNN. The performances of these metaheuristic methods in optimizing CNN on classifying MNIST and CIFAR dataset were evaluated and compared. Furthermore, the proposed methods are also compared with the original CNN. Although the proposed methods show an increase in the computation time, their accuracy has also been improved (up to 7.14 percent).
Simen, Patrick; Contreras, David; Buck, Cara; Hu, Peter; Holmes, Philip; Cohen, Jonathan D
2009-12-01
The drift-diffusion model (DDM) implements an optimal decision procedure for stationary, 2-alternative forced-choice tasks. The height of a decision threshold applied to accumulating information on each trial determines a speed-accuracy tradeoff (SAT) for the DDM, thereby accounting for a ubiquitous feature of human performance in speeded response tasks. However, little is known about how participants settle on particular tradeoffs. One possibility is that they select SATs that maximize a subjective rate of reward earned for performance. For the DDM, there exist unique, reward-rate-maximizing values for its threshold and starting point parameters in free-response tasks that reward correct responses (R. Bogacz, E. Brown, J. Moehlis, P. Holmes, & J. D. Cohen, 2006). These optimal values vary as a function of response-stimulus interval, prior stimulus probability, and relative reward magnitude for correct responses. We tested the resulting quantitative predictions regarding response time, accuracy, and response bias under these task manipulations and found that grouped data conformed well to the predictions of an optimally parameterized DDM.
Encoder-Decoder Optimization for Brain-Computer Interfaces
Merel, Josh; Pianto, Donald M.; Cunningham, John P.; Paninski, Liam
2015-01-01
Neuroprosthetic brain-computer interfaces are systems that decode neural activity into useful control signals for effectors, such as a cursor on a computer screen. It has long been recognized that both the user and decoding system can adapt to increase the accuracy of the end effector. Co-adaptation is the process whereby a user learns to control the system in conjunction with the decoder adapting to learn the user's neural patterns. We provide a mathematical framework for co-adaptation and relate co-adaptation to the joint optimization of the user's control scheme ("encoding model") and the decoding algorithm's parameters. When the assumptions of that framework are respected, co-adaptation cannot yield better performance than that obtainable by an optimal initial choice of fixed decoder, coupled with optimal user learning. For a specific case, we provide numerical methods to obtain such an optimized decoder. We demonstrate our approach in a model brain-computer interface system using an online prosthesis simulator, a simple human-in-the-loop pyschophysics setup which provides a non-invasive simulation of the BCI setting. These experiments support two claims: that users can learn encoders matched to fixed, optimal decoders and that, once learned, our approach yields expected performance advantages. PMID:26029919
Encoder-decoder optimization for brain-computer interfaces.
Merel, Josh; Pianto, Donald M; Cunningham, John P; Paninski, Liam
2015-06-01
Neuroprosthetic brain-computer interfaces are systems that decode neural activity into useful control signals for effectors, such as a cursor on a computer screen. It has long been recognized that both the user and decoding system can adapt to increase the accuracy of the end effector. Co-adaptation is the process whereby a user learns to control the system in conjunction with the decoder adapting to learn the user's neural patterns. We provide a mathematical framework for co-adaptation and relate co-adaptation to the joint optimization of the user's control scheme ("encoding model") and the decoding algorithm's parameters. When the assumptions of that framework are respected, co-adaptation cannot yield better performance than that obtainable by an optimal initial choice of fixed decoder, coupled with optimal user learning. For a specific case, we provide numerical methods to obtain such an optimized decoder. We demonstrate our approach in a model brain-computer interface system using an online prosthesis simulator, a simple human-in-the-loop pyschophysics setup which provides a non-invasive simulation of the BCI setting. These experiments support two claims: that users can learn encoders matched to fixed, optimal decoders and that, once learned, our approach yields expected performance advantages.
Huang, Daizheng; Wu, Zhihui
2017-01-01
Accurately predicting the trend of outpatient visits by mathematical modeling can help policy makers manage hospitals effectively, reasonably organize schedules for human resources and finances, and appropriately distribute hospital material resources. In this study, a hybrid method based on empirical mode decomposition and back-propagation artificial neural networks optimized by particle swarm optimization is developed to forecast outpatient visits on the basis of monthly numbers. The data outpatient visits are retrieved from January 2005 to December 2013 and first obtained as the original time series. Second, the original time series is decomposed into a finite and often small number of intrinsic mode functions by the empirical mode decomposition technique. Third, a three-layer back-propagation artificial neural network is constructed to forecast each intrinsic mode functions. To improve network performance and avoid falling into a local minimum, particle swarm optimization is employed to optimize the weights and thresholds of back-propagation artificial neural networks. Finally, the superposition of forecasting results of the intrinsic mode functions is regarded as the ultimate forecasting value. Simulation indicates that the proposed method attains a better performance index than the other four methods. PMID:28222194
Huang, Daizheng; Wu, Zhihui
2017-01-01
Accurately predicting the trend of outpatient visits by mathematical modeling can help policy makers manage hospitals effectively, reasonably organize schedules for human resources and finances, and appropriately distribute hospital material resources. In this study, a hybrid method based on empirical mode decomposition and back-propagation artificial neural networks optimized by particle swarm optimization is developed to forecast outpatient visits on the basis of monthly numbers. The data outpatient visits are retrieved from January 2005 to December 2013 and first obtained as the original time series. Second, the original time series is decomposed into a finite and often small number of intrinsic mode functions by the empirical mode decomposition technique. Third, a three-layer back-propagation artificial neural network is constructed to forecast each intrinsic mode functions. To improve network performance and avoid falling into a local minimum, particle swarm optimization is employed to optimize the weights and thresholds of back-propagation artificial neural networks. Finally, the superposition of forecasting results of the intrinsic mode functions is regarded as the ultimate forecasting value. Simulation indicates that the proposed method attains a better performance index than the other four methods.
A Bayesian Framework for Human Body Pose Tracking from Depth Image Sequences
Zhu, Youding; Fujimura, Kikuo
2010-01-01
This paper addresses the problem of accurate and robust tracking of 3D human body pose from depth image sequences. Recovering the large number of degrees of freedom in human body movements from a depth image sequence is challenging due to the need to resolve the depth ambiguity caused by self-occlusions and the difficulty to recover from tracking failure. Human body poses could be estimated through model fitting using dense correspondences between depth data and an articulated human model (local optimization method). Although it usually achieves a high accuracy due to dense correspondences, it may fail to recover from tracking failure. Alternately, human pose may be reconstructed by detecting and tracking human body anatomical landmarks (key-points) based on low-level depth image analysis. While this method (key-point based method) is robust and recovers from tracking failure, its pose estimation accuracy depends solely on image-based localization accuracy of key-points. To address these limitations, we present a flexible Bayesian framework for integrating pose estimation results obtained by methods based on key-points and local optimization. Experimental results are shown and performance comparison is presented to demonstrate the effectiveness of the proposed approach. PMID:22399933
Hamedi, Raheleh; Hadjmohammadi, Mohammad Reza
2016-12-01
A sensitive and rapid method based on alcohol-assisted dispersive liquid-liquid microextraction followed by high-performance liquid chromatography for the determination of fluoxetine in human plasma and urine samples was developed. The effects of six parameters on the extraction recovery were investigated and optimized utilizing Plackett-Burman design and Box-Benken design, respectively. According to the Plackett-Burman design results, the volume of disperser solvent, extraction time, and stirring speed had no effect on the recovery of fluoxetine. The optimized conditions included a mixture of 172 μL of 1-octanol as extraction solvent and 400 μL of methanol as disperser solvent, pH of 11.3 and 0% w/v of salt in the sample solution. Replicating the experiment in optimized condition for five times, gave the average extraction recoveries equal to 90.15%. The detection limit of fluoxetine in human plasma was obtained 3 ng/mL, and the linearity was in the range of 10-1200 ng/mL. The corresponding values for human urine were 4.2 ng/mL with the linearity range from 10 to 2000 ng/mL. Relative standard deviations for intra and inter day extraction of fluoxetine were less than 7% in five measurements. The developed method was successfully applied for the determination of fluoxetine in human plasma and urine samples. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Meena, Yogesh Kumar; Cecotti, Hubert; Wong-Lin, Kongfatt; Dutta, Ashish; Prasad, Girijesh
2018-04-01
Virtual keyboard applications and alternative communication devices provide new means of communication to assist disabled people. To date, virtual keyboard optimization schemes based on script-specific information, along with multimodal input access facility, are limited. In this paper, we propose a novel method for optimizing the position of the displayed items for gaze-controlled tree-based menu selection systems by considering a combination of letter frequency and command selection time. The optimized graphical user interface layout has been designed for a Hindi language virtual keyboard based on a menu wherein 10 commands provide access to type 88 different characters, along with additional text editing commands. The system can be controlled in two different modes: eye-tracking alone and eye-tracking with an access soft-switch. Five different keyboard layouts have been presented and evaluated with ten healthy participants. Furthermore, the two best performing keyboard layouts have been evaluated with eye-tracking alone on ten stroke patients. The overall performance analysis demonstrated significantly superior typing performance, high usability (87% SUS score), and low workload (NASA TLX with 17 scores) for the letter frequency and time-based organization with script specific arrangement design. This paper represents the first optimized gaze-controlled Hindi virtual keyboard, which can be extended to other languages.
Human Visual Search Does Not Maximize the Post-Saccadic Probability of Identifying Targets
Morvan, Camille; Maloney, Laurence T.
2012-01-01
Researchers have conjectured that eye movements during visual search are selected to minimize the number of saccades. The optimal Bayesian eye movement strategy minimizing saccades does not simply direct the eye to whichever location is judged most likely to contain the target but makes use of the entire retina as an information gathering device during each fixation. Here we show that human observers do not minimize the expected number of saccades in planning saccades in a simple visual search task composed of three tokens. In this task, the optimal eye movement strategy varied, depending on the spacing between tokens (in the first experiment) or the size of tokens (in the second experiment), and changed abruptly once the separation or size surpassed a critical value. None of our observers changed strategy as a function of separation or size. Human performance fell far short of ideal, both qualitatively and quantitatively. PMID:22319428
A Comparison of Probabilistic and Deterministic Campaign Analysis for Human Space Exploration
NASA Technical Reports Server (NTRS)
Merrill, R. Gabe; Andraschko, Mark; Stromgren, Chel; Cirillo, Bill; Earle, Kevin; Goodliff, Kandyce
2008-01-01
Human space exploration is by its very nature an uncertain endeavor. Vehicle reliability, technology development risk, budgetary uncertainty, and launch uncertainty all contribute to stochasticity in an exploration scenario. However, traditional strategic analysis has been done in a deterministic manner, analyzing and optimizing the performance of a series of planned missions. History has shown that exploration scenarios rarely follow such a planned schedule. This paper describes a methodology to integrate deterministic and probabilistic analysis of scenarios in support of human space exploration. Probabilistic strategic analysis is used to simulate "possible" scenario outcomes, based upon the likelihood of occurrence of certain events and a set of pre-determined contingency rules. The results of the probabilistic analysis are compared to the nominal results from the deterministic analysis to evaluate the robustness of the scenario to adverse events and to test and optimize contingency planning.
NASA Astrophysics Data System (ADS)
He, Xin
2017-03-01
The ideal observer is widely used in imaging system optimization. One practical question remains open: do the ideal and human observers have the same preference in system optimization and evaluation? Based on the ideal observer's mathematical properties proposed by Barrett et. al. and the empirical properties of human observers investigated by Myers et. al., I attempt to pursue the general rules regarding the applicability of the ideal observer in system optimization. Particularly, in software optimization, the ideal observer pursues data conservation while humans pursue data presentation or perception. In hardware optimization, the ideal observer pursues a system with the maximum total information, while humans pursue a system with the maximum selected (e.g., certain frequency bands) information. These different objectives may result in different system optimizations between human and the ideal observers. Thus, an ideal observer optimized system is not necessarily optimal for humans. I cite empirical evidence in search and detection tasks, in hardware and software evaluation, in X-ray CT, pinhole imaging, as well as emission computed tomography to corroborate the claims. (Disclaimer: the views expressed in this work do not necessarily represent those of the FDA)
Optimizing the NASA Technical Report Server
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Maa, Ming-Hokng
1996-01-01
The NASA Technical Report Server (NTRS), a World Wide Web report distribution NASA technical publications service, is modified for performance enhancement, greater protocol support, and human interface optimization. Results include: Parallel database queries, significantly decreasing user access times by an average factor of 2.3; access from clients behind firewalls and/ or proxies which truncate excessively long Uniform Resource Locators (URLs); access to non-Wide Area Information Server (WAIS) databases and compatibility with the 239-50.3 protocol; and a streamlined user interface.
A Concept for Optimizing Behavioural Effectiveness & Efficiency
NASA Astrophysics Data System (ADS)
Barca, Jan Carlo; Rumantir, Grace; Li, Raymond
Both humans and machines exhibit strengths and weaknesses that can be enhanced by merging the two entities. This research aims to provide a broader understanding of how closer interactions between these two entities can facilitate more optimal goal-directed performance through the use of artificial extensions of the human body. Such extensions may assist us in adapting to and manipulating our environments in a more effective way than any system known today. To demonstrate this concept, we have developed a simulation where a semi interactive virtual spider can be navigated through an environment consisting of several obstacles and a virtual predator capable of killing the spider. The virtual spider can be navigated through the use of three different control systems that can be used to assist in optimising overall goal directed performance. The first two control systems use, an onscreen button interface and a touch sensor, respectively to facilitate human navigation of the spider. The third control system is an autonomous navigation system through the use of machine intelligence embedded in the spider. This system enables the spider to navigate and react to changes in its local environment. The results of this study indicate that machines should be allowed to override human control in order to maximise the benefits of collaboration between man and machine. This research further indicates that the development of strong machine intelligence, sensor systems that engage all human senses, extra sensory input systems, physical remote manipulators, multiple intelligent extensions of the human body, as well as a tighter symbiosis between man and machine, can support an upgrade of the human form.
Vestibular models for design and evaluation of flight simulator motion
NASA Technical Reports Server (NTRS)
Bussolari, S. R.; Sullivan, R. B.; Young, L. R.
1986-01-01
The use of spatial orientation models in the design and evaluation of control systems for motion-base flight simulators is investigated experimentally. The development of a high-fidelity motion drive controller using an optimal control approach based on human vestibular models is described. The formulation and implementation of the optimal washout system are discussed. The effectiveness of the motion washout system was evaluated by studying the response of six motion washout systems to the NASA/AMES Vertical Motion Simulator for a single dash-quick-stop maneuver. The effects of the motion washout system on pilot performance and simulator acceptability are examined. The data reveal that human spatial orientation models are useful for the design and evaluation of flight simulator motion fidelity.
Protocol for vital dye staining of corneal endothelial cells.
Park, Sunju; Fong, Alan G; Cho, Hyung; Zhang, Cheng; Gritz, David C; Mian, Gibran; Herzlich, Alexandra A; Gore, Patrick; Morganti, Ashley; Chuck, Roy S
2012-12-01
To describe a step-by-step methodology to establish a reproducible staining protocol for the evaluation of human corneal endothelial cells. Four procedures were performed to determine the best protocol. (1) To determine the optimal trypan blue staining method, goat corneas were stained with 4 dilutions of trypan blue (0.4%, 0.2%, 0.1%, and 0.05%) and 1% alizarin red. (2) To determine the optimal alizarin red staining method, goat corneas were stained with 2 dilutions of alizarin red (1% and 0.5%) and 0.2% trypan blue. (3) To ensure that trypan blue truly stains damaged cells, goat corneas were exposed to either 3% hydrogen peroxide or to balanced salt solution, and then stained with 0.2% trypan blue and 0.5% alizarin red. (4) Finally, fresh human corneal buttons were examined; 1 group was stained with 0.2% trypan blue and another group with 0.4% trypan blue. For the 4 procedures performed, the results are as follows: (1) trypan blue staining was not observed in any of the normal corneal samples; (2) 0.5% alizarin red demonstrated sharper cell borders than 1% alizarin red; (3) positive trypan blue staining was observed in the hydrogen peroxide exposed tissue in damaged areas; (4) 0.4% trypan blue showed more distinct positive staining than 0.2% trypan blue. We were able to determine the optimal vital dye staining conditions for human corneal endothelial cells using 0.4% trypan blue and 0.5% alizarin red.
Sauer, Charles W; Boutin, Mallory A; Kim, Jae H
2017-05-01
Very-low-birth-weight infants continue to face significant difficulties with postnatal growth. Human milk is the optimal form of nutrition for infants but may exhibit variation in nutrient content. This study aimed to perform macronutrient analysis on expressed human milk from mothers whose babies are hospitalized in the neonatal intensive care unit. Up to five human milk samples per participant were analyzed for protein, carbohydrate, and fat content using reference chemical analyses (Kjeldahl for protein, high pressure liquid chromatography for carbohydrates, and Mojonnier for fat). Calorie content was calculated. A total of 64 samples from 24 participants was analyzed. Wide variability was found in calorie, protein, carbohydrate, and fat composition. The authors found an average of 17.9 kcal/ounce, with only 34% of samples falling within 10% of the expected caloric density. The assumption that human milk contains 20 kcal/ounce is no longer supported based on this study. This supports promoting an individualized nutrition strategy as a crucial aspect to optimal nutrition.
Naval Special Warfare Injury Prevention and Human Performance Initiative
2012-06-30
that these two macronutrients fall below recommended amounts, it may impair physical performance. Dietary supplement use was reported in 73% the...a suboptimal macronutrient distribution to fuel and recover from daily hard PT. To optimize the adaptations from PT, it is recommended to increase...the injuries with 65% attributed to running and weight lifting • Higher than desirable body fat • Insufficient and inappropriate macronutrient
Numerical integration and optimization of motions for multibody dynamic systems
NASA Astrophysics Data System (ADS)
Aguilar Mayans, Joan
This thesis considers the optimization and simulation of motions involving rigid body systems. It does so in three distinct parts, with the following topics: optimization and analysis of human high-diving motions, efficient numerical integration of rigid body dynamics with contacts, and motion optimization of a two-link robot arm using Finite-Time Lyapunov Analysis. The first part introduces the concept of eigenpostures, which we use to simulate and analyze human high-diving motions. Eigenpostures are used in two different ways: first, to reduce the complexity of the optimal control problem that we solve to obtain such motions, and second, to generate an eigenposture space to which we map existing real world motions to better analyze them. The benefits of using eigenpostures are showcased through different examples. The second part reviews an extensive list of integration algorithms used for the integration of rigid body dynamics. We analyze the accuracy and stability of the different integrators in the three-dimensional space and the rotation space SO(3). Integrators with an accuracy higher than first order perform more efficiently than integrators with first order accuracy, even in the presence of contacts. The third part uses Finite-time Lyapunov Analysis to optimize motions for a two-link robot arm. Finite-Time Lyapunov Analysis diagnoses the presence of time-scale separation in the dynamics of the optimized motion and provides the information and methodology for obtaining an accurate approximation to the optimal solution, avoiding the complications that timescale separation causes for alternative solution methods.
Yang, Jie; Zhang, Pengcheng; Zhang, Liyuan; Shu, Huazhong; Li, Baosheng; Gui, Zhiguo
2017-01-01
In inverse treatment planning of intensity-modulated radiation therapy (IMRT), the objective function is typically the sum of the weighted sub-scores, where the weights indicate the importance of the sub-scores. To obtain a high-quality treatment plan, the planner manually adjusts the objective weights using a trial-and-error procedure until an acceptable plan is reached. In this work, a new particle swarm optimization (PSO) method which can adjust the weighting factors automatically was investigated to overcome the requirement of manual adjustment, thereby reducing the workload of the human planner and contributing to the development of a fully automated planning process. The proposed optimization method consists of three steps. (i) First, a swarm of weighting factors (i.e., particles) is initialized randomly in the search space, where each particle corresponds to a global objective function. (ii) Then, a plan optimization solver is employed to obtain the optimal solution for each particle, and the values of the evaluation functions used to determine the particle's location and the population global location for the PSO are calculated based on these results. (iii) Next, the weighting factors are updated based on the particle's location and the population global location. Step (ii) is performed alternately with step (iii) until the termination condition is reached. In this method, the evaluation function is a combination of several key points on the dose volume histograms. Furthermore, a perturbation strategy - the crossover and mutation operator hybrid approach - is employed to enhance the population diversity, and two arguments are applied to the evaluation function to improve the flexibility of the algorithm. In this study, the proposed method was used to develop IMRT treatment plans involving five unequally spaced 6MV photon beams for 10 prostate cancer cases. The proposed optimization algorithm yielded high-quality plans for all of the cases, without human planner intervention. A comparison of the results with the optimized solution obtained using a similar optimization model but with human planner intervention revealed that the proposed algorithm produced optimized plans superior to that developed using the manual plan. The proposed algorithm can generate admissible solutions within reasonable computational times and can be used to develop fully automated IMRT treatment planning methods, thus reducing human planners' workloads during iterative processes. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Performance considerations for high-definition head-mounted displays
NASA Technical Reports Server (NTRS)
Edwards, Oliver J.; Larimer, James; Gille, Jennifer
1992-01-01
Design image-optimization for helmet-mounted displays (HMDs) for military systems is presently discussed within the framework of a systems-engineering approach that encompasses (1) a description of natural targets in the field; (2) the characteristics of human visual perception; and (3) device specifications that directly relate to these ecological and human-factors parameters. Attention is given to target size and contrast and the relationship of the modulation transfer function to image resolution.
Porsa, Sina; Lin, Yi-Chung; Pandy, Marcus G
2016-08-01
The aim of this study was to compare the computational performances of two direct methods for solving large-scale, nonlinear, optimal control problems in human movement. Direct shooting and direct collocation were implemented on an 8-segment, 48-muscle model of the body (24 muscles on each side) to compute the optimal control solution for maximum-height jumping. Both algorithms were executed on a freely-available musculoskeletal modeling platform called OpenSim. Direct collocation converged to essentially the same optimal solution up to 249 times faster than direct shooting when the same initial guess was assumed (3.4 h of CPU time for direct collocation vs. 35.3 days for direct shooting). The model predictions were in good agreement with the time histories of joint angles, ground reaction forces and muscle activation patterns measured for subjects jumping to their maximum achievable heights. Both methods converged to essentially the same solution when started from the same initial guess, but computation time was sensitive to the initial guess assumed. Direct collocation demonstrates exceptional computational performance and is well suited to performing predictive simulations of movement using large-scale musculoskeletal models.
A PERFECT MATCH CONDITION FOR POINT-SET MATCHING PROBLEMS USING THE OPTIMAL MASS TRANSPORT APPROACH
CHEN, PENGWEN; LIN, CHING-LONG; CHERN, I-LIANG
2013-01-01
We study the performance of optimal mass transport-based methods applied to point-set matching problems. The present study, which is based on the L2 mass transport cost, states that perfect matches always occur when the product of the point-set cardinality and the norm of the curl of the non-rigid deformation field does not exceed some constant. This analytic result is justified by a numerical study of matching two sets of pulmonary vascular tree branch points whose displacement is caused by the lung volume changes in the same human subject. The nearly perfect match performance verifies the effectiveness of this mass transport-based approach. PMID:23687536
Human Albumin Fragments Nanoparticles as PTX Carrier for Improved Anti-cancer Efficacy
Ge, Liang; You, Xinru; Huang, Jun; Chen, Yuejian; Chen, Li; Zhu, Ying; Zhang, Yuan; Liu, Xiqiang; Wu, Jun; Hai, Qian
2018-01-01
For enhanced anti-cancer performance, human serum albumin fragments (HSAFs) nanoparticles (NPs) were developed as paclitaxel (PTX) carrier in this paper. Human albumins were broken into fragments via degradation and crosslinked by genipin to form HSAF NPs for better biocompatibility, improved PTX drug loading and sustained drug release. Compared with crosslinked human serum albumin NPs, the HSAF-NPs showed relative smaller particle size, higher drug loading, and improved sustained release. Cellular and animal results both indicated that the PTX encapsulated HSAF-NPs have shown good anti-cancer performance. And the anticancer results confirmed that NPs with fast cellular internalization showed better tumor inhibition. These findings will not only provide a safe and robust drug delivery NP platform for cancer therapy, but also offer fundamental information for the optimal design of albumin based NPs. PMID:29946256
NASA Astrophysics Data System (ADS)
Bauer, Johannes; Dávila-Chacón, Jorge; Wermter, Stefan
2015-10-01
Humans and other animals have been shown to perform near-optimally in multi-sensory integration tasks. Probabilistic population codes (PPCs) have been proposed as a mechanism by which optimal integration can be accomplished. Previous approaches have focussed on how neural networks might produce PPCs from sensory input or perform calculations using them, like combining multiple PPCs. Less attention has been given to the question of how the necessary organisation of neurons can arise and how the required knowledge about the input statistics can be learned. In this paper, we propose a model of learning multi-sensory integration based on an unsupervised learning algorithm in which an artificial neural network learns the noise characteristics of each of its sources of input. Our algorithm borrows from the self-organising map the ability to learn latent-variable models of the input and extends it to learning to produce a PPC approximating a probability density function over the latent variable behind its (noisy) input. The neurons in our network are only required to perform simple calculations and we make few assumptions about input noise properties and tuning functions. We report on a neurorobotic experiment in which we apply our algorithm to multi-sensory integration in a humanoid robot to demonstrate its effectiveness and compare it to human multi-sensory integration on the behavioural level. We also show in simulations that our algorithm performs near-optimally under certain plausible conditions, and that it reproduces important aspects of natural multi-sensory integration on the neural level.
NASA Astrophysics Data System (ADS)
Hardy, Jason; Campbell, Mark; Miller, Isaac; Schimpf, Brian
2008-10-01
The local path planner implemented on Cornell's 2007 DARPA Urban Challenge entry vehicle Skynet utilizes a novel mixture of discrete and continuous path planning steps to facilitate a safe, smooth, and human-like driving behavior. The planner first solves for a feasible path through the local obstacle map using a grid based search algorithm. The resulting path is then refined using a cost-based nonlinear optimization routine with both hard and soft constraints. The behavior of this optimization is influenced by tunable weighting parameters which govern the relative cost contributions assigned to different path characteristics. This paper studies the sensitivity of the vehicle's performance to these path planner weighting parameters using a data driven simulation based on logged data from the National Qualifying Event. The performance of the path planner in both the National Qualifying Event and in the Urban Challenge is also presented and analyzed.
NASA Astrophysics Data System (ADS)
Lily; Laila, L.; Prasetyo, B. E.
2018-03-01
A selective, reproducibility, effective, sensitive, simple and fast High-Performance Liquid Chromatography (HPLC) was developed, optimized and validated to analyze 25-Desacetyl Rifampicin (25-DR) in human urine which is from tuberculosis patient. The separation was performed by HPLC Agilent Technologies with column Agilent Eclipse XDB- Ci8 and amobile phase of 65:35 v/v methanol: 0.01 M sodium phosphate buffer pH 5.2, at 254 nm and flow rate of 0.8ml/min. The mean retention time was 3.016minutes. The method was linear from 2–10μg/ml 25-DR with a correlation coefficient of 0.9978. Standard deviation, relative standard deviation and coefficient variation of 2, 6, 10μg/ml 25-DR were 0-0.0829, 03.1752, 0-0.0317%, respectively. The recovery of 5, 7, 9μg/ml25-DR was 80.8661, 91.3480 and 111.1457%, respectively. Limits of detection (LoD) and quantification (LoQ) were 0.51 and 1.7μg/ml, respectively. The method has fulfilled the validity guidelines of the International Conference on Harmonization (ICH) bioanalytical method which includes parameters of specificity, linearity, precision, accuracy, LoD, and LoQ. The developed method is suitable for pharmacokinetic analysis of various concentrations of 25-DR in human urine.
Neuromodulation research and application in the U.S. Department of Defense.
Nelson, Jeremy T; Tepe, Victoria
2015-01-01
Modern neuromodulatory techniques for military applications have been explored for the past decade, with an intent to optimize operator performance and, ultimately, to improve overall military effectiveness. In light of potential military applications, some researchers have voiced concern about national security agency involvement in this area of research, and possible exploitation of research findings to support military objectives. The aim of this article is to examine the U.S. Department of Defense's interest in and application of neuromodulation. We explored articles, cases, and historical context to identify critical considerations of debate concerning dual use (i.e., national security and civilian) technologies, specifically focusing on non-invasive brain stimulation (NIBS). We review the background and recent examples of DoD-sponsored neuromodulation research, framed in the more general context of research that aims to optimize and/or rehabilitate human performance. We propose that concerns about military exploitation of neuromodulatory science and technology are not unique, but rather are part of a larger philosophic debate pertaining to military application of human performance science and technology. We consider unique aspects of the Department of Defense research enterprise--which includes programs crucial to the advancement of military medicine--and why it is well-situated to fund and perform such research. We conclude that debate concerning DoD investment in human performance research must recognize the significant potential for dual use (civilian, medical) benefit as well as the need for civilian scientific insight and influence. Military interests in the health and performance of service members provide research funding and impetus to dual use applications that will benefit the civilian community. Copyright © 2015 Elsevier Inc. All rights reserved.
Visual-search model observer for assessing mass detection in CT
NASA Astrophysics Data System (ADS)
Karbaschi, Zohreh; Gifford, Howard C.
2017-03-01
Our aim is to devise model observers (MOs) to evaluate acquisition protocols in medical imaging. To optimize protocols for human observers, an MO must reliably interpret images containing quantum and anatomical noise under aliasing conditions. In this study of sampling parameters for simulated lung CT, the lesion-detection performance of human observers was compared with that of visual-search (VS) observers, a channelized nonprewhitening (CNPW) observer, and a channelized Hoteling (CH) observer. Scans of a mathematical torso phantom modeled single-slice parallel-hole CT with varying numbers of detector pixels and angular projections. Circular lung lesions had a fixed radius. Twodimensional FBP reconstructions were performed. A localization ROC study was conducted with the VS, CNPW and human observers, while the CH observer was applied in a location-known ROC study. Changing the sampling parameters had negligible effect on the CNPW and CH observers, whereas several VS observers demonstrated a sensitivity to sampling artifacts that was in agreement with how the humans performed.
Abouelatta, Samar M; Aboelwafa, Ahmed A; Khalil, Rawia M; ElGazayerly, Omaima N
2015-01-01
The challenge in developing oral drug delivery systems of poorly soluble basic drugs is primarily due to their pH dependent solubility. Cinnarizine (CNZ), a model for a poorly soluble basic drug, has pH dependent solubility; where it dissolves readily at low pH in the stomach and exhibits a very low solubility at pH values greater than 4. It is also characterized by a short half life of 3-6h, which requires frequent daily administration resulting in poor patient compliance. In an attempt to solve these problems, extended release floating lipid beads were formulated. A 2(4) full factorial design was utilized for optimization of the effects of various independent variables; lipid:drug ratio, % Pluronic F-127, % Sterotex, and Gelucire 43/01:Gelucire 50/13 ratio, on the loading efficiency and release of CNZ from the lipid beads. In-vivo pharmacokinetic study of the optimized CNZ-lipid beads compared to Stugeron® (reference standard) was performed in healthy human volunteers. A promising approach for enhancing the bioavailability of the poorly soluble basic drug, CNZ, utilizing novel and simple floating lipid beads was successfully developed. Zero order release profile of CNZ was achieved for 12h. Mean AUC0-24 and AUC0-∞ of the optimized CNZ-loaded lipid beads were 4.23 and 6.04 times that of Stugeron® tablets respectively. Copyright © 2014 Elsevier B.V. All rights reserved.
Fuster-Parra, P; García-Mas, A; Ponseti, F J; Leo, F M
2015-04-01
The purpose of this paper was to discover the relationships among 22 relevant psychological features in semi-professional football players in order to study team's performance and collective efficacy via a Bayesian network (BN). The paper includes optimization of team's performance and collective efficacy using intercausal reasoning pattern which constitutes a very common pattern in human reasoning. The BN is used to make inferences regarding our problem, and therefore we obtain some conclusions; among them: maximizing the team's performance causes a decrease in collective efficacy and when team's performance achieves the minimum value it causes an increase in moderate/high values of collective efficacy. Similarly, we may reason optimizing team collective efficacy instead. It also allows us to determine the features that have the strongest influence on performance and which on collective efficacy. From the BN two different coaching styles were differentiated taking into account the local Markov property: training leadership and autocratic leadership. Copyright © 2014 Elsevier B.V. All rights reserved.
Herbranson, Walter T.; Schroeder, Julia
2011-01-01
The “Monty Hall Dilemma” (MHD) is a well known probability puzzle in which a player tries to guess which of three doors conceals a desirable prize. After an initial choice is made, one of the remaining doors is opened, revealing no prize. The player is then given the option of staying with their initial guess or switching to the other unopened door. Most people opt to stay with their initial guess, despite the fact that switching doubles the probability of winning. A series of experiments investigated whether pigeons (Columba livia), like most humans, would fail to maximize their expected winnings in a version of the MHD. Birds completed multiple trials of a standard MHD, with the three response keys in an operant chamber serving as the three doors and access to mixed grain as the prize. Across experiments, the probability of gaining reinforcement for switching and staying was manipulated, and birds adjusted their probability of switching and staying to approximate the optimal strategy. Replication of the procedure with human participants showed that humans failed to adopt optimal strategies, even with extensive training. PMID:20175592
Herbranson, Walter T; Schroeder, Julia
2010-02-01
The "Monty Hall Dilemma" (MHD) is a well known probability puzzle in which a player tries to guess which of three doors conceals a desirable prize. After an initial choice is made, one of the remaining doors is opened, revealing no prize. The player is then given the option of staying with their initial guess or switching to the other unopened door. Most people opt to stay with their initial guess, despite the fact that switching doubles the probability of winning. A series of experiments investigated whether pigeons (Columba livia), like most humans, would fail to maximize their expected winnings in a version of the MHD. Birds completed multiple trials of a standard MHD, with the three response keys in an operant chamber serving as the three doors and access to mixed grain as the prize. Across experiments, the probability of gaining reinforcement for switching and staying was manipulated, and birds adjusted their probability of switching and staying to approximate the optimal strategy. Replication of the procedure with human participants showed that humans failed to adopt optimal strategies, even with extensive training.
Joshi, Varun; Srinivasan, Manoj
2015-02-08
Understanding how humans walk on a surface that can move might provide insights into, for instance, whether walking humans prioritize energy use or stability. Here, motivated by the famous human-driven oscillations observed in the London Millennium Bridge, we introduce a minimal mathematical model of a biped, walking on a platform (bridge or treadmill) capable of lateral movement. This biped model consists of a point-mass upper body with legs that can exert force and perform mechanical work on the upper body. Using numerical optimization, we obtain energy-optimal walking motions for this biped, deriving the periodic body and platform motions that minimize a simple metabolic energy cost. When the platform has an externally imposed sinusoidal displacement of appropriate frequency and amplitude, we predict that body motion entrained to platform motion consumes less energy than walking on a fixed surface. When the platform has finite inertia, a mass- spring-damper with similar parameters to the Millennium Bridge, we show that the optimal biped walking motion sustains a large lateral platform oscillation when sufficiently many people walk on the bridge. Here, the biped model reduces walking metabolic cost by storing and recovering energy from the platform, demonstrating energy benefits for two features observed for walking on the Millennium Bridge: crowd synchrony and large lateral oscillations.
Joshi, Varun; Srinivasan, Manoj
2015-01-01
Understanding how humans walk on a surface that can move might provide insights into, for instance, whether walking humans prioritize energy use or stability. Here, motivated by the famous human-driven oscillations observed in the London Millennium Bridge, we introduce a minimal mathematical model of a biped, walking on a platform (bridge or treadmill) capable of lateral movement. This biped model consists of a point-mass upper body with legs that can exert force and perform mechanical work on the upper body. Using numerical optimization, we obtain energy-optimal walking motions for this biped, deriving the periodic body and platform motions that minimize a simple metabolic energy cost. When the platform has an externally imposed sinusoidal displacement of appropriate frequency and amplitude, we predict that body motion entrained to platform motion consumes less energy than walking on a fixed surface. When the platform has finite inertia, a mass- spring-damper with similar parameters to the Millennium Bridge, we show that the optimal biped walking motion sustains a large lateral platform oscillation when sufficiently many people walk on the bridge. Here, the biped model reduces walking metabolic cost by storing and recovering energy from the platform, demonstrating energy benefits for two features observed for walking on the Millennium Bridge: crowd synchrony and large lateral oscillations. PMID:25663810
Strahl, Stefan; Mertins, Alfred
2008-07-18
Evidence that neurosensory systems use sparse signal representations as well as improved performance of signal processing algorithms using sparse signal models raised interest in sparse signal coding in the last years. For natural audio signals like speech and environmental sounds, gammatone atoms have been derived as expansion functions that generate a nearly optimal sparse signal model (Smith, E., Lewicki, M., 2006. Efficient auditory coding. Nature 439, 978-982). Furthermore, gammatone functions are established models for the human auditory filters. Thus far, a practical application of a sparse gammatone signal model has been prevented by the fact that deriving the sparsest representation is, in general, computationally intractable. In this paper, we applied an accelerated version of the matching pursuit algorithm for gammatone dictionaries allowing real-time and large data set applications. We show that a sparse signal model in general has advantages in audio coding and that a sparse gammatone signal model encodes speech more efficiently in terms of sparseness than a sparse modified discrete cosine transform (MDCT) signal model. We also show that the optimal gammatone parameters derived for English speech do not match the human auditory filters, suggesting for signal processing applications to derive the parameters individually for each applied signal class instead of using psychometrically derived parameters. For brain research, it means that care should be taken with directly transferring findings of optimality for technical to biological systems.
The Monty Hall dilemma in pigeons: effect of investment in initial choice.
Stagner, Jessica P; Rayburn-Reeves, Rebecca; Zentall, Thomas R
2013-10-01
In the Monty Hall dilemma, humans are initially given a choice among three alternatives, one of which has a hidden prize. After choosing, but before it is revealed whether they have won the prize, they are shown that one of the remaining alternatives does not have the prize. They are then asked whether they want to stay with their original choice or switch to the remaining alternative. Although switching results in obtaining the prize two thirds of the time, humans consistently fail to adopt the optimal strategy of switching even after considerable training. Interestingly, there is evidence that pigeons show more optimal switching performance with this task than humans. Because humans often view even random choices already made as being more valuable than choices not made, we reasoned that if pigeons made a greater investment, it might produce an endowment or ownership effect resulting in more human-like suboptimal performance. On the other hand, the greater investment in the initial choice by the pigeons might facilitate switching behavior by helping them to better discriminate their staying versus switching behavior. In Experiment 1, we examined the effect of requiring pigeons to make a greater investment in their initial choice (20 pecks rather than the usual 1 peck). We found that the increased response requirement facilitated acquisition of the switching response. In Experiment 2, we showed that facilitation of switching due to the increased response requirement did not result from extinction of responding to the initially chosen location.
Spiral: Automated Computing for Linear Transforms
NASA Astrophysics Data System (ADS)
Püschel, Markus
2010-09-01
Writing fast software has become extraordinarily difficult. For optimal performance, programs and their underlying algorithms have to be adapted to take full advantage of the platform's parallelism, memory hierarchy, and available instruction set. To make things worse, the best implementations are often platform-dependent and platforms are constantly evolving, which quickly renders libraries obsolete. We present Spiral, a domain-specific program generation system for important functionality used in signal processing and communication including linear transforms, filters, and other functions. Spiral completely replaces the human programmer. For a desired function, Spiral generates alternative algorithms, optimizes them, compiles them into programs, and intelligently searches for the best match to the computing platform. The main idea behind Spiral is a mathematical, declarative, domain-specific framework to represent algorithms and the use of rewriting systems to generate and optimize algorithms at a high level of abstraction. Experimental results show that the code generated by Spiral competes with, and sometimes outperforms, the best available human-written code.
Zeric Stosic, Marina Z; Jaksic, Sandra M; Stojanov, Igor M; Apic, Jelena B; Ratajac, Radomir D
2016-11-01
High-performance liquid chromatography (HPLC) method with diode array detection (DAD) were optimized and validated for separation and determination of tetramethrin in an antiparasitic human shampoo. In order to optimize separation conditions, two different columns, different column oven temperatures, as well as mobile phase composition and ratio, were tested. Best separation was achieved on the Supelcosil TM LC-18- DB column (4.6 x 250 mm), particle size 5 jim, with mobile phase methanol : water (78 : 22, v/v) at a flow rate of 0.8 mL/min and at temperature of 30⁰C. The detection wavelength of the detector was set at 220 nm. Under the optimum chromatographic conditions, standard calibration curve was measured with good linearity [r2 = 0.9997]. Accuracy of the method defined as a mean recovery of tetramethrin from shampoo matrix was 100.09%. The advantages of this method are that it can easily be used for the routine analysis of drug tetramethrin in pharmaceutical formulas and in all pharmaceutical researches involving tetramethrin.
van der Zwaard, Stephan; van der Laarse, Willem J; Weide, Guido; Bloemers, Frank W; Hofmijster, Mathijs J; Levels, Koen; Noordhof, Dionne A; de Koning, Jos J; de Ruiter, Cornelis J; Jaspers, Richard T
2018-04-01
Optimizing physical performance is a major goal in current physiology. However, basic understanding of combining high sprint and endurance performance is currently lacking. This study identifies critical determinants of combined sprint and endurance performance using multiple regression analyses of physiologic determinants at different biologic levels. Cyclists, including 6 international sprint, 8 team pursuit, and 14 road cyclists, completed a Wingate test and 15-km time trial to obtain sprint and endurance performance results, respectively. Performance was normalized to lean body mass 2/3 to eliminate the influence of body size. Performance determinants were obtained from whole-body oxygen consumption, blood sampling, knee-extensor maximal force, muscle oxygenation, whole-muscle morphology, and muscle fiber histochemistry of musculus vastus lateralis. Normalized sprint performance was explained by percentage of fast-type fibers and muscle volume ( R 2 = 0.65; P < 0.001) and normalized endurance performance by performance oxygen consumption ( V̇o 2 ), mean corpuscular hemoglobin concentration, and muscle oxygenation ( R 2 = 0.92; P < 0.001). Combined sprint and endurance performance was explained by gross efficiency, performance V̇o 2 , and likely by muscle volume and fascicle length ( P = 0.056; P = 0.059). High performance V̇o 2 related to a high oxidative capacity, high capillarization × myoglobin, and small physiologic cross-sectional area ( R 2 = 0.67; P < 0.001). Results suggest that fascicle length and capillarization are important targets for training to optimize sprint and endurance performance simultaneously.-Van der Zwaard, S., van der Laarse, W. J., Weide, G., Bloemers, F. W., Hofmijster, M. J., Levels, K., Noordhof, D. A., de Koning, J. J., de Ruiter, C. J., Jaspers, R. T. Critical determinants of combined sprint and endurance performance: an integrative analysis from muscle fiber to the human body.
Biorecognition Element Design and Characterization for Human Performance Biomarkers Sensing
2015-07-16
immobilize aptamers and peptides on the AuNP surface. The parameters optimized in this work included reaction times, ligand ratio (PEG-OH vs PEG- COOH...instructions for performing peptides and aptamers surface immobilization were provided to collaborators in order to create nanoprobes that were integrated...with sequences made of less than 20 amino acids) and DNA aptamers (via on-off structural switching properties) are appealing BREs for new sensors
Parker, Maximilian G; Tyson, Sarah F; Weightman, Andrew P; Abbott, Bruce; Emsley, Richard; Mansell, Warren
2017-11-01
Computational models that simulate individuals' movements in pursuit-tracking tasks have been used to elucidate mechanisms of human motor control. Whilst there is evidence that individuals demonstrate idiosyncratic control-tracking strategies, it remains unclear whether models can be sensitive to these idiosyncrasies. Perceptual control theory (PCT) provides a unique model architecture with an internally set reference value parameter, and can be optimized to fit an individual's tracking behavior. The current study investigated whether PCT models could show temporal stability and individual specificity over time. Twenty adults completed three blocks of 15 1-min, pursuit-tracking trials. Two blocks (training and post-training) were completed in one session and the third was completed after 1 week (follow-up). The target moved in a one-dimensional, pseudorandom pattern. PCT models were optimized to the training data using a least-mean-squares algorithm, and validated with data from post-training and follow-up. We found significant inter-individual variability (partial η 2 : .464-.697) and intra-individual consistency (Cronbach's α: .880-.976) in parameter estimates. Polynomial regression revealed that all model parameters, including the reference value parameter, contribute to simulation accuracy. Participants' tracking performances were significantly more accurately simulated by models developed from their own tracking data than by models developed from other participants' data. We conclude that PCT models can be optimized to simulate the performance of an individual and that the test-retest reliability of individual models is a necessary criterion for evaluating computational models of human performance.
A research model--forecasting incident rates from optimized safety program intervention strategies.
Iyer, P S; Haight, J M; Del Castillo, E; Tink, B W; Hawkins, P W
2005-01-01
INTRODUCTION/PROBLEM: Property damage incidents, workplace injuries, and safety programs designed to prevent them, are expensive aspects of doing business in contemporary industry. The National Safety Council (2002) estimated that workplace injuries cost $146.6 billion per year. Because companies are resource limited, optimizing intervention strategies to decrease incidents with less costly programs can contribute to improved productivity. Systematic data collection methods were employed and the forecasting ability of a time-lag relationship between interventions and incident rates was studied using various statistical methods (an intervention is not expected to have an immediate nor infinitely lasting effect on the incident rate). As a follow up to the initial work, researchers developed two models designed to forecast incident rates. One is based on past incident rate performance and the other on the configuration and level of effort applied to the safety and health program. Researchers compared actual incident performance to the prediction capability of each model over 18 months in the forestry operations at an electricity distribution company and found the models to allow accurate prediction of incident rates. These models potentially have powerful implications as a business-planning tool for human resource allocation and for designing an optimized safety and health intervention program to minimize incidents. Depending on the mathematical relationship, one can determine what interventions, where and how much to apply them, and when to increase or reduce human resource input as determined by the forecasted performance.
Optimality of the basic colour categories for classification
Griffin, Lewis D
2005-01-01
Categorization of colour has been widely studied as a window into human language and cognition, and quite separately has been used pragmatically in image-database retrieval systems. This suggests the hypothesis that the best category system for pragmatic purposes coincides with human categories (i.e. the basic colours). We have tested this hypothesis by assessing the performance of different category systems in a machine-vision task. The task was the identification of the odd-one-out from triples of images obtained using a web-based image-search service. In each triple, two of the images had been retrieved using the same search term, the other a different term. The terms were simple concrete nouns. The results were as follows: (i) the odd-one-out task can be performed better than chance using colour alone; (ii) basic colour categorization performs better than random systems of categories; (iii) a category system that performs better than the basic colours could not be found; and (iv) it is not just the general layout of the basic colours that is important, but also the detail. We conclude that (i) the results support the plausibility of an explanation for the basic colours as a result of a pressure-to-optimality and (ii) the basic colours are good categories for machine vision image-retrieval systems. PMID:16849219
Space Life-Support Engineering Program
NASA Technical Reports Server (NTRS)
Seagrave, Richard C. (Principal Investigator)
1995-01-01
This report covers the seventeen months of work performed under an extended one year NASA University Grant awarded to Iowa State University to perform research on topics relating to the development of closed-loop long-term life support systems with the initial principal focus on space water management. In the first phase of the program, investigators from chemistry and chemical engineering with demonstrated expertise in systems analysis, thermodynamics, analytical chemistry and instrumentation, performed research and development in two major related areas; the development of low-cost, accurate, and durable sensors for trace chemical and biological species, and the development of unsteady-state simulation packages for use in the development and optimization of control systems for life support systems. In the second year of the program, emphasis was redirected towards concentrating on the development of dynamic simulation techniques and software and on performing a thermodynamic systems analysis, centered on availability or energy analysis, in an effort to begin optimizing the systems needed for water purification. The third year of the program, the subject of this report, was devoted to the analysis of the water balance for the interaction between humans and the life support system during space flight and exercise, to analysis of the cardiopulmonary systems of humans during space flight, and to analysis of entropy production during operation of the air recovery system during space flight.
Asghari, Alireza; Fazl-Karimi, Hamidreza; Barfi, Behruz; Rajabi, Maryam; Daneshfar, Ali
2014-08-01
Aminophenol isomers (2-, 3-, and 4-aminophenols) are typically classified as industrial pollutants with genotoxic and mutagenic effects due to their easy penetration through the skin and membranes of human, animals, and plants. In the present study, a simple and efficient ultrasound-assisted emulsification microextraction procedure coupled with high-performance liquid chromatography with ultraviolet detector was developed for preconcentration and determination of these compounds in human fluid and environmental water samples. Effective parameters (such as type and volume of extraction solvent, pH and ionic strength of sample, and ultrasonication and centrifuging time) were investigated and optimized. Under optimum conditions (including sample volume: 5 mL; extraction solvent: chloroform, 80 µL; pH: 6.5; without salt addition; ultrasonication: 3.5 min; and centrifuging time: 3 min, 5000 rpm min(-1)), the enrichment factors and limits of detection were ranged from 42 to 51 and 0.028 to 0.112 µg mL(-1), respectively. Once optimized, analytical performance of the method was studied in terms of linearity (0.085-157 µg mL(-1), r (2) > 0.998), accuracy (recovery = 88.6- 101.7%), and precision (repeatability: intraday precision < 3.98%, and interday precision < 5.12%). Finally, applicability of the method was evaluated by the extraction and determination of these compounds in human urine, hair dye, and real water samples. © The Author(s) 2014.
Limits in decision making arise from limits in memory retrieval.
Giguère, Gyslain; Love, Bradley C
2013-05-07
Some decisions, such as predicting the winner of a baseball game, are challenging in part because outcomes are probabilistic. When making such decisions, one view is that humans stochastically and selectively retrieve a small set of relevant memories that provides evidence for competing options. We show that optimal performance at test is impossible when retrieving information in this fashion, no matter how extensive training is, because limited retrieval introduces noise into the decision process that cannot be overcome. One implication is that people should be more accurate in predicting future events when trained on idealized rather than on the actual distributions of items. In other words, we predict the best way to convey information to people is to present it in a distorted, idealized form. Idealization of training distributions is predicted to reduce the harmful noise induced by immutable bottlenecks in people's memory retrieval processes. In contrast, machine learning systems that selectively weight (i.e., retrieve) all training examples at test should not benefit from idealization. These conjectures are strongly supported by several studies and supporting analyses. Unlike machine systems, people's test performance on a target distribution is higher when they are trained on an idealized version of the distribution rather than on the actual target distribution. Optimal machine classifiers modified to selectively and stochastically sample from memory match the pattern of human performance. These results suggest firm limits on human rationality and have broad implications for how to train humans tasked with important classification decisions, such as radiologists, baggage screeners, intelligence analysts, and gamblers.
Limits in decision making arise from limits in memory retrieval
Giguère, Gyslain; Love, Bradley C.
2013-01-01
Some decisions, such as predicting the winner of a baseball game, are challenging in part because outcomes are probabilistic. When making such decisions, one view is that humans stochastically and selectively retrieve a small set of relevant memories that provides evidence for competing options. We show that optimal performance at test is impossible when retrieving information in this fashion, no matter how extensive training is, because limited retrieval introduces noise into the decision process that cannot be overcome. One implication is that people should be more accurate in predicting future events when trained on idealized rather than on the actual distributions of items. In other words, we predict the best way to convey information to people is to present it in a distorted, idealized form. Idealization of training distributions is predicted to reduce the harmful noise induced by immutable bottlenecks in people’s memory retrieval processes. In contrast, machine learning systems that selectively weight (i.e., retrieve) all training examples at test should not benefit from idealization. These conjectures are strongly supported by several studies and supporting analyses. Unlike machine systems, people’s test performance on a target distribution is higher when they are trained on an idealized version of the distribution rather than on the actual target distribution. Optimal machine classifiers modified to selectively and stochastically sample from memory match the pattern of human performance. These results suggest firm limits on human rationality and have broad implications for how to train humans tasked with important classification decisions, such as radiologists, baggage screeners, intelligence analysts, and gamblers. PMID:23610402
Singh, Bhupinder; Garg, Babita; Chaturvedi, Subhash Chand; Arora, Sharry; Mandsaurwale, Rachana; Kapil, Rishi; Singh, Baljinder
2012-05-01
The current studies entail successful formulation of optimized gastroretentive tablets of lamivudine using the floating-bioadhesive potential of carbomers and cellulosic polymers, and their subsequent in-vitro and in-vivo evaluation in animals and humans. Effervescent floating-bioadhesive hydrophilic matrices were prepared and evaluated for in-vitro drug release, floatation and ex-vivo bioadhesive strength. The optimal composition of polymer blends was systematically chosen using central composite design and overlay plots. Pharmacokinetic studies were carried out in rabbits, and various levels of in-vitro/in-vivo correlation (IVIVC) were established. In-vivo gamma scintigraphic studies were performed in human volunteers using (99m) Tc to evaluate formulation retention in the gastric milieu. The optimized formulation exhibited excellent bioadhesive and floatational characteristics besides possessing adequate drug-release control and pharmacokinetic extension of plasma levels. The successful establishment of various levels of IVIVC substantiated the judicious choice of in-vitro dissolution media for simulating the in-vivo conditions. In-vivo gamma scintigraphic studies ratified the gastroretentive characteristics of the optimized formulation with a retention time of 5 h or more. Besides unravelling the polymer synergism, the study helped in developing an optimal once-a-day gastroretentive drug delivery system with improved bioavailability potential exhibiting excellent swelling, floating and bioadhesive characteristics. © 2012 The Authors. JPP © 2012 Royal Pharmaceutical Society.
Intraoperative Functional Ultrasound Imaging of Human Brain Activity.
Imbault, Marion; Chauvet, Dorian; Gennisson, Jean-Luc; Capelle, Laurent; Tanter, Mickael
2017-08-04
The functional mapping of brain activity is essential to perform optimal glioma surgery and to minimize the risk of postoperative deficits. We introduce a new, portable neuroimaging modality of the human brain based on functional ultrasound (fUS) for deep functional cortical mapping. Using plane-wave transmissions at an ultrafast frame rate (1 kHz), fUS is performed during surgery to measure transient changes in cerebral blood volume with a high spatiotemporal resolution (250 µm, 1 ms). fUS identifies, maps and differentiates regions of brain activation during task-evoked cortical responses within the depth of a sulcus in both awake and anaesthetized patients.
Research on the position estimation of human movement based on camera projection
NASA Astrophysics Data System (ADS)
Yi, Zhang; Yuan, Luo; Hu, Huosheng
2005-06-01
During the rehabilitation process of the post-stroke patients is conducted, their movements need to be localized and learned so that incorrect movement can be instantly modified or tuned. Therefore, tracking these movement becomes vital and necessary for the rehabilitative course. During human movement tracking, the position estimation of human movement is very important. In this paper, the character of the human movement system is first analyzed. Next, camera and inertial sensor are used to respectively measure the position of human movement, and the Kalman filter algorithm is proposed to fuse the two measurement to get a optimization estimation of the position. In the end, the performance of the method is analyzed.
Application of Human-Autonomy Teaming (HAT) Patterns to Reduce Crew Operations (RCO)
NASA Technical Reports Server (NTRS)
Shively, R. Jay; Brandt, Summer L.; Lachter, Joel; Matessa, Mike; Sadler, Garrett; Battiste, Henri
2011-01-01
Unmanned aerial systems, advanced cockpits, and air traffic management are all seeing dramatic increases in automation. However, while automation may take on some tasks previously performed by humans, humans will still be required to remain in the system for the foreseeable future. The collaboration between humans and these increasingly autonomous systems will begin to resemble cooperation between teammates, rather than simple task allocation. It is critical to understand this human-autonomy teaming (HAT) to optimize these systems in the future. One methodology to understand HAT is by identifying recurring patterns of HAT that have similar characteristics and solutions. This paper applies a methodology for identifying HAT patterns to an advanced cockpit project.
Meyer, Georg F.; Wong, Li Ting; Timson, Emma; Perfect, Philip; White, Mark D.
2012-01-01
We argue that objective fidelity evaluation of virtual environments, such as flight simulation, should be human-performance-centred and task-specific rather than measure the match between simulation and physical reality. We show how principled experimental paradigms and behavioural models to quantify human performance in simulated environments that have emerged from research in multisensory perception provide a framework for the objective evaluation of the contribution of individual cues to human performance measures of fidelity. We present three examples in a flight simulation environment as a case study: Experiment 1: Detection and categorisation of auditory and kinematic motion cues; Experiment 2: Performance evaluation in a target-tracking task; Experiment 3: Transferrable learning of auditory motion cues. We show how the contribution of individual cues to human performance can be robustly evaluated for each task and that the contribution is highly task dependent. The same auditory cues that can be discriminated and are optimally integrated in experiment 1, do not contribute to target-tracking performance in an in-flight refuelling simulation without training, experiment 2. In experiment 3, however, we demonstrate that the auditory cue leads to significant, transferrable, performance improvements with training. We conclude that objective fidelity evaluation requires a task-specific analysis of the contribution of individual cues. PMID:22957068
Human factors in incident reporting
NASA Technical Reports Server (NTRS)
Jones, S. G.
1993-01-01
The paper proposes a cooperative research effort be undertaken by academic institutions and industry organizations toward the compilation of a human factors data base in conjunction with technical information. Team members in any discipline can benefit and learn from observing positive examples of decision making and performance by crews under stressful or less than optimal circumstances. The opportunity to note trends in interpersonal and interactive behaviors and to categorize them is terms of more or less desirable outcomes should not be missed.
2016-10-01
0.3-0.9). Evidence from this evaluation suggests that ARSOF Soldiers may benefit from participation in THOR3 compared to other human performance...factor that was not modifiable and one of the strongest risk factors for injury. Evidence from this evaluation suggests that Soldiers may benefit from...a Motorized Vehicle 7 2 Walking or Hiking 7 2 Rough Housing or Fighting 3 1 Stepping/Climbing 1 < 1 Repairing or maintaining equipment 1
Solving Large Problems with a Small Working Memory
ERIC Educational Resources Information Center
Pizlo, Zygmunt; Stefanov, Emil
2013-01-01
We describe an important elaboration of our multiscale/multiresolution model for solving the Traveling Salesman Problem (TSP). Our previous model emulated the non-uniform distribution of receptors on the human retina and the shifts of visual attention. This model produced near-optimal solutions of TSP in linear time by performing hierarchical…
Corpus-Based Optimization of Language Models Derived from Unification Grammars
NASA Technical Reports Server (NTRS)
Rayner, Manny; Hockey, Beth Ann; James, Frankie; Bratt, Harry; Bratt, Elizabeth O.; Gawron, Mark; Goldwater, Sharon; Dowding, John; Bhagat, Amrita
2000-01-01
We describe a technique which makes it feasible to improve the performance of a language model derived from a manually constructed unification grammar, using low-quality untranscribed speech data and a minimum of human annotation. The method is on a medium-vocabulary spoken language command and control task.
Individual Differences in Optimization Problem Solving: Reconciling Conflicting Results
ERIC Educational Resources Information Center
Chronicle, Edward P.; MacGregor, James N.; Lee, Michael; Ormerod, Thomas C.; Hughes, Peter
2008-01-01
Results on human performance on the Traveling Salesman Problem (TSP) from different laboratories show high consistency. However, one exception is in the area of individual differences. While one research group has consistently failed to find systematic individual differences across instances of TSPs (Chronicle, MacGregor and Ormerod), another…
Adaptive Memory: Ancestral Priorities and the Mnemonic Value of Survival Processing
ERIC Educational Resources Information Center
Nairne, James S.; Pandeirada, Josefa N. S.
2010-01-01
Evolutionary psychologists often propose that humans carry around "stone-age" brains, along with a toolkit of cognitive adaptations designed originally to solve hunter-gatherer problems. This perspective predicts that optimal cognitive performance might sometimes be induced by ancestrally-based problems, those present in ancestral environments,…
Simulated parallel annealing within a neighborhood for optimization of biomechanical systems.
Higginson, J S; Neptune, R R; Anderson, F C
2005-09-01
Optimization problems for biomechanical systems have become extremely complex. Simulated annealing (SA) algorithms have performed well in a variety of test problems and biomechanical applications; however, despite advances in computer speed, convergence to optimal solutions for systems of even moderate complexity has remained prohibitive. The objective of this study was to develop a portable parallel version of a SA algorithm for solving optimization problems in biomechanics. The algorithm for simulated parallel annealing within a neighborhood (SPAN) was designed to minimize interprocessor communication time and closely retain the heuristics of the serial SA algorithm. The computational speed of the SPAN algorithm scaled linearly with the number of processors on different computer platforms for a simple quadratic test problem and for a more complex forward dynamic simulation of human pedaling.
Motor planning under temporal uncertainty is suboptimal when the gain function is asymmetric
Ota, Keiji; Shinya, Masahiro; Kudo, Kazutoshi
2015-01-01
For optimal action planning, the gain/loss associated with actions and the variability in motor output should both be considered. A number of studies make conflicting claims about the optimality of human action planning but cannot be reconciled due to their use of different movements and gain/loss functions. The disagreement is possibly because of differences in the experimental design and differences in the energetic cost of participant motor effort. We used a coincident timing task, which requires decision making with constant energetic cost, to test the optimality of participant's timing strategies under four configurations of the gain function. We compared participant strategies to an optimal timing strategy calculated from a Bayesian model that maximizes the expected gain. We found suboptimal timing strategies under two configurations of the gain function characterized by asymmetry, in which higher gain is associated with higher risk of zero gain. Participants showed a risk-seeking strategy by responding closer than optimal to the time of onset/offset of zero gain. Meanwhile, there was good agreement of the model with actual performance under two configurations of the gain function characterized by symmetry. Our findings show that human ability to make decisions that must reflect uncertainty in one's own motor output has limits that depend on the configuration of the gain function. PMID:26236227
Group interaction and flight crew performance
NASA Technical Reports Server (NTRS)
Foushee, H. Clayton; Helmreich, Robert L.
1988-01-01
The application of human-factors analysis to the performance of aircraft-operation tasks by the crew as a group is discussed in an introductory review and illustrated with anecdotal material. Topics addressed include the function of a group in the operational environment, the classification of group performance factors (input, process, and output parameters), input variables and the flight crew process, and the effect of process variables on performance. Consideration is given to aviation safety issues, techniques for altering group norms, ways of increasing crew effort and coordination, and the optimization of group composition.
Optimization of a Lattice Boltzmann Computation on State-of-the-Art Multicore Platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Samuel; Carter, Jonathan; Oliker, Leonid
2009-04-10
We present an auto-tuning approach to optimize application performance on emerging multicore architectures. The methodology extends the idea of search-based performance optimizations, popular in linear algebra and FFT libraries, to application-specific computational kernels. Our work applies this strategy to a lattice Boltzmann application (LBMHD) that historically has made poor use of scalar microprocessors due to its complex data structures and memory access patterns. We explore one of the broadest sets of multicore architectures in the HPC literature, including the Intel Xeon E5345 (Clovertown), AMD Opteron 2214 (Santa Rosa), AMD Opteron 2356 (Barcelona), Sun T5140 T2+ (Victoria Falls), as well asmore » a QS20 IBM Cell Blade. Rather than hand-tuning LBMHD for each system, we develop a code generator that allows us to identify a highly optimized version for each platform, while amortizing the human programming effort. Results show that our auto-tuned LBMHD application achieves up to a 15x improvement compared with the original code at a given concurrency. Additionally, we present detailed analysis of each optimization, which reveal surprising hardware bottlenecks and software challenges for future multicore systems and applications.« less
NASA Astrophysics Data System (ADS)
Zhang, Jingjing; Guo, Weihong; Xie, Bin; Yu, Xingjian; Luo, Xiaobing; Zhang, Tao; Yu, Zhihua; Wang, Hong; Jin, Xing
2017-09-01
Blue light hazard of white light-emitting diodes (LED) is a hidden risk for human's photobiological safety. Recent spectral optimization methods focus on maximizing luminous efficacy and improving color performances of LEDs, but few of them take blue hazard into account. Therefore, for healthy lighting, it's urgent to propose a spectral optimization method for white LED source to exhibit low blue light hazard, high luminous efficacy of radiation (LER) and high color performances. In this study, a genetic algorithm with penalty functions was proposed for realizing white spectra with low blue hazard, maximal LER and high color rendering index (CRI) values. By simulations, white spectra from LEDs with low blue hazard, high LER (≥297 lm/W) and high CRI (≥90) were achieved at different correlated color temperatures (CCTs) from 2013 K to 7845 K. Thus, the spectral optimization method can be used for guiding the fabrication of LED sources in line with photobiological safety. It is also found that the maximum permissible exposure duration of the optimized spectra increases by 14.9% than that of bichromatic phosphor-converted LEDs with equal CCT.
Yasukawa, Keiko; Shimosawa, Tatsuo; Okubo, Shigeo; Yatomi, Yutaka
2018-01-01
Background Human mercaptalbumin and human non-mercaptalbumin have been reported as markers for various pathological conditions, such as kidney and liver diseases. These markers play important roles in redox regulations throughout the body. Despite the recognition of these markers in various pathophysiologic conditions, the measurements of human mercaptalbumin and non-mercaptalbumin have not been popular because of the technical complexity and long measurement time of conventional methods. Methods Based on previous reports, we explored the optimal analytical conditions for a high-performance liquid chromatography method using an anion-exchange column packed with a hydrophilic polyvinyl alcohol gel. The method was then validated using performance tests as well as measurements of various patients' serum samples. Results We successfully established a reliable high-performance liquid chromatography method with an analytical time of only 12 min per test. The repeatability (within-day variability) and reproducibility (day-to-day variability) were 0.30% and 0.27% (CV), respectively. A very good correlation was obtained with the results of the conventional method. Conclusions A practical method for the clinical measurement of human mercaptalbumin and non-mercaptalbumin was established. This high-performance liquid chromatography method is expected to be a powerful tool enabling the expansion of clinical usefulness and ensuring the elucidation of the roles of albumin in redox reactions throughout the human body.
Real-time PCR probe optimization using design of experiments approach.
Wadle, S; Lehnert, M; Rubenwolf, S; Zengerle, R; von Stetten, F
2016-03-01
Primer and probe sequence designs are among the most critical input factors in real-time polymerase chain reaction (PCR) assay optimization. In this study, we present the use of statistical design of experiments (DOE) approach as a general guideline for probe optimization and more specifically focus on design optimization of label-free hydrolysis probes that are designated as mediator probes (MPs), which are used in reverse transcription MP PCR (RT-MP PCR). The effect of three input factors on assay performance was investigated: distance between primer and mediator probe cleavage site; dimer stability of MP and target sequence (influenza B virus); and dimer stability of the mediator and universal reporter (UR). The results indicated that the latter dimer stability had the greatest influence on assay performance, with RT-MP PCR efficiency increased by up to 10% with changes to this input factor. With an optimal design configuration, a detection limit of 3-14 target copies/10 μl reaction could be achieved. This improved detection limit was confirmed for another UR design and for a second target sequence, human metapneumovirus, with 7-11 copies/10 μl reaction detected in an optimum case. The DOE approach for improving oligonucleotide designs for real-time PCR not only produces excellent results but may also reduce the number of experiments that need to be performed, thus reducing costs and experimental times.
Active learning: learning a motor skill without a coach.
Huang, Vincent S; Shadmehr, Reza; Diedrichsen, Jörn
2008-08-01
When we learn a new skill (e.g., golf) without a coach, we are "active learners": we have to choose the specific components of the task on which to train (e.g., iron, driver, putter, etc.). What guides our selection of the training sequence? How do choices that people make compare with choices made by machine learning algorithms that attempt to optimize performance? We asked subjects to learn the novel dynamics of a robotic tool while moving it in four directions. They were instructed to choose their practice directions to maximize their performance in subsequent tests. We found that their choices were strongly influenced by motor errors: subjects tended to immediately repeat an action if that action had produced a large error. This strategy was correlated with better performance on test trials. However, even when participants performed perfectly on a movement, they did not avoid repeating that movement. The probability of repeating an action did not drop below chance even when no errors were observed. This behavior led to suboptimal performance. It also violated a strong prediction of current machine learning algorithms, which solve the active learning problem by choosing a training sequence that will maximally reduce the learner's uncertainty about the task. While we show that these algorithms do not provide an adequate description of human behavior, our results suggest ways to improve human motor learning by helping people choose an optimal training sequence.
Blessy, S A Praylin Selva; Sulochana, C Helen
2015-01-01
Segmentation of brain tumor from Magnetic Resonance Imaging (MRI) becomes very complicated due to the structural complexities of human brain and the presence of intensity inhomogeneities. To propose a method that effectively segments brain tumor from MR images and to evaluate the performance of unsupervised optimal fuzzy clustering (UOFC) algorithm for segmentation of brain tumor from MR images. Segmentation is done by preprocessing the MR image to standardize intensity inhomogeneities followed by feature extraction, feature fusion and clustering. Different validation measures are used to evaluate the performance of the proposed method using different clustering algorithms. The proposed method using UOFC algorithm produces high sensitivity (96%) and low specificity (4%) compared to other clustering methods. Validation results clearly show that the proposed method with UOFC algorithm effectively segments brain tumor from MR images.
Evaluation and Design of Genome-Wide CRISPR/SpCas9 Knockout Screens
Hart, Traver; Tong, Amy Hin Yan; Chan, Katie; Van Leeuwen, Jolanda; Seetharaman, Ashwin; Aregger, Michael; Chandrashekhar, Megha; Hustedt, Nicole; Seth, Sahil; Noonan, Avery; Habsid, Andrea; Sizova, Olga; Nedyalkova, Lyudmila; Climie, Ryan; Tworzyanski, Leanne; Lawson, Keith; Sartori, Maria Augusta; Alibeh, Sabriyeh; Tieu, David; Masud, Sanna; Mero, Patricia; Weiss, Alexander; Brown, Kevin R.; Usaj, Matej; Billmann, Maximilian; Rahman, Mahfuzur; Costanzo, Michael; Myers, Chad L.; Andrews, Brenda J.; Boone, Charles; Durocher, Daniel; Moffat, Jason
2017-01-01
The adaptation of CRISPR/SpCas9 technology to mammalian cell lines is transforming the study of human functional genomics. Pooled libraries of CRISPR guide RNAs (gRNAs) targeting human protein-coding genes and encoded in viral vectors have been used to systematically create gene knockouts in a variety of human cancer and immortalized cell lines, in an effort to identify whether these knockouts cause cellular fitness defects. Previous work has shown that CRISPR screens are more sensitive and specific than pooled-library shRNA screens in similar assays, but currently there exists significant variability across CRISPR library designs and experimental protocols. In this study, we reanalyze 17 genome-scale knockout screens in human cell lines from three research groups, using three different genome-scale gRNA libraries. Using the Bayesian Analysis of Gene Essentiality algorithm to identify essential genes, we refine and expand our previously defined set of human core essential genes from 360 to 684 genes. We use this expanded set of reference core essential genes, CEG2, plus empirical data from six CRISPR knockout screens to guide the design of a sequence-optimized gRNA library, the Toronto KnockOut version 3.0 (TKOv3) library. We then demonstrate the high effectiveness of the library relative to reference sets of essential and nonessential genes, as well as other screens using similar approaches. The optimized TKOv3 library, combined with the CEG2 reference set, provide an efficient, highly optimized platform for performing and assessing gene knockout screens in human cell lines. PMID:28655737
Optimizing point-of-care testing in clinical systems management.
Kost, G J
1998-01-01
The goal of improving medical and economic outcomes calls for leadership based on fundamental principles. The manager of clinical systems works collaboratively within the acute care center to optimize point-of-care testing through systematic approaches such as integrative strategies, algorithms, and performance maps. These approaches are effective and efficacious for critically ill patients. Optimizing point-of-care testing throughout the entire health-care system is inherently more difficult. There is potential to achieve high-quality testing, integrated disease management, and equitable health-care delivery. Despite rapid change and economic uncertainty, a macro-strategic, information-integrated, feedback-systems, outcomes-oriented approach is timely, challenging, effective, and uplifting to the creative human spirit.
Particle swarm optimization based space debris surveillance network scheduling
NASA Astrophysics Data System (ADS)
Jiang, Hai; Liu, Jing; Cheng, Hao-Wen; Zhang, Yao
2017-02-01
The increasing number of space debris has created an orbital debris environment that poses increasing impact risks to existing space systems and human space flights. For the safety of in-orbit spacecrafts, we should optimally schedule surveillance tasks for the existing facilities to allocate resources in a manner that most significantly improves the ability to predict and detect events involving affected spacecrafts. This paper analyzes two criteria that mainly affect the performance of a scheduling scheme and introduces an artificial intelligence algorithm into the scheduling of tasks of the space debris surveillance network. A new scheduling algorithm based on the particle swarm optimization algorithm is proposed, which can be implemented in two different ways: individual optimization and joint optimization. Numerical experiments with multiple facilities and objects are conducted based on the proposed algorithm, and simulation results have demonstrated the effectiveness of the proposed algorithm.
NASA Technical Reports Server (NTRS)
Holden, Kritina L.; Thompson, Shelby G.; Sandor, Aniko; McCann, Robert S.; Kaiser, Mary K.; Adelstein, Barnard D.; Begault, Durand R.; Beutter, Brent R.; Stone, Leland S.; Godfroy, Martine
2009-01-01
The goal of the Information Presentation Directed Research Project (DRP) is to address design questions related to the presentation of information to the crew. In addition to addressing display design issues associated with information formatting, style, layout, and interaction, the Information Presentation DRP is also working toward understanding the effects of extreme environments encountered in space travel on information processing. Work is also in progress to refine human factors-based design tools, such as human performance modeling, that will supplement traditional design techniques and help ensure that optimal information design is accomplished in the most cost-efficient manner. The major areas of work, or subtasks, within the Information Presentation DRP for FY10 are: 1) Displays, 2) Controls, 3) Procedures and Fault Management, and 4) Human Performance Modeling. The poster will highlight completed and planned work for each subtask.
Selecting Tasks for Evaluating Human Performance as a Function of Gravity
NASA Technical Reports Server (NTRS)
Norcross, Jason R.; Gernhardt, Michael L.
2011-01-01
A challenge in understanding human performance as a function of gravity is determining which tasks to research. Initial studies began with treadmill walking, which was easy to quantify and control. However, with the development of pressurized rovers, it is less important to optimize human performance for ambulation as pressurized rovers will likely perform gross translation for them. Future crews are likely to spend much of their extravehicular activity (EVA) performing geology, construction,a nd maintenance type tasks. With these types of tasks, people have different performance strategies, and it is often difficult to quantify the task and measure steady-state metabolic rates or perform biomechanical analysis. For many of these types of tasks, subjective feedback may be the only data that can be collected. However, subjective data may not fully support a rigorous scientific comparison of human performance across different gravity levels and suit factors. NASA would benefit from having a wide variety of quantifiable tasks that allow human performance comparison across different conditions. In order to determine which tasks will effectively support scientific studies, many different tasks and data analysis techniques will need to be employed. Many of these tasks and techniques will not be effective, but some will produce quantifiable results that are sensitive enough to show performance differences. One of the primary concerns related to EVA performance is metabolic rate. The higher the metabolic rate, the faster the astronaut will exhaust consumables. The focus of this poster will be on how different tasks affect metabolic rate across different gravity levels.
Strategic Adaptation to Task Characteristics, Incentives, and Individual Differences in Dual-Tasking
Janssen, Christian P.; Brumby, Duncan P.
2015-01-01
We investigate how good people are at multitasking by comparing behavior to a prediction of the optimal strategy for dividing attention between two concurrent tasks. In our experiment, 24 participants had to interleave entering digits on a keyboard with controlling a randomly moving cursor with a joystick. The difficulty of the tracking task was systematically varied as a within-subjects factor. Participants were also exposed to different explicit reward functions that varied the relative importance of the tracking task relative to the typing task (between-subjects). Results demonstrate that these changes in task characteristics and monetary incentives, together with individual differences in typing ability, influenced how participants choose to interleave tasks. This change in strategy then affected their performance on each task. A computational cognitive model was used to predict performance for a wide set of alternative strategies for how participants might have possibly interleaved tasks. This allowed for predictions of optimal performance to be derived, given the constraints placed on performance by the task and cognition. A comparison of human behavior with the predicted optimal strategy shows that participants behaved near optimally. Our findings have implications for the design and evaluation of technology for multitasking situations, as consideration should be given to the characteristics of the task, but also to how different users might use technology depending on their individual characteristics and their priorities. PMID:26161851
Frequency modulation entrains slow neural oscillations and optimizes human listening behavior
Henry, Molly J.; Obleser, Jonas
2012-01-01
The human ability to continuously track dynamic environmental stimuli, in particular speech, is proposed to profit from “entrainment” of endogenous neural oscillations, which involves phase reorganization such that “optimal” phase comes into line with temporally expected critical events, resulting in improved processing. The current experiment goes beyond previous work in this domain by addressing two thus far unanswered questions. First, how general is neural entrainment to environmental rhythms: Can neural oscillations be entrained by temporal dynamics of ongoing rhythmic stimuli without abrupt onsets? Second, does neural entrainment optimize performance of the perceptual system: Does human auditory perception benefit from neural phase reorganization? In a human electroencephalography study, listeners detected short gaps distributed uniformly with respect to the phase angle of a 3-Hz frequency-modulated stimulus. Listeners’ ability to detect gaps in the frequency-modulated sound was not uniformly distributed in time, but clustered in certain preferred phases of the modulation. Moreover, the optimal stimulus phase was individually determined by the neural delta oscillation entrained by the stimulus. Finally, delta phase predicted behavior better than stimulus phase or the event-related potential after the gap. This study demonstrates behavioral benefits of phase realignment in response to frequency-modulated auditory stimuli, overall suggesting that frequency fluctuations in natural environmental input provide a pacing signal for endogenous neural oscillations, thereby influencing perceptual processing. PMID:23151506
Optimization of HPV DNA detection in urine by improving collection, storage, and extraction.
Vorsters, A; Van den Bergh, J; Micalessi, I; Biesmans, S; Bogers, J; Hens, A; De Coster, I; Ieven, M; Van Damme, P
2014-11-01
The benefits of using urine for the detection of human papillomavirus (HPV) DNA have been evaluated in disease surveillance, epidemiological studies, and screening for cervical cancers in specific subgroups. HPV DNA testing in urine is being considered for important purposes, notably the monitoring of HPV vaccination in adolescent girls and young women who do not wish to have a vaginal examination. The need to optimize and standardize sampling, storage, and processing has been reported.In this paper, we examined the impact of a DNA-conservation buffer, the extraction method, and urine sampling on the detection of HPV DNA and human DNA in urine provided by 44 women with a cytologically normal but HPV DNA-positive cervical sample. Ten women provided first-void and midstream urine samples. DNA analysis was performed using real-time PCR to allow quantification of HPV and human DNA.The results showed that an optimized method for HPV DNA detection in urine should (a) prevent DNA degradation during extraction and storage, (b) recover cell-free HPV DNA in addition to cell-associated DNA, (c) process a sufficient volume of urine, and (d) use a first-void sample.In addition, we found that detectable human DNA in urine may not be a good internal control for sample validity. HPV prevalence data that are based on urine samples collected, stored, and/or processed under suboptimal conditions may underestimate infection rates.
Yang, Anxiong; Stingl, Michael; Berry, David A.; Lohscheller, Jörg; Voigt, Daniel; Eysholdt, Ulrich; Döllinger, Michael
2011-01-01
With the use of an endoscopic, high-speed camera, vocal fold dynamics may be observed clinically during phonation. However, observation and subjective judgment alone may be insufficient for clinical diagnosis and documentation of improved vocal function, especially when the laryngeal disease lacks any clear morphological presentation. In this study, biomechanical parameters of the vocal folds are computed by adjusting the corresponding parameters of a three-dimensional model until the dynamics of both systems are similar. First, a mathematical optimization method is presented. Next, model parameters (such as pressure, tension and masses) are adjusted to reproduce vocal fold dynamics, and the deduced parameters are physiologically interpreted. Various combinations of global and local optimization techniques are attempted. Evaluation of the optimization procedure is performed using 50 synthetically generated data sets. The results show sufficient reliability, including 0.07 normalized error, 96% correlation, and 91% accuracy. The technique is also demonstrated on data from human hemilarynx experiments, in which a low normalized error (0.16) and high correlation (84%) values were achieved. In the future, this technique may be applied to clinical high-speed images, yielding objective measures with which to document improved vocal function of patients with voice disorders. PMID:21877808
Optimal Modality Selection for Cooperative Human-Robot Task Completion.
Jacob, Mithun George; Wachs, Juan P
2016-12-01
Human-robot cooperation in complex environments must be fast, accurate, and resilient. This requires efficient communication channels where robots need to assimilate information using a plethora of verbal and nonverbal modalities such as hand gestures, speech, and gaze. However, even though hybrid human-robot communication frameworks and multimodal communication have been studied, a systematic methodology for designing multimodal interfaces does not exist. This paper addresses the gap by proposing a novel methodology to generate multimodal lexicons which maximizes multiple performance metrics over a wide range of communication modalities (i.e., lexicons). The metrics are obtained through a mixture of simulation and real-world experiments. The methodology is tested in a surgical setting where a robot cooperates with a surgeon to complete a mock abdominal incision and closure task by delivering surgical instruments. Experimental results show that predicted optimal lexicons significantly outperform predicted suboptimal lexicons (p <; 0.05) in all metrics validating the predictability of the methodology. The methodology is validated in two scenarios (with and without modeling the risk of a human-robot collision) and the differences in the lexicons are analyzed.
Cui, Xinchun; Niu, Yuying; Zheng, Xiangwei; Han, Yingshuai
2018-01-01
In this paper, a new color watermarking algorithm based on differential evolution is proposed. A color host image is first converted from RGB space to YIQ space, which is more suitable for the human visual system. Then, apply three-level discrete wavelet transformation to luminance component Y and generate four different frequency sub-bands. After that, perform singular value decomposition on these sub-bands. In the watermark embedding process, apply discrete wavelet transformation to a watermark image after the scrambling encryption processing. Our new algorithm uses differential evolution algorithm with adaptive optimization to choose the right scaling factors. Experimental results show that the proposed algorithm has a better performance in terms of invisibility and robustness.
Investigation of the transmission of fore and aft vibration through the human body.
Demić, Miroslav; Lukić, Jovanka
2009-07-01
Understanding the behavior of human body under the influence of vibration is of great importance for the optimal motor vehicle system design. Therefore, great efforts are being done in order to discover as many information about the influence of vibration on human body as possible. So far the references show that the major scientific attention has been paid to vertical vibration, although intensive research has been performed lately on the other sorts of excitation. In this paper, the results of the investigation of behavior of human body, in seated position, under the influence of random fore and aft vibration are shown. The investigation is performed by the use of an electro-hydraulic simulator, on a group of 30 healthy male occupants. Experiments are performed in order to give results to improve human body modeling in driving conditions. Excitation amplitudes (1.75 and 2.25 m/s(2) rms) and seat backrest conditions (with and without inclination) were varied. Data results are analyzed by partial coherence and transfer functions. Analyses have been performed and results are given in detail. The results obtained have shown that the human body under the influence of random excitations behaves as a non-linear system and its response depends on spatial position. Obtained results give necessary data to define structure and parameters of human biodynamic model with respect to different excitation and seat backrest position.
Toward a Model-Based Predictive Controller Design in Brain–Computer Interfaces
Kamrunnahar, M.; Dias, N. S.; Schiff, S. J.
2013-01-01
A first step in designing a robust and optimal model-based predictive controller (MPC) for brain–computer interface (BCI) applications is presented in this article. An MPC has the potential to achieve improved BCI performance compared to the performance achieved by current ad hoc, nonmodel-based filter applications. The parameters in designing the controller were extracted as model-based features from motor imagery task-related human scalp electroencephalography. Although the parameters can be generated from any model-linear or non-linear, we here adopted a simple autoregressive model that has well-established applications in BCI task discriminations. It was shown that the parameters generated for the controller design can as well be used for motor imagery task discriminations with performance (with 8–23% task discrimination errors) comparable to the discrimination performance of the commonly used features such as frequency specific band powers and the AR model parameters directly used. An optimal MPC has significant implications for high performance BCI applications. PMID:21267657
Toward a model-based predictive controller design in brain-computer interfaces.
Kamrunnahar, M; Dias, N S; Schiff, S J
2011-05-01
A first step in designing a robust and optimal model-based predictive controller (MPC) for brain-computer interface (BCI) applications is presented in this article. An MPC has the potential to achieve improved BCI performance compared to the performance achieved by current ad hoc, nonmodel-based filter applications. The parameters in designing the controller were extracted as model-based features from motor imagery task-related human scalp electroencephalography. Although the parameters can be generated from any model-linear or non-linear, we here adopted a simple autoregressive model that has well-established applications in BCI task discriminations. It was shown that the parameters generated for the controller design can as well be used for motor imagery task discriminations with performance (with 8-23% task discrimination errors) comparable to the discrimination performance of the commonly used features such as frequency specific band powers and the AR model parameters directly used. An optimal MPC has significant implications for high performance BCI applications.
PERI - Auto-tuning Memory Intensive Kernels for Multicore
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bailey, David H; Williams, Samuel; Datta, Kaushik
2008-06-24
We present an auto-tuning approach to optimize application performance on emerging multicore architectures. The methodology extends the idea of search-based performance optimizations, popular in linear algebra and FFT libraries, to application-specific computational kernels. Our work applies this strategy to Sparse Matrix Vector Multiplication (SpMV), the explicit heat equation PDE on a regular grid (Stencil), and a lattice Boltzmann application (LBMHD). We explore one of the broadest sets of multicore architectures in the HPC literature, including the Intel Xeon Clovertown, AMD Opteron Barcelona, Sun Victoria Falls, and the Sony-Toshiba-IBM (STI) Cell. Rather than hand-tuning each kernel for each system, we developmore » a code generator for each kernel that allows us to identify a highly optimized version for each platform, while amortizing the human programming effort. Results show that our auto-tuned kernel applications often achieve a better than 4X improvement compared with the original code. Additionally, we analyze a Roofline performance model for each platform to reveal hardware bottlenecks and software challenges for future multicore systems and applications.« less
Optimizing Cognitive Performance: The Relationship of Self-Theory to the Human Dimension Concept
2015-06-12
Newman , he states that “emotional intelligence is positively correlated with performance for high emotional labor jobs.”62 In a summary study of most...individual commanders, leaders, administrators, teachers, and others to include self-theory within their individual curriculums, teaching styles , and...society, where grandparents, parents , and children are students together. In a time of drastic change it is the learners who inherit the future. The
Kim, Da-Hye; Oh, Jeong-Eun
2017-05-01
Human hair has many advantages as a non-invasive sample; however, analytical methods for detecting perfluoroalkyl substances (PFASs) in human hair are still in the development stage. Therefore, the aim of this study was to develop and validate a method for monitoring 11 PFASs in human hair. Solid-phase extraction (SPE), ion-pairing extraction (IPE), a combined method (SPE+IPE) and solvent extraction with ENVI-carb clean-up were compared to develop an optimal extraction method using two types of hair sample (powder and piece forms). Analysis of PFASs was performed using liquid chromatography and tandem mass spectrometry. Among the four different extraction procedures, the SPE method using powdered hair showed the best extraction efficiency and recoveries ranged from 85.8 to 102%. The method detection limits for the SPE method were 0.114-0.796 ng/g and good precision (below 10%) and accuracy (66.4-110%) were obtained. In light of these results, SPE is considered the optimal method for PFAS extraction from hair. It was also successfully used to detect PFASs in human hair samples. Copyright © 2017 Elsevier Ltd. All rights reserved.
Boston-Fleischhauer, Carol
2008-01-01
The design and implementation of efficient, effective, and safe processes are never-ending challenges in healthcare. Less than optimal performance levels and rising concerns about patient safety suggest that traditional process design methods are insufficient to meet design requirements. In this 2-part series, the author presents human factors engineering and reliability science as important knowledge to enhance existing operational and clinical process design methods in healthcare. An examination of these theories, application approaches, and examples are presented.
Nindl, Bradley C; Williams, Thomas J; Deuster, Patricia A; Butler, Nikki L; Jones, Bruce H
2013-01-01
With downsizing of the military services and significant budget cuts, it will be more important than ever to optimize the health and performance of individual service members. Musculoskeletal injuries (MSIs) represent a major threat to the health and fitness of Soldiers and other service members that degrade our nation's ability to project military power. This affects both financial (such as the economic burden from medical, healthcare, and disability costs) and human manpower resources (Soldiers medically unable to optimally perform their duties and to deploy). For example, in 2012, MSIs represented the leading cause of medical care visits across the military services resulting in almost 2,200,000 medical encounters. They also result in more disability discharges than any other health condition. Nonbattle injuries (NBIs) have caused more medical evacuations (34%) from recent theaters of operation than any other cause including combat injuries. Physical training and sports are the main cause of these NBIs. The majority (56%) of these injuries are the direct result of physical training. Higher levels of physical fitness protect against such injuries; however, more physical training to improve fitness also causes higher injury rates. Thus, military physical training programs must balance the need for fitness with the risks of injuries. The Army has launched several initiatives that may potentially improve military physical readiness and reduce injuries. These include the US Army Training and Doctrine Command's Baseline Soldier Physical Readiness Requirements and Gender Neutral Physical Performance Standards studies, as well as the reimplementation of the Master Fitness Trainer program and the Army Medical Command's Soldier Medical Readiness and Performance Triad Campaigns. It is imperative for military leaders to understand that military physical readiness can be enhanced at the same time that MSIs are prevented. A strategic paradigm shift in the military's approach to physical readiness policies is needed to avoid further degradation of warfighting capability in an era of austerity. We believe this can be best accomplished through leveraging scientific, evidence-based best practices by Army senior leadership which supports, prioritizes, and implements innovative, synchronized, and integrated human performance optimization/injury prevention policy changes.
A stochastic visco-hyperelastic model of human placenta tissue for finite element crash simulations.
Hu, Jingwen; Klinich, Kathleen D; Miller, Carl S; Rupp, Jonathan D; Nazmi, Giseli; Pearlman, Mark D; Schneider, Lawrence W
2011-03-01
Placental abruption is the most common cause of fetal deaths in motor-vehicle crashes, but studies on the mechanical properties of human placenta are rare. This study presents a new method of developing a stochastic visco-hyperelastic material model of human placenta tissue using a combination of uniaxial tensile testing, specimen-specific finite element (FE) modeling, and stochastic optimization techniques. In our previous study, uniaxial tensile tests of 21 placenta specimens have been performed using a strain rate of 12/s. In this study, additional uniaxial tensile tests were performed using strain rates of 1/s and 0.1/s on 25 placenta specimens. Response corridors for the three loading rates were developed based on the normalized data achieved by test reconstructions of each specimen using specimen-specific FE models. Material parameters of a visco-hyperelastic model and their associated standard deviations were tuned to match both the means and standard deviations of all three response corridors using a stochastic optimization method. The results show a very good agreement between the tested and simulated response corridors, indicating that stochastic analysis can improve estimation of variability in material model parameters. The proposed method can be applied to develop stochastic material models of other biological soft tissues.
Schwarz, Patric; Pannes, Klaus Dieter; Nathan, Michel; Reimer, Hans Jorg; Kleespies, Axel; Kuhn, Nicole; Rupp, Anne; Zügel, Nikolaus Peter
2011-10-01
The decision to optimize the processes in the operating tract was based on two factors: competition among clinics and a desire to optimize the use of available resources. The aim of the project was to improve operating room (OR) capacity utilization by reduction of change and throughput time per patient. The study was conducted at Centre Hospitalier Emil Mayrisch Clinic for specialized care (n = 618 beds) Luxembourg (South). A prospective analysis was performed before and after the implementation of optimized processes. Value stream analysis and design (value stream mapping, VSM) were used as tools. VSM depicts patient throughput and the corresponding information flows. Furthermore it is used to identify process waste (e.g. time, human resources, materials, etc.). For this purpose, change times per patient (extubation of patient 1 until intubation of patient 2) and throughput times (inward transfer until outward transfer) were measured. VSM, change and throughput times for 48 patient flows (VSM A(1), actual state = initial situation) served as the starting point. Interdisciplinary development of an optimized VSM (VSM-O) was evaluated. Prospective analysis of 42 patients (VSM-A(2)) without and 75 patients (VSM-O) with an optimized process in place were conducted. The prospective analysis resulted in a mean change time of (mean ± SEM) VSM-A(2) 1,507 ± 100 s versus VSM-O 933 ± 66 s (p < 0.001). The mean throughput time VSM-A(2) (mean ± SEM) was 151 min (±8) versus VSM-O 120 min (±10) (p < 0.05). This corresponds to a 23% decrease in waiting time per patient in total. Efficient OR capacity utilization and the optimized use of human resources allowed an additional 1820 interventions to be carried out per year without any increase in human resources. In addition, perioperative patient monitoring was increased up to 100%.
Techniques for designing rotorcraft control systems
NASA Technical Reports Server (NTRS)
Levine, William S.; Barlow, Jewel
1993-01-01
This report summarizes the work that was done on the project from 1 Apr. 1992 to 31 Mar. 1993. The main goal of this research is to develop a practical tool for rotorcraft control system design based on interactive optimization tools (CONSOL-OPTCAD) and classical rotorcraft design considerations (ADOCS). This approach enables the designer to combine engineering intuition and experience with parametric optimization. The combination should make it possible to produce a better design faster than would be possible using either pure optimization or pure intuition and experience. We emphasize that the goal of this project is not to develop an algorithm. It is to develop a tool. We want to keep the human designer in the design process to take advantage of his or her experience and creativity. The role of the computer is to perform the calculation necessary to improve and to display the performance of the nominal design. Briefly, during the first year we have connected CONSOL-OPTCAD, an existing software package for optimizing parameters with respect to multiple performance criteria, to a simplified nonlinear simulation of the UH-60 rotorcraft. We have also created mathematical approximations to the Mil-specs for rotorcraft handling qualities and input them into CONSOL-OPTCAD. Finally, we have developed the additional software necessary to use CONSOL-OPTCAD for the design of rotorcraft controllers.
NASA Technical Reports Server (NTRS)
Hess, R. A.
1976-01-01
Paramount to proper utilization of electronic displays is a method for determining pilot-centered display requirements. Display design should be viewed fundamentally as a guidance and control problem which has interactions with the designer's knowledge of human psychomotor activity. From this standpoint, reliable analytical models of human pilots as information processors and controllers can provide valuable insight into the display design process. A relatively straightforward, nearly algorithmic procedure for deriving model-based, pilot-centered display requirements was developed and is presented. The optimal or control theoretic pilot model serves as the backbone of the design methodology, which is specifically directed toward the synthesis of head-down, electronic, cockpit display formats. Some novel applications of the optimal pilot model are discussed. An analytical design example is offered which defines a format for the electronic display to be used in a UH-1H helicopter in a landing approach task involving longitudinal and lateral degrees of freedom.
NASA Astrophysics Data System (ADS)
Xing, Y. F.; Wang, Y. S.; Shi, L.; Guo, H.; Chen, H.
2016-01-01
According to the human perceptional characteristics, a method combined by the optimal wavelet-packet transform and artificial neural network, so-called OWPT-ANN model, for psychoacoustical recognition is presented. Comparisons of time-frequency analysis methods are performed, and an OWPT with 21 critical bands is designed for feature extraction of a sound, as is a three-layer back-propagation ANN for sound quality (SQ) recognition. Focusing on the loudness and sharpness, the OWPT-ANN model is applied on vehicle noises under different working conditions. Experimental verifications show that the OWPT can effectively transfer a sound into a time-varying energy pattern as that in the human auditory system. The errors of loudness and sharpness of vehicle noise from the OWPT-ANN are all less than 5%, which suggest a good accuracy of the OWPT-ANN model in SQ recognition. The proposed methodology might be regarded as a promising technique for signal processing in the human-hearing related fields in engineering.
Application of hanging drop technique to optimize human IgG formulations.
Li, Guohua; Kasha, Purna C; Late, Sameer; Banga, Ajay K
2010-01-01
The purpose of this work is to assess the hanging drop technique in screening excipients to develop optimal formulations for human immunoglobulin G (IgG). A microdrop of human IgG and test solution hanging from a cover slide and undergoing vapour diffusion was monitored by a stereomicroscope. Aqueous solutions of IgG in the presence of different pH, salt concentrations and excipients were prepared and characterized. Low concentration of either sodium/potassium phosphate or McIlvaine buffer favoured the solubility of IgG. Addition of sucrose favoured the stability of this antibody while addition of NaCl caused more aggregation. Antimicrobial preservatives were also screened and a complex effect at different buffer conditions was observed. Dynamic light scattering, differential scanning calorimetry and size exclusion chromatography studies were performed to further validate the results. In conclusion, hanging drop is a very easy and effective approach to screen protein formulations in the early stage of formulation development.
Simplified human thermoregulatory model for designing wearable thermoelectric devices
NASA Astrophysics Data System (ADS)
Wijethunge, Dimuthu; Kim, Donggyu; Kim, Woochul
2018-02-01
Research on wearable and implantable devices have become popular with the strong need in market. A precise understanding of the thermal properties of human skin, which are not constant values but vary depending on ambient condition, is required for the development of such devices. In this paper, we present simplified human thermoregulatory model for accurately estimating the thermal properties of the skin without applying rigorous calculations. The proposed model considers a variable blood flow rate through the skin, evaporation functions, and a variable convection heat transfer from the skin surface. In addition, wearable thermoelectric generation (TEG) and refrigeration devices were simulated. We found that deviations of 10-60% can be resulted in estimating TEG performance without considering human thermoregulatory model owing to the fact that thermal resistance of human skin is adapted to ambient condition. Simplicity of the modeling procedure presented in this work could be beneficial for optimizing and predicting the performance of any applications that are directly coupled with skin thermal properties.
Berret, Bastien; Darlot, Christian; Jean, Frédéric; Pozzo, Thierry; Papaxanthis, Charalambos; Gauthier, Jean Paul
2008-01-01
An important question in the literature focusing on motor control is to determine which laws drive biological limb movements. This question has prompted numerous investigations analyzing arm movements in both humans and monkeys. Many theories assume that among all possible movements the one actually performed satisfies an optimality criterion. In the framework of optimal control theory, a first approach is to choose a cost function and test whether the proposed model fits with experimental data. A second approach (generally considered as the more difficult) is to infer the cost function from behavioral data. The cost proposed here includes a term called the absolute work of forces, reflecting the mechanical energy expenditure. Contrary to most investigations studying optimality principles of arm movements, this model has the particularity of using a cost function that is not smooth. First, a mathematical theory related to both direct and inverse optimal control approaches is presented. The first theoretical result is the Inactivation Principle, according to which minimizing a term similar to the absolute work implies simultaneous inactivation of agonistic and antagonistic muscles acting on a single joint, near the time of peak velocity. The second theoretical result is that, conversely, the presence of non-smoothness in the cost function is a necessary condition for the existence of such inactivation. Second, during an experimental study, participants were asked to perform fast vertical arm movements with one, two, and three degrees of freedom. Observed trajectories, velocity profiles, and final postures were accurately simulated by the model. In accordance, electromyographic signals showed brief simultaneous inactivation of opposing muscles during movements. Thus, assuming that human movements are optimal with respect to a certain integral cost, the minimization of an absolute-work-like cost is supported by experimental observations. Such types of optimality criteria may be applied to a large range of biological movements. PMID:18949023
Iwanowicz, Edwin J; Kimball, S David; Lin, James; Lau, Wan; Han, W-C; Wang, Tammy C; Roberts, Daniel G M; Schumacher, W A; Ogletree, Martin L; Seiler, Steven M
2002-11-04
A series of retro-binding inhibitors of human alpha-thrombin was prepared to elucidate structure-activity relationships (SAR) and optimize in vivo performance. Compounds 9 and 11, orally active inhibitors of thrombin catalytic activity, were identified to be efficacious in a thrombin-induced lethality model in mice.
NASA Technical Reports Server (NTRS)
Baron, S.; Lancraft, R.; Zacharias, G.
1980-01-01
The optimal control model (OCM) of the human operator is used to predict the effect of simulator characteristics on pilot performance and workload. The piloting task studied is helicopter hover. Among the simulator characteristics considered were (computer generated) visual display resolution, field of view and time delay.
The High-Performance Mind: Mastering Brainwaves for Insight, Healing, and Creativity.
ERIC Educational Resources Information Center
Wise, Anna
This book aims to enlighten its readers on how to achieve optimal human efficiency, well-being, and balance through "brainwave training." According to the book, the 4 kinds of brainwaves--beta, alpha, theta, and delta--communicate with each other to pass information between conscious and unconscious mind. Mastering these brainwaves…
Fernández, Purificación; González, Cristina; Pena, M Teresa; Carro, Antonia M; Lorenzo, Rosa A
2013-03-12
A simple and efficient ultrasound-assisted dispersive liquid-liquid microextraction (UA-DLLME) method has been developed for the determination of seven benzodiazepines (alprazolam, bromazepam, clonazepam, diazepam, lorazepam, lormetazepam and tetrazepam) in human plasma samples. Chloroform and methanol were used as extractant and disperser solvents, respectively. The influence of several variables (e.g., type and volume of dispersant and extraction solvents, pH, ultrasonic time and ionic strength) was carefully evaluated and optimized, using an asymmetric screening design 3(2)4(2)//16. Analysis of extracts was performed by ultra-performance liquid chromatography coupled with photodiode array detection (UPLC-PDA). Under the optimum conditions, two reversed-phases, Shield RP18 and C18 columns were successfully tested, obtaining good linearity in a range of 0.01-5μgmL(-1), with correlation coefficients r>0.996. Quantification limits ranged between 4.3-13.2ngmL(-1) and 4.0-14.8ngmL(-1), were obtained for C18 and Shield RP18 columns, respectively. The optimized method exhibited a good precision level, with relative standard deviation values lower than 8%. The recoveries studied at two spiked levels, ranged from 71 to 102% for all considered compounds. The proposed method was successfully applied to the analysis of seven benzodiazepines in real human plasma samples. Copyright © 2013 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanchez, A; Little, K; Chung, J
Purpose: To validate the use of a Channelized Hotelling Observer (CHO) model for guiding image processing parameter selection and enable improved nodule detection in digital chest radiography. Methods: In a previous study, an anthropomorphic chest phantom was imaged with and without PMMA simulated nodules using a GE Discovery XR656 digital radiography system. The impact of image processing parameters was then explored using a CHO with 10 Laguerre-Gauss channels. In this work, we validate the CHO’s trend in nodule detectability as a function of two processing parameters by conducting a signal-known-exactly, multi-reader-multi-case (MRMC) ROC observer study. Five naive readers scored confidencemore » of nodule visualization in 384 images with 50% nodule prevalence. The image backgrounds were regions-of-interest extracted from 6 normal patient scans, and the digitally inserted simulated nodules were obtained from phantom data in previous work. Each patient image was processed with both a near-optimal and a worst-case parameter combination, as determined by the CHO for nodule detection. The same 192 ROIs were used for each image processing method, with 32 randomly selected lung ROIs per patient image. Finally, the MRMC data was analyzed using the freely available iMRMC software of Gallas et al. Results: The image processing parameters which were optimized for the CHO led to a statistically significant improvement (p=0.049) in human observer AUC from 0.78 to 0.86, relative to the image processing implementation which produced the lowest CHO performance. Conclusion: Differences in user-selectable image processing methods on a commercially available digital radiography system were shown to have a marked impact on performance of human observers in the task of lung nodule detection. Further, the effect of processing on humans was similar to the effect on CHO performance. Future work will expand this study to include a wider range of detection/classification tasks and more observers, including experienced chest radiologists.« less
Optimal Control of Malaria Transmission using Insecticide Treated Nets and Spraying
NASA Astrophysics Data System (ADS)
Athina, D.; Bakhtiar, T.; Jaharuddin
2017-03-01
In this paper, we consider a model of the transmission of malaria which was developed by Silva and Torres equipped with two control variables, namely the use of insecticide treated nets (ITN) to reduce the number of human beings infected and spraying to reduce the number of mosquitoes. Pontryagin maximum principle was applied to derive the differential equation system as optimality conditions which must be satisfied by optimal control variables. The Mangasarian sufficiency theorem shows that Pontryagin maximum principle is necessary as well as sufficient conditions for optimization problem. The 4th-order Runge Kutta method was then performed to solve the differential equations system. The numerical results show that both controls given at once can reduce the number of infected individuals as well as the number of mosquitoes which reduce the impact of malaria transmission.
Application of Human-Autonomy Teaming (HAT) Patterns to Reduce Crew Operations (RCO)
NASA Technical Reports Server (NTRS)
Shively, R. Jay; Brandt, Summer L.; Lachter, Joel; Matessa, Mike; Sadler, Garrett; Battiste, Henri
2016-01-01
Unmanned aerial systems, robotics, advanced cockpits, and air traffic management are all examples of domains that are seeing dramatic increases in automation. While automation may take on some tasks previously performed by humans, humans will still be required, for the foreseeable future, to remain in the system. The collaboration with humans and these increasingly autonomous systems will begin to resemble cooperation between teammates, rather than simple task allocation. It is critical to understand this human-autonomy teaming (HAT) to optimize these systems in the future. One methodology to understand HAT is by identifying recurring patterns of HAT that have similar characteristics and solutions. This paper applies a methodology for identifying HAT patterns to an advanced cockpit project.
Effects of Motivation: Rewarding Hackers for Undetected Attacks Cause Analysts to Perform Poorly.
Maqbool, Zahid; Makhijani, Nidhi; Pammi, V S Chandrasekhar; Dutt, Varun
2017-05-01
The aim of this study was to determine how monetary motivations influence decision making of humans performing as security analysts and hackers in a cybersecurity game. Cyberattacks are increasing at an alarming rate. As cyberattacks often cause damage to existing cyber infrastructures, it is important to understand how monetary rewards may influence decision making of hackers and analysts in the cyber world. Currently, only limited attention has been given to this area. In an experiment, participants were randomly assigned to three between-subjects conditions ( n = 26 for each condition): equal payoff, where the magnitude of monetary rewards for hackers and defenders was the same; rewarding hacker, where the magnitude of monetary reward for hacker's successful attack was 10 times the reward for analyst's successful defense; and rewarding analyst, where the magnitude of monetary reward for analyst's successful defense was 10 times the reward for hacker's successful attack. In all conditions, half of the participants were human hackers playing against Nash analysts and half were human analysts playing against Nash hackers. Results revealed that monetary rewards for human hackers and analysts caused a decrease in attack and defend actions compared with the baseline. Furthermore, rewarding human hackers for undetected attacks made analysts deviate significantly from their optimal behavior. If hackers are rewarded for their undetected attack actions, then this causes analysts to deviate from optimal defend proportions. Thus, analysts need to be trained not become overenthusiastic in defending networks. Applications of our results are to networks where the influence of monetary rewards may cause information theft and system damage.
Electromagnetic perception and individual features of human beings.
Lebedeva, N N; Kotrovskaya, T I
2001-01-01
An investigation was made of the individual reactions of human subjects exposed to electromagnetic fields. We performed the study on 86 volunteers separated into two groups. The first group was exposed to the electromagnetic field of infralow frequencies, whereas the second group was exposed to the electromagnetic field of extremely high frequencies. We found that the electromagnetic perception of human beings correlated with their individual features, such as EEG parameters, the critical frequency of flash merging, and the electric current sensitivity. Human subjects who had a high-quality perception of electromagnetic waves showed an optimal balance of cerebral processes, an excellent functional state of the central nervous system, and a good decision criterion.
NASA Astrophysics Data System (ADS)
Nikitin, Alexander P.; Bulsara, Adi R.; Stocks, Nigel G.
2017-03-01
Inspired by recent results on self-tunability in the outer hair cells of the mammalian cochlea, we describe an array of magnetic sensors where each individual sensor can self-tune to an optimal operating regime. The self-tuning gives the array its "biomimetic" features. We show that the overall performance of the array can, as expected, be improved by increasing the number of sensors but, however, coupling between sensors reduces the overall performance even though the individual sensors in the system could see an improvement. We quantify the similarity of this phenomenon to the Ringelmann effect that was formulated 103 years ago to account for productivity losses in human and animal groups. We propose a global feedback scheme that can be used to greatly mitigate the performance degradation that would, normally, stem from the Ringelmann effect.
Visual Perceptual Learning and Models.
Dosher, Barbara; Lu, Zhong-Lin
2017-09-15
Visual perceptual learning through practice or training can significantly improve performance on visual tasks. Originally seen as a manifestation of plasticity in the primary visual cortex, perceptual learning is more readily understood as improvements in the function of brain networks that integrate processes, including sensory representations, decision, attention, and reward, and balance plasticity with system stability. This review considers the primary phenomena of perceptual learning, theories of perceptual learning, and perceptual learning's effect on signal and noise in visual processing and decision. Models, especially computational models, play a key role in behavioral and physiological investigations of the mechanisms of perceptual learning and for understanding, predicting, and optimizing human perceptual processes, learning, and performance. Performance improvements resulting from reweighting or readout of sensory inputs to decision provide a strong theoretical framework for interpreting perceptual learning and transfer that may prove useful in optimizing learning in real-world applications.
Ethorobotics: A New Approach to Human-Robot Relationship
Miklósi, Ádám; Korondi, Péter; Matellán, Vicente; Gácsi, Márta
2017-01-01
Here we aim to lay the theoretical foundations of human-robot relationship drawing upon insights from disciplines that govern relevant human behaviors: ecology and ethology. We show how the paradox of the so called “uncanny valley hypothesis” can be solved by applying the “niche” concept to social robots, and relying on the natural behavior of humans. Instead of striving to build human-like social robots, engineers should construct robots that are able to maximize their performance in their niche (being optimal for some specific functions), and if they are endowed with appropriate form of social competence then humans will eventually interact with them independent of their embodiment. This new discipline, which we call ethorobotics, could change social robotics, giving a boost to new technical approaches and applications. PMID:28649213
Motor unit recruitment by size does not provide functional advantages for motor performance
Dideriksen, Jakob L; Farina, Dario
2013-01-01
It is commonly assumed that the orderly recruitment of motor units by size provides a functional advantage for the performance of movements compared with a random recruitment order. On the other hand, the excitability of a motor neuron depends on its size and this is intrinsically linked to its innervation number. A range of innervation numbers among motor neurons corresponds to a range of sizes and thus to a range of excitabilities ordered by size. Therefore, if the excitation drive is similar among motor neurons, the recruitment by size is inevitably due to the intrinsic properties of motor neurons and may not have arisen to meet functional demands. In this view, we tested the assumption that orderly recruitment is necessarily beneficial by determining if this type of recruitment produces optimal motor output. Using evolutionary algorithms and without any a priori assumptions, the parameters of neuromuscular models were optimized with respect to several criteria for motor performance. Interestingly, the optimized model parameters matched well known neuromuscular properties, but none of the optimization criteria determined a consistent recruitment order by size unless this was imposed by an association between motor neuron size and excitability. Further, when the association between size and excitability was imposed, the resultant model of recruitment did not improve the motor performance with respect to the absence of orderly recruitment. A consistent observation was that optimal solutions for a variety of criteria of motor performance always required a broad range of innervation numbers in the population of motor neurons, skewed towards the small values. These results indicate that orderly recruitment of motor units in itself does not provide substantial functional advantages for motor control. Rather, the reason for its near-universal presence in human movements is that motor functions are optimized by a broad range of innervation numbers. PMID:24144879
Motor unit recruitment by size does not provide functional advantages for motor performance.
Dideriksen, Jakob L; Farina, Dario
2013-12-15
It is commonly assumed that the orderly recruitment of motor units by size provides a functional advantage for the performance of movements compared with a random recruitment order. On the other hand, the excitability of a motor neuron depends on its size and this is intrinsically linked to its innervation number. A range of innervation numbers among motor neurons corresponds to a range of sizes and thus to a range of excitabilities ordered by size. Therefore, if the excitation drive is similar among motor neurons, the recruitment by size is inevitably due to the intrinsic properties of motor neurons and may not have arisen to meet functional demands. In this view, we tested the assumption that orderly recruitment is necessarily beneficial by determining if this type of recruitment produces optimal motor output. Using evolutionary algorithms and without any a priori assumptions, the parameters of neuromuscular models were optimized with respect to several criteria for motor performance. Interestingly, the optimized model parameters matched well known neuromuscular properties, but none of the optimization criteria determined a consistent recruitment order by size unless this was imposed by an association between motor neuron size and excitability. Further, when the association between size and excitability was imposed, the resultant model of recruitment did not improve the motor performance with respect to the absence of orderly recruitment. A consistent observation was that optimal solutions for a variety of criteria of motor performance always required a broad range of innervation numbers in the population of motor neurons, skewed towards the small values. These results indicate that orderly recruitment of motor units in itself does not provide substantial functional advantages for motor control. Rather, the reason for its near-universal presence in human movements is that motor functions are optimized by a broad range of innervation numbers.
Leveraging Human Insights by Combining Multi-Objective Optimization with Interactive Evolution
2015-03-26
application, a program that used human selections to guide the evolution of insect -like images. He was able to demonstrate that humans provide key insights...LEVERAGING HUMAN INSIGHTS BY COMBINING MULTI-OBJECTIVE OPTIMIZATION WITH INTERACTIVE EVOLUTION THESIS Joshua R. Christman, Second Lieutenant, USAF...COMBINING MULTI-OBJECTIVE OPTIMIZATION WITH INTERACTIVE EVOLUTION THESIS Presented to the Faculty Department of Electrical and Computer Engineering
Automation and Optimization of Multipulse Laser Zona Drilling of Mouse Embryos During Embryo Biopsy.
Wong, Christopher Yee; Mills, James K
2017-03-01
Laser zona drilling (LZD) is a required step in many embryonic surgical procedures, for example, assisted hatching and preimplantation genetic diagnosis. LZD involves the ablation of the zona pellucida (ZP) using a laser while minimizing potentially harmful thermal effects on critical internal cell structures. Develop a method for the automation and optimization of multipulse LZD, applied to cleavage-stage embryos. A two-stage optimization is used. The first stage uses computer vision algorithms to identify embryonic structures and determines the optimal ablation zone farthest away from critical structures such as blastomeres. The second stage combines a genetic algorithm with a previously reported thermal analysis of LZD to optimize the combination of laser pulse locations and pulse durations. The goal is to minimize the peak temperature experienced by the blastomeres while creating the desired opening in the ZP. A proof of concept of the proposed LZD automation and optimization method is demonstrated through experiments on mouse embryos with positive results, as adequately sized openings are created. Automation of LZD is feasible and is a viable step toward the automation of embryo biopsy procedures. LZD is a common but delicate procedure performed by human operators using subjective methods to gauge proper LZD procedure. Automation of LZD removes human error to increase the success rate of LZD. Although the proposed methods are developed for cleavage-stage embryos, the same methods may be applied to most types LZD procedures, embryos at different developmental stages, or nonembryonic cells.
Hunt, Megan M; Meng, Guoliang; Rancourt, Derrick E; Gates, Ian D; Kallos, Michael S
2014-01-01
Traditional optimization of culture parameters for the large-scale culture of human embryonic stem cells (ESCs) as aggregates is carried out in a stepwise manner whereby the effect of varying each culture parameter is investigated individually. However, as evidenced by the wide range of published protocols and culture performance indicators (growth rates, pluripotency marker expression, etc.), there is a lack of systematic investigation into the true effect of varying culture parameters especially with respect to potential interactions between culture variables. Here we describe the design and execution of a two-parameter, three-level (3(2)) factorial experiment resulting in nine conditions that were run in duplicate 125-mL stirred suspension bioreactors. The two parameters investigated here were inoculation density and agitation rate, which are easily controlled, but currently, poorly characterized. Cell readouts analyzed included fold expansion, maximum density, and exponential growth rate. Our results reveal that the choice of best case culture parameters was dependent on which cell property was chosen as the primary output variable. Subsequent statistical analyses via two-way analysis of variance indicated significant interaction effects between inoculation density and agitation rate specifically in the case of exponential growth rates. Results indicate that stepwise optimization has the potential to miss out on the true optimal case. In addition, choosing an optimum condition for a culture output of interest from the factorial design yielded similar results when repeated with the same cell line indicating reproducibility. We finally validated that human ESCs remain pluripotent in suspension culture as aggregates under our optimal conditions and maintain their differentiation capabilities as well as a stable karyotype and strong expression levels of specific human ESC markers over several passages in suspension bioreactors.
EVA Suit R and D for Performance Optimization
NASA Technical Reports Server (NTRS)
Cowley, Matthew S.; Harvill, Lauren; Benson, Elizabeth; Rajulu, Sudhakar
2014-01-01
Designing a planetary suit is very complex and often requires difficult trade-offs between performance, cost, mass, and system complexity. To verify that new suit designs meet requirements, full prototypes must be built and tested with human subjects. However, numerous design iterations will occur before the hardware meets those requirements. Traditional draw-prototype-test paradigms for R&D are prohibitively expensive with today's shrinking Government budgets. Personnel at NASA are developing modern simulation techniques which focus on human-centric designs by creating virtual prototype simulations and fully adjustable physical prototypes of suit hardware. During the R&D design phase, these easily modifiable representations of an EVA suit's hard components will allow designers to think creatively and exhaust design possibilities before they build and test working prototypes with human subjects. It allows scientists to comprehensively benchmark current suit capabilities and limitations for existing suit sizes and sizes that do not exist. This is extremely advantageous and enables comprehensive design down-selections to be made early in the design process, enables the use of human performance as design criteria, and enables designs to target specific populations
Wang, Huili; Gao, Ming; Wang, Mei; Zhang, Rongbo; Wang, Wenwei; Dahlgren, Randy A; Wang, Xuedong
2015-03-15
Herein, we developed a novel integrated device to perform phase separation based on ultrasound-assisted salt-induced liquid-liquid microextraction for determination of five fluoroquinones (FQs) in human body fluids. The integrated device consisted of three simple HDPE components used to separate the extraction solvent from the aqueous phase prior to retrieving the extractant. A series of extraction parameters were optimized using the response surface method based on central composite design. Optimal conditions consisted of 945μL acetone extraction solvent, pH 2.1, 4.1min stir time, 5.9g Na2SO4, and 4.0min centrifugation. Under optimized conditions, the limits of detection (at S/N=3) were 0.12-0.66μgL(-1), the linear range was 0.5-500μgL(-1) and recoveries were 92.6-110.9% for the five FQs extracted from plasma and urine. The proposed method has several advantages, such as easy construction from inexpensive materials, high extraction efficiency, short extraction time, and compatibility with HPLC analysis. Thus, this method shows excellent prospects for sample pretreatment and analysis of FQs in human body fluids. Copyright © 2015 Elsevier B.V. All rights reserved.
Pan-Zhou, Xin-Ru; Mayes, Benjamin A; Rashidzadeh, Hassan; Gasparac, Rahela; Smith, Steven; Bhadresa, Sanjeev; Gupta, Kusum; Cohen, Marita Larsson; Bu, Charlie; Good, Steven S; Moussa, Adel; Rush, Roger
2016-10-01
IDX184 is a phosphoramidate prodrug of 2'-methylguanosine-5'-monophosphate, developed to treat patients infected with hepatitis C virus. A mass balance study of radiolabeled IDX184 and pharmacokinetic studies of IDX184 in portal vein-cannulated monkeys revealed relatively low IDX184 absorption but higher exposure of IDX184 in the portal vein than in the systemic circulation, indicating >90 % of the absorbed dose was subject to hepatic extraction. Systemic exposures to the main metabolite, 2'-methylguanosine (2'-MeG), were used as a surrogate for liver levels of the pharmacologically active entity 2'-MeG triphosphate, and accordingly, systemic levels of 2'-MeG in the monkey were used to optimize formulations for further clinical development of IDX184. Capsule formulations of IDX184 delivered acceptable levels of 2'-MeG in humans; however, the encapsulation process introduced low levels of the genotoxic impurity ethylene sulfide (ES), which necessitated formulation optimization. Animal pharmacokinetic data guided the development of a tablet with trace levels of ES and pharmacokinetic performance equal to that of the clinical capsule in the monkey. Under fed conditions in humans, the new tablet formulation showed similar exposure to the capsule used in prior clinical trials.
Wilkinson, D S; Dilts, T J
1999-01-01
We believe the team approach to laboratory management achieves the best outcomes. Laboratory management requires the integration of medical, technical, and administrative expertise to achieve optimal service, quality, and cost performance. Usually, a management team of two or more individuals must be assembled to achieve all of these critical leadership functions. The individual members of the management team must possess the requisite expertise in clinical medicine, laboratory science, technology management, and administration. They also must work together in a unified and collaborative manner, regardless of where individual team members appear on the organizational chart. The management team members share in executing the entire human resource management life cycle, creating the proper environment to maximize human performance. Above all, the management team provides visionary and credible leadership.
SoMIR framework for designing high-NDBP photonic crystal waveguides.
Mirjalili, Seyed Mohammad
2014-06-20
This work proposes a modularized framework for designing the structure of photonic crystal waveguides (PCWs) and reducing human involvement during the design process. The proposed framework consists of three main modules: parameters module, constraints module, and optimizer module. The first module is responsible for defining the structural parameters of a given PCW. The second module defines various limitations in order to achieve desirable optimum designs. The third module is the optimizer, in which a numerical optimization method is employed to perform optimization. As case studies, two new structures called Ellipse PCW (EPCW) and Hypoellipse PCW (HPCW) with different shape of holes in each row are proposed and optimized by the framework. The calculation results show that the proposed framework is able to successfully optimize the structures of the new EPCW and HPCW. In addition, the results demonstrate the applicability of the proposed framework for optimizing different PCWs. The results of the comparative study show that the optimized EPCW and HPCW provide 18% and 9% significant improvements in normalized delay-bandwidth product (NDBP), respectively, compared to the ring-shape-hole PCW, which has the highest NDBP in the literature. Finally, the simulations of pulse propagation confirm the manufacturing feasibility of both optimized structures.
Optimal design and control of an electromechanical transfemoral prosthesis with energy regeneration.
Rohani, Farbod; Richter, Hanz; van den Bogert, Antonie J
2017-01-01
In this paper, we present the design of an electromechanical above-knee active prosthesis with energy storage and regeneration. The system consists of geared knee and ankle motors, parallel springs for each motor, an ultracapacitor, and controllable four-quadrant power converters. The goal is to maximize the performance of the system by finding optimal controls and design parameters. A model of the system dynamics was developed, and used to solve a combined trajectory and design optimization problem. The objectives of the optimization were to minimize tracking error relative to human joint motions, as well as energy use. The optimization problem was solved by the method of direct collocation, based on joint torque and joint angle data from ten subjects walking at three speeds. After optimization of controls and design parameters, the simulated system could operate at zero energy cost while still closely emulating able-bodied gait. This was achieved by controlled energy transfer between knee and ankle, and by controlled storage and release of energy throughout the gait cycle. Optimal gear ratios and spring parameters were similar across subjects and walking speeds.
Quality of clinical brain tumor MR spectra judged by humans and machine learning tools.
Kyathanahally, Sreenath P; Mocioiu, Victor; Pedrosa de Barros, Nuno; Slotboom, Johannes; Wright, Alan J; Julià-Sapé, Margarida; Arús, Carles; Kreis, Roland
2018-05-01
To investigate and compare human judgment and machine learning tools for quality assessment of clinical MR spectra of brain tumors. A very large set of 2574 single voxel spectra with short and long echo time from the eTUMOUR and INTERPRET databases were used for this analysis. Original human quality ratings from these studies as well as new human guidelines were used to train different machine learning algorithms for automatic quality control (AQC) based on various feature extraction methods and classification tools. The performance was compared with variance in human judgment. AQC built using the RUSBoost classifier that combats imbalanced training data performed best. When furnished with a large range of spectral and derived features where the most crucial ones had been selected by the TreeBagger algorithm it showed better specificity (98%) in judging spectra from an independent test-set than previously published methods. Optimal performance was reached with a virtual three-class ranking system. Our results suggest that feature space should be relatively large for the case of MR tumor spectra and that three-class labels may be beneficial for AQC. The best AQC algorithm showed a performance in rejecting spectra that was comparable to that of a panel of human expert spectroscopists. Magn Reson Med 79:2500-2510, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
National Space Biomedical Research Institute (NSBRI) JSC Summer Projects
NASA Technical Reports Server (NTRS)
Dowdy, Forrest Ryan
2014-01-01
This project optimized the calorie content in a breakfast meal replacement bar for the Advanced Food Technology group. Use of multivariable optimization yielded the highest weight savings possible while simultaneously matching NASA Human Standards nutritional guidelines. The scope of this research included the study of shelf-life indicators such as water activity, moisture content, and texture analysis. Key metrics indicate higher protein content, higher caloric density, and greater mass savings as a result of the reformulation process. The optimization performed for this study demonstrated wide application to other food bars in the Advanced Food Technology portfolio. Recommendations for future work include shelf life studies on bar hardening and overall acceptability data over increased time frames and temperature fluctuation scenarios.
Modifications to Optimize the AH-1Z Human Machine Interface
2013-04-18
accomplish this, a complete workload study of tasks performed by aircrew in the AH-1Z must be completed in the near future in order to understand...design flaws and guide future design and integration of increased capability. Additionally, employment of material solutions to provide aircrew with the...accomplish this, a complete workload study of tasks performed by aircrew in the AH-1Z must be completed in the near future in order to understand
A Report on Applying EEGnet to Discriminate Human State Effects on Task Performance
2018-01-01
whether we could identify what task the participant was performing from differences in the recorded brain time series . We modeled the relationship...between input data (brain time series ) and output labels (task A and task B) as an unknown function, and we found an optimal approximation of that...this report are not to be construed as an official Department of the Army position unless so designated by other authorized documents. Citation of
Proprioceptive isokinetic exercise test
NASA Technical Reports Server (NTRS)
Dempster, P. T.; Bernauer, E. M.; Bond, M.; Greenleaf, J. E.
1993-01-01
Proprioception, the reception of stimuli within the body that indicates position, is an important mechanism for optimal human performance. People exposed to prolonged bed rest, microgravity, or other deconditioning situations usually experience reduced proprioceptor and kinesthetic stimuli that compromise body balance, posture, and equilibrium. A new proprioceptive test is described that utilizes the computer-driven LIDO isokinetic ergometer. An overview of the computer logic, software, and testing procedure for this proprioceptive test, which can be performed with the arms or legs, is described.
Janssen, Christian P; Brumby, Duncan P; Dowell, John; Chater, Nick; Howes, Andrew
2011-01-01
We report the results of a dual-task study in which participants performed a tracking and typing task under various experimental conditions. An objective payoff function was used to provide explicit feedback on how participants should trade off performance between the tasks. Results show that participants' dual-task interleaving strategy was sensitive to changes in the difficulty of the tracking task and resulted in differences in overall task performance. To test the hypothesis that people select strategies that maximize payoff, a Cognitively Bounded Rational Analysis model was developed. This analysis evaluated a variety of dual-task interleaving strategies to identify the optimal strategy for maximizing payoff in each condition. The model predicts that the region of optimum performance is different between experimental conditions. The correspondence between human data and the prediction of the optimal strategy is found to be remarkably high across a number of performance measures. This suggests that participants were honing their behavior to maximize payoff. Limitations are discussed. Copyright © 2011 Cognitive Science Society, Inc.
Peh, Gary S L; Toh, Kah-Peng; Ang, Heng-Pei; Seah, Xin-Yi; George, Benjamin L; Mehta, Jodhbir S
2013-05-03
Global shortage of donor corneas greatly restricts the numbers of corneal transplantations performed yearly. Limited ex vivo expansion of primary human corneal endothelial cells is possible, and a considerable clinical interest exists for development of tissue-engineered constructs using cultivated corneal endothelial cells. The objective of this study was to investigate the density-dependent growth of human corneal endothelial cells isolated from paired donor corneas and to elucidate an optimal seeding density for their extended expansion in vitro whilst maintaining their unique cellular morphology. Established primary human corneal endothelial cells were propagated to the second passage (P2) before they were utilized for this study. Confluent P2 cells were dissociated and seeded at four seeding densities: 2,500 cells per cm2 ('LOW'); 5,000 cells per cm2 ('MID'); 10,000 cells per cm2 ('HIGH'); and 20,000 cells per cm2 ('HIGH(×2)'), and subsequently analyzed for their propensity to proliferate. They were also subjected to morphometric analyses comparing cell sizes, coefficient of variance, as well as cell circularity when each culture became confluent. At the two lower densities, proliferation rates were higher than cells seeded at higher densities, though not statistically significant. However, corneal endothelial cells seeded at lower densities were significantly larger in size, heterogeneous in shape and less circular (fibroblastic-like), and remained hypertrophic after one month in culture. Comparatively, cells seeded at higher densities were significantly homogeneous, compact and circular at confluence. Potentially, at an optimal seeding density of 10,000 cells per cm2, it is possible to obtain between 10 million to 25 million cells at the third passage. More importantly, these expanded human corneal endothelial cells retained their unique cellular morphology. Our results demonstrated a density dependency in the culture of primary human corneal endothelial cells. Sub-optimal seeding density results in a decrease in cell saturation density, as well as a loss in their proliferative potential. As such, we propose a seeding density of not less than 10,000 cells per cm2 for regular passage of primary human corneal endothelial cells.
Narang, Yashraj S; Murthy Arelekatti, V N; Winter, Amos G
2016-12-01
Our research aims to design low-cost, high-performance, passive prosthetic knees for developing countries. In this study, we determine optimal stiffness, damping, and engagement parameters for a low-cost, passive prosthetic knee that consists of simple mechanical elements and may enable users to walk with the normative kinematics of able-bodied humans. Knee joint power was analyzed to divide gait into energy-based phases and select mechanical components for each phase. The behavior of each component was described with a polynomial function, and the coefficients and polynomial order of each function were optimized to reproduce the knee moments required for normative kinematics of able-bodied humans. Sensitivity of coefficients to prosthesis mass was also investigated. The knee moments required for prosthesis users to walk with able-bodied normative kinematics were accurately reproduced with a mechanical system consisting of a linear spring, two constant-friction dampers, and three clutches (R2=0.90 for a typical prosthetic leg). Alterations in upper leg, lower leg, and foot mass had a large influence on optimal coefficients, changing damping coefficients by up to 180%. Critical results are reported through parametric illustrations that can be used by designers of prostheses to select optimal components for a prosthetic knee based on the inertial properties of the amputee and his or her prosthetic leg.
Using advanced computer vision algorithms on small mobile robots
NASA Astrophysics Data System (ADS)
Kogut, G.; Birchmore, F.; Biagtan Pacis, E.; Everett, H. R.
2006-05-01
The Technology Transfer project employs a spiral development process to enhance the functionality and autonomy of mobile robot systems in the Joint Robotics Program (JRP) Robotic Systems Pool by converging existing component technologies onto a transition platform for optimization. An example of this approach is the implementation of advanced computer vision algorithms on small mobile robots. We demonstrate the implementation and testing of the following two algorithms useful on mobile robots: 1) object classification using a boosted Cascade of classifiers trained with the Adaboost training algorithm, and 2) human presence detection from a moving platform. Object classification is performed with an Adaboost training system developed at the University of California, San Diego (UCSD) Computer Vision Lab. This classification algorithm has been used to successfully detect the license plates of automobiles in motion in real-time. While working towards a solution to increase the robustness of this system to perform generic object recognition, this paper demonstrates an extension to this application by detecting soda cans in a cluttered indoor environment. The human presence detection from a moving platform system uses a data fusion algorithm which combines results from a scanning laser and a thermal imager. The system is able to detect the presence of humans while both the humans and the robot are moving simultaneously. In both systems, the two aforementioned algorithms were implemented on embedded hardware and optimized for use in real-time. Test results are shown for a variety of environments.
Physiologically based Pharmacokinetic Modeling of 1,4-Dioxane in Rats, Mice, and Humans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sweeney, Lisa M.; Thrall, Karla D.; Poet, Torka S.
2008-01-01
ABSTRACT 1,4-Dioxane (CAS No. 123-91-1) is used primarily as a solvent or as a solvent stabilizer. It can cause lung, liver and kidney damage at sufficiently high exposure levels. Two physiologically-based pharmacokinetic (PBPK) models of 1,4-dioxane and its major metabolite, hydroxyethoxyacetic acid (HEAA), were published in 1990. These models have uncertainties and deficiencies that could be addressed and the model strengthened for use in a contemporary cancer risk assessment for 1,4-dioxane. Studies were performed to fill data gaps and reduce uncertainties pertaining to the pharmacokinetics of 1,4-dioxane and HEAA in rats, mice, and humans. Three types of studies were performed:partitionmore » coefficient measurements, blood time course in mice, and in vitro pharmacokinetics using rat, mouse, and human hepatocytes. Updated PBPK models were developed based on these new data and previously available data. The optimized rate of metabolism for the mouse was significantly higher than the value previously estimated. The optimized rat kinetic parameters were similar to those in the 1990 models. Only two human studies were identified. Model predictions were consistent with one study, but did not fit the second as well. In addition, a rat nasal exposure was completed. The results confirmed water directly contacts rat nasal tissues during drinking water under bioassays. Consistent with previous PBPK models, nasal tissues were not specifically included in the model. Use of these models will reduce the uncertainty in future 1,4-dioxane risk assessments.« less
Li, Zhenjiang; Wang, Bin; Ge, Shufang; Yan, Lailai; Liu, Yingying; Li, Zhiwen; Ren, Aiguo
2016-12-01
Polycyclic aromatic hydrocarbons (PAHs), nicotine, cotinine, and metals in human hair have been used as important environmental exposure markers. We aimed to develop a simple method to simultaneously analyze these pollutants using a small quantity of hair. The digestion performances of tetramethylammonium hydroxide (TMAH) and sodium hydroxide (NaOH) for human hair were compared. Various solvents or their mixtures including n-hexane (HEX), dichloromethane (DCM) and trichloromethane (TCM), HEX:DCM32 (3/2) and HEX:TCM73 (7/3) were adopted to extract organics. The recoveries of metals were determined under an optimal operation of digestion and extraction. Our results showed that TMAH performed well in dissolving human hair and even better than NaOH. Overall, the recoveries for five solutions were acceptable for PAHs, nicotine in the range of 80%-110%. Except for HEX, other four extraction solutions had acceptable extraction efficiency for cotinine from HEX:TCM73 (88 ± 4.1%) to HEX:DCM32 (100 ± 2.8%). HEX:DCM32 was chosen as the optimal solvent in consideration of its extraction efficiency and lower density than water. The recoveries of 12 typical major or trace metals were mainly in the range of 90%-110% and some of them were close to 100%. In conclusion, the simultaneous analysis of PAHs, nicotine, cotinine, and metals was feasible. Our study provided a simple and low-cost technique for environmental epidemiological studies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Cankorur-Cetinkaya, Ayca; Dias, Joao M L; Kludas, Jana; Slater, Nigel K H; Rousu, Juho; Oliver, Stephen G; Dikicioglu, Duygu
2017-06-01
Multiple interacting factors affect the performance of engineered biological systems in synthetic biology projects. The complexity of these biological systems means that experimental design should often be treated as a multiparametric optimization problem. However, the available methodologies are either impractical, due to a combinatorial explosion in the number of experiments to be performed, or are inaccessible to most experimentalists due to the lack of publicly available, user-friendly software. Although evolutionary algorithms may be employed as alternative approaches to optimize experimental design, the lack of simple-to-use software again restricts their use to specialist practitioners. In addition, the lack of subsidiary approaches to further investigate critical factors and their interactions prevents the full analysis and exploitation of the biotechnological system. We have addressed these problems and, here, provide a simple-to-use and freely available graphical user interface to empower a broad range of experimental biologists to employ complex evolutionary algorithms to optimize their experimental designs. Our approach exploits a Genetic Algorithm to discover the subspace containing the optimal combination of parameters, and Symbolic Regression to construct a model to evaluate the sensitivity of the experiment to each parameter under investigation. We demonstrate the utility of this method using an example in which the culture conditions for the microbial production of a bioactive human protein are optimized. CamOptimus is available through: (https://doi.org/10.17863/CAM.10257).
Optimization of cryoprotectant loading into murine and human oocytes.
Karlsson, Jens O M; Szurek, Edyta A; Higgins, Adam Z; Lee, Sang R; Eroglu, Ali
2014-02-01
Loading of cryoprotectants into oocytes is an important step of the cryopreservation process, in which the cells are exposed to potentially damaging osmotic stresses and chemical toxicity. Thus, we investigated the use of physics-based mathematical optimization to guide design of cryoprotectant loading methods for mouse and human oocytes. We first examined loading of 1.5 M dimethyl sulfoxide (Me(2)SO) into mouse oocytes at 23°C. Conventional one-step loading resulted in rates of fertilization (34%) and embryonic development (60%) that were significantly lower than those of untreated controls (95% and 94%, respectively). In contrast, the mathematically optimized two-step method yielded much higher rates of fertilization (85%) and development (87%). To examine the causes for oocyte damage, we performed experiments to separate the effects of cell shrinkage and Me(2)SO exposure time, revealing that neither shrinkage nor Me(2)SO exposure single-handedly impairs the fertilization and development rates. Thus, damage during one-step Me(2)SO addition appears to result from interactions between the effects of Me(2)SO toxicity and osmotic stress. We also investigated Me(2)SO loading into mouse oocytes at 30°C. At this temperature, fertilization rates were again lower after one-step loading (8%) in comparison to mathematically optimized two-step loading (86%) and untreated controls (96%). Furthermore, our computer algorithm generated an effective strategy for reducing Me(2)SO exposure time, using hypotonic diluents for cryoprotectant solutions. With this technique, 1.5 M Me(2)SO was successfully loaded in only 2.5 min, with 92% fertilizability. Based on these promising results, we propose new methods to load cryoprotectants into human oocytes, designed using our mathematical optimization approach. Copyright © 2013 Elsevier Inc. All rights reserved.
Optimization of Cryoprotectant Loading into Murine and Human Oocytes
Karlsson, Jens O.M.; Szurek, Edyta A.; Higgins, Adam Z.; Lee, Sang R.; Eroglu, Ali
2014-01-01
Loading of cryoprotectants into oocytes is an important step of the cryopreservation process, in which the cells are exposed to potentially damaging osmotic stresses and chemical toxicity. Thus, we investigated the use of physics-based mathematical optimization to guide design of cryoprotectant loading methods for mouse and human oocytes. We first examined loading of 1.5 M dimethylsulfoxide (Me2SO) into mouse oocytes at 23°C. Conventional one-step loading resulted in rates of fertilization (34%) and embryonic development (60%) that were significantly lower than those of untreated controls (95% and 94%, respectively). In contrast, the mathematically optimized two-step method yielded much higher rates of fertilization (85%) and development (87%). To examine the causes for oocyte damage, we performed experiments to separate the effects of cell shrinkage and Me2SO exposure time, revealing that neither shrinkage nor Me2SO exposure single-handedly impairs the fertilization and development rates. Thus, damage during one-step Me2SO addition appears to result from interactions between the effects of Me2SO toxicity and osmotic stress. We also investigated Me2SO loading into mouse oocytes at 30°C. At this temperature, fertilization rates were again lower after one-step loading (8%) in comparison to mathematically optimized two-step loading (86%) and untreated controls (96%). Furthermore, our computer algorithm generated an effective strategy for reducing Me2SO exposure time, using hypotonic diluents for cryoprotectant solutions. With this technique, 1.5 M Me2SO was successfully loaded in only 2.5 min, with 92% fertilizability. Based on these promising results, we propose new methods to load cryoprotectants into human oocytes, designed using our mathematical optimization approach. PMID:24246951
Detection of fatigue cracks by nondestructive testing methods
NASA Technical Reports Server (NTRS)
Anderson, R. T.; Delacy, T. J.; Stewart, R. C.
1973-01-01
The effectiveness was assessed of various NDT methods to detect small tight cracks by randomly introducing fatigue cracks into aluminum sheets. The study included optimizing NDT methods calibrating NDT equipment with fatigue cracked standards, and evaluating a number of cracked specimens by the optimized NDT methods. The evaluations were conducted by highly trained personnel, provided with detailed procedures, in order to minimize the effects of human variability. These personnel performed the NDT on the test specimens without knowledge of the flaw locations and reported on the flaws detected. The performance of these tests was measured by comparing the flaws detected against the flaws present. The principal NDT methods utilized were radiographic, ultrasonic, penetrant, and eddy current. Holographic interferometry, acoustic emission monitoring, and replication methods were also applied on a reduced number of specimens. Generally, the best performance was shown by eddy current, ultrasonic, penetrant and holographic tests. Etching provided no measurable improvement, while proof loading improved flaw detectability. Data are shown that quantify the performances of the NDT methods applied.
A global evolutionary and metabolic analysis of human obesity gene risk variants.
Castillo, Joseph J; Hazlett, Zachary S; Orlando, Robert A; Garver, William S
2017-09-05
It is generally accepted that the selection of gene variants during human evolution optimized energy metabolism that now interacts with our obesogenic environment to increase the prevalence of obesity. The purpose of this study was to perform a global evolutionary and metabolic analysis of human obesity gene risk variants (110 human obesity genes with 127 nearest gene risk variants) identified using genome-wide association studies (GWAS) to enhance our knowledge of early and late genotypes. As a result of determining the mean frequency of these obesity gene risk variants in 13 available populations from around the world our results provide evidence for the early selection of ancestral risk variants (defined as selection before migration from Africa) and late selection of derived risk variants (defined as selection after migration from Africa). Our results also provide novel information for association of these obesity genes or encoded proteins with diverse metabolic pathways and other human diseases. The overall results indicate a significant differential evolutionary pattern for the selection of obesity gene ancestral and derived risk variants proposed to optimize energy metabolism in varying global environments and complex association with metabolic pathways and other human diseases. These results are consistent with obesity genes that encode proteins possessing a fundamental role in maintaining energy metabolism and survival during the course of human evolution. Copyright © 2017. Published by Elsevier B.V.
Stochastic resonance in attention control
NASA Astrophysics Data System (ADS)
Kitajo, K.; Yamanaka, K.; Ward, L. M.; Yamamoto, Y.
2006-12-01
We investigated the beneficial role of noise in a human higher brain function, namely visual attention control. We asked subjects to detect a weak gray-level target inside a marker box either in the left or the right visual field. Signal detection performance was optimized by presenting a low level of randomly flickering gray-level noise between and outside the two possible target locations. Further, we found that an increase in eye movement (saccade) rate helped to compensate for the usual deterioration in detection performance at higher noise levels. To our knowledge, this is the first experimental evidence that noise can optimize a higher brain function which involves distinct brain regions above the level of primary sensory systems -- switching behavior between multi-stable attention states -- via the mechanism of stochastic resonance.
The Astronaut-Athlete: Optimizing Human Performance in Space.
Hackney, Kyle J; Scott, Jessica M; Hanson, Andrea M; English, Kirk L; Downs, Meghan E; Ploutz-Snyder, Lori L
2015-12-01
It is well known that long-duration spaceflight results in deconditioning of neuromuscular and cardiovascular systems, leading to a decline in physical fitness. On reloading in gravitational environments, reduced fitness (e.g., aerobic capacity, muscular strength, and endurance) could impair human performance, mission success, and crew safety. The level of fitness necessary for the performance of routine and off-nominal terrestrial mission tasks remains an unanswered and pressing question for scientists and flight physicians. To mitigate fitness loss during spaceflight, resistance and aerobic exercise are the most effective countermeasure available to astronauts. Currently, 2.5 h·d, 6-7 d·wk is allotted in crew schedules for exercise to be performed on highly specialized hardware on the International Space Station (ISS). Exercise hardware provides up to 273 kg of loading capability for resistance exercise, treadmill speeds between 0.44 and 5.5 m·s, and cycle workloads from 0 and 350 W. Compared to ISS missions, future missions beyond low earth orbit will likely be accomplished with less vehicle volume and power allocated for exercise hardware. Concomitant factors, such as diet and age, will also affect the physiologic responses to exercise training (e.g., anabolic resistance) in the space environment. Research into the potential optimization of exercise countermeasures through use of dietary supplementation, and pharmaceuticals may assist in reducing physiological deconditioning during long-duration spaceflight and have the potential to enhance performance of occupationally related astronaut tasks (e.g., extravehicular activity, habitat construction, equipment repairs, planetary exploration, and emergency response).
Automated Speech Rate Measurement in Dysarthria.
Martens, Heidi; Dekens, Tomas; Van Nuffelen, Gwen; Latacz, Lukas; Verhelst, Werner; De Bodt, Marc
2015-06-01
In this study, a new algorithm for automated determination of speech rate (SR) in dysarthric speech is evaluated. We investigated how reliably the algorithm calculates the SR of dysarthric speech samples when compared with calculation performed by speech-language pathologists. The new algorithm was trained and tested using Dutch speech samples of 36 speakers with no history of speech impairment and 40 speakers with mild to moderate dysarthria. We tested the algorithm under various conditions: according to speech task type (sentence reading, passage reading, and storytelling) and algorithm optimization method (speaker group optimization and individual speaker optimization). Correlations between automated and human SR determination were calculated for each condition. High correlations between automated and human SR determination were found in the various testing conditions. The new algorithm measures SR in a sufficiently reliable manner. It is currently being integrated in a clinical software tool for assessing and managing prosody in dysarthric speech. Further research is needed to fine-tune the algorithm to severely dysarthric speech, to make the algorithm less sensitive to background noise, and to evaluate how the algorithm deals with syllabic consonants.
Abort Options for Human Missions to Earth-Moon Halo Orbits
NASA Technical Reports Server (NTRS)
Jesick, Mark C.
2013-01-01
Abort trajectories are optimized for human halo orbit missions about the translunar libration point (L2), with an emphasis on the use of free return trajectories. Optimal transfers from outbound free returns to L2 halo orbits are numerically optimized in the four-body ephemeris model. Circumlunar free returns are used for direct transfers, and cislunar free returns are used in combination with lunar gravity assists to reduce propulsive requirements. Trends in orbit insertion cost and flight time are documented across the southern L2 halo family as a function of halo orbit position and free return flight time. It is determined that the maximum amplitude southern halo incurs the lowest orbit insertion cost for direct transfers but the maximum cost for lunar gravity assist transfers. The minimum amplitude halo is the most expensive destination for direct transfers but the least expensive for lunar gravity assist transfers. The on-orbit abort costs for three halos are computed as a function of abort time and return time. Finally, an architecture analysis is performed to determine launch and on-orbit vehicle requirements for halo orbit missions.
Voss, Joel L; Warren, David E; Gonsalves, Brian D; Federmeier, Kara D; Tranel, Dan; Cohen, Neal J
2011-08-02
Effective exploratory behaviors involve continuous updating of sensory sampling to optimize the efficacy of information gathering. Despite some work on this issue in animals, little information exists regarding the cognitive or neural mechanisms for this sort of behavioral optimization in humans. Here we examined a visual exploration phenomenon that occurred when human subjects studying an array of objects spontaneously looked "backward" in their scanning paths to view recently seen objects again. This "spontaneous revisitation" of recently viewed objects was associated with enhanced hippocampal activity and superior subsequent memory performance in healthy participants, but occurred only rarely in amnesic patients with severe damage to the hippocampus. These findings demonstrate the necessity of the hippocampus not just in the aspects of long-term memory with which it has been associated previously, but also in the short-term adaptive control of behavior. Functional neuroimaging showed hippocampal engagement occurring in conjunction with frontocerebellar circuits, thereby revealing some of the larger brain circuitry essential for the strategic deployment of information-seeking behaviors that optimize learning.
NASA Technical Reports Server (NTRS)
Bladwin, Richard S.
2009-01-01
As NASA embarks on a renewed human presence in space, safe, human-rated, electrical energy storage and power generation technologies, which will be capable of demonstrating reliable performance in a variety of unique mission environments, will be required. To address the future performance and safety requirements for the energy storage technologies that will enhance and enable future NASA Constellation Program elements and other future aerospace missions, advanced rechargeable, lithium-ion battery technology development is being pursued with an emphasis on addressing performance technology gaps between state-of-the-art capabilities and critical future mission requirements. The material attributes and related performance of a lithium-ion cell's internal separator component are critical for achieving overall optimal performance, safety and reliability. This review provides an overview of the general types, material properties and the performance and safety characteristics of current separator materials employed in lithium-ion batteries, such as those materials that are being assessed and developed for future aerospace missions.
Analysis of short-chain fatty acids in human feces: A scoping review.
Primec, Maša; Mičetić-Turk, Dušanka; Langerholc, Tomaž
2017-06-01
Short-chain fatty acids (SCFAs) play a crucial role in maintaining homeostasis in humans, therefore the importance of a good and reliable SCFAs analytical detection has raised a lot in the past few years. The aim of this scoping review is to show the trends in the development of different methods of SCFAs analysis in feces, based on the literature published in the last eleven years in all major indexing databases. The search criteria included analytical quantification techniques of SCFAs in different human clinical and in vivo studies. SCFAs analysis is still predominantly performed using gas chromatography (GC), followed by high performance liquid chromatography (HPLC), nuclear magnetic resonance (NMR) and capillary electrophoresis (CE). Performances, drawbacks and advantages of these methods are discussed, especially in the light of choosing a proper pretreatment, as feces is a complex biological material. Further optimization to develop a simple, cost effective and robust method for routine use is needed. Copyright © 2017 Elsevier Inc. All rights reserved.
A HUMAN AUTOMATION INTERACTION CONCEPT FOR A SMALL MODULAR REACTOR CONTROL ROOM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Le Blanc, Katya; Spielman, Zach; Hill, Rachael
Many advanced nuclear power plant (NPP) designs incorporate higher degrees of automation than the existing fleet of NPPs. Automation is being introduced or proposed in NPPs through a wide variety of systems and technologies, such as advanced displays, computer-based procedures, advanced alarm systems, and computerized operator support systems. Additionally, many new reactor concepts, both full scale and small modular reactors, are proposing increased automation and reduced staffing as part of their concept of operations. However, research consistently finds that there is a fundamental tradeoff between system performance with increased automation and reduced human performance. There is a need to addressmore » the question of how to achieve high performance and efficiency of high levels of automation without degrading human performance. One example of a new NPP concept that will utilize greater degrees of automation is the SMR concept from NuScale Power. The NuScale Power design requires 12 modular units to be operated in one single control room, which leads to a need for higher degrees of automation in the control room. Idaho National Laboratory (INL) researchers and NuScale Power human factors and operations staff are working on a collaborative project to address the human performance challenges of increased automation and to determine the principles that lead to optimal performance in highly automated systems. This paper will describe this concept in detail and will describe an experimental test of the concept. The benefits and challenges of the approach will be discussed.« less
A Simple Artificial Life Model Explains Irrational Behavior in Human Decision-Making
Feher da Silva, Carolina; Baldo, Marcus Vinícius Chrysóstomo
2012-01-01
Although praised for their rationality, humans often make poor decisions, even in simple situations. In the repeated binary choice experiment, an individual has to choose repeatedly between the same two alternatives, where a reward is assigned to one of them with fixed probability. The optimal strategy is to perseverate with choosing the alternative with the best expected return. Whereas many species perseverate, humans tend to match the frequencies of their choices to the frequencies of the alternatives, a sub-optimal strategy known as probability matching. Our goal was to find the primary cognitive constraints under which a set of simple evolutionary rules can lead to such contrasting behaviors. We simulated the evolution of artificial populations, wherein the fitness of each animat (artificial animal) depended on its ability to predict the next element of a sequence made up of a repeating binary string of varying size. When the string was short relative to the animats’ neural capacity, they could learn it and correctly predict the next element of the sequence. When it was long, they could not learn it, turning to the next best option: to perseverate. Animats from the last generation then performed the task of predicting the next element of a non-periodical binary sequence. We found that, whereas animats with smaller neural capacity kept perseverating with the best alternative as before, animats with larger neural capacity, which had previously been able to learn the pattern of repeating strings, adopted probability matching, being outperformed by the perseverating animats. Our results demonstrate how the ability to make predictions in an environment endowed with regular patterns may lead to probability matching under less structured conditions. They point to probability matching as a likely by-product of adaptive cognitive strategies that were crucial in human evolution, but may lead to sub-optimal performances in other environments. PMID:22563454
A simple artificial life model explains irrational behavior in human decision-making.
Feher da Silva, Carolina; Baldo, Marcus Vinícius Chrysóstomo
2012-01-01
Although praised for their rationality, humans often make poor decisions, even in simple situations. In the repeated binary choice experiment, an individual has to choose repeatedly between the same two alternatives, where a reward is assigned to one of them with fixed probability. The optimal strategy is to perseverate with choosing the alternative with the best expected return. Whereas many species perseverate, humans tend to match the frequencies of their choices to the frequencies of the alternatives, a sub-optimal strategy known as probability matching. Our goal was to find the primary cognitive constraints under which a set of simple evolutionary rules can lead to such contrasting behaviors. We simulated the evolution of artificial populations, wherein the fitness of each animat (artificial animal) depended on its ability to predict the next element of a sequence made up of a repeating binary string of varying size. When the string was short relative to the animats' neural capacity, they could learn it and correctly predict the next element of the sequence. When it was long, they could not learn it, turning to the next best option: to perseverate. Animats from the last generation then performed the task of predicting the next element of a non-periodical binary sequence. We found that, whereas animats with smaller neural capacity kept perseverating with the best alternative as before, animats with larger neural capacity, which had previously been able to learn the pattern of repeating strings, adopted probability matching, being outperformed by the perseverating animats. Our results demonstrate how the ability to make predictions in an environment endowed with regular patterns may lead to probability matching under less structured conditions. They point to probability matching as a likely by-product of adaptive cognitive strategies that were crucial in human evolution, but may lead to sub-optimal performances in other environments.
Automated annotation of functional imaging experiments via multi-label classification
Turner, Matthew D.; Chakrabarti, Chayan; Jones, Thomas B.; Xu, Jiawei F.; Fox, Peter T.; Luger, George F.; Laird, Angela R.; Turner, Jessica A.
2013-01-01
Identifying the experimental methods in human neuroimaging papers is important for grouping meaningfully similar experiments for meta-analyses. Currently, this can only be done by human readers. We present the performance of common machine learning (text mining) methods applied to the problem of automatically classifying or labeling this literature. Labeling terms are from the Cognitive Paradigm Ontology (CogPO), the text corpora are abstracts of published functional neuroimaging papers, and the methods use the performance of a human expert as training data. We aim to replicate the expert's annotation of multiple labels per abstract identifying the experimental stimuli, cognitive paradigms, response types, and other relevant dimensions of the experiments. We use several standard machine learning methods: naive Bayes (NB), k-nearest neighbor, and support vector machines (specifically SMO or sequential minimal optimization). Exact match performance ranged from only 15% in the worst cases to 78% in the best cases. NB methods combined with binary relevance transformations performed strongly and were robust to overfitting. This collection of results demonstrates what can be achieved with off-the-shelf software components and little to no pre-processing of raw text. PMID:24409112
Natural Biology vs. Cultural Structures: Art and Child Development in Education
ERIC Educational Resources Information Center
Burrill, Rebecca R.
2005-01-01
Art in Education is generally considered a sideline, a secondary specialty, or something to be integrated into the primary curriculum, i.e. literacy and arts integration. But art is not secondary; it is primary in learning and in human development. Art is, in fact, necessary and optimal for learning to write, read, and perform arithmetic. This…
A Hybrid Approach for CpG Island Detection in the Human Genome.
Yang, Cheng-Hong; Lin, Yu-Da; Chiang, Yi-Cheng; Chuang, Li-Yeh
2016-01-01
CpG islands have been demonstrated to influence local chromatin structures and simplify the regulation of gene activity. However, the accurate and rapid determination of CpG islands for whole DNA sequences remains experimentally and computationally challenging. A novel procedure is proposed to detect CpG islands by combining clustering technology with the sliding-window method (PSO-based). Clustering technology is used to detect the locations of all possible CpG islands and process the data, thus effectively obviating the need for the extensive and unnecessary processing of DNA fragments, and thus improving the efficiency of sliding-window based particle swarm optimization (PSO) search. This proposed approach, named ClusterPSO, provides versatile and highly-sensitive detection of CpG islands in the human genome. In addition, the detection efficiency of ClusterPSO is compared with eight CpG island detection methods in the human genome. Comparison of the detection efficiency for the CpG islands in human genome, including sensitivity, specificity, accuracy, performance coefficient (PC), and correlation coefficient (CC), ClusterPSO revealed superior detection ability among all of the test methods. Moreover, the combination of clustering technology and PSO method can successfully overcome their respective drawbacks while maintaining their advantages. Thus, clustering technology could be hybridized with the optimization algorithm method to optimize CpG island detection. The prediction accuracy of ClusterPSO was quite high, indicating the combination of CpGcluster and PSO has several advantages over CpGcluster and PSO alone. In addition, ClusterPSO significantly reduced implementation time.
Stucki, Gerold; Grimby, Gunnar
2007-05-01
There is a need to organize rehabilitation and related research into distinct scientific fields in order to overcome the current limitations of rehabilitation research. Based on the general distinction in basic, applied and professional sciences applicable to research in general, and the rehabilitation relevant distinction between the comprehensive perspective based on WHO's integrative model of human functioning (ICF) and the partial perspective focusing on the biomedical aspects of functioning, it is possible to identify 5 distinct scientific fields of human functioning and rehabilitation research. These are the emerging human functioning sciences and integrative rehabilitation sciences from the comprehensive perspective, the established biosciences and biomedical rehabilitation sciences and engineering from the partial perspective, and the professional rehabilitation sciences at the cutting edge of research and practice. The human functioning sciences aim to understand human functioning and to identify targets for comprehensive interventions, with the goal of contributing to the minimization of the experience of disability in the population. The biosciences in rehabilitation aim to explain body injury and repair and to identify targets for biomedical interventions. The integrative rehabilitation sciences design and study comprehensive assessments and interventions that integrate biomedical, personal factor and environmental approaches suited to optimize people's performance. The biomedical rehabilitation sciences and engineering study diagnostic measures and interventions suitable to minimize impairment, including symptom control, and to optimize people's capacity. The professional rehabilitation sciences study how to provide best care with the goal of enabling people with health conditions experiencing or likely to experience disability to achieve and maintain optimal functioning in interaction with the environment. The organization of human functioning and rehabilitation research into the 5 distinct scientific fields facilitates the development of academic training programs and career building as well as the development of research structures dedicated to human functioning and rehabilitation research.
NASA Astrophysics Data System (ADS)
Charfi, Imen; Miteran, Johel; Dubois, Julien; Atri, Mohamed; Tourki, Rached
2013-10-01
We propose a supervised approach to detect falls in a home environment using an optimized descriptor adapted to real-time tasks. We introduce a realistic dataset of 222 videos, a new metric allowing evaluation of fall detection performance in a video stream, and an automatically optimized set of spatio-temporal descriptors which fed a supervised classifier. We build the initial spatio-temporal descriptor named STHF using several combinations of transformations of geometrical features (height and width of human body bounding box, the user's trajectory with her/his orientation, projection histograms, and moments of orders 0, 1, and 2). We study the combinations of usual transformations of the features (Fourier transform, wavelet transform, first and second derivatives), and we show experimentally that it is possible to achieve high performance using support vector machine and Adaboost classifiers. Automatic feature selection allows to show that the best tradeoff between classification performance and processing time is obtained by combining the original low-level features with their first derivative. Hence, we evaluate the robustness of the fall detection regarding location changes. We propose a realistic and pragmatic protocol that enables performance to be improved by updating the training in the current location with normal activities records.
Magnetic Resonance Super-resolution Imaging Measurement with Dictionary-optimized Sparse Learning
NASA Astrophysics Data System (ADS)
Li, Jun-Bao; Liu, Jing; Pan, Jeng-Shyang; Yao, Hongxun
2017-06-01
Magnetic Resonance Super-resolution Imaging Measurement (MRIM) is an effective way of measuring materials. MRIM has wide applications in physics, chemistry, biology, geology, medical and material science, especially in medical diagnosis. It is feasible to improve the resolution of MR imaging through increasing radiation intensity, but the high radiation intensity and the longtime of magnetic field harm the human body. Thus, in the practical applications the resolution of hardware imaging reaches the limitation of resolution. Software-based super-resolution technology is effective to improve the resolution of image. This work proposes a framework of dictionary-optimized sparse learning based MR super-resolution method. The framework is to solve the problem of sample selection for dictionary learning of sparse reconstruction. The textural complexity-based image quality representation is proposed to choose the optimal samples for dictionary learning. Comprehensive experiments show that the dictionary-optimized sparse learning improves the performance of sparse representation.
Manavalan, Balachandran; Shin, Tae Hwan; Lee, Gwang
2018-01-05
DNase I hypersensitive sites (DHSs) are genomic regions that provide important information regarding the presence of transcriptional regulatory elements and the state of chromatin. Therefore, identifying DHSs in uncharacterized DNA sequences is crucial for understanding their biological functions and mechanisms. Although many experimental methods have been proposed to identify DHSs, they have proven to be expensive for genome-wide application. Therefore, it is necessary to develop computational methods for DHS prediction. In this study, we proposed a support vector machine (SVM)-based method for predicting DHSs, called DHSpred (DNase I Hypersensitive Site predictor in human DNA sequences), which was trained with 174 optimal features. The optimal combination of features was identified from a large set that included nucleotide composition and di- and trinucleotide physicochemical properties, using a random forest algorithm. DHSpred achieved a Matthews correlation coefficient and accuracy of 0.660 and 0.871, respectively, which were 3% higher than those of control SVM predictors trained with non-optimized features, indicating the efficiency of the feature selection method. Furthermore, the performance of DHSpred was superior to that of state-of-the-art predictors. An online prediction server has been developed to assist the scientific community, and is freely available at: http://www.thegleelab.org/DHSpred.html.
Manavalan, Balachandran; Shin, Tae Hwan; Lee, Gwang
2018-01-01
DNase I hypersensitive sites (DHSs) are genomic regions that provide important information regarding the presence of transcriptional regulatory elements and the state of chromatin. Therefore, identifying DHSs in uncharacterized DNA sequences is crucial for understanding their biological functions and mechanisms. Although many experimental methods have been proposed to identify DHSs, they have proven to be expensive for genome-wide application. Therefore, it is necessary to develop computational methods for DHS prediction. In this study, we proposed a support vector machine (SVM)-based method for predicting DHSs, called DHSpred (DNase I Hypersensitive Site predictor in human DNA sequences), which was trained with 174 optimal features. The optimal combination of features was identified from a large set that included nucleotide composition and di- and trinucleotide physicochemical properties, using a random forest algorithm. DHSpred achieved a Matthews correlation coefficient and accuracy of 0.660 and 0.871, respectively, which were 3% higher than those of control SVM predictors trained with non-optimized features, indicating the efficiency of the feature selection method. Furthermore, the performance of DHSpred was superior to that of state-of-the-art predictors. An online prediction server has been developed to assist the scientific community, and is freely available at: http://www.thegleelab.org/DHSpred.html PMID:29416743
Assessment of mass detection performance in contrast enhanced digital mammography
NASA Astrophysics Data System (ADS)
Carton, Ann-Katherine; de Carvalho, Pablo M.; Li, Zhijin; Dromain, Clarisse; Muller, Serge
2015-03-01
We address the detectability of contrast-agent enhancing masses for contrast-agent enhanced spectral mammography (CESM), a dual-energy technique providing functional projection images of breast tissue perfusion and vascularity using simulated CESM images. First, the realism of simulated CESM images from anthropomorphic breast software phantoms generated with a software X-ray imaging platform was validated. Breast texture was characterized by power-law coefficients calculated in data sets of real clinical and simulated images. We also performed a 2-alternative forced choice (2-AFC) psychophysical experiment whereby simulated and real images were presented side-by-side to an experienced radiologist to test if real images could be distinguished from the simulated images. It was found that texture in our simulated CESM images has a fairly realistic appearance. Next, the relative performance of human readers and previously developed mathematical observers was assessed for the detection of iodine-enhancing mass lesions containing different contrast agent concentrations. A four alternative-forced-choice (4 AFC) task was designed; the task for the model and human observer was to detect which one of the four simulated DE recombined images contained an iodineenhancing mass. Our results showed that the NPW and NPWE models largely outperform human performance. After introduction of an internal noise component, both observers approached human performance. The CHO observer performs slightly worse than the average human observer. There is still work to be done in improving model observers as predictors of human-observer performance. Larger trials could also improve our test statistics. We hope that in the future, this framework of software breast phantoms, virtual image acquisition and processing, and mathematical observers can be beneficial to optimize CESM imaging techniques.
Modified optimal control pilot model for computer-aided design and analysis
NASA Technical Reports Server (NTRS)
Davidson, John B.; Schmidt, David K.
1992-01-01
This paper presents the theoretical development of a modified optimal control pilot model based upon the optimal control model (OCM) of the human operator developed by Kleinman, Baron, and Levison. This model is input compatible with the OCM and retains other key aspects of the OCM, such as a linear quadratic solution for the pilot gains with inclusion of control rate in the cost function, a Kalman estimator, and the ability to account for attention allocation and perception threshold effects. An algorithm designed for each implementation in current dynamic systems analysis and design software is presented. Example results based upon the analysis of a tracking task using three basic dynamic systems are compared with measured results and with similar analyses performed with the OCM and two previously proposed simplified optimal pilot models. The pilot frequency responses and error statistics obtained with this modified optimal control model are shown to compare more favorably to the measured experimental results than the other previously proposed simplified models evaluated.
Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB.
Lee, Leng-Feng; Umberger, Brian R
2016-01-01
Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1-2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility.
Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB
Lee, Leng-Feng
2016-01-01
Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1–2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility. PMID:26835184
Yandell, Matthew B; Quinlivan, Brendan T; Popov, Dmitry; Walsh, Conor; Zelik, Karl E
2017-05-18
Wearable assistive devices have demonstrated the potential to improve mobility outcomes for individuals with disabilities, and to augment healthy human performance; however, these benefits depend on how effectively power is transmitted from the device to the human user. Quantifying and understanding this power transmission is challenging due to complex human-device interface dynamics that occur as biological tissues and physical interface materials deform and displace under load, absorbing and returning power. Here we introduce a new methodology for quickly estimating interface power dynamics during movement tasks using common motion capture and force measurements, and then apply this method to quantify how a soft robotic ankle exosuit interacts with and transfers power to the human body during walking. We partition exosuit end-effector power (i.e., power output from the device) into power that augments ankle plantarflexion (termed augmentation power) vs. power that goes into deformation and motion of interface materials and underlying soft tissues (termed interface power). We provide empirical evidence of how human-exosuit interfaces absorb and return energy, reshaping exosuit-to-human power flow and resulting in three key consequences: (i) During exosuit loading (as applied forces increased), about 55% of exosuit end-effector power was absorbed into the interfaces. (ii) However, during subsequent exosuit unloading (as applied forces decreased) most of the absorbed interface power was returned viscoelastically. Consequently, the majority (about 75%) of exosuit end-effector work over each stride contributed to augmenting ankle plantarflexion. (iii) Ankle augmentation power (and work) was delayed relative to exosuit end-effector power, due to these interface energy absorption and return dynamics. Our findings elucidate the complexities of human-exosuit interface dynamics during transmission of power from assistive devices to the human body, and provide insight into improving the design and control of wearable robots. We conclude that in order to optimize the performance of wearable assistive devices it is important, throughout design and evaluation phases, to account for human-device interface dynamics that affect power transmission and thus human augmentation benefits.
Amoroso, N; Errico, R; Bruno, S; Chincarini, A; Garuccio, E; Sensi, F; Tangaro, S; Tateo, A; Bellotti, R
2015-11-21
In this study we present a novel fully automated Hippocampal Unified Multi-Atlas-Networks (HUMAN) algorithm for the segmentation of the hippocampus in structural magnetic resonance imaging. In multi-atlas approaches atlas selection is of crucial importance for the accuracy of the segmentation. Here we present an optimized method based on the definition of a small peri-hippocampal region to target the atlas learning with linear and non-linear embedded manifolds. All atlases were co-registered to a data driven template resulting in a computationally efficient method that requires only one test registration. The optimal atlases identified were used to train dedicated artificial neural networks whose labels were then propagated and fused to obtain the final segmentation. To quantify data heterogeneity and protocol inherent effects, HUMAN was tested on two independent data sets provided by the Alzheimer's Disease Neuroimaging Initiative and the Open Access Series of Imaging Studies. HUMAN is accurate and achieves state-of-the-art performance (Dice[Formula: see text] and Dice[Formula: see text]). It is also a robust method that remains stable when applied to the whole hippocampus or to sub-regions (patches). HUMAN also compares favorably with a basic multi-atlas approach and a benchmark segmentation tool such as FreeSurfer.
NASA Astrophysics Data System (ADS)
Amoroso, N.; Errico, R.; Bruno, S.; Chincarini, A.; Garuccio, E.; Sensi, F.; Tangaro, S.; Tateo, A.; Bellotti, R.; Alzheimers Disease Neuroimaging Initiative,the
2015-11-01
In this study we present a novel fully automated Hippocampal Unified Multi-Atlas-Networks (HUMAN) algorithm for the segmentation of the hippocampus in structural magnetic resonance imaging. In multi-atlas approaches atlas selection is of crucial importance for the accuracy of the segmentation. Here we present an optimized method based on the definition of a small peri-hippocampal region to target the atlas learning with linear and non-linear embedded manifolds. All atlases were co-registered to a data driven template resulting in a computationally efficient method that requires only one test registration. The optimal atlases identified were used to train dedicated artificial neural networks whose labels were then propagated and fused to obtain the final segmentation. To quantify data heterogeneity and protocol inherent effects, HUMAN was tested on two independent data sets provided by the Alzheimer’s Disease Neuroimaging Initiative and the Open Access Series of Imaging Studies. HUMAN is accurate and achieves state-of-the-art performance (Dice{{}\\text{ADNI}} =0.929+/- 0.003 and Dice{{}\\text{OASIS}} =0.869+/- 0.002 ). It is also a robust method that remains stable when applied to the whole hippocampus or to sub-regions (patches). HUMAN also compares favorably with a basic multi-atlas approach and a benchmark segmentation tool such as FreeSurfer.
Efficient cost-sensitive human-machine collaboration for offline signature verification
NASA Astrophysics Data System (ADS)
Coetzer, Johannes; Swanepoel, Jacques; Sabourin, Robert
2012-01-01
We propose a novel strategy for the optimal combination of human and machine decisions in a cost-sensitive environment. The proposed algorithm should be especially beneficial to financial institutions where off-line signatures, each associated with a specific transaction value, require authentication. When presented with a collection of genuine and fraudulent training signatures, produced by so-called guinea pig writers, the proficiency of a workforce of human employees and a score-generating machine can be estimated and represented in receiver operating characteristic (ROC) space. Using a set of Boolean fusion functions, the majority vote decision of the human workforce is combined with each threshold-specific machine-generated decision. The performance of the candidate ensembles is estimated and represented in ROC space, after which only the optimal ensembles and associated decision trees are retained. When presented with a questioned signature linked to an arbitrary writer, the system first uses the ROC-based cost gradient associated with the transaction value to select the ensemble that minimises the expected cost, and then uses the corresponding decision tree to authenticate the signature in question. We show that, when utilising the entire human workforce, the incorporation of a machine streamlines the authentication process and decreases the expected cost for all operating conditions.
Cognitive Functioning in Space Exploration Missions: A Human Requirement
NASA Technical Reports Server (NTRS)
Fiedler, Edan; Woolford, Barbara
2005-01-01
Solving cognitive issues in the exploration missions will require implementing results from both Human Behavior and Performance, and Space Human Factors Engineering. Operational and research cognitive requirements need to reflect a coordinated management approach with appropriate oversight and guidance from NASA headquarters. First, this paper will discuss one proposed management method that would combine the resources of Space Medicine and Space Human Factors Engineering at JSC, other NASA agencies, the National Space Biomedical Research Institute, Wyle Labs, and other academic or industrial partners. The proposed management is based on a Human Centered Design that advocates full acceptance of the human as a system equal to other systems. Like other systems, the human is a system with many subsystems, each of which has strengths and limitations. Second, this paper will suggest ways to inform exploration policy about what is needed for optimal cognitive functioning of the astronaut crew, as well as requirements to ensure necessary assessment and intervention strategies for the human system if human limitations are reached. Assessment strategies will include clinical evaluation and fitness-to-perform evaluations. Clinical intervention tools and procedures will be available to the astronaut and space flight physician. Cognitive performance will be supported through systematic function allocation, task design, training, and scheduling. Human factors requirements and guidelines will lead to well-designed information displays and retrieval systems that reduce crew time and errors. Means of capturing process, design, and operational requirements to ensure crew performance will be discussed. Third, this paper will describe the current plan of action, and future challenges to be resolved before a lunar or Mars expedition. The presentation will include a proposed management plan for research, involvement of various organizations, and a timetable of deliverables.
Brankov, Jovan G
2013-10-21
The channelized Hotelling observer (CHO) has become a widely used approach for evaluating medical image quality, acting as a surrogate for human observers in early-stage research on assessment and optimization of imaging devices and algorithms. The CHO is typically used to measure lesion detectability. Its popularity stems from experiments showing that the CHO's detection performance can correlate well with that of human observers. In some cases, CHO performance overestimates human performance; to counteract this effect, an internal-noise model is introduced, which allows the CHO to be tuned to match human-observer performance. Typically, this tuning is achieved using example data obtained from human observers. We argue that this internal-noise tuning step is essentially a model training exercise; therefore, just as in supervised learning, it is essential to test the CHO with an internal-noise model on a set of data that is distinct from that used to tune (train) the model. Furthermore, we argue that, if the CHO is to provide useful insights about new imaging algorithms or devices, the test data should reflect such potential differences from the training data; it is not sufficient simply to use new noise realizations of the same imaging method. Motivated by these considerations, the novelty of this paper is the use of new model selection criteria to evaluate ten established internal-noise models, utilizing four different channel models, in a train-test approach. Though not the focus of the paper, a new internal-noise model is also proposed that outperformed the ten established models in the cases tested. The results, using cardiac perfusion SPECT data, show that the proposed train-test approach is necessary, as judged by the newly proposed model selection criteria, to avoid spurious conclusions. The results also demonstrate that, in some models, the optimal internal-noise parameter is very sensitive to the choice of training data; therefore, these models are prone to overfitting, and will not likely generalize well to new data. In addition, we present an alternative interpretation of the CHO as a penalized linear regression wherein the penalization term is defined by the internal-noise model.
NASA Astrophysics Data System (ADS)
Radzicki, Vincent R.; Boutte, David; Taylor, Paul; Lee, Hua
2017-05-01
Radar based detection of human targets behind walls or in dense urban environments is an important technical challenge with many practical applications in security, defense, and disaster recovery. Radar reflections from a human can be orders of magnitude weaker than those from objects encountered in urban settings such as walls, cars, or possibly rubble after a disaster. Furthermore, these objects can act as secondary reflectors and produce multipath returns from a person. To mitigate these issues, processing of radar return data needs to be optimized for recognizing human motion features such as walking, running, or breathing. This paper presents a theoretical analysis on the modulation effects human motion has on the radar waveform and how high levels of multipath can distort these motion effects. From this analysis, an algorithm is designed and optimized for tracking human motion in heavily clutter environments. The tracking results will be used as the fundamental detection/classification tool to discriminate human targets from others by identifying human motion traits such as predictable walking patterns and periodicity in breathing rates. The theoretical formulations will be tested against simulation and measured data collected using a low power, portable see-through-the-wall radar system that could be practically deployed in real-world scenarios. Lastly, the performance of the algorithm is evaluated in a series of experiments where both a single person and multiple people are moving in an indoor, cluttered environment.
Some factors affecting performance of rats in the traveling salesman problem.
Bellizzi, C; Goldsteinholm, K; Blaser, R E
2015-11-01
The traveling salesman problem (TSP) is used to measure the efficiency of spatial route selection. Among researchers in cognitive psychology and neuroscience, it has been utilized to examine the mechanisms of decision making, planning, and spatial navigation. While both human and non-human animals produce good solutions to the TSP, the solution strategies engaged by non-human species are not well understood. We conducted two experiments on the TSP using Long-Evans laboratory rats as subjects. The first experiment examined the role of arena walls in route selection. Rats tend to display thigmotaxis in testing conditions comparable to the TSP, which could produce results similar to a convex hull type strategy suggested for humans. The second experiment examined the role of turn angle between targets along the optimal route, to determine whether rats exhibit a preferential turning bias. Our results indicated that both thigmotaxis and preferential turn angles do affect performance in the TSP, but neither is sufficient as a predictor of route choice in this task.
The role of voice input for human-machine communication.
Cohen, P R; Oviatt, S L
1995-01-01
Optimism is growing that the near future will witness rapid growth in human-computer interaction using voice. System prototypes have recently been built that demonstrate speaker-independent real-time speech recognition, and understanding of naturally spoken utterances with vocabularies of 1000 to 2000 words, and larger. Already, computer manufacturers are building speech recognition subsystems into their new product lines. However, before this technology can be broadly useful, a substantial knowledge base is needed about human spoken language and performance during computer-based spoken interaction. This paper reviews application areas in which spoken interaction can play a significant role, assesses potential benefits of spoken interaction with machines, and compares voice with other modalities of human-computer interaction. It also discusses information that will be needed to build a firm empirical foundation for the design of future spoken and multimodal interfaces. Finally, it argues for a more systematic and scientific approach to investigating spoken input and performance with future language technology. PMID:7479803
Validating models of target acquisition performance in the dismounted soldier context
NASA Astrophysics Data System (ADS)
Glaholt, Mackenzie G.; Wong, Rachel K.; Hollands, Justin G.
2018-04-01
The problem of predicting real-world operator performance with digital imaging devices is of great interest within the military and commercial domains. There are several approaches to this problem, including: field trials with imaging devices, laboratory experiments using imagery captured from these devices, and models that predict human performance based on imaging device parameters. The modeling approach is desirable, as both field trials and laboratory experiments are costly and time-consuming. However, the data from these experiments is required for model validation. Here we considered this problem in the context of dismounted soldiering, for which detection and identification of human targets are essential tasks. Human performance data were obtained for two-alternative detection and identification decisions in a laboratory experiment in which photographs of human targets were presented on a computer monitor and the images were digitally magnified to simulate range-to-target. We then compared the predictions of different performance models within the NV-IPM software package: Targeting Task Performance (TTP) metric model and the Johnson model. We also introduced a modification to the TTP metric computation that incorporates an additional correction for target angular size. We examined model predictions using NV-IPM default values for a critical model constant, V50, and we also considered predictions when this value was optimized to fit the behavioral data. When using default values, certain model versions produced a reasonably close fit to the human performance data in the detection task, while for the identification task all models substantially overestimated performance. When using fitted V50 values the models produced improved predictions, though the slopes of the performance functions were still shallow compared to the behavioral data. These findings are discussed in relation to the models' designs and parameters, and the characteristics of the behavioral paradigm.
Efficient receiver tuning using differential evolution strategies
NASA Astrophysics Data System (ADS)
Wheeler, Caleb H.; Toland, Trevor G.
2016-08-01
Differential evolution (DE) is a powerful and computationally inexpensive optimization strategy that can be used to search an entire parameter space or to converge quickly on a solution. The Kilopixel Array Pathfinder Project (KAPPa) is a heterodyne receiver system delivering 5 GHz of instantaneous bandwidth in the tuning range of 645-695 GHz. The fully automated KAPPa receiver test system finds optimal receiver tuning using performance feedback and DE. We present an adaptation of DE for use in rapid receiver characterization. The KAPPa DE algorithm is written in Python 2.7 and is fully integrated with the KAPPa instrument control, data processing, and visualization code. KAPPa develops the technologies needed to realize heterodyne focal plane arrays containing 1000 pixels. Finding optimal receiver tuning by investigating large parameter spaces is one of many challenges facing the characterization phase of KAPPa. This is a difficult task via by-hand techniques. Characterizing or tuning in an automated fashion without need for human intervention is desirable for future large scale arrays. While many optimization strategies exist, DE is ideal for time and performance constraints because it can be set to converge to a solution rapidly with minimal computational overhead. We discuss how DE is utilized in the KAPPa system and discuss its performance and look toward the future of 1000 pixel array receivers and consider how the KAPPa DE system might be applied.
Dynamic inverse models in human-cyber-physical systems
NASA Astrophysics Data System (ADS)
Robinson, Ryan M.; Scobee, Dexter R. R.; Burden, Samuel A.; Sastry, S. Shankar
2016-05-01
Human interaction with the physical world is increasingly mediated by automation. This interaction is characterized by dynamic coupling between robotic (i.e. cyber) and neuromechanical (i.e. human) decision-making agents. Guaranteeing performance of such human-cyber-physical systems will require predictive mathematical models of this dynamic coupling. Toward this end, we propose a rapprochement between robotics and neuromechanics premised on the existence of internal forward and inverse models in the human agent. We hypothesize that, in tele-robotic applications of interest, a human operator learns to invert automation dynamics, directly translating from desired task to required control input. By formulating the model inversion problem in the context of a tracking task for a nonlinear control system in control-a_ne form, we derive criteria for exponential tracking and show that the resulting dynamic inverse model generally renders a portion of the physical system state (i.e., the internal dynamics) unobservable from the human operator's perspective. Under stability conditions, we show that the human can achieve exponential tracking without formulating an estimate of the system's state so long as they possess an accurate model of the system's dynamics. These theoretical results are illustrated using a planar quadrotor example. We then demonstrate that the automation can intervene to improve performance of the tracking task by solving an optimal control problem. Performance is guaranteed to improve under the assumption that the human learns and inverts the dynamic model of the altered system. We conclude with a discussion of practical limitations that may hinder exact dynamic model inversion.
Pilot interaction with automated airborne decision making systems
NASA Technical Reports Server (NTRS)
Rouse, W. B.; Chu, Y. Y.; Greenstein, J. S.; Walden, R. S.
1976-01-01
An investigation was made of interaction between a human pilot and automated on-board decision making systems. Research was initiated on the topic of pilot problem solving in automated and semi-automated flight management systems and attempts were made to develop a model of human decision making in a multi-task situation. A study was made of allocation of responsibility between human and computer, and discussed were various pilot performance parameters with varying degrees of automation. Optimal allocation of responsibility between human and computer was considered and some theoretical results found in the literature were presented. The pilot as a problem solver was discussed. Finally the design of displays, controls, procedures, and computer aids for problem solving tasks in automated and semi-automated systems was considered.
Lose-Shift Responding in Humans Is Promoted by Increased Cognitive Load
Ivan, Victorita E.; Banks, Parker J.; Goodfellow, Kris; Gruber, Aaron J.
2018-01-01
The propensity of animals to shift choices immediately after unexpectedly poor reinforcement outcomes is a pervasive strategy across species and tasks. We report here on the memory supporting such lose-shift responding in humans, assessed using a binary choice task in which random responding is the optimal strategy. Participants exhibited little lose-shift responding when fully attending to the task, but this increased by 30%–40% in participants that performed with additional cognitive load that is known to tax executive systems. Lose-shift responding in the cognitively loaded adults persisted throughout the testing session, despite being a sub-optimal strategy, but was less likely as the time increased between reinforcement and the subsequent choice. Furthermore, children (5–9 years old) without load performed similarly to the cognitively loaded adults. This effect disappeared in older children aged 11–13 years old. These data provide evidence supporting our hypothesis that lose-shift responding is a default and reflexive strategy in the mammalian brain, likely mediated by a decaying memory trace, and is normally suppressed by executive systems. Reducing the efficacy of executive control by cognitive load (adults) or underdevelopment (children) increases its prevalence. It may therefore be an important component to consider when interpreting choice data, and may serve as an objective behavioral assay of executive function in humans that is easy to measure. PMID:29568264
Analysis of aldehydes in human exhaled breath condensates by in-tube SPME-HPLC.
Wang, ShuLing; Hu, Sheng; Xu, Hui
2015-11-05
In this paper, polypyrrole/graphene (PPy/G) composite coating was prepared by a facile electrochemical polymerization strategy on the inner surface of a stainless steel (SS) tube. Based on the coating tube, a novel online in-tube solid-phase microextraction -high performance liquid chromatography (IT-SPME-HPLC) was developed and applied for the extraction of aldehydes in the human exhaled breath condensates (EBC). The hybrid PPy/G nanocomposite exhibits remarkable chemical and mechanical stability, high selectivity, and satisfactory extraction performance toward aldehyde compounds. Moreover, the proposed online IT-SPME-HPLC method possesses numerous superiorities, such as time and cost saving, process simplicity, high precision and sensitivity. Some parameters related to extraction efficiency were optimized systematically. Under the optimal conditions, the recoveries of the aldehyde compounds at three spiked concentration levels varied in the range of 85%-117%. Good linearity was obtained with excellent correlation coefficients (R(2)) being larger than 0.994. The relative standard deviations (n = 5) of the method ranged from 1.8% to 11.3% and the limits of detection were between 2.3 and 3.3 nmol L(-1). The successful application of the proposed method in human EBC indicated that it is a promising approach for the determination of trace aldehyde metabolites in complex EBC samples. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Prathabrao, M.; Nawawi, Azli; Sidek, Noor Azizah
2017-04-01
Radio Frequency Identification (RFID) system has multiple benefits which can improve the operational efficiency of the organization. The advantages are the ability to record data systematically and quickly, reducing human errors and system errors, update the database automatically and efficiently. It is often more readers (reader) is needed for the installation purposes in RFID system. Thus, it makes the system more complex. As a result, RFID network planning process is needed to ensure the RFID system works perfectly. The planning process is also considered as an optimization process and power adjustment because the coordinates of each RFID reader to be determined. Therefore, algorithms inspired by the environment (Algorithm Inspired by Nature) is often used. In the study, PSO algorithm is used because it has few number of parameters, the simulation time is fast, easy to use and also very practical. However, PSO parameters must be adjusted correctly, for robust and efficient usage of PSO. Failure to do so may result in disruption of performance and results of PSO optimization of the system will be less good. To ensure the efficiency of PSO, this study will examine the effects of two parameters on the performance of PSO Algorithm in RFID tag coverage optimization. The parameters to be studied are the swarm size and iteration number. In addition to that, the study will also recommend the most optimal adjustment for both parameters that is, 200 for the no. iterations and 800 for the no. of swarms. Finally, the results of this study will enable PSO to operate more efficiently in order to optimize RFID network planning system.
Reddy, Uma M; Abuhamad, Alfred Z; Levine, Deborah; Saade, George R
2014-05-01
Given that practice variation exists in the frequency and performance of ultrasound and magnetic resonance imaging in pregnancy, the Eunice Kennedy Shriver National Institute of Child Health and Human Development hosted a workshop to address indications for ultrasound and magnetic resonance imaging in pregnancy, to discuss when and how often these studies should be performed, to consider recommendations for optimizing yield and cost-effectiveness and to identify research opportunities. This article is the executive summary of the workshop. Published by Mosby, Inc.
Reddy, Uma M; Abuhamad, Alfred Z; Levine, Deborah; Saade, George R
2014-05-01
Given that practice variation exists in the frequency and performance of ultrasound and magnetic resonance imaging (MRI) in pregnancy, the Eunice Kennedy Shriver National Institute of Child Health and Human Development hosted a workshop to address indications for ultrasound and MRI in pregnancy, to discuss when and how often these studies should be performed, to consider recommendations for optimizing yield and cost effectiveness, and to identify research opportunities. This article is the executive summary of the workshop.
Reddy, Uma M; Abuhamad, Alfred Z; Levine, Deborah; Saade, George R
2014-05-01
Given that practice variation exists in the frequency and performance of ultrasound and magnetic resonance imaging (MRI) in pregnancy, the Eunice Kennedy Shriver National Institute of Child Health and Human Development hosted a workshop to address indications for ultrasound and MRI in pregnancy, to discuss when and how often these studies should be performed, to consider recommendations for optimizing yield and cost effectiveness, and to identify research opportunities. This article is the executive summary of the workshop.
Optimized energy of spectral CT for infarct imaging: Experimental validation with human validation.
Sandfort, Veit; Palanisamy, Srikanth; Symons, Rolf; Pourmorteza, Amir; Ahlman, Mark A; Rice, Kelly; Thomas, Tom; Davies-Venn, Cynthia; Krauss, Bernhard; Kwan, Alan; Pandey, Ankur; Zimmerman, Stefan L; Bluemke, David A
Late contrast enhancement visualizes myocardial infarction, but the contrast to noise ratio (CNR) is low using conventional CT. The aim of this study was to determine if spectral CT can improve imaging of myocardial infarction. A canine model of myocardial infarction was produced in 8 animals (90-min occlusion, reperfusion). Later, imaging was performed after contrast injection using CT at 90 kVp/150 kVpSn. The following reconstructions were evaluated: Single energy 90 kVp, mixed, iodine map, multiple monoenergetic conventional and monoenergetic noise optimized reconstructions. Regions of interest were measured in infarct and remote regions to calculate contrast to noise ratio (CNR) and Bhattacharya distance (a metric of the differentiation between regions). Blinded assessment of image quality was performed. The same reconstruction methods were applied to CT scans of four patients with known infarcts. For animal studies, the highest CNR for infarct vs. myocardium was achieved in the lowest keV (40 keV) VMo images (CNR 4.42, IQR 3.64-5.53), which was superior to 90 kVp, mixed and iodine map (p = 0.008, p = 0.002, p < 0.001, respectively). Compared to 90 kVp and iodine map, the 40 keV VMo reconstructions showed significantly higher histogram separation (p = 0.042 and p < 0.0001, respectively). The VMo reconstructions showed the highest rate of excellent quality scores. A similar pattern was seen in human studies, with CNRs for infarct maximized at the lowest keV optimized reconstruction (CNR 4.44, IQR 2.86-5.94). Dual energy in conjunction with noise-optimized monoenergetic post-processing improves CNR of myocardial infarct delineation by approximately 20-25%. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Pancharoen, K.; Zhu, D.; Beeby, S. P.
2016-11-01
This paper presents a magnetically levitated electromagnetic vibration energy harvester based on magnet arrays. It has a nonlinear response that extends the operating bandwidth and enhances the power output of the harvesting device. The harvester is designed to be embedded in a hip prosthesis and harvest energy from low frequency movements (< 5 Hz) associated with human motion. The design optimization is performed using Comsol simulation considering the constraints on size of the harvester and low operating frequency. The output voltage across the optimal load 3.5kΩ generated from hip movement is 0.137 Volts during walking and 0.38 Volts during running. The power output harvested from hip movement during walking and running is 5.35 μW and 41.36 μW respectively..
Heuristic and optimal policy computations in the human brain during sequential decision-making.
Korn, Christoph W; Bach, Dominik R
2018-01-23
Optimal decisions across extended time horizons require value calculations over multiple probabilistic future states. Humans may circumvent such complex computations by resorting to easy-to-compute heuristics that approximate optimal solutions. To probe the potential interplay between heuristic and optimal computations, we develop a novel sequential decision-making task, framed as virtual foraging in which participants have to avoid virtual starvation. Rewards depend only on final outcomes over five-trial blocks, necessitating planning over five sequential decisions and probabilistic outcomes. Here, we report model comparisons demonstrating that participants primarily rely on the best available heuristic but also use the normatively optimal policy. FMRI signals in medial prefrontal cortex (MPFC) relate to heuristic and optimal policies and associated choice uncertainties. Crucially, reaction times and dorsal MPFC activity scale with discrepancies between heuristic and optimal policies. Thus, sequential decision-making in humans may emerge from integration between heuristic and optimal policies, implemented by controllers in MPFC.
2015-01-01
Color is one of the most prominent features of an image and used in many skin and face detection applications. Color space transformation is widely used by researchers to improve face and skin detection performance. Despite the substantial research efforts in this area, choosing a proper color space in terms of skin and face classification performance which can address issues like illumination variations, various camera characteristics and diversity in skin color tones has remained an open issue. This research proposes a new three-dimensional hybrid color space termed SKN by employing the Genetic Algorithm heuristic and Principal Component Analysis to find the optimal representation of human skin color in over seventeen existing color spaces. Genetic Algorithm heuristic is used to find the optimal color component combination setup in terms of skin detection accuracy while the Principal Component Analysis projects the optimal Genetic Algorithm solution to a less complex dimension. Pixel wise skin detection was used to evaluate the performance of the proposed color space. We have employed four classifiers including Random Forest, Naïve Bayes, Support Vector Machine and Multilayer Perceptron in order to generate the human skin color predictive model. The proposed color space was compared to some existing color spaces and shows superior results in terms of pixel-wise skin detection accuracy. Experimental results show that by using Random Forest classifier, the proposed SKN color space obtained an average F-score and True Positive Rate of 0.953 and False Positive Rate of 0.0482 which outperformed the existing color spaces in terms of pixel wise skin detection accuracy. The results also indicate that among the classifiers used in this study, Random Forest is the most suitable classifier for pixel wise skin detection applications. PMID:26267377
Wong, Kevin S K; Jian, Yifan; Cua, Michelle; Bonora, Stefano; Zawadzki, Robert J; Sarunic, Marinko V
2015-02-01
Wavefront sensorless adaptive optics optical coherence tomography (WSAO-OCT) is a novel imaging technique for in vivo high-resolution depth-resolved imaging that mitigates some of the challenges encountered with the use of sensor-based adaptive optics designs. This technique replaces the Hartmann Shack wavefront sensor used to measure aberrations with a depth-resolved image-driven optimization algorithm, with the metric based on the OCT volumes acquired in real-time. The custom-built ultrahigh-speed GPU processing platform and fast modal optimization algorithm presented in this paper was essential in enabling real-time, in vivo imaging of human retinas with wavefront sensorless AO correction. WSAO-OCT is especially advantageous for developing a clinical high-resolution retinal imaging system as it enables the use of a compact, low-cost and robust lens-based adaptive optics design. In this report, we describe our WSAO-OCT system for imaging the human photoreceptor mosaic in vivo. We validated our system performance by imaging the retina at several eccentricities, and demonstrated the improvement in photoreceptor visibility with WSAO compensation.
Short-term solar flare prediction using image-case-based reasoning
NASA Astrophysics Data System (ADS)
Liu, Jin-Fu; Li, Fei; Zhang, Huai-Peng; Yu, Da-Ren
2017-10-01
Solar flares strongly influence space weather and human activities, and their prediction is highly complex. The existing solutions such as data based approaches and model based approaches have a common shortcoming which is the lack of human engagement in the forecasting process. An image-case-based reasoning method is introduced to achieve this goal. The image case library is composed of SOHO/MDI longitudinal magnetograms, the images from which exhibit the maximum horizontal gradient, the length of the neutral line and the number of singular points that are extracted for retrieving similar image cases. Genetic optimization algorithms are employed for optimizing the weight assignment for image features and the number of similar image cases retrieved. Similar image cases and prediction results derived by majority voting for these similar image cases are output and shown to the forecaster in order to integrate his/her experience with the final prediction results. Experimental results demonstrate that the case-based reasoning approach has slightly better performance than other methods, and is more efficient with forecasts improved by humans.
Regenerative Life Support Systems Test Bed performance - Lettuce crop characterization
NASA Technical Reports Server (NTRS)
Barta, Daniel J.; Edeen, Marybeth A.; Eckhardt, Bradley D.
1992-01-01
System performance in terms of human life support requirements was evaluated for two crops of lettuce (Lactuca sative cv. Waldmann's Green) grown in the Regenerative Life Support Systems Test Bed. Each crop, grown in separate pots under identical environmental and cultural conditions, was irrigated with half-strength Hoagland's nutrient solution, with the frequency of irrigation being increased as the crop aged over the 30-day crop tests. Averaging over both crop tests, the test bed met the requirements of 2.1 person-days of oxygen production, 2.4 person-days of CO2 removal, and 129 person-days of potential potable water production. Gains in the mass of water and O2 produced and CO2 removed could be achieved by optimizing environmental conditions to increase plant growth rate and by optimizing cultural management methods.
An optimized adaptive optics experimental setup for in vivo retinal imaging
NASA Astrophysics Data System (ADS)
Balderas-Mata, S. E.; Valdivieso González, L. G.; Ramírez Zavaleta, G.; López Olazagasti, E.; Tepichin Rodriguez, E.
2012-10-01
The use of Adaptive Optics (AO) in ophthalmologic instruments to image human retinas has been probed to improve the imaging lateral resolution, by correcting both static and dynamic aberrations inherent in human eyes. Typically, the configuration of the AO arm uses an infrared beam from a superluminescent diode (SLD), which is focused on the retina, acting as a point source. The back reflected light emerges through the eye optical system bringing with it the aberrations of the cornea. The aberrated wavefront is measured with a Shack - Hartmann wavefront sensor (SHWFS). However, the aberrations in the optical imaging system can reduced the performance of the wave front correction. The aim of this work is to present an optimized first stage AO experimental setup for in vivo retinal imaging. In our proposal, the imaging optical system has been designed in order to reduce spherical aberrations due to the lenses. The ANSI Standard is followed assuring the safety power levels. The performance of the system will be compared with a commercial aberrometer. This system will be used as the AO arm of a flood-illuminated fundus camera system for retinal imaging. We present preliminary experimental results showing the enhancement.
MKID digital readout tuning with deep learning
NASA Astrophysics Data System (ADS)
Dodkins, R.; Mahashabde, S.; O'Brien, K.; Thatte, N.; Fruitwala, N.; Walter, A. B.; Meeker, S. R.; Szypryt, P.; Mazin, B. A.
2018-04-01
Microwave Kinetic Inductance Detector (MKID) devices offer inherent spectral resolution, simultaneous read out of thousands of pixels, and photon-limited sensitivity at optical wavelengths. Before taking observations the readout power and frequency of each pixel must be individually tuned, and if the equilibrium state of the pixels change, then the readout must be retuned. This process has previously been performed through manual inspection, and typically takes one hour per 500 resonators (20 h for a ten-kilo-pixel array). We present an algorithm based on a deep convolution neural network (CNN) architecture to determine the optimal bias power for each resonator. The bias point classifications from this CNN model, and those from alternative automated methods, are compared to those from human decisions, and the accuracy of each method is assessed. On a test feed-line dataset, the CNN achieves an accuracy of 90% within 1 dB of the designated optimal value, which is equivalent accuracy to a randomly selected human operator, and superior to the highest scoring alternative automated method by 10%. On a full ten-kilopixel array, the CNN performs the characterization in a matter of minutes - paving the way for future mega-pixel MKID arrays.
Coverage and efficiency in current SNP chips
Ha, Ngoc-Thuy; Freytag, Saskia; Bickeboeller, Heike
2014-01-01
To answer the question as to which commercial high-density SNP chip covers most of the human genome given a fixed budget, we compared the performance of 12 chips of different sizes released by Affymetrix and Illumina for the European, Asian, and African populations. These include Affymetrix' relatively new population-optimized arrays, whose SNP sets are each tailored toward a specific ethnicity. Our evaluation of the chips included the use of two measures, efficiency and cost–benefit ratio, which we developed as supplements to genetic coverage. Unlike coverage, these measures factor in the price of a chip or its substitute size (number of SNPs on chip), allowing comparisons to be drawn between differently priced chips. In this fashion, we identified the Affymetrix population-optimized arrays as offering the most cost-effective coverage for the Asian and African population. For the European population, we established the Illumina Human Omni 2.5-8 as the preferred choice. Interestingly, the Affymetrix chip tailored toward an Eastern Asian subpopulation performed well for all three populations investigated. However, our coverage estimates calculated for all chips proved much lower than those advertised by the producers. All our analyses were based on the 1000 Genome Project as reference population. PMID:24448550
Di Ciano, Patricia; Manvich, Daniel F; Pushparaj, Abhiram; Gappasov, Andrew; Hess, Ellen J; Weinshenker, David; Le Foll, Bernard
2018-01-01
Gambling disorder is a growing societal concern, as recognized by its recent classification as an addictive disorder in the DSM-5. Case reports have shown that disulfiram reduces gambling-related behavior in humans. The purpose of the present study was to determine whether disulfiram affects performance on a rat gambling task, a rodent version of the Iowa gambling task in humans, and whether any changes were associated with alterations in dopamine and/or norepinephrine levels. Rats were administered disulfiram prior to testing on the rat gambling task or prior to analysis of dopamine or norepinephrine levels in brain homogenates. Rats in the behavioral task were divided into two subgroups (optimal vs suboptimal) based on their baseline levels of performance in the rat gambling task. Rats in the optimal group chose the advantageous strategy more, and rats in the suboptimal group (a parallel to problem gambling) chose the disadvantageous strategy more. Rats were not divided into optimal or suboptimal groups prior to neurochemical analysis. Disulfiram administered 2 h, but not 30 min, before the task dose-dependently improved choice behavior in the rats with an initial disadvantageous "gambling-like" strategy, while having no effect on the rats employing an advantageous strategy. The behavioral effects of disulfiram were associated with increased striatal dopamine and decreased striatal norepinephrine. These findings suggest that combined actions on dopamine and norepinephrine may be a useful treatment for gambling disorders.
Modeling Brain Dynamics in Brain Tumor Patients Using the Virtual Brain.
Aerts, Hannelore; Schirner, Michael; Jeurissen, Ben; Van Roost, Dirk; Achten, Eric; Ritter, Petra; Marinazzo, Daniele
2018-01-01
Presurgical planning for brain tumor resection aims at delineating eloquent tissue in the vicinity of the lesion to spare during surgery. To this end, noninvasive neuroimaging techniques such as functional MRI and diffusion-weighted imaging fiber tracking are currently employed. However, taking into account this information is often still insufficient, as the complex nonlinear dynamics of the brain impede straightforward prediction of functional outcome after surgical intervention. Large-scale brain network modeling carries the potential to bridge this gap by integrating neuroimaging data with biophysically based models to predict collective brain dynamics. As a first step in this direction, an appropriate computational model has to be selected, after which suitable model parameter values have to be determined. To this end, we simulated large-scale brain dynamics in 25 human brain tumor patients and 11 human control participants using The Virtual Brain, an open-source neuroinformatics platform. Local and global model parameters of the Reduced Wong-Wang model were individually optimized and compared between brain tumor patients and control subjects. In addition, the relationship between model parameters and structural network topology and cognitive performance was assessed. Results showed (1) significantly improved prediction accuracy of individual functional connectivity when using individually optimized model parameters; (2) local model parameters that can differentiate between regions directly affected by a tumor, regions distant from a tumor, and regions in a healthy brain; and (3) interesting associations between individually optimized model parameters and structural network topology and cognitive performance.
SU-E-I-43: Pediatric CT Dose and Image Quality Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, G; Singh, R
2014-06-01
Purpose: To design an approach to optimize radiation dose and image quality for pediatric CT imaging, and to evaluate expected performance. Methods: A methodology was designed to quantify relative image quality as a function of CT image acquisition parameters. Image contrast and image noise were used to indicate expected conspicuity of objects, and a wide-cone system was used to minimize scan time for motion avoidance. A decision framework was designed to select acquisition parameters as a weighted combination of image quality and dose. Phantom tests were used to acquire images at multiple techniques to demonstrate expected contrast, noise and dose.more » Anthropomorphic phantoms with contrast inserts were imaged on a 160mm CT system with tube voltage capabilities as low as 70kVp. Previously acquired clinical images were used in conjunction with simulation tools to emulate images at different tube voltages and currents to assess human observer preferences. Results: Examination of image contrast, noise, dose and tube/generator capabilities indicates a clinical task and object-size dependent optimization. Phantom experiments confirm that system modeling can be used to achieve the desired image quality and noise performance. Observer studies indicate that clinical utilization of this optimization requires a modified approach to achieve the desired performance. Conclusion: This work indicates the potential to optimize radiation dose and image quality for pediatric CT imaging. In addition, the methodology can be used in an automated parameter selection feature that can suggest techniques given a limited number of user inputs. G Stevens and R Singh are employees of GE Healthcare.« less
Skendi, Adriana; Irakli, Maria N; Papageorgiou, Maria D
2016-04-01
A simple, sensitive and accurate analytical method was optimized and developed for the determination of deoxynivalenol and aflatoxins in cereals intended for human consumption using high-performance liquid chromatography with diode array and fluorescence detection and a photochemical reactor for enhanced detection. A response surface methodology, using a fractional central composite design, was carried out for optimization of the water percentage at the beginning of the run (X1, 80-90%), the level of acetonitrile at the end of gradient system (X2, 10-20%) with the water percentage fixed at 60%, and the flow rate (X3, 0.8-1.2 mL/min). The studied responses were the chromatographic peak area, the resolution factor and the time of analysis. Optimal chromatographic conditions were: X1 = 80%, X2 = 10%, and X3 = 1 mL/min. Following a double sample extraction with water and a mixture of methanol/water, mycotoxins were rapidly purified by an optimized solid-phase extraction protocol. The optimized method was further validated with respect to linearity (R(2) >0.9991), sensitivity, precision, and recovery (90-112%). The application to 23 commercial cereal samples from Greece showed contamination levels below the legally set limits, except for one maize sample. The main advantages of the developed method are the simplicity of operation and the low cost. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Cankorur-Cetinkaya, Ayca; Dias, Joao M. L.; Kludas, Jana; Slater, Nigel K. H.; Rousu, Juho; Dikicioglu, Duygu
2017-01-01
Multiple interacting factors affect the performance of engineered biological systems in synthetic biology projects. The complexity of these biological systems means that experimental design should often be treated as a multiparametric optimization problem. However, the available methodologies are either impractical, due to a combinatorial explosion in the number of experiments to be performed, or are inaccessible to most experimentalists due to the lack of publicly available, user-friendly software. Although evolutionary algorithms may be employed as alternative approaches to optimize experimental design, the lack of simple-to-use software again restricts their use to specialist practitioners. In addition, the lack of subsidiary approaches to further investigate critical factors and their interactions prevents the full analysis and exploitation of the biotechnological system. We have addressed these problems and, here, provide a simple‐to‐use and freely available graphical user interface to empower a broad range of experimental biologists to employ complex evolutionary algorithms to optimize their experimental designs. Our approach exploits a Genetic Algorithm to discover the subspace containing the optimal combination of parameters, and Symbolic Regression to construct a model to evaluate the sensitivity of the experiment to each parameter under investigation. We demonstrate the utility of this method using an example in which the culture conditions for the microbial production of a bioactive human protein are optimized. CamOptimus is available through: (https://doi.org/10.17863/CAM.10257). PMID:28635591
NASA Technical Reports Server (NTRS)
Cliff, Susan E.; Thomas, Scott D.
2005-01-01
Numerical optimization was employed on the Apollo Command Module to modify its external shape. The Apollo Command Module (CM) that was used on all NASA human space flights during the Apollo Space Program is stable and trimmed in an apex forward (alpha of approximately 40 to 80 degrees) position. This poses a safety risk if the CM separates from the launch tower during abort. Optimization was employed on the Apollo CM to remedy the undesirable stability characteristics of the configuration. Geometric shape changes were limited to axisymmetric modifications that altered the radius of the apex (R(sub A)), base radius (R(sub O)), corner radius (R(sub C)), and the cone half angle (theta), while the maximum diameter of the CM was held constant. The results of multipoint optimization on the CM indicated that the cross-range performance can be improved while maintaining robust apex-aft stability with a single trim point. Navier-Stokes computations were performed on the baseline and optimized configurations and confirmed the Euler-based optimization results. Euler Analysis of ten alternative CM vehicles with different values of the above four parameters are compared with the published experimental results of numerous wind tunnel tests during the late 1960's. These comparisons cover a wide Mach number range and a full 180-degree pitch range and show that the Euler methods are capable of fairly accurate force and moment computations and can separate the vehicle characteristics of these ten alternative configurations.
Eskinazi, Ilan; Fregly, Benjamin J
2018-04-01
Concurrent estimation of muscle activations, joint contact forces, and joint kinematics by means of gradient-based optimization of musculoskeletal models is hindered by computationally expensive and non-smooth joint contact and muscle wrapping algorithms. We present a framework that simultaneously speeds up computation and removes sources of non-smoothness from muscle force optimizations using a combination of parallelization and surrogate modeling, with special emphasis on a novel method for modeling joint contact as a surrogate model of a static analysis. The approach allows one to efficiently introduce elastic joint contact models within static and dynamic optimizations of human motion. We demonstrate the approach by performing two optimizations, one static and one dynamic, using a pelvis-leg musculoskeletal model undergoing a gait cycle. We observed convergence on the order of seconds for a static optimization time frame and on the order of minutes for an entire dynamic optimization. The presented framework may facilitate model-based efforts to predict how planned surgical or rehabilitation interventions will affect post-treatment joint and muscle function. Copyright © 2018 IPEM. Published by Elsevier Ltd. All rights reserved.
How HRP Research Results Contribute to Human Space Exploration Risk Mitigation
NASA Technical Reports Server (NTRS)
Lumpkins, S. B.; Mindock, J. A.
2014-01-01
In addition to the scientific value of publications derived from research, results from Human Research Program (HRP) research also support HRP’s goals of mitigating crew health and performance risks in space flight. Research results are used to build the evidence base characterizing crew health and performance risks, to support risk research plan development, to inform crew health and performance standards, and to provide technologies to programs for meeting those standards and optimizing crew health and performance in space. This talk will describe examples of how research results support these efforts. For example, HRP research results are used to revise or even create new standards for human space flight, which have been established to protect crew health and performance during flight, and prevent negative long-term health consequences due to space flight. These standards are based on the best available clinical and scientific evidence, as well as operational experience from previous space flight missions, and are reviewed as new evidence emerges. Research results are also used to update the HRP evidence base, which is comprised of a set of reports that provide a current record of the state of knowledge from research and operations for each of the defined human health and performance risks for future NASA exploration missions. A discussion of the role of evidence within the HRP architecture will also be presented. The scope of HRP research results extends well beyond publications, as they are used in several capacities to support HRP deliverables and, ultimately, the advancement of human space exploration beyond low-Earth orbit.
How HRP Research Results Contribute to Human Space Exploration Risk Mitigation
NASA Technical Reports Server (NTRS)
Lumpkins, Sarah; Mindock, Jennifer
2014-01-01
In addition to the scientific value of publications derived from research, results from Human Research Program (HRP) research also support HRP's goals of mitigating crew health and performance risks in space flight. Research results are used to build the evidence base characterizing crew health and performance risks, to support risk research plan development, to inform crew health and performance standards, and to provide technologies to programs for meeting those standards and optimizing crew health and performance in space. This talk will describe examples of how research results support these efforts. For example, HRP research results are used to revise or even create new standards for human space flight, which have been established to protect crew health and performance during flight, and prevent negative long-term health consequences due to space flight. These standards are based on the best available clinical and scientific evidence, as well as operational experience from previous space flight missions, and are reviewed as new evidence emerges. Research results are also used to update the HRP evidence base, which is comprised of a set of reports that provide a current record of the state of knowledge from research and operations for each of the defined human health and performance risks for future NASA exploration missions. A discussion of the role of evidence within the HRP architecture will also be presented. The scope of HRP research results extends well beyond publications, as they are used in several capacities to support HRP deliverables and, ultimately, the advancement of human space exploration beyond low-Earth orbit.
Evaluating the impact of the humanities in medical education.
Wershof Schwartz, Andrea; Abramson, Jeremy S; Wojnowich, Israel; Accordino, Robert; Ronan, Edward J; Rifkin, Mary R
2009-08-01
The inclusion of the humanities in medical education may offer significant potential benefits to individual future physicians and to the medical community as a whole. Debate remains, however, about the definition and precise role of the humanities in medical education, whether at the premedical, medical school, or postgraduate level. Recent trends have revealed an increasing presence of the humanities in medical training. This article reviews the literature on the impact of humanities education on the performance of medical students and residents and the challenges posed by the evaluation of the impact of humanities in medical education. Students who major in the humanities as college students perform just as well, if not better, than their peers with science backgrounds during medical school and in residency on objective measures of achievement such as National Board of Medical Examiners scores and academic grades. Although many humanities electives and courses are offered in premedical and medical school curricula, measuring and quantifying their impact has proven challenging because the courses are diverse in content and goals. Many of the published studies involve self-selected groups of students and seek to measure subjective outcomes which are difficult to measure, such as increases in empathy, professionalism, and self-care. Further research is needed to define the optimal role for humanities education in medical training; in particular, more quantitative studies are needed to examine the impact that it may have on physician performance beyond medical school and residency. Medical educators must consider what potential benefits humanities education can contribute to medical education, how its impact can be measured, and what ultimate outcomes we hope to achieve.
Ryan, Denise S; Sia, Rose K; Stutzman, Richard D; Pasternak, Joseph F; Howard, Robin S; Howell, Christopher L; Maurer, Tana; Torres, Mark F; Bower, Kraig S
2017-01-01
To compare visual performance, marksmanship performance, and threshold target identification following wavefront-guided (WFG) versus wavefront-optimized (WFO) photorefractive keratectomy (PRK). In this prospective, randomized clinical trial, active duty U.S. military Soldiers, age 21 or over, electing to undergo PRK were randomized to undergo WFG (n = 27) or WFO (n = 27) PRK for myopia or myopic astigmatism. Binocular visual performance was assessed preoperatively and 1, 3, and 6 months postoperatively: Super Vision Test high contrast, Super Vision Test contrast sensitivity (CS), and 25% contrast acuity with night vision goggle filter. CS function was generated testing at five spatial frequencies. Marksmanship performance in low light conditions was evaluated in a firing tunnel. Target detection and identification performance was tested for probability of identification of varying target sets and probability of detection of humans in cluttered environments. Visual performance, CS function, marksmanship, and threshold target identification demonstrated no statistically significant differences over time between the two treatments. Exploratory regression analysis of firing range tasks at 6 months showed no significant differences or correlations between procedures. Regression analysis of vehicle and handheld probability of identification showed a significant association with pretreatment performance. Both WFG and WFO PRK results translate to excellent and comparable visual and military performance. Reprint & Copyright © 2017 Association of Military Surgeons of the U.S.
Mirifar, Arash; Beckmann, Jürgen; Ehrlenspiel, Felix
2017-04-01
Self-regulation plays an important role in enhancing human performance. Neurofeedback is a promising noninvasive approach for modifying human brain oscillation and can be utilized in developing skills for self-regulation of brain activity. So far, the effectiveness of neurofeedback has been evaluated with regard to not only its application in clinical populations but also the enhancement of performance in general. However, reviews of the application of neurofeedback training in the sports domain are absent, although this application goes back to 1991, when it was first applied in archery. Sport scientists have shown an increasing interest in this topic in recent years. This article provides an overview of empirical studies examining the effects of neurofeedback in sports and evaluates these studies against cardinal and methodological criteria. Furthermore, it includes guidelines and suggestions for future evaluations of neurofeedback training in sports. Copyright © 2017 Elsevier Ltd. All rights reserved.
Human capabilities in space. [man machine interaction
NASA Technical Reports Server (NTRS)
Nicogossian, A. E.
1984-01-01
Man's ability to live and perform useful work in space was demonstrated throughout the history of manned space flight. Current planning envisions a multi-functional space station. Man's unique abilities to respond to the unforeseen and to operate at a level of complexity exceeding any reasonable amount of previous planning distinguish him from present day machines. His limitations, however, include his inherent inability to survive without protection, his limited strength, and his propensity to make mistakes when performing repetitive and monotonous tasks. By contrast, an automated system does routine and delicate tasks, exerts force smoothly and precisely, stores, and recalls large amounts of data, and performs deductive reasoning while maintaining a relative insensitivity to the environment. The establishment of a permanent presence of man in space demands that man and machines be appropriately combined in spaceborne systems. To achieve this optimal combination, research is needed in such diverse fields as artificial intelligence, robotics, behavioral psychology, economics, and human factors engineering.
Application of Human-Autonomy Teaming (HAT) Patterns to Reduced Crew Operations (RCO)
NASA Technical Reports Server (NTRS)
Shively, R. Jay; Brandt, Summer L.; Lachter, Joel; Matessa, Mike; Sadler, Garrett; Battiste, Henri
2016-01-01
As part of the Air Force - NASA Bi-Annual Research Council Meeting, slides will be presented on recent Reduced Crew Operations (RCO) work. Unmanned aerial systems, robotics, advanced cockpits, and air traffic management are all examples of domains that are seeing dramatic increases in automation. While automation may take on some tasks previously performed by humans, humans will still be required, for the foreseeable future, to remain in the system. The collaboration with humans and these increasingly autonomous systems will begin to resemble cooperation between teammates, rather than simple task allocation. It is critical to understand this human-autonomy teaming (HAT) to optimize these systems in the future. One methodology to understand HAT is by identifying recurring patterns of HAT that have similar characteristics and solutions. A methodology for identifying HAT patterns to an advanced cockpit project is discussed.
The traveling salesman problem: a hierarchical model.
Graham, S M; Joshi, A; Pizlo, Z
2000-10-01
Our review of prior literature on spatial information processing in perception, attention, and memory indicates that these cognitive functions involve similar mechanisms based on a hierarchical architecture. The present study extends the application of hierarchical models to the area of problem solving. First, we report results of an experiment in which human subjects were tested on a Euclidean traveling salesman problem (TSP) with 6 to 30 cities. The subject's solutions were either optimal or near-optimal in length and were produced in a time that was, on average, a linear function of the number of cities. Next, the performance of the subjects is compared with that of five representative artificial intelligence and operations research algorithms, that produce approximate solutions for Euclidean problems. None of these algorithms was found to be an adequate psychological model. Finally, we present a new algorithm for solving the TSP, which is based on a hierarchical pyramid architecture. The performance of this new algorithm is quite similar to the performance of the subjects.
Zhang, Xiong; Zhao, Yacong; Zhang, Yu; Zhong, Xuefei; Fan, Zhaowen
2018-01-01
The novel human-computer interface (HCI) using bioelectrical signals as input is a valuable tool to improve the lives of people with disabilities. In this paper, surface electromyography (sEMG) signals induced by four classes of wrist movements were acquired from four sites on the lower arm with our designed system. Forty-two features were extracted from the time, frequency and time-frequency domains. Optimal channels were determined from single-channel classification performance rank. The optimal-feature selection was according to a modified entropy criteria (EC) and Fisher discrimination (FD) criteria. The feature selection results were evaluated by four different classifiers, and compared with other conventional feature subsets. In online tests, the wearable system acquired real-time sEMG signals. The selected features and trained classifier model were used to control a telecar through four different paradigms in a designed environment with simple obstacles. Performance was evaluated based on travel time (TT) and recognition rate (RR). The results of hardware evaluation verified the feasibility of our acquisition systems, and ensured signal quality. Single-channel analysis results indicated that the channel located on the extensor carpi ulnaris (ECU) performed best with mean classification accuracy of 97.45% for all movement’s pairs. Channels placed on ECU and the extensor carpi radialis (ECR) were selected according to the accuracy rank. Experimental results showed that the proposed FD method was better than other feature selection methods and single-type features. The combination of FD and random forest (RF) performed best in offline analysis, with 96.77% multi-class RR. Online results illustrated that the state-machine paradigm with a 125 ms window had the highest maneuverability and was closest to real-life control. Subjects could accomplish online sessions by three sEMG-based paradigms, with average times of 46.02, 49.06 and 48.08 s, respectively. These experiments validate the feasibility of proposed real-time wearable HCI system and algorithms, providing a potential assistive device interface for persons with disabilities. PMID:29543737
Predictive genomics DNA profiling for athletic performance.
Kambouris, Marios; Ntalouka, Foteini; Ziogas, Georgios; Maffulli, Nicola
2012-12-01
Genes control biological processes such as muscle, cartilage and bone formation, muscle energy production and metabolism (mitochondriogenesis, lactic acid removal), blood and tissue oxygenation (erythropoiesis, angiogenesis, vasodilatation), all essential in sport and athletic performance. DNA sequence variations in such genes confer genetic advantages that can be exploited, or genetic 'barriers' that could be overcome to achieve optimal athletic performance. Predictive Genomic DNA Profiling for athletic performance reveals genetic variations that may be associated with better suitability for endurance, strength and speed sports, vulnerability to sports-related injuries and individualized nutritional requirements. Knowledge of genetic 'suitability' in respect to endurance capacity or strength and speed would lead to appropriate sport and athletic activity selection. Knowledge of genetic advantages and barriers would 'direct' an individualized training program, nutritional plan and nutritional supplementation to achieving optimal performance, overcoming 'barriers' that results from intense exercise and pressure under competition with minimum waste of time and energy and avoidance of health risks (hypertension, cardiovascular disease, inflammation, and musculoskeletal injuries) related to exercise, training and competition. Predictive Genomics DNA profiling for Athletics and Sports performance is developing into a tool for athletic activity and sport selection and for the formulation of individualized and personalized training and nutritional programs to optimize health and performance for the athlete. Human DNA sequences are patentable in some countries, while in others DNA testing methodologies [unless proprietary], are non patentable. On the other hand, gene and variant selection, genotype interpretation and the risk and suitability assigning algorithms based on the specific Genomic variants used are amenable to patent protection.
The optimal SAM surface functional group for producing a biomimetic HA coating on Ti.
Liu, D P; Majewski, P; O'Neill, B K; Ngothai, Y; Colby, C B
2006-06-15
Commercial interest is growing in biomimetic methods that employ self assembled mono-layers (SAMs) to produce biocompatible HA coatings on Ti-based orthopedic implants. Recently, separate studies have considered HA formation for various SAM surface functional groups. However, these have often neglected to verify crystallinity of the HA coating, which is essential for optimal bioactivity. Furthermore, differing experimental and analytical methods make performance comparisons difficult. This article investigates and evaluates HA formation for four of the most promising surface functional groups: --OH, --SO(3)H, --PO(4)H(2) and --COOH. All of them successfully formed a HA coating at Ca/P ratios between 1.49 and 1.62. However, only the --SO(3)H and --COOH end groups produced a predominantly crystalline HA. Furthermore, the --COOH end group yielded the thickest layer and possessed crystalline characteristics very similar to that of the human bone. The --COOH end group appears to provide the optimal SAM surface interface for nucleation and growth of biomimetic crystalline HA. Intriguingly, this finding may lend support to explanations elsewhere of why human bone sialoprotein is such a potent nucleator of HA and is attributed to the protein's glutamic acid-rich sequences.
Zhang, Juanjuan; Collins, Steven H.
2017-01-01
This study uses theory and experiments to investigate the relationship between the passive stiffness of series elastic actuators and torque tracking performance in lower-limb exoskeletons during human walking. Through theoretical analysis with our simplified system model, we found that the optimal passive stiffness matches the slope of the desired torque-angle relationship. We also conjectured that a bandwidth limit resulted in a maximum rate of change in torque error that can be commanded through control input, which is fixed across desired and passive stiffness conditions. This led to hypotheses about the interactions among optimal control gains, passive stiffness and desired quasi-stiffness. Walking experiments were conducted with multiple angle-based desired torque curves. The observed lowest torque tracking errors identified for each combination of desired and passive stiffnesses were shown to be linearly proportional to the magnitude of the difference between the two stiffnesses. The proportional gains corresponding to the lowest observed errors were seen inversely proportional to passive stiffness values and to desired stiffness. These findings supported our hypotheses, and provide guidance to application-specific hardware customization as well as controller design for torque-controlled robotic legged locomotion. PMID:29326580
Yang, Du; Shuai, Yuan; Zhifei, Zhou; Lizheng, Wu; Lulu, Wang; Xing'an, Wu; Xiaojing, Wang
2017-04-01
To explore the effect of nicotine on the autophagy level of human periodontal ligament cells (hPDLCs). Periodontal tissues collected from premolars for orthodontic treatment reasons were used to culture hPDLCs. Western blot analysis was performed to test the most optimal time and concentration of nicotine on the autophagy level of the hPDLCs. Transmission electron microscope and immunofluorescence observation were carried out to detect the form of autophagosomes and expression of autophagy related protein LC3 in hPDLCs under this optimal condition. Protein expression of LC3Ⅱ was up regulated with the 12 h nicotine stimulating. Besides that, the up regulation of the protein expression of LC3Ⅱ was concentration dependent and nicotine with a concentration of 1×10⁻⁵ mol·L⁻¹ was the most optimal condition. Transmission electron microscope and immunofluorescence observations indicated that nicotine would activate the autophagy level of hPDLCs by increasing the number of autophagosomes and up regulating the expression of autophagy related protein LC3. Nicotine could increase autophagy level of hPDLCs, thus affecting the occurrence and development of smoking related periodontitis.
Mathematical modeling of zika virus disease with nonlinear incidence and optimal control
NASA Astrophysics Data System (ADS)
Goswami, Naba Kumar; Srivastav, Akhil Kumar; Ghosh, Mini; Shanmukha, B.
2018-04-01
The Zika virus was first discovered in a rhesus monkey in the Zika Forest of Uganda in 1947, and it was isolated from humans in Nigeria in 1952. Zika virus disease is primarily a mosquito-borne disease, which is transmitted to human primarily through the bite of an infected Aedes species mosquito. However, there is documented evidence of sexual transmission of this disease too. In this paper, a nonlinear mathematical model for Zika virus by considering nonlinear incidence is formulated and analyzed. The equilibria and the basic reproduction number (R0) of the model are found. The stability of the different equilibria of the model is discussed in detail. When the basic reproduction number R0 < 1, the disease-free equilibrium is locally and globally stable i.e. in this case disease dies out. For R0 > 1, we have endemic equilibrium which is locally stable under some restriction on parameters. Further this model is extended to optimal control model and is analyzed by using Pontryagin’s Maximum Principle. It has been observed that optimal control plays a significant role in reducing the number of zika infectives. Finally, numerical simulation is performed to illustrate the analytical findings.
Mitchell, Peter D; Ratcliffe, Elizabeth; Hourd, Paul; Williams, David J; Thomas, Robert J
2014-12-01
It is well documented that cryopreservation and resuscitation of human embryonic stem cells (hESCs) is complex and ill-defined, and often suffers poor cell recovery and increased levels of undesirable cell differentiation. In this study we have applied Quality-by-Design (QbD) concepts to the critical processes of slow-freeze cryopreservation and resuscitation of hESC colony cultures. Optimized subprocesses were linked together to deliver a controlled complete process. We have demonstrated a rapid, high-throughput, and stable system for measurement of cell adherence and viability as robust markers of in-process and postrecovery cell state. We observed that measurement of adherence and viability of adhered cells at 1 h postseeding was predictive of cell proliferative ability up to 96 h in this system. Application of factorial design defined the operating spaces for cryopreservation and resuscitation, critically linking the performance of these two processes. Optimization of both processes resulted in enhanced reattachment and post-thaw viability, resulting in substantially greater recovery of cryopreserved, pluripotent cell colonies. This study demonstrates the importance of QbD concepts and tools for rapid, robust, and low-risk process design that can inform manufacturing controls and logistics.
Evaluating Suit Fit Using Performance Degradation
NASA Technical Reports Server (NTRS)
Margerum, Sarah E.; Cowley, Matthew; Harvill, Lauren; Benson, Elizabeth; Rajulu, Sudhakar
2011-01-01
The Mark III suit has multiple sizes of suit components (arm, leg, and gloves) as well as sizing inserts to tailor the fit of the suit to an individual. This study sought to determine a way to identify the point an ideal suit fit transforms into a bad fit and how to quantify this breakdown using mobility-based physical performance data. This study examined the changes in human physical performance via degradation of the elbow and wrist range of motion of the planetary suit prototype (Mark III) with respect to changes in sizing and as well as how to apply that knowledge to suit sizing options and improvements in suit fit. The methods implemented in this study focused on changes in elbow and wrist mobility due to incremental suit sizing modifications. This incremental sizing was within a range that included both optimum and poor fit. Suited range of motion data was collected using a motion analysis system for nine isolated and functional tasks encompassing the elbow and wrist joints. A total of four subjects were tested with motions involving both arms simultaneously as well as the right arm only. The results were then compared across sizing configurations. The results of this study indicate that range of motion may be used as a viable parameter to quantify at what stage suit sizing causes a detriment in performance; however the human performance decrement appeared to be based on the interaction of multiple joints along a limb, not a single joint angle. The study was able to identify a preliminary method to quantify the impact of size on performance and to develop a means to gauge tolerances around optimal size. More work is needed to improve the assessment of optimal fit and to compensate for multiple joint interactions.
Can we define an infant's need from the composition of human milk?
Stam, José; Sauer, Pieter Jj; Boehm, Günther
2013-08-01
Human milk is recommended as the optimal nutrient source for infants and is associated with several short- and long-term benefits for child health. When accepting that human milk is the optimal nutrition for healthy term infants, it should be possible to calculate the nutritional needs of these infants from the intake of human milk. These data can then be used to design the optimal composition of infant formulas. In this review we show that the composition of human milk is rather variable and is dependent on factors such as beginning or end of feeding, duration of lactation, diet and body composition of the mother, maternal genes, and possibly infant factors such as sex. In particular, the composition of fatty acids in human milk is quite variable. It therefore seems questionable to estimate the nutritional needs of an infant exclusively from the intake of human milk. The optimal intake for infants must be based, at least in part, on other information-eg, balance or stable-isotope studies. The present recommendation that the composition of infant formulas should be based on the composition of human milk needs revision.
NASA Astrophysics Data System (ADS)
Ghaly, Michael; Links, Jonathan M.; Frey, Eric
2015-03-01
In this work, we used the ideal observer (IO) and IO with model mismatch (IO-MM) applied in the projection domain and an anthropomorphic Channelized Hotelling Observer (CHO) applied to reconstructed images to optimize the acquisition energy window width and evaluate various scatter compensation methods in the context of a myocardial perfusion SPECT defect detection task. The IO has perfect knowledge of the image formation process and thus reflects performance with perfect compensation for image-degrading factors. Thus, using the IO to optimize imaging systems could lead to suboptimal parameters compared to those optimized for humans interpreting SPECT images reconstructed with imperfect or no compensation. The IO-MM allows incorporating imperfect system models into the IO optimization process. We found that with near-perfect scatter compensation, the optimal energy window for the IO and CHO were similar; in its absence the IO-MM gave a better prediction of the optimal energy window for the CHO using different scatter compensation methods. These data suggest that the IO-MM may be useful for projection-domain optimization when model mismatch is significant, and that the IO is useful when followed by reconstruction with good models of the image formation process.
Task-based lens design with application to digital mammography
NASA Astrophysics Data System (ADS)
Chen, Liying; Barrett, Harrison H.
2005-01-01
Recent advances in model observers that predict human perceptual performance now make it possible to optimize medical imaging systems for human task performance. We illustrate the procedure by considering the design of a lens for use in an optically coupled digital mammography system. The channelized Hotelling observer is used to model human performance, and the channels chosen are differences of Gaussians. The task performed by the model observer is detection of a lesion at a random but known location in a clustered lumpy background mimicking breast tissue. The entire system is simulated with a Monte Carlo application according to physics principles, and the main system component under study is the imaging lens that couples a fluorescent screen to a CCD detector. The signal-to-noise ratio (SNR) of the channelized Hotelling observer is used to quantify this detectability of the simulated lesion (signal) on the simulated mammographic background. Plots of channelized Hotelling SNR versus signal location for various lens apertures, various working distances, and various focusing places are presented. These plots thus illustrate the trade-off between coupling efficiency and blur in a task-based manner. In this way, the channelized Hotelling SNR is used as a merit function for lens design.
Task-based lens design, with application to digital mammography
NASA Astrophysics Data System (ADS)
Chen, Liying
Recent advances in model observers that predict human perceptual performance now make it possible to optimize medical imaging systems for human task performance. We illustrate the procedure by considering the design of a lens for use in an optically coupled digital mammography system. The channelized Hotelling observer is used to model human performance, and the channels chosen are differences of Gaussians (DOGs). The task performed by the model observer is detection of a lesion at a random but known location in a clustered lumpy background mimicking breast tissue. The entire system is simulated with a Monte Carlo application according to the physics principles, and the main system component under study is the imaging lens that couples a fluorescent screen to a CCD detector. The SNR of the channelized Hotelling observer is used to quantify the detectability of the simulated lesion (signal) upon the simulated mammographic background. In this work, plots of channelized Hotelling SNR vs. signal location for various lens apertures, various working distances, and various focusing places are shown. These plots thus illustrate the trade-off between coupling efficiency and blur in a task-based manner. In this way, the channelized Hotelling SNR is used as a merit function for lens design.
Prediction of muscle performance during dynamic repetitive movement
NASA Technical Reports Server (NTRS)
Byerly, D. L.; Byerly, K. A.; Sognier, M. A.; Squires, W. G.
2003-01-01
BACKGROUND: During long-duration spaceflight, astronauts experience progressive muscle atrophy and often perform strenuous extravehicular activities. Post-flight, there is a lengthy recovery period with an increased risk for injury. Currently, there is a critical need for an enabling tool to optimize muscle performance and to minimize the risk of injury to astronauts while on-orbit and during post-flight recovery. Consequently, these studies were performed to develop a method to address this need. METHODS: Eight test subjects performed a repetitive dynamic exercise to failure at 65% of their upper torso weight using a Lordex spinal machine. Surface electromyography (SEMG) data was collected from the erector spinae back muscle. The SEMG data was evaluated using a 5th order autoregressive (AR) model and linear regression analysis. RESULTS: The best predictor found was an AR parameter, the mean average magnitude of AR poles, with r = 0.75 and p = 0.03. This parameter can predict performance to failure as early as the second repetition of the exercise. CONCLUSION: A method for predicting human muscle performance early during dynamic repetitive exercise was developed. The capability to predict performance to failure has many potential applications to the space program including evaluating countermeasure effectiveness on-orbit, optimizing post-flight recovery, and potential future real-time monitoring capability during extravehicular activity.
Abney, Drew H; Paxton, Alexandra; Dale, Rick; Kello, Christopher T
2015-11-01
Successful interaction requires complex coordination of body movements. Previous research has suggested a functional role for coordination and especially synchronization (i.e., time-locked movement across individuals) in different types of human interaction contexts. Although such coordination has been shown to be nearly ubiquitous in human interaction, less is known about its function. One proposal is that synchrony supports and facilitates communication (Topics Cogn Sci 1:305-319, 2009). However, questions still remain about what the properties of coordination for optimizing communication might look like. In the present study, dyads worked together to construct towers from uncooked spaghetti and marshmallows. Using cross-recurrence quantification analysis, we found that dyads with loosely coupled gross body movements performed better, supporting recent work suggesting that simple synchrony may not be the key to effective performance (Riley et al. 2011). We also found evidence that leader-follower dynamics-when sensitive to the specific role structure of the interaction-impact task performance. We discuss our results with respect to the functional role of coordination in human interaction.
A survey of the dummy face and human face stimuli used in BCI paradigm.
Chen, Long; Jin, Jing; Zhang, Yu; Wang, Xingyu; Cichocki, Andrzej
2015-01-15
It was proved that the human face stimulus were superior to the flash only stimulus in BCI system. However, human face stimulus may lead to copyright infringement problems and was hard to be edited according to the requirement of the BCI study. Recently, it was reported that facial expression changes could be done by changing a curve in a dummy face which could obtain good performance when it was applied to visual-based P300 BCI systems. In this paper, four different paradigms were presented, which were called dummy face pattern, human face pattern, inverted dummy face pattern and inverted human face pattern, to evaluate the performance of the dummy faces stimuli compared with the human faces stimuli. The key point that determined the value of dummy faces in BCI systems were whether dummy faces stimuli could obtain as good performance as human faces stimuli. Online and offline results of four different paradigms would have been obtained and comparatively analyzed. Online and offline results showed that there was no significant difference among dummy faces and human faces in ERPs, classification accuracy and information transfer rate when they were applied in BCI systems. Dummy faces stimuli could evoke large ERPs and obtain as high classification accuracy and information transfer rate as the human faces stimuli. Since dummy faces were easy to be edited and had no copyright infringement problems, it would be a good choice for optimizing the stimuli of BCI systems. Copyright © 2014 Elsevier B.V. All rights reserved.
Isolation strategy of a two-strain avian influenza model using optimal control
NASA Astrophysics Data System (ADS)
Mardlijah, Ariani, Tika Desi; Asfihani, Tahiyatul
2017-08-01
Avian influenza has killed many victims of both birds and humans. Most cases of avian influenza infection in humans have resulted transmission from poultry to humans. To prevent or minimize the patients of avian influenza can be done by pharmaceutical and non-pharmaceutical measures such as the use of masks, isolation, etc. We will be analyzed two strains of avian influenza models that focus on treatment of symptoms with insulation, then investigate the stability of the equilibrium point by using Routh-Hurwitz criteria. We also used optimal control to reduce the number of humans infected by making the isolation level as the control then proceeds optimal control will be simulated. The completion of optimal control used in this study is the Pontryagin Minimum Principle and for simulation we are using Runge Kutta method. The results obtained showed that the application of two control is more optimal compared to apply one control only.
Commonality between Reduced Gravity and Microgravity Habitats for Long Duration Missions
NASA Technical Reports Server (NTRS)
Howard, Robert
2014-01-01
Many conceptual studies for long duration missions beyond Earth orbit have assumed unique habitat designs for each destination and for transit habitation. This may not be the most effective approach. A variable gravity habitat, one designed for use in microgravity, lunar, Martian, and terrestrial environments may provide savings that offset the loss of environment-specific optimization. However, a brief analysis of selected flown spacecraft and Constellation-era conceptual habitat designs suggests that one cannot simply lift a habitat from one environment and place it in another that it was not designed for without incurring significant human performance compromises. By comparison, a conceptual habitat based on the Skylab II framework but designed specifically to accommodate variable gravity environments can be shown to yield significant advantages while incurring only minimal human performance compromises.
Ghorbani, Mahdi; Chamsaz, Mahmoud; Rounaghi, Gholam Hossein
2016-03-01
A simple, rapid, and sensitive method for the determination of naproxen and ibuprofen in complex biological and water matrices (cow milk, human urine, river, and well water samples) has been developed using ultrasound-assisted magnetic dispersive solid-phase microextraction. Magnetic ethylendiamine-functionalized graphene oxide nanocomposite was synthesized and used as a novel adsorbent for the microextraction process and showed great adsorptive ability toward these analytes. Different parameters affecting the microextraction were optimized with the aid of the experimental design approach. A Plackett-Burman screening design was used to study the main variables affecting the microextraction process, and the Box-Behnken optimization design was used to optimize the previously selected variables for extraction of naproxen and ibuprofen. The optimized technique provides good repeatability (relative standard deviations of the intraday precision 3.1 and 3.3, interday precision of 5.6 and 6.1%), linearity (0.1-500 and 0.3-650 ng/mL), low limits of detection (0.03 and 0.1 ng/mL), and a high enrichment factor (168 and 146) for naproxen and ibuprofen, respectively. The proposed method can be successfully applied in routine analysis for determination of naproxen and ibuprofen in cow milk, human urine, and real water samples. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Motion Cueing Algorithm Development: Human-Centered Linear and Nonlinear Approaches
NASA Technical Reports Server (NTRS)
Houck, Jacob A. (Technical Monitor); Telban, Robert J.; Cardullo, Frank M.
2005-01-01
While the performance of flight simulator motion system hardware has advanced substantially, the development of the motion cueing algorithm, the software that transforms simulated aircraft dynamics into realizable motion commands, has not kept pace. Prior research identified viable features from two algorithms: the nonlinear "adaptive algorithm", and the "optimal algorithm" that incorporates human vestibular models. A novel approach to motion cueing, the "nonlinear algorithm" is introduced that combines features from both approaches. This algorithm is formulated by optimal control, and incorporates a new integrated perception model that includes both visual and vestibular sensation and the interaction between the stimuli. Using a time-varying control law, the matrix Riccati equation is updated in real time by a neurocomputing approach. Preliminary pilot testing resulted in the optimal algorithm incorporating a new otolith model, producing improved motion cues. The nonlinear algorithm vertical mode produced a motion cue with a time-varying washout, sustaining small cues for longer durations and washing out large cues more quickly compared to the optimal algorithm. The inclusion of the integrated perception model improved the responses to longitudinal and lateral cues. False cues observed with the NASA adaptive algorithm were absent. The neurocomputing approach was crucial in that the number of presentations of an input vector could be reduced to meet the real time requirement without degrading the quality of the motion cues.
Three-dimensional displays and stereo vision
Westheimer, Gerald
2011-01-01
Procedures for three-dimensional image reconstruction that are based on the optical and neural apparatus of human stereoscopic vision have to be designed to work in conjunction with it. The principal methods of implementing stereo displays are described. Properties of the human visual system are outlined as they relate to depth discrimination capabilities and achieving optimal performance in stereo tasks. The concept of depth rendition is introduced to define the change in the parameters of three-dimensional configurations for cases in which the physical disposition of the stereo camera with respect to the viewed object differs from that of the observer's eyes. PMID:21490023
Aerodynamic design of the Cal Poly Da Vinci Human-Powered Helicopter
NASA Technical Reports Server (NTRS)
Larwood, Scott; Saiki, Neal
1990-01-01
This paper will discuss the methodology used in designing the rotor and drive propellers for the third generation Cal Poly Da Vinci Human-Powered Helicopter. The rotor was designed using a lifting surface, uniform inflow hover analysis code and the propeller was designed using a minimum induced-loss method. Construction, geometry, and operating considerations are discussed as they impact the designs. Optimization of the design performance is also explained. The propellers were tested in a wind tunnel and results are compared with theoretical data. Successful flight tests of the Da Vinci III are discussed.
Selecting Tasks for Evaluating Human Performance as a Function of Gravity
NASA Technical Reports Server (NTRS)
Norcross, J. R.; Gernhardt, M. L.
2010-01-01
A challenge in understanding human performance as a function of gravity is determining which tasks to research. Initial studies began with treadmill walking, which was easy to quantify and control. However, with the development of pressurized rovers, it is less important to optimize human performance for ambulation as rovers will likely perform gross translation for them. Future crews are likely to spend much of their extravehicular activity (EVA) performing geology, construction and maintenance type tasks, for which it is difficult to measure steady-state-workloads. To evaluate human performance in reduced gravity, we have collected metabolic, biomechanical and subjective data for different tasks at varied gravity levels. Methods: Ten subjects completed 5 different tasks including weight transfer, shoveling, treadmill walking, treadmill running and treadmill incline walking. All tasks were performed shirt-sleeved at 1-g, 3/8-g and 1/6-g. Off-loaded conditions were achieved via the Active Response Gravity Offload System. Treadmill tasks were performed for 3 minutes with reported oxygen consumption (VO2) averaged over the last 2 minutes. Shoveling was performed for 3 minutes with metabolic cost reported as ml O2 consumed per kg material shoveled. Weight transfer reports metabolic cost as liters O2 consumed to complete the task. Statistical analysis was performed via repeated measures ANOVA. Results: Statistically significant metabolic differences were noted between all 3 gravity levels for treadmill running and incline walking. For the other 3 tasks, there were significant differences between 1-g and each reduced gravity, but not between 1/6-g and 3/8-g. For weight transfer, significant differences were seen between gravities in both trial-average VO2 and time-to-completion with noted differences in strategy for task completion. Conclusion: To determine if gravity has a metabolic effect on human performance, this research may indicate that tasks should be selected that require the subject to work vertically against the force of gravity.
2014-11-24
aptamers to enhance specificity. Additionally, pre-concentration was coupled to various detection paradigms to achieve high-sensitivity biomarker... Aptamers , Biomarkers, Nanofluidics, Pre-concentration Devices, Sensing 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT SAR 18. NUMBER...devices and optimized electrokinetic pre-concentration conditions for key neurological biomarkers of interest, by using nanoparticles and aptamers to
Optimization of a Paper-Based ELISA for a Human Performance Biomarker
2013-11-11
the response of the assay through minimizing the interaction of the antigen with the cellulose fibers 28 . However, the chemiluminescent assay used...28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 18 to proteases, microbial contamination, or
Differences in Multitask Resource Reallocation After Change in Task Values.
Matton, Nadine; Paubel, Pierre; Cegarra, Julien; Raufaste, Eric
2016-12-01
The objective was to characterize multitask resource reallocation strategies when managing subtasks with various assigned values. When solving a resource conflict in multitasking, Salvucci and Taatgen predict a globally rational strategy will be followed that favors the most urgent subtask and optimizes global performance. However, Katidioti and Taatgen identified a locally rational strategy that optimizes only a subcomponent of the whole task, leading to detrimental consequences on global performance. Moreover, the question remains open whether expertise would have an impact on the choice of the strategy. We adopted a multitask environment used for pilot selection with a change in emphasis on two out of four subtasks while all subtasks had to be maintained over a minimum performance. A laboratory eye-tracking study contrasted 20 recently selected pilot students considered as experienced with this task and 15 university students considered as novices. When two subtasks were emphasized, novices focused their resources particularly on one high-value subtask and failed to prevent both low-value subtasks falling below minimum performance. On the contrary, experienced people delayed the processing of one low-value subtask but managed to optimize global performance. In a multitasking environment where some subtasks are emphasized, novices follow a locally rational strategy whereas experienced participants follow a globally rational strategy. During complex training, trainees are only able to adjust their resource allocation strategy to subtask emphasis changes once they are familiar with the multitasking environment. © 2016, Human Factors and Ergonomics Society.
Utilization of the Space Vision System as an Augmented Reality System For Mission Operations
NASA Technical Reports Server (NTRS)
Maida, James C.; Bowen, Charles
2003-01-01
Augmented reality is a technique whereby computer generated images are superimposed on live images for visual enhancement. Augmented reality can also be characterized as dynamic overlays when computer generated images are registered with moving objects in a live image. This technique has been successfully implemented, with low to medium levels of registration precision, in an NRA funded project entitled, "Improving Human Task Performance with Luminance Images and Dynamic Overlays". Future research is already being planned to also utilize a laboratory-based system where more extensive subject testing can be performed. However successful this might be, the problem will still be whether such a technology can be used with flight hardware. To answer this question, the Canadian Space Vision System (SVS) will be tested as an augmented reality system capable of improving human performance where the operation requires indirect viewing. This system has already been certified for flight and is currently flown on each shuttle mission for station assembly. Successful development and utilization of this system in a ground-based experiment will expand its utilization for on-orbit mission operations. Current research and development regarding the use of augmented reality technology is being simulated using ground-based equipment. This is an appropriate approach for development of symbology (graphics and annotation) optimal for human performance and for development of optimal image registration techniques. It is anticipated that this technology will become more pervasive as it matures. Because we know what and where almost everything is on ISS, this reduces the registration problem and improves the computer model of that reality, making augmented reality an attractive tool, provided we know how to use it. This is the basis for current research in this area. However, there is a missing element to this process. It is the link from this research to the current ISS video system and to flight hardware capable of utilizing this technology. This is the basis for this proposed Space Human Factors Engineering project, the determination of the display symbology within the performance limits of the Space Vision System that will objectively improve human performance. This utilization of existing flight hardware will greatly reduce the costs of implementation for flight. Besides being used onboard shuttle and space station and as a ground-based system for mission operational support, it also has great potential for science and medical training and diagnostics, remote learning, team learning, video/media conferencing, and educational outreach.
Singularity now: using the ventricular assist device as a model for future human-robotic physiology.
Martin, Archer K
2016-04-01
In our 21 st century world, human-robotic interactions are far more complicated than Asimov predicted in 1942. The future of human-robotic interactions includes human-robotic machine hybrids with an integrated physiology, working together to achieve an enhanced level of baseline human physiological performance. This achievement can be described as a biological Singularity. I argue that this time of Singularity cannot be met by current biological technologies, and that human-robotic physiology must be integrated for the Singularity to occur. In order to conquer the challenges we face regarding human-robotic physiology, we first need to identify a working model in today's world. Once identified, this model can form the basis for the study, creation, expansion, and optimization of human-robotic hybrid physiology. In this paper, I present and defend the line of argument that currently this kind of model (proposed to be named "IshBot") can best be studied in ventricular assist devices - VAD.
Singularity now: using the ventricular assist device as a model for future human-robotic physiology
Martin, Archer K.
2016-01-01
In our 21st century world, human-robotic interactions are far more complicated than Asimov predicted in 1942. The future of human-robotic interactions includes human-robotic machine hybrids with an integrated physiology, working together to achieve an enhanced level of baseline human physiological performance. This achievement can be described as a biological Singularity. I argue that this time of Singularity cannot be met by current biological technologies, and that human-robotic physiology must be integrated for the Singularity to occur. In order to conquer the challenges we face regarding human-robotic physiology, we first need to identify a working model in today’s world. Once identified, this model can form the basis for the study, creation, expansion, and optimization of human-robotic hybrid physiology. In this paper, I present and defend the line of argument that currently this kind of model (proposed to be named “IshBot”) can best be studied in ventricular assist devices – VAD. PMID:28913480
An algorithm for automatic parameter adjustment for brain extraction in BrainSuite
NASA Astrophysics Data System (ADS)
Rajagopal, Gautham; Joshi, Anand A.; Leahy, Richard M.
2017-02-01
Brain Extraction (classification of brain and non-brain tissue) of MRI brain images is a crucial pre-processing step necessary for imaging-based anatomical studies of the human brain. Several automated methods and software tools are available for performing this task, but differences in MR image parameters (pulse sequence, resolution) and instrumentand subject-dependent noise and artefacts affect the performance of these automated methods. We describe and evaluate a method that automatically adapts the default parameters of the Brain Surface Extraction (BSE) algorithm to optimize a cost function chosen to reflect accurate brain extraction. BSE uses a combination of anisotropic filtering, Marr-Hildreth edge detection, and binary morphology for brain extraction. Our algorithm automatically adapts four parameters associated with these steps to maximize the brain surface area to volume ratio. We evaluate the method on a total of 109 brain volumes with ground truth brain masks generated by an expert user. A quantitative evaluation of the performance of the proposed algorithm showed an improvement in the mean (s.d.) Dice coefficient from 0.8969 (0.0376) for default parameters to 0.9509 (0.0504) for the optimized case. These results indicate that automatic parameter optimization can result in significant improvements in definition of the brain mask.
Bai, Mingsian R; Hsieh, Ping-Ju; Hur, Kur-Nan
2009-02-01
The performance of the minimum mean-square error noise reduction (MMSE-NR) algorithm in conjunction with time-recursive averaging (TRA) for noise estimation is found to be very sensitive to the choice of two recursion parameters. To address this problem in a more systematic manner, this paper proposes an optimization method to efficiently search the optimal parameters of the MMSE-TRA-NR algorithms. The objective function is based on a regression model, whereas the optimization process is carried out with the simulated annealing algorithm that is well suited for problems with many local optima. Another NR algorithm proposed in the paper employs linear prediction coding as a preprocessor for extracting the correlated portion of human speech. Objective and subjective tests were undertaken to compare the optimized MMSE-TRA-NR algorithm with several conventional NR algorithms. The results of subjective tests were processed by using analysis of variance to justify the statistic significance. A post hoc test, Tukey's Honestly Significant Difference, was conducted to further assess the pairwise difference between the NR algorithms.
Image gathering and processing - Information and fidelity
NASA Technical Reports Server (NTRS)
Huck, F. O.; Fales, C. L.; Halyo, N.; Samms, R. W.; Stacy, K.
1985-01-01
In this paper we formulate and use information and fidelity criteria to assess image gathering and processing, combining optical design with image-forming and edge-detection algorithms. The optical design of the image-gathering system revolves around the relationship among sampling passband, spatial response, and signal-to-noise ratio (SNR). Our formulations of information, fidelity, and optimal (Wiener) restoration account for the insufficient sampling (i.e., aliasing) common in image gathering as well as for the blurring and noise that conventional formulations account for. Performance analyses and simulations for ordinary optical-design constraints and random scences indicate that (1) different image-forming algorithms prefer different optical designs; (2) informationally optimized designs maximize the robustness of optimal image restorations and lead to the highest-spatial-frequency channel (relative to the sampling passband) for which edge detection is reliable (if the SNR is sufficiently high); and (3) combining the informationally optimized design with a 3 by 3 lateral-inhibitory image-plane-processing algorithm leads to a spatial-response shape that approximates the optimal edge-detection response of (Marr's model of) human vision and thus reduces the data preprocessing and transmission required for machine vision.
Rate-distortion theory and human perception.
Sims, Chris R
2016-07-01
The fundamental goal of perception is to aid in the achievement of behavioral objectives. This requires extracting and communicating useful information from noisy and uncertain sensory signals. At the same time, given the complexity of sensory information and the limitations of biological information processing, it is necessary that some information must be lost or discarded in the act of perception. Under these circumstances, what constitutes an 'optimal' perceptual system? This paper describes the mathematical framework of rate-distortion theory as the optimal solution to the problem of minimizing the costs of perceptual error subject to strong constraints on the ability to communicate or transmit information. Rate-distortion theory offers a general and principled theoretical framework for developing computational-level models of human perception (Marr, 1982). Models developed in this framework are capable of producing quantitatively precise explanations for human perceptual performance, while yielding new insights regarding the nature and goals of perception. This paper demonstrates the application of rate-distortion theory to two benchmark domains where capacity limits are especially salient in human perception: discrete categorization of stimuli (also known as absolute identification) and visual working memory. A software package written for the R statistical programming language is described that aids in the development of models based on rate-distortion theory. Copyright © 2016 The Author. Published by Elsevier B.V. All rights reserved.
The Evolution of Generosity in the Ultimatum Game.
Hintze, Arend; Hertwig, Ralph
2016-09-28
When humans fail to make optimal decisions in strategic games and economic gambles, researchers typically try to explain why that behaviour is biased. To this end, they search for mechanisms that cause human behaviour to deviate from what seems to be the rational optimum. But perhaps human behaviour is not biased; perhaps research assumptions about the optimality of strategies are incomplete. In the one-shot anonymous symmetric ultimatum game (UG), humans fail to play optimally as defined by the Nash equilibrium. However, the distinction between kin and non-kin-with kin detection being a key evolutionary adaption-is often neglected when deriving the "optimal" strategy. We computationally evolved strategies in the UG that were equipped with an evolvable probability to discern kin from non-kin. When an opponent was not kin, agents evolved strategies that were similar to those used by humans. We therefore conclude that the strategy humans play is not irrational. The deviation between behaviour and the Nash equilibrium may rather be attributable to key evolutionary adaptations, such as kin detection. Our findings further suggest that social preference models are likely to capture mechanisms that permit people to play optimally in an evolutionary context. Once this context is taken into account, human behaviour no longer appears irrational.
Chaudhary, Amit; Yadav, Birendra Singh; Singh, Swati; Maurya, Pramod Kumar; Mishra, Alok; Srivastva, Shweta; Varadwaj, Pritish Kumar; Singh, Nand Kumar; Mani, Ashutosh
2017-10-01
Ficus religiosa L. is generally known as Peepal and belongs to family Moraceae . The tree is a source of many compounds having high medicinal value. In gastrointestinal tract, histamine H2 receptors have key role in histamine-stimulated gastric acid secretion. Their over stimulation causes its excessive production which is responsible for gastric ulcer. This study aims to screen the range of phytochemicals present in F. religiosa for binding with human histamine H2 and identify therapeutics for a gastric ulcer from the plant. In this work, a 3D-structure of human histamine H2 receptor was modeled by using homology modeling and the predicted model was validated using PROCHECK. Docking studies were also performed to assess binding affinities between modeled receptor and 34 compounds. Molecular dynamics simulations were done to identify most stable receptor-ligand complexes. Absorption, distribution, metabolism, excretion, and screening was done to evaluate pharmacokinetic properties of compounds. The results suggest that seven ligands, namely, germacrene, bergaptol, lanosterol, Ergost-5-en-3beta-ol, α-amyrin acetate, bergapten, and γ-cadinene showed better binding affinities. Among seven phytochemicals, lanosterol and α-amyrin acetate were found to have greater stability during simulation studies. These two compounds may be a suitable therapeutic agent against histamine H2 receptor. This study was performed to screen antiulcer compounds from F. religiosa . Molecular modeling, molecular docking and MD simulation studies were performed with selected phytochemicals from F. religiosa . The analysis suggests that Lanosterol and α-amyrin may be a suitable therapeutic agent against histamine H2 receptor. This study facilitates initiation of the herbal drug discovery process for the antiulcer activity. Abbreviations used: ADMET: Absorption, distribution, metabolism, excretion and toxicity, DOPE: Discrete Optimized Potential Energy, OPLS: Optimized potential for liquid simulations, RMSD: Root-mean-square deviation, HOA: Human oral absorption, MW: Molecular weight, SP: Standard-precision, XP: Extra-precision, GPCRs: G protein-coupled receptors, SASA: Solvent accessible surface area, Rg: Radius of gyration, NHB: Number of hydrogen bond.
Song, Yeonhwa; Kim, Jin-Sun; Kim, Se-Hyuk; Park, Yoon Kyung; Yu, Eunsil; Kim, Ki-Hun; Seo, Eul-Ju; Oh, Heung-Bum; Lee, Han Chu; Kim, Kang Mo; Seo, Haeng Ran
2018-05-25
Hepatocellular carcinoma (HCC) is one of the most common malignant tumors worldwide and has poor prognosis. Specially, patients with HCC usually have poor tolerance of systemic chemotherapy, because HCCs develop from chronically damaged tissue that contains considerable inflammation, fibrosis, and cirrhosis. Since HCC exhibits highly heterogeneous molecular characteristics, a proper in vitro system is required for the study of HCC pathogenesis. To this end, we have established two new hepatitis B virus (HBV) DNA-secreting HCC cell lines from infected patients. Based on these two new HCC cell lines, we have developed chemosensitivity assays for patient-derived multicellular tumor spheroids (MCTSs) in order to select optimized anti-cancer drugs to provide more informative data for clinical drug application. To monitor the effect of the interaction of cancer cells and stromal cells in MCTS, we used a 3D co-culture model with patient-derived HCC cells and stromal cells from human hepatic stellate cells, human fibroblasts, and human umbilical vein endothelial cells to facilitate screening for optimized cancer therapy. To validate our system, we performed a comparison of chemosensitivity of the three culture systems, which are monolayer culture system, tumor spheroids, and MCTSs of patient-derived cells, to sorafenib, 5-fluorouracil, and cisplatin, as these compounds are typically standard therapy for advanced HCC in South Korea. In summary, these findings suggest that the MCTS culture system is the best methodology for screening for optimized treatment for each patients with HCC, because tumor spheroids not only mirror the 3D cellular context of the tumors but also exhibit therapeutically relevant pathophysiological gradients and heterogeneity of in vivo tumors.
NASA Technical Reports Server (NTRS)
Kirlik, Alex
1991-01-01
Advances in computer and control technology offer the opportunity for task-offload aiding in human-machine systems. A task-offload aid (e.g., an autopilot, an intelligent assistant) can be selectively engaged by the human operator to dynamically delegate tasks to an automated system. Successful design and performance prediction in such systems requires knowledge of the factors influencing the strategy the operator develops and uses for managing interaction with the task-offload aid. A model is presented that shows how such strategies can be predicted as a function of three task context properties (frequency and duration of secondary tasks and costs of delaying secondary tasks) and three aid design properties (aid engagement and disengagement times, aid performance relative to human performance). Sensitivity analysis indicates how each of these contextual and design factors affect the optimal aid aid usage strategy and attainable system performance. The model is applied to understanding human-automation interaction in laboratory experiments on human supervisory control behavior. The laboratory task allowed subjects freedom to determine strategies for using an autopilot in a dynamic, multi-task environment. Modeling results suggested that many subjects may indeed have been acting appropriately by not using the autopilot in the way its designers intended. Although autopilot function was technically sound, this aid was not designed with due regard to the overall task context in which it was placed. These results demonstrate the need for additional research on how people may strategically manage their own resources, as well as those provided by automation, in an effort to keep workload and performance at acceptable levels.
McKenna, Matthew T.; Wang, Shijun; Nguyen, Tan B.; Burns, Joseph E.; Petrick, Nicholas; Summers, Ronald M.
2012-01-01
Computer-aided detection (CAD) systems have been shown to improve the diagnostic performance of CT colonography (CTC) in the detection of premalignant colorectal polyps. Despite the improvement, the overall system is not optimal. CAD annotations on true lesions are incorrectly dismissed, and false positives are misinterpreted as true polyps. Here, we conduct an observer performance study utilizing distributed human intelligence in the form of anonymous knowledge workers (KWs) to investigate human performance in classifying polyp candidates under different presentation strategies. We evaluated 600 polyp candidates from 50 patients, each case having at least one polyp • 6 mm, from a large database of CTC studies. Each polyp candidate was labeled independently as a true or false polyp by 20 KWs and an expert radiologist. We asked each labeler to determine whether the candidate was a true polyp after looking at a single 3D-rendered image of the candidate and after watching a video fly-around of the candidate. We found that distributed human intelligence improved significantly when presented with the additional information in the video fly-around. We noted that performance degraded with increasing interpretation time and increasing difficulty, but distributed human intelligence performed better than our CAD classifier for “easy” and “moderate” polyp candidates. Further, we observed numerous parallels between the expert radiologist and the KWs. Both showed similar improvement in classification moving from single-image to video interpretation. Additionally, difficulty estimates obtained from the KWs using an expectation maximization algorithm correlated well with the difficulty rating assigned by the expert radiologist. Our results suggest that distributed human intelligence is a powerful tool that will aid in the development of CAD for CTC. PMID:22705287
McKenna, Matthew T; Wang, Shijun; Nguyen, Tan B; Burns, Joseph E; Petrick, Nicholas; Summers, Ronald M
2012-08-01
Computer-aided detection (CAD) systems have been shown to improve the diagnostic performance of CT colonography (CTC) in the detection of premalignant colorectal polyps. Despite the improvement, the overall system is not optimal. CAD annotations on true lesions are incorrectly dismissed, and false positives are misinterpreted as true polyps. Here, we conduct an observer performance study utilizing distributed human intelligence in the form of anonymous knowledge workers (KWs) to investigate human performance in classifying polyp candidates under different presentation strategies. We evaluated 600 polyp candidates from 50 patients, each case having at least one polyp ≥6 mm, from a large database of CTC studies. Each polyp candidate was labeled independently as a true or false polyp by 20 KWs and an expert radiologist. We asked each labeler to determine whether the candidate was a true polyp after looking at a single 3D-rendered image of the candidate and after watching a video fly-around of the candidate. We found that distributed human intelligence improved significantly when presented with the additional information in the video fly-around. We noted that performance degraded with increasing interpretation time and increasing difficulty, but distributed human intelligence performed better than our CAD classifier for "easy" and "moderate" polyp candidates. Further, we observed numerous parallels between the expert radiologist and the KWs. Both showed similar improvement in classification moving from single-image to video interpretation. Additionally, difficulty estimates obtained from the KWs using an expectation maximization algorithm correlated well with the difficulty rating assigned by the expert radiologist. Our results suggest that distributed human intelligence is a powerful tool that will aid in the development of CAD for CTC. Copyright © 2012. Published by Elsevier B.V.
Hippocampal brain-network coordination during volitional exploratory behavior enhances learning.
Voss, Joel L; Gonsalves, Brian D; Federmeier, Kara D; Tranel, Daniel; Cohen, Neal J
2011-01-01
Exploratory behaviors during learning determine what is studied and when, helping to optimize subsequent memory performance. To elucidate the cognitive and neural determinants of exploratory behaviors, we manipulated the control that human subjects had over the position of a moving window through which they studied objects and their locations. Our behavioral, neuropsychological and neuroimaging data indicate that volitional control benefits memory performance and is linked to a brain network that is centered on the hippocampus. Increases in correlated activity between the hippocampus and other areas were associated with specific aspects of memory, which suggests that volitional control optimizes interactions among specialized neural systems through the hippocampus. Memory is therefore an active process that is intrinsically linked to behavior. Furthermore, brain structures that are typically seen as passive participants in memory encoding (for example, the hippocampus) are actually part of an active network that controls behavior dynamically as it unfolds.
Online adaptation and over-trial learning in macaque visuomotor control.
Braun, Daniel A; Aertsen, Ad; Paz, Rony; Vaadia, Eilon; Rotter, Stefan; Mehring, Carsten
2011-01-01
When faced with unpredictable environments, the human motor system has been shown to develop optimized adaptation strategies that allow for online adaptation during the control process. Such online adaptation is to be contrasted to slower over-trial learning that corresponds to a trial-by-trial update of the movement plan. Here we investigate the interplay of both processes, i.e., online adaptation and over-trial learning, in a visuomotor experiment performed by macaques. We show that simple non-adaptive control schemes fail to perform in this task, but that a previously suggested adaptive optimal feedback control model can explain the observed behavior. We also show that over-trial learning as seen in learning and aftereffect curves can be explained by learning in a radial basis function network. Our results suggest that both the process of over-trial learning and the process of online adaptation are crucial to understand visuomotor learning.
Online Adaptation and Over-Trial Learning in Macaque Visuomotor Control
Braun, Daniel A.; Aertsen, Ad; Paz, Rony; Vaadia, Eilon; Rotter, Stefan; Mehring, Carsten
2011-01-01
When faced with unpredictable environments, the human motor system has been shown to develop optimized adaptation strategies that allow for online adaptation during the control process. Such online adaptation is to be contrasted to slower over-trial learning that corresponds to a trial-by-trial update of the movement plan. Here we investigate the interplay of both processes, i.e., online adaptation and over-trial learning, in a visuomotor experiment performed by macaques. We show that simple non-adaptive control schemes fail to perform in this task, but that a previously suggested adaptive optimal feedback control model can explain the observed behavior. We also show that over-trial learning as seen in learning and aftereffect curves can be explained by learning in a radial basis function network. Our results suggest that both the process of over-trial learning and the process of online adaptation are crucial to understand visuomotor learning. PMID:21720526
Status of Low Thrust Work at JSC
NASA Technical Reports Server (NTRS)
Condon, Gerald L.
2004-01-01
High performance low thrust (solar electric, nuclear electric, variable specific impulse magnetoplasma rocket) propulsion offers a significant benefit to NASA missions beyond low Earth orbit. As NASA (e.g., Prometheus Project) endeavors to develop these propulsion systems and associated power supplies, it becomes necessary to develop a refined trajectory design capability that will allow engineers to develop future robotic and human mission designs that take advantage of this new technology. This ongoing work addresses development of a trajectory design and optimization tool for assessing low thrust (and other types) trajectories. This work targets to advance the state of the art, enable future NASA missions, enable science drivers, and enhance education. This presentation provides a summary of the low thrust-related JSC activities under the ISP program and specifically, provides a look at a new release of a multi-gravity, multispacecraft trajectory optimization tool (Copernicus) along with analysis performed using this tool over the past year.
Peters, Ryan M.; Staibano, Phillip
2015-01-01
The ability to resolve the orientation of edges is crucial to daily tactile and sensorimotor function, yet the means by which edge perception occurs is not well understood. Primate cortical area 3b neurons have diverse receptive field (RF) spatial structures that may participate in edge orientation perception. We evaluated five candidate RF models for macaque area 3b neurons, previously recorded while an oriented bar contacted the monkey's fingertip. We used a Bayesian classifier to assign each neuron a best-fit RF structure. We generated predictions for human performance by implementing an ideal observer that optimally decoded stimulus-evoked spike counts in the model neurons. The ideal observer predicted a saturating reduction in bar orientation discrimination threshold with increasing bar length. We tested 24 humans on an automated, precision-controlled bar orientation discrimination task and observed performance consistent with that predicted. We next queried the ideal observer to discover the RF structure and number of cortical neurons that best matched each participant's performance. Human perception was matched with a median of 24 model neurons firing throughout a 1-s period. The 10 lowest-performing participants were fit with RFs lacking inhibitory sidebands, whereas 12 of the 14 higher-performing participants were fit with RFs containing inhibitory sidebands. Participants whose discrimination improved as bar length increased to 10 mm were fit with longer RFs; those who performed well on the 2-mm bar, with narrower RFs. These results suggest plausible RF features and computational strategies underlying tactile spatial perception and may have implications for perceptual learning. PMID:26354318
Royo Sánchez, Ana Cristina; Aguilar Martín, Juan José; Santolaria Mazo, Jorge
2014-12-01
Motion capture systems are often used for checking and analyzing human motion in biomechanical applications. It is important, in this context, that the systems provide the best possible accuracy. Among existing capture systems, optical systems are those with the highest accuracy. In this paper, the development of a new calibration procedure for optical human motion capture systems is presented. The performance and effectiveness of that new calibration procedure are also checked by experimental validation. The new calibration procedure consists of two stages. In the first stage, initial estimators of intrinsic and extrinsic parameters are sought. The camera calibration method used in this stage is the one proposed by Tsai. These parameters are determined from the camera characteristics, the spatial position of the camera, and the center of the capture volume. In the second stage, a simultaneous nonlinear optimization of all parameters is performed to identify the optimal values, which minimize the objective function. The objective function, in this case, minimizes two errors. The first error is the distance error between two markers placed in a wand. The second error is the error of position and orientation of the retroreflective markers of a static calibration object. The real co-ordinates of the two objects are calibrated in a co-ordinate measuring machine (CMM). The OrthoBio system is used to validate the new calibration procedure. Results are 90% lower than those from the previous calibration software and broadly comparable with results from a similarly configured Vicon system.
Sherman, Maxwell A; Lee, Shane; Law, Robert; Haegens, Saskia; Thorn, Catherine A; Hämäläinen, Matti S; Moore, Christopher I; Jones, Stephanie R
2016-08-16
Human neocortical 15-29-Hz beta oscillations are strong predictors of perceptual and motor performance. However, the mechanistic origin of beta in vivo is unknown, hindering understanding of its functional role. Combining human magnetoencephalography (MEG), computational modeling, and laminar recordings in animals, we present a new theory that accounts for the origin of spontaneous neocortical beta. In our MEG data, spontaneous beta activity from somatosensory and frontal cortex emerged as noncontinuous beta events typically lasting <150 ms with a stereotypical waveform. Computational modeling uniquely designed to infer the electrical currents underlying these signals showed that beta events could emerge from the integration of nearly synchronous bursts of excitatory synaptic drive targeting proximal and distal dendrites of pyramidal neurons, where the defining feature of a beta event was a strong distal drive that lasted one beta period (∼50 ms). This beta mechanism rigorously accounted for the beta event profiles; several other mechanisms did not. The spatial location of synaptic drive in the model to supragranular and infragranular layers was critical to the emergence of beta events and led to the prediction that beta events should be associated with a specific laminar current profile. Laminar recordings in somatosensory neocortex from anesthetized mice and awake monkeys supported these predictions, suggesting this beta mechanism is conserved across species and recording modalities. These findings make several predictions about optimal states for perceptual and motor performance and guide causal interventions to modulate beta for optimal function.
Sensory Optimization by Stochastic Tuning
Jurica, Peter; Gepshtein, Sergei; Tyukin, Ivan; van Leeuwen, Cees
2013-01-01
Individually, visual neurons are each selective for several aspects of stimulation, such as stimulus location, frequency content, and speed. Collectively, the neurons implement the visual system’s preferential sensitivity to some stimuli over others, manifested in behavioral sensitivity functions. We ask how the individual neurons are coordinated to optimize visual sensitivity. We model synaptic plasticity in a generic neural circuit, and find that stochastic changes in strengths of synaptic connections entail fluctuations in parameters of neural receptive fields. The fluctuations correlate with uncertainty of sensory measurement in individual neurons: the higher the uncertainty the larger the amplitude of fluctuation. We show that this simple relationship is sufficient for the stochastic fluctuations to steer sensitivities of neurons toward a characteristic distribution, from which follows a sensitivity function observed in human psychophysics, and which is predicted by a theory of optimal allocation of receptive fields. The optimal allocation arises in our simulations without supervision or feedback about system performance and independently of coupling between neurons, making the system highly adaptive and sensitive to prevailing stimulation. PMID:24219849
Deep biomarkers of human aging: Application of deep neural networks to biomarker development
Putin, Evgeny; Mamoshina, Polina; Aliper, Alexander; Korzinkin, Mikhail; Moskalev, Alexey; Kolosov, Alexey; Ostrovskiy, Alexander; Cantor, Charles; Vijg, Jan; Zhavoronkov, Alex
2016-01-01
One of the major impediments in human aging research is the absence of a comprehensive and actionable set of biomarkers that may be targeted and measured to track the effectiveness of therapeutic interventions. In this study, we designed a modular ensemble of 21 deep neural networks (DNNs) of varying depth, structure and optimization to predict human chronological age using a basic blood test. To train the DNNs, we used over 60,000 samples from common blood biochemistry and cell count tests from routine health exams performed by a single laboratory and linked to chronological age and sex. The best performing DNN in the ensemble demonstrated 81.5 % epsilon-accuracy r = 0.90 with R2 = 0.80 and MAE = 6.07 years in predicting chronological age within a 10 year frame, while the entire ensemble achieved 83.5% epsilon-accuracy r = 0.91 with R2 = 0.82 and MAE = 5.55 years. The ensemble also identified the 5 most important markers for predicting human chronological age: albumin, glucose, alkaline phosphatase, urea and erythrocytes. To allow for public testing and evaluate real-life performance of the predictor, we developed an online system available at http://www.aging.ai. The ensemble approach may facilitate integration of multi-modal data linked to chronological age and sex that may lead to simple, minimally invasive, and affordable methods of tracking integrated biomarkers of aging in humans and performing cross-species feature importance analysis. PMID:27191382
Deep biomarkers of human aging: Application of deep neural networks to biomarker development.
Putin, Evgeny; Mamoshina, Polina; Aliper, Alexander; Korzinkin, Mikhail; Moskalev, Alexey; Kolosov, Alexey; Ostrovskiy, Alexander; Cantor, Charles; Vijg, Jan; Zhavoronkov, Alex
2016-05-01
One of the major impediments in human aging research is the absence of a comprehensive and actionable set of biomarkers that may be targeted and measured to track the effectiveness of therapeutic interventions. In this study, we designed a modular ensemble of 21 deep neural networks (DNNs) of varying depth, structure and optimization to predict human chronological age using a basic blood test. To train the DNNs, we used over 60,000 samples from common blood biochemistry and cell count tests from routine health exams performed by a single laboratory and linked to chronological age and sex. The best performing DNN in the ensemble demonstrated 81.5 % epsilon-accuracy r = 0.90 with R(2) = 0.80 and MAE = 6.07 years in predicting chronological age within a 10 year frame, while the entire ensemble achieved 83.5% epsilon-accuracy r = 0.91 with R(2) = 0.82 and MAE = 5.55 years. The ensemble also identified the 5 most important markers for predicting human chronological age: albumin, glucose, alkaline phosphatase, urea and erythrocytes. To allow for public testing and evaluate real-life performance of the predictor, we developed an online system available at http://www.aging.ai. The ensemble approach may facilitate integration of multi-modal data linked to chronological age and sex that may lead to simple, minimally invasive, and affordable methods of tracking integrated biomarkers of aging in humans and performing cross-species feature importance analysis.
Optimization of wearable microwave antenna with simplified electromagnetic model of the human body
NASA Astrophysics Data System (ADS)
Januszkiewicz, Łukasz; Barba, Paolo Di; Hausman, Sławomir
2017-12-01
In this paper the problem of optimization design of a microwave wearable antenna is investigated. Reference is made to a specific antenna design that is a wideband Vee antenna the geometry of which is characterized by 6 parameters. These parameters were automatically adjusted with an evolution strategy based algorithm EStra to obtain the impedance matching of the antenna located in the proximity of the human body. The antenna was designed to operate in the ISM (industrial, scientific, medical) band which covers the frequency range of 2.4 GHz up to 2.5 GHz. The optimization procedure used the finite-difference time-domain method based full-wave simulator with a simplified human body model. In the optimization procedure small movements of antenna towards or away of the human body that are likely to happen during real use were considered. The stability of the antenna parameters irrespective of the movements of the user's body is an important factor in wearable antenna design. The optimization procedure allowed obtaining good impedance matching for a given range of antenna distances with respect to the human body.
Reconstruction of Tissue-Specific Metabolic Networks Using CORDA
Schultz, André; Qutub, Amina A.
2016-01-01
Human metabolism involves thousands of reactions and metabolites. To interpret this complexity, computational modeling becomes an essential experimental tool. One of the most popular techniques to study human metabolism as a whole is genome scale modeling. A key challenge to applying genome scale modeling is identifying critical metabolic reactions across diverse human tissues. Here we introduce a novel algorithm called Cost Optimization Reaction Dependency Assessment (CORDA) to build genome scale models in a tissue-specific manner. CORDA performs more efficiently computationally, shows better agreement to experimental data, and displays better model functionality and capacity when compared to previous algorithms. CORDA also returns reaction associations that can greatly assist in any manual curation to be performed following the automated reconstruction process. Using CORDA, we developed a library of 76 healthy and 20 cancer tissue-specific reconstructions. These reconstructions identified which metabolic pathways are shared across diverse human tissues. Moreover, we identified changes in reactions and pathways that are differentially included and present different capacity profiles in cancer compared to healthy tissues, including up-regulation of folate metabolism, the down-regulation of thiamine metabolism, and tight regulation of oxidative phosphorylation. PMID:26942765
NASA Technical Reports Server (NTRS)
Polsgrove, Tara P.; Thomas, Herbert D.; Dwyer Ciancio, Alicia; Collins, Tim; Samareh, Jamshid
2017-01-01
Landing humans on Mars is one of NASA's long term goals. NASA's Evolvable Mars Campaign (EMC) is focused on evaluating architectural trade options to define the capabilities and elements needed to sustain human presence on the surface of Mars. The EMC study teams have considered a variety of in-space propulsion options and surface mission options. Understanding how these choices affect the performance of the lander will allow a balanced optimization of this complex system of systems problem. This paper presents the effects of mission and vehicle design options on lander mass and performance. Beginning with Earth launch, options include fairing size assumptions, co-manifesting elements with the lander, and Earth-Moon vicinity operations. Capturing into Mars orbit using either aerocapture or propulsive capture is assessed. For entry, descent, and landing both storable as well as oxygen and methane propellant combinations are considered, engine thrust level is assessed, and sensitivity to landed payload mass is presented. This paper focuses on lander designs using the Hypersonic Inflatable Aerodynamic Decelerators, one of several entry system technologies currently considered for human missions.
The history behind successful uterine transplantation in humans
Castellón, Luis Arturo Ruvalcaba; Amador, Martha Isolina García; González, Roberto Enrique Díaz; Eduardo, Montoya Sarmiento Jorge; Díaz-García, César; Kvarnström, Niclas; Bränström, Mats
2017-01-01
This paper aimed to describe the basic aspects of uterine transplant (UTx) research in humans, including preliminary experiences in rodents and domestic species. Studies in rats, domestic species, and non-human primates validated and optimized the UTx procedure in terms of its surgical aspects, immunosuppression, rejection diagnosis, peculiarities of pregnancy in immunosuppressed patients, and patients with special uterine conditions. In animal species, the first live birth from UTx was achieved in a syngeneic mouse model in 2003. Twenty-five UTx procedures have been performed in humans. The first two cases were unsuccessful, but established the need for rigorous research to improve success rates. As a result of a controlled clinical study under a strictly designed research protocol, nine subsequent UTx procedures have resulted in six healthy live births, the first of them in 2014. Further failed UTx procedures have been performed in China, Czech Republic, Brazil, Germany, and the United States, most of which using living donors. Albeit still an experimental procedure in, UTx is the first potential alternative for the treatment of absolute uterine factor infertility (AUFI). PMID:28609280
Pickard, Dawn
2007-01-01
We have developed experiments and materials to model human genetics using rapid cycling Brassica rapa, also known as Fast Plants. Because of their self-incompatibility for pollination and the genetic diversity within strains, B. rapa can serve as a relevant model for human genetics in teaching laboratory experiments. The experiment presented here is a paternity exclusion project in which a child is born with a known mother but two possible alleged fathers. Students use DNA markers (microsatellites) to perform paternity exclusion on these subjects. Realistic DNA marker analysis can be challenging to implement within the limitations of an instructional lab, but we have optimized the experimental methods to work in a teaching lab environment and to maximize the “hands-on” experience for the students. The genetic individuality of each B. rapa plant, revealed by analysis of polymorphic microsatellite markers, means that each time students perform this project, they obtain unique results that foster independent thinking in the process of data interpretation. PMID:17548880
EMU Suit Performance Simulation
NASA Technical Reports Server (NTRS)
Cowley, Matthew S.; Benson, Elizabeth; Harvill, Lauren; Rajulu, Sudhakar
2014-01-01
Introduction: Designing a planetary suit is very complex and often requires difficult trade-offs between performance, cost, mass, and system complexity. To verify that new suit designs meet requirements, full prototypes must be built and tested with human subjects. However, numerous design iterations will occur before the hardware meets those requirements. Traditional draw-prototype-test paradigms for research and development are prohibitively expensive with today's shrinking Government budgets. Personnel at NASA are developing modern simulation techniques that focus on a human-centric design paradigm. These new techniques make use of virtual prototype simulations and fully adjustable physical prototypes of suit hardware. This is extremely advantageous and enables comprehensive design down-selections to be made early in the design process. Objectives: The primary objective was to test modern simulation techniques for evaluating the human performance component of two EMU suit concepts, pivoted and planar style hard upper torso (HUT). Methods: This project simulated variations in EVA suit shoulder joint design and subject anthropometry and then measured the differences in shoulder mobility caused by the modifications. These estimations were compared to human-in-the-loop test data gathered during past suited testing using four subjects (two large males, two small females). Results: Results demonstrated that EVA suit modeling and simulation are feasible design tools for evaluating and optimizing suit design based on simulated performance. The suit simulation model was found to be advantageous in its ability to visually represent complex motions and volumetric reach zones in three dimensions, giving designers a faster and deeper comprehension of suit component performance vs. human performance. Suit models were able to discern differing movement capabilities between EMU HUT configurations, generic suit fit concerns, and specific suit fit concerns for crewmembers based on individual anthropometry
An Illumination Modeling System for Human Factors Analyses
NASA Technical Reports Server (NTRS)
Huynh, Thong; Maida, James C.; Bond, Robert L. (Technical Monitor)
2002-01-01
Seeing is critical to human performance. Lighting is critical for seeing. Therefore, lighting is critical to human performance. This is common sense, and here on earth, it is easily taken for granted. However, on orbit, because the sun will rise or set every 45 minutes on average, humans working in space must cope with extremely dynamic lighting conditions. Contrast conditions of harsh shadowing and glare is also severe. The prediction of lighting conditions for critical operations is essential. Crew training can factor lighting into the lesson plans when necessary. Mission planners can determine whether low-light video cameras are required or whether additional luminaires need to be flown. The optimization of the quantity and quality of light is needed because of the effects on crew safety, on electrical power and on equipment maintainability. To address all of these issues, an illumination modeling system has been developed by the Graphics Research and Analyses Facility (GRAF) and Lighting Environment Test Facility (LETF) in the Space Human Factors Laboratory at NASA Johnson Space Center. The system uses physically based ray tracing software (Radiance) developed at Lawrence Berkeley Laboratories, a human factors oriented geometric modeling system (PLAID) and an extensive database of humans and environments. Material reflectivity properties of major surfaces and critical surfaces are measured using a gonio-reflectometer. Luminaires (lights) are measured for beam spread distribution, color and intensity. Video camera performances are measured for color and light sensitivity. 3D geometric models of humans and the environment are combined with the material and light models to form a system capable of predicting lighting conditions and visibility conditions in space.
Automation effects in a multiloop manual control system
NASA Technical Reports Server (NTRS)
Hess, R. A.; Mcnally, B. D.
1986-01-01
An experimental and analytical study was undertaken to investigate human interaction with a simple multiloop manual control system in which the human's activity was systematically varied by changing the level of automation. The system simulated was the longitudinal dynamics of a hovering helicopter. The automation-systems-stabilized vehicle responses from attitude to velocity to position and also provided for display automation in the form of a flight director. The control-loop structure resulting from the task definition can be considered a simple stereotype of a hierarchical control system. The experimental study was complemented by an analytical modeling effort which utilized simple crossover models of the human operator. It was shown that such models can be extended to the description of multiloop tasks involving preview and precognitive human operator behavior. The existence of time optimal manual control behavior was established for these tasks and the role which internal models may play in establishing human-machine performance was discussed.
Near-optimal integration of facial form and motion.
Dobs, Katharina; Ma, Wei Ji; Reddy, Leila
2017-09-08
Human perception consists of the continuous integration of sensory cues pertaining to the same object. While it has been fairly well shown that humans use an optimal strategy when integrating low-level cues proportional to their relative reliability, the integration processes underlying high-level perception are much less understood. Here we investigate cue integration in a complex high-level perceptual system, the human face processing system. We tested cue integration of facial form and motion in an identity categorization task and found that an optimal model could successfully predict subjects' identity choices. Our results suggest that optimal cue integration may be implemented across different levels of the visual processing hierarchy.
Gamifying Video Object Segmentation.
Spampinato, Concetto; Palazzo, Simone; Giordano, Daniela
2017-10-01
Video object segmentation can be considered as one of the most challenging computer vision problems. Indeed, so far, no existing solution is able to effectively deal with the peculiarities of real-world videos, especially in cases of articulated motion and object occlusions; limitations that appear more evident when we compare the performance of automated methods with the human one. However, manually segmenting objects in videos is largely impractical as it requires a lot of time and concentration. To address this problem, in this paper we propose an interactive video object segmentation method, which exploits, on one hand, the capability of humans to identify correctly objects in visual scenes, and on the other hand, the collective human brainpower to solve challenging and large-scale tasks. In particular, our method relies on a game with a purpose to collect human inputs on object locations, followed by an accurate segmentation phase achieved by optimizing an energy function encoding spatial and temporal constraints between object regions as well as human-provided location priors. Performance analysis carried out on complex video benchmarks, and exploiting data provided by over 60 users, demonstrated that our method shows a better trade-off between annotation times and segmentation accuracy than interactive video annotation and automated video object segmentation approaches.
NASA Astrophysics Data System (ADS)
Lee, Kangwon
Intelligent vehicle systems, such as Adaptive Cruise Control (ACC) or Collision Warning/Collision Avoidance (CW/CA), are currently under development, and several companies have already offered ACC on selected models. Control or decision-making algorithms of these systems are commonly evaluated under extensive computer simulations and well-defined scenarios on test tracks. However, they have rarely been validated with large quantities of naturalistic human driving data. This dissertation utilized two University of Michigan Transportation Research Institute databases (Intelligent Cruise Control Field Operational Test and System for Assessment of Vehicle Motion Environment) in the development and evaluation of longitudinal driver models and CW/CA algorithms. First, to examine how drivers normally follow other vehicles, the vehicle motion data from the databases were processed using a Kalman smoother. The processed data was then used to fit and evaluate existing longitudinal driver models (e.g., the linear follow-the-leader model, the Newell's special model, the nonlinear follow-the-leader model, the linear optimal control model, the Gipps model and the optimal velocity model). A modified version of the Gipps model was proposed and found to be accurate in both microscopic (vehicle) and macroscopic (traffic) senses. Second, to examine emergency braking behavior and to evaluate CW/CA algorithms, the concepts of signal detection theory and a performance index suitable for unbalanced situations (few threatening data points vs. many safe data points) are introduced. Selected existing CW/CA algorithms were found to have a performance index (geometric mean of true-positive rate and precision) not exceeding 20%. To optimize the parameters of the CW/CA algorithms, a new numerical optimization scheme was developed to replace the original data points with their representative statistics. A new CW/CA algorithm was proposed, which was found to score higher than 55% in the performance index. This dissertation provides a model of how drivers follow lead-vehicles that is much more accurate than other models in the literature. Furthermore, the data-based approach was used to confirm that a CW/CA algorithm utilizing lead-vehicle braking was substantially more effective than existing algorithms, leading to collision warning systems that are much more likely to contribute to driver safety.
A case study on implementing lean ergonomic manufacturing systems (LEMS) in an automobile industry
NASA Astrophysics Data System (ADS)
Srinivasa Rao, P.; Niraj, Malay
2016-09-01
Lean manufacturing is a business strategy developed in Japan. In the present scenario, the global market is developing new techniques for getting more and more production rate with a good quality under low cost. In this context, human factors have to be given importance to their working conditions. The study demonstrates the adoption of ergonomic conditions in lean manufacturing for the improvement of organizational performance of the industry. The aim of ergonomics is to adapt the new techniques to their work in efficient and safe ways in order to optimize the human health conditions and increasing the production rate. By conducting survey on various disciplines and showed how the production rate and human ergonomic conditions is affected.
Fernandez-Torres, R; Consentino, M Olías; Lopez, M A Bello; Mochon, M Callejon
2010-05-15
A new, accurate and sensitive reversed-phase high-performance liquid chromatography (RP-HPLC) as analytical method for the quantitative determination of 11 antibiotics (drugs) and the main metabolites of five of them present in human urine has been worked out, optimized and validated. The analytes belong to four different groups of antibiotics (sulfonamides, tetracyclines, penicillins and anphenicols). The analyzed compounds were sulfadiazine (SDI) and its N(4)-acetylsulfadiazine (NDI) metabolite, sulfamethazine (SMZ) and its N(4)-acetylsulfamethazine (NMZ), sulfamerazine (SMR) and its N(4)-acetylsulfamerazine (NMR), sulfamethoxazole (SMX), trimetroprim (TMP), amoxicillin (AMX) and its main metabolite amoxicilloic acid (AMA), ampicillin (AMP) and its main metabolite ampicilloic acid (APA), chloramphenicol (CLF), thiamphenicol (TIF), oxytetracycline (OXT) and chlortetracycline (CLT). For HPLC analysis, diode array (DAD) and fluorescence (FLD) detectors were used. The separation of the analyzed compounds was conducted by means of a Phenomenex Gemini C(18) (150mm x 4.6mm I.D., particle size 5microm) analytical column with LiChroCART LiChrospher C(18) (4mm x 4mm, particle size 5microm) guard column. Analyzed drugs were determined within 34min using formic acid 0.1% in water and acetonitrile in gradient elution mode as mobile phase. A linear response was observed for all compounds in the range of concentration studied. Two procedures were optimized for sample preparation: a direct treatment with methanol and acetonitrile and a solid phase extraction procedure using Bond Elut Plexa columns. The method was applied to the determination of the analytes in human urine from volunteers under treatment with different pharmaceutical formulations. This method can be successfully applied to routine determination of all these drugs in human urine samples.
Robotic Billiards: Understanding Humans in Order to Counter Them.
Nierhoff, Thomas; Leibrandt, Konrad; Lorenz, Tamara; Hirche, Sandra
2016-08-01
Ongoing technological advances in the areas of computation, sensing, and mechatronics enable robotic-based systems to interact with humans in the real world. To succeed against a human in a competitive scenario, a robot must anticipate the human behavior and include it in its own planning framework. Then it can predict the next human move and counter it accordingly, thus not only achieving overall better performance but also systematically exploiting the opponent's weak spots. Pool is used as a representative scenario to derive a model-based planning and control framework where not only the physics of the environment but also a model of the opponent is considered. By representing the game of pool as a Markov decision process and incorporating a model of the human decision-making based on studies, an optimized policy is derived. This enables the robot to include the opponent's typical game style into its tactical considerations when planning a stroke. The results are validated in simulations and real-life experiments with an anthropomorphic robot playing pool against a human.
Design concepts for the Centrifuge Facility Life Sciences Glovebox
NASA Technical Reports Server (NTRS)
Sun, Sidney C.; Horkachuck, Michael J.; Mckeown, Kellie A.
1989-01-01
The Life Sciences Glovebox will provide the bioisolated environment to support on-orbit operations involving non-human live specimens and samples for human life sceinces experiments. It will be part of the Centrifuge Facility, in which animal and plant specimens are housed in bioisolated Habitat modules and transported to the Glovebox as part of the experiment protocols supported by the crew. At the Glovebox, up to two crew members and two habitat modules must be accommodated to provide flexibility and support optimal operations. This paper will present several innovative design concepts that attempt to satisfy the basic Glovebox requirements. These concepts were evaluated for ergonomics and ease of operations using computer modeling and full-scale mockups. The more promising ideas were presented to scientists and astronauts for their evaluation. Their comments, and the results from other evaluations are presented. Based on the evaluations, the authors recommend designs and features that will help optimize crew performance and facilitate science accommodations, and specify problem areas that require further study.
Wu, Tingzhu; Lin, Yue; Zheng, Lili; Guo, Ziquan; Xu, Jianxing; Liang, Shijie; Liu, Zhuguagn; Lu, Yijun; Shih, Tien-Mo; Chen, Zhong
2018-02-19
An optimal design of light-emitting diode (LED) lighting that benefits both the photosynthesis performance for plants and the visional health for human eyes has drawn considerable attention. In the present study, we have developed a multi-color driving algorithm that serves as a liaison between desired spectral power distributions and pulse-width-modulation duty cycles. With the aid of this algorithm, our multi-color plant-growth light sources can optimize correlated-color temperature (CCT) and color rendering index (CRI) such that photosynthetic luminous efficacy of radiation (PLER) is maximized regardless of the number of LEDs and the type of photosynthetic action spectrum (PAS). In order to illustrate the accuracies of the proposed algorithm and the practicalities of our plant-growth light sources, we choose six color LEDs and German PAS for experiments. Finally, our study can help provide a useful guide to improve light qualities in plant factories, in which long-term co-inhabitance of plants and human beings is required.
Using the principles of circadian physiology enhances shift schedule design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Connolly, J.J.; Moore-Ede, M.C.
1987-01-01
Nuclear power plants must operate 24 h, 7 days a week. For the most part, shift schedules currently in use at nuclear power plants have been designed to meet operational needs without considering the biological clocks of the human operators. The development of schedules that also take circadian principles into account is a positive step that can be taken to improve plant safety by optimizing operator alertness. These schedules reduce the probability of human errors especially during backshifts. In addition, training programs that teach round-the-clock workers how to deal with the problems of shiftwork can help to optimize performance andmore » alertness. These programs teach shiftworkers the underlying causes of the sleep problems associated with shiftwork and also provide coping strategies for improving sleep and dealing with the transition between shifts. When these training programs are coupled with an improved schedule, the problems associated with working round-the-clock can be significantly reduced.« less
Evaluating Suit Fit Using Performance Degradation
NASA Technical Reports Server (NTRS)
Margerum, Sarah E.; Cowley, Matthew; Harvill, Lauren; Benson, Elizabeth; Rajulu, Sudhakar
2012-01-01
The Mark III planetary technology demonstrator space suit can be tailored to an individual by swapping the modular components of the suit, such as the arms, legs, and gloves, as well as adding or removing sizing inserts in key areas. A method was sought to identify the transition from an ideal suit fit to a bad fit and how to quantify this breakdown using a metric of mobility-based human performance data. To this end, the degradation of the range of motion of the elbow and wrist of the suit as a function of suit sizing modifications was investigated to attempt to improve suit fit. The sizing range tested spanned optimal and poor fit and was adjusted incrementally in order to compare each joint angle across five different sizing configurations. Suited range of motion data were collected using a motion capture system for nine isolated and functional tasks utilizing the elbow and wrist joints. A total of four subjects were tested with motions involving both arms simultaneously as well as the right arm by itself. Findings indicated that no single joint drives the performance of the arm as a function of suit size; instead it is based on the interaction of multiple joints along a limb. To determine a size adjustment range where an individual can operate the suit at an acceptable level, a performance detriment limit was set. This user-selected limit reveals the task-dependent tolerance of the suit fit around optimal size. For example, the isolated joint motion indicated that the suit can deviate from optimal by as little as -0.6 in to -2.6 in before experiencing a 10% performance drop in the wrist or elbow joint. The study identified a preliminary method to quantify the impact of size on performance and developed a new way to gauge tolerances around optimal size.
Calvano, C D; Aresta, A; Iacovone, M; De Benedetto, G E; Zambonin, C G; Battaglia, M; Ditonno, P; Rutigliano, M; Bettocchi, C
2010-03-11
Protein analysis in biological fluids, such as urine, by means of mass spectrometry (MS) still suffers for insufficient standardization in protocols for sample collection, storage and preparation. In this work, the influence of these variables on healthy donors human urine protein profiling performed by matrix assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF-MS) was studied. A screening of various urine sample pre-treatment procedures and different sample deposition approaches on the MALDI target was performed. The influence of urine samples storage time and temperature on spectral profiles was evaluated by means of principal component analysis (PCA). The whole optimized procedure was eventually applied to the MALDI-TOF-MS analysis of human urine samples taken from prostate cancer patients. The best results in terms of detected ions number and abundance in the MS spectra were obtained by using home-made microcolumns packed with hydrophilic-lipophilic balance (HLB) resin as sample pre-treatment method; this procedure was also less expensive and suitable for high throughput analyses. Afterwards, the spin coating approach for sample deposition on the MALDI target plate was optimized, obtaining homogenous and reproducible spots. Then, PCA indicated that low storage temperatures of acidified and centrifuged samples, together with short handling time, allowed to obtain reproducible profiles without artifacts contribution due to experimental conditions. Finally, interesting differences were found by comparing the MALDI-TOF-MS protein profiles of pooled urine samples of healthy donors and prostate cancer patients. The results showed that analytical and pre-analytical variables are crucial for the success of urine analysis, to obtain meaningful and reproducible data, even if the intra-patient variability is very difficult to avoid. It has been proven how pooled urine samples can be an interesting way to make easier the comparison between healthy and pathological samples and to individuate possible differences in the protein expression between the two sets of samples. Copyright 2009 Elsevier B.V. All rights reserved.
Ma, Wei Ji; Shen, Shan; Dziugaite, Gintare; van den Berg, Ronald
2015-01-01
In tasks such as visual search and change detection, a key question is how observers integrate noisy measurements from multiple locations to make a decision. Decision rules proposed to model this process haven fallen into two categories: Bayes-optimal (ideal observer) rules and ad-hoc rules. Among the latter, the maximum-of-outputs (max) rule has been most prominent. Reviewing recent work and performing new model comparisons across a range of paradigms, we find that in all cases except for one, the optimal rule describes human data as well as or better than every max rule either previously proposed or newly introduced here. This casts doubt on the utility of the max rule for understanding perceptual decision-making. PMID:25584425
On the pilot's behavior of detecting a system parameter change
NASA Technical Reports Server (NTRS)
Morizumi, N.; Kimura, H.
1986-01-01
The reaction of a human pilot, engaged in compensatory control, to a sudden change in the controlled element's characteristics is described. Taking the case where the change manifests itself as a variance change of the monitored signal, it is shown that the detection time, defined to be the time elapsed until the pilot detects the change, is related to the monitored signal and its derivative. Then, the detection behavior is modeled by an optimal controller, an optimal estimator, and a variance-ratio test mechanism that is performed for the monitored signal and its derivative. Results of a digital simulation show that the pilot's detection behavior can be well represented by the model proposed here.
NASA Astrophysics Data System (ADS)
Donnelly, William J., III
2012-06-01
PURPOSE: To present a commercially available optical modeling software tool to assist the development of optical instrumentation and systems that utilize and/or integrate with the human eye. METHODS: A commercially available flexible eye modeling system is presented, the Advanced Human Eye Model (AHEM). AHEM is a module that the engineer can use to perform rapid development and test scenarios on systems that integrate with the eye. Methods include merging modeled systems initially developed outside of AHEM and performing a series of wizard-type operations that relieve the user from requiring an optometric or ophthalmic background to produce a complete eye inclusive system. Scenarios consist of retinal imaging of targets and sources through integrated systems. Uses include, but are not limited to, optimization, telescopes, microscopes, spectacles, contact and intraocular lenses, ocular aberrations, cataract simulation and scattering, and twin eye model (binocular) systems. RESULTS: Metrics, graphical data, and exportable CAD geometry are generated from the various modeling scenarios.
Immunodiagnostic Value of Echinococcus Granulosus Recombinant B8/1 Subunit of Antigen B.
Savardashtaki, Amir; Sarkari, Bahador; Arianfar, Farzane; Mostafavi-Pour, Zohreh
2017-06-01
Cystic echinococcosis (CE), as a chronic parasitic disease, is a major health problem in many countries. The performance of the currently available serodiagnostic tests for the diagnosis of CE is unsatisfactory. The current study aimed at sub-cloning a gene, encoding the B8/1 subunit of antigen B (AgB) from Echinococcus granulosus, using gene optimization for the immunodiagnosis of human CE. The coding sequence for AgB8/1 subunit of Echinococcus granulosus was selected from GenBank and was gene-optimized. The sequence was synthesized and inserted into pGEX-4T-1 vector. Purification was performed with GST tag affinity column. Diagnostic performance of the produced recombinant antigen, native antigen B and a commercial ELISA kit were further evaluated in an ELISA system, using a panel of sera from CE patients and controls. SDS-PAGE demonstrated that the protein of interest had a high expression level and purity after GST tag affinity purification. Western blotting verified the immunoreactivity of the produced recombinant antigen with the sera of CE patients. In an ELISA system, the sensitivity and specificity (for human CE diagnosis) of the recombinant antigen, native antigen B and commercial kit were respectively 93% and 92%, 87% and 90% and 97% and 95%. The produced recombinant antigen showed a high diagnostic value which can be recommended for serodiagnosis of CE in Iran and other CE-endemic areas. Utilizing the combination of other subunits of AgB8 would improve the performance value of the introduced ELISA system.
Abou-El-Enein, Mohamed; Römhild, Andy; Kaiser, Daniel; Beier, Carola; Bauer, Gerhard; Volk, Hans-Dieter; Reinke, Petra
2013-03-01
Advanced therapy medicinal products (ATMP) have gained considerable attention in academia due to their therapeutic potential. Good Manufacturing Practice (GMP) principles ensure the quality and sterility of manufacturing these products. We developed a model for estimating the manufacturing costs of cell therapy products and optimizing the performance of academic GMP-facilities. The "Clean-Room Technology Assessment Technique" (CTAT) was tested prospectively in the GMP facility of BCRT, Berlin, Germany, then retrospectively in the GMP facility of the University of California-Davis, California, USA. CTAT is a two-level model: level one identifies operational (core) processes and measures their fixed costs; level two identifies production (supporting) processes and measures their variable costs. The model comprises several tools to measure and optimize performance of these processes. Manufacturing costs were itemized using adjusted micro-costing system. CTAT identified GMP activities with strong correlation to the manufacturing process of cell-based products. Building best practice standards allowed for performance improvement and elimination of human errors. The model also demonstrated the unidirectional dependencies that may exist among the core GMP activities. When compared to traditional business models, the CTAT assessment resulted in a more accurate allocation of annual expenses. The estimated expenses were used to set a fee structure for both GMP facilities. A mathematical equation was also developed to provide the final product cost. CTAT can be a useful tool in estimating accurate costs for the ATMPs manufactured in an optimized GMP process. These estimates are useful when analyzing the cost-effectiveness of these novel interventions. Copyright © 2013 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.
Human body motion tracking based on quantum-inspired immune cloning algorithm
NASA Astrophysics Data System (ADS)
Han, Hong; Yue, Lichuan; Jiao, Licheng; Wu, Xing
2009-10-01
In a static monocular camera system, to gain a perfect 3D human body posture is a great challenge for Computer Vision technology now. This paper presented human postures recognition from video sequences using the Quantum-Inspired Immune Cloning Algorithm (QICA). The algorithm included three parts. Firstly, prior knowledge of human beings was used, the key joint points of human could be detected automatically from the human contours and skeletons which could be thinning from the contours; And due to the complexity of human movement, a forecasting mechanism of occlusion joint points was addressed to get optimum 2D key joint points of human body; And then pose estimation recovered by optimizing between the 2D projection of 3D human key joint points and 2D detection key joint points using QICA, which recovered the movement of human body perfectly, because this algorithm could acquire not only the global optimal solution, but the local optimal solution.
Visual-search models for location-known detection tasks
NASA Astrophysics Data System (ADS)
Gifford, H. C.; Karbaschi, Z.; Banerjee, K.; Das, M.
2017-03-01
Lesion-detection studies that analyze a fixed target position are generally considered predictive of studies involving lesion search, but the extent of the correlation often goes untested. The purpose of this work was to develop a visual-search (VS) model observer for location-known tasks that, coupled with previous work on localization tasks, would allow efficient same-observer assessments of how search and other task variations can alter study outcomes. The model observer featured adjustable parameters to control the search radius around the fixed lesion location and the minimum separation between suspicious locations. Comparisons were made against human observers, a channelized Hotelling observer and a nonprewhitening observer with eye filter in a two-alternative forced-choice study with simulated lumpy background images containing stationary anatomical and quantum noise. These images modeled single-pinhole nuclear medicine scans with different pinhole sizes. When the VS observer's search radius was optimized with training images, close agreement was obtained with human-observer results. Some performance differences between the humans could be explained by varying the model observer's separation parameter. The range of optimal pinhole sizes identified by the VS observer was in agreement with the range determined with the channelized Hotelling observer.
Müllner, Marie; Schlattl, Helmut; Hoeschen, Christoph; Dietrich, Olaf
2015-12-01
To demonstrate the feasibility of gold-specific spectral CT imaging for the detection of liver lesions in humans at low concentrations of gold as targeted contrast agent. A Monte Carlo simulation study of spectral CT imaging with a photon-counting and energy-resolving detector (with 6 energy bins) was performed in a realistic phantom of the human abdomen. The detector energy thresholds were optimized for the detection of gold. The simulation results were reconstructed with the K-edge imaging algorithm; the reconstructed gold-specific images were filtered and evaluated with respect to signal-to-noise ratio and contrast-to-noise ratio (CNR). The simulations demonstrate the feasibility of spectral CT with CNRs of the specific gold signal between 2.7 and 4.8 after bilateral filtering. Using the optimized bin thresholds increases the CNRs of the lesions by up to 23% compared to bin thresholds described in former studies. Gold is a promising new CT contrast agent for spectral CT in humans; minimum tissue mass fractions of 0.2 wt% of gold are required for sufficient image contrast. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Di Nardo, Giovanna; Gilardi, Gianfranco
2012-01-01
Drug metabolism in human liver is a process involving many different enzymes. Among them, a number of cytochromes P450 isoforms catalyze the oxidation of most of the drugs commercially available. Each P450 isoform acts on more than one drug, and one drug may be oxidized by more than one enzyme. As a result, multiple products may be obtained from the same drug, and as the metabolites can be biologically active and may cause adverse drug reactions (ADRs), the metabolic profile of a new drug has to be known before this can be commercialized. Therefore, the metabolites of a certain drug must be identified, synthesized and tested for toxicity. Their synthesis must be in sufficient quantities to be used for metabolic tests. This review focuses on the progresses done in the field of the optimization of a bacterial self-sufficient and efficient cytochrome P450, P450 BM3 from Bacillus megaterium, used for the production of metabolites of human enzymes. The progress made in the improvement of its catalytic performance towards drugs, the substitution of the costly NADPH cofactor and its immobilization and scale-up of the process for industrial application are reported. PMID:23443101
Analytic Intermodel Consistent Modeling of Volumetric Human Lung Dynamics.
Ilegbusi, Olusegun; Seyfi, Behnaz; Neylon, John; Santhanam, Anand P
2015-10-01
Human lung undergoes breathing-induced deformation in the form of inhalation and exhalation. Modeling the dynamics is numerically complicated by the lack of information on lung elastic behavior and fluid-structure interactions between air and the tissue. A mathematical method is developed to integrate deformation results from a deformable image registration (DIR) and physics-based modeling approaches in order to represent consistent volumetric lung dynamics. The computational fluid dynamics (CFD) simulation assumes the lung is a poro-elastic medium with spatially distributed elastic property. Simulation is performed on a 3D lung geometry reconstructed from four-dimensional computed tomography (4DCT) dataset of a human subject. The heterogeneous Young's modulus (YM) is estimated from a linear elastic deformation model with the same lung geometry and 4D lung DIR. The deformation obtained from the CFD is then coupled with the displacement obtained from the 4D lung DIR by means of the Tikhonov regularization (TR) algorithm. The numerical results include 4DCT registration, CFD, and optimal displacement data which collectively provide consistent estimate of the volumetric lung dynamics. The fusion method is validated by comparing the optimal displacement with the results obtained from the 4DCT registration.
''Illusion of control'' in Time-Horizon Minority and Parrondo Games
NASA Astrophysics Data System (ADS)
Satinover, J. B.; Sornette, D.
2007-12-01
Human beings like to believe they are in control of their destiny. This ubiquitous trait seems to increase motivation and persistence, and is probably evolutionarily adaptive [J.D. Taylor, S.E. Brown, Psych. Bull. 103, 193 (1988); A. Bandura, Self-efficacy: the exercise of control (WH Freeman, New York, 1997)]. But how good really is our ability to control? How successful is our track record in these areas? There is little understanding of when and under what circumstances we may over-estimate [E. Langer, J. Pers. Soc. Psych. 7, 185 (1975)] or even lose our ability to control and optimize outcomes, especially when they are the result of aggregations of individual optimization processes. Here, we demonstrate analytically using the theory of Markov Chains and by numerical simulations in two classes of games, the Time-Horizon Minority Game [M.L. Hart, P. Jefferies, N.F. Johnson, Phys. A 311, 275 (2002)] and the Parrondo Game [J.M.R. Parrondo, G.P. Harmer, D. Abbott, Phys. Rev. Lett. 85, 5226 (2000); J.M.R. Parrondo, How to cheat a bad mathematician (ISI, Italy, 1996)], that agents who optimize their strategy based on past information may actually perform worse than non-optimizing agents. In other words, low-entropy (more informative) strategies under-perform high-entropy (or random) strategies. This provides a precise definition of the “illusion of control” in certain set-ups a priori defined to emphasize the importance of optimization.
Optimizing Processes to Minimize Risk
NASA Technical Reports Server (NTRS)
Loyd, David
2017-01-01
NASA, like the other hazardous industries, has suffered very catastrophic losses. Human error will likely never be completely eliminated as a factor in our failures. When you can't eliminate risk, focus on mitigating the worst consequences and recovering operations. Bolstering processes to emphasize the role of integration and problem solving is key to success. Building an effective Safety Culture bolsters skill-based performance that minimizes risk and encourages successful engagement.
Information Search in Judgment Tasks: The Effects of Unequal Cue Validity and Cost.
1984-05-01
bookbag before betting on the contents of the bag being sampled ( Edwards , 1965). They proposed an alternative model for the regression (or continuous...ly displaced vertically for clarity.) The analogous relationship for the Bayesian model is developed by Edwards (1965). Snapper and Peterson (1971...A re- gression model and some preliminary findings." Organizational Behavior and Human Performance, 1982, 30, 330-350. Edwards , W.: "Optimal
Development of Cross-Assembly Phage PCR-Based Methods ...
Technologies that can characterize human fecal pollution in environmental waters offer many advantages over traditional general indicator approaches. However, many human-associated methods cross-react with non-human animal sources and lack suitable sensitivity for fecal source identification applications. The genome of a newly discovered bacteriophage (~97 kbp), the Cross-Assembly phage or “crAssphage”, assembled from a human gut metagenome DNA sequence library is predicted to be both highly abundant and predominately occur in human feces suggesting that this double stranded DNA virus may be an ideal human fecal pollution indicator. We report the development of two human-associated crAssphage endpoint PCR methods (crAss056 and crAss064). A shotgun strategy was employed where 384 candidate primers were designed to cover ~41 kbp of the crAssphage genome deemed favorable for method development based on a series of bioinformatics analyses. Candidate primers were subjected to three rounds of testing to evaluate assay optimization, specificity, limit of detection (LOD95), geographic variability, and performance in environmental water samples. The top two performing candidate primer sets exhibited 100% specificity (n = 70 individual samples from 8 different animal species), >90% sensitivity (n = 10 raw sewage samples from different geographic locations), LOD95 of 0.01 ng/µL of total DNA per reaction, and successfully detected human fecal pollution in impaired envi
Electromagnetic Modeling of Human Body Using High Performance Computing
NASA Astrophysics Data System (ADS)
Ng, Cho-Kuen; Beall, Mark; Ge, Lixin; Kim, Sanghoek; Klaas, Ottmar; Poon, Ada
Realistic simulation of electromagnetic wave propagation in the actual human body can expedite the investigation of the phenomenon of harvesting implanted devices using wireless powering coupled from external sources. The parallel electromagnetics code suite ACE3P developed at SLAC National Accelerator Laboratory is based on the finite element method for high fidelity accelerator simulation, which can be enhanced to model electromagnetic wave propagation in the human body. Starting with a CAD model of a human phantom that is characterized by a number of tissues, a finite element mesh representing the complex geometries of the individual tissues is built for simulation. Employing an optimal power source with a specific pattern of field distribution, the propagation and focusing of electromagnetic waves in the phantom has been demonstrated. Substantial speedup of the simulation is achieved by using multiple compute cores on supercomputers.
NASA Technical Reports Server (NTRS)
Shields, W. E.; Smith, J. D.; Washburn, D. A.; Rumbaugh, D. M. (Principal Investigator)
1997-01-01
The authors asked whether animals, like humans, use an uncertain response adaptively to escape indeterminate stimulus relations. Humans and monkeys were placed in a same-different task, known to be challenging for animals. Its difficulty was increased further by reducing the size of the stimulus differences, thereby making many same and different trials difficult to tell apart. Monkeys do escape selectively from these threshold trials, even while coping with 7 absolute stimulus levels concurrently. Monkeys even adjust their response strategies on short time scales according to the local task conditions. Signal-detection and optimality analyses confirm the similarity of humans' and animals' performances. Whereas associative interpretations account poorly for these results, an intuitive uncertainty construct does so easily. The authors discuss the cognitive processes that allow uncertainty's adaptive use and recommend further comparative studies of metacognition.
NASA Technical Reports Server (NTRS)
Atwell, William; Rojdev, Kristina; Aghara, Sukesh; Sriprisan, Sirikul
2013-01-01
In this paper we present a novel space radiation shielding approach using various material lay-ups, called "Graded-Z" shielding, which could optimize cost, weight, and safety while mitigating the radiation exposures from the trapped radiation and solar proton environments, as well as the galactic cosmic radiation (GCR) environment, to humans and electronics. In addition, a validation and verification (V&V) was performed using two different high energy particle transport/dose codes (MCNPX & HZETRN). Inherently, we know that materials having high-hydrogen content are very good space radiation shielding materials. Graded-Z material lay-ups are very good trapped electron mitigators for medium earth orbit (MEO) and geostationary earth orbit (GEO). In addition, secondary particles, namely neutrons, are produced as the primary particles penetrate a spacecraft, which can have deleterious effects to both humans and electronics. The use of "dopants," such as beryllium, boron, and lithium, impregnated in other shielding materials provides a means of absorbing the secondary neutrons. Several examples of optimized Graded-Z shielding layups that include the use of composite materials are presented and discussed in detail. This parametric shielding study is an extension of some earlier pioneering work we (William Atwell and Kristina Rojdev) performed in 20041 and 20092.
Impact of human resource management practices on nursing home performance.
Rondeau, K V; Wagar, T H
2001-08-01
Management scholars and practitioners alike have become increasingly interested in learning more about the ability of certain 'progressive' or 'high-performance' human resource management (HRM) practices to enhance organizational effectiveness. There is growing evidence to suggest that the contribution of various HRM practices to impact firm performance may be synergistic in effect yet contingent on a number of contextual factors, including workplace climate. A contingency theory perspective suggests that in order to be effective, HMR policies and practices must be consistent with other aspects of the organization, including its environment. This paper reports on empirical findings from research that examines the relationship between HRM practices, workplace climate and perceptions of organizational performance, in a large sample of Canadian nursing homes. Data from 283 nursing homes were collected by means of a mail survey that included questions on HRM practices, programmes, and policies, on human resource aspects of workplace climate, as well as a variety of indicators that include employee, customer/resident and facility measures of organizational performance. Results derived from ordered probit analysis suggest that nursing homes in our sample which had implemented more 'progressive' HRM practices and which reported a workplace climate that strongly values employee participation, empowerment and accountability tended to be perceived to generally perform better on a number of valued organizational outcomes. Nursing homes in our sample that performed best overall were found to be more likely to not only have implemented more of these HRM practices, but also to report having a workplace climate that reflects the seminal value that it places on its human resources. This finding is consistent with the conclusion that simply introducing HRM practices or programmes, in the absence of an appropriately supportive workplace climate, will be insufficient to attain optimal organizational performance.
Dynamic Task Performance, Cohesion, and Communications in Human Groups.
Giraldo, Luis Felipe; Passino, Kevin M
2016-10-01
In the study of the behavior of human groups, it has been observed that there is a strong interaction between the cohesiveness of the group, its performance when the group has to solve a task, and the patterns of communication between the members of the group. Developing mathematical and computational tools for the analysis and design of task-solving groups that are not only cohesive but also perform well is of importance in social sciences, organizational management, and engineering. In this paper, we model a human group as a dynamical system whose behavior is driven by a task optimization process and the interaction between subsystems that represent the members of the group interconnected according to a given communication network. These interactions are described as attractions and repulsions among members. We show that the dynamics characterized by the proposed mathematical model are qualitatively consistent with those observed in real-human groups, where the key aspect is that the attraction patterns in the group and the commitment to solve the task are not static but change over time. Through a theoretical analysis of the system we provide conditions on the parameters that allow the group to have cohesive behaviors, and Monte Carlo simulations are used to study group dynamics for different sets of parameters, communication topologies, and tasks to solve.
Zhang, Jianhua; Yin, Zhong; Wang, Rubin
2017-01-01
This paper developed a cognitive task-load (CTL) classification algorithm and allocation strategy to sustain the optimal operator CTL levels over time in safety-critical human-machine integrated systems. An adaptive human-machine system is designed based on a non-linear dynamic CTL classifier, which maps a set of electroencephalogram (EEG) and electrocardiogram (ECG) related features to a few CTL classes. The least-squares support vector machine (LSSVM) is used as dynamic pattern classifier. A series of electrophysiological and performance data acquisition experiments were performed on seven volunteer participants under a simulated process control task environment. The participant-specific dynamic LSSVM model is constructed to classify the instantaneous CTL into five classes at each time instant. The initial feature set, comprising 56 EEG and ECG related features, is reduced to a set of 12 salient features (including 11 EEG-related features) by using the locality preserving projection (LPP) technique. An overall correct classification rate of about 80% is achieved for the 5-class CTL classification problem. Then the predicted CTL is used to adaptively allocate the number of process control tasks between operator and computer-based controller. Simulation results showed that the overall performance of the human-machine system can be improved by using the adaptive automation strategy proposed.
NASA Astrophysics Data System (ADS)
Johnson, Tony; Metcalfe, Jason; Brewster, Benjamin; Manteuffel, Christopher; Jaswa, Matthew; Tierney, Terrance
2010-04-01
The proliferation of intelligent systems in today's military demands increased focus on the optimization of human-robot interactions. Traditional studies in this domain involve large-scale field tests that require humans to operate semiautomated systems under varying conditions within military-relevant scenarios. However, provided that adequate constraints are employed, modeling and simulation can be a cost-effective alternative and supplement. The current presentation discusses a simulation effort that was executed in parallel with a field test with Soldiers operating military vehicles in an environment that represented key elements of the true operational context. In this study, "constructive" human operators were designed to represent average Soldiers executing supervisory control over an intelligent ground system. The constructive Soldiers were simulated performing the same tasks as those performed by real Soldiers during a directly analogous field test. Exercising the models in a high-fidelity virtual environment provided predictive results that represented actual performance in certain aspects, such as situational awareness, but diverged in others. These findings largely reflected the quality of modeling assumptions used to design behaviors and the quality of information available on which to articulate principles of operation. Ultimately, predictive analyses partially supported expectations, with deficiencies explicable via Soldier surveys, experimenter observations, and previously-identified knowledge gaps.
NASA Astrophysics Data System (ADS)
Salehi, Hassan S.; Li, Hai; Kumavor, Patrick D.; Merkulov, Aleksey; Sanders, Melinda; Brewer, Molly; Zhu, Quing
2015-03-01
In this paper, wavelength selection for multispectral photoacoustic/ultrasound tomography was optimized to obtain accurate images of hemoglobin oxygen saturation (sO2) in vivo. Although wavelengths can be selected by theoretical methods, in practice the accuracy of reconstructed images will be affected by wavelength-specific and system-specific factors such as laser source power and ultrasound transducer sensitivity. By performing photoacoustic spectroscopy of mouse tumor models using 14 different wavelengths between 710 and 840 nm, we were able to identify a wavelength set which most accurately reproduced the results obtained using all 14 wavelengths via selection criteria. In clinical studies, the optimal wavelength set was successfully used to image human ovaries in vivo and noninvasively. Although these results are specific to our co-registered photoacoustic/ultrasound imaging system, the approach we developed can be applied to other functional photoacoustic and optical imaging systems.
Gonzalo-Lumbreras, R; Izquierdo-Hornillos, R
2000-05-26
An HPLC separation of a complex mixture containing 13 urinary anabolics and corticoids, and boldenone and bolasterone (synthetic anabolics) has been carried out. The applied optimization method involved the use of binary, ternary and quaternary mobile phases containing acetonitrile, methanol or tetrahydrofuran as organic modifiers. The effect of different reversed-phase packings and temperature on the separation was studied. The optimum separation was achieved by using a water-acetonitrile (60:40, v/v) mobile phase in reversed-phase HPLC at 30 degrees C, allowing the separation of all the analytes in about 24 min. Calibration graphs were obtained using bolasterone or methyltestosterone as internal standards. Detection limits were in the range 0.012-0.107 microg ml(-1). The optimized separation was applied to the analysis, after liquid-liquid extraction, of human urine samples spiked with steroids.
Wu, Haining; Dong, Jianfei; Qi, Gaojin; Zhang, Guoqi
2015-07-01
Enhancing the colorfulness of illuminated objects is a promising application of LED lighting for commercial, exhibiting, and scientific purposes. This paper proposes a method to enhance the color of illuminated objects for a given polychromatic lamp. Meanwhile, the light color is restricted to white. We further relax the white light constraints by introducing soft margins. Based on the spectral and electrical characteristics of LEDs and object surface properties, we determine the optimal mixing of the LED light spectrum by solving a numerical optimization problem, which is a quadratic fractional programming problem by formulation. Simulation studies show that the trade-off between the white light constraint and the level of the color enhancement can be adjusted by tuning an upper limit value of the soft margin. Furthermore, visual evaluation experiments are performed to evaluate human perception of the color enhancement. The experiments have verified the effectiveness of the proposed method.
NASA Technical Reports Server (NTRS)
Englander, Jacob
2016-01-01
Preliminary design of interplanetary missions is a highly complex process. The mission designer must choose discrete parameters such as the number of flybys, the bodies at which those flybys are performed, and in some cases the final destination. In addition, a time-history of control variables must be chosen that defines the trajectory. There are often many thousands, if not millions, of possible trajectories to be evaluated. This can be a very expensive process in terms of the number of human analyst hours required. An automated approach is therefore very desirable. This work presents such an approach by posing the mission design problem as a hybrid optimal control problem. The method is demonstrated on notional high-thrust chemical and low-thrust electric propulsion missions. In the low-thrust case, the hybrid optimal control problem is augmented to include systems design optimization.
Fractal analysis of mandibular trabecular bone: optimal tile sizes for the tile counting method.
Huh, Kyung-Hoe; Baik, Jee-Seon; Yi, Won-Jin; Heo, Min-Suk; Lee, Sam-Sun; Choi, Soon-Chul; Lee, Sun-Bok; Lee, Seung-Pyo
2011-06-01
This study was performed to determine the optimal tile size for the fractal dimension of the mandibular trabecular bone using a tile counting method. Digital intraoral radiographic images were obtained at the mandibular angle, molar, premolar, and incisor regions of 29 human dry mandibles. After preprocessing, the parameters representing morphometric characteristics of the trabecular bone were calculated. The fractal dimensions of the processed images were analyzed in various tile sizes by the tile counting method. The optimal range of tile size was 0.132 mm to 0.396 mm for the fractal dimension using the tile counting method. The sizes were closely related to the morphometric parameters. The fractal dimension of mandibular trabecular bone, as calculated with the tile counting method, can be best characterized with a range of tile sizes from 0.132 to 0.396 mm.
Fractal analysis of mandibular trabecular bone: optimal tile sizes for the tile counting method
Huh, Kyung-Hoe; Baik, Jee-Seon; Heo, Min-Suk; Lee, Sam-Sun; Choi, Soon-Chul; Lee, Sun-Bok; Lee, Seung-Pyo
2011-01-01
Purpose This study was performed to determine the optimal tile size for the fractal dimension of the mandibular trabecular bone using a tile counting method. Materials and Methods Digital intraoral radiographic images were obtained at the mandibular angle, molar, premolar, and incisor regions of 29 human dry mandibles. After preprocessing, the parameters representing morphometric characteristics of the trabecular bone were calculated. The fractal dimensions of the processed images were analyzed in various tile sizes by the tile counting method. Results The optimal range of tile size was 0.132 mm to 0.396 mm for the fractal dimension using the tile counting method. The sizes were closely related to the morphometric parameters. Conclusion The fractal dimension of mandibular trabecular bone, as calculated with the tile counting method, can be best characterized with a range of tile sizes from 0.132 to 0.396 mm. PMID:21977478
Discrete-time pilot model. [human dynamics and digital simulation
NASA Technical Reports Server (NTRS)
Cavalli, D.
1978-01-01
Pilot behavior is considered as a discrete-time process where the decision making has a sequential nature. This model differs from both the quasilinear model which follows from classical control theory and from the optimal control model which considers the human operator as a Kalman estimator-predictor. An additional factor considered is that the pilot's objective may not be adequately formulated as a quadratic cost functional to be minimized, but rather as a more fuzzy measure of the closeness with which the aircraft follows a reference trajectory. All model parameters, in the digital program simulating the pilot's behavior, were successfully compared in terms of standard-deviation and performance with those of professional pilots in IFR configuration. The first practical application of the model was in the study of its performance degradation when the aircraft model static margin decreases.
Arctic cognition: a study of cognitive performance in summer and winter at 69 degrees N
NASA Technical Reports Server (NTRS)
Brennen, T.; Martinussen, M.; Hansen, B. O.; Hjemdal, O.
1999-01-01
Evidence has accumulated over the past 15 years that affect in humans is cyclical. In winter there is a tendency to depression, with remission in summer, and this effect is stronger at higher latitudes. In order to determine whether human cognition is similarly rhythmical, this study investigated the cognitive processes of 100 participants living at 69 degrees N. Participants were tested in summer and winter on a range of cognitive tasks, including verbal memory, attention and simple reaction time tasks. The seasonally counterbalanced design and the very northerly latitude of this study provide optimal conditions for detecting impaired cognitive performance in winter, and the conclusion is negative: of five tasks with seasonal effects, four had disadvantages in summer. Like the menstrual cycle, the circannual cycle appears to influence mood but not cognition.
Automation of POST Cases via External Optimizer and "Artificial p2" Calculation
NASA Technical Reports Server (NTRS)
Dees, Patrick D.; Zwack, Mathew R.
2017-01-01
During early conceptual design of complex systems, speed and accuracy are often at odds with one another. While many characteristics of the design are fluctuating rapidly during this phase there is nonetheless a need to acquire accurate data from which to down-select designs as these decisions will have a large impact upon program life-cycle cost. Therefore enabling the conceptual designer to produce accurate data in a timely manner is tantamount to program viability. For conceptual design of launch vehicles, trajectory analysis and optimization is a large hurdle. Tools such as the industry standard Program to Optimize Simulated Trajectories (POST) have traditionally required an expert in the loop for setting up inputs, running the program, and analyzing the output. The solution space for trajectory analysis is in general non-linear and multi-modal requiring an experienced analyst to weed out sub-optimal designs in pursuit of the global optimum. While an experienced analyst presented with a vehicle similar to one which they have already worked on can likely produce optimal performance figures in a timely manner, as soon as the "experienced" or "similar" adjectives are invalid the process can become lengthy. In addition, an experienced analyst working on a similar vehicle may go into the analysis with preconceived ideas about what the vehicle's trajectory should look like which can result in sub-optimal performance being recorded. Thus, in any case but the ideal either time or accuracy can be sacrificed. In the authors' previous work a tool called multiPOST was created which captures the heuristics of a human analyst over the process of executing trajectory analysis with POST. However without the instincts of a human in the loop, this method relied upon Monte Carlo simulation to find successful trajectories. Overall the method has mixed results, and in the context of optimizing multiple vehicles it is inefficient in comparison to the method presented POST's internal optimizer functions like any other gradient-based optimizer. It has a specified variable to optimize whose value is represented as optval, a set of dependent constraints to meet with associated forms and tolerances whose value is represented as p2, and a set of independent variables known as the u-vector to modify in pursuit of optimality. Each of these quantities are calculated or manipulated at a certain phase within the trajectory. The optimizer is further constrained by the requirement that the input u-vector must result in a trajectory which proceeds through each of the prescribed events in the input file. For example, if the input u-vector causes the vehicle to crash before it can achieve the orbital parameters required for a parking orbit, then the run will fail without engaging the optimizer, and a p2 value of exactly zero is returned. This poses a problem, as this "non-connecting" region of the u-vector space is far larger than the "connecting" region which returns a non-zero value of p2 and can be worked on by the internal optimizer. Finding this connecting region and more specifically the global optimum within this region has traditionally required the use of an expert analyst.
Multi-objective optimization integrated with life cycle assessment for rainwater harvesting systems
NASA Astrophysics Data System (ADS)
Li, Yi; Huang, Youyi; Ye, Quanliang; Zhang, Wenlong; Meng, Fangang; Zhang, Shanxue
2018-03-01
The major limitation of optimization models applied previously for rainwater harvesting (RWH) systems is the systematic evaluation of environmental and human health impacts across all the lifecycle stages. This study integrated life cycle assessment (LCA) into a multi-objective optimization model to optimize the construction areas of green rooftops, porous pavements and green lands in Beijing of China, considering the trade-offs among 24 h-interval RWH volume (QR), stormwater runoff volume control ratio (R), economic cost (EC), and environmental impacts (EI). Eleven life cycle impact indicators were assessed with a functional unit of 10,000 m2 of RWH construction areas. The LCA results showed that green lands performed the smallest lifecycle impacts of all assessment indicators, in contrast, porous pavements showed the largest impact values except Abiotic Depletion Potential (ADP) elements. Based on the standardization results, ADP fossil was chosen as the representative indicator for the calculation of EI objective in multi-objective optimization model due to its largest value in all RWH systems lifecycle. The optimization results for QR, R, EC and EI were 238.80 million m3, 78.5%, 66.68 billion RMB Yuan, and 1.05E + 16 MJ, respectively. After the construction of optimal RWH system, 14.7% of annual domestic water consumption and 78.5% of maximum daily rainfall would be supplied and controlled in Beijing, respectively, which would make a great contribution to reduce the stress of water scarcity and water logging problems. Green lands have been the first choice for RWH in Beijing according to the capacity of rainwater harvesting and less environmental and human impacts. Porous pavements played a good role in water logging alleviation (R for 67.5%), however, did not show a large construction result in this study due to the huge ADP fossil across the lifecycle. Sensitivity analysis revealed the daily maximum precipitation to be key factor for the robustness of the results for three RWH systems construction in this study.
Avdievich, Nikolai I; Giapitzakis, Ioannis-Angelos; Pfrommer, Andreas; Henning, Anke
2018-02-01
To improve the decoupling of a transceiver human head phased array at ultra-high fields (UHF, ≥ 7T) and to optimize its transmit (Tx) and receive (Rx) performance, a single-row eight-element (1 × 8) tight-fit transceiver overlapped loop array was developed and constructed. Overlapping the loops increases the RF field penetration depth but can compromise decoupling by generating substantial mutual resistance. Based on analytical modeling, we optimized the loop geometry and relative positioning to simultaneously minimize the resistive and inductive coupling and constructed a 9.4T eight-loop transceiver head phased array decoupled entirely by overlapping loops. We demonstrated that both the magnetic and electric coupling between adjacent loops is compensated at the same time by overlapping and nearly perfect decoupling (below -30 dB) can be obtained without additional decoupling strategies. Tx-efficiency and SNR of the overlapped array outperformed that of a common UHF gapped array of similar dimensions. Parallel Rx-performance was also not compromised due to overlapping the loops. As a proof of concept we developed and constructed a 9.4T (400 MHz) overlapped transceiver head array based on results of the analytical modeling. We demonstrated that at UHF overlapping loops not only provides excellent decoupling but also improves both Tx- and Rx-performance. Magn Reson Med 79:1200-1211, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Stochastic control approaches for sensor management in search and exploitation
NASA Astrophysics Data System (ADS)
Hitchings, Darin Chester
Recent improvements in the capabilities of autonomous vehicles have motivated their increased use in such applications as defense, homeland security, environmental monitoring, and surveillance. To enhance performance in these applications, new algorithms are required to control teams of robots autonomously and through limited interactions with human operators. In this dissertation we develop new algorithms for control of robots performing information-seeking missions in unknown environments. These missions require robots to control their sensors in order to discover the presence of objects, keep track of the objects, and learn what these objects are, given a fixed sensing budget. Initially, we investigate control of multiple sensors, with a finite set of sensing options and finite-valued measurements, to locate and classify objects given a limited resource budget. The control problem is formulated as a Partially Observed Markov Decision Problem (POMDP), but its exact solution requires excessive computation. Under the assumption that sensor error statistics are independent and time-invariant, we develop a class of algorithms using Lagrangian Relaxation techniques to obtain optimal mixed strategies using performance bounds developed in previous research. We investigate alternative Receding Horizon (RH) controllers to convert the mixed strategies to feasible adaptive-sensing strategies and evaluate the relative performance of these controllers in simulation. The resulting controllers provide superior performance to alternative algorithms proposed in the literature and obtain solutions to large-scale POMDP problems several orders of magnitude faster than optimal Dynamic Programming (DP) approaches with comparable performance quality. We extend our results for finite action, finite measurement sensor control to scenarios with moving objects. We use Hidden Markov Models (HMMs) for the evolution of objects, according to the dynamics of a birth-death process. We develop a new lower bound on the performance of adaptive controllers in these scenarios, develop algorithms for computing solutions to this lower bound, and use these algorithms as part of a RH controller for sensor allocation in the presence of moving objects We also consider an adaptive Search problem where sensing actions are continuous and the underlying measurement space is also continuous. We extend our previous hierarchical decomposition approach based on performance bounds to this problem and develop novel implementations of Stochastic Dynamic Programming (SDP) techniques to solve this problem. Our algorithms are nearly two orders of magnitude faster than previously proposed approaches and yield solutions of comparable quality. For supervisory control, we discuss how human operators can work with and augment robotic teams performing these tasks. Our focus is on how tasks are partitioned among teams of robots and how a human operator can make intelligent decisions for task partitioning. We explore these questions through the design of a game that involves robot automata controlled by our algorithms and a human supervisor that partitions tasks based on different levels of support information. This game can be used with human subject experiments to explore the effect of information on quality of supervisory control.
ERIC Educational Resources Information Center
Ansari, Fazel; Seidenberg, Ulrich
2016-01-01
This paper discusses the complementarity of human and cyber physical production systems (CPPS). The discourse of complementarity is elaborated by defining five criteria for comparing the characteristics of human and CPPS. Finally, a management portfolio matrix is proposed for examining the feasibility of optimal collaboration between them. The…
NASA Technical Reports Server (NTRS)
Gernhardt, M.L.; Chappell, S.P.
2009-01-01
The EVA Physiology, Systems and Performance (EPSP) Project is performing tests in different analog environments to understand human performance during Extravehicular Activity (EVA) with the aim of developing more safe and efficient systems for lunar exploration missions and the Constellation Program. The project is characterizing human EVA performance in studies using several test beds, including the underwater NASA Extreme Environment Mission Operations (NEEMO) and Neutral Buoyancy Laboratory (NBL) facilities, JSC fs Partial Gravity Simulator (POGO), and the NASA Reduced Gravity Office (RGO) parabolic flight aircraft. Using these varied testing environments, NASA can gain a more complete understanding of human performance issues related to EVA and the limitations of each testing environment. Tests are focused on identifying and understanding the EVA system factors that affect human performance such as center of gravity (CG), inertial mass, ground reaction forces (GRF), suit weight, and suit pressure. The test results will lead to the development of lunar EVA systems operations concepts and design requirements that optimize human performance and exploration capabilities. METHODS: Tests were conducted in the NBL and during NEEMO missions in the NOAA Aquarius Habitat. A reconfigurable back pack with repositionable mass was used to simulate Perfect, Low, Forward, High, Aft and NASA Baseline CG locations. Subjects performed simulated exploration tasks that included ambulation, kneel and recovery, rock pick-up, and shoveling. Testing using POGO, that simulates partial gravity via pneumatic weight offload system and a similar reconfigurable rig, is underway for a subset of the same tasks. Additionally, test trials are being performed on the RGO parabolic flight aircraft. Subject performance was assessed using a modified Cooper-Harper scale to assess operator compensation required to achieve desired performance. All CG locations are based on the assumption of a standardized 6 ft 180 lb subject. RESULTS: The modified Cooper-Harper Scale assesses desired task performance described as performance in a reduced gravity environment as compared to a 1G environment. Modified Cooper-Harper ratings of . 3 indicate no improvements are needed, ratings of 4-6 indicate improvements are desirable, and ratings . 7 indicate improvements are mandatory. DISCUSSION: Differences were noted in suited CH results based on environment at the same CG and suit pressure. Additionally, results suggest that CG location affects unsuited human performance. Subjects preferred locations near their natural CG over those that are high, aft, or a combination of high and aft. Further testing and analyses are planned to compare these unsuited results to suited performance.
McCafferty, Sean J; Schwiegerling, Jim T
2015-04-01
Present an analysis methodology for developing and evaluating accommodating intraocular lenses incorporating a deformable interface. The next generation design of extruded gel interface intraocular lens is presented. A prototype based upon similar previously in vivo proven design was tested with measurements of actuation force, lens power, interface contour, optical transfer function, and visual Strehl ratio. Prototype verified mathematical models were used to optimize optical and mechanical design parameters to maximize the image quality and minimize the required force to accommodate. The prototype lens produced adequate image quality with the available physiologic accommodating force. The iterative mathematical modeling based upon the prototype yielded maximized optical and mechanical performance through maximum allowable gel thickness to extrusion diameter ratio, maximum feasible refractive index change at the interface, and minimum gel material properties in Poisson's ratio and Young's modulus. The design prototype performed well. It operated within the physiologic constraints of the human eye including the force available for full accommodative amplitude using the eye's natural focusing feedback, while maintaining image quality in the space available. The parameters that optimized optical and mechanical performance were delineated as those, which minimize both asphericity and actuation pressure. The design parameters outlined herein can be used as a template to maximize the performance of a deformable interface intraocular lens. The article combines a multidisciplinary basic science approach from biomechanics, optical science, and ophthalmology to optimize an intraocular lens design suitable for preliminary animal trials.
CAMS as a tool for human factors research in spaceflight
NASA Astrophysics Data System (ADS)
Sauer, Juergen
2004-01-01
The paper reviews a number of research studies that were carried out with a PC-based task environment called Cabin Air Management System (CAMS) simulating the operation of a spacecraft's life support system. As CAMS was a multiple task environment, it allowed the measurement of performance at different levels. Four task components of different priority were embedded in the task environment: diagnosis and repair of system faults, maintaining atmospheric parameters in a safe state, acknowledgement of system alarms (reaction time), and keeping a record of critical system resources (prospective memory). Furthermore, the task environment permitted the examination of different task management strategies and changes in crew member state (fatigue, anxiety, mental effort). A major goal of the research programme was to examine how crew members adapted to various forms of sub-optimal working conditions, such as isolation and confinement, sleep deprivation and noise. None of the studies provided evidence for decrements in primary task performance. However, the results showed a number of adaptive responses of crew members to adjust to the different sub-optimal working conditions. There was evidence for adjustments in information sampling strategies (usually reductions in sampling frequency) as a result of unfavourable working conditions. The results also showed selected decrements in secondary task performance. Prospective memory seemed to be somewhat more vulnerable to sub-optimal working conditions than performance on the reaction time task. Finally, suggestions are made for future research with the CAMS environment.
NASA Astrophysics Data System (ADS)
Ouyang, Qin; Chen, Quansheng; Zhao, Jiewen
2016-02-01
The approach presented herein reports the application of near infrared (NIR) spectroscopy, in contrast with human sensory panel, as a tool for estimating Chinese rice wine quality; concretely, to achieve the prediction of the overall sensory scores assigned by the trained sensory panel. Back propagation artificial neural network (BPANN) combined with adaptive boosting (AdaBoost) algorithm, namely BP-AdaBoost, as a novel nonlinear algorithm, was proposed in modeling. First, the optimal spectra intervals were selected by synergy interval partial least square (Si-PLS). Then, BP-AdaBoost model based on the optimal spectra intervals was established, called Si-BP-AdaBoost model. These models were optimized by cross validation, and the performance of each final model was evaluated according to correlation coefficient (Rp) and root mean square error of prediction (RMSEP) in prediction set. Si-BP-AdaBoost showed excellent performance in comparison with other models. The best Si-BP-AdaBoost model was achieved with Rp = 0.9180 and RMSEP = 2.23 in the prediction set. It was concluded that NIR spectroscopy combined with Si-BP-AdaBoost was an appropriate method for the prediction of the sensory quality in Chinese rice wine.
Heinz, M G; Colburn, H S; Carney, L H
2001-10-01
The perceptual significance of the cochlear amplifier was evaluated by predicting level-discrimination performance based on stochastic auditory-nerve (AN) activity. Performance was calculated for three models of processing: the optimal all-information processor (based on discharge times), the optimal rate-place processor (based on discharge counts), and a monaural coincidence-based processor that uses a non-optimal combination of rate and temporal information. An analytical AN model included compressive magnitude and level-dependent-phase responses associated with the cochlear amplifier, and high-, medium-, and low-spontaneous-rate (SR) fibers with characteristic frequencies (CFs) spanning the AN population. The relative contributions of nonlinear magnitude and nonlinear phase responses to level encoding were compared by using four versions of the model, which included and excluded the nonlinear gain and phase responses in all possible combinations. Nonlinear basilar-membrane (BM) phase responses are robustly encoded in near-CF AN fibers at low frequencies. Strongly compressive BM responses at high frequencies near CF interact with the high thresholds of low-SR AN fibers to produce large dynamic ranges. Coincidence performance based on a narrow range of AN CFs was robust across a wide dynamic range at both low and high frequencies, and matched human performance levels. Coincidence performance based on all CFs demonstrated the "near-miss" to Weber's law at low frequencies and the high-frequency "mid-level bump." Monaural coincidence detection is a physiologically realistic mechanism that is extremely general in that it can utilize AN information (average-rate, synchrony, and nonlinear-phase cues) from all SR groups.
NASA Astrophysics Data System (ADS)
Ma, Jinlei; Zhou, Zhiqiang; Wang, Bo; Zong, Hua
2017-05-01
The goal of infrared (IR) and visible image fusion is to produce a more informative image for human observation or some other computer vision tasks. In this paper, we propose a novel multi-scale fusion method based on visual saliency map (VSM) and weighted least square (WLS) optimization, aiming to overcome some common deficiencies of conventional methods. Firstly, we introduce a multi-scale decomposition (MSD) using the rolling guidance filter (RGF) and Gaussian filter to decompose input images into base and detail layers. Compared with conventional MSDs, this MSD can achieve the unique property of preserving the information of specific scales and reducing halos near edges. Secondly, we argue that the base layers obtained by most MSDs would contain a certain amount of residual low-frequency information, which is important for controlling the contrast and overall visual appearance of the fused image, and the conventional "averaging" fusion scheme is unable to achieve desired effects. To address this problem, an improved VSM-based technique is proposed to fuse the base layers. Lastly, a novel WLS optimization scheme is proposed to fuse the detail layers. This optimization aims to transfer more visual details and less irrelevant IR details or noise into the fused image. As a result, the fused image details would appear more naturally and be suitable for human visual perception. Experimental results demonstrate that our method can achieve a superior performance compared with other fusion methods in both subjective and objective assessments.
Evolutionary Optimization of a Quadrifilar Helical Antenna
NASA Technical Reports Server (NTRS)
Lohn, Jason D.; Kraus, William F.; Linden, Derek S.; Clancy, Daniel (Technical Monitor)
2002-01-01
Automated antenna synthesis via evolutionary design has recently garnered much attention in the research literature. Evolutionary algorithms show promise because, among search algorithms, they are able to effectively search large, unknown design spaces. NASA's Mars Odyssey spacecraft is due to reach final Martian orbit insertion in January, 2002. Onboard the spacecraft is a quadrifilar helical antenna that provides telecommunications in the UHF band with landed assets, such as robotic rovers. Each helix is driven by the same signal which is phase-delayed in 90 deg increments. A small ground plane is provided at the base. It is designed to operate in the frequency band of 400-438 MHz. Based on encouraging previous results in automated antenna design using evolutionary search, we wanted to see whether such techniques could improve upon Mars Odyssey antenna design. Specifically, a co-evolutionary genetic algorithm is applied to optimize the gain and size of the quadrifilar helical antenna. The optimization was performed in-situ in the presence of a neighboring spacecraft structure. On the spacecraft, a large aluminum fuel tank is adjacent to the antenna. Since this fuel tank can dramatically affect the antenna's performance, we leave it to the evolutionary process to see if it can exploit the fuel tank's properties advantageously. Optimizing in the presence of surrounding structures would be quite difficult for human antenna designers, and thus the actual antenna was designed for free space (with a small ground plane). In fact, when flying on the spacecraft, surrounding structures that are moveable (e.g., solar panels) may be moved during the mission in order to improve the antenna's performance.
Mikhaylova, E; Kolstein, M; De Lorenzo, G; Chmeissani, M
2014-07-01
A novel positron emission tomography (PET) scanner design based on a room-temperature pixelated CdTe solid-state detector is being developed within the framework of the Voxel Imaging PET (VIP) Pathfinder project [1]. The simulation results show a great potential of the VIP to produce high-resolution images even in extremely challenging conditions such as the screening of a human head [2]. With unprecedented high channel density (450 channels/cm 3 ) image reconstruction is a challenge. Therefore optimization is needed to find the best algorithm in order to exploit correctly the promising detector potential. The following reconstruction algorithms are evaluated: 2-D Filtered Backprojection (FBP), Ordered Subset Expectation Maximization (OSEM), List-Mode OSEM (LM-OSEM), and the Origin Ensemble (OE) algorithm. The evaluation is based on the comparison of a true image phantom with a set of reconstructed images obtained by each algorithm. This is achieved by calculation of image quality merit parameters such as the bias, the variance and the mean square error (MSE). A systematic optimization of each algorithm is performed by varying the reconstruction parameters, such as the cutoff frequency of the noise filters and the number of iterations. The region of interest (ROI) analysis of the reconstructed phantom is also performed for each algorithm and the results are compared. Additionally, the performance of the image reconstruction methods is compared by calculating the modulation transfer function (MTF). The reconstruction time is also taken into account to choose the optimal algorithm. The analysis is based on GAMOS [3] simulation including the expected CdTe and electronic specifics.
NASA Astrophysics Data System (ADS)
Pathiraja, G. C.; Wijesingha, M. S.; Nanayakkara, N.
2017-05-01
Chlorpyrifos, a widely used organophosphate pesticide which can be found in surface water bodies, is harmful for human body. Thus, treating water contaminated with chlorpyrifos is important. In our previous studies, novel Ti/IrO2-SnO2 anode was successfully developed for electrochemical degradation of chlorpyrifos in chloride free water. In this study, optimization of previously developed Ti/IrO2-SnO2 anode for mineralization of chlorpyrifos was successfully performed through response surface methodology. During the optimization study, two-level factorial design was used to determine the optimal coating solutions concentration for developing the Ti/IrO2-SnO2 anode. Cyclic voltammetry and open circuit potential were performed to investigate the electrochemically active surface area and stability of these anodes. The response surface and contour plots show that 0.3 M of [Ir] and 7.5 mM of [Sn] coated electrode has both highest anodic charge and stability. Scanning Electron Microscopic (SEM) images show the evidence of having both compact and porous regions in the surface of the thin film, resulting larger surface area. Within 6 h, the best result for mineralization (55.56%) of chlorpyrifos was obtained with 0.3 M of [Ir] and 7.5 mM of [Sn] coated anode using Total organic Carbon (TOC) analyzer. Therefore, the optimum coating concentration was found as 0.3 M of [Ir] and 7.5 mM of [Sn]. It would require an energy consumption of 6 kWhm-3.
Lenguito, Giovanni; Chaimov, Deborah; Weitz, Jonathan R; Rodriguez-Diaz, Rayner; Rawal, Siddarth A K; Tamayo-Garcia, Alejandro; Caicedo, Alejandro; Stabler, Cherie L; Buchwald, Peter; Agarwal, Ashutosh
2017-02-28
We report the design and fabrication of a robust fluidic platform built out of inert plastic materials and micromachined features that promote optimized convective fluid transport. The platform is tested for perfusion interrogation of rodent and human pancreatic islets, dynamic secretion of hormones, concomitant live-cell imaging, and optogenetic stimulation of genetically engineered islets. A coupled quantitative fluid dynamics computational model of glucose stimulated insulin secretion and fluid dynamics was first utilized to design device geometries that are optimal for complete perfusion of three-dimensional islets, effective collection of secreted insulin, and minimization of system volumes and associated delays. Fluidic devices were then fabricated through rapid prototyping techniques, such as micromilling and laser engraving, as two interlocking parts from materials that are non-absorbent and inert. Finally, the assembly was tested for performance using both rodent and human islets with multiple assays conducted in parallel, such as dynamic perfusion, staining and optogenetics on standard microscopes, as well as for integration with commercial perfusion machines. The optimized design of convective fluid flows, use of bio-inert and non-absorbent materials, reversible assembly, manual access for loading and unloading of islets, and straightforward integration with commercial imaging and fluid handling systems proved to be critical for perfusion assay, and particularly suited for time-resolved optogenetics studies.
Selection of optimal spectral sensitivity functions for color filter arrays.
Parmar, Manu; Reeves, Stanley J
2010-12-01
A color image meant for human consumption can be appropriately displayed only if at least three distinct color channels are present. Typical digital cameras acquire three-color images with only one sensor. A color filter array (CFA) is placed on the sensor such that only one color is sampled at a particular spatial location. This sparsely sampled signal is then reconstructed to form a color image with information about all three colors at each location. In this paper, we show that the wavelength sensitivity functions of the CFA color filters affect both the color reproduction ability and the spatial reconstruction quality of recovered images. We present a method to select perceptually optimal color filter sensitivity functions based upon a unified spatial-chromatic sampling framework. A cost function independent of particular scenes is defined that expresses the error between a scene viewed by the human visual system and the reconstructed image that represents the scene. A constrained minimization of the cost function is used to obtain optimal values of color-filter sensitivity functions for several periodic CFAs. The sensitivity functions are shown to perform better than typical RGB and CMY color filters in terms of both the s-CIELAB ∆E error metric and a qualitative assessment.
Modified Optimization Water Index (mowi) for LANDSAT-8 Oli/tirs
NASA Astrophysics Data System (ADS)
Moradi, M.; Sahebi, M.; Shokri, M.
2017-09-01
Water is one of the most important resources that essential need for human life. Due to population growth and increasing need of human to water, proper management of water resources will be one of the serious challenges of next decades. Remote sensing data is the best way to the management of water resources due time and cost effectiveness over a greater range of temporal and spatial scales. Between many kinds of satellite data, from SAR to optic or from high resolution to low resolution, Landsat imagery is more interesting data for water detection and management of earth surface water. Landsat8 OLI/TIRS is the newest version of Landsat satellite series. In this paper, we investigated the full spectral potential of Landsat8 for water detection. It is developed many kinds of methods for this purpose that index based methods have some advantages than other methods. Pervious indices just use a limited number of spectral band. In this paper, Modified Optimization Water Index (MOWI) defined by consideration of a linear combination of bands that each coefficient of bands calculated by particle swarm algorithm. The result shows that modified optimization water index (MOWI) has a proper performance on different condition like cloud, cloud shadow and mountain shadow.
Simulation of talking faces in the human brain improves auditory speech recognition
von Kriegstein, Katharina; Dogan, Özgür; Grüter, Martina; Giraud, Anne-Lise; Kell, Christian A.; Grüter, Thomas; Kleinschmidt, Andreas; Kiebel, Stefan J.
2008-01-01
Human face-to-face communication is essentially audiovisual. Typically, people talk to us face-to-face, providing concurrent auditory and visual input. Understanding someone is easier when there is visual input, because visual cues like mouth and tongue movements provide complementary information about speech content. Here, we hypothesized that, even in the absence of visual input, the brain optimizes both auditory-only speech and speaker recognition by harvesting speaker-specific predictions and constraints from distinct visual face-processing areas. To test this hypothesis, we performed behavioral and neuroimaging experiments in two groups: subjects with a face recognition deficit (prosopagnosia) and matched controls. The results show that observing a specific person talking for 2 min improves subsequent auditory-only speech and speaker recognition for this person. In both prosopagnosics and controls, behavioral improvement in auditory-only speech recognition was based on an area typically involved in face-movement processing. Improvement in speaker recognition was only present in controls and was based on an area involved in face-identity processing. These findings challenge current unisensory models of speech processing, because they show that, in auditory-only speech, the brain exploits previously encoded audiovisual correlations to optimize communication. We suggest that this optimization is based on speaker-specific audiovisual internal models, which are used to simulate a talking face. PMID:18436648
Predictive Behavior of a Computational Foot/Ankle Model through Artificial Neural Networks.
Chande, Ruchi D; Hargraves, Rosalyn Hobson; Ortiz-Robinson, Norma; Wayne, Jennifer S
2017-01-01
Computational models are useful tools to study the biomechanics of human joints. Their predictive performance is heavily dependent on bony anatomy and soft tissue properties. Imaging data provides anatomical requirements while approximate tissue properties are implemented from literature data, when available. We sought to improve the predictive capability of a computational foot/ankle model by optimizing its ligament stiffness inputs using feedforward and radial basis function neural networks. While the former demonstrated better performance than the latter per mean square error, both networks provided reasonable stiffness predictions for implementation into the computational model.
Choosing colors for map display icons using models of visual search.
Shive, Joshua; Francis, Gregory
2013-04-01
We show how to choose colors for icons on maps to minimize search time using predictions of a model of visual search. The model analyzes digital images of a search target (an icon on a map) and a search display (the map containing the icon) and predicts search time as a function of target-distractor color distinctiveness and target eccentricity. We parameterized the model using data from a visual search task and performed a series of optimization tasks to test the model's ability to choose colors for icons to minimize search time across icons. Map display designs made by this procedure were tested experimentally. In a follow-up experiment, we examined the model's flexibility to assign colors in novel search situations. The model fits human performance, performs well on the optimization tasks, and can choose colors for icons on maps with novel stimuli to minimize search time without requiring additional model parameter fitting. Models of visual search can suggest color choices that produce search time reductions for display icons. Designers should consider constructing visual search models as a low-cost method of evaluating color assignments.
Yabalak, Erdal
2018-05-18
This study was performed to investigate the mineralization of ticarcillin in the artificially prepared aqueous solution presenting ticarcillin contaminated waters, which constitute a serious problem for human health. 81.99% of total organic carbon removal, 79.65% of chemical oxygen demand removal, and 94.35% of ticarcillin removal were achieved by using eco-friendly, time-saving, powerful and easy-applying, subcritical water oxidation method in the presence of a safe-to-use oxidizing agent, hydrogen peroxide. Central composite design, which belongs to the response surface methodology, was applied to design the degradation experiments, to optimize the methods, to evaluate the effects of the system variables, namely, temperature, hydrogen peroxide concentration, and treatment time, on the responses. In addition, theoretical equations were proposed in each removal processes. ANOVA tests were utilized to evaluate the reliability of the performed models. F values of 245.79, 88.74, and 48.22 were found for total organic carbon removal, chemical oxygen demand removal, and ticarcillin removal, respectively. Moreover, artificial neural network modeling was applied to estimate the response in each case and its prediction and optimizing performance was statistically examined and compared to the performance of central composite design.
Optimism and Cardiovascular Function in Children with Congenital Heart Disease
2010-02-17
their theories. For example, Sigmund Freud (1856-1939) included references to both optimism and pessimism in his theory of human nature and...optimistic side of human nature and the drive towards death represents the pessimistic aspect of human nature ( Freud , 1927/1961, p. 8). The...W. (1977). Longitudinal physique changes among healthy white veterans at Boston. Human Biology, 49, 541-558. Freud , S. (1927/1961). Civilization
Attenuation correction for the large non-human primate brain imaging using microPET.
Naidoo-Variawa, S; Lehnert, W; Kassiou, M; Banati, R; Meikle, S R
2010-04-21
Assessment of the biodistribution and pharmacokinetics of radiopharmaceuticals in vivo is often performed on animal models of human disease prior to their use in humans. The baboon brain is physiologically and neuro-anatomically similar to the human brain and is therefore a suitable model for evaluating novel CNS radioligands. We previously demonstrated the feasibility of performing baboon brain imaging on a dedicated small animal PET scanner provided that the data are accurately corrected for degrading physical effects such as photon attenuation in the body. In this study, we investigated factors affecting the accuracy and reliability of alternative attenuation correction strategies when imaging the brain of a large non-human primate (papio hamadryas) using the microPET Focus 220 animal scanner. For measured attenuation correction, the best bias versus noise performance was achieved using a (57)Co transmission point source with a 4% energy window. The optimal energy window for a (68)Ge transmission source operating in singles acquisition mode was 20%, independent of the source strength, providing bias-noise performance almost as good as for (57)Co. For both transmission sources, doubling the acquisition time had minimal impact on the bias-noise trade-off for corrected emission images, despite observable improvements in reconstructed attenuation values. In a [(18)F]FDG brain scan of a female baboon, both measured attenuation correction strategies achieved good results and similar SNR, while segmented attenuation correction (based on uncorrected emission images) resulted in appreciable regional bias in deep grey matter structures and the skull. We conclude that measured attenuation correction using a single pass (57)Co (4% energy window) or (68)Ge (20% window) transmission scan achieves an excellent trade-off between bias and propagation of noise when imaging the large non-human primate brain with a microPET scanner.
Attenuation correction for the large non-human primate brain imaging using microPET
NASA Astrophysics Data System (ADS)
Naidoo-Variawa, S.; Lehnert, W.; Kassiou, M.; Banati, R.; Meikle, S. R.
2010-04-01
Assessment of the biodistribution and pharmacokinetics of radiopharmaceuticals in vivo is often performed on animal models of human disease prior to their use in humans. The baboon brain is physiologically and neuro-anatomically similar to the human brain and is therefore a suitable model for evaluating novel CNS radioligands. We previously demonstrated the feasibility of performing baboon brain imaging on a dedicated small animal PET scanner provided that the data are accurately corrected for degrading physical effects such as photon attenuation in the body. In this study, we investigated factors affecting the accuracy and reliability of alternative attenuation correction strategies when imaging the brain of a large non-human primate (papio hamadryas) using the microPET Focus 220 animal scanner. For measured attenuation correction, the best bias versus noise performance was achieved using a 57Co transmission point source with a 4% energy window. The optimal energy window for a 68Ge transmission source operating in singles acquisition mode was 20%, independent of the source strength, providing bias-noise performance almost as good as for 57Co. For both transmission sources, doubling the acquisition time had minimal impact on the bias-noise trade-off for corrected emission images, despite observable improvements in reconstructed attenuation values. In a [18F]FDG brain scan of a female baboon, both measured attenuation correction strategies achieved good results and similar SNR, while segmented attenuation correction (based on uncorrected emission images) resulted in appreciable regional bias in deep grey matter structures and the skull. We conclude that measured attenuation correction using a single pass 57Co (4% energy window) or 68Ge (20% window) transmission scan achieves an excellent trade-off between bias and propagation of noise when imaging the large non-human primate brain with a microPET scanner.
Modeling the biomechanical and injury response of human liver parenchyma under tensile loading.
Untaroiu, Costin D; Lu, Yuan-Chiao; Siripurapu, Sundeep K; Kemper, Andrew R
2015-01-01
The rapid advancement in computational power has made human finite element (FE) models one of the most efficient tools for assessing the risk of abdominal injuries in a crash event. In this study, specimen-specific FE models were employed to quantify material and failure properties of human liver parenchyma using a FE optimization approach. Uniaxial tensile tests were performed on 34 parenchyma coupon specimens prepared from two fresh human livers. Each specimen was tested to failure at one of four loading rates (0.01s(-1), 0.1s(-1), 1s(-1), and 10s(-1)) to investigate the effects of rate dependency on the biomechanical and failure response of liver parenchyma. Each test was simulated by prescribing the end displacements of specimen-specific FE models based on the corresponding test data. The parameters of a first-order Ogden material model were identified for each specimen by a FE optimization approach while simulating the pre-tear loading region. The mean material model parameters were then determined for each loading rate from the characteristic averages of the stress-strain curves, and a stochastic optimization approach was utilized to determine the standard deviations of the material model parameters. A hyperelastic material model using a tabulated formulation for rate effects showed good predictions in terms of tensile material properties of human liver parenchyma. Furthermore, the tissue tearing was numerically simulated using a cohesive zone modeling (CZM) approach. A layer of cohesive elements was added at the failure location, and the CZM parameters were identified by fitting the post-tear force-time history recorded in each test. The results show that the proposed approach is able to capture both the biomechanical and failure response, and accurately model the overall force-deflection response of liver parenchyma over a large range of tensile loadings rates. Copyright © 2014 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacques Hugo; Ronald Boring; Lew Hanes
2013-09-01
The U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) program is collaborating with a U.S. nuclear utility to bring about a systematic fleet-wide control room modernization. To facilitate this upgrade, a new distributed control system (DCS) is being introduced into the control rooms of these plants. The DCS will upgrade the legacy plant process computer and emergency response facility information system. In addition, the DCS will replace an existing analog turbine control system with a display-based system. With technology upgrades comes the opportunity to improve the overall human-system interaction between the operators and the control room. To optimize operatormore » performance, the LWRS Control Room Modernization research team followed a human-centered approach published by the U.S. Nuclear Regulatory Commission. NUREG-0711, Rev. 3, Human Factors Engineering Program Review Model (O’Hara et al., 2012), prescribes four phases for human factors engineering. This report provides examples of the first phase, Planning and Analysis. The three elements of Planning and Analysis in NUREG-0711 that are most crucial to initiating control room upgrades are: • Operating Experience Review: Identifies opportunities for improvement in the existing system and provides lessons learned from implemented systems. • Function Analysis and Allocation: Identifies which functions at the plant may be optimally handled by the DCS vs. the operators. • Task Analysis: Identifies how tasks might be optimized for the operators. Each of these elements is covered in a separate chapter. Examples are drawn from workshops with reactor operators that were conducted at the LWRS Human System Simulation Laboratory HSSL and at the respective plants. The findings in this report represent generalized accounts of more detailed proprietary reports produced for the utility for each plant. The goal of this LWRS report is to disseminate the technique and provide examples sufficient to serve as a template for other utilities’ projects for control room modernization.« less
Novel transcranial magnetic stimulation coil for mice
NASA Astrophysics Data System (ADS)
March, Stephen; Stark, Spencer; Crowther, Lawrence; Hadimani, Ravi; Jiles, David
2014-03-01
Transcranial magnetic stimulation (TMS) shows potential for non-invasive treatment of various neurological disorders. Significant work has been performed on the design of coils used for TMS on human subjects but few reports have been made on the design of coils for use on the brains of animals such as mice. This work is needed as TMS studies utilizing mice can allow rapid preclinical development of TMS for human disorders but the coil designs developed for use on humans are inadequate for optimal stimulation of the much smaller mouse brain. A novel TMS coil has been developed with the goal of inducing strong and focused electric fields for the stimulation of small animals such as mice. Calculations of induced electric fields were performed utilizing an MRI derived inhomogeneous model of an adult male mouse. Mechanical and thermal analysis of this new TMS helmet-coil design have also been performed at anticipated TMS operating conditions to ensure mechanical stability of the new coil and establish expected linear attraction and rotational force values. Calculated temperature increases for typical stimulation periods indicate the helmet-coil system is capable of operating within established medical standards. A prototype of the coil has been fabricated and characterization results are presented.
Low contrast detection in abdominal CT: comparing single-slice and multi-slice tasks
NASA Astrophysics Data System (ADS)
Ba, Alexandre; Racine, Damien; Viry, Anaïs.; Verdun, Francis R.; Schmidt, Sabine; Bochud, François O.
2017-03-01
Image quality assessment is crucial for the optimization of computed tomography (CT) protocols. Human and mathematical model observers are increasingly used for the detection of low contrast signal in abdominal CT, but are frequently limited to the use of a single image slice. Another limitation is that most of them only consider the detection of a signal embedded in a uniform background phantom. The purpose of this paper was to test if human observer performance is significantly different in CT images read in single or multiple slice modes and if these differences are the same for anatomical and uniform clinical images. We investigated detection performance and scrolling trends of human observers of a simulated liver lesion embedded in anatomical and uniform CT backgrounds. Results show that observers don't take significantly benefit of additional information provided in multi-slice reading mode. Regarding the background, performances are moderately higher for uniform than for anatomical images. Our results suggest that for low contrast detection in abdominal CT, the use of multi-slice model observers would probably only add a marginal benefit. On the other hand, the quality of a CT image is more accurately estimated with clinical anatomical backgrounds.
Human Guidance Behavior Decomposition and Modeling
NASA Astrophysics Data System (ADS)
Feit, Andrew James
Trained humans are capable of high performance, adaptable, and robust first-person dynamic motion guidance behavior. This behavior is exhibited in a wide variety of activities such as driving, piloting aircraft, skiing, biking, and many others. Human performance in such activities far exceeds the current capability of autonomous systems in terms of adaptability to new tasks, real-time motion planning, robustness, and trading safety for performance. The present work investigates the structure of human dynamic motion guidance that enables these performance qualities. This work uses a first-person experimental framework that presents a driving task to the subject, measuring control inputs, vehicle motion, and operator visual gaze movement. The resulting data is decomposed into subspace segment clusters that form primitive elements of action-perception interactive behavior. Subspace clusters are defined by both agent-environment system dynamic constraints and operator control strategies. A key contribution of this work is to define transitions between subspace cluster segments, or subgoals, as points where the set of active constraints, either system or operator defined, changes. This definition provides necessary conditions to determine transition points for a given task-environment scenario that allow a solution trajectory to be planned from known behavior elements. In addition, human gaze behavior during this task contains predictive behavior elements, indicating that the identified control modes are internally modeled. Based on these ideas, a generative, autonomous guidance framework is introduced that efficiently generates optimal dynamic motion behavior in new tasks. The new subgoal planning algorithm is shown to generate solutions to certain tasks more quickly than existing approaches currently used in robotics.
Learning to detect and combine the features of an object
Suchow, Jordan W.; Pelli, Denis G.
2013-01-01
To recognize an object, it is widely supposed that we first detect and then combine its features. Familiar objects are recognized effortlessly, but unfamiliar objects—like new faces or foreign-language letters—are hard to distinguish and must be learned through practice. Here, we describe a method that separates detection and combination and reveals how each improves as the observer learns. We dissociate the steps by two independent manipulations: For each step, we do or do not provide a bionic crutch that performs it optimally. Thus, the two steps may be performed solely by the human, solely by the crutches, or cooperatively, when the human takes one step and a crutch takes the other. The crutches reveal a double dissociation between detecting and combining. Relative to the two-step ideal, the human observer’s overall efficiency for unconstrained identification equals the product of the efficiencies with which the human performs the steps separately. The two-step strategy is inefficient: Constraining the ideal to take two steps roughly halves its identification efficiency. In contrast, we find that humans constrained to take two steps perform just as well as when unconstrained, which suggests that they normally take two steps. Measuring threshold contrast (the faintness of a barely identifiable letter) as it improves with practice, we find that detection is inefficient and learned slowly. Combining is learned at a rate that is 4× higher and, after 1,000 trials, 7× more efficient. This difference explains much of the diversity of rates reported in perceptual learning studies, including effects of complexity and familiarity. PMID:23267067
NASA Astrophysics Data System (ADS)
Kalayeh, Mahdi M.; Marin, Thibault; Pretorius, P. Hendrik; Wernick, Miles N.; Yang, Yongyi; Brankov, Jovan G.
2011-03-01
In this paper, we present a numerical observer for image quality assessment, aiming to predict human observer accuracy in a cardiac perfusion defect detection task for single-photon emission computed tomography (SPECT). In medical imaging, image quality should be assessed by evaluating the human observer accuracy for a specific diagnostic task. This approach is known as task-based assessment. Such evaluations are important for optimizing and testing imaging devices and algorithms. Unfortunately, human observer studies with expert readers are costly and time-demanding. To address this problem, numerical observers have been developed as a surrogate for human readers to predict human diagnostic performance. The channelized Hotelling observer (CHO) with internal noise model has been found to predict human performance well in some situations, but does not always generalize well to unseen data. We have argued in the past that finding a model to predict human observers could be viewed as a machine learning problem. Following this approach, in this paper we propose a channelized relevance vector machine (CRVM) to predict human diagnostic scores in a detection task. We have previously used channelized support vector machines (CSVM) to predict human scores and have shown that this approach offers better and more robust predictions than the classical CHO method. The comparison of the proposed CRVM with our previously introduced CSVM method suggests that CRVM can achieve similar generalization accuracy, while dramatically reducing model complexity and computation time.
Tonello, Lucio; Gashi, Bekim; Scuotto, Alessandro; Cappello, Glenda; Cocchi, Massimo; Gabrielli, Fabio; Tuszynski, Jack A
2018-01-01
Living organisms tend to find viable strategies under ambient conditions that optimize their search for, and utilization of, life-sustaining resources. For plants, a leading role in this process is performed by auxin, a plant hormone that drives morphological development, dynamics, and movement to optimize the absorption of light (through branches and leaves) and chemical "food" (through roots). Similarly to auxin in plants, serotonin seems to play an important role in higher animals, especially humans. Here, it is proposed that morphological and functional similarities between (i) plant leaves and the animal/human brain and (ii) plant roots and the animal/human gastro-intestinal tract have general features in common. Plants interact with light and use it for biological energy, whereas, neurons in the central nervous system seem to interact with bio-photons and use them for proper brain function. Further, as auxin drives roots "arborescence" within the soil, similarly serotonin seems to facilitate enteric nervous system connectivity within the human gastro-intestinal tract. This auxin/serotonin parallel suggests the root-branches axis in plants may be an evolutionary precursor to the gastro-intestinal-brain axis in humans. Finally, we hypothesize that light might be an important factor, both in gastro-intestinal dynamics and brain function. Such a comparison may indicate a key role for the interaction of light and serotonin in neuronal physiology (possibly in both the central nervous system and the enteric nervous system), and according to recent work, mind and consciousness.
Dynamic modeling and optimization for space logistics using time-expanded networks
NASA Astrophysics Data System (ADS)
Ho, Koki; de Weck, Olivier L.; Hoffman, Jeffrey A.; Shishko, Robert
2014-12-01
This research develops a dynamic logistics network formulation for lifecycle optimization of mission sequences as a system-level integrated method to find an optimal combination of technologies to be used at each stage of the campaign. This formulation can find the optimal transportation architecture considering its technology trades over time. The proposed methodologies are inspired by the ground logistics analysis techniques based on linear programming network optimization. Particularly, the time-expanded network and its extension are developed for dynamic space logistics network optimization trading the quality of the solution with the computational load. In this paper, the methodologies are applied to a human Mars exploration architecture design problem. The results reveal multiple dynamic system-level trades over time and give recommendation of the optimal strategy for the human Mars exploration architecture. The considered trades include those between In-Situ Resource Utilization (ISRU) and propulsion technologies as well as the orbit and depot location selections over time. This research serves as a precursor for eventual permanent settlement and colonization of other planets by humans and us becoming a multi-planet species.
Acquisition and production of skilled behavior in dynamic decision-making tasks
NASA Technical Reports Server (NTRS)
Kirlik, Alex
1993-01-01
Summaries of the four projects completed during the performance of this research are included. The four projects described are: Perceptual Augmentation Aiding for Situation Assessment, Perceptual Augmentation Aiding for Dynamic Decision-Making and Control, Action Advisory Aiding for Dynamic Decision-Making and Control, and Display Design to Support Time-Constrained Route Optimization. Papers based on each of these projects are currently in preparation. The theoretical framework upon which the first three projects are based, Ecological Task Analysis, was also developed during the performance of this research, and is described in a previous report. A project concerned with modeling strategies in human control of a dynamic system was also completed during the performance of this research.
Synergia: an accelerator modeling tool with 3-D space charge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amundson, James F.; Spentzouris, P.; /Fermilab
2004-07-01
High precision modeling of space-charge effects, together with accurate treatment of single-particle dynamics, is essential for designing future accelerators as well as optimizing the performance of existing machines. We describe Synergia, a high-fidelity parallel beam dynamics simulation package with fully three dimensional space-charge capabilities and a higher order optics implementation. We describe the computational techniques, the advanced human interface, and the parallel performance obtained using large numbers of macroparticles. We also perform code benchmarks comparing to semi-analytic results and other codes. Finally, we present initial results on particle tune spread, beam halo creation, and emittance growth in the Fermilab boostermore » accelerator.« less
The Use of Human Factors Simulation to Conserve Operations Expense
NASA Technical Reports Server (NTRS)
Hamilton, George S.; Dischinger, H. Charles, Jr.; Wu, Hsin-I.
1999-01-01
In preparation for on-orbit operations, NASA performs experiments aboard a KC-135 which performs parabolic maneuvers, resulting in short periods of microgravity. While considerably less expensive than space operations, the use of this aircraft is costly. Simulation of tasks to be performed during the flight can allow the participants to optimize hardware configuration and crew interaction prior to flight. This presentation will demonstrate the utility of such simulation. The experiment simulated is the fluid dynamics of epoxy components which may be used in a patch kit in the event of meteoroid damage to the International Space Station. Improved configuration and operational efficiencies were reflected in early and increased data collection.
NASA Exploration Team (NExT) In-Space Transportation Overview
NASA Technical Reports Server (NTRS)
Drake, Bret G.; Cooke, Douglas R.; Kos, Larry D.; Brady, Hugh J. (Technical Monitor)
2002-01-01
This presentation provides an overview of NASA Exploration Team's (NEXT) vision of in-space transportation in the future. Hurdles facing in-space transportation include affordable power sources, crew health and safety, optimized robotic and human operations and space systems performance. Topics covered include: exploration of Earth's neighborhood, Earth's neighborhood architecture and elements, Mars mission trajectory options, delta-v variations, Mars mission duration options, Mars mission architecture, nuclear electric propulsion advantages and miscellaneous technology needs.
2010-04-30
combating market dynamism (Aldrich, 1979; Child, 1972), which is a result of evolving technology, shifting prices, or variance in product availability... principles : (1) human beings are bounded rationally, and (2), as a result of being rationally bound, will always choose to further their own self... principles to govern the relationship among the buyers and suppliers. Our conceptual model aligns the alternative governance structures derived
Communication System Architecture for Planetary Exploration
NASA Technical Reports Server (NTRS)
Braham, Stephen P.; Alena, Richard; Gilbaugh, Bruce; Glass, Brian; Norvig, Peter (Technical Monitor)
2001-01-01
Future human missions to Mars will require effective communications supporting exploration activities and scientific field data collection. Constraints on cost, size, weight and power consumption for all communications equipment make optimization of these systems very important. These information and communication systems connect people and systems together into coherent teams performing the difficult and hazardous tasks inherent in planetary exploration. The communication network supporting vehicle telemetry data, mission operations, and scientific collaboration must have excellent reliability, and flexibility.
Zhang, Jian-Hua; Xia, Jia-Jun; Garibaldi, Jonathan M; Groumpos, Petros P; Wang, Ru-Bin
2017-06-01
In human-machine (HM) hybrid control systems, human operator and machine cooperate to achieve the control objectives. To enhance the overall HM system performance, the discrete manual control task-load by the operator must be dynamically allocated in accordance with continuous-time fluctuation of psychophysiological functional status of the operator, so-called operator functional state (OFS). The behavior of the HM system is hybrid in nature due to the co-existence of discrete task-load (control) variable and continuous operator performance (system output) variable. Petri net is an effective tool for modeling discrete event systems, but for hybrid system involving discrete dynamics, generally Petri net model has to be extended. Instead of using different tools to represent continuous and discrete components of a hybrid system, this paper proposed a method of fuzzy inference Petri nets (FIPN) to represent the HM hybrid system comprising a Mamdani-type fuzzy model of OFS and a logical switching controller in a unified framework, in which the task-load level is dynamically reallocated between the operator and machine based on the model-predicted OFS. Furthermore, this paper used a multi-model approach to predict the operator performance based on three electroencephalographic (EEG) input variables (features) via the Wang-Mendel (WM) fuzzy modeling method. The membership function parameters of fuzzy OFS model for each experimental participant were optimized using artificial bee colony (ABC) evolutionary algorithm. Three performance indices, RMSE, MRE, and EPR, were computed to evaluate the overall modeling accuracy. Experiment data from six participants are analyzed. The results show that the proposed method (FIPN with adaptive task allocation) yields lower breakdown rate (from 14.8% to 3.27%) and higher human performance (from 90.30% to 91.99%). The simulation results of the FIPN-based adaptive HM (AHM) system on six experimental participants demonstrate that the FIPN framework provides an effective way to model and regulate/optimize the OFS in HM hybrid systems composed of continuous-time OFS model and discrete-event switching controller. Copyright © 2017 Elsevier B.V. All rights reserved.
2014-01-01
Background Mechanical loads induced through muscle contraction, vibration, or compressive forces are thought to modulate tissue plasticity. With the emergence of regenerative medicine, there is a need to understand the optimal mechanical environment (vibration, load, or muscle force) that promotes cellular health. To our knowledge no mechanical system has been proposed to deliver these isolated mechanical stimuli in human tissue. We present the design, performance, and utilization of a new technology that may be used to study localized mechanical stimuli on human tissues. A servo-controlled vibration and limb loading system were developed and integrated into a single instrument to deliver vibration, compression, or muscle contractile loads to a single limb (tibia) in humans. The accuracy, repeatability, transmissibility, and safety of the mechanical delivery system were evaluated on eight individuals with spinal cord injury (SCI). Findings The limb loading system was linear, repeatable, and accurate to less than 5, 1, and 1 percent of full scale, respectively, and transmissibility was excellent. The between session tests on individuals with spinal cord injury (SCI) showed high intra-class correlations (>0.9). Conclusions All tests supported that therapeutic loads can be delivered to a lower limb (tibia) in a safe, accurate, and measureable manner. Future collaborations between engineers and cellular physiologists will be important as research programs strive to determine the optimal mechanical environment for developing cells and tissues in humans. PMID:24894666
Multi-kilobase homozygous targeted gene replacement in human induced pluripotent stem cells.
Byrne, Susan M; Ortiz, Luis; Mali, Prashant; Aach, John; Church, George M
2015-02-18
Sequence-specific nucleases such as TALEN and the CRISPR/Cas9 system have so far been used to disrupt, correct or insert transgenes at precise locations in mammalian genomes. We demonstrate efficient 'knock-in' targeted replacement of multi-kilobase genes in human induced pluripotent stem cells (iPSC). Using a model system replacing endogenous human genes with their mouse counterpart, we performed a comprehensive study of targeting vector design parameters for homologous recombination. A 2.7 kilobase (kb) homozygous gene replacement was achieved in up to 11% of iPSC without selection. The optimal homology arm length was around 2 kb, with homology length being especially critical on the arm not adjacent to the cut site. Homologous sequence inside the cut sites was detrimental to targeting efficiency, consistent with a synthesis-dependent strand annealing (SDSA) mechanism. Using two nuclease sites, we observed a high degree of gene excisions and inversions, which sometimes occurred more frequently than indel mutations. While homozygous deletions of 86 kb were achieved with up to 8% frequency, deletion frequencies were not solely a function of nuclease activity and deletion size. Our results analyzing the optimal parameters for targeting vector design will inform future gene targeting efforts involving multi-kilobase gene segments, particularly in human iPSC. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
Optimal Line Length in Reading--A Literature Review
ERIC Educational Resources Information Center
Nanavati, Anuj A.; Bias, Randolph G.
2005-01-01
One of the most important, and most studied, aspects of human perception is the act of reading. Reading has received much attention from researchers, both from a human information processing (HIP) approach and as a common, practical act that needs to be optimized, especially in the realm of human-computer interaction (HCI). One of the text …
Myths and realities of electronics maintenance.
Harris, Douglas H
2008-06-01
The author presents and discusses discoveries and developments contributing to enhanced electronics maintenance performance. This body of research is viewed from the vantage point of Nick Bond's 1970 Ely Award-winning article in Human Factors, "Some Persistent Myths About Military Electronics Maintenance." Bond identified a set of myths and summarized research that not only produced information and techniques leading to demonstrably improved maintenance performance but also exploded many unfounded beliefs that were commonly held before the research had been conducted and the findings disseminated. The period from 1964 through 1986, as reflected by publications in the journal, was a productive period of research that led to greater understanding of human factors in electronics maintenance and to numerous advances that contributed, ultimately, to more effective maintenance performance. Technological advances, combined with what we learned about maintenance performance, have substantially reduced the maintenance burden and enhanced the maintenance of electronic systems. Some of the principal lessons learned from this research on electronics maintenance apply to understanding the effects of equipment complexity, providing an optimal role for automation, designing more appropriate on-the-job training, and enhancing troubleshooting skills.
Mieog, J Sven D; Troyan, Susan L; Hutteman, Merlijn; Donohoe, Kevin J; van der Vorst, Joost R; Stockdale, Alan; Liefers, Gerrit-Jan; Choi, Hak Soo; Gibbs-Strauss, Summer L; Putter, Hein; Gioux, Sylvain; Kuppen, Peter J K; Ashitate, Yoshitomo; Löwik, Clemens W G M; Smit, Vincent T H B M; Oketokoun, Rafiou; Ngo, Long H; van de Velde, Cornelis J H; Frangioni, John V; Vahrmeijer, Alexander L
2011-09-01
Near-infrared (NIR) fluorescent sentinel lymph node (SLN) mapping in breast cancer requires optimized imaging systems and lymphatic tracers. A small, portable version of the FLARE imaging system, termed Mini-FLARE, was developed for capturing color video and two semi-independent channels of NIR fluorescence (700 and 800 nm) in real time. Initial optimization of lymphatic tracer dose was performed using 35-kg Yorkshire pigs and a 6-patient pilot clinical trial. More refined optimization was performed in 24 consecutive breast cancer patients. All patients received the standard of care using (99m)Technetium-nanocolloid and patent blue. In addition, 1.6 ml of indocyanine green adsorbed to human serum albumin (ICG:HSA) was injected directly after patent blue at the same location. Patients were allocated to 1 of 8 escalating ICG:HSA concentration groups from 50 to 1000 μM. The Mini-FLARE system was positioned easily in the operating room and could be used up to 13 in. from the patient. Mini-FLARE enabled visualization of lymphatic channels and SLNs in all patients. A total of 35 SLNs (mean = 1.45, range 1-3) were detected: 35 radioactive (100%), 30 blue (86%), and 35 NIR fluorescent (100%). Contrast agent quenching at the injection site and dilution within lymphatic channels were major contributors to signal strength of the SLN. Optimal injection dose of ICG:HSA ranged between 400 and 800 μM. No adverse reactions were observed. We describe the clinical translation of a new NIR fluorescence imaging system and define the optimal ICG:HSA dose range for SLN mapping in breast cancer.
Luo, Danmei; Rong, Qiguo; Chen, Quan
2017-09-01
Reconstruction of segmental defects in the mandible remains a challenge for maxillofacial surgery. The use of porous scaffolds is a potential method for repairing these defects. Now, additive manufacturing techniques provide a solution for the fabrication of porous scaffolds with specific geometrical shapes and complex structures. The goal of this study was to design and optimize a three-dimensional tetrahedral titanium scaffold for the reconstruction of mandibular defects. With a fixed strut diameter of 0.45mm and a mean cell size of 2.2mm, a tetrahedral structural porous scaffold was designed for a simulated anatomical defect derived from computed tomography (CT) data of a human mandible. An optimization method based on the concept of uniform stress was performed on the initial scaffold to realize a minimal-weight design. Geometric and mechanical comparisons between the initial and optimized scaffold show that the optimized scaffold exhibits a larger porosity, 81.90%, as well as a more homogeneous stress distribution. These results demonstrate that tetrahedral structural titanium scaffolds are feasible structures for repairing mandibular defects, and that the proposed optimization scheme has the ability to produce superior scaffolds for mandibular reconstruction with better stability, higher porosity, and less weight. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Garin, John; Matteo, Joseph; Jennings, Von Ayre
1988-01-01
The capability for a single operator to simultaneously control complex remote multi degree of freedom robotic arms and associated dextrous end effectors is being developed. An optimal solution within the realm of current technology, can be achieved by recognizing that: (1) machines/computer systems are more effective than humans when the task is routine and specified, and (2) humans process complex data sets and deal with the unpredictable better than machines. These observations lead naturally to a philosophy in which the human's role becomes a higher level function associated with planning, teaching, initiating, monitoring, and intervening when the machine gets into trouble, while the machine performs the codifiable tasks with deliberate efficiency. This concept forms the basis for the integration of man and telerobotics, i.e., robotics with the operator in the control loop. The concept of integration of the human in the loop and maximizing the feed-forward and feed-back data flow is referred to as telepresence.
Theta-burst microstimulation in the human entorhinal area improves memory specificity.
Titiz, Ali S; Hill, Michael R H; Mankin, Emily A; M Aghajan, Zahra; Eliashiv, Dawn; Tchemodanov, Natalia; Maoz, Uri; Stern, John; Tran, Michelle E; Schuette, Peter; Behnke, Eric; Suthana, Nanthia A; Fried, Itzhak
2017-10-24
The hippocampus is critical for episodic memory, and synaptic changes induced by long-term potentiation (LTP) are thought to underlie memory formation. In rodents, hippocampal LTP may be induced through electrical stimulation of the perforant path. To test whether similar techniques could improve episodic memory in humans, we implemented a microstimulation technique that allowed delivery of low-current electrical stimulation via 100 μm -diameter microelectrodes. As thirteen neurosurgical patients performed a person recognition task, microstimulation was applied in a theta-burst pattern, shown to optimally induce LTP. Microstimulation in the right entorhinal area during learning significantly improved subsequent memory specificity for novel portraits; participants were able both to recognize previously-viewed photos and reject similar lures. These results suggest that microstimulation with physiologic level currents-a radical departure from commonly used deep brain stimulation protocols-is sufficient to modulate human behavior and provides an avenue for refined interrogation of the circuits involved in human memory.
2007-04-12
Scholars in psychology and related disciplines incorporated optimistic or pessimistic views of human nature into their theories. For example, Sigmund ... Freud (1856-1939) included both optimism and pessimism as concepts in his theory of human nature and development. He asserted that humans have a drive...drive towards death represents the pessimistic aspect of human nature ( Freud , 1964). Psychologist William James (1842-1910), was the first to consider