Sample records for computer tasks quantified

  1. The Composite Strain Index (COSI) and Cumulative Strain Index (CUSI): methodologies for quantifying biomechanical stressors for complex tasks and job rotation using the Revised Strain Index.

    PubMed

    Garg, Arun; Moore, J Steven; Kapellusch, Jay M

    2017-08-01

    The Composite Strain Index (COSI) quantifies biomechanical stressors for complex tasks consisting of exertions at different force levels and/or with different exertion times. The Cumulative Strain Index (CUSI) further integrates biomechanical stressors from different tasks to quantify exposure for the entire work shift. The paper provides methodologies to compute COSI and CUSI along with examples. Complex task simulation produced 169,214 distinct tasks. Use of average, time-weighted average (TWA) and peak force and COSI classified 66.9, 28.2, 100 and 38.9% of tasks as hazardous, respectively. For job rotation the simulation produced 10,920 distinct jobs. TWA COSI, peak task COSI and CUSI classified 36.5, 78.1 and 66.6% jobs as hazardous, respectively. The results suggest that the TWA approach systematically underestimates the biomechanical stressors and peak approach overestimates biomechanical stressors, both at the task and job level. It is believed that the COSI and CUSI partially address these underestimations and overestimations of biomechanical stressors. Practitioner Summary: COSI quantifies exposure when applied hand force and/or duration of that force changes during a task cycle. CUSI integrates physical exposures from job rotation. These should be valuable tools for designing and analysing tasks and job rotation to determine risk of musculoskeletal injuries.

  2. Methodology for Uncertainty Analysis of Dynamic Computational Toxicology Models

    EPA Science Inventory

    The task of quantifying the uncertainty in both parameter estimates and model predictions has become more important with the increased use of dynamic computational toxicology models by the EPA. Dynamic toxicological models include physiologically-based pharmacokinetic (PBPK) mode...

  3. Engineering Students Designing a Statistical Procedure for Quantifying Variability

    ERIC Educational Resources Information Center

    Hjalmarson, Margret A.

    2007-01-01

    The study examined first-year engineering students' responses to a statistics task that asked them to generate a procedure for quantifying variability in a data set from an engineering context. Teams used technological tools to perform computations, and their final product was a ranking procedure. The students could use any statistical measures,…

  4. Computer task performance by subjects with Duchenne muscular dystrophy.

    PubMed

    Malheiros, Silvia Regina Pinheiro; da Silva, Talita Dias; Favero, Francis Meire; de Abreu, Luiz Carlos; Fregni, Felipe; Ribeiro, Denise Cardoso; de Mello Monteiro, Carlos Bandeira

    2016-01-01

    Two specific objectives were established to quantify computer task performance among people with Duchenne muscular dystrophy (DMD). First, we compared simple computational task performance between subjects with DMD and age-matched typically developing (TD) subjects. Second, we examined correlations between the ability of subjects with DMD to learn the computational task and their motor functionality, age, and initial task performance. The study included 84 individuals (42 with DMD, mean age of 18±5.5 years, and 42 age-matched controls). They executed a computer maze task; all participants performed the acquisition (20 attempts) and retention (five attempts) phases, repeating the same maze. A different maze was used to verify transfer performance (five attempts). The Motor Function Measure Scale was applied, and the results were compared with maze task performance. In the acquisition phase, a significant decrease was found in movement time (MT) between the first and last acquisition block, but only for the DMD group. For the DMD group, MT during transfer was shorter than during the first acquisition block, indicating improvement from the first acquisition block to transfer. In addition, the TD group showed shorter MT than the DMD group across the study. DMD participants improved their performance after practicing a computational task; however, the difference in MT was present in all attempts among DMD and control subjects. Computational task improvement was positively influenced by the initial performance of individuals with DMD. In turn, the initial performance was influenced by their distal functionality but not their age or overall functionality.

  5. Learning the ideal observer for SKE detection tasks by use of convolutional neural networks (Cum Laude Poster Award)

    NASA Astrophysics Data System (ADS)

    Zhou, Weimin; Anastasio, Mark A.

    2018-03-01

    It has been advocated that task-based measures of image quality (IQ) should be employed to evaluate and optimize imaging systems. Task-based measures of IQ quantify the performance of an observer on a medically relevant task. The Bayesian Ideal Observer (IO), which employs complete statistical information of the object and noise, achieves the upper limit of the performance for a binary signal classification task. However, computing the IO performance is generally analytically intractable and can be computationally burdensome when Markov-chain Monte Carlo (MCMC) techniques are employed. In this paper, supervised learning with convolutional neural networks (CNNs) is employed to approximate the IO test statistics for a signal-known-exactly and background-known-exactly (SKE/BKE) binary detection task. The receiver operating characteristic (ROC) curve and the area under the ROC curve (AUC) are compared to those produced by the analytically computed IO. The advantages of the proposed supervised learning approach for approximating the IO are demonstrated.

  6. The Semantic Distance Task: Quantifying Semantic Distance with Semantic Network Path Length

    ERIC Educational Resources Information Center

    Kenett, Yoed N.; Levi, Effi; Anaki, David; Faust, Miriam

    2017-01-01

    Semantic distance is a determining factor in cognitive processes, such as semantic priming, operating upon semantic memory. The main computational approach to compute semantic distance is through latent semantic analysis (LSA). However, objections have been raised against this approach, mainly in its failure at predicting semantic priming. We…

  7. Quantifying the signals contained in heterogeneous neural responses and determining their relationships with task performance

    PubMed Central

    Pagan, Marino

    2014-01-01

    The responses of high-level neurons tend to be mixtures of many different types of signals. While this diversity is thought to allow for flexible neural processing, it presents a challenge for understanding how neural responses relate to task performance and to neural computation. To address these challenges, we have developed a new method to parse the responses of individual neurons into weighted sums of intuitive signal components. Our method computes the weights by projecting a neuron's responses onto a predefined orthonormal basis. Once determined, these weights can be combined into measures of signal modulation; however, in their raw form these signal modulation measures are biased by noise. Here we introduce and evaluate two methods for correcting this bias, and we report that an analytically derived approach produces performance that is robust and superior to a bootstrap procedure. Using neural data recorded from inferotemporal cortex and perirhinal cortex as monkeys performed a delayed-match-to-sample target search task, we demonstrate how the method can be used to quantify the amounts of task-relevant signals in heterogeneous neural populations. We also demonstrate how these intuitive quantifications of signal modulation can be related to single-neuron measures of task performance (d′). PMID:24920017

  8. The Effects of a Goal Setting Intervention on Productivity and Persistence in an Analogue Work Task

    ERIC Educational Resources Information Center

    Tammemagi, Triona; O'Hora, Denis; Maglieri, Kristen A.

    2013-01-01

    The authors of this study sought to quantify the beneficial effect of goal setting on work performance, and to characterize the persistence or deterioration of goal-directed behavior over time. Twenty-six participants completed a computer-based data entry task. Performance was measured during an initial baseline, a goal setting intervention that…

  9. Energetic arousal and language: predictions from the computational theory of quantifiers processing.

    PubMed

    Zajenkowski, Marcin

    2013-10-01

    The author examines the relationship between energetic arousal (EA) and the processing of sentences containing natural-language quantifiers. Previous studies and theories have shown that energy may differentially affect various cognitive functions. Recent investigations devoted to quantifiers strongly support the theory that various types of quantifiers involve different cognitive functions in the sentence-picture verification task. In the present study, 201 students were presented with a sentence-picture verification task consisting of simple propositions containing a quantifier that referred to the color of a car on display. Color pictures of cars accompanied the propositions. In addition, the level of participants' EA was measured before and after the verification task. It was found that EA and performance on proportional quantifiers (e.g., "More than half of the cars are red") are in an inverted U-shaped relationship. This result may be explained by the fact that proportional sentences engage working memory to a high degree, and previous models of EA-cognition associations have been based on the assumption that tasks that require parallel attentional and memory processes are best performed when energy is moderate. The research described in the present article has several applications, as it shows the optimal human conditions for verbal comprehension. For instance, it may be important in workplace design to control the level of arousal experienced by office staff when work is mostly related to the processing of complex texts. Energy level may be influenced by many factors, such as noise, time of day, or thermal conditions.

  10. Toward a better understanding of task demands, workload, and performance during physician-computer interactions.

    PubMed

    Mazur, Lukasz M; Mosaly, Prithima R; Moore, Carlton; Comitz, Elizabeth; Yu, Fei; Falchook, Aaron D; Eblan, Michael J; Hoyle, Lesley M; Tracton, Gregg; Chera, Bhishamjit S; Marks, Lawrence B

    2016-11-01

    To assess the relationship between (1) task demands and workload, (2) task demands and performance, and (3) workload and performance, all during physician-computer interactions in a simulated environment. Two experiments were performed in 2 different electronic medical record (EMR) environments: WebCIS (n = 12) and Epic (n = 17). Each participant was instructed to complete a set of prespecified tasks on 3 routine clinical EMR-based scenarios: urinary tract infection (UTI), pneumonia (PN), and heart failure (HF). Task demands were quantified using behavioral responses (click and time analysis). At the end of each scenario, subjective workload was measured using the NASA-Task-Load Index (NASA-TLX). Physiological workload was measured using pupillary dilation and electroencephalography (EEG) data collected throughout the scenarios. Performance was quantified based on the maximum severity of omission errors. Data analysis indicated that the PN and HF scenarios were significantly more demanding than the UTI scenario for participants using WebCIS (P < .01), and that the PN scenario was significantly more demanding than the UTI and HF scenarios for participants using Epic (P < .01). In both experiments, the regression analysis indicated a significant relationship only between task demands and performance (P < .01). Results suggest that task demands as experienced by participants are related to participants' performance. Future work may support the notion that task demands could be used as a quality metric that is likely representative of performance, and perhaps patient outcomes. The present study is a reasonable next step in a systematic assessment of how task demands and workload are related to performance in EMR-evolving environments. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Three-dimensional Aerodynamic Instability in Multi-stage Axial Compressors

    NASA Technical Reports Server (NTRS)

    Suder, Kenneth (Technical Monitor); Tan, Choon-Sooi

    2003-01-01

    Four separate tasks are reported. The first task: A Computational Model for Short Wavelength Stall Inception and Development In Multi-Stage Compressors; the second task: Three-dimensional Rotating Stall Inception and Effects of Rotating Tip Clearance Asymmetry in Axial Compressors; the third task:Development of an Effective Computational Methodology for Body Force Representation of High-speed Rotor 37; and the fourth task:Development of Circumferential Inlet Distortion through a Representative Eleven Stage High-speed axial compressor. The common theme that threaded throughout these four tasks is the conceptual framework that consists of quantifying flow processes at the fadcompressor blade passage level to define the compressor performance characteristics needed for addressing physical phenomena such compressor aerodynamic instability and compressor response to flow distoriton with length scales larger than compressor blade-to-blade spacing at the system level. The results from these two levels can be synthesized to: (1) simulate compressor aerodynamic instability inception local to a blade rotor tip and its development from a local flow event into the nonlinear limit cycle instability that involves the entire compressor as was demonstrated in the first task; (2) determine the conditions under which compressor stability assessment based on two-dimensional model may not be adequate and the effects of self-induced flow distortion on compressor stability limit as in the second task; (3) quantify multistage compressor response to inlet distortion in stagnation pressure as illustrated in the fourth task; and (4) elucidate its potential applicability for compressor map generation under uniform as well as non-uniform inlet flow given three-dimensional Navier-Stokes solution for each individual blade row as was demonstrated in the third task.

  12. Blending Velocities In Task Space In Computing Robot Motions

    NASA Technical Reports Server (NTRS)

    Volpe, Richard A.

    1995-01-01

    Blending of linear and angular velocities between sequential specified points in task space constitutes theoretical basis of improved method of computing trajectories followed by robotic manipulators. In method, generalized velocity-vector-blending technique provides relatively simple, common conceptual framework for blending linear, angular, and other parametric velocities. Velocity vectors originate from straight-line segments connecting specified task-space points, called "via frames" and represent specified robot poses. Linear-velocity-blending functions chosen from among first-order, third-order-polynomial, and cycloidal options. Angular velocities blended by use of first-order approximation of previous orientation-matrix-blending formulation. Angular-velocity approximation yields small residual error, quantified and corrected. Method offers both relative simplicity and speed needed for generation of robot-manipulator trajectories in real time.

  13. Brain-computer interface control along instructed paths

    NASA Astrophysics Data System (ADS)

    Sadtler, P. T.; Ryu, S. I.; Tyler-Kabara, E. C.; Yu, B. M.; Batista, A. P.

    2015-02-01

    Objective. Brain-computer interfaces (BCIs) are being developed to assist paralyzed people and amputees by translating neural activity into movements of a computer cursor or prosthetic limb. Here we introduce a novel BCI task paradigm, intended to help accelerate improvements to BCI systems. Through this task, we can push the performance limits of BCI systems, we can quantify more accurately how well a BCI system captures the user’s intent, and we can increase the richness of the BCI movement repertoire. Approach. We have implemented an instructed path task, wherein the user must drive a cursor along a visible path. The instructed path task provides a versatile framework to increase the difficulty of the task and thereby push the limits of performance. Relative to traditional point-to-point tasks, the instructed path task allows more thorough analysis of decoding performance and greater richness of movement kinematics. Main results. We demonstrate that monkeys are able to perform the instructed path task in a closed-loop BCI setting. We further investigate how the performance under BCI control compares to native arm control, whether users can decrease their movement variability in the face of a more demanding task, and how the kinematic richness is enhanced in this task. Significance. The use of the instructed path task has the potential to accelerate the development of BCI systems and their clinical translation.

  14. Identification of task demands and usability issues in police use of mobile computing terminals.

    PubMed

    Zahabi, Maryam; Kaber, David

    2018-01-01

    Crash reports from various states in the U.S. have shown high numbers of emergency vehicle crashes, especially in law enforcement situations. This study identified the perceived importance and frequency of police mobile computing terminal (MCT) tasks, quantified the demands of different tasks using a cognitive performance modeling methodology, identified usability violations of current MCT interface designs, and formulated design recommendations for an enhanced interface. Results revealed that "access call notes", "plate number check" and "find location on map" are the most important and frequently performed tasks for officers. "Reading plate information" was also found to be the most visually and cognitively demanding task-method. Usability principles of "using simple and natural dialog" and "minimizing user memory load" were violated by the current MCT interface design. The enhanced design showed potential for reducing cognitive demands and task completion time. Findings should be further validated using a driving simulation study. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Unobtrusive monitoring of divided attention in a cognitive health coaching intervention for the elderly.

    PubMed

    McKanna, James A; Pavel, Misha; Jimison, Holly

    2010-11-13

    Assessment of cognitive functionality is an important aspect of care for elders. Unfortunately, few tools exist to measure divided attention, the ability to allocate attention to different aspects of tasks. An accurate determination of divided attention would allow inference of generalized cognitive decline, as well as providing a quantifiable indicator of an important component of driving skill. We propose a new method for determining relative divided attention ability through unobtrusive monitoring of computer use. Specifically, we measure performance on a dual-task cognitive computer exercise as part of a health coaching intervention. This metric indicates whether the user has the ability to pay attention to both tasks at once, or is primarily attending to one task at a time (sacrificing optimal performance). The monitoring of divided attention in a home environment is a key component of both the early detection of cognitive problems and for assessing the efficacy of coaching interventions.

  16. A New Informatics Geography.

    PubMed

    Coiera, E

    2016-11-10

    Anyone with knowledge of information systems has experienced frustration when it comes to system implementation or use. Unanticipated challenges arise frequently and unanticipated consequences may follow. Working from first principles, to understand why information technology (IT) is often challenging, identify which IT endeavors are more likely to succeed, and predict the best role that technology can play in different tasks and settings. The fundamental purpose of IT is to enhance our ability to undertake tasks, supplying new information that changes what we decide and ultimately what occurs in the world. The value of this information (VOI) can be calculated at different stages of the decision-making process and will vary depending on how technology is used. We can imagine a task space that describes the relative benefits of task completion by humans or computers and that contains specific areas where humans or computers are superior. There is a third area where neither is strong and a final joint workspace where humans and computers working in partnership produce the best results. By understanding that information has value and that VOI can be quantified, we can make decisions about how best to support the work we do. Evaluation of the expected utility of task completion by humans or computers should allow us to decide whether solutions should depend on technology, humans, or a partnership between the two.

  17. SU-F-J-178: A Computer Simulation Model Observer for Task-Based Image Quality Assessment in Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dolly, S; Mutic, S; Anastasio, M

    Purpose: Traditionally, image quality in radiation therapy is assessed subjectively or by utilizing physically-based metrics. Some model observers exist for task-based medical image quality assessment, but almost exclusively for diagnostic imaging tasks. As opposed to disease diagnosis, the task for image observers in radiation therapy is to utilize the available images to design and deliver a radiation dose which maximizes patient disease control while minimizing normal tissue damage. The purpose of this study was to design and implement a new computer simulation model observer to enable task-based image quality assessment in radiation therapy. Methods: A modular computer simulation framework wasmore » developed to resemble the radiotherapy observer by simulating an end-to-end radiation therapy treatment. Given images and the ground-truth organ boundaries from a numerical phantom as inputs, the framework simulates an external beam radiation therapy treatment and quantifies patient treatment outcomes using the previously defined therapeutic operating characteristic (TOC) curve. As a preliminary demonstration, TOC curves were calculated for various CT acquisition and reconstruction parameters, with the goal of assessing and optimizing simulation CT image quality for radiation therapy. Sources of randomness and bias within the system were analyzed. Results: The relationship between CT imaging dose and patient treatment outcome was objectively quantified in terms of a singular value, the area under the TOC (AUTOC) curve. The AUTOC decreases more rapidly for low-dose imaging protocols. AUTOC variation introduced by the dose optimization algorithm was approximately 0.02%, at the 95% confidence interval. Conclusion: A model observer has been developed and implemented to assess image quality based on radiation therapy treatment efficacy. It enables objective determination of appropriate imaging parameter values (e.g. imaging dose). Framework flexibility allows for incorporation of additional modules to include any aspect of the treatment process, and therefore has great potential for both assessment and optimization within radiation therapy.« less

  18. Quantitative assessment of arm tremor in people with neurological disorders.

    PubMed

    Jeonghee Kim; Parnell, Claire; Wichmann, Thomas; DeWeerth, Stephen P

    2016-08-01

    Abnormal oscillatory movement (i.e. tremor) is usually evaluated with qualitative assessment by clinicians, and quantified with subjective scoring methods. These methods are often inaccurate. We utilized a quantitative and standardized task based on the Fitts' law to assess the performance of arm movement with tremor by controlling a gyration mouse on a computer. The experiment included the center-out tapping (COT) and rectangular track navigation (RTN) tasks. We report the results of a pilot study in which we collected the performance for healthy participants in whom tremor was simulated by imposing oscillatory movements to the arm with a vibration motor. We compared their movement speed and accuracy with and without the artificial "tremor." We found that the artificial tremor significantly affected the path efficiency for both tasks (COT: 56.8 vs. 46.2%, p <; 0.05; RTN: 94.2 vs. 67.4%, p <; 0.05), and we were able to distinguish the presence of tremor. From this result, we expect to quantify severity of tremor and the effectiveness therapy for tremor patients.

  19. Impact of topographic mask models on scanner matching solutions

    NASA Astrophysics Data System (ADS)

    Tyminski, Jacek K.; Pomplun, Jan; Renwick, Stephen P.

    2014-03-01

    Of keen interest to the IC industry are advanced computational lithography applications such as Optical Proximity Correction of IC layouts (OPC), scanner matching by optical proximity effect matching (OPEM), and Source Optimization (SO) and Source-Mask Optimization (SMO) used as advanced reticle enhancement techniques. The success of these tasks is strongly dependent on the integrity of the lithographic simulators used in computational lithography (CL) optimizers. Lithographic mask models used by these simulators are key drivers impacting the accuracy of the image predications, and as a consequence, determine the validity of these CL solutions. Much of the CL work involves Kirchhoff mask models, a.k.a. thin masks approximation, simplifying the treatment of the mask near-field images. On the other hand, imaging models for hyper-NA scanner require that the interactions of the illumination fields with the mask topography be rigorously accounted for, by numerically solving Maxwell's Equations. The simulators used to predict the image formation in the hyper-NA scanners must rigorously treat the masks topography and its interaction with the scanner illuminators. Such imaging models come at a high computational cost and pose challenging accuracy vs. compute time tradeoffs. Additional complication comes from the fact that the performance metrics used in computational lithography tasks show highly non-linear response to the optimization parameters. Finally, the number of patterns used for tasks such as OPC, OPEM, SO, or SMO range from tens to hundreds. These requirements determine the complexity and the workload of the lithography optimization tasks. The tools to build rigorous imaging optimizers based on first-principles governing imaging in scanners are available, but the quantifiable benefits they might provide are not very well understood. To quantify the performance of OPE matching solutions, we have compared the results of various imaging optimization trials obtained with Kirchhoff mask models to those obtained with rigorous models involving solutions of Maxwell's Equations. In both sets of trials, we used sets of large numbers of patterns, with specifications representative of CL tasks commonly encountered in hyper-NA imaging. In this report we present OPEM solutions based on various mask models and discuss the models' impact on hyper- NA scanner matching accuracy. We draw conclusions on the accuracy of results obtained with thin mask models vs. the topographic OPEM solutions. We present various examples representative of the scanner image matching for patterns representative of the current generation of IC designs.

  20. Multifractal analysis of information processing in hippocampal neural ensembles during working memory under Δ9-tetrahydrocannabinol administration

    PubMed Central

    Fetterhoff, Dustin; Opris, Ioan; Simpson, Sean L.; Deadwyler, Sam A.; Hampson, Robert E.; Kraft, Robert A.

    2014-01-01

    Background Multifractal analysis quantifies the time-scale-invariant properties in data by describing the structure of variability over time. By applying this analysis to hippocampal interspike interval sequences recorded during performance of a working memory task, a measure of long-range temporal correlations and multifractal dynamics can reveal single neuron correlates of information processing. New method Wavelet leaders-based multifractal analysis (WLMA) was applied to hippocampal interspike intervals recorded during a working memory task. WLMA can be used to identify neurons likely to exhibit information processing relevant to operation of brain–computer interfaces and nonlinear neuronal models. Results Neurons involved in memory processing (“Functional Cell Types” or FCTs) showed a greater degree of multifractal firing properties than neurons without task-relevant firing characteristics. In addition, previously unidentified FCTs were revealed because multifractal analysis suggested further functional classification. The cannabinoid-type 1 receptor partial agonist, tetrahydrocannabinol (THC), selectively reduced multifractal dynamics in FCT neurons compared to non-FCT neurons. Comparison with existing methods WLMA is an objective tool for quantifying the memory-correlated complexity represented by FCTs that reveals additional information compared to classification of FCTs using traditional z-scores to identify neuronal correlates of behavioral events. Conclusion z-Score-based FCT classification provides limited information about the dynamical range of neuronal activity characterized by WLMA. Increased complexity, as measured with multifractal analysis, may be a marker of functional involvement in memory processing. The level of multifractal attributes can be used to differentially emphasize neural signals to improve computational models and algorithms underlying brain–computer interfaces. PMID:25086297

  1. Force-reflection and shared compliant control in operating telemanipulators with time delay

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Hannaford, Blake; Bejczy, Antal K.

    1992-01-01

    The performance of an advanced telemanipulation system in the presence of a wide range of time delays between a master control station and a slave robot is quantified. The contemplated applications include multiple satellite links to LEO, geosynchronous operation, spacecraft local area networks, and general-purpose computer-based short-distance designs. The results of high-precision peg-in-hole tasks performed by six test operators indicate that task performance decreased linearly with introduced time delays for both kinesthetic force feedback (KFF) and shared compliant control (SCC). The rate of this decrease was substantially improved with SCC compared to KFF. Task performance at delays above 1 s was not possible using KFF. SCC enabled task performance for such delays, which are realistic values for ground-controlled remote manipulation of telerobots in space.

  2. Multiplex Quantitative Histologic Analysis of Human Breast Cancer Cell Signaling and Cell Fate

    DTIC Science & Technology

    2010-05-01

    Breast cancer, cell signaling, cell proliferation, histology, image analysis 15. NUMBER OF PAGES - 51 16. PRICE CODE 17. SECURITY CLASSIFICATION...revealed by individual stains in multiplex combinations; and (3) software (FARSIGHT) for automated multispectral image analysis that (i) segments...Task 3. Develop computational algorithms for multispectral immunohistological image analysis FARSIGHT software was developed to quantify intrinsic

  3. Contextual Fraction as a Measure of Contextuality.

    PubMed

    Abramsky, Samson; Barbosa, Rui Soares; Mansfield, Shane

    2017-08-04

    We consider the contextual fraction as a quantitative measure of contextuality of empirical models, i.e., tables of probabilities of measurement outcomes in an experimental scenario. It provides a general way to compare the degree of contextuality across measurement scenarios; it bears a precise relationship to violations of Bell inequalities; its value, and a witnessing inequality, can be computed using linear programing; it is monotonic with respect to the "free" operations of a resource theory for contextuality; and it measures quantifiable advantages in informatic tasks, such as games and a form of measurement-based quantum computing.

  4. Contextual Fraction as a Measure of Contextuality

    NASA Astrophysics Data System (ADS)

    Abramsky, Samson; Barbosa, Rui Soares; Mansfield, Shane

    2017-08-01

    We consider the contextual fraction as a quantitative measure of contextuality of empirical models, i.e., tables of probabilities of measurement outcomes in an experimental scenario. It provides a general way to compare the degree of contextuality across measurement scenarios; it bears a precise relationship to violations of Bell inequalities; its value, and a witnessing inequality, can be computed using linear programing; it is monotonic with respect to the "free" operations of a resource theory for contextuality; and it measures quantifiable advantages in informatic tasks, such as games and a form of measurement-based quantum computing.

  5. The effect of topiramate plasma concentration on linguistic behavior, verbal recall and working memory.

    PubMed

    Marino, S E; Pakhomov, S V S; Han, S; Anderson, K L; Ding, M; Eberly, L E; Loring, D W; Hawkins-Taylor, C; Rarick, J O; Leppik, I E; Cibula, J E; Birnbaum, A K

    2012-07-01

    This is the first study of the effect of topiramate on linguistic behavior and verbal recall using a computational linguistics system for automated language and speech analysis to detect and quantify drug-induced changes in speech recorded during discourse-level tasks. Healthy volunteers were administered a single, 100-mg oral dose of topiramate in two double-blind, randomized, placebo-controlled, crossover studies. Subjects' topiramate plasma levels ranged from 0.23 to 2.81 μg/mL. We found a significant association between topiramate levels and impairment on measures of verbal fluency elicited during a picture description task, correct number of words recalled on a paragraph recall test, and reaction time recorded during a working memory task. Using the tools of clinical pharmacology and computational linguistics, we elucidated the relationship between the determinants of a drug's disposition as reflected in plasma concentrations and their impact on cognitive functioning as reflected in spoken language discourse. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Speed in Information Processing with a Computer Driven Visual Display in a Real-time Digital Simulation. M.S. Thesis - Virginia Polytechnic Inst.

    NASA Technical Reports Server (NTRS)

    Kyle, R. G.

    1972-01-01

    Information transfer between the operator and computer-generated display systems is an area where the human factors engineer discovers little useful design data relating human performance to system effectiveness. This study utilized a computer-driven, cathode-ray-tube graphic display to quantify human response speed in a sequential information processing task. The performance criteria was response time to sixteen cell elements of a square matrix display. A stimulus signal instruction specified selected cell locations by both row and column identification. An equal probable number code, from one to four, was assigned at random to the sixteen cells of the matrix and correspondingly required one of four, matched keyed-response alternatives. The display format corresponded to a sequence of diagnostic system maintenance events, that enable the operator to verify prime system status, engage backup redundancy for failed subsystem components, and exercise alternate decision-making judgements. The experimental task bypassed the skilled decision-making element and computer processing time, in order to determine a lower bound on the basic response speed for given stimulus/response hardware arrangement.

  7. Development and Testing of a Smartphone-Based Cognitive/Neuropsychological Evaluation System for Substance Abusers.

    PubMed

    Pal, Reshmi; Mendelson, John; Clavier, Odile; Baggott, Mathew J; Coyle, Jeremy; Galloway, Gantt P

    2016-01-01

    In methamphetamine (MA) users, drug-induced neurocognitive deficits may help to determine treatment, monitor adherence, and predict relapse. To measure these relationships, we developed an iPhone app (Neurophone) to compare lab and field performance of N-Back, Stop Signal, and Stroop tasks that are sensitive to MA-induced deficits. Twenty healthy controls and 16 MA-dependent participants performed the tasks in-lab using a validated computerized platform and the Neurophone before taking the latter home and performing the tasks twice daily for two weeks. N-Back task: there were no clear differences in performance between computer-based vs. phone-based in-lab tests and phone-based in-lab vs. phone-based in-field tests. Stop-Signal task: difference in parameters prevented comparison of computer-based and phone-based versions. There was significant difference in phone performance between field and lab. Stroop task: response time measured by the speech recognition engine lacked precision to yield quantifiable results. There was no learning effect over time. On an average, each participant completed 84.3% of the in-field NBack tasks and 90.4% of the in-field Stop Signal tasks (MA-dependent participants: 74.8% and 84.3%; healthy controls: 91.4% and 95.0%, respectively). Participants rated Neurophone easy to use. Cognitive tasks performed in-field using Neurophone have the potential to yield results comparable to those obtained in a laboratory setting. Tasks need to be modified for use as the app's voice recognition system is not yet adequate for timed tests.

  8. Improving our understanding of multi-tasking in healthcare: Drawing together the cognitive psychology and healthcare literature.

    PubMed

    Douglas, Heather E; Raban, Magdalena Z; Walter, Scott R; Westbrook, Johanna I

    2017-03-01

    Multi-tasking is an important skill for clinical work which has received limited research attention. Its impacts on clinical work are poorly understood. In contrast, there is substantial multi-tasking research in cognitive psychology, driver distraction, and human-computer interaction. This review synthesises evidence of the extent and impacts of multi-tasking on efficiency and task performance from health and non-healthcare literature, to compare and contrast approaches, identify implications for clinical work, and to develop an evidence-informed framework for guiding the measurement of multi-tasking in future healthcare studies. The results showed healthcare studies using direct observation have focused on descriptive studies to quantify concurrent multi-tasking and its frequency in different contexts, with limited study of impact. In comparison, non-healthcare studies have applied predominantly experimental and simulation designs, focusing on interleaved and concurrent multi-tasking, and testing theories of the mechanisms by which multi-tasking impacts task efficiency and performance. We propose a framework to guide the measurement of multi-tasking in clinical settings that draws together lessons from these siloed research efforts. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Fitts' Law in the Control of Isometric Grip Force With Naturalistic Targets.

    PubMed

    Thumser, Zachary C; Slifkin, Andrew B; Beckler, Dylan T; Marasco, Paul D

    2018-01-01

    Fitts' law models the relationship between amplitude, precision, and speed of rapid movements. It is widely used to quantify performance in pointing tasks, study human-computer interaction, and generally to understand perceptual-motor information processes, including research to model performance in isometric force production tasks. Applying Fitts' law to an isometric grip force task would allow for quantifying grasp performance in rehabilitative medicine and may aid research on prosthetic control and design. We examined whether Fitts' law would hold when participants attempted to accurately produce their intended force output while grasping a manipulandum when presented with images of various everyday objects (we termed this the implicit task). Although our main interest was the implicit task, to benchmark it and establish validity, we examined performance against a more standard visual feedback condition via a digital force-feedback meter on a video monitor (explicit task). Next, we progressed from visual force feedback with force meter targets to the same targets without visual force feedback (operating largely on feedforward control with tactile feedback). This provided an opportunity to see if Fitts' law would hold without vision, and allowed us to progress toward the more naturalistic implicit task (which does not include visual feedback). Finally, we changed the nature of the targets from requiring explicit force values presented as arrows on a force-feedback meter (explicit targets) to the more naturalistic and intuitive target forces implied by images of objects (implicit targets). With visual force feedback the relation between task difficulty and the time to produce the target grip force was predicted by Fitts' law (average r 2 = 0.82). Without vision, average grip force scaled accurately although force variability was insensitive to the target presented. In contrast, images of everyday objects generated more reliable grip forces without the visualized force meter. In sum, population means were well-described by Fitts' law for explicit targets with vision ( r 2 = 0.96) and implicit targets ( r 2 = 0.89), but not as well-described for explicit targets without vision ( r 2 = 0.54). Implicit targets should provide a realistic see-object-squeeze-object test using Fitts' law to quantify the relative speed-accuracy relationship of any given grasper.

  10. Duct flow nonuniformities: Effect of struts in SSME HGM 2+

    NASA Technical Reports Server (NTRS)

    Burke, Roger

    1988-01-01

    This study consists of an analysis of flow through the Space Shuttle Main Engine (SSME) Hot Gas Manifold (HGM) for the purpose of understanding and quantifying the flow environment and, in particular, the flow through a region of structural supports located between the inner and outer walls of the HGM. The primary task of the study, as defined by NASA-MSFC, is to assess and develop the computational capability for analyzing detailed three-dimensional flow through the HGM support strut region to be incorporated into a full fuelside HGM analysis. Secondarily, computed results are to be compared with available experimental results.

  11. POPEYE: A production rule-based model of multitask supervisory control (POPCORN)

    NASA Technical Reports Server (NTRS)

    Townsend, James T.; Kadlec, Helena; Kantowitz, Barry H.

    1988-01-01

    Recent studies of relationships between subjective ratings of mental workload, performance, and human operator and task characteristics have indicated that these relationships are quite complex. In order to study the various relationships and place subjective mental workload within a theoretical framework, we developed a production system model for the performance component of the complex supervisory task called POPCORN. The production system model is represented by a hierarchial structure of goals and subgoals, and the information flow is controlled by a set of condition-action rules. The implementation of this production system, called POPEYE, generates computer simulated data under different task difficulty conditions which are comparable to those of human operators performing the task. This model is the performance aspect of an overall dynamic psychological model which we are developing to examine and quantify relationships between performance and psychological aspects in a complex environment.

  12. Reliably Discriminating Stock Structure with Genetic Markers:Mixture Models with Robust and Fast Computation.

    PubMed

    Foster, Scott D; Feutry, Pierre; Grewe, Peter M; Berry, Oliver; Hui, Francis K C; Davies, Campbell R

    2018-06-26

    Delineating naturally occurring and self-sustaining sub-populations (stocks) of a species is an important task, especially for species harvested from the wild. Despite its central importance to natural resource management, analytical methods used to delineate stocks are often, and increasingly, borrowed from superficially similar analytical tasks in human genetics even though models specifically for stock identification have been previously developed. Unfortunately, the analytical tasks in resource management and human genetics are not identical { questions about humans are typically aimed at inferring ancestry (often referred to as 'admixture') rather than breeding stocks. In this article, we argue, and show through simulation experiments and an analysis of yellowfin tuna data, that ancestral analysis methods are not always appropriate for stock delineation. In this work, we advocate a variant of a previouslyintroduced and simpler model that identifies stocks directly. We also highlight that the computational aspects of the analysis, irrespective of the model, are difficult. We introduce some alternative computational methods and quantitatively compare these methods to each other and to established methods. We also present a method for quantifying uncertainty in model parameters and in assignment probabilities. In doing so, we demonstrate that point estimates can be misleading. One of the computational strategies presented here, based on an expectation-maximisation algorithm with judiciously chosen starting values, is robust and has a modest computational cost. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  13. Influence of input device, work surface angle, and task on spine kinematics.

    PubMed

    Riddell, Maureen F; Gallagher, Kaitlin M; McKinnon, Colin D; Callaghan, Jack P

    2016-01-01

    With the increase of tablet usage in both office and industrial workplaces, it is critical to investigate the influence of tablet usage on spine posture and movement. To quantify spine kinematics while participants interacted with a tablet or desktop computer. Fourteen participants volunteered for this study. Marker clusters were fixed onto body regions to analyze cervical and lumbar spine posture and sampled at 32 Hz (Optotrak Certus, NDI, Waterloo, Canada). Participants sat for one hour in total. Cervical and lumbar median angles and range of motion (10th to 90th % ile angles) were extracted from amplitude probability distribution functions performed on the angle data. Using a sloped desk surface at 15°, compared to a flat desk, influenced cervical flexion (p = 0.0228). Completing the form fill task resulted in the highest degree of cervical flexion (p = 0.0008) compared to the other tasks completed with cervical angles between 6.1°-8.5° higher than emailing and reading respectively. An interaction between device and task (p = 0.0061) was found for relative lumbar median spine angles. Increased lumbar flexion was recorded when using a computer versus a tablet to complete various tasks. Task influenced both cervical and lumbar spine posture with the highest cervical flexion occurring while completing a simulated data entry task. A work surface slope of 15° decreased cervical spine flexion compared to a horizontal work surface slope.

  14. Effects of portable computing devices on posture, muscle activation levels and efficiency.

    PubMed

    Werth, Abigail; Babski-Reeves, Kari

    2014-11-01

    Very little research exists on ergonomic exposures when using portable computing devices. This study quantified muscle activity (forearm and neck), posture (wrist, forearm and neck), and performance (gross typing speed and error rates) differences across three portable computing devices (laptop, netbook, and slate computer) and two work settings (desk and computer) during data entry tasks. Twelve participants completed test sessions on a single computer using a test-rest-test protocol (30min of work at one work setting, 15min of rest, 30min of work at the other work setting). The slate computer resulted in significantly more non-neutral wrist, elbow and neck postures, particularly when working on the sofa. Performance on the slate computer was four times less than that of the other computers, though lower muscle activity levels were also found. Potential or injury or illness may be elevated when working on smaller, portable computers in non-traditional work settings. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  15. Quantification of baseline pupillary response and task-evoked pupillary response during constant and incremental task load.

    PubMed

    Mosaly, Prithima R; Mazur, Lukasz M; Marks, Lawrence B

    2017-10-01

    The methods employed to quantify the baseline pupil size and task-evoked pupillary response (TEPR) may affect the overall study results. To test this hypothesis, the objective of this study was to assess variability in baseline pupil size and TEPR during two basic working memory tasks: constant load of 3-letters memorisation-recall (10 trials), and incremental load memorisation-recall (two trials of each load level), using two commonly used methods (1) change from trail/load specific baseline, (2) change from constant baseline. Results indicated that there was a significant shift in baseline between the trails for constant load, and between the load levels for incremental load. The TEPR was independent of shifts in baseline using method 1 only for constant load, and method 2 only for higher levels of incremental load condition. These important findings suggest that the assessment of both the baseline and methods to quantify TEPR are critical in ergonomics application, especially in studies with small number of trials per subject per condition. Practitioner Summary: Quantification of TEPR can be affected by shifts in baseline pupil size that are most likely affected by non-cognitive factors when other external factors are kept constant. Therefore, quantification methods employed to compute both baseline and TEPR are critical in understanding the information processing of humans in practical ergonomics settings.

  16. Cross-Talk in Superconducting Transmon Quantum Computing Architecture

    NASA Astrophysics Data System (ADS)

    Abraham, David; Chow, Jerry; Corcoles, Antonio; Rothwell, Mary; Keefe, George; Gambetta, Jay; Steffen, Matthias; IBM Quantum Computing Team

    2013-03-01

    Superconducting transmon quantum computing test structures often exhibit significant undesired cross-talk. For experiments with only a handful of qubits this cross-talk can be quantified and understood, and therefore corrected. As quantum computing circuits become more complex, and thereby contain increasing numbers of qubits and resonators, it becomes more vital that the inadvertent coupling between these elements is minimized. The task of accurately controlling each single qubit to the level of precision required throughout the realization of a quantum algorithm is difficult by itself, but coupled with the need of nulling out leakage signals from neighboring qubits or resonators would quickly become impossible. We discuss an approach to solve this critical problem. We acknowledge support from IARPA under contract W911NF-10-1-0324.

  17. Cognitive performance modeling based on general systems performance theory.

    PubMed

    Kondraske, George V

    2010-01-01

    General Systems Performance Theory (GSPT) was initially motivated by problems associated with quantifying different aspects of human performance. It has proved to be invaluable for measurement development and understanding quantitative relationships between human subsystem capacities and performance in complex tasks. It is now desired to bring focus to the application of GSPT to modeling of cognitive system performance. Previous studies involving two complex tasks (i.e., driving and performing laparoscopic surgery) and incorporating measures that are clearly related to cognitive performance (information processing speed and short-term memory capacity) were revisited. A GSPT-derived method of task analysis and performance prediction termed Nonlinear Causal Resource Analysis (NCRA) was employed to determine the demand on basic cognitive performance resources required to support different levels of complex task performance. This approach is presented as a means to determine a cognitive workload profile and the subsequent computation of a single number measure of cognitive workload (CW). Computation of CW may be a viable alternative to measuring it. Various possible "more basic" performance resources that contribute to cognitive system performance are discussed. It is concluded from this preliminary exploration that a GSPT-based approach can contribute to defining cognitive performance models that are useful for both individual subjects and specific groups (e.g., military pilots).

  18. Reliability of fMRI for Studies of Language in Post-Stroke Aphasia Subjects

    PubMed Central

    Eaton, Kenneth P.; Szaflarski, Jerzy P.; Altaye, Mekibib; Ball, Angel L.; Kissela, Brett M.; Banks, Christi; Holland, Scott K.

    2008-01-01

    Quantifying change in brain activation patterns associated with post-stroke recovery and reorganization of language function over time requires accurate understanding of inter-scan and inter-subject variability. Here we report inter-scan variability measures for fMRI activation patterns associated with verb generation (VG) and semantic decision/tone decision (SDTD) tasks in 4 healthy controls and 4 aphasic left middle cerebral artery (LMCA) stroke subjects. A series of 10 fMRI scans was completed on a 4T Varian scanner for each task for each subject, except for one stroke subject who completed 5 and 6 scans for SDTD and VG, thus yielding 35 and 36 total stroke subject scans for SDTD and VG, respectively. Group composite and intraclass correlation coefficient (ICC) maps were computed across all subjects and trials for each task. The patterns of reliable activation for the VG and SDTD tasks correspond well to those regions typically activated by these tasks in healthy and aphasic subjects. ICCs for activation were consistently high (R0.05 ≈ 0.8) for individual tasks among both control and aphasic subjects. These voxel-wise measures of reliability highlight regions of low inter-scan variability within language circuitry for control and post-recovery stroke subjects. ICCs computed from the combination of the SDTD/VG data were markedly reduced for both control and aphasic subjects as compared with the ICCs for the individual tasks. These quantitative measures of inter-scan variability support the proposed use of these fMRI paradigms for longitudinal mapping of neural reorganization of language processing following left hemispheric insult. PMID:18411061

  19. Acoustic Prediction State of the Art Assessment

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D.

    2007-01-01

    The acoustic assessment task for both the Subsonic Fixed Wing and the Supersonic projects under NASA s Fundamental Aeronautics Program was designed to assess the current state-of-the-art in noise prediction capability and to establish baselines for gauging future progress. The documentation of our current capabilities included quantifying the differences between predictions of noise from computer codes and measurements of noise from experimental tests. Quantifying the accuracy of both the computed and experimental results further enhanced the credibility of the assessment. This presentation gives sample results from codes representative of NASA s capabilities in aircraft noise prediction both for systems and components. These include semi-empirical, statistical, analytical, and numerical codes. System level results are shown for both aircraft and engines. Component level results are shown for a landing gear prototype, for fan broadband noise, for jet noise from a subsonic round nozzle, and for propulsion airframe aeroacoustic interactions. Additional results are shown for modeling of the acoustic behavior of duct acoustic lining and the attenuation of sound in lined ducts with flow.

  20. Fitts’ Law in the Control of Isometric Grip Force With Naturalistic Targets

    PubMed Central

    Thumser, Zachary C.; Slifkin, Andrew B.; Beckler, Dylan T.; Marasco, Paul D.

    2018-01-01

    Fitts’ law models the relationship between amplitude, precision, and speed of rapid movements. It is widely used to quantify performance in pointing tasks, study human-computer interaction, and generally to understand perceptual-motor information processes, including research to model performance in isometric force production tasks. Applying Fitts’ law to an isometric grip force task would allow for quantifying grasp performance in rehabilitative medicine and may aid research on prosthetic control and design. We examined whether Fitts’ law would hold when participants attempted to accurately produce their intended force output while grasping a manipulandum when presented with images of various everyday objects (we termed this the implicit task). Although our main interest was the implicit task, to benchmark it and establish validity, we examined performance against a more standard visual feedback condition via a digital force-feedback meter on a video monitor (explicit task). Next, we progressed from visual force feedback with force meter targets to the same targets without visual force feedback (operating largely on feedforward control with tactile feedback). This provided an opportunity to see if Fitts’ law would hold without vision, and allowed us to progress toward the more naturalistic implicit task (which does not include visual feedback). Finally, we changed the nature of the targets from requiring explicit force values presented as arrows on a force-feedback meter (explicit targets) to the more naturalistic and intuitive target forces implied by images of objects (implicit targets). With visual force feedback the relation between task difficulty and the time to produce the target grip force was predicted by Fitts’ law (average r2 = 0.82). Without vision, average grip force scaled accurately although force variability was insensitive to the target presented. In contrast, images of everyday objects generated more reliable grip forces without the visualized force meter. In sum, population means were well-described by Fitts’ law for explicit targets with vision (r2 = 0.96) and implicit targets (r2 = 0.89), but not as well-described for explicit targets without vision (r2 = 0.54). Implicit targets should provide a realistic see-object-squeeze-object test using Fitts’ law to quantify the relative speed-accuracy relationship of any given grasper. PMID:29773999

  1. Virtual environment to quantify the influence of colour stimuli on the performance of tasks requiring attention

    PubMed Central

    2011-01-01

    Background Recent studies indicate that the blue-yellow colour discrimination is impaired in ADHD individuals. However, the relationship between colour and performance has not been investigated. This paper describes the development and the testing of a virtual environment that is capable to quantify the influence of red-green versus blue-yellow colour stimuli on the performance of people in a fun and interactive way, being appropriate for the target audience. Methods An interactive computer game based on virtual reality was developed to evaluate the performance of the players. The game's storyline was based on the story of an old pirate who runs across islands and dangerous seas in search of a lost treasure. Within the game, the player must find and interpret the hints scattered in different scenarios. Two versions of this game were implemented. In the first, hints and information boards were painted using red and green colours. In the second version, these objects were painted using blue and yellow colours. For modelling, texturing, and animating virtual characters and objects the three-dimensional computer graphics tool Blender 3D was used. The textures were created with the GIMP editor to provide visual effects increasing the realism and immersion of the players. The games were tested on 20 non-ADHD volunteers who were divided into two subgroups (A1 and A2) and 20 volunteers with ADHD who were divided into subgroups B1 and B2. Subgroups A1 and B1 used the first version of the game with the hints painted in green-red colors, and subgroups A2 and B2 the second version using the same hints now painted in blue-yellow. The time spent to complete each task of the game was measured. Results Data analyzed with ANOVA two-way and posthoc TUKEY LSD showed that the use of blue/yellow instead of green/red colors decreased the game performance of all participants. However, a greater decrease in performance could be observed with ADHD participants where tasks, that require attention, were most affected. Conclusions The game proved to be a user-friendly tool capable to detect and quantify the influence of color on the performance of people executing tasks that require attention and showed to be attractive for people with ADHD. PMID:21854630

  2. Virtual environment to quantify the influence of colour stimuli on the performance of tasks requiring attention.

    PubMed

    Silva, Alessandro P; Frère, Annie F

    2011-08-19

    Recent studies indicate that the blue-yellow colour discrimination is impaired in ADHD individuals. However, the relationship between colour and performance has not been investigated. This paper describes the development and the testing of a virtual environment that is capable to quantify the influence of red-green versus blue-yellow colour stimuli on the performance of people in a fun and interactive way, being appropriate for the target audience. An interactive computer game based on virtual reality was developed to evaluate the performance of the players.The game's storyline was based on the story of an old pirate who runs across islands and dangerous seas in search of a lost treasure. Within the game, the player must find and interpret the hints scattered in different scenarios. Two versions of this game were implemented. In the first, hints and information boards were painted using red and green colours. In the second version, these objects were painted using blue and yellow colours. For modelling, texturing, and animating virtual characters and objects the three-dimensional computer graphics tool Blender 3D was used. The textures were created with the GIMP editor to provide visual effects increasing the realism and immersion of the players. The games were tested on 20 non-ADHD volunteers who were divided into two subgroups (A1 and A2) and 20 volunteers with ADHD who were divided into subgroups B1 and B2. Subgroups A1 and B1 used the first version of the game with the hints painted in green-red colors, and subgroups A2 and B2 the second version using the same hints now painted in blue-yellow. The time spent to complete each task of the game was measured. Data analyzed with ANOVA two-way and posthoc TUKEY LSD showed that the use of blue/yellow instead of green/red colors decreased the game performance of all participants. However, a greater decrease in performance could be observed with ADHD participants where tasks, that require attention, were most affected. The game proved to be a user-friendly tool capable to detect and quantify the influence of color on the performance of people executing tasks that require attention and showed to be attractive for people with ADHD.

  3. Network Controllability in the Inferior Frontal Gyrus Relates to Controlled Language Variability and Susceptibility to TMS.

    PubMed

    Medaglia, John D; Harvey, Denise Y; White, Nicole; Kelkar, Apoorva; Zimmerman, Jared; Bassett, Danielle S; Hamilton, Roy H

    2018-06-08

    In language production, humans are confronted with considerable word selection demands. Often, we must select a word from among similar, acceptable, and competing alternative words in order to construct a sentence that conveys an intended meaning. In recent years, the left inferior frontal gyrus (LIFG) has been identified as critical to this ability. Despite a recent emphasis on network approaches to understanding language, how the LIFG interacts with the brain's complex networks to facilitate controlled language performance remains unknown. Here, we take a novel approach to understand word selection as a network control process in the brain. Using an anatomical brain network derived from high-resolution diffusion spectrum imaging (DSI), we computed network controllability underlying the site of transcranial magnetic stimulation in the LIFG between administrations of language tasks that vary in response (cognitive control) demands: open-response (word generation) vs. closed-response (number naming) tasks. We find that a statistic that quantifies the LIFG's theoretically predicted control of communication across modules in the human connectome explains TMS-induced changes in open-response language task performance only. Moreover, we find that a statistic that quantifies the LIFG's theoretically predicted control of difficult-to-reach states explains vulnerability to TMS in the closed-ended (but not open-ended) response task. These findings establish a link between network controllability, cognitive function, and TMS effects. SIGNIFICANCE STATEMENT This work illustrates that network control statistics applied to anatomical connectivity data demonstrate relationships with cognitive variability during controlled language tasks and TMS effects. Copyright © 2018 the authors.

  4. Ideal AFROC and FROC observers.

    PubMed

    Khurd, Parmeshwar; Liu, Bin; Gindi, Gene

    2010-02-01

    Detection of multiple lesions in images is a medically important task and free-response receiver operating characteristic (FROC) analyses and its variants, such as alternative FROC (AFROC) analyses, are commonly used to quantify performance in such tasks. However, ideal observers that optimize FROC or AFROC performance metrics have not yet been formulated in the general case. If available, such ideal observers may turn out to be valuable for imaging system optimization and in the design of computer aided diagnosis techniques for lesion detection in medical images. In this paper, we derive ideal AFROC and FROC observers. They are ideal in that they maximize, amongst all decision strategies, the area, or any partial area, under the associated AFROC or FROC curve. Calculation of observer performance for these ideal observers is computationally quite complex. We can reduce this complexity by considering forms of these observers that use false positive reports derived from signal-absent images only. We also consider a Bayes risk analysis for the multiple-signal detection task with an appropriate definition of costs. A general decision strategy that minimizes Bayes risk is derived. With particular cost constraints, this general decision strategy reduces to the decision strategy associated with the ideal AFROC or FROC observer.

  5. Deaf Learners' Knowledge of English Universal Quantifiers

    ERIC Educational Resources Information Center

    Berent, Gerald P.; Kelly, Ronald R.; Porter, Jeffrey E.; Fonzi, Judith

    2008-01-01

    Deaf and hearing students' knowledge of English sentences containing universal quantifiers was compared through their performance on a 50-item, multiple-picture task that required students to decide whether each of five pictures represented a possible meaning of a target sentence. The task assessed fundamental knowledge of quantifier sentences,…

  6. Toward using alpha and theta brain waves to quantify programmer expertise.

    PubMed

    Crk, Igor; Kluthe, Timothy

    2014-01-01

    Empirical studies of programming language learnability and usability have thus far depended on indirect measures of human cognitive performance, attempting to capture what is at its essence a purely cognitive exercise through various indicators of comprehension, such as the correctness of coding tasks or the time spent working out the meaning of code and producing acceptable solutions. Understanding program comprehension is essential to understanding the inherent complexity of programming languages, and ultimately, having a measure of mental effort based on direct observation of the brain at work will illuminate the nature of the work of programming. We provide evidence of direct observation of the cognitive effort associated with programming tasks, through a carefully constructed empirical study using a cross-section of undergraduate computer science students and an inexpensive, off-the-shelf brain-computer interface device. This study presents a link between expertise and programming language comprehension, draws conclusions about the observed indicators of cognitive effort using recent cognitive theories, and proposes directions for future work that is now possible.

  7. Optimal multisensory decision-making in a reaction-time task.

    PubMed

    Drugowitsch, Jan; DeAngelis, Gregory C; Klier, Eliana M; Angelaki, Dora E; Pouget, Alexandre

    2014-06-14

    Humans and animals can integrate sensory evidence from various sources to make decisions in a statistically near-optimal manner, provided that the stimulus presentation time is fixed across trials. Little is known about whether optimality is preserved when subjects can choose when to make a decision (reaction-time task), nor when sensory inputs have time-varying reliability. Using a reaction-time version of a visual/vestibular heading discrimination task, we show that behavior is clearly sub-optimal when quantified with traditional optimality metrics that ignore reaction times. We created a computational model that accumulates evidence optimally across both cues and time, and trades off accuracy with decision speed. This model quantitatively explains subjects's choices and reaction times, supporting the hypothesis that subjects do, in fact, accumulate evidence optimally over time and across sensory modalities, even when the reaction time is under the subject's control.

  8. Towards the quantitative evaluation of visual attention models.

    PubMed

    Bylinskii, Z; DeGennaro, E M; Rajalingham, R; Ruda, H; Zhang, J; Tsotsos, J K

    2015-11-01

    Scores of visual attention models have been developed over the past several decades of research. Differences in implementation, assumptions, and evaluations have made comparison of these models very difficult. Taxonomies have been constructed in an attempt at the organization and classification of models, but are not sufficient at quantifying which classes of models are most capable of explaining available data. At the same time, a multitude of physiological and behavioral findings have been published, measuring various aspects of human and non-human primate visual attention. All of these elements highlight the need to integrate the computational models with the data by (1) operationalizing the definitions of visual attention tasks and (2) designing benchmark datasets to measure success on specific tasks, under these definitions. In this paper, we provide some examples of operationalizing and benchmarking different visual attention tasks, along with the relevant design considerations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Performing a global barrier operation in a parallel computer

    DOEpatents

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2014-12-09

    Executing computing tasks on a parallel computer that includes compute nodes coupled for data communications, where each compute node executes tasks, with one task on each compute node designated as a master task, including: for each task on each compute node until all master tasks have joined a global barrier: determining whether the task is a master task; if the task is not a master task, joining a single local barrier; if the task is a master task, joining the global barrier and the single local barrier only after all other tasks on the compute node have joined the single local barrier.

  10. Use of a Tracing Task to Assess Visuomotor Performance: Effects of Age, Sex, and Handedness

    PubMed Central

    2013-01-01

    Background. Visuomotor abnormalities are common in aging and age-related disease, yet difficult to quantify. This study investigated the effects of healthy aging, sex, and handedness on the performance of a tracing task. Participants (n = 150, aged 21–95 years, 75 females) used a stylus to follow a moving target around a circle on a tablet computer with their dominant and nondominant hands. Participants also performed the Trail Making Test (a measure of executive function). Methods. Deviations from the circular path were computed to derive an “error” time series. For each time series, absolute mean, variance, and complexity index (a proposed measure of system functionality and adaptability) were calculated. Using the moving target and stylus coordinates, the percentage of task time within the target region and the cumulative micropause duration (a measure of motion continuity) were computed. Results. All measures showed significant effects of aging (p < .0005). Post hoc age group comparisons showed that with increasing age, the absolute mean and variance of the error increased, complexity index decreased, percentage of time within the target region decreased, and cumulative micropause duration increased. Only complexity index showed a significant difference between dominant versus nondominant hands within each age group (p < .0005). All measures showed relationships to the Trail Making Test (p < .05). Conclusions. Measures derived from a tracing task identified performance differences in healthy individuals as a function of age, sex, and handedness. Studies in populations with specific neuromotor syndromes are warranted to test the utility of measures based on the dynamics of tracking a target as a clinical assessment tool. PMID:23388876

  11. Quantifying the association between white matter integrity changes and subconcussive head impact exposure from a single season of youth and high school football using 3D convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Saghafi, Behrouz; Murugesan, Gowtham; Davenport, Elizabeth; Wagner, Ben; Urban, Jillian; Kelley, Mireille; Jones, Derek; Powers, Alexander; Whitlow, Christopher; Stitzel, Joel; Maldjian, Joseph; Montillo, Albert

    2018-02-01

    The effect of subconcussive head impact exposure during contact sports, including American football, on brain health is poorly understood particularly in young and adolescent players, who may be more vulnerable to brain injury during periods of rapid brain maturation. This study aims to quantify the association between cumulative effects of head impact exposure from a single season of football on white matter (WM) integrity as measured with diffusion MRI. The study targets football players aged 9-18 years old. All players were imaged pre- and post-season with structural MRI and diffusion tensor MRI (DTI). Fractional Anisotropy (FA) maps, shown to be closely correlated with WM integrity, were computed for each subject, co-registered and subtracted to compute the change in FA per subject. Biomechanical metrics were collected at every practice and game using helmet mounted accelerometers. Each head impact was converted into a risk of concussion, and the risk of concussion-weighted cumulative exposure (RWE) was computed for each player for the season. Athletes with high and low RWE were selected for a two-category classification task. This task was addressed by developing a 3D Convolutional Neural Network (CNN) to automatically classify players into high and low impact exposure groups from the change in FA maps. Using the proposed model, high classification performance, including ROC Area Under Curve score of 85.71% and F1 score of 83.33% was achieved. This work adds to the growing body of evidence for the presence of detectable neuroimaging brain changes in white matter integrity from a single season of contact sports play, even in the absence of a clinically diagnosed concussion.

  12. EEG-Based Brain-Computer Interface for Decoding Motor Imagery Tasks within the Same Hand Using Choi-Williams Time-Frequency Distribution

    PubMed Central

    Alwanni, Hisham; Baslan, Yara; Alnuman, Nasim; Daoud, Mohammad I.

    2017-01-01

    This paper presents an EEG-based brain-computer interface system for classifying eleven motor imagery (MI) tasks within the same hand. The proposed system utilizes the Choi-Williams time-frequency distribution (CWD) to construct a time-frequency representation (TFR) of the EEG signals. The constructed TFR is used to extract five categories of time-frequency features (TFFs). The TFFs are processed using a hierarchical classification model to identify the MI task encapsulated within the EEG signals. To evaluate the performance of the proposed approach, EEG data were recorded for eighteen intact subjects and four amputated subjects while imagining to perform each of the eleven hand MI tasks. Two performance evaluation analyses, namely channel- and TFF-based analyses, are conducted to identify the best subset of EEG channels and the TFFs category, respectively, that enable the highest classification accuracy between the MI tasks. In each evaluation analysis, the hierarchical classification model is trained using two training procedures, namely subject-dependent and subject-independent procedures. These two training procedures quantify the capability of the proposed approach to capture both intra- and inter-personal variations in the EEG signals for different MI tasks within the same hand. The results demonstrate the efficacy of the approach for classifying the MI tasks within the same hand. In particular, the classification accuracies obtained for the intact and amputated subjects are as high as 88.8% and 90.2%, respectively, for the subject-dependent training procedure, and 80.8% and 87.8%, respectively, for the subject-independent training procedure. These results suggest the feasibility of applying the proposed approach to control dexterous prosthetic hands, which can be of great benefit for individuals suffering from hand amputations. PMID:28832513

  13. The influence of deliberate practice on musical achievement: a meta-analysis.

    PubMed

    Platz, Friedrich; Kopiez, Reinhard; Lehmann, Andreas C; Wolf, Anna

    2014-01-01

    Deliberate practice (DP) is a task-specific structured training activity that plays a key role in understanding skill acquisition and explaining individual differences in expert performance. Relevant activities that qualify as DP have to be identified in every domain. For example, for training in classical music, solitary practice is a typical training activity during skill acquisition. To date, no meta-analysis on the quantifiable effect size of deliberate practice on attained performance in music has been conducted. Yet the identification of a quantifiable effect size could be relevant for the current discussion on the role of various factors on individual difference in musical achievement. Furthermore, a research synthesis might enable new computational approaches to musical development. Here we present the first meta-analysis on the role of deliberate practice in the domain of musical performance. A final sample size of 13 studies (total N = 788) was carefully extracted to satisfy the following criteria: reported durations of task-specific accumulated practice as predictor variables and objectively assessed musical achievement as the target variable. We identified an aggregated effect size of r c = 0.61; 95% CI [0.54, 0.67] for the relationship between task-relevant practice (which by definition includes DP) and musical achievement. Our results corroborate the central role of long-term (deliberate) practice for explaining expert performance in music.

  14. Selecting Tasks for Evaluating Human Performance as a Function of Gravity

    NASA Technical Reports Server (NTRS)

    Norcross, Jason R.; Gernhardt, Michael L.

    2011-01-01

    A challenge in understanding human performance as a function of gravity is determining which tasks to research. Initial studies began with treadmill walking, which was easy to quantify and control. However, with the development of pressurized rovers, it is less important to optimize human performance for ambulation as pressurized rovers will likely perform gross translation for them. Future crews are likely to spend much of their extravehicular activity (EVA) performing geology, construction,a nd maintenance type tasks. With these types of tasks, people have different performance strategies, and it is often difficult to quantify the task and measure steady-state metabolic rates or perform biomechanical analysis. For many of these types of tasks, subjective feedback may be the only data that can be collected. However, subjective data may not fully support a rigorous scientific comparison of human performance across different gravity levels and suit factors. NASA would benefit from having a wide variety of quantifiable tasks that allow human performance comparison across different conditions. In order to determine which tasks will effectively support scientific studies, many different tasks and data analysis techniques will need to be employed. Many of these tasks and techniques will not be effective, but some will produce quantifiable results that are sensitive enough to show performance differences. One of the primary concerns related to EVA performance is metabolic rate. The higher the metabolic rate, the faster the astronaut will exhaust consumables. The focus of this poster will be on how different tasks affect metabolic rate across different gravity levels.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Performing a global barrier operation in a parallel computer that includes compute nodes coupled for data communications, where each compute node executes tasks, with one task on each compute node designated as a master task, including: for each task on each compute node until all master tasks have joined a global barrier: determining whether the task is a master task; if the task is not a master task, joining a single local barrier; if the task is a master task, joining the global barrier and the single local barrier only after all other tasks on the compute node have joinedmore » the single local barrier.« less

  16. Impact of number of repeated scans on model observer performance for a low-contrast detection task in computed tomography.

    PubMed

    Ma, Chi; Yu, Lifeng; Chen, Baiyu; Favazza, Christopher; Leng, Shuai; McCollough, Cynthia

    2016-04-01

    Channelized Hotelling observer (CHO) models have been shown to correlate well with human observers for several phantom-based detection/classification tasks in clinical computed tomography (CT). A large number of repeated scans were used to achieve an accurate estimate of the model's template. The purpose of this study is to investigate how the experimental and CHO model parameters affect the minimum required number of repeated scans. A phantom containing 21 low-contrast objects was scanned on a 128-slice CT scanner at three dose levels. Each scan was repeated 100 times. For each experimental configuration, the low-contrast detectability, quantified as the area under receiver operating characteristic curve, [Formula: see text], was calculated using a previously validated CHO with randomly selected subsets of scans, ranging from 10 to 100. Using [Formula: see text] from the 100 scans as the reference, the accuracy from a smaller number of scans was determined. Our results demonstrated that the minimum number of repeated scans increased when the radiation dose level decreased, object size and contrast level decreased, and the number of channels increased. As a general trend, it increased as the low-contrast detectability decreased. This study provides a basis for the experimental design of task-based image quality assessment in clinical CT using CHO.

  17. Impact of number of repeated scans on model observer performance for a low-contrast detection task in computed tomography

    PubMed Central

    Ma, Chi; Yu, Lifeng; Chen, Baiyu; Favazza, Christopher; Leng, Shuai; McCollough, Cynthia

    2016-01-01

    Abstract. Channelized Hotelling observer (CHO) models have been shown to correlate well with human observers for several phantom-based detection/classification tasks in clinical computed tomography (CT). A large number of repeated scans were used to achieve an accurate estimate of the model’s template. The purpose of this study is to investigate how the experimental and CHO model parameters affect the minimum required number of repeated scans. A phantom containing 21 low-contrast objects was scanned on a 128-slice CT scanner at three dose levels. Each scan was repeated 100 times. For each experimental configuration, the low-contrast detectability, quantified as the area under receiver operating characteristic curve, Az, was calculated using a previously validated CHO with randomly selected subsets of scans, ranging from 10 to 100. Using Az from the 100 scans as the reference, the accuracy from a smaller number of scans was determined. Our results demonstrated that the minimum number of repeated scans increased when the radiation dose level decreased, object size and contrast level decreased, and the number of channels increased. As a general trend, it increased as the low-contrast detectability decreased. This study provides a basis for the experimental design of task-based image quality assessment in clinical CT using CHO. PMID:27284547

  18. JPL Electronic Nose: From Sniffing Brain Cancer to Trouble in Space

    NASA Technical Reports Server (NTRS)

    Homer, Margie L.

    2011-01-01

    What Is An Electronic Nose? An array of non-specific chemical sensors, controlled and analyzed electronically, which mimics the action of the mammalian nose by recognizing patterns of response. An Enose: (1.) ENose measures background resistance in each sensor and establishes a baseline. (2.) Contaminant comes in contact with sensors on the sensing head. (3.) The sensing films, change physical properties, such as thickness or color, as air composition changes. (4.) Sensor response is recorded by a computer, the change in resistance is computed, and the distributed response pattern of the sensor array is used to identify gases and mixtures of gases. (5. Responses of the sensor array are analyzed and quantified using software developed for the task.

  19. Computational approaches to protein inference in shotgun proteomics

    PubMed Central

    2012-01-01

    Shotgun proteomics has recently emerged as a powerful approach to characterizing proteomes in biological samples. Its overall objective is to identify the form and quantity of each protein in a high-throughput manner by coupling liquid chromatography with tandem mass spectrometry. As a consequence of its high throughput nature, shotgun proteomics faces challenges with respect to the analysis and interpretation of experimental data. Among such challenges, the identification of proteins present in a sample has been recognized as an important computational task. This task generally consists of (1) assigning experimental tandem mass spectra to peptides derived from a protein database, and (2) mapping assigned peptides to proteins and quantifying the confidence of identified proteins. Protein identification is fundamentally a statistical inference problem with a number of methods proposed to address its challenges. In this review we categorize current approaches into rule-based, combinatorial optimization and probabilistic inference techniques, and present them using integer programing and Bayesian inference frameworks. We also discuss the main challenges of protein identification and propose potential solutions with the goal of spurring innovative research in this area. PMID:23176300

  20. A Method for Measuring the Effective Throughput Time Delay in Simulated Displays Involving Manual Control

    NASA Technical Reports Server (NTRS)

    Jewell, W. F.; Clement, W. F.

    1984-01-01

    The advent and widespread use of the computer-generated image (CGI) device to simulate visual cues has a mixed impact on the realism and fidelity of flight simulators. On the plus side, CGIs provide greater flexibility in scene content than terrain boards and closed circuit television based visual systems, and they have the potential for a greater field of view. However, on the minus side, CGIs introduce into the visual simulation relatively long time delays. In many CGIs, this delay is as much as 200 ms, which is comparable to the inherent delay time of the pilot. Because most GCIs use multiloop processing and smoothing algorithms and are linked to a multiloop host computer, it is seldom possible to identify a unique throughput time delay, and it is therefore difficult to quantify the performance of the closed loop pilot simulator system relative to the real world task. A method to address these issues using the critical task tester is described. Some empirical results from applying the method are presented, and a novel technique for improving the performance of GCIs is discussed.

  1. Error regions in quantum state tomography: computational complexity caused by geometry of quantum states

    NASA Astrophysics Data System (ADS)

    Suess, Daniel; Rudnicki, Łukasz; maciel, Thiago O.; Gross, David

    2017-09-01

    The outcomes of quantum mechanical measurements are inherently random. It is therefore necessary to develop stringent methods for quantifying the degree of statistical uncertainty about the results of quantum experiments. For the particularly relevant task of quantum state tomography, it has been shown that a significant reduction in uncertainty can be achieved by taking the positivity of quantum states into account. However—the large number of partial results and heuristics notwithstanding—no efficient general algorithm is known that produces an optimal uncertainty region from experimental data, while making use of the prior constraint of positivity. Here, we provide a precise formulation of this problem and show that the general case is NP-hard. Our result leaves room for the existence of efficient approximate solutions, and therefore does not in itself imply that the practical task of quantum uncertainty quantification is intractable. However, it does show that there exists a non-trivial trade-off between optimality and computational efficiency for error regions. We prove two versions of the result: one for frequentist and one for Bayesian statistics.

  2. A methodology for assessing the effect of correlations among muscle synergy activations on task-discriminating information.

    PubMed

    Delis, Ioannis; Berret, Bastien; Pozzo, Thierry; Panzeri, Stefano

    2013-01-01

    Muscle synergies have been hypothesized to be the building blocks used by the central nervous system to generate movement. According to this hypothesis, the accomplishment of various motor tasks relies on the ability of the motor system to recruit a small set of synergies on a single-trial basis and combine them in a task-dependent manner. It is conceivable that this requires a fine tuning of the trial-to-trial relationships between the synergy activations. Here we develop an analytical methodology to address the nature and functional role of trial-to-trial correlations between synergy activations, which is designed to help to better understand how these correlations may contribute to generating appropriate motor behavior. The algorithm we propose first divides correlations between muscle synergies into types (noise correlations, quantifying the trial-to-trial covariations of synergy activations at fixed task, and signal correlations, quantifying the similarity of task tuning of the trial-averaged activation coefficients of different synergies), and then uses single-trial methods (task-decoding and information theory) to quantify their overall effect on the task-discriminating information carried by muscle synergy activations. We apply the method to both synchronous and time-varying synergies and exemplify it on electromyographic data recorded during performance of reaching movements in different directions. Our method reveals the robust presence of information-enhancing patterns of signal and noise correlations among pairs of synchronous synergies, and shows that they enhance by 9-15% (depending on the set of tasks) the task-discriminating information provided by the synergy decompositions. We suggest that the proposed methodology could be useful for assessing whether single-trial activations of one synergy depend on activations of other synergies and quantifying the effect of such dependences on the task-to-task differences in muscle activation patterns.

  3. The Use Of Videography For Three-Dimensional Motion Analysis

    NASA Astrophysics Data System (ADS)

    Hawkins, D. A.; Hawthorne, D. L.; DeLozier, G. S.; Campbell, K. R.; Grabiner, M. D.

    1988-02-01

    Special video path editing capabilities with custom hardware and software, have been developed for use in conjunction with existing video acquisition hardware and firmware. This system has simplified the task of quantifying the kinematics of human movement. A set of retro-reflective markers are secured to a subject performing a given task (i.e. walking, throwing, swinging a golf club, etc.). Multiple cameras, a video processor, and a computer work station collect video data while the task is performed. Software has been developed to edit video files, create centroid data, and identify marker paths. Multi-camera path files are combined to form a 3D path file using the DLT method of cinematography. A separate program converts the 3D path file into kinematic data by creating a set of local coordinate axes and performing a series of coordinate transformations from one local system to the next. The kinematic data is then displayed for appropriate review and/or comparison.

  4. Behavioral and Brain Measures of Phasic Alerting Effects on Visual Attention.

    PubMed

    Wiegand, Iris; Petersen, Anders; Finke, Kathrin; Bundesen, Claus; Lansner, Jon; Habekost, Thomas

    2017-01-01

    In the present study, we investigated effects of phasic alerting on visual attention in a partial report task, in which half of the displays were preceded by an auditory warning cue. Based on the computational Theory of Visual Attention (TVA), we estimated parameters of spatial and non-spatial aspects of visual attention and measured event-related lateralizations (ERLs) over visual processing areas. We found that the TVA parameter sensory effectiveness a , which is thought to reflect visual processing capacity, significantly increased with phasic alerting. By contrast, the distribution of visual processing resources according to task relevance and spatial position, as quantified in parameters top-down control α and spatial bias w index , was not modulated by phasic alerting. On the electrophysiological level, the latencies of ERLs in response to the task displays were reduced following the warning cue. These results suggest that phasic alerting facilitates visual processing in a general, unselective manner and that this effect originates in early stages of visual information processing.

  5. Effects of computer keyboarding on ultrasonographic measures of the median nerve

    PubMed Central

    Toosi, KK; Impink, BG; Baker, NA; Boninger, ML

    2011-01-01

    Background Keyboarding is a highly repetitive daily task and has been linked to musculoskeletal disorders of the upper extremity. However, the effect of keyboarding on median nerve injuries is not well understood. The purpose of this study was to use ultrasonographic measurements to determine whether continuous keyboarding can cause acute changes in the median nerve. Methods Ultrasound images of the median nerve from twenty-one volunteers were captured at the levels of the pisiform and distal radius prior to and following a prolonged keyboarding task (i.e., one hour of continuous keyboarding). Images were analyzed by a blinded investigator to quantify the median nerve characteristics. Changes in the median nerve ultrasonographic measures as a result of continuous keyboarding task were evaluated. Results Cross-sectional areas at the pisiform level were significantly larger in both dominant (p=0.004) and non-dominant (p=0.001) hands following the keyboarding task. Swelling ratio was significantly greater in the dominant hand (p=0.020) after 60 minutes of keyboarding when compared to the baseline measures. Flattening ratios were not significantly different in either hand as a result of keyboarding. Conclusion We were able to detect an acute increase in the area of the median nerve following one hour of keyboarding with a computer keyboard. This suggests that keyboarding has an impact on the median nerve. Further studies are required to understand this relationship, which would provide insight into the pathophysiology of median neuropathies such as carpal tunnel syndrome. PMID:21739468

  6. The semantic distance task: Quantifying semantic distance with semantic network path length.

    PubMed

    Kenett, Yoed N; Levi, Effi; Anaki, David; Faust, Miriam

    2017-09-01

    Semantic distance is a determining factor in cognitive processes, such as semantic priming, operating upon semantic memory. The main computational approach to compute semantic distance is through latent semantic analysis (LSA). However, objections have been raised against this approach, mainly in its failure at predicting semantic priming. We propose a novel approach to computing semantic distance, based on network science methodology. Path length in a semantic network represents the amount of steps needed to traverse from 1 word in the network to the other. We examine whether path length can be used as a measure of semantic distance, by investigating how path length affect performance in a semantic relatedness judgment task and recall from memory. Our results show a differential effect on performance: Up to 4 steps separating between word-pairs, participants exhibit an increase in reaction time (RT) and decrease in the percentage of word-pairs judged as related. From 4 steps onward, participants exhibit a significant decrease in RT and the word-pairs are dominantly judged as unrelated. Furthermore, we show that as path length between word-pairs increases, success in free- and cued-recall decreases. Finally, we demonstrate how our measure outperforms computational methods measuring semantic distance (LSA and positive pointwise mutual information) in predicting participants RT and subjective judgments of semantic strength. Thus, we provide a computational alternative to computing semantic distance. Furthermore, this approach addresses key issues in cognitive theory, namely the breadth of the spreading activation process and the effect of semantic distance on memory retrieval. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. Psychometric properties of startle and corrugator response in NPU, Affective Picture Viewing, and Resting State tasks

    PubMed Central

    Kaye, Jesse T.; Bradford, Daniel E.; Curtin, John J.

    2016-01-01

    The current study provides a comprehensive evaluation of critical psychometric properties of commonly used psychophysiology laboratory tasks/measures within the NIMH RDoC. Participants (N = 128) completed the No Shock, Predictable Shock, Unpredictable Shock (NPU) task, Affective Picture Viewing task, and Resting State task at two study visits separated by one week. We examined potentiation/modulation scores in NPU (predictable or unpredictable shock vs. no shock) and Affective Picture Viewing tasks (pleasant or unpleasant vs. neutral pictures) for startle and corrugator responses with two commonly used quantification methods. We quantified startle potentiation/modulation scores with raw and standardized responses. We quantified corrugator potentiation/modulation in the time and frequency domains. We quantified general startle reactivity in the Resting State Task as the mean raw startle response during the task. For these three tasks, two measures, and two quantification methods we evaluated effect size robustness and stability, internal consistency (i.e., split-half reliability), and one-week temporal stability. The psychometric properties of startle potentiation in the NPU task were good but concerns were noted for corrugator potentiation in this task. Some concerns also were noted for the psychometric properties of both startle and corrugator modulation in the Affective Picture Viewing task, in particular for pleasant picture modulation. Psychometric properties of general startle reactivity in the Resting State task were good. Some salient differences in the psychometric properties of the NPU and Affective Picture Viewing tasks were observed within and across quantification methods. PMID:27167717

  8. A framework for optimizing micro-CT in dual-modality micro-CT/XFCT small-animal imaging system

    NASA Astrophysics Data System (ADS)

    Vedantham, Srinivasan; Shrestha, Suman; Karellas, Andrew; Cho, Sang Hyun

    2017-09-01

    Dual-modality Computed Tomography (CT)/X-ray Fluorescence Computed Tomography (XFCT) can be a valuable tool for imaging and quantifying the organ and tissue distribution of small concentrations of high atomic number materials in small-animal system. In this work, the framework for optimizing the micro-CT imaging system component of the dual-modality system is described, either when the micro-CT images are concurrently acquired with XFCT and using the x-ray spectral conditions for XFCT, or when the micro-CT images are acquired sequentially and independently of XFCT. This framework utilizes the cascaded systems analysis for task-specific determination of the detectability index using numerical observer models at a given radiation dose, where the radiation dose is determined using Monte Carlo simulations.

  9. Application of a Resource Theory for Magic States to Fault-Tolerant Quantum Computing.

    PubMed

    Howard, Mark; Campbell, Earl

    2017-03-03

    Motivated by their necessity for most fault-tolerant quantum computation schemes, we formulate a resource theory for magic states. First, we show that robustness of magic is a well-behaved magic monotone that operationally quantifies the classical simulation overhead for a Gottesman-Knill-type scheme using ancillary magic states. Our framework subsequently finds immediate application in the task of synthesizing non-Clifford gates using magic states. When magic states are interspersed with Clifford gates, Pauli measurements, and stabilizer ancillas-the most general synthesis scenario-then the class of synthesizable unitaries is hard to characterize. Our techniques can place nontrivial lower bounds on the number of magic states required for implementing a given target unitary. Guided by these results, we have found new and optimal examples of such synthesis.

  10. Quantifying Narrative Ability in Autism Spectrum Disorder: A Computational Linguistic Analysis of Narrative Coherence

    PubMed Central

    Losh, Molly; Gordon, Peter C.

    2014-01-01

    Autism Spectrum Disorder (ASD) is characterized by difficulties with social communication and functioning, and ritualistic/repetitive behaviors (American Psychiatric Association, 2013). While substantial heterogeneity exists in symptom expression, impairments in language discourse skills, including narrative, are universally observed (Tager-Flusberg, Paul, & Lord, 2005). This study applied a computational linguistic tool, Latent Semantic Analysis (LSA), to objectively characterize narrative performance in ASD across two narrative contexts differing in interpersonal and cognitive demands. Results indicated that individuals with ASD produced narratives comparable in semantic content to those from controls when narrating from a picture book, but produced narratives diminished in semantic quality in a more demanding narrative recall task. Results are discussed in terms of the utility of LSA as a quantitative, objective, and efficient measure of narrative ability. PMID:24915929

  11. Influence of dual-tasking with different levels of attention diversion on characteristics of the movement-related cortical potential.

    PubMed

    Aliakbaryhosseinabadi, Susan; Kamavuako, Ernest Nlandu; Jiang, Ning; Farina, Dario; Mrachacz-Kersting, Natalie

    2017-11-01

    Dual tasking is defined as performing two tasks concurrently and has been shown to have a significant effect on attention directed to the performance of the main task. In this study, an attention diversion task with two different levels was administered while participants had to complete a cue-based motor task consisting of foot dorsiflexion. An auditory oddball task with two levels of complexity was implemented to divert the user's attention. Electroencephalographic (EEG) recordings were made from nine single channels. Event-related potentials (ERPs) confirmed that the oddball task of counting a sequence of two tones decreased the auditory P300 amplitude more than the oddball task of counting one target tone among three different tones. Pre-movement features quantified from the movement-related cortical potential (MRCP) were changed significantly between single and dual-task conditions in motor and fronto-central channels. There was a significant delay in movement detection for the case of single tone counting in two motor channels only (237.1-247.4ms). For the task of sequence counting, motor cortex and frontal channels showed a significant delay in MRCP detection (232.1-250.5ms). This study investigated the effect of attention diversion in dual-task conditions by analysing both ERPs and MRCPs in single channels. The higher attention diversion lead to a significant reduction in specific MRCP features of the motor task. These results suggest that attention division in dual-tasking situations plays an important role in movement execution and detection. This has important implications in designing real-time brain-computer interface systems. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Measurement of the effect of physical exercise on the concentration of individuals with ADHD.

    PubMed

    Silva, Alessandro P; Prado, Sueli O S; Scardovelli, Terigi A; Boschi, Silvia R M S; Campos, Luiz C; Frère, Annie F

    2015-01-01

    Attention Deficit Hyperactivity Disorder (ADHD) mainly affects the academic performance of children and adolescents. In addition to bringing physical and mental health benefits, physical activity has been used to prevent and improve ADHD comorbidities; however, its effectiveness has not been quantified. In this study, the effect of physical activity on children's attention was measured using a computer game. Intense physical activity was promoted by a relay race, which requires a 5-min run without a rest interval. The proposed physical stimulus was performed with 28 volunteers: 14 with ADHD (GE-EF) and 14 without ADHD symptoms (GC-EF). After 5 min of rest, these volunteers accessed the computer game to accomplish the tasks in the shortest time possible. The computer game was also accessed by another 28 volunteers: 14 with ADHD (GE) and 14 without these symptoms (GC). The response time to solve the tasks that require attention was recorded. The results of the four groups were analyzed using D'Agostino statistical tests of normality, Kruskal-Wallis analyses of variance and post-hoc Dunn tests. The groups of volunteers with ADHD who performed exercise (GE-EF) showed improved performance for the tasks that require attention with a difference of 30.52% compared with the volunteers with ADHD who did not perform the exercise (GE). The (GE-EF) group showed similar performance (2.5% difference) with the volunteers in the (GC) group who have no ADHD symptoms and did not exercise. This study shows that intense exercise can improve the attention of children with ADHD and may help their school performance.

  13. Brain computer interfaces for neurorehabilitation – its current status as a rehabilitation strategy post-stroke.

    PubMed

    van Dokkum, L E H; Ward, T; Laffont, I

    2015-02-01

    The idea of using brain computer interfaces (BCI) for rehabilitation emerged relatively recently. Basically, BCI for neurorehabilitation involves the recording and decoding of local brain signals generated by the patient, as he/her tries to perform a particular task (even if imperfect), or during a mental imagery task. The main objective is to promote the recruitment of selected brain areas involved and to facilitate neural plasticity. The recorded signal can be used in several ways: (i) to objectify and strengthen motor imagery-based training, by providing the patient feedback on the imagined motor task, for example, in a virtual environment; (ii) to generate a desired motor task via functional electrical stimulation or rehabilitative robotic orthoses attached to the patient's limb – encouraging and optimizing task execution as well as "closing" the disrupted sensorimotor loop by giving the patient the appropriate sensory feedback; (iii) to understand cerebral reorganizations after lesion, in order to influence or even quantify plasticity-induced changes in brain networks. For example, applying cerebral stimulation to re-equilibrate inter-hemispheric imbalance as shown by functional recording of brain activity during movement may help recovery. Its potential usefulness for a patient population has been demonstrated on various levels and its diverseness in interface applications makes it adaptable to a large population. The position and status of these very new rehabilitation systems should now be considered with respect to our current and more or less validated traditional methods, as well as in the light of the wide range of possible brain damage. The heterogeneity in post-damage expression inevitably complicates the decoding of brain signals and thus their use in pathological conditions, asking for controlled clinical trials. Copyright © 2015. Published by Elsevier Masson SAS.

  14. Quantitative Tools for Examining the Vocalizations of Juvenile Songbirds

    PubMed Central

    Wellock, Cameron D.; Reeke, George N.

    2012-01-01

    The singing of juvenile songbirds is highly variable and not well stereotyped, a feature that makes it difficult to analyze with existing computational techniques. We present here a method suitable for analyzing such vocalizations, windowed spectral pattern recognition (WSPR). Rather than performing pairwise sample comparisons, WSPR measures the typicality of a sample against a large sample set. We also illustrate how WSPR can be used to perform a variety of tasks, such as sample classification, song ontogeny measurement, and song variability measurement. Finally, we present a novel measure, based on WSPR, for quantifying the apparent complexity of a bird's singing. PMID:22701474

  15. Demand Response Resource Quantification with Detailed Building Energy Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, Elaine; Horsey, Henry; Merket, Noel

    Demand response is a broad suite of technologies that enables changes in electrical load operations in support of power system reliability and efficiency. Although demand response is not a new concept, there is new appetite for comprehensively evaluating its technical potential in the context of renewable energy integration. The complexity of demand response makes this task difficult -- we present new methods for capturing the heterogeneity of potential responses from buildings, their time-varying nature, and metrics such as thermal comfort that help quantify likely acceptability of specific demand response actions. Computed with an automated software framework, the methods are scalable.

  16. Quantifying Leg Movement Activity During Sleep.

    PubMed

    Ferri, Raffaele; Fulda, Stephany

    2016-12-01

    Currently, 2 sets of similar rules for recording and scoring leg movement (LM) exist, including periodic LM during sleep (PLMS) and periodic LM during wakefulness. The former were published in 2006 by a task force of the International Restless Legs Syndrome Study Group, and the second in 2007 by the American Academy of Sleep Medicine. This article reviews the basic recording methods, scoring rules, and computer-based programs for PLMS. Less frequent LM activities, such as alternating leg muscle activation, hypnagogic foot tremor, high-frequency LMs, and excessive fragmentary myoclonus are briefly described. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Video-Based Method of Quantifying Performance and Instrument Motion During Simulated Phonosurgery

    PubMed Central

    Conroy, Ellen; Surender, Ketan; Geng, Zhixian; Chen, Ting; Dailey, Seth; Jiang, Jack

    2015-01-01

    Objectives/Hypothesis To investigate the use of the Video-Based Phonomicrosurgery Instrument Tracking System to collect instrument position data during simulated phonomicrosurgery and calculate motion metrics using these data. We used this system to determine if novice subject motion metrics improved over 1 week of training. Study Design Prospective cohort study. Methods Ten subjects performed simulated surgical tasks once per day for 5 days. Instrument position data were collected and used to compute motion metrics (path length, depth perception, and motion smoothness). Data were analyzed to determine if motion metrics improved with practice time. Task outcome was also determined each day, and relationships between task outcome and motion metrics were used to evaluate the validity of motion metrics as indicators of surgical performance. Results Significant decreases over time were observed for path length (P <.001), depth perception (P <.001), and task outcome (P <.001). No significant change was observed for motion smoothness. Significant relationships were observed between task outcome and path length (P <.001), depth perception (P <.001), and motion smoothness (P <.001). Conclusions Our system can estimate instrument trajectory and provide quantitative descriptions of surgical performance. It may be useful for evaluating phonomicrosurgery performance. Path length and depth perception may be particularly useful indicators. PMID:24737286

  18. Classification and Feature Selection Algorithms for Modeling Ice Storm Climatology

    NASA Astrophysics Data System (ADS)

    Swaminathan, R.; Sridharan, M.; Hayhoe, K.; Dobbie, G.

    2015-12-01

    Ice storms account for billions of dollars of winter storm loss across the continental US and Canada. In the future, increasing concentration of human populations in areas vulnerable to ice storms such as the northeastern US will only exacerbate the impacts of these extreme events on infrastructure and society. Quantifying the potential impacts of global climate change on ice storm prevalence and frequency is challenging, as ice storm climatology is driven by complex and incompletely defined atmospheric processes, processes that are in turn influenced by a changing climate. This makes the underlying atmospheric and computational modeling of ice storm climatology a formidable task. We propose a novel computational framework that uses sophisticated stochastic classification and feature selection algorithms to model ice storm climatology and quantify storm occurrences from both reanalysis and global climate model outputs. The framework is based on an objective identification of ice storm events by key variables derived from vertical profiles of temperature, humidity and geopotential height. Historical ice storm records are used to identify days with synoptic-scale upper air and surface conditions associated with ice storms. Evaluation using NARR reanalysis and historical ice storm records corresponding to the northeastern US demonstrates that an objective computational model with standard performance measures, with a relatively high degree of accuracy, identify ice storm events based on upper-air circulation patterns and provide insights into the relationships between key climate variables associated with ice storms.

  19. A novel method for quantifying arm motion similarity.

    PubMed

    Zhi Li; Hauser, Kris; Roldan, Jay Ryan; Milutinovic, Dejan; Rosen, Jacob

    2015-08-01

    This paper proposes a novel task-independent method for quantifying arm motion similarity that can be applied to any kinematic/dynamic variable of interest. Given two arm motions for the same task, not necessarily with the same completion time, it plots the time-normalized curves against one another and generates four real-valued features. To validate these features we apply them to quantify the relationship between healthy and paretic arm motions of chronic stroke patients. Studying both unimanual and bimanual arm motions of eight chronic stroke patients, we find that inter-arm coupling that tends to synchronize the motions of both arms in bimanual motions, has a stronger effect at task-relevant joints than at task-irrelevant joints. It also revealed that the paretic arm suppresses the shoulder flexion of the non-paretic arm, while the latter encourages the shoulder rotation of the former.

  20. A Study on the Validity of a Task Complexity Measure for Emergency Operating Procedures of Nuclear Power Plants—Comparing With a Subjective Workload

    NASA Astrophysics Data System (ADS)

    Park, J.; Jung, W.

    2006-10-01

    In this study, the appropriateness of the task complexity (TACOM) measure that can quantify the complexity of emergency tasks was investigated by comparing subjective workload scores with the associated TACOM scores. To this end, based on the NASA-TLX (task load index) technique, 18 operators were asked to subjectively estimate perceived workload for 23 emergency tasks that were specified in the emergency operating procedures of the reference nuclear power plants. As the result of comparisons, it was observed that subjective workload scores increase in proportion to the increase of TACOM scores. Therefore, it is expect that the TACOM measure can be used as a serviceable method to quantify the complexity of emergency tasks

  1. Task-based modeling and optimization of a cone-beam CT scanner for musculoskeletal imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prakash, P.; Zbijewski, W.; Gang, G. J.

    2011-10-15

    Purpose: This work applies a cascaded systems model for cone-beam CT imaging performance to the design and optimization of a system for musculoskeletal extremity imaging. The model provides a quantitative guide to the selection of system geometry, source and detector components, acquisition techniques, and reconstruction parameters. Methods: The model is based on cascaded systems analysis of the 3D noise-power spectrum (NPS) and noise-equivalent quanta (NEQ) combined with factors of system geometry (magnification, focal spot size, and scatter-to-primary ratio) and anatomical background clutter. The model was extended to task-based analysis of detectability index (d') for tasks ranging in contrast and frequencymore » content, and d' was computed as a function of system magnification, detector pixel size, focal spot size, kVp, dose, electronic noise, voxel size, and reconstruction filter to examine trade-offs and optima among such factors in multivariate analysis. The model was tested quantitatively versus the measured NPS and qualitatively in cadaver images as a function of kVp, dose, pixel size, and reconstruction filter under conditions corresponding to the proposed scanner. Results: The analysis quantified trade-offs among factors of spatial resolution, noise, and dose. System magnification (M) was a critical design parameter with strong effect on spatial resolution, dose, and x-ray scatter, and a fairly robust optimum was identified at M {approx} 1.3 for the imaging tasks considered. The results suggested kVp selection in the range of {approx}65-90 kVp, the lower end (65 kVp) maximizing subject contrast and the upper end maximizing NEQ (90 kVp). The analysis quantified fairly intuitive results--e.g., {approx}0.1-0.2 mm pixel size (and a sharp reconstruction filter) optimal for high-frequency tasks (bone detail) compared to {approx}0.4 mm pixel size (and a smooth reconstruction filter) for low-frequency (soft-tissue) tasks. This result suggests a specific protocol for 1 x 1 (full-resolution) projection data acquisition followed by full-resolution reconstruction with a sharp filter for high-frequency tasks along with 2 x 2 binning reconstruction with a smooth filter for low-frequency tasks. The analysis guided selection of specific source and detector components implemented on the proposed scanner. The analysis also quantified the potential benefits and points of diminishing return in focal spot size, reduced electronic noise, finer detector pixels, and low-dose limits of detectability. Theoretical results agreed quantitatively with the measured NPS and qualitatively with evaluation of cadaver images by a musculoskeletal radiologist. Conclusions: A fairly comprehensive model for 3D imaging performance in cone-beam CT combines factors of quantum noise, system geometry, anatomical background, and imaging task. The analysis provided a valuable, quantitative guide to design, optimization, and technique selection for a musculoskeletal extremities imaging system under development.« less

  2. Reactive transport modeling in the subsurface environment with OGS-IPhreeqc

    NASA Astrophysics Data System (ADS)

    He, Wenkui; Beyer, Christof; Fleckenstein, Jan; Jang, Eunseon; Kalbacher, Thomas; Naumov, Dimitri; Shao, Haibing; Wang, Wenqing; Kolditz, Olaf

    2015-04-01

    Worldwide, sustainable water resource management becomes an increasingly challenging task due to the growth of population and extensive applications of fertilizer in agriculture. Moreover, climate change causes further stresses to both water quantity and quality. Reactive transport modeling in the coupled soil-aquifer system is a viable approach to assess the impacts of different land use and groundwater exploitation scenarios on the water resources. However, the application of this approach is usually limited in spatial scale and to simplified geochemical systems due to the huge computational expense involved. Such computational expense is not only caused by solving the high non-linearity of the initial boundary value problems of water flow in the unsaturated zone numerically with rather fine spatial and temporal discretization for the correct mass balance and numerical stability, but also by the intensive computational task of quantifying geochemical reactions. In the present study, a flexible and efficient tool for large scale reactive transport modeling in variably saturated porous media and its applications are presented. The open source scientific software OpenGeoSys (OGS) is coupled with the IPhreeqc module of the geochemical solver PHREEQC. The new coupling approach makes full use of advantages from both codes: OGS provides a flexible choice of different numerical approaches for simulation of water flow in the vadose zone such as the pressure-based or mixed forms of Richards equation; whereas the IPhreeqc module leads to a simplification of data storage and its communication with OGS, which greatly facilitates the coupling and code updating. Moreover, a parallelization scheme with MPI (Message Passing Interface) is applied, in which the computational task of water flow and mass transport is partitioned through domain decomposition, whereas the efficient parallelization of geochemical reactions is achieved by smart allocation of computational workload over multiple compute nodes. The plausibility of the new coupling is verified by several benchmark tests. In addition, the efficiency of the new coupling approach is demonstrated by its application in a large scale scenario, in which the environmental fate of pesticides in a complex soil-aquifer system is studied.

  3. Reactive transport modeling in variably saturated porous media with OGS-IPhreeqc

    NASA Astrophysics Data System (ADS)

    He, W.; Beyer, C.; Fleckenstein, J. H.; Jang, E.; Kalbacher, T.; Shao, H.; Wang, W.; Kolditz, O.

    2014-12-01

    Worldwide, sustainable water resource management becomes an increasingly challenging task due to the growth of population and extensive applications of fertilizer in agriculture. Moreover, climate change causes further stresses to both water quantity and quality. Reactive transport modeling in the coupled soil-aquifer system is a viable approach to assess the impacts of different land use and groundwater exploitation scenarios on the water resources. However, the application of this approach is usually limited in spatial scale and to simplified geochemical systems due to the huge computational expense involved. Such computational expense is not only caused by solving the high non-linearity of the initial boundary value problems of water flow in the unsaturated zone numerically with rather fine spatial and temporal discretization for the correct mass balance and numerical stability, but also by the intensive computational task of quantifying geochemical reactions. In the present study, a flexible and efficient tool for large scale reactive transport modeling in variably saturated porous media and its applications are presented. The open source scientific software OpenGeoSys (OGS) is coupled with the IPhreeqc module of the geochemical solver PHREEQC. The new coupling approach makes full use of advantages from both codes: OGS provides a flexible choice of different numerical approaches for simulation of water flow in the vadose zone such as the pressure-based or mixed forms of Richards equation; whereas the IPhreeqc module leads to a simplification of data storage and its communication with OGS, which greatly facilitates the coupling and code updating. Moreover, a parallelization scheme with MPI (Message Passing Interface) is applied, in which the computational task of water flow and mass transport is partitioned through domain decomposition, whereas the efficient parallelization of geochemical reactions is achieved by smart allocation of computational workload over multiple compute nodes. The plausibility of the new coupling is verified by several benchmark tests. In addition, the efficiency of the new coupling approach is demonstrated by its application in a large scale scenario, in which the environmental fate of pesticides in a complex soil-aquifer system is studied.

  4. Quantifying Learning in Young Infants: Tracking Leg Actions During a Discovery-learning Task.

    PubMed

    Sargent, Barbara; Reimann, Hendrik; Kubo, Masayoshi; Fetters, Linda

    2015-06-01

    Task-specific actions emerge from spontaneous movement during infancy. It has been proposed that task-specific actions emerge through a discovery-learning process. Here a method is described in which 3-4 month old infants learn a task by discovery and their leg movements are captured to quantify the learning process. This discovery-learning task uses an infant activated mobile that rotates and plays music based on specified leg action of infants. Supine infants activate the mobile by moving their feet vertically across a virtual threshold. This paradigm is unique in that as infants independently discover that their leg actions activate the mobile, the infants' leg movements are tracked using a motion capture system allowing for the quantification of the learning process. Specifically, learning is quantified in terms of the duration of mobile activation, the position variance of the end effectors (feet) that activate the mobile, changes in hip-knee coordination patterns, and changes in hip and knee muscle torque. This information describes infant exploration and exploitation at the interplay of person and environmental constraints that support task-specific action. Subsequent research using this method can investigate how specific impairments of different populations of infants at risk for movement disorders influence the discovery-learning process for task-specific action.

  5. Sex differences on a computerized mental rotation task disappear with computer familiarization.

    PubMed

    Roberts, J E; Bell, M A

    2000-12-01

    The area of cognitive research that has produced the most consistent sex differences is spatial ability. Particularly, men consistently perform better on mental rotation tasks than do women. This study examined the effects of familiarization with a computer on performance of a computerized two-dimensional mental rotation task. Two groups of college students (N=44) performed the rotation task, with one group performing a color-matching task that allowed them to be familiarized with the computer prior to the rotation task. Among the participants who only performed the rotation task, the 11 men performed better than the 11 women. Among the participants who performed the computer familiarization task before the rotation task, how ever, there were no sex differences on the mental rotation task between the 10 men and 12 women. These data indicate that sex differences on this two-dimensional task may reflect familiarization with the computer, not the mental rotation component of the task. Further research with larger samples and increased range of task difficulty is encouraged.

  6. Choosing MUSE: Validation of a Low-Cost, Portable EEG System for ERP Research

    PubMed Central

    Krigolson, Olave E.; Williams, Chad C.; Norton, Angela; Hassall, Cameron D.; Colino, Francisco L.

    2017-01-01

    In recent years there has been an increase in the number of portable low-cost electroencephalographic (EEG) systems available to researchers. However, to date the validation of the use of low-cost EEG systems has focused on continuous recording of EEG data and/or the replication of large system EEG setups reliant on event-markers to afford examination of event-related brain potentials (ERP). Here, we demonstrate that it is possible to conduct ERP research without being reliant on event markers using a portable MUSE EEG system and a single computer. Specifically, we report the results of two experiments using data collected with the MUSE EEG system—one using the well-known visual oddball paradigm and the other using a standard reward-learning task. Our results demonstrate that we could observe and quantify the N200 and P300 ERP components in the visual oddball task and the reward positivity (the mirror opposite component to the feedback-related negativity) in the reward-learning task. Specifically, single sample t-tests of component existence (all p's < 0.05), computation of Bayesian credible intervals, and 95% confidence intervals all statistically verified the existence of the N200, P300, and reward positivity in all analyses. We provide with this research paper an open source website with all the instructions, methods, and software to replicate our findings and to provide researchers with an easy way to use the MUSE EEG system for ERP research. Importantly, our work highlights that with a single computer and a portable EEG system such as the MUSE one can conduct ERP research with ease thus greatly extending the possible use of the ERP methodology to a variety of novel contexts. PMID:28344546

  7. Choosing MUSE: Validation of a Low-Cost, Portable EEG System for ERP Research.

    PubMed

    Krigolson, Olave E; Williams, Chad C; Norton, Angela; Hassall, Cameron D; Colino, Francisco L

    2017-01-01

    In recent years there has been an increase in the number of portable low-cost electroencephalographic (EEG) systems available to researchers. However, to date the validation of the use of low-cost EEG systems has focused on continuous recording of EEG data and/or the replication of large system EEG setups reliant on event-markers to afford examination of event-related brain potentials (ERP). Here, we demonstrate that it is possible to conduct ERP research without being reliant on event markers using a portable MUSE EEG system and a single computer. Specifically, we report the results of two experiments using data collected with the MUSE EEG system-one using the well-known visual oddball paradigm and the other using a standard reward-learning task. Our results demonstrate that we could observe and quantify the N200 and P300 ERP components in the visual oddball task and the reward positivity (the mirror opposite component to the feedback-related negativity) in the reward-learning task. Specifically, single sample t -tests of component existence (all p 's < 0.05), computation of Bayesian credible intervals, and 95% confidence intervals all statistically verified the existence of the N200, P300, and reward positivity in all analyses. We provide with this research paper an open source website with all the instructions, methods, and software to replicate our findings and to provide researchers with an easy way to use the MUSE EEG system for ERP research. Importantly, our work highlights that with a single computer and a portable EEG system such as the MUSE one can conduct ERP research with ease thus greatly extending the possible use of the ERP methodology to a variety of novel contexts.

  8. Passive motion paradigm: an alternative to optimal control.

    PubMed

    Mohan, Vishwanathan; Morasso, Pietro

    2011-01-01

    IN THE LAST YEARS, OPTIMAL CONTROL THEORY (OCT) HAS EMERGED AS THE LEADING APPROACH FOR INVESTIGATING NEURAL CONTROL OF MOVEMENT AND MOTOR COGNITION FOR TWO COMPLEMENTARY RESEARCH LINES: behavioral neuroscience and humanoid robotics. In both cases, there are general problems that need to be addressed, such as the "degrees of freedom (DoFs) problem," the common core of production, observation, reasoning, and learning of "actions." OCT, directly derived from engineering design techniques of control systems quantifies task goals as "cost functions" and uses the sophisticated formal tools of optimal control to obtain desired behavior (and predictions). We propose an alternative "softer" approach passive motion paradigm (PMP) that we believe is closer to the biomechanics and cybernetics of action. The basic idea is that actions (overt as well as covert) are the consequences of an internal simulation process that "animates" the body schema with the attractor dynamics of force fields induced by the goal and task-specific constraints. This internal simulation offers the brain a way to dynamically link motor redundancy with task-oriented constraints "at runtime," hence solving the "DoFs problem" without explicit kinematic inversion and cost function computation. We argue that the function of such computational machinery is not only restricted to shaping motor output during action execution but also to provide the self with information on the feasibility, consequence, understanding and meaning of "potential actions." In this sense, taking into account recent developments in neuroscience (motor imagery, simulation theory of covert actions, mirror neuron system) and in embodied robotics, PMP offers a novel framework for understanding motor cognition that goes beyond the engineering control paradigm provided by OCT. Therefore, the paper is at the same time a review of the PMP rationale, as a computational theory, and a perspective presentation of how to develop it for designing better cognitive architectures.

  9. MIST VR. A laparoscopic surgery procedures trainer and evaluator.

    PubMed

    Sutton, C; McCloy, R; Middlebrook, A; Chater, P; Wilson, M; Stone, R

    1997-01-01

    The key bimanual instrument tasks involved in laparoscopic surgery have been abstracted for use in a virtual reality surgical skills evaluator and trainer. The trainer uses two laparoscopic instruments mounted on a frame with position sensors which provide instrument movement data that is translated into interactive real time graphics on a PC (P133, 16 Mb RAM, graphics acceleration card). An accurately scaled operating volume of 10 cm3 is represented by a 3D cube on the computer screen. "Camera" position and size of target objects can be varied for different skill levels. Targets appear randomly within the operating volume according to the skill task and can be grasped and manipulated with the instruments. Accuracy and errors during the tasks and time to completion are logged. Mist VR has tutorial, training, examination, analysis and configuration modes. Six tasks have been selected and include combinations of instrument approach, target acquisition, target manipulation and placement, transfer between instruments, target contact with optional diathermy, and controlled instrument withdrawal/replacement. Tasks can be configured for varying degrees of difficulty and the configurations saved to a library for reuse. Specific task configurations can be assigned to individual students. In the examination mode the supervisor can select the tasks, repetitions and order and save to a specific file for that trainee. Progress can be assessed and there is the option for playback of the training session or examination. Data analyses permit overall, including task, and right or left hand performances to be quantified. Mist VR represents a significant advance over the subjective assessment of training performances with existing "plastic box" basic trainers.

  10. Neural control of cursor trajectory and click by a human with tetraplegia 1000 days after implant of an intracortical microelectrode array

    NASA Astrophysics Data System (ADS)

    Simeral, J. D.; Kim, S.-P.; Black, M. J.; Donoghue, J. P.; Hochberg, L. R.

    2011-04-01

    The ongoing pilot clinical trial of the BrainGate neural interface system aims in part to assess the feasibility of using neural activity obtained from a small-scale, chronically implanted, intracortical microelectrode array to provide control signals for a neural prosthesis system. Critical questions include how long implanted microelectrodes will record useful neural signals, how reliably those signals can be acquired and decoded, and how effectively they can be used to control various assistive technologies such as computers and robotic assistive devices, or to enable functional electrical stimulation of paralyzed muscles. Here we examined these questions by assessing neural cursor control and BrainGate system characteristics on five consecutive days 1000 days after implant of a 4 × 4 mm array of 100 microelectrodes in the motor cortex of a human with longstanding tetraplegia subsequent to a brainstem stroke. On each of five prospectively-selected days we performed time-amplitude sorting of neuronal spiking activity, trained a population-based Kalman velocity decoding filter combined with a linear discriminant click state classifier, and then assessed closed-loop point-and-click cursor control. The participant performed both an eight-target center-out task and a random target Fitts metric task which was adapted from a human-computer interaction ISO standard used to quantify performance of computer input devices. The neural interface system was further characterized by daily measurement of electrode impedances, unit waveforms and local field potentials. Across the five days, spiking signals were obtained from 41 of 96 electrodes and were successfully decoded to provide neural cursor point-and-click control with a mean task performance of 91.3% ± 0.1% (mean ± s.d.) correct target acquisition. Results across five consecutive days demonstrate that a neural interface system based on an intracortical microelectrode array can provide repeatable, accurate point-and-click control of a computer interface to an individual with tetraplegia 1000 days after implantation of this sensor.

  11. Neural control of cursor trajectory and click by a human with tetraplegia 1000 days after implant of an intracortical microelectrode array

    PubMed Central

    Simeral, J D; Kim, S-P; Black, M J; Donoghue, J P; Hochberg, L R

    2013-01-01

    The ongoing pilot clinical trial of the BrainGate neural interface system aims in part to assess the feasibility of using neural activity obtained from a small-scale, chronically implanted, intracortical microelectrode array to provide control signals for a neural prosthesis system. Critical questions include how long implanted microelectrodes will record useful neural signals, how reliably those signals can be acquired and decoded, and how effectively they can be used to control various assistive technologies such as computers and robotic assistive devices, or to enable functional electrical stimulation of paralyzed muscles. Here we examined these questions by assessing neural cursor control and BrainGate system characteristics on five consecutive days 1000 days after implant of a 4 × 4 mm array of 100 microelectrodes in the motor cortex of a human with longstanding tetraplegia subsequent to a brainstem stroke. On each of five prospectively-selected days we performed time-amplitude sorting of neuronal spiking activity, trained a population-based Kalman velocity decoding filter combined with a linear discriminant click state classifier, and then assessed closed-loop point-and-click cursor control. The participant performed both an eight-target center-out task and a random target Fitts metric task which was adapted from a human-computer interaction ISO standard used to quantify performance of computer input devices. The neural interface system was further characterized by daily measurement of electrode impedances, unit waveforms and local field potentials. Across the five days, spiking signals were obtained from 41 of 96 electrodes and were successfully decoded to provide neural cursor point-and-click control with a mean task performance of 91.3% ± 0.1% (mean ± s.d.) correct target acquisition. Results across five consecutive days demonstrate that a neural interface system based on an intracortical microelectrode array can provide repeatable, accurate point-and-click control of a computer interface to an individual with tetraplegia 1000 days after implantation of this sensor. PMID:21436513

  12. Graph-based similarity concepts in virtual screening.

    PubMed

    Hutter, Michael C

    2011-03-01

    Applying similarity for finding new promising compounds is a key issue in drug design. Conversely, quantifying similarity between molecules has remained a difficult task despite the numerous approaches. Here, some general aspects along with recent developments regarding similarity criteria are collected. For the purpose of virtual screening, the compounds have to be encoded into a computer-readable format that permits a comparison, according to given similarity criteria, comprising the use of the 3D structure, fingerprints, graph-based and alignment-based approaches. Whereas finding the most common substructures is the most obvious method, more recent approaches take into account chemical modifications that appear throughout existing drugs, from various therapeutic categories and targets.

  13. Practical quantification of necrosis in histological whole-slide images.

    PubMed

    Homeyer, André; Schenk, Andrea; Arlt, Janine; Dahmen, Uta; Dirsch, Olaf; Hahn, Horst K

    2013-06-01

    Since the histological quantification of necrosis is a common task in medical research and practice, we evaluate different image analysis methods for quantifying necrosis in whole-slide images. In a practical usage scenario, we assess the impact of different classification algorithms and feature sets on both accuracy and computation time. We show how a well-chosen combination of multiresolution features and an efficient postprocessing step enables the accurate quantification necrosis in gigapixel images in less than a minute. The results are general enough to be applied to other areas of histological image analysis as well. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Cloud computing task scheduling strategy based on improved differential evolution algorithm

    NASA Astrophysics Data System (ADS)

    Ge, Junwei; He, Qian; Fang, Yiqiu

    2017-04-01

    In order to optimize the cloud computing task scheduling scheme, an improved differential evolution algorithm for cloud computing task scheduling is proposed. Firstly, the cloud computing task scheduling model, according to the model of the fitness function, and then used improved optimization calculation of the fitness function of the evolutionary algorithm, according to the evolution of generation of dynamic selection strategy through dynamic mutation strategy to ensure the global and local search ability. The performance test experiment was carried out in the CloudSim simulation platform, the experimental results show that the improved differential evolution algorithm can reduce the cloud computing task execution time and user cost saving, good implementation of the optimal scheduling of cloud computing tasks.

  15. Use of Terrestrial Laser Scanner for Rigid Airport Pavement Management.

    PubMed

    Barbarella, Maurizio; D'Amico, Fabrizio; De Blasiis, Maria Rosaria; Di Benedetto, Alessandro; Fiani, Margherita

    2017-12-26

    The evaluation of the structural efficiency of airport infrastructures is a complex task. Faulting is one of the most important indicators of rigid pavement performance. The aim of our study is to provide a new method for faulting detection and computation on jointed concrete pavements. Nowadays, the assessment of faulting is performed with the use of laborious and time-consuming measurements that strongly hinder aircraft traffic. We proposed a field procedure for Terrestrial Laser Scanner data acquisition and a computation flow chart in order to identify and quantify the fault size at each joint of apron slabs. The total point cloud has been used to compute the least square plane fitting those points. The best-fit plane for each slab has been computed too. The attitude of each slab plane with respect to both the adjacent ones and the apron reference plane has been determined by the normal vectors to the surfaces. Faulting has been evaluated as the difference in elevation between the slab planes along chosen sections. For a more accurate evaluation of the faulting value, we have then considered a few strips of data covering rectangular areas of different sizes across the joints. The accuracy of the estimated quantities has been computed too.

  16. Influence of computer work under time pressure on cardiac activity.

    PubMed

    Shi, Ping; Hu, Sijung; Yu, Hongliu

    2015-03-01

    Computer users are often under stress when required to complete computer work within a required time. Work stress has repeatedly been associated with an increased risk for cardiovascular disease. The present study examined the effects of time pressure workload during computer tasks on cardiac activity in 20 healthy subjects. Heart rate, time domain and frequency domain indices of heart rate variability (HRV) and Poincaré plot parameters were compared among five computer tasks and two rest periods. Faster heart rate and decreased standard deviation of R-R interval were noted in response to computer tasks under time pressure. The Poincaré plot parameters showed significant differences between different levels of time pressure workload during computer tasks, and between computer tasks and the rest periods. In contrast, no significant differences were identified for the frequency domain indices of HRV. The results suggest that the quantitative Poincaré plot analysis used in this study was able to reveal the intrinsic nonlinear nature of the autonomically regulated cardiac rhythm. Specifically, heightened vagal tone occurred during the relaxation computer tasks without time pressure. In contrast, the stressful computer tasks with added time pressure stimulated cardiac sympathetic activity. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. The effects of dual tasking on gait synchronization during over-ground side-by-side walking.

    PubMed

    Zivotofsky, Ari Z; Bernad-Elazari, Hagar; Grossman, Pnina; Hausdorff, Jeffrey M

    2018-06-01

    Recent studies have shown that gait synchronization during natural walking is not merely anecdotal, but it is a repeatable phenomenon that is quantifiable and is apparently related to available sensory feedback modalities. However, the mechanisms underlying this phase-locking of gait have only recently begun to be investigated. For example, it is not known what role, if any, attention plays. We employed a dual tasking paradigm in order to investigate the role attention plays in gait synchronization. Sixteen pairs of subjects walked under six conditions that manipulated the available sensory feedback and the degree of difficulty of the dual task, i.e., the attention. Movement was quantified using a trunk-mounted tri-axial accelerometer. A gait synchronization index (GSI) was calculated in order to quantify the degree of synchronization of the gait pattern. A simple dual task resulted in an increased level of synchronization, whereas a more complex dual task lead to a reduction in synchronization. Handholding increased synchronization, compared to the same attention condition without handholding. These results indicate that in order for two walkers to synchronize, some level of attention is apparently required, such that a relatively complex dual task utilizes enough attentional resources to reduce the occurrence of synchronization. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Describing functional diversity of brain regions and brain networks

    PubMed Central

    Anderson, Michael L.; Kinnison, Josh; Pessoa, Luiz

    2013-01-01

    Despite the general acceptance that functional specialization plays an important role in brain function, there is little consensus about its extent in the brain. We sought to advance the understanding of this question by employing a data-driven approach that capitalizes on the existence of large databases of neuroimaging data. We quantified the diversity of activation in brain regions as a way to characterize the degree of functional specialization. To do so, brain activations were classified in terms of task domains, such as vision, attention, and language, which determined a region’s functional fingerprint. We found that the degree of diversity varied considerably across the brain. We also quantified novel properties of regions and of networks that inform our understanding of several task-positive and task-negative networks described in the literature, including defining functional fingerprints for entire networks and measuring their functional assortativity, namely the degree to which they are composed of regions with similar functional fingerprints. Our results demonstrate that some brain networks exhibit strong assortativity, whereas other networks consist of relatively heterogeneous parts. In sum, rather than characterizing the contributions of individual brain regions using task-based functional attributions, we instead quantified their dispositional tendencies, and related those to each region’s affiliative properties in both task-positive and task-negative contexts. PMID:23396162

  19. Prestimulus influences on auditory perception from sensory representations and decision processes.

    PubMed

    Kayser, Stephanie J; McNair, Steven W; Kayser, Christoph

    2016-04-26

    The qualities of perception depend not only on the sensory inputs but also on the brain state before stimulus presentation. Although the collective evidence from neuroimaging studies for a relation between prestimulus state and perception is strong, the interpretation in the context of sensory computations or decision processes has remained difficult. In the auditory system, for example, previous studies have reported a wide range of effects in terms of the perceptually relevant frequency bands and state parameters (phase/power). To dissociate influences of state on earlier sensory representations and higher-level decision processes, we collected behavioral and EEG data in human participants performing two auditory discrimination tasks relying on distinct acoustic features. Using single-trial decoding, we quantified the relation between prestimulus activity, relevant sensory evidence, and choice in different task-relevant EEG components. Within auditory networks, we found that phase had no direct influence on choice, whereas power in task-specific frequency bands affected the encoding of sensory evidence. Within later-activated frontoparietal regions, theta and alpha phase had a direct influence on choice, without involving sensory evidence. These results delineate two consistent mechanisms by which prestimulus activity shapes perception. However, the timescales of the relevant neural activity depend on the specific brain regions engaged by the respective task.

  20. Prestimulus influences on auditory perception from sensory representations and decision processes

    PubMed Central

    McNair, Steven W.

    2016-01-01

    The qualities of perception depend not only on the sensory inputs but also on the brain state before stimulus presentation. Although the collective evidence from neuroimaging studies for a relation between prestimulus state and perception is strong, the interpretation in the context of sensory computations or decision processes has remained difficult. In the auditory system, for example, previous studies have reported a wide range of effects in terms of the perceptually relevant frequency bands and state parameters (phase/power). To dissociate influences of state on earlier sensory representations and higher-level decision processes, we collected behavioral and EEG data in human participants performing two auditory discrimination tasks relying on distinct acoustic features. Using single-trial decoding, we quantified the relation between prestimulus activity, relevant sensory evidence, and choice in different task-relevant EEG components. Within auditory networks, we found that phase had no direct influence on choice, whereas power in task-specific frequency bands affected the encoding of sensory evidence. Within later-activated frontoparietal regions, theta and alpha phase had a direct influence on choice, without involving sensory evidence. These results delineate two consistent mechanisms by which prestimulus activity shapes perception. However, the timescales of the relevant neural activity depend on the specific brain regions engaged by the respective task. PMID:27071110

  1. Significance of noisy signals in periodograms

    NASA Astrophysics Data System (ADS)

    Süveges, Maria

    2015-08-01

    The detection of tiny periodic signals in noisy and irregularly sampled time series is a challenging task. Once a small peak is found in the periodogram, the next step is to see how probable it is that pure noise produced a peak so extreme - that is to say, compute its False Alarm Probability (FAP). This useful measure quantifies the statistical plausibility of the found signal among the noise. However, its derivation from statistical principles is very hard due to the specificities of astronomical periodograms, such as oversampling and the ensuing strong correlation among its values at different frequencies. I will present a method to compute the FAP based on extreme-value statistics (Süveges 2014), and compare it to two other methods proposed by Baluev (2008) and Paltani (2004) and Schwarzenberg-Czerny (2012) on signals with various signal shapes and at different signal-to-noise ratios.

  2. Respiratory sinus arrhythmia responses to cognitive tasks: effects of task factors and RSA indices.

    PubMed

    Overbeek, Thérèse J M; van Boxtel, Anton; Westerink, Joyce H D M

    2014-05-01

    Many studies show that respiratory sinus arrhythmia (RSA) decreases while performing cognitive tasks. However, there is uncertainty about the role of contaminating factors such as physical activity and stress-inducing task variables. Different methods to quantify RSA may also contribute to variable results. In 83 healthy subjects, we studied RSA responses to a working memory task requiring varying levels of cognitive control and a perceptual attention task not requiring strong cognitive control. RSA responses were quantified in the time and frequency domain and were additionally corrected for differences in mean interbeat interval and respiration rate, resulting in eight different RSA indices. The two tasks were clearly differentiated by heart rate and facial EMG reference measures. Cognitive control induced inhibition of RSA whereas perceptual attention generally did not. However, the results show several differences between different RSA indices, emphasizing the importance of methodological variables. Age and sex did not influence the results. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Quantification of signal detection performance degradation induced by phase-retrieval in propagation-based x-ray phase-contrast imaging

    NASA Astrophysics Data System (ADS)

    Chou, Cheng-Ying; Anastasio, Mark A.

    2016-04-01

    In propagation-based X-ray phase-contrast (PB XPC) imaging, the measured image contains a mixture of absorption- and phase-contrast. To obtain separate images of the projected absorption and phase (i.e., refractive) properties of a sample, phase retrieval methods can be employed. It has been suggested that phase-retrieval can always improve image quality in PB XPC imaging. However, when objective (task-based) measures of image quality are employed, this is not necessarily true and phase retrieval can be detrimental. In this work, signal detection theory is utilized to quantify the performance of a Hotelling observer (HO) for detecting a known signal in a known background. Two cases are considered. In the first case, the HO acts directly on the measured intensity data. In the second case, the HO acts on either the retrieved phase or absorption image. We demonstrate that the performance of the HO is superior when acting on the measured intensity data. The loss of task-specific information induced by phase-retrieval is quantified by computing the efficiency of the HO as the ratio of the test statistic signal-to-noise ratio (SNR) for the two cases. The effect of the system geometry on this efficiency is systematically investigated. Our findings confirm that phase-retrieval can impair signal detection performance in XPC imaging.

  4. Quantifying transfer after perceptual-motor sequence learning: how inflexible is implicit learning?

    PubMed

    Sanchez, Daniel J; Yarnik, Eric N; Reber, Paul J

    2015-03-01

    Studies of implicit perceptual-motor sequence learning have often shown learning to be inflexibly tied to the training conditions during learning. Since sequence learning is seen as a model task of skill acquisition, limits on the ability to transfer knowledge from the training context to a performance context indicates important constraints on skill learning approaches. Lack of transfer across contexts has been demonstrated by showing that when task elements are changed following training, this leads to a disruption in performance. These results have typically been taken as suggesting that the sequence knowledge relies on integrated representations across task elements (Abrahamse, Jiménez, Verwey, & Clegg, Psychon Bull Rev 17:603-623, 2010a). Using a relatively new sequence learning task, serial interception sequence learning, three experiments are reported that quantify this magnitude of performance disruption after selectively manipulating individual aspects of motor performance or perceptual information. In Experiment 1, selective disruption of the timing or order of sequential actions was examined using a novel response manipulandum that allowed for separate analysis of these two motor response components. In Experiments 2 and 3, transfer was examined after selective disruption of perceptual information that left the motor response sequence intact. All three experiments provided quantifiable estimates of partial transfer to novel contexts that suggest some level of information integration across task elements. However, the ability to identify quantifiable levels of successful transfer indicates that integration is not all-or-none and that measurement sensitivity is a key in understanding sequence knowledge representations.

  5. Processing of Numerical and Proportional Quantifiers

    ERIC Educational Resources Information Center

    Shikhare, Sailee; Heim, Stefan; Klein, Elise; Huber, Stefan; Willmes, Klaus

    2015-01-01

    Quantifier expressions like "many" and "at least" are part of a rich repository of words in language representing magnitude information. The role of numerical processing in comprehending quantifiers was studied in a semantic truth value judgment task, asking adults to quickly verify sentences about visual displays using…

  6. Computer usage and task-switching during resident's working day: Disruptive or not?

    PubMed

    Méan, Marie; Garnier, Antoine; Wenger, Nathalie; Castioni, Julien; Waeber, Gérard; Marques-Vidal, Pedro

    2017-01-01

    Recent implementation of electronic health records (EHR) has dramatically changed medical ward organization. While residents in general internal medicine use EHR systems half of their working time, whether computer usage impacts residents' workflow remains uncertain. We aimed to observe the frequency of task-switches occurring during resident's work and to assess whether computer usage was associated with task-switching. In a large Swiss academic university hospital, we conducted, between May 26 and July 24, 2015 a time-motion study to assess how residents in general internal medicine organize their working day. We observed 49 day and 17 evening shifts of 36 residents, amounting to 697 working hours. During day shifts, residents spent 5.4 hours using a computer (mean total working time: 11.6 hours per day). On average, residents switched 15 times per hour from a task to another. Task-switching peaked between 8:00-9:00 and 16:00-17:00. Task-switching was not associated with resident's characteristics and no association was found between task-switching and extra hours (Spearman r = 0.220, p = 0.137 for day and r = 0.483, p = 0.058 for evening shifts). Computer usage occurred more frequently at the beginning or ends of day shifts and was associated with decreased overall task-switching. Task-switching occurs very frequently during resident's working day. Despite the fact that residents used a computer half of their working time, computer usage was associated with decreased task-switching. Whether frequent task-switches and computer usage impact the quality of patient care and resident's work must be evaluated in further studies.

  7. Combined Uncertainty and A-Posteriori Error Bound Estimates for General CFD Calculations: Theory and Software Implementation

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.

    2014-01-01

    This workshop presentation discusses the design and implementation of numerical methods for the quantification of statistical uncertainty, including a-posteriori error bounds, for output quantities computed using CFD methods. Hydrodynamic realizations often contain numerical error arising from finite-dimensional approximation (e.g. numerical methods using grids, basis functions, particles) and statistical uncertainty arising from incomplete information and/or statistical characterization of model parameters and random fields. The first task at hand is to derive formal error bounds for statistics given realizations containing finite-dimensional numerical error [1]. The error in computed output statistics contains contributions from both realization error and the error resulting from the calculation of statistics integrals using a numerical method. A second task is to devise computable a-posteriori error bounds by numerically approximating all terms arising in the error bound estimates. For the same reason that CFD calculations including error bounds but omitting uncertainty modeling are only of limited value, CFD calculations including uncertainty modeling but omitting error bounds are only of limited value. To gain maximum value from CFD calculations, a general software package for uncertainty quantification with quantified error bounds has been developed at NASA. The package provides implementations for a suite of numerical methods used in uncertainty quantification: Dense tensorization basis methods [3] and a subscale recovery variant [1] for non-smooth data, Sparse tensorization methods[2] utilizing node-nested hierarchies, Sampling methods[4] for high-dimensional random variable spaces.

  8. A Scheduling Algorithm for Cloud Computing System Based on the Driver of Dynamic Essential Path.

    PubMed

    Xie, Zhiqiang; Shao, Xia; Xin, Yu

    2016-01-01

    To solve the problem of task scheduling in the cloud computing system, this paper proposes a scheduling algorithm for cloud computing based on the driver of dynamic essential path (DDEP). This algorithm applies a predecessor-task layer priority strategy to solve the problem of constraint relations among task nodes. The strategy assigns different priority values to every task node based on the scheduling order of task node as affected by the constraint relations among task nodes, and the task node list is generated by the different priority value. To address the scheduling order problem in which task nodes have the same priority value, the dynamic essential long path strategy is proposed. This strategy computes the dynamic essential path of the pre-scheduling task nodes based on the actual computation cost and communication cost of task node in the scheduling process. The task node that has the longest dynamic essential path is scheduled first as the completion time of task graph is indirectly influenced by the finishing time of task nodes in the longest dynamic essential path. Finally, we demonstrate the proposed algorithm via simulation experiments using Matlab tools. The experimental results indicate that the proposed algorithm can effectively reduce the task Makespan in most cases and meet a high quality performance objective.

  9. A Scheduling Algorithm for Cloud Computing System Based on the Driver of Dynamic Essential Path

    PubMed Central

    Xie, Zhiqiang; Shao, Xia; Xin, Yu

    2016-01-01

    To solve the problem of task scheduling in the cloud computing system, this paper proposes a scheduling algorithm for cloud computing based on the driver of dynamic essential path (DDEP). This algorithm applies a predecessor-task layer priority strategy to solve the problem of constraint relations among task nodes. The strategy assigns different priority values to every task node based on the scheduling order of task node as affected by the constraint relations among task nodes, and the task node list is generated by the different priority value. To address the scheduling order problem in which task nodes have the same priority value, the dynamic essential long path strategy is proposed. This strategy computes the dynamic essential path of the pre-scheduling task nodes based on the actual computation cost and communication cost of task node in the scheduling process. The task node that has the longest dynamic essential path is scheduled first as the completion time of task graph is indirectly influenced by the finishing time of task nodes in the longest dynamic essential path. Finally, we demonstrate the proposed algorithm via simulation experiments using Matlab tools. The experimental results indicate that the proposed algorithm can effectively reduce the task Makespan in most cases and meet a high quality performance objective. PMID:27490901

  10. Task Selection, Task Switching and Multitasking during Computer-Based Independent Study

    ERIC Educational Resources Information Center

    Judd, Terry

    2015-01-01

    Detailed logs of students' computer use, during independent study sessions, were captured in an open-access computer laboratory. Each log consisted of a chronological sequence of tasks representing either the application or the Internet domain displayed in the workstation's active window. Each task was classified using a three-tier schema…

  11. Monitoring task loading with multivariate EEG measures during complex forms of human-computer interaction

    NASA Technical Reports Server (NTRS)

    Smith, M. E.; Gevins, A.; Brown, H.; Karnik, A.; Du, R.

    2001-01-01

    Electroencephalographic (EEG) recordings were made while 16 participants performed versions of a personal-computer-based flight simulation task of low, moderate, or high difficulty. As task difficulty increased, frontal midline theta EEG activity increased and alpha band activity decreased. A participant-specific function that combined multiple EEG features to create a single load index was derived from a sample of each participant's data and then applied to new test data from that participant. Index values were computed for every 4 s of task data. Across participants, mean task load index values increased systematically with increasing task difficulty and differed significantly between the different task versions. Actual or potential applications of this research include the use of multivariate EEG-based methods to monitor task loading during naturalistic computer-based work.

  12. Deep learning based classification of breast tumors with shear-wave elastography.

    PubMed

    Zhang, Qi; Xiao, Yang; Dai, Wei; Suo, Jingfeng; Wang, Congzhi; Shi, Jun; Zheng, Hairong

    2016-12-01

    This study aims to build a deep learning (DL) architecture for automated extraction of learned-from-data image features from the shear-wave elastography (SWE), and to evaluate the DL architecture in differentiation between benign and malignant breast tumors. We construct a two-layer DL architecture for SWE feature extraction, comprised of the point-wise gated Boltzmann machine (PGBM) and the restricted Boltzmann machine (RBM). The PGBM contains task-relevant and task-irrelevant hidden units, and the task-relevant units are connected to the RBM. Experimental evaluation was performed with five-fold cross validation on a set of 227 SWE images, 135 of benign tumors and 92 of malignant tumors, from 121 patients. The features learned with our DL architecture were compared with the statistical features quantifying image intensity and texture. Results showed that the DL features achieved better classification performance with an accuracy of 93.4%, a sensitivity of 88.6%, a specificity of 97.1%, and an area under the receiver operating characteristic curve of 0.947. The DL-based method integrates feature learning with feature selection on SWE. It may be potentially used in clinical computer-aided diagnosis of breast cancer. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Validated Predictions of Metabolic Energy Consumption for Submaximal Effort Movement

    PubMed Central

    Tsianos, George A.; MacFadden, Lisa N.

    2016-01-01

    Physical performance emerges from complex interactions among many physiological systems that are largely driven by the metabolic energy demanded. Quantifying metabolic demand is an essential step for revealing the many mechanisms of physical performance decrement, but accurate predictive models do not exist. The goal of this study was to investigate if a recently developed model of muscle energetics and force could be extended to reproduce the kinematics, kinetics, and metabolic demand of submaximal effort movement. Upright dynamic knee extension against various levels of ergometer load was simulated. Task energetics were estimated by combining the model of muscle contraction with validated models of lower limb musculotendon paths and segment dynamics. A genetic algorithm was used to compute the muscle excitations that reproduced the movement with the lowest energetic cost, which was determined to be an appropriate criterion for this task. Model predictions of oxygen uptake rate (VO2) were well within experimental variability for the range over which the model parameters were confidently known. The model's accurate estimates of metabolic demand make it useful for assessing the likelihood and severity of physical performance decrement for a given task as well as investigating underlying physiologic mechanisms. PMID:27248429

  14. Investigation of the impact of main control room digitalization on operators cognitive reliability in nuclear power plants.

    PubMed

    Zhou, Yong; Mu, Haiying; Jiang, Jianjun; Zhang, Li

    2012-01-01

    Currently, there is a trend in nuclear power plants (NPPs) toward introducing digital and computer technologies into main control rooms (MCRs). Safe generation of electric power in NPPs requires reliable performance of cognitive tasks such as fault detection, diagnosis, and response planning. The digitalization of MCRs has dramatically changed the whole operating environment, and the ways operators interact with the plant systems. If the design and implementation of the digital technology is incompatible with operators' cognitive characteristics, it may have negative effects on operators' cognitive reliability. Firstly, on the basis of three essential prerequisites for successful cognitive tasks, a causal model is constructed to reveal the typical human performance issues arising from digitalization. The cognitive mechanisms which they impact cognitive reliability are analyzed in detail. Then, Bayesian inference is used to quantify and prioritize the influences of these factors. It suggests that interface management and unbalanced workload distribution have more significant impacts on operators' cognitive reliability.

  15. The potential of virtual reality and gaming to assist successful aging with disability.

    PubMed

    Lange, B S; Requejo, P; Flynn, S M; Rizzo, A A; Valero-Cuevas, F J; Baker, L; Winstein, C

    2010-05-01

    Using the advances in computing power, software and hardware technologies, virtual reality (VR), and gaming applications have the potential to address clinical challenges for a range of disabilities. VR-based games can potentially provide the ability to assess and augment cognitive and motor rehabilitation under a range of stimulus conditions that are not easily controllable and quantifiable in the real world. This article discusses an approach for maximizing function and participation for those aging with and into a disability by combining task-specific training with advances in VR and gaming technologies to enable positive behavioral modifications for independence in the home and community. There is potential for the use of VR and game applications for rehabilitating, maintaining, and enhancing those processes that are affected by aging with and into disability, particularly the need to attain a balance in the interplay between sensorimotor function and cognitive demands and to reap the benefits of task-specific training and regular physical activity and exercise.

  16. Task allocation in a distributed computing system

    NASA Technical Reports Server (NTRS)

    Seward, Walter D.

    1987-01-01

    A conceptual framework is examined for task allocation in distributed systems. Application and computing system parameters critical to task allocation decision processes are discussed. Task allocation techniques are addressed which focus on achieving a balance in the load distribution among the system's processors. Equalization of computing load among the processing elements is the goal. Examples of system performance are presented for specific applications. Both static and dynamic allocation of tasks are considered and system performance is evaluated using different task allocation methodologies.

  17. Measuring Social Motivation Using Signal Detection and Reward Responsiveness.

    PubMed

    Chevallier, Coralie; Tonge, Natasha; Safra, Lou; Kahn, David; Kohls, Gregor; Miller, Judith; Schultz, Robert T

    2016-01-01

    Recent trends in psychiatry have emphasized the need for a shift from categorical to dimensional approaches. Of critical importance to this transformation is the availability of tools to objectively quantify behaviors dimensionally. The present study focuses on social motivation, a dimension of behavior that is central to a range of psychiatric conditions but for which a particularly small number of assays currently exist. In Study 1 (N = 48), healthy adults completed a monetary reward task and a social reward task, followed by completion of the Chapman Physical and Social Anhedonia Scales. In Study 2 (N = 26), an independent sample was recruited to assess the robustness of Study 1's findings. The reward tasks were analyzed using signal detection theory to quantify how much reward cues bias participants' responses. In both Study 1 and Study 2, social anhedonia scores were negatively correlated with change in response bias in the social reward task but not in the monetary reward task. A median split on social anhedonia scores confirmed that participants with high social anhedonia showed less change in response bias in the social reward task compared to participants with low social anhedonia. This study confirms that social anhedonia selectively affects how much an individual changes their behavior based on the presence of socially rewarding cues and establishes a tool to quantify social reward responsiveness dimensionally.

  18. Primary School Children's Collaboration: Task Presentation and Gender Issues.

    ERIC Educational Resources Information Center

    Fitzpatrick, Helen; Hardman, Margaret

    2000-01-01

    Explores the characteristics of social interaction during an English language based task in the primary classroom, and the role of the computer in structuring collaboration when compared to a non-computer mode. Explains that seven and nine year old boys and girls (n=120) completed a computer and non-computer task. (CMK)

  19. A scientific workflow framework for (13)C metabolic flux analysis.

    PubMed

    Dalman, Tolga; Wiechert, Wolfgang; Nöh, Katharina

    2016-08-20

    Metabolic flux analysis (MFA) with (13)C labeling data is a high-precision technique to quantify intracellular reaction rates (fluxes). One of the major challenges of (13)C MFA is the interactivity of the computational workflow according to which the fluxes are determined from the input data (metabolic network model, labeling data, and physiological rates). Here, the workflow assembly is inevitably determined by the scientist who has to consider interacting biological, experimental, and computational aspects. Decision-making is context dependent and requires expertise, rendering an automated evaluation process hardly possible. Here, we present a scientific workflow framework (SWF) for creating, executing, and controlling on demand (13)C MFA workflows. (13)C MFA-specific tools and libraries, such as the high-performance simulation toolbox 13CFLUX2, are wrapped as web services and thereby integrated into a service-oriented architecture. Besides workflow steering, the SWF features transparent provenance collection and enables full flexibility for ad hoc scripting solutions. To handle compute-intensive tasks, cloud computing is supported. We demonstrate how the challenges posed by (13)C MFA workflows can be solved with our approach on the basis of two proof-of-concept use cases. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Quantifying transfer after perceptual-motor sequence learning: how inflexible is implicit learning?

    PubMed Central

    Sanchez, Daniel J.; Yarnik, Eric N.

    2015-01-01

    Studies of implicit perceptual-motor sequence learning have often shown learning to be inflexibly tied to the training conditions during learning. Since sequence learning is seen as a model task of skill acquisition, limits on the ability to transfer knowledge from the training context to a performance context indicates important constraints on skill learning approaches. Lack of transfer across contexts has been demonstrated by showing that when task elements are changed following training, this leads to a disruption in performance. These results have typically been taken as suggesting that the sequence knowledge relies on integrated representations across task elements (Abrahamse, Jiménez, Verwey, & Clegg, Psychon Bull Rev 17:603–623, 2010a). Using a relatively new sequence learning task, serial interception sequence learning, three experiments are reported that quantify this magnitude of performance disruption after selectively manipulating individual aspects of motor performance or perceptual information. In Experiment 1, selective disruption of the timing or order of sequential actions was examined using a novel response manipulandum that allowed for separate analysis of these two motor response components. In Experiments 2 and 3, transfer was examined after selective disruption of perceptual information that left the motor response sequence intact. All three experiments provided quantifiable estimates of partial transfer to novel contexts that suggest some level of information integration across task elements. However, the ability to identify quantifiable levels of successful transfer indicates that integration is not all-or-none and that measurement sensitivity is a key in understanding sequence knowledge representations. PMID:24668505

  1. Quantifying the Correctness, Computational Complexity, and Security of Privacy-Preserving String Comparators for Record Linkage

    PubMed Central

    Durham, Elizabeth; Xue, Yuan; Kantarcioglu, Murat; Malin, Bradley

    2011-01-01

    Record linkage is the task of identifying records from disparate data sources that refer to the same entity. It is an integral component of data processing in distributed settings, where the integration of information from multiple sources can prevent duplication and enrich overall data quality, thus enabling more detailed and correct analysis. Privacy-preserving record linkage (PPRL) is a variant of the task in which data owners wish to perform linkage without revealing identifiers associated with the records. This task is desirable in various domains, including healthcare, where it may not be possible to reveal patient identity due to confidentiality requirements, and in business, where it could be disadvantageous to divulge customers' identities. To perform PPRL, it is necessary to apply string comparators that function in the privacy-preserving space. A number of privacy-preserving string comparators (PPSCs) have been proposed, but little research has compared them in the context of a real record linkage application. This paper performs a principled and comprehensive evaluation of six PPSCs in terms of three key properties: 1) correctness of record linkage predictions, 2) computational complexity, and 3) security. We utilize a real publicly-available dataset, derived from the North Carolina voter registration database, to evaluate the tradeoffs between the aforementioned properties. Among our results, we find that PPSCs that partition, encode, and compare strings yield highly accurate record linkage results. However, as a tradeoff, we observe that such PPSCs are less secure than those that map and compare strings in a reduced dimensional space. PMID:22904698

  2. Quantifying the Correctness, Computational Complexity, and Security of Privacy-Preserving String Comparators for Record Linkage.

    PubMed

    Durham, Elizabeth; Xue, Yuan; Kantarcioglu, Murat; Malin, Bradley

    2012-10-01

    Record linkage is the task of identifying records from disparate data sources that refer to the same entity. It is an integral component of data processing in distributed settings, where the integration of information from multiple sources can prevent duplication and enrich overall data quality, thus enabling more detailed and correct analysis. Privacy-preserving record linkage (PPRL) is a variant of the task in which data owners wish to perform linkage without revealing identifiers associated with the records. This task is desirable in various domains, including healthcare, where it may not be possible to reveal patient identity due to confidentiality requirements, and in business, where it could be disadvantageous to divulge customers' identities. To perform PPRL, it is necessary to apply string comparators that function in the privacy-preserving space. A number of privacy-preserving string comparators (PPSCs) have been proposed, but little research has compared them in the context of a real record linkage application. This paper performs a principled and comprehensive evaluation of six PPSCs in terms of three key properties: 1) correctness of record linkage predictions, 2) computational complexity, and 3) security. We utilize a real publicly-available dataset, derived from the North Carolina voter registration database, to evaluate the tradeoffs between the aforementioned properties. Among our results, we find that PPSCs that partition, encode, and compare strings yield highly accurate record linkage results. However, as a tradeoff, we observe that such PPSCs are less secure than those that map and compare strings in a reduced dimensional space.

  3. A Framework for a Supervisory Expert System for Robotic Manipulators with Joint-Position Limits and Joint-Rate Limits

    NASA Technical Reports Server (NTRS)

    Mutambara, Arthur G. O.; Litt, Jonathan

    1998-01-01

    This report addresses the problem of path planning and control of robotic manipulators which have joint-position limits and joint-rate limits. The manipulators move autonomously and carry out variable tasks in a dynamic, unstructured and cluttered environment. The issue considered is whether the robotic manipulator can achieve all its tasks, and if it cannot, the objective is to identify the closest achievable goal. This problem is formalized and systematically solved for generic manipulators by using inverse kinematics and forward kinematics. Inverse kinematics are employed to define the subspace, workspace and constrained workspace, which are then used to identify when a task is not achievable. The closest achievable goal is obtained by determining weights for an optimal control redistribution scheme. These weights are quantified by using forward kinematics. Conditions leading to joint rate limits are identified, in particular it is established that all generic manipulators have singularities at the boundary of their workspace, while some have loci of singularities inside their workspace. Once the manipulator singularity is identified the command redistribution scheme is used to compute the closest achievable Cartesian velocities. Two examples are used to illustrate the use of the algorithm: A three link planar manipulator and the Unimation Puma 560. Implementation of the derived algorithm is effected by using a supervisory expert system to check whether the desired goal lies in the constrained workspace and if not, to evoke the redistribution scheme which determines the constraint relaxation between end effector position and orientation, and then computes optimal gains.

  4. Quantifying tasks, ergonomic exposures and injury rates among school custodial workers.

    PubMed

    Village, J; Koehoorn, M; Hossain, S; Ostry, A

    2009-06-01

    A job exposure matrix of ergonomics risk factors was constructed for school custodial workers in one large school district in the province of British Columbia using 100 h of 1-min fixed-interval observations, participatory worker consensus on task durations and existing employment and school characteristic data. Significant differences in ergonomics risk factors were found by tasks and occupations. Cleaning and moving furniture, handling garbage, cleaning washrooms and cleaning floors were associated with the most physical risks and the exposure was often higher during the summer vs. the school year. Injury rates over a 4-year period showed the custodian injury rate was four times higher than the overall injury rate across all occupations in the school district. Injury rates were significantly higher in the school year compared with summer (12.2 vs. 7.0 per 100 full-time equivalents per year, p < 0.05). Custodial workers represent a considerable proportion of the labour force and have high injury rates, yet ergonomic studies are disproportionately few. Previous studies that quantified risk factors in custodial workers tended to focus on a few tasks or specific risk factors. This study, using participatory ergonomics and observational methods, systematically quantifies the broad range of musculoskeletal risk factors across multiple tasks performed by custodial workers in schools, adding considerably to the methodological literature.

  5. Statistical benchmark for BosonSampling

    NASA Astrophysics Data System (ADS)

    Walschaers, Mattia; Kuipers, Jack; Urbina, Juan-Diego; Mayer, Klaus; Tichy, Malte Christopher; Richter, Klaus; Buchleitner, Andreas

    2016-03-01

    Boson samplers—set-ups that generate complex many-particle output states through the transmission of elementary many-particle input states across a multitude of mutually coupled modes—promise the efficient quantum simulation of a classically intractable computational task, and challenge the extended Church-Turing thesis, one of the fundamental dogmas of computer science. However, as in all experimental quantum simulations of truly complex systems, one crucial problem remains: how to certify that a given experimental measurement record unambiguously results from enforcing the claimed dynamics, on bosons, fermions or distinguishable particles? Here we offer a statistical solution to the certification problem, identifying an unambiguous statistical signature of many-body quantum interference upon transmission across a multimode, random scattering device. We show that statistical analysis of only partial information on the output state allows to characterise the imparted dynamics through particle type-specific features of the emerging interference patterns. The relevant statistical quantifiers are classically computable, define a falsifiable benchmark for BosonSampling, and reveal distinctive features of many-particle quantum dynamics, which go much beyond mere bunching or anti-bunching effects.

  6. A Distributed Computing Framework for Real-Time Detection of Stress and of Its Propagation in a Team.

    PubMed

    Pandey, Parul; Lee, Eun Kyung; Pompili, Dario

    2016-11-01

    Stress is one of the key factor that impacts the quality of our daily life: From the productivity and efficiency in the production processes to the ability of (civilian and military) individuals in making rational decisions. Also, stress can propagate from one individual to other working in a close proximity or toward a common goal, e.g., in a military operation or workforce. Real-time assessment of the stress of individuals alone is, however, not sufficient, as understanding its source and direction in which it propagates in a group of people is equally-if not more-important. A continuous near real-time in situ personal stress monitoring system to quantify level of stress of individuals and its direction of propagation in a team is envisioned. However, stress monitoring of an individual via his/her mobile device may not always be possible for extended periods of time due to limited battery capacity of these devices. To overcome this challenge a novel distributed mobile computing framework is proposed to organize the resources in the vicinity and form a mobile device cloud that enables offloading of computation tasks in stress detection algorithm from resource constrained devices (low residual battery, limited CPU cycles) to resource rich devices. Our framework also supports computing parallelization and workflows, defining how the data and tasks divided/assigned among the entities of the framework are designed. The direction of propagation and magnitude of influence of stress in a group of individuals are studied by applying real-time, in situ analysis of Granger Causality. Tangible benefits (in terms of energy expenditure and execution time) of the proposed framework in comparison to a centralized framework are presented via thorough simulations and real experiments.

  7. Use of computer games as an intervention for stroke.

    PubMed

    Proffitt, Rachel M; Alankus, Gazihan; Kelleher, Caitlin L; Engsberg, Jack R

    2011-01-01

    Current rehabilitation for persons with hemiparesis after stroke requires high numbers of repetitions to be in accordance with contemporary motor learning principles. The motivational characteristics of computer games can be harnessed to create engaging interventions for persons with hemiparesis after stroke that incorporate this high number of repetitions. The purpose of this case report was to test the feasibility of using computer games as a 6-week home therapy intervention to improve upper extremity function for a person with stroke. One person with left upper extremity hemiparesis after stroke participated in a 6-week home therapy computer game intervention. The games were customized to her preferences and abilities and modified weekly. Her performance was tracked and analyzed. Data from pre-, mid-, and postintervention testing using standard upper extremity measures and the Reaching Performance Scale (RPS) were analyzed. After 3 weeks, the participant demonstrated increased upper extremity range of motion at the shoulder and decreased compensatory trunk movements during reaching tasks. After 6 weeks, she showed functional gains in activities of daily living (ADLs) and instrumental ADLs despite no further improvements on the RPS. Results indicate that computer games have the potential to be a useful intervention for people with stroke. Future work will add additional support to quantify the effectiveness of the games as a home therapy intervention for persons with stroke.

  8. Use of Terrestrial Laser Scanner for Rigid Airport Pavement Management

    PubMed Central

    Di Benedetto, Alessandro; Fiani, Margherita

    2017-01-01

    The evaluation of the structural efficiency of airport infrastructures is a complex task. Faulting is one of the most important indicators of rigid pavement performance. The aim of our study is to provide a new method for faulting detection and computation on jointed concrete pavements. Nowadays, the assessment of faulting is performed with the use of laborious and time-consuming measurements that strongly hinder aircraft traffic. We proposed a field procedure for Terrestrial Laser Scanner data acquisition and a computation flow chart in order to identify and quantify the fault size at each joint of apron slabs. The total point cloud has been used to compute the least square plane fitting those points. The best-fit plane for each slab has been computed too. The attitude of each slab plane with respect to both the adjacent ones and the apron reference plane has been determined by the normal vectors to the surfaces. Faulting has been evaluated as the difference in elevation between the slab planes along chosen sections. For a more accurate evaluation of the faulting value, we have then considered a few strips of data covering rectangular areas of different sizes across the joints. The accuracy of the estimated quantities has been computed too. PMID:29278386

  9. Testing cognition in the wild: factors affecting performance and individual consistency in two measures of avian cognition.

    PubMed

    Shaw, Rachael C

    2017-01-01

    Developing cognitive tasks to reliably quantify individual differences in cognitive ability is critical to advance our understanding of the fitness consequences of cognition in the wild. Several factors may influence individual performance in a cognitive task, with some being unrelated to the cognitive ability that is the target of the test. It is therefore essential to assess how extraneous factors may affect task performance, particularly for those tasks that are frequently used to quantify individual differences in cognitive ability. The current study therefore measured the performance of wild North Island robins in two tasks commonly used to measure individual differences in avian cognition: a novel motor task and a detour reaching task. The robins' performance in the motor task was affected by prior experience; individuals that had previously participated in a similar task that required a different motor action pattern outperformed naïve subjects. By contrast, detour reaching performance was influenced by an individual's body condition, suggesting that energetic state may affect inhibitory control in robins. Designing tasks that limit the influence of past experience and developing means of standardising motivation across animals tested in the wild remain key challenges to improving current measurements of cognitive ability in birds. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Task allocation model for minimization of completion time in distributed computer systems

    NASA Astrophysics Data System (ADS)

    Wang, Jai-Ping; Steidley, Carl W.

    1993-08-01

    A task in a distributed computing system consists of a set of related modules. Each of the modules will execute on one of the processors of the system and communicate with some other modules. In addition, precedence relationships may exist among the modules. Task allocation is an essential activity in distributed-software design. This activity is of importance to all phases of the development of a distributed system. This paper establishes task completion-time models and task allocation models for minimizing task completion time. Current work in this area is either at the experimental level or without the consideration of precedence relationships among modules. The development of mathematical models for the computation of task completion time and task allocation will benefit many real-time computer applications such as radar systems, navigation systems, industrial process control systems, image processing systems, and artificial intelligence oriented systems.

  11. A General Cross-Layer Cloud Scheduling Framework for Multiple IoT Computer Tasks.

    PubMed

    Wu, Guanlin; Bao, Weidong; Zhu, Xiaomin; Zhang, Xiongtao

    2018-05-23

    The diversity of IoT services and applications brings enormous challenges to improving the performance of multiple computer tasks' scheduling in cross-layer cloud computing systems. Unfortunately, the commonly-employed frameworks fail to adapt to the new patterns on the cross-layer cloud. To solve this issue, we design a new computer task scheduling framework for multiple IoT services in cross-layer cloud computing systems. Specifically, we first analyze the features of the cross-layer cloud and computer tasks. Then, we design the scheduling framework based on the analysis and present detailed models to illustrate the procedures of using the framework. With the proposed framework, the IoT services deployed in cross-layer cloud computing systems can dynamically select suitable algorithms and use resources more effectively to finish computer tasks with different objectives. Finally, the algorithms are given based on the framework, and extensive experiments are also given to validate its effectiveness, as well as its superiority.

  12. Structural constraints on pronoun binding and coreference: evidence from eye movements during reading

    PubMed Central

    Cunnings, Ian; Patterson, Clare; Felser, Claudia

    2015-01-01

    A number of recent studies have investigated how syntactic and non-syntactic constraints combine to cue memory retrieval during anaphora resolution. In this paper we investigate how syntactic constraints and gender congruence interact to guide memory retrieval during the resolution of subject pronouns. Subject pronouns are always technically ambiguous, and the application of syntactic constraints on their interpretation depends on properties of the antecedent that is to be retrieved. While pronouns can freely corefer with non-quantified referential antecedents, linking a pronoun to a quantified antecedent is only possible in certain syntactic configurations via variable binding. We report the results from a judgment task and three online reading comprehension experiments investigating pronoun resolution with quantified and non-quantified antecedents. Results from both the judgment task and participants' eye movements during reading indicate that comprehenders freely allow pronouns to corefer with non-quantified antecedents, but that retrieval of quantified antecedents is restricted to specific syntactic environments. We interpret our findings as indicating that syntactic constraints constitute highly weighted cues to memory retrieval during anaphora resolution. PMID:26157400

  13. Structural constraints on pronoun binding and coreference: evidence from eye movements during reading.

    PubMed

    Cunnings, Ian; Patterson, Clare; Felser, Claudia

    2015-01-01

    A number of recent studies have investigated how syntactic and non-syntactic constraints combine to cue memory retrieval during anaphora resolution. In this paper we investigate how syntactic constraints and gender congruence interact to guide memory retrieval during the resolution of subject pronouns. Subject pronouns are always technically ambiguous, and the application of syntactic constraints on their interpretation depends on properties of the antecedent that is to be retrieved. While pronouns can freely corefer with non-quantified referential antecedents, linking a pronoun to a quantified antecedent is only possible in certain syntactic configurations via variable binding. We report the results from a judgment task and three online reading comprehension experiments investigating pronoun resolution with quantified and non-quantified antecedents. Results from both the judgment task and participants' eye movements during reading indicate that comprehenders freely allow pronouns to corefer with non-quantified antecedents, but that retrieval of quantified antecedents is restricted to specific syntactic environments. We interpret our findings as indicating that syntactic constraints constitute highly weighted cues to memory retrieval during anaphora resolution.

  14. SHC Project 3.63, Task 2, Beneficial Use of Waste Materials

    EPA Science Inventory

    SHC Project 3.63, Task 2, “Beneficial Use of Waste Materials”, is designed to conduct research and analyses to characterize and quantify the risks and benefits of using or reusing waste materials. There are 6 primary research areas in Task 2 that cover a broad spectr...

  15. Temporal Sequences Quantify the Contributions of Individual Fixations in Complex Perceptual Matching Tasks

    ERIC Educational Resources Information Center

    Busey, Thomas; Yu, Chen; Wyatte, Dean; Vanderkolk, John

    2013-01-01

    Perceptual tasks such as object matching, mammogram interpretation, mental rotation, and satellite imagery change detection often require the assignment of correspondences to fuse information across views. We apply techniques developed for machine translation to the gaze data recorded from a complex perceptual matching task modeled after…

  16. Perceptual decision related activity in the lateral geniculate nucleus

    PubMed Central

    Jiang, Yaoguang; Yampolsky, Dmitry; Purushothaman, Gopathy

    2015-01-01

    Fundamental to neuroscience is the understanding of how the language of neurons relates to behavior. In the lateral geniculate nucleus (LGN), cells show distinct properties such as selectivity for particular wavelengths, increments or decrements in contrast, or preference for fine detail versus rapid motion. No studies, however, have measured how LGN cells respond when an animal is challenged to make a perceptual decision using information within the receptive fields of those LGN cells. In this study we measured neural activity in the macaque LGN during a two-alternative, forced-choice (2AFC) contrast detection task or during a passive fixation task and found that a small proportion (13.5%) of single LGN parvocellular (P) and magnocellular (M) neurons matched the psychophysical performance of the monkey. The majority of LGN neurons measured in both tasks were not as sensitive as the monkey. The covariation between neural response and behavior (quantified as choice probability) was significantly above chance during active detection, even when there was no external stimulus. Interneuronal correlations and task-related gain modulations were negligible under the same condition. A bottom-up pooling model that used sensory neural responses to compute perceptual choices in the absence of interneuronal correlations could fully explain these results at the level of the LGN, supporting the hypothesis that the perceptual decision pool consists of multiple sensory neurons and that response fluctuations in these neurons can influence perception. PMID:26019309

  17. Evaluation of Ground Vibrations Induced by Military Noise Sources

    DTIC Science & Technology

    2006-08-01

    1 Task 2—Determine the acoustic -to-seismic coupling coefficients C1 and C2 ...................... 1 Task 3—Computational modeling ...Determine the acoustic -to-seismic coupling coefficients C1 and C2 ....................45 Task 3—Computational modeling of acoustically induced ground...ground conditions. Task 3—Computational modeling of acoustically induced ground motion The simple model of blast sound interaction with the

  18. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment.

    PubMed

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Abdulhamid, Shafi'i Muhammad; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing.

  19. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment

    PubMed Central

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing. PMID:28467505

  20. Robust quantum network architectures and topologies for entanglement distribution

    NASA Astrophysics Data System (ADS)

    Das, Siddhartha; Khatri, Sumeet; Dowling, Jonathan P.

    2018-01-01

    Entanglement distribution is a prerequisite for several important quantum information processing and computing tasks, such as quantum teleportation, quantum key distribution, and distributed quantum computing. In this work, we focus on two-dimensional quantum networks based on optical quantum technologies using dual-rail photonic qubits for the building of a fail-safe quantum internet. We lay out a quantum network architecture for entanglement distribution between distant parties using a Bravais lattice topology, with the technological constraint that quantum repeaters equipped with quantum memories are not easily accessible. We provide a robust protocol for simultaneous entanglement distribution between two distant groups of parties on this network. We also discuss a memory-based quantum network architecture that can be implemented on networks with an arbitrary topology. We examine networks with bow-tie lattice and Archimedean lattice topologies and use percolation theory to quantify the robustness of the networks. In particular, we provide figures of merit on the loss parameter of the optical medium that depend only on the topology of the network and quantify the robustness of the network against intermittent photon loss and intermittent failure of nodes. These figures of merit can be used to compare the robustness of different network topologies in order to determine the best topology in a given real-world scenario, which is critical in the realization of the quantum internet.

  1. Passive Motion Paradigm: An Alternative to Optimal Control

    PubMed Central

    Mohan, Vishwanathan; Morasso, Pietro

    2011-01-01

    In the last years, optimal control theory (OCT) has emerged as the leading approach for investigating neural control of movement and motor cognition for two complementary research lines: behavioral neuroscience and humanoid robotics. In both cases, there are general problems that need to be addressed, such as the “degrees of freedom (DoFs) problem,” the common core of production, observation, reasoning, and learning of “actions.” OCT, directly derived from engineering design techniques of control systems quantifies task goals as “cost functions” and uses the sophisticated formal tools of optimal control to obtain desired behavior (and predictions). We propose an alternative “softer” approach passive motion paradigm (PMP) that we believe is closer to the biomechanics and cybernetics of action. The basic idea is that actions (overt as well as covert) are the consequences of an internal simulation process that “animates” the body schema with the attractor dynamics of force fields induced by the goal and task-specific constraints. This internal simulation offers the brain a way to dynamically link motor redundancy with task-oriented constraints “at runtime,” hence solving the “DoFs problem” without explicit kinematic inversion and cost function computation. We argue that the function of such computational machinery is not only restricted to shaping motor output during action execution but also to provide the self with information on the feasibility, consequence, understanding and meaning of “potential actions.” In this sense, taking into account recent developments in neuroscience (motor imagery, simulation theory of covert actions, mirror neuron system) and in embodied robotics, PMP offers a novel framework for understanding motor cognition that goes beyond the engineering control paradigm provided by OCT. Therefore, the paper is at the same time a review of the PMP rationale, as a computational theory, and a perspective presentation of how to develop it for designing better cognitive architectures. PMID:22207846

  2. A Time-Motion Study of ICU Workflow and the Impact of Strain.

    PubMed

    Hefter, Yosefa; Madahar, Purnema; Eisen, Lewis A; Gong, Michelle N

    2016-08-01

    Understanding ICU workflow and how it is impacted by ICU strain is necessary for implementing effective improvements. This study aimed to quantify how ICU physicians spend time and to examine the impact of ICU strain on workflow. Prospective, observational time-motion study. Five ICUs in two hospitals at an academic medical center. Thirty attending and resident physicians. None. In 137 hours of field observations, the most time-84 hours (62% of total observation time)-was spent on professional communication. Reviewing patient data and documentation occupied a combined 52 hours (38%), whereas direct patient care and education occupied 24 hours (17%) and 13 hours (9%), respectively. The most frequently used tool was the computer, used in tasks that occupied 51 hours (37%). Severity of illness of the ICU on day of observation was the only strain factor that significantly impacted work patterns. In a linear regression model, increase in average ICU Sequential Organ Failure Assessment was associated with more time spent on direct patient care (β = 4.3; 95% CI, 0.9-7.7) and education (β = 3.2; 95% CI, 0.7-5.8), and less time spent on documentation (β = -7.4; 95% CI, -11.6 to -3.2) and on tasks using the computer (β = -7.8; 95% CI, -14.1 to -1.6). These results were more pronounced with a combined strain score that took into account unit census and Sequential Organ Failure Assessment score. After accounting for ICU type (medical vs surgical) and staffing structure (resident staffed vs physician assistant staffed), results changed minimally. Clinicians spend the bulk of their time in the ICU on professional communication and tasks involving computers. With the strain of high severity of illness and a full unit, clinicians reallocate time from documentation to patient care and education. Further efforts are needed to examine system-related aspects of care to understand the impact of workflow and strain on patient care.

  3. The effect of bracing availability on one-hand isometric force exertion capability.

    PubMed

    Jones, Monica L H; Reed, Matthew P; Chaffin, Don B

    2013-01-01

    Environmental obstructions that workers encounter can kinematically limit the postures that they can achieve. However, such obstructions can also provide an opportunity for additional support by bracing with the hand, thigh or other body part. The reaction forces on bracing surfaces, which are in addition to those acting at the feet and task hand, are hypothesised to improve force exertion capability, and become required inputs to biomechanical analysis of tasks with bracing. The effects of kinematic constraints and associated bracing opportunities on isometric hand force were quantified in a laboratory study of 22 men and women. Analyses of one-hand maximal push, pull and lift tasks demonstrated that bracing surfaces available at the thighs and non-task hand enabled participants to exert an average of 43% more force at the task hand. Task hand force direction deviated significantly from the nominal direction for exertions performed with bracing at both medium and low task hand locations. This study quantifies the effect of bracing on kinematically constrained force exertions. Knowledge that appropriate bracing surfaces can substantially increase hand force is critical to the evaluation of task-oriented strength capability. Force estimates may also involve large off-axis components, which have clear implications for ergonomic analyses of manual tasks.

  4. Quantifying the Physiological Stress Response to Simulated Maritime Pilotage Tasks: The Influence of Task Complexity and Pilot Experience.

    PubMed

    Main, Luana C; Wolkow, Alexander; Chambers, Timothy P

    2017-11-01

    The aim of this study was to quantify the stress associated with performing maritime pilotage tasks in a high-fidelity simulator. Eight trainee and 13 maritime pilots completed two simulated pilotage tasks of varying complexity. Salivary cortisol samples were collected pre- and post-simulation for both trials. Heart rate was measured continuously throughout the study. Significant changes in salivary cortisol (P = 0.000, η = 0.139), average (P = 0.006, η = 0.087), and peak heart rate (P = 0.013, η = 0.077) from pre- to postsimulation were found. Varying task complexity did partially influence stress response; average (P = 0.016, η = 0.026) and peak heart rate (P = 0.034, η = 0.020) were higher in the experimental condition. Trainees also recorded higher average (P = 0.000, η = 0.054) and peak heart rates (P = 0.027, η = 0.022). Performing simulated pilotage tasks evoked a measurable stress response in both trainee and expert maritime pilots.

  5. The Time on Task Effect in Reading and Problem Solving Is Moderated by Task Difficulty and Skill: Insights from a Computer-Based Large-Scale Assessment

    ERIC Educational Resources Information Center

    Goldhammer, Frank; Naumann, Johannes; Stelter, Annette; Tóth, Krisztina; Rölke, Heiko; Klieme, Eckhard

    2014-01-01

    Computer-based assessment can provide new insights into behavioral processes of task completion that cannot be uncovered by paper-based instruments. Time presents a major characteristic of the task completion process. Psychologically, time on task has 2 different interpretations, suggesting opposing associations with task outcome: Spending more…

  6. Methodological Framework for World Health Organization Estimates of the Global Burden of Foodborne Disease

    PubMed Central

    Devleesschauwer, Brecht; Haagsma, Juanita A.; Angulo, Frederick J.; Bellinger, David C.; Cole, Dana; Döpfer, Dörte; Fazil, Aamir; Fèvre, Eric M.; Gibb, Herman J.; Hald, Tine; Kirk, Martyn D.; Lake, Robin J.; Maertens de Noordhout, Charline; Mathers, Colin D.; McDonald, Scott A.; Pires, Sara M.; Speybroeck, Niko; Thomas, M. Kate; Torgerson, Paul R.; Wu, Felicia; Havelaar, Arie H.; Praet, Nicolas

    2015-01-01

    Background The Foodborne Disease Burden Epidemiology Reference Group (FERG) was established in 2007 by the World Health Organization to estimate the global burden of foodborne diseases (FBDs). This paper describes the methodological framework developed by FERG's Computational Task Force to transform epidemiological information into FBD burden estimates. Methods and Findings The global and regional burden of 31 FBDs was quantified, along with limited estimates for 5 other FBDs, using Disability-Adjusted Life Years in a hazard- and incidence-based approach. To accomplish this task, the following workflow was defined: outline of disease models and collection of epidemiological data; design and completion of a database template; development of an imputation model; identification of disability weights; probabilistic burden assessment; and estimating the proportion of the disease burden by each hazard that is attributable to exposure by food (i.e., source attribution). All computations were performed in R and the different functions were compiled in the R package 'FERG'. Traceability and transparency were ensured by sharing results and methods in an interactive way with all FERG members throughout the process. Conclusions We developed a comprehensive framework for estimating the global burden of FBDs, in which methodological simplicity and transparency were key elements. All the tools developed have been made available and can be translated into a user-friendly national toolkit for studying and monitoring food safety at the local level. PMID:26633883

  7. The Effects of Study Tasks in a Computer-Based Chemistry Learning Environment

    ERIC Educational Resources Information Center

    Urhahne, Detlef; Nick, Sabine; Poepping, Anna Christin; Schulz , Sarah Jayne

    2013-01-01

    The present study examines the effects of different study tasks on the acquisition of knowledge about acids and bases in a computer-based learning environment. Three different task formats were selected to create three treatment conditions: learning with gap-fill and matching tasks, learning with multiple-choice tasks, and learning only from text…

  8. Time delays in flight simulator visual displays

    NASA Technical Reports Server (NTRS)

    Crane, D. F.

    1980-01-01

    It is pointed out that the effects of delays of less than 100 msec in visual displays on pilot dynamic response and system performance are of particular interest at this time because improvements in the latest computer-generated imagery (CGI) systems are expected to reduce CGI displays delays to this range. Attention is given to data which quantify the effects of display delays in the range of 0-100 msec on system stability and performance, and pilot dynamic response for a particular choice of aircraft dynamics, display, controller, and task. The conventional control system design methods are reviewed, the pilot response data presented, and data for long delays, all suggest lead filter compensation of display delay. Pilot-aircraft system crossover frequency information guides compensation filter specification.

  9. Q4 Titanium 6-4 Material Properties Development

    NASA Technical Reports Server (NTRS)

    Cooper, Kenneth; Nettles, Mindy

    2015-01-01

    This task involves development and characterization of selective laser melting (SLM) parameters for additive manufacturing of titanium-6%aluminum-4%vanadium (Ti-6Al-4V or Ti64). SLM is a relatively new manufacturing technology that fabricates complex metal components by fusing thin layers of powder with a high-powered laser beam, utilizing a 3D computer design to direct the energy and form the shape without traditional tools, dies, or molds. There are several metal SLM technologies and materials on the market today, and various efforts to quantify the mechanical properties, however, nothing consolidated or formal to date. Meanwhile, SLM material fatigue properties of Ti64 are currently highly sought after by NASA propulsion designers for rotating turbomachinery components.

  10. The Human Serum Metabolome

    PubMed Central

    Psychogios, Nikolaos; Hau, David D.; Peng, Jun; Guo, An Chi; Mandal, Rupasri; Bouatra, Souhaila; Sinelnikov, Igor; Krishnamurthy, Ramanarayan; Eisner, Roman; Gautam, Bijaya; Young, Nelson; Xia, Jianguo; Knox, Craig; Dong, Edison; Huang, Paul; Hollander, Zsuzsanna; Pedersen, Theresa L.; Smith, Steven R.; Bamforth, Fiona; Greiner, Russ; McManus, Bruce; Newman, John W.; Goodfriend, Theodore; Wishart, David S.

    2011-01-01

    Continuing improvements in analytical technology along with an increased interest in performing comprehensive, quantitative metabolic profiling, is leading to increased interest pressures within the metabolomics community to develop centralized metabolite reference resources for certain clinically important biofluids, such as cerebrospinal fluid, urine and blood. As part of an ongoing effort to systematically characterize the human metabolome through the Human Metabolome Project, we have undertaken the task of characterizing the human serum metabolome. In doing so, we have combined targeted and non-targeted NMR, GC-MS and LC-MS methods with computer-aided literature mining to identify and quantify a comprehensive, if not absolutely complete, set of metabolites commonly detected and quantified (with today's technology) in the human serum metabolome. Our use of multiple metabolomics platforms and technologies allowed us to substantially enhance the level of metabolome coverage while critically assessing the relative strengths and weaknesses of these platforms or technologies. Tables containing the complete set of 4229 confirmed and highly probable human serum compounds, their concentrations, related literature references and links to their known disease associations are freely available at http://www.serummetabolome.ca. PMID:21359215

  11. A Computational Approach to Quantifiers as an Explanation for Some Language Impairments in Schizophrenia

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Styla, Rafal; Szymanik, Jakub

    2011-01-01

    We compared the processing of natural language quantifiers in a group of patients with schizophrenia and a healthy control group. In both groups, the difficulty of the quantifiers was consistent with computational predictions, and patients with schizophrenia took more time to solve the problems. However, they were significantly less accurate only…

  12. Method and system for benchmarking computers

    DOEpatents

    Gustafson, John L.

    1993-09-14

    A testing system and method for benchmarking computer systems. The system includes a store containing a scalable set of tasks to be performed to produce a solution in ever-increasing degrees of resolution as a larger number of the tasks are performed. A timing and control module allots to each computer a fixed benchmarking interval in which to perform the stored tasks. Means are provided for determining, after completion of the benchmarking interval, the degree of progress through the scalable set of tasks and for producing a benchmarking rating relating to the degree of progress for each computer.

  13. Checkpointing for a hybrid computing node

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cher, Chen-Yong

    2016-03-08

    According to an aspect, a method for checkpointing in a hybrid computing node includes executing a task in a processing accelerator of the hybrid computing node. A checkpoint is created in a local memory of the processing accelerator. The checkpoint includes state data to restart execution of the task in the processing accelerator upon a restart operation. Execution of the task is resumed in the processing accelerator after creating the checkpoint. The state data of the checkpoint are transferred from the processing accelerator to a main processor of the hybrid computing node while the processing accelerator is executing the task.

  14. The Effects of Study Tasks in a Computer-Based Chemistry Learning Environment

    NASA Astrophysics Data System (ADS)

    Urhahne, Detlef; Nick, Sabine; Poepping, Anna Christin; Schulz, Sarah Jayne

    2013-12-01

    The present study examines the effects of different study tasks on the acquisition of knowledge about acids and bases in a computer-based learning environment. Three different task formats were selected to create three treatment conditions: learning with gap-fill and matching tasks, learning with multiple-choice tasks, and learning only from text and figures without any additional tasks. Participants were 196 ninth-grade students who learned with a self-developed multimedia program in a pretest-posttest control group design. Research results reveal that gap-fill and matching tasks were most effective in promoting knowledge acquisition, followed by multiple-choice tasks, and no tasks at all. The findings are in line with previous research on this topic. The effects can possibly be explained by the generation-recognition model, which predicts that gap-fill and matching tasks trigger more encompassing learning processes than multiple-choice tasks. It is concluded that instructional designers should incorporate more challenging study tasks for enhancing the effectiveness of computer-based learning environments.

  15. Prediction of Human Cytochrome P450 Inhibition Using a Multitask Deep Autoencoder Neural Network.

    PubMed

    Li, Xiang; Xu, Youjun; Lai, Luhua; Pei, Jianfeng

    2018-05-30

    Adverse side effects of drug-drug interactions induced by human cytochrome P450 (CYP450) inhibition is an important consideration in drug discovery. It is highly desirable to develop computational models that can predict the inhibitive effect of a compound against a specific CYP450 isoform. In this study, we developed a multitask model for concurrent inhibition prediction of five major CYP450 isoforms, namely, 1A2, 2C9, 2C19, 2D6, and 3A4. The model was built by training a multitask autoencoder deep neural network (DNN) on a large dataset containing more than 13 000 compounds, extracted from the PubChem BioAssay Database. We demonstrate that the multitask model gave better prediction results than that of single-task models, previous reported classifiers, and traditional machine learning methods on an average of five prediction tasks. Our multitask DNN model gave average prediction accuracies of 86.4% for the 10-fold cross-validation and 88.7% for the external test datasets. In addition, we built linear regression models to quantify how the other tasks contributed to the prediction difference of a given task between single-task and multitask models, and we explained under what conditions the multitask model will outperform the single-task model, which suggested how to use multitask DNN models more effectively. We applied sensitivity analysis to extract useful knowledge about CYP450 inhibition, which may shed light on the structural features of these isoforms and give hints about how to avoid side effects during drug development. Our models are freely available at http://repharma.pku.edu.cn/deepcyp/home.php or http://www.pkumdl.cn/deepcyp/home.php .

  16. 29 CFR 778.313 - Computing overtime pay under the Act for employees compensated on task basis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Computing overtime pay under the Act for employees compensated on task basis. 778.313 Section 778.313 Labor Regulations Relating to Labor (Continued) WAGE AND... TO REGULATIONS OVERTIME COMPENSATION Special Problems âtaskâ Basis of Payment § 778.313 Computing...

  17. The comprehension and production of quantifiers in isiXhosa-speaking Grade 1 learners

    PubMed Central

    Southwood, Frenette

    2016-01-01

    Background Quantifiers form part of the discourse-internal linguistic devices that children need to access and produce narratives and other classroom discourse. Little is known about the development - especially the prodiction - of quantifiers in child language, specifically in speakers of an African language. Objectives The study aimed to ascertain how well Grade 1 isiXhosa first language (L1) learners perform at the beginning and at the end of Grade 1 on quantifier comprehension and production tasks. Method Two low socioeconomic groups of L1 isiXhosa learners with either isiXhosa or English as language of learning and teaching (LOLT) were tested in February and November of their Grade 1 year with tasks targeting several quantifiers. Results The isiXhosa LOLT group comprehended no/none, any and all fully either in February or then in November of Grade 1, and they produced all assessed quantifiers in February of Grade 1. For the English LOLT group, neither the comprehension nor the production of quantifiers was mastered by the end of Grade 1, although there was a significant increase in both their comprehension and production scores. Conclusion The English LOLT group made significant progress in comprehension and production of quantifiers, but still performed worse than peers who had their L1 as LOLT. Generally, children with no or very little prior knowledge of the LOLT need either, (1) more deliberate exposure to quantifier-rich language or, (2) longer exposure to general classroom language before quantifiers can be expected to be mastered sufficiently to allow access to quantifier-related curriculum content. PMID:27245132

  18. Reinforcement learning in computer vision

    NASA Astrophysics Data System (ADS)

    Bernstein, A. V.; Burnaev, E. V.

    2018-04-01

    Nowadays, machine learning has become one of the basic technologies used in solving various computer vision tasks such as feature detection, image segmentation, object recognition and tracking. In many applications, various complex systems such as robots are equipped with visual sensors from which they learn state of surrounding environment by solving corresponding computer vision tasks. Solutions of these tasks are used for making decisions about possible future actions. It is not surprising that when solving computer vision tasks we should take into account special aspects of their subsequent application in model-based predictive control. Reinforcement learning is one of modern machine learning technologies in which learning is carried out through interaction with the environment. In recent years, Reinforcement learning has been used both for solving such applied tasks as processing and analysis of visual information, and for solving specific computer vision problems such as filtering, extracting image features, localizing objects in scenes, and many others. The paper describes shortly the Reinforcement learning technology and its use for solving computer vision problems.

  19. Rail Inspection Systems Analysis and Technology Survey

    DOT National Transportation Integrated Search

    1977-09-01

    The study was undertaken to identify existing rail inspection system capabilities and methods which might be used to improve these capabilities. Task I was a study to quantify existing inspection parameters and Task II was a cost effectiveness study ...

  20. Pupillary transient responses to within-task cognitive load variation.

    PubMed

    Wong, Hoe Kin; Epps, Julien

    2016-12-01

    Changes in physiological signals due to task evoked cognitive load have been reported extensively. However, pupil size based approaches for estimating cognitive load on a moment-to-moment basis are not as well understood as estimating cognitive load on a task-to-task basis, despite the appeal these approaches have for continuous load estimation. In particular, the pupillary transient response to instantaneous changes in induced load has not been experimentally quantified, and the within-task changes in pupil dilation have not been investigated in a manner that allows their consistency to be quantified with a view to biomedical system design. In this paper, a variation of the digit span task is developed which reliably induces rapid changes of cognitive load to generate task-evoked pupillary responses (TEPRs) associated with large, within-task load changes. Linear modelling and one-way ANOVA reveals that increasing the rate of cognitive loading, while keeping task demands constant, results in a steeper pupillary response. Instantaneous drops in cognitive load are shown to produce statistically significantly different transient pupillary responses relative to sustained load, and when characterised using an exponential decay response, the task-evoked pupillary response time constant is in the order of 1-5 s. Within-task test-retest analysis confirms the reliability of the moment-to-moment measurements. Based on these results, estimates of pupil diameter can be employed with considerably more confidence in moment-to-moment cognitive load estimation systems. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  1. Multi-Attribute Task Battery - Applications in pilot workload and strategic behavior research

    NASA Technical Reports Server (NTRS)

    Arnegard, Ruth J.; Comstock, J. R., Jr.

    1991-01-01

    The Multi-Attribute Task (MAT) Battery provides a benchmark set of tasks for use in a wide range of lab studies of operator performance and workload. The battery incorporates tasks analogous to activities that aircraft crewmembers perform in flight, while providing a high degree of experimenter control, performance data on each subtask, and freedom to nonpilot test subjects. Features not found in existing computer based tasks include an auditory communication task (to simulate Air Traffic Control communication), a resource management task permitting many avenues or strategies of maintaining target performance, a scheduling window which gives the operator information about future task demands, and the option of manual or automated control of tasks. Performance data are generated for each subtask. In addition, the task battery may be paused and onscreen workload rating scales presented to the subject. The MAT Battery requires a desktop computer with color graphics. The communication task requires a serial link to a second desktop computer with a voice synthesizer or digitizer card.

  2. The multi-attribute task battery for human operator workload and strategic behavior research

    NASA Technical Reports Server (NTRS)

    Comstock, J. Raymond, Jr.; Arnegard, Ruth J.

    1992-01-01

    The Multi-Attribute Task (MAT) Battery provides a benchmark set of tasks for use in a wide range of lab studies of operator performance and workload. The battery incorporates tasks analogous to activities that aircraft crewmembers perform in flight, while providing a high degree of experimenter control, performance data on each subtask, and freedom to use nonpilot test subjects. Features not found in existing computer based tasks include an auditory communication task (to simulate Air Traffic Control communication), a resource management task permitting many avenues or strategies of maintaining target performance, a scheduling window which gives the operator information about future task demands, and the option of manual or automated control of tasks. Performance data are generated for each subtask. In addition, the task battery may be paused and onscreen workload rating scales presented to the subject. The MAT Battery requires a desktop computer with color graphics. The communication task requires a serial link to a second desktop computer with a voice synthesizer or digitizer card.

  3. Ergonomic assessment for the task of repairing computers in a manufacturing company: A case study.

    PubMed

    Maldonado-Macías, Aidé; Realyvásquez, Arturo; Hernández, Juan Luis; García-Alcaraz, Jorge

    2015-01-01

    Manufacturing industry workers who repair computers may be exposed to ergonomic risk factors. This project analyzes the tasks involved in the computer repair process to (1) find the risk level for musculoskeletal disorders (MSDs) and (2) propose ergonomic interventions to address any ergonomic issues. Work procedures and main body postures were video recorded and analyzed using task analysis, the Rapid Entire Body Assessment (REBA) postural method, and biomechanical analysis. High risk for MSDs was found on every subtask using REBA. Although biomechanical analysis found an acceptable mass center displacement during tasks, a hazardous level of compression on the lower back during computer's transportation was detected. This assessment found ergonomic risks mainly in the trunk, arm/forearm, and legs; the neck and hand/wrist were also compromised. Opportunities for ergonomic analyses and interventions in the design and execution of computer repair tasks are discussed.

  4. Mining Tasks from the Web Anchor Text Graph: MSR Notebook Paper for the TREC 2015 Tasks Track

    DTIC Science & Technology

    2015-11-20

    Mining Tasks from the Web Anchor Text Graph: MSR Notebook Paper for the TREC 2015 Tasks Track Paul N. Bennett Microsoft Research Redmond, USA pauben...anchor text graph has proven useful in the general realm of query reformulation [2], we sought to quantify the value of extracting key phrases from...anchor text in the broader setting of the task understanding track. Given a query, our approach considers a simple method for identifying a relevant

  5. Cortico-striatal language pathways dynamically adjust for syntactic complexity: A computational study.

    PubMed

    Szalisznyó, Krisztina; Silverstein, David; Teichmann, Marc; Duffau, Hugues; Smits, Anja

    2017-01-01

    A growing body of literature supports a key role of fronto-striatal circuits in language perception. It is now known that the striatum plays a role in engaging attentional resources and linguistic rule computation while also serving phonological short-term memory capabilities. The ventral semantic and the dorsal phonological stream dichotomy assumed for spoken language processing also seems to play a role in cortico-striatal perception. Based on recent studies that correlate deep Broca-striatal pathways with complex syntax performance, we used a previously developed computational model of frontal-striatal syntax circuits and hypothesized that different parallel language pathways may contribute to canonical and non-canonical sentence comprehension separately. We modified and further analyzed a thematic role assignment task and corresponding reservoir computing model of language circuits, as previously developed by Dominey and coworkers. We examined the models performance under various parameter regimes, by influencing how fast the presented language input decays and altering the temporal dynamics of activated word representations. This enabled us to quantify canonical and non-canonical sentence comprehension abilities. The modeling results suggest that separate cortico-cortical and cortico-striatal circuits may be recruited differently for processing syntactically more difficult and less complicated sentences. Alternatively, a single circuit would need to dynamically and adaptively adjust to syntactic complexity. Copyright © 2016. Published by Elsevier Inc.

  6. An integrated system for rainfall induced shallow landslides modeling

    NASA Astrophysics Data System (ADS)

    Formetta, Giuseppe; Capparelli, Giovanna; Rigon, Riccardo; Versace, Pasquale

    2014-05-01

    Rainfall induced shallow landslides (RISL) cause significant damages involving loss of life and properties. Predict susceptible locations for RISL is a complex task that involves many disciplines: hydrology, geotechnical science, geomorphology, statistic. Usually to accomplish this task two main approaches are used: statistical or physically based model. In this work an open source (OS), 3-D, fully distributed hydrological model was integrated in an OS modeling framework (Object Modeling System). The chain is closed by linking the system to a component for safety factor computation with infinite slope approximation able to take into account layered soils and suction contribution to hillslope stability. The model composition was tested for a case study in Calabria (Italy) in order to simulate the triggering of a landslide happened in the Cosenza Province. The integration in OMS allows the use of other components such as a GIS to manage inputs-output processes, and automatic calibration algorithms to estimate model parameters. Finally, model performances were quantified by comparing modelled and simulated trigger time. This research is supported by Ambito/Settore AMBIENTE E SICUREZZA (PON01_01503) project.

  7. The Effects of Aging and Dual Tasking on Human Gait Complexity During Treadmill Walking: A Comparative Study Using Quantized Dynamical Entropy and Sample Entropy.

    PubMed

    Ahmadi, Samira; Wu, Christine; Sepehri, Nariman; Kantikar, Anuprita; Nankar, Mayur; Szturm, Tony

    2018-01-01

    Quantized dynamical entropy (QDE) has recently been proposed as a new measure to quantify the complexity of dynamical systems with the purpose of offering a better computational efficiency. This paper further investigates the viability of this method using five different human gait signals. These signals are recorded while normal walking and while performing secondary tasks among two age groups (young and older age groups). The results are compared with the outcomes of previously established sample entropy (SampEn) measure for the same signals. We also study how analyzing segmented and spatially and temporally normalized signal differs from analyzing whole data. Our findings show that human gait signals become more complex as people age and while they are cognitively loaded. Center of pressure (COP) displacement in mediolateral direction is the best signal for showing the gait changes. Moreover, the results suggest that by segmenting data, more information about intrastride dynamical features are obtained. Most importantly, QDE is shown to be a reliable measure for human gait complexity analysis.

  8. Motor cortical activity changes during neuroprosthetic-controlled object interaction.

    PubMed

    Downey, John E; Brane, Lucas; Gaunt, Robert A; Tyler-Kabara, Elizabeth C; Boninger, Michael L; Collinger, Jennifer L

    2017-12-05

    Brain-computer interface (BCI) controlled prosthetic arms are being developed to restore function to people with upper-limb paralysis. This work provides an opportunity to analyze human cortical activity during complex tasks. Previously we observed that BCI control became more difficult during interactions with objects, although we did not quantify the neural origins of this phenomena. Here, we investigated how motor cortical activity changed in the presence of an object independently of the kinematics that were being generated using intracortical recordings from two people with tetraplegia. After identifying a population-wide increase in neural firing rates that corresponded with the hand being near an object, we developed an online scaling feature in the BCI system that operated without knowledge of the task. Online scaling increased the ability of two subjects to control the robotic arm when reaching to grasp and transport objects. This work suggests that neural representations of the environment, in this case the presence of an object, are strongly and consistently represented in motor cortex but can be accounted for to improve BCI performance.

  9. Healthy Aging Delays Scalp EEG Sensitivity to Noise in a Face Discrimination Task

    PubMed Central

    Rousselet, Guillaume A.; Gaspar, Carl M.; Pernet, Cyril R.; Husk, Jesse S.; Bennett, Patrick J.; Sekuler, Allison B.

    2010-01-01

    We used a single-trial ERP approach to quantify age-related changes in the time-course of noise sensitivity. A total of 62 healthy adults, aged between 19 and 98, performed a non-speeded discrimination task between two faces. Stimulus information was controlled by parametrically manipulating the phase spectrum of these faces. Behavioral 75% correct thresholds increased with age. This result may be explained by lower signal-to-noise ratios in older brains. ERP from each subject were entered into a single-trial general linear regression model to identify variations in neural activity statistically associated with changes in image structure. The fit of the model, indexed by R2, was computed at multiple post-stimulus time points. The time-course of the R2 function showed significantly delayed noise sensitivity in older observers. This age effect is reliable, as demonstrated by test–retest in 24 subjects, and started about 120 ms after stimulus onset. Our analyses suggest also a qualitative change from a young to an older pattern of brain activity at around 47 ± 4 years old. PMID:21833194

  10. Health literacy and task environment influence parents' burden for data entry on child-specific health information: randomized controlled trial.

    PubMed

    Porter, Stephen C; Guo, Chao-Yu; Bacic, Janine; Chan, Eugenia

    2011-01-26

    Health care systems increasingly rely on patients' data entry efforts to organize and assist in care delivery through health information exchange. We sought to determine (1) the variation in burden imposed on parents by data entry efforts across paper-based and computer-based environments, and (2) the impact, if any, of parents' health literacy on the task burden. We completed a randomized controlled trial of parent-completed data entry tasks. Parents of children with attention deficit hyperactivity disorder (ADHD) were randomized based on the Test of Functional Health Literacy in Adults (TOFHLA) to either a paper-based or computer-based environment for entry of health information on their children. The primary outcome was the National Aeronautics and Space Administration Task Load Index (TLX) total weighted score. We screened 271 parents: 194 (71.6%) were eligible, and 180 of these (92.8%) constituted the study cohort. We analyzed 90 participants from each arm. Parents who completed information tasks on paper reported a higher task burden than those who worked in the computer environment: mean (SD) TLX scores were 22.8 (20.6) for paper and 16.3 (16.1) for computer. Assignment to the paper environment conferred a significant risk of higher task burden (F(1,178) = 4.05, P = .046). Adequate literacy was associated with lower task burden (decrease in burden score of 1.15 SD, P = .003). After adjusting for relevant child and parent factors, parents' TOFHLA score (beta = -.02, P = .02) and task environment (beta = .31, P = .03) remained significantly associated with task burden. A tailored computer-based environment provided an improved task experience for data entry compared to the same tasks completed on paper. Health literacy was inversely related to task burden.

  11. The ability of non-computer tasks to increase biomechanical exposure variability in computer-intensive office work.

    PubMed

    Barbieri, Dechristian França; Srinivasan, Divya; Mathiassen, Svend Erik; Nogueira, Helen Cristina; Oliveira, Ana Beatriz

    2015-01-01

    Postures and muscle activity in the upper body were recorded from 50 academics office workers during 2 hours of normal work, categorised by observation into computer work (CW) and three non-computer (NC) tasks (NC seated work, NC standing/walking work and breaks). NC tasks differed significantly in exposures from CW, with standing/walking NC tasks representing the largest contrasts for most of the exposure variables. For the majority of workers, exposure variability was larger in their present job than in CW alone, as measured by the job variance ratio (JVR), i.e. the ratio between min-min variabilities in the job and in CW. Calculations of JVRs for simulated jobs containing different proportions of CW showed that variability could, indeed, be increased by redistributing available tasks, but that substantial increases could only be achieved by introducing more vigorous tasks in the job, in casu illustrated by cleaning.

  12. International standards for neurological classification of spinal cord injury: classification skills of clinicians versus computational algorithms.

    PubMed

    Schuld, C; Franz, S; van Hedel, H J A; Moosburger, J; Maier, D; Abel, R; van de Meent, H; Curt, A; Weidner, N; Rupp, R

    2015-04-01

    This is a retrospective analysis. The objective of this study was to describe and quantify the discrepancy in the classification of the International Standards for Neurological Classification of Spinal Cord Injury (ISNCSCI) by clinicians versus a validated computational algorithm. European Multicenter Study on Human Spinal Cord Injury (EMSCI). Fully documented ISNCSCI data sets from EMSCI's first years (2003-2005) classified by clinicians (mostly spinal cord medicine residents, who received in-house ISNCSCI training by senior SCI physicians) were computationally reclassified. Any differences in the scoring of sensory and motor levels, American Spinal Injury Association Impairment Scale (AIS) or the zone of partial preservation (ZPP) were quantified. Four hundred and twenty ISNCSCI data sets were evaluated. The lowest agreement was found in motor levels (right: 62.1%, P=0.002; left: 61.8%, P=0.003), followed by motor ZPP (right: 81.6%, P=0.74; left 80.0%, P=0.27) and then AIS (83.4%, P=0.001). Sensory levels and sensory ZPP showed the best concordance (right sensory level: 90.8%, P=0.66; left sensory level: 90.0%, P=0.30; right sensory ZPP: 91.0%, P=0.18; left sensory ZPP: 92.2%, P=0.03). AIS B was most often misinterpreted as AIS C and vice versa (AIS B as C: 29.4% and AIS C as B: 38.6%). Most difficult classification tasks were the correct determination of motor levels and the differentiation between AIS B and AIS C/D. These issues should be addressed in upcoming ISNCSCI revisions. Training is strongly recommended to improve classification skills for clinical practice, as well as for clinical investigators conducting spinal cord studies. This study is partially funded by the International Foundation for Research in Paraplegia, Zurich, Switzerland.

  13. Load Balancing in Cloud Computing Environment Using Improved Weighted Round Robin Algorithm for Nonpreemptive Dependent Tasks.

    PubMed

    Devi, D Chitra; Uthariaraj, V Rhymend

    2016-01-01

    Cloud computing uses the concepts of scheduling and load balancing to migrate tasks to underutilized VMs for effectively sharing the resources. The scheduling of the nonpreemptive tasks in the cloud computing environment is an irrecoverable restraint and hence it has to be assigned to the most appropriate VMs at the initial placement itself. Practically, the arrived jobs consist of multiple interdependent tasks and they may execute the independent tasks in multiple VMs or in the same VM's multiple cores. Also, the jobs arrive during the run time of the server in varying random intervals under various load conditions. The participating heterogeneous resources are managed by allocating the tasks to appropriate resources by static or dynamic scheduling to make the cloud computing more efficient and thus it improves the user satisfaction. Objective of this work is to introduce and evaluate the proposed scheduling and load balancing algorithm by considering the capabilities of each virtual machine (VM), the task length of each requested job, and the interdependency of multiple tasks. Performance of the proposed algorithm is studied by comparing with the existing methods.

  14. Load Balancing in Cloud Computing Environment Using Improved Weighted Round Robin Algorithm for Nonpreemptive Dependent Tasks

    PubMed Central

    Devi, D. Chitra; Uthariaraj, V. Rhymend

    2016-01-01

    Cloud computing uses the concepts of scheduling and load balancing to migrate tasks to underutilized VMs for effectively sharing the resources. The scheduling of the nonpreemptive tasks in the cloud computing environment is an irrecoverable restraint and hence it has to be assigned to the most appropriate VMs at the initial placement itself. Practically, the arrived jobs consist of multiple interdependent tasks and they may execute the independent tasks in multiple VMs or in the same VM's multiple cores. Also, the jobs arrive during the run time of the server in varying random intervals under various load conditions. The participating heterogeneous resources are managed by allocating the tasks to appropriate resources by static or dynamic scheduling to make the cloud computing more efficient and thus it improves the user satisfaction. Objective of this work is to introduce and evaluate the proposed scheduling and load balancing algorithm by considering the capabilities of each virtual machine (VM), the task length of each requested job, and the interdependency of multiple tasks. Performance of the proposed algorithm is studied by comparing with the existing methods. PMID:26955656

  15. Women and Computers: Effects of Stereotype Threat on Attribution of Failure

    ERIC Educational Resources Information Center

    Koch, Sabine C.; Muller, Stephanie M.; Sieverding, Monika

    2008-01-01

    This study investigated whether stereotype threat can influence women's attributions of failure in a computer task. Male and female college-age students (n = 86, 16-21 years old) from Germany were asked to work on a computer task and were hinted beforehand that in this task, either (a) men usually perform better than women do (negative threat…

  16. Measurement and Evidence of Computer-Based Task Switching and Multitasking by "Net Generation" Students

    ERIC Educational Resources Information Center

    Judd, Terry; Kennedy, Gregor

    2011-01-01

    Logs of on-campus computer and Internet usage were used to conduct a study of computer-based task switching and multitasking by undergraduate medical students. A detailed analysis of over 6000 individual sessions revealed that while a majority of students engaged in both task switching and multitasking behaviours, they did so less frequently than…

  17. Diagnosing Pre-Service Science Teachers' Understanding of Chemistry Concepts by Using Computer-Mediated Predict-Observe-Explain Tasks

    ERIC Educational Resources Information Center

    Sesn, Burcin Acar

    2013-01-01

    The purpose of this study was to investigate pre-service science teachers' understanding of surface tension, cohesion and adhesion forces by using computer-mediated predict-observe-explain tasks. 22 third-year pre-service science teachers participated in this study. Three computer-mediated predict-observe-explain tasks were developed and applied…

  18. Report of the Task Force on Computer Charging.

    ERIC Educational Resources Information Center

    Computer Co-ordination Group, Ottawa (Ontario).

    The objectives of the Task Force on Computer Charging as approved by the Committee of Presidents of Universities of Ontario were: (1) to identify alternative methods of costing computing services; (2) to identify alternative methods of pricing computing services; (3) to develop guidelines for the pricing of computing services; (4) to identify…

  19. An evaluation method of computer usability based on human-to-computer information transmission model.

    PubMed

    Ogawa, K

    1992-01-01

    This paper proposes a new evaluation and prediction method for computer usability. This method is based on our two previously proposed information transmission measures created from a human-to-computer information transmission model. The model has three information transmission levels: the device, software, and task content levels. Two measures, called the device independent information measure (DI) and the computer independent information measure (CI), defined on the software and task content levels respectively, are given as the amount of information transmitted. Two information transmission rates are defined as DI/T and CI/T, where T is the task completion time: the device independent information transmission rate (RDI), and the computer independent information transmission rate (RCI). The method utilizes the RDI and RCI rates to evaluate relatively the usability of software and device operations on different computer systems. Experiments using three different systems, in this case a graphical information input task, confirm that the method offers an efficient way of determining computer usability.

  20. Parallel and Efficient Sensitivity Analysis of Microscopy Image Segmentation Workflows in Hybrid Systems

    PubMed Central

    Barreiros, Willian; Teodoro, George; Kurc, Tahsin; Kong, Jun; Melo, Alba C. M. A.; Saltz, Joel

    2017-01-01

    We investigate efficient sensitivity analysis (SA) of algorithms that segment and classify image features in a large dataset of high-resolution images. Algorithm SA is the process of evaluating variations of methods and parameter values to quantify differences in the output. A SA can be very compute demanding because it requires re-processing the input dataset several times with different parameters to assess variations in output. In this work, we introduce strategies to efficiently speed up SA via runtime optimizations targeting distributed hybrid systems and reuse of computations from runs with different parameters. We evaluate our approach using a cancer image analysis workflow on a hybrid cluster with 256 nodes, each with an Intel Phi and a dual socket CPU. The SA attained a parallel efficiency of over 90% on 256 nodes. The cooperative execution using the CPUs and the Phi available in each node with smart task assignment strategies resulted in an additional speedup of about 2×. Finally, multi-level computation reuse lead to an additional speedup of up to 2.46× on the parallel version. The level of performance attained with the proposed optimizations will allow the use of SA in large-scale studies. PMID:29081725

  1. Report of the Nuclear Propulsion Mission Analysis, Figures of Merit Subpanel: Quantifiable figures of merit for nuclear thermal propulsion

    NASA Technical Reports Server (NTRS)

    Haynes, Davy A.

    1991-01-01

    The results of an inquiry by the Nuclear Propulsion Mission Analysis, Figures of Merit subpanel are given. The subpanel was tasked to consider the question of what are the appropriate and quantifiable parameters to be used in the definition of an overall figure of merit (FoM) for Mars transportation system (MTS) nuclear thermal rocket engines (NTR). Such a characterization is needed to resolve the NTR engine design trades by a logical and orderly means, and to provide a meaningful method for comparison of the various NTR engine concepts. The subpanel was specifically tasked to identify the quantifiable engine parameters which would be the most significant engine factors affecting an overall FoM for a MTS and was not tasked with determining 'acceptable' or 'recommended' values for the identified parameters. In addition, the subpanel was asked not to define an overall FoM for a MTS. Thus, the selection of a specific approach, applicable weighting factors, to any interrelationships, for establishing an overall numerical FoM were considered beyond the scope of the subpanel inquiry.

  2. Age-Related Differences in Listening Effort During Degraded Speech Recognition.

    PubMed

    Ward, Kristina M; Shen, Jing; Souza, Pamela E; Grieco-Calub, Tina M

    The purpose of the present study was to quantify age-related differences in executive control as it relates to dual-task performance, which is thought to represent listening effort, during degraded speech recognition. Twenty-five younger adults (YA; 18-24 years) and 21 older adults (OA; 56-82 years) completed a dual-task paradigm that consisted of a primary speech recognition task and a secondary visual monitoring task. Sentence material in the primary task was either unprocessed or spectrally degraded into 8, 6, or 4 spectral channels using noise-band vocoding. Performance on the visual monitoring task was assessed by the accuracy and reaction time of participants' responses. Performance on the primary and secondary task was quantified in isolation (i.e., single task) and during the dual-task paradigm. Participants also completed a standardized psychometric measure of executive control, including attention and inhibition. Statistical analyses were implemented to evaluate changes in listeners' performance on the primary and secondary tasks (1) per condition (unprocessed vs. vocoded conditions); (2) per task (single task vs. dual task); and (3) per group (YA vs. OA). Speech recognition declined with increasing spectral degradation for both YA and OA when they performed the task in isolation or concurrently with the visual monitoring task. OA were slower and less accurate than YA on the visual monitoring task when performed in isolation, which paralleled age-related differences in standardized scores of executive control. When compared with single-task performance, OA experienced greater declines in secondary-task accuracy, but not reaction time, than YA. Furthermore, results revealed that age-related differences in executive control significantly contributed to age-related differences on the visual monitoring task during the dual-task paradigm. OA experienced significantly greater declines in secondary-task accuracy during degraded speech recognition than YA. These findings are interpreted as suggesting that OA expended greater listening effort than YA, which may be partially attributed to age-related differences in executive control.

  3. Job Management and Task Bundling

    NASA Astrophysics Data System (ADS)

    Berkowitz, Evan; Jansen, Gustav R.; McElvain, Kenneth; Walker-Loud, André

    2018-03-01

    High Performance Computing is often performed on scarce and shared computing resources. To ensure computers are used to their full capacity, administrators often incentivize large workloads that are not possible on smaller systems. Measurements in Lattice QCD frequently do not scale to machine-size workloads. By bundling tasks together we can create large jobs suitable for gigantic partitions. We discuss METAQ and mpi_jm, software developed to dynamically group computational tasks together, that can intelligently backfill to consume idle time without substantial changes to users' current workflows or executables.

  4. Behavioral and functional strategies during tool use tasks in bonobos.

    PubMed

    Bardo, Ameline; Borel, Antony; Meunier, Hélène; Guéry, Jean-Pascal; Pouydebat, Emmanuelle

    2016-09-01

    Different primate species have developed extensive capacities for grasping and manipulating objects. However, the manual abilities of primates remain poorly known from a dynamic point of view. The aim of the present study was to quantify the functional and behavioral strategies used by captive bonobos (Pan paniscus) during tool use tasks. The study was conducted on eight captive bonobos which we observed during two tool use tasks: food extraction from a large piece of wood and food recovery from a maze. We focused on grasping postures, in-hand movements, the sequences of grasp postures used that have not been studied in bonobos, and the kind of tools selected. Bonobos used a great variety of grasping postures during both tool use tasks. They were capable of in-hand movement, demonstrated complex sequences of contacts, and showed more dynamic manipulation during the maze task than during the extraction task. They arrived on the location of the task with the tool already modified and used different kinds of tools according to the task. We also observed individual manual strategies. Bonobos were thus able to develop in-hand movements similar to humans and chimpanzees, demonstrated dynamic manipulation, and they responded to task constraints by selecting and modifying tools appropriately, usually before they started the tasks. These results show the necessity to quantify object manipulation in different species to better understand their real manual specificities, which is essential to reconstruct the evolution of primate manual abilities. © 2016 Wiley Periodicals, Inc.

  5. Selective Inhibition and Naming Performance in Semantic Blocking, Picture-Word Interference, and Color-Word Stroop Tasks

    ERIC Educational Resources Information Center

    Shao, Zeshu; Roelofs, Ardi; Martin, Randi C.; Meyer, Antje S.

    2015-01-01

    In 2 studies, we examined whether explicit distractors are necessary and sufficient to evoke selective inhibition in 3 naming tasks: the semantic blocking, picture-word interference, and color-word Stroop task. Delta plots were used to quantify the size of the interference effects as a function of reaction time (RT). Selective inhibition was…

  6. Does the medium matter? The interaction of task type and technology on group performance and member reactions.

    PubMed

    Straus, S G; McGrath, J E

    1994-02-01

    The authors investigated the hypothesis that as group tasks pose greater requirements for member interdependence, communication media that transmit more social context cues will foster group performance and satisfaction. Seventy-two 3-person groups of undergraduate students worked in either computer-mediated or face-to-face meetings on 3 tasks with increasing levels of interdependence: an idea-generation task, an intellective task, and a judgment task. Results showed few differences between computer-mediated and face-to-face groups in the quality of the work completed but large differences in productivity favoring face-to-face groups. Analysis of productivity and of members' reactions supported the predicted interaction of tasks and media, with greater discrepancies between media conditions for tasks requiring higher levels of coordination. Results are discussed in terms of the implications of using computer-mediated communications systems for group work.

  7. The effect of psychosocial stress on muscle activity during computer work: Comparative study between desktop computer and mobile computing products.

    PubMed

    Taib, Mohd Firdaus Mohd; Bahn, Sangwoo; Yun, Myung Hwan

    2016-06-27

    The popularity of mobile computing products is well known. Thus, it is crucial to evaluate their contribution to musculoskeletal disorders during computer usage under both comfortable and stressful environments. This study explores the effect of different computer products' usages with different tasks used to induce psychosocial stress on muscle activity. Fourteen male subjects performed computer tasks: sixteen combinations of four different computer products with four different tasks used to induce stress. Electromyography for four muscles on the forearm, shoulder and neck regions and task performances were recorded. The increment of trapezius muscle activity was dependent on the task used to induce the stress where a higher level of stress made a greater increment. However, this relationship was not found in the other three muscles. Besides that, compared to desktop and laptop use, the lowest activity for all muscles was obtained during the use of a tablet or smart phone. The best net performance was obtained in a comfortable environment. However, during stressful conditions, the best performance can be obtained using the device that a user is most comfortable with or has the most experience with. Different computer products and different levels of stress play a big role in muscle activity during computer work. Both of these factors must be taken into account in order to reduce the occurrence of musculoskeletal disorders or problems.

  8. Stability of hand force production. I. Hand level control variables and multifinger synergies.

    PubMed

    Reschechtko, Sasha; Latash, Mark L

    2017-12-01

    We combined the theory of neural control of movement with referent coordinates and the uncontrolled manifold hypothesis to explore synergies stabilizing the hand action in accurate four-finger pressing tasks. In particular, we tested a hypothesis on two classes of synergies, those among the four fingers and those within a pair of control variables, stabilizing hand action under visual feedback and disappearing without visual feedback. Subjects performed four-finger total force and moment production tasks under visual feedback; the feedback was later partially or completely removed. The "inverse piano" device was used to lift and lower the fingers smoothly at the beginning and at the end of each trial. These data were used to compute pairs of hypothetical control variables. Intertrial analysis of variance within the finger force space was used to quantify multifinger synergies stabilizing both force and moment. A data permutation method was used to quantify synergies among control variables. Under visual feedback, synergies in the spaces of finger forces and hypothetical control variables were found to stabilize total force. Without visual feedback, the subjects showed a force drift to lower magnitudes and a moment drift toward pronation. This was accompanied by disappearance of the four-finger synergies and strong attenuation of the control variable synergies. The indexes of the two types of synergies correlated with each other. The findings are interpreted within the scheme with multiple levels of abundant variables. NEW & NOTEWORTHY We extended the idea of hierarchical control with referent spatial coordinates for the effectors and explored two types of synergies stabilizing multifinger force production tasks. We observed synergies among finger forces and synergies between hypothetical control variables that stabilized performance under visual feedback but failed to stabilize it after visual feedback had been removed. Indexes of two types of synergies correlated with each other. The data suggest the existence of multiple mechanisms stabilizing motor actions. Copyright © 2017 the American Physiological Society.

  9. Concussion is associated with altered preparatory postural adjustments during gait initiation.

    PubMed

    Doherty, Cailbhe; Zhao, Liang; Ryan, John; Komaba, Yusuke; Inomata, Akihiro; Caulfield, Brian

    2017-04-01

    Gait initiation is a useful surrogate measure of supraspinal motor control mechanisms but has never been evaluated in a cohort following concussion. The aim of this study was to quantify the preparatory postural adjustments (PPAs) of gait initiation (GI) in fifteen concussion patients (4 females, 11 males) in comparison to a group of fifteen age- and sex-matched controls. All participants completed variants of the GI task where their dominant and non-dominant limbs as the 'stepping' and 'support' limbs. Task performance was quantified using the centre of pressure (COP) trajectory of each foot (computed from a force plate) and a surrogate of the centre of mass (COM) trajectory (estimated from an inertial measurement unit placed on the sacrum). Concussed patients exhibited decreased COP excursion on their dominant foot, both when it was the stepping limb (sagittal plane: 9.71mm [95% CI: 8.14-11.27mm] vs 14.9mm [95% CI: 12.31-17.49mm]; frontal plane: 36.95mm [95% CI: 30.87-43.03mm] vs 54.24mm [95% CI: 46.99-61.50mm]) and when it was the support limb (sagittal plane: 10.43mm [95% CI: 8.73-12.13mm] vs 18.13mm [95% CI: 14.92-21.35mm]; frontal plane: 66.51mm [95% CI: 60.45-72.57mm] vs 88.43mm [95% CI: 78.53-98.32mm]). This was reflected in the trajectory of the COM, wherein concussion patients exhibited lower posterior displacement (19.67mm [95% CI: 19.65mm-19.7mm]) compared with controls (23.62mm [95% CI: 23.6-23.64]). On this basis, we conclude that individuals with concussion display deficits during a GI task which are potentially indicative of supraspinal impairments in motor control. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Shoulder Strength Requirements for Upper Limb Functional Tasks: Do Age and Rotator Cuff Tear Status Matter?

    PubMed

    Santago, Anthony C; Vidt, Meghan E; Li, Xiaotong; Tuohy, Christopher J; Poehling, Gary G; Freehill, Michael T; Saul, Katherine R

    2017-12-01

    Understanding upper limb strength requirements for daily tasks is imperative for early detection of strength loss that may progress to disability due to age or rotator cuff tear. We quantified shoulder strength requirements for 5 upper limb tasks performed by 3 groups: uninjured young adults and older adults, and older adults with a degenerative supraspinatus tear prior to repair. Musculoskeletal models were developed for each group representing age, sex, and tear-related strength losses. Percentage of available strength used was quantified for the subset of tasks requiring the largest amount of shoulder strength. Significant differences in strength requirements existed across tasks: upward reach 105° required the largest average strength; axilla wash required the largest peak strength. However, there were limited differences across participant groups. Older adults with and without a tear used a larger percentage of their shoulder elevation (p < .001, p < .001) and external rotation (p < .001, p = .017) strength than the young adults, respectively. Presence of a tear significantly increased percentage of internal rotation strength compared to young (p < .001) and uninjured older adults (p = .008). Marked differences in strength demand across tasks indicate the need for evaluating a diversity of functional tasks to effectively detect early strength loss, which may lead to disability.

  11. Categories of Computer Use and Their Relationships with Attitudes toward Computers.

    ERIC Educational Resources Information Center

    Mitra, Anandra

    1998-01-01

    Analysis of attitude and use questionnaires completed by undergraduates (n1,444) at Wake Forest University determined that computers were used most frequently for word processing. Other uses were e-mail for task and non-task activities and mathematical and statistical computation. Results suggest that the level of computer use was related to…

  12. Exploring methodological frameworks for a mental task-based near-infrared spectroscopy brain-computer interface.

    PubMed

    Weyand, Sabine; Takehara-Nishiuchi, Kaori; Chau, Tom

    2015-10-30

    Near-infrared spectroscopy (NIRS) brain-computer interfaces (BCIs) enable users to interact with their environment using only cognitive activities. This paper presents the results of a comparison of four methodological frameworks used to select a pair of tasks to control a binary NIRS-BCI; specifically, three novel personalized task paradigms and the state-of-the-art prescribed task framework were explored. Three types of personalized task selection approaches were compared, including: user-selected mental tasks using weighted slope scores (WS-scores), user-selected mental tasks using pair-wise accuracy rankings (PWAR), and researcher-selected mental tasks using PWAR. These paradigms, along with the state-of-the-art prescribed mental task framework, where mental tasks are selected based on the most commonly used tasks in literature, were tested by ten able-bodied participants who took part in five NIRS-BCI sessions. The frameworks were compared in terms of their accuracy, perceived ease-of-use, computational time, user preference, and length of training. Most notably, researcher-selected personalized tasks resulted in significantly higher accuracies, while user-selected personalized tasks resulted in significantly higher perceived ease-of-use. It was also concluded that PWAR minimized the amount of data that needed to be collected; while, WS-scores maximized user satisfaction and minimized computational time. In comparison to the state-of-the-art prescribed mental tasks, our findings show that overall, personalized tasks appear to be superior to prescribed tasks with respect to accuracy and perceived ease-of-use. The deployment of personalized rather than prescribed mental tasks ought to be considered and further investigated in future NIRS-BCI studies. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Quantifying Postural Control during Exergaming Using Multivariate Whole-Body Movement Data: A Self-Organizing Maps Approach

    PubMed Central

    van Diest, Mike; Stegenga, Jan; Wörtche, Heinrich J.; Roerdink, Jos B. T. M; Verkerke, Gijsbertus J.; Lamoth, Claudine J. C.

    2015-01-01

    Background Exergames are becoming an increasingly popular tool for training balance ability, thereby preventing falls in older adults. Automatic, real time, assessment of the user’s balance control offers opportunities in terms of providing targeted feedback and dynamically adjusting the gameplay to the individual user, yet algorithms for quantification of balance control remain to be developed. The aim of the present study was to identify movement patterns, and variability therein, of young and older adults playing a custom-made weight-shifting (ice-skating) exergame. Methods Twenty older adults and twenty young adults played a weight-shifting exergame under five conditions of varying complexity, while multi-segmental whole-body movement data were captured using Kinect. Movement coordination patterns expressed during gameplay were identified using Self Organizing Maps (SOM), an artificial neural network, and variability in these patterns was quantified by computing Total Trajectory Variability (TTvar). Additionally a k Nearest Neighbor (kNN) classifier was trained to discriminate between young and older adults based on the SOM features. Results Results showed that TTvar was significantly higher in older adults than in young adults, when playing the exergame under complex task conditions. The kNN classifier showed a classification accuracy of 65.8%. Conclusions Older adults display more variable sway behavior than young adults, when playing the exergame under complex task conditions. The SOM features characterizing movement patterns expressed during exergaming allow for discriminating between young and older adults with limited accuracy. Our findings contribute to the development of algorithms for quantification of balance ability during home-based exergaming for balance training. PMID:26230655

  14. Robotic assessment of neuromuscular characteristics using musculoskeletal models: A pilot study.

    PubMed

    Jayaneththi, V R; Viloria, J; Wiedemann, L G; Jarrett, C; McDaid, A J

    2017-07-01

    Non-invasive neuromuscular characterization aims to provide greater insight into the effectiveness of existing and emerging rehabilitation therapies by quantifying neuromuscular characteristics relating to force production, muscle viscoelasticity and voluntary neural activation. In this paper, we propose a novel approach to evaluate neuromuscular characteristics, such as muscle fiber stiffness and viscosity, by combining robotic and HD-sEMG measurements with computational musculoskeletal modeling. This pilot study investigates the efficacy of this approach on a healthy population and provides new insight on potential limitations of conventional musculoskeletal models for this application. Subject-specific neuromuscular characteristics of the biceps and triceps brachii were evaluated using robot-measured kinetics, kinematics and EMG activity as inputs to a musculoskeletal model. Repeatability experiments in five participants revealed large variability within each subjects evaluated characteristics, with almost all experiencing variation greater than 50% of full scale when repeating the same task. The use of robotics and HD-sEMG, in conjunction with musculoskeletal modeling, to quantify neuromuscular characteristics has been explored. Despite the ability to predict joint kinematics with relatively high accuracy, parameter characterization was inconsistent i.e. many parameter combinations gave rise to minimal kinematic error. The proposed technique is a novel approach for in vivo neuromuscular characterization and is a step towards the realization of objective in-home robot-assisted rehabilitation. Importantly, the results have confirmed the technical (robot and HD-sEMG) feasibility while highlighting the need to develop new musculoskeletal models and optimization techniques capable of achieving consistent results across a range of dynamic tasks. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Quantifying Postural Control during Exergaming Using Multivariate Whole-Body Movement Data: A Self-Organizing Maps Approach.

    PubMed

    van Diest, Mike; Stegenga, Jan; Wörtche, Heinrich J; Roerdink, Jos B T M; Verkerke, Gijsbertus J; Lamoth, Claudine J C

    2015-01-01

    Exergames are becoming an increasingly popular tool for training balance ability, thereby preventing falls in older adults. Automatic, real time, assessment of the user's balance control offers opportunities in terms of providing targeted feedback and dynamically adjusting the gameplay to the individual user, yet algorithms for quantification of balance control remain to be developed. The aim of the present study was to identify movement patterns, and variability therein, of young and older adults playing a custom-made weight-shifting (ice-skating) exergame. Twenty older adults and twenty young adults played a weight-shifting exergame under five conditions of varying complexity, while multi-segmental whole-body movement data were captured using Kinect. Movement coordination patterns expressed during gameplay were identified using Self Organizing Maps (SOM), an artificial neural network, and variability in these patterns was quantified by computing Total Trajectory Variability (TTvar). Additionally a k Nearest Neighbor (kNN) classifier was trained to discriminate between young and older adults based on the SOM features. Results showed that TTvar was significantly higher in older adults than in young adults, when playing the exergame under complex task conditions. The kNN classifier showed a classification accuracy of 65.8%. Older adults display more variable sway behavior than young adults, when playing the exergame under complex task conditions. The SOM features characterizing movement patterns expressed during exergaming allow for discriminating between young and older adults with limited accuracy. Our findings contribute to the development of algorithms for quantification of balance ability during home-based exergaming for balance training.

  16. Quantum correlations in a family of bipartite separable qubit states

    NASA Astrophysics Data System (ADS)

    Xie, Chuanmei; Liu, Yimin; Chen, Jianlan; Zhang, Zhanjun

    2017-03-01

    Quantum correlations (QCs) in some separable states have been proposed as a key resource for certain quantum communication tasks and quantum computational models without entanglement. In this paper, a family of nine-parameter separable states, obtained from arbitrary mixture of two sets of bi-qubit product pure states, is considered. QCs in these separable states are studied analytically or numerically using four QC quantifiers, i.e., measurement-induced disturbance (Luo in Phys Rev A77:022301, 2008), ameliorated MID (Girolami et al. in J Phys A Math Theor 44:352002, 2011),quantum dissonance (DN) (Modi et al. in Phys Rev Lett 104:080501, 2010), and new quantum dissonance (Rulli in Phys Rev A 84:042109, 2011), respectively. First, an inherent symmetry in the concerned separable states is revealed, that is, any nine-parameter separable states concerned in this paper can be transformed to a three-parameter kernel state via some certain local unitary operation. Then, four different QC expressions are concretely derived with the four QC quantifiers. Furthermore, some comparative studies of the QCs are presented, discussed and analyzed, and some distinct features about them are exposed. We find that, in the framework of all the four QC quantifiers, the more mixed the original two pure product states, the bigger QCs the separable states own. Our results reveal some intrinsic features of QCs in separable systems in quantum information.

  17. Schedule Risk Assessment

    NASA Technical Reports Server (NTRS)

    Smith, Greg

    2003-01-01

    Schedule Risk Assessment needs to determine the probability of finishing on or before a given point in time. Task in a schedule should reflect the "most likely" duration for each task. IN reality, each task is different and has a varying degree of probability of finishing within or after the duration specified. Schedule risk assessment attempt to quantify these probabilities by assigning values to each task. Bridges the gap between CPM scheduling and the project's need to know the likelihood of "when".

  18. Understanding neuromotor strategy during functional upper extremity tasks using symbolic dynamics.

    PubMed

    Nathan, Dominic E; Guastello, Stephen J; Prost, Robert W; Jeutter, Dean C

    2012-01-01

    The ability to model and quantify brain activation patterns that pertain to natural neuromotor strategy of the upper extremities during functional task performance is critical to the development of therapeutic interventions such as neuroprosthetic devices. The mechanisms of information flow, activation sequence and patterns, and the interaction between anatomical regions of the brain that are specific to movement planning, intention and execution of voluntary upper extremity motor tasks were investigated here. This paper presents a novel method using symbolic dynamics (orbital decomposition) and nonlinear dynamic tools of entropy, self-organization and chaos to describe the underlying structure of activation shifts in regions of the brain that are involved with the cognitive aspects of functional upper extremity task performance. Several questions were addressed: (a) How is it possible to distinguish deterministic or causal patterns of activity in brain fMRI from those that are really random or non-contributory to the neuromotor control process? (b) Can the complexity of activation patterns over time be quantified? (c) What are the optimal ways of organizing fMRI data to preserve patterns of activation, activation levels, and extract meaningful temporal patterns as they evolve over time? Analysis was performed using data from a custom developed time resolved fMRI paradigm involving human subjects (N=18) who performed functional upper extremity motor tasks with varying time delays between the onset of intention and onset of actual movements. The results indicate that there is structure in the data that can be quantified through entropy and dimensional complexity metrics and statistical inference, and furthermore, orbital decomposition is sensitive in capturing the transition of states that correlate with the cognitive aspects of functional task performance.

  19. Decoding human mental states by whole-head EEG+fNIRS during category fluency task performance

    NASA Astrophysics Data System (ADS)

    Omurtag, Ahmet; Aghajani, Haleh; Onur Keles, Hasan

    2017-12-01

    Objective. Concurrent scalp electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS), which we refer to as EEG+fNIRS, promises greater accuracy than the individual modalities while remaining nearly as convenient as EEG. We sought to quantify the hybrid system’s ability to decode mental states and compare it with its unimodal components. Approach. We recorded from healthy volunteers taking the category fluency test and applied machine learning techniques to the data. Main results. EEG+fNIRS’s decoding accuracy was greater than that of its subsystems, partly due to the new type of neurovascular features made available by hybrid data. Significance. Availability of an accurate and practical decoding method has potential implications for medical diagnosis, brain-computer interface design, and neuroergonomics.

  20. Scoring functions for protein-protein interactions.

    PubMed

    Moal, Iain H; Moretti, Rocco; Baker, David; Fernández-Recio, Juan

    2013-12-01

    The computational evaluation of protein-protein interactions will play an important role in organising the wealth of data being generated by high-throughput initiatives. Here we discuss future applications, report recent developments and identify areas requiring further investigation. Many functions have been developed to quantify the structural and energetic properties of interacting proteins, finding use in interrelated challenges revolving around the relationship between sequence, structure and binding free energy. These include loop modelling, side-chain refinement, docking, multimer assembly, affinity prediction, affinity change upon mutation, hotspots location and interface design. Information derived from models optimised for one of these challenges can be used to benefit the others, and can be unified within the theoretical frameworks of multi-task learning and Pareto-optimal multi-objective learning. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Computer Assistance in Information Work. Part I: Conceptual Framework for Improving the Computer/User Interface in Information Work. Part II: Catalog of Acceleration, Augmentation, and Delegation Functions in Information Work.

    ERIC Educational Resources Information Center

    Paisley, William; Butler, Matilda

    This study of the computer/user interface investigated the role of the computer in performing information tasks that users now perform without computer assistance. Users' perceptual/cognitive processes are to be accelerated or augmented by the computer; a long term goal is to delegate information tasks entirely to the computer. Cybernetic and…

  2. Assessment of Computer and Information Literacy in ICILS 2013: Do Different Item Types Measure the Same Construct?

    ERIC Educational Resources Information Center

    Ihme, Jan Marten; Senkbeil, Martin; Goldhammer, Frank; Gerick, Julia

    2017-01-01

    The combination of different item formats is found quite often in large scale assessments, and analyses on the dimensionality often indicate multi-dimensionality of tests regarding the task format. In ICILS 2013, three different item types (information-based response tasks, simulation tasks, and authoring tasks) were used to measure computer and…

  3. Costs and benefits of integrating information between the cerebral hemispheres: a computational perspective.

    PubMed

    Belger, A; Banich, M T

    1998-07-01

    Because interaction of the cerebral hemispheres has been found to aid task performance under demanding conditions, the present study examined how this effect is moderated by computational complexity, the degree of lateralization for a task, and individual differences in asymmetric hemispheric activation (AHA). Computational complexity was manipulated across tasks either by increasing the number of inputs to be processed or by increasing the number of steps to a decision. Comparison of within- and across-hemisphere trials indicated that the size of the between-hemisphere advantage increased as a function of task complexity, except for a highly lateralized rhyme decision task that can only be performed by the left hemisphere. Measures of individual differences in AHA revealed that when task demands and an individual's AHA both load on the same hemisphere, the ability to divide the processing between the hemispheres is limited. Thus, interhemispheric division of processing improves performance at higher levels of computational complexity only when the required operations can be divided between the hemispheres.

  4. Subjective and objective quantification of physician's workload and performance during radiation therapy planning tasks.

    PubMed

    Mazur, Lukasz M; Mosaly, Prithima R; Hoyle, Lesley M; Jones, Ellen L; Marks, Lawrence B

    2013-01-01

    To quantify, and compare, workload for several common physician-based treatment planning tasks using objective and subjective measures of workload. To assess the relationship between workload and performance to define workload levels where performance could be expected to decline. Nine physicians performed the same 3 tasks on each of 2 cases ("easy" vs "hard"). Workload was assessed objectively throughout the tasks (via monitoring of pupil size and blink rate), and subjectively at the end of each case (via National Aeronautics and Space Administration Task Load Index; NASA-TLX). NASA-TLX assesses the 6 dimensions (mental, physical, and temporal demands, frustration, effort, and performance); scores > or ≈ 50 are associated with reduced performance in other industries. Performance was measured using participants' stated willingness to approve the treatment plan. Differences in subjective and objective workload between cases, tasks, and experience were assessed using analysis of variance (ANOVA). The correlation between subjective and objective workload measures were assessed via the Pearson correlation test. The relationships between workload and performance measures were assessed using the t test. Eighteen case-wise and 54 task-wise assessments were obtained. Subjective NASA-TLX scores (P < .001), but not time-weighted averages of objective scores (P > .1), were significantly lower for the easy vs hard case. Most correlations between the subjective and objective measures were not significant, except between average blink rate and NASA-TLX scores (r = -0.34, P = .02), for task-wise assessments. Performance appeared to decline at NASA-TLX scores of ≥55. The NASA-TLX may provide a reasonable method to quantify subjective workload for broad activities, and objective physiologic eye-based measures may be useful to monitor workload for more granular tasks within activities. The subjective and objective measures, as herein quantified, do not necessarily track each other, and more work is needed to assess their utilities. From a series of controlled experiments, we found that performance appears to decline at subjective workload levels ≥55 (as measured via NASA-TLX), which is consistent with findings from other industries. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  5. A new parallel DNA algorithm to solve the task scheduling problem based on inspired computational model.

    PubMed

    Wang, Zhaocai; Ji, Zuwen; Wang, Xiaoming; Wu, Tunhua; Huang, Wei

    2017-12-01

    As a promising approach to solve the computationally intractable problem, the method based on DNA computing is an emerging research area including mathematics, computer science and molecular biology. The task scheduling problem, as a well-known NP-complete problem, arranges n jobs to m individuals and finds the minimum execution time of last finished individual. In this paper, we use a biologically inspired computational model and describe a new parallel algorithm to solve the task scheduling problem by basic DNA molecular operations. In turn, we skillfully design flexible length DNA strands to represent elements of the allocation matrix, take appropriate biological experiment operations and get solutions of the task scheduling problem in proper length range with less than O(n 2 ) time complexity. Copyright © 2017. Published by Elsevier B.V.

  6. Two aspects of feedforward postural control: anticipatory postural adjustments and anticipatory synergy adjustments.

    PubMed

    Klous, Miriam; Mikulic, Pavle; Latash, Mark L

    2011-05-01

    We used the framework of the uncontrolled manifold hypothesis to explore the relations between anticipatory synergy adjustments (ASAs) and anticipatory postural adjustments (APAs) during feedforward control of vertical posture. ASAs represent a drop in the index of a multimuscle-mode synergy stabilizing the coordinate of the center of pressure in preparation to an action. ASAs reflect early changes of an index of covariation among variables reflecting muscle activation, whereas APAs reflect early changes in muscle activation levels averaged across trials. The assumed purpose of ASAs is to modify stability of performance variables, whereas the purpose of APAs is to change magnitudes of those variables. We hypothesized that ASAs would be seen before APAs and that this finding would be consistent with regard to the muscle-mode composition defined on the basis of different tasks and phases of action. Subjects performed a voluntary body sway task and a quick, bilateral shoulder flexion task under self-paced and reaction time conditions. Surface muscle activity of 12 leg and trunk muscles was analyzed to identify sets of 4 muscle modes for each task and for different phases within the shoulder flexion task. Variance components in the muscle-mode space and indexes of multimuscle-mode synergy stabilizing shift of the center of pressure were computed. ASAs were seen ∼ 100-150 ms prior to the task initiation, before APAs. The results were consistent with respect to different sets of muscle modes defined over the two tasks and different shoulder flexion phases. We conclude that the preparation for a self-triggered postural perturbation is associated with two types of anticipatory adjustments, ASAs and APAs. They reflect different feedforward processes within the hypothetical hierarchical control scheme, resulting in changes in patterns of covariation of elemental variables and in their patterns averaged across trials, respectively. The results show that synergies quantified using dissimilar sets of muscle modes show similar feedforward changes in preparation to action.

  7. Temporal coding of reward-guided choice in the posterior parietal cortex

    PubMed Central

    Hawellek, David J.; Wong, Yan T.; Pesaran, Bijan

    2016-01-01

    Making a decision involves computations across distributed cortical and subcortical networks. How such distributed processing is performed remains unclear. We test how the encoding of choice in a key decision-making node, the posterior parietal cortex (PPC), depends on the temporal structure of the surrounding population activity. We recorded spiking and local field potential (LFP) activity in the PPC while two rhesus macaques performed a decision-making task. We quantified the mutual information that neurons carried about an upcoming choice and its dependence on LFP activity. The spiking of PPC neurons was correlated with LFP phases at three distinct time scales in the theta, beta, and gamma frequency bands. Importantly, activity at these time scales encoded upcoming decisions differently. Choice information contained in neural firing varied with the phase of beta and gamma activity. For gamma activity, maximum choice information occurred at the same phase as the maximum spike count. However, for beta activity, choice information and spike count were greatest at different phases. In contrast, theta activity did not modulate the encoding properties of PPC units directly but was correlated with beta and gamma activity through cross-frequency coupling. We propose that the relative timing of local spiking and choice information reveals temporal reference frames for computations in either local or large-scale decision networks. Differences between the timing of task information and activity patterns may be a general signature of distributed processing across large-scale networks. PMID:27821752

  8. Learning rational temporal eye movement strategies.

    PubMed

    Hoppe, David; Rothkopf, Constantin A

    2016-07-19

    During active behavior humans redirect their gaze several times every second within the visual environment. Where we look within static images is highly efficient, as quantified by computational models of human gaze shifts in visual search and face recognition tasks. However, when we shift gaze is mostly unknown despite its fundamental importance for survival in a dynamic world. It has been suggested that during naturalistic visuomotor behavior gaze deployment is coordinated with task-relevant events, often predictive of future events, and studies in sportsmen suggest that timing of eye movements is learned. Here we establish that humans efficiently learn to adjust the timing of eye movements in response to environmental regularities when monitoring locations in the visual scene to detect probabilistically occurring events. To detect the events humans adopt strategies that can be understood through a computational model that includes perceptual and acting uncertainties, a minimal processing time, and, crucially, the intrinsic costs of gaze behavior. Thus, subjects traded off event detection rate with behavioral costs of carrying out eye movements. Remarkably, based on this rational bounded actor model the time course of learning the gaze strategies is fully explained by an optimal Bayesian learner with humans' characteristic uncertainty in time estimation, the well-known scalar law of biological timing. Taken together, these findings establish that the human visual system is highly efficient in learning temporal regularities in the environment and that it can use these regularities to control the timing of eye movements to detect behaviorally relevant events.

  9. Health Literacy and Task Environment Influence Parents' Burden for Data Entry on Child-Specific Health Information: Randomized Controlled Trial

    PubMed Central

    Guo, Chao-Yu; Bacic, Janine; Chan, Eugenia

    2011-01-01

    Background Health care systems increasingly rely on patients’ data entry efforts to organize and assist in care delivery through health information exchange. Objectives We sought to determine (1) the variation in burden imposed on parents by data entry efforts across paper-based and computer-based environments, and (2) the impact, if any, of parents’ health literacy on the task burden. Methods We completed a randomized controlled trial of parent-completed data entry tasks. Parents of children with attention deficit hyperactivity disorder (ADHD) were randomized based on the Test of Functional Health Literacy in Adults (TOFHLA) to either a paper-based or computer-based environment for entry of health information on their children. The primary outcome was the National Aeronautics and Space Administration Task Load Index (TLX) total weighted score. Results We screened 271 parents: 194 (71.6%) were eligible, and 180 of these (92.8%) constituted the study cohort. We analyzed 90 participants from each arm. Parents who completed information tasks on paper reported a higher task burden than those who worked in the computer environment: mean (SD) TLX scores were 22.8 (20.6) for paper and 16.3 (16.1) for computer. Assignment to the paper environment conferred a significant risk of higher task burden (F1,178 = 4.05, P = .046). Adequate literacy was associated with lower task burden (decrease in burden score of 1.15 SD, P = .003). After adjusting for relevant child and parent factors, parents’ TOFHLA score (beta = -.02, P = .02) and task environment (beta = .31, P = .03) remained significantly associated with task burden. Conclusions A tailored computer-based environment provided an improved task experience for data entry compared to the same tasks completed on paper. Health literacy was inversely related to task burden. Trial registration Clinicaltrials.gov NCT00543257; http://www.clinicaltrials.gov/ct2/show/NCT00543257 (Archived by WebCite at http://www.webcitation.org/5vUVH2DYR) PMID:21269990

  10. Computational Virtual Reality (VR) as a human-computer interface in the operation of telerobotic systems

    NASA Technical Reports Server (NTRS)

    Bejczy, Antal K.

    1995-01-01

    This presentation focuses on the application of computer graphics or 'virtual reality' (VR) techniques as a human-computer interface tool in the operation of telerobotic systems. VR techniques offer very valuable task realization aids for planning, previewing and predicting robotic actions, operator training, and for visual perception of non-visible events like contact forces in robotic tasks. The utility of computer graphics in telerobotic operation can be significantly enhanced by high-fidelity calibration of virtual reality images to actual TV camera images. This calibration will even permit the creation of artificial (synthetic) views of task scenes for which no TV camera views are available.

  11. Development of Physical Employment Standards for the Royal Australian Navy: Validation of Identified Whole-of-ship Tasks

    DTIC Science & Technology

    2014-11-01

    to determining individual CS tasks, this partially satisfied Navy’s request to quantify the physical demands of the course in order to draw parity ...will enable comparison between task demands on the course and during on-board duties. These data will be used to determine whether there is parity ...between the physical and physiological demands of the ACSC (or components of it) and CS tasks performed on-board each platform. If parity is drawn

  12. Aging may negatively impact movement smoothness during stair negotiation.

    PubMed

    Dixon, P C; Stirling, L; Xu, X; Chang, C C; Dennerlein, J T; Schiffman, J M

    2018-05-26

    Stairs represent a barrier to safe locomotion for some older adults, potentially leading to the adoption of a cautious gait strategy that may lack fluidity. This strategy may be characterized as unsmooth; however, stair negotiation smoothness has yet to be quantified. The aims of this study were to assess age- and task-related differences in head and body center of mass (COM) acceleration patterns and smoothness during stair negotiation and to determine if smoothness was associated with the timed "Up and Go" (TUG) test of functional movement. Motion data from nineteen older and twenty young adults performing stair ascent, stair descent, and overground straight walking trials were analyzed and used to compute smoothness based on the log-normalized dimensionless jerk (LDJ) and the velocity spectral arc length (SPARC) metrics. The associations between TUG and smoothness measures were evaluated using Pearson's correlation coefficient (r). Stair tasks increased head and body COM acceleration pattern differences across groups, compared to walking (p < 0.05). LDJ smoothness for the head and body COM decreased in older adults during stair descent, compared to young adults (p ≤ 0.015) and worsened with increasing TUG for all tasks (-0.60 ≤ r ≤ -0.43). SPARC smoothness of the head and body COM increased in older adults, regardless of task (p < 0.001), while correlations showed improved SPARC smoothness with increasing TUG for some tasks (0.33 ≤ r ≤ 0.40). The LDJ outperforms SPARC in identifying age-related stair negotiation adaptations and is associated with performance on a clinical test of gait. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Study to design and develop remote manipulator system. [computer simulation of human performance

    NASA Technical Reports Server (NTRS)

    Hill, J. W.; Mcgovern, D. E.; Sword, A. J.

    1974-01-01

    Modeling of human performance in remote manipulation tasks is reported by automated procedures using computers to analyze and count motions during a manipulation task. Performance is monitored by an on-line computer capable of measuring the joint angles of both master and slave and in some cases the trajectory and velocity of the hand itself. In this way the operator's strategies with different transmission delays, displays, tasks, and manipulators can be analyzed in detail for comparison. Some progress is described in obtaining a set of standard tasks and difficulty measures for evaluating manipulator performance.

  14. Motivation and Performance within a Collaborative Computer-Based Modeling Task: Relations between Students' Achievement Goal Orientation, Self-Efficacy, Cognitive Processing, and Achievement

    ERIC Educational Resources Information Center

    Sins, Patrick H. M.; van Joolingen, Wouter R.; Savelsbergh, Elwin R.; van Hout-Wolters, Bernadette

    2008-01-01

    Purpose of the present study was to test a conceptual model of relations among achievement goal orientation, self-efficacy, cognitive processing, and achievement of students working within a particular collaborative task context. The task involved a collaborative computer-based modeling task. In order to test the model, group measures of…

  15. Examination of the perceived agility and balance during a reactive agility task.

    PubMed

    Stirling, Leia; Eke, Chika; Cain, Stephen M

    2018-01-01

    In vehicle dynamics, it is commonly understood that there is an inverse relationship between stability and maneuverability. However, animal studies have found that stability and maneuverability can coincide. In this study, we examine humans running a reactive agility obstacle and consider the relationship between observational perceived agility and balance, as well as the relationship between quantified surrogates of agility and balance. Recreational athletes (n = 18) completed the agility task while wearing inertial measurement units (IMUs) on their body. The task was also video-recorded. An observational study was completed by a separate group of adults (n = 33) that were asked to view the videos and score each athlete on a Likert scale for balance and for agility. The data from the body-worn IMUs were used to estimate quantified surrogate measures for agility and balance, and to assess if the relationship between the quantified agility and balance was in the same direction as the perceived relationship from the Likert scale responses. Results indicate that athletes that were given a higher Likert agility score were also given a higher balance score (rs = 0.75,p < 0.001). Quantitative surrogates of agility and balance also showed this same relationship. Additional insights on technique for this reactive agility task were informed by the quantitative surrogates. We observed the importance of stepping technique in achieving the faster completion times. The fast performing athletes spent a greater proportion of the task in double support and lower overall time in single support indicating increased periods of static stability. The fast performing athletes did not have a higher body speed, but performed the task with a more efficient technique, using foot placement to enable heading changes, and thus may have had a more efficient path. Similar to animal studies, people use technique to enable agile strategies while also enabling increased balance across the task.

  16. Modeling Cognitive Strategies during Complex Task Performing Process

    ERIC Educational Resources Information Center

    Mazman, Sacide Guzin; Altun, Arif

    2012-01-01

    The purpose of this study is to examine individuals' computer based complex task performing processes and strategies in order to determine the reasons of failure by cognitive task analysis method and cued retrospective think aloud with eye movement data. Study group was five senior students from Computer Education and Instructional Technologies…

  17. Automated Instructional Monitors for Complex Operational Tasks. Final Report.

    ERIC Educational Resources Information Center

    Feurzeig, Wallace

    A computer-based instructional system is described which incorporates diagnosis of students difficulties in acquiring complex concepts and skills. A computer automatically generated a simulated display. It then monitored and analyzed a student's work in the performance of assigned training tasks. Two major tasks were studied. The first,…

  18. Simple, efficient allocation of modelling runs on heterogeneous clusters with MPI

    USGS Publications Warehouse

    Donato, David I.

    2017-01-01

    In scientific modelling and computation, the choice of an appropriate method for allocating tasks for parallel processing depends on the computational setting and on the nature of the computation. The allocation of independent but similar computational tasks, such as modelling runs or Monte Carlo trials, among the nodes of a heterogeneous computational cluster is a special case that has not been specifically evaluated previously. A simulation study shows that a method of on-demand (that is, worker-initiated) pulling from a bag of tasks in this case leads to reliably short makespans for computational jobs despite heterogeneity both within and between cluster nodes. A simple reference implementation in the C programming language with the Message Passing Interface (MPI) is provided.

  19. Evaluation schemes for video and image anomaly detection algorithms

    NASA Astrophysics Data System (ADS)

    Parameswaran, Shibin; Harguess, Josh; Barngrover, Christopher; Shafer, Scott; Reese, Michael

    2016-05-01

    Video anomaly detection is a critical research area in computer vision. It is a natural first step before applying object recognition algorithms. There are many algorithms that detect anomalies (outliers) in videos and images that have been introduced in recent years. However, these algorithms behave and perform differently based on differences in domains and tasks to which they are subjected. In order to better understand the strengths and weaknesses of outlier algorithms and their applicability in a particular domain/task of interest, it is important to measure and quantify their performance using appropriate evaluation metrics. There are many evaluation metrics that have been used in the literature such as precision curves, precision-recall curves, and receiver operating characteristic (ROC) curves. In order to construct these different metrics, it is also important to choose an appropriate evaluation scheme that decides when a proposed detection is considered a true or a false detection. Choosing the right evaluation metric and the right scheme is very critical since the choice can introduce positive or negative bias in the measuring criterion and may favor (or work against) a particular algorithm or task. In this paper, we review evaluation metrics and popular evaluation schemes that are used to measure the performance of anomaly detection algorithms on videos and imagery with one or more anomalies. We analyze the biases introduced by these by measuring the performance of an existing anomaly detection algorithm.

  20. Parallel processing using an optical delay-based reservoir computer

    NASA Astrophysics Data System (ADS)

    Van der Sande, Guy; Nguimdo, Romain Modeste; Verschaffelt, Guy

    2016-04-01

    Delay systems subject to delayed optical feedback have recently shown great potential in solving computationally hard tasks. By implementing a neuro-inspired computational scheme relying on the transient response to optical data injection, high processing speeds have been demonstrated. However, reservoir computing systems based on delay dynamics discussed in the literature are designed by coupling many different stand-alone components which lead to bulky, lack of long-term stability, non-monolithic systems. Here we numerically investigate the possibility of implementing reservoir computing schemes based on semiconductor ring lasers. Semiconductor ring lasers are semiconductor lasers where the laser cavity consists of a ring-shaped waveguide. SRLs are highly integrable and scalable, making them ideal candidates for key components in photonic integrated circuits. SRLs can generate light in two counterpropagating directions between which bistability has been demonstrated. We demonstrate that two independent machine learning tasks , even with different nature of inputs with different input data signals can be simultaneously computed using a single photonic nonlinear node relying on the parallelism offered by photonics. We illustrate the performance on simultaneous chaotic time series prediction and a classification of the Nonlinear Channel Equalization. We take advantage of different directional modes to process individual tasks. Each directional mode processes one individual task to mitigate possible crosstalk between the tasks. Our results indicate that prediction/classification with errors comparable to the state-of-the-art performance can be obtained even with noise despite the two tasks being computed simultaneously. We also find that a good performance is obtained for both tasks for a broad range of the parameters. The results are discussed in detail in [Nguimdo et al., IEEE Trans. Neural Netw. Learn. Syst. 26, pp. 3301-3307, 2015

  1. Importance of baseline in event-related desynchronization during a combination task of motor imagery and motor observation

    NASA Astrophysics Data System (ADS)

    Tangwiriyasakul, Chayanin; Verhagen, Rens; van Putten, Michel J. A. M.; Rutten, Wim L. C.

    2013-04-01

    Objective. Event-related desynchronization (ERD) or synchronization (ERS) refers to the modulation of any EEG rhythm in response to a particular event. It is typically quantified as the ratio between a baseline and a task condition (the event). Here, we focused on the sensorimotor mu-rhythm. We explored the effects of different baselines on mu-power and ERD of the mu-rhythm during a motor imagery task. Methods. Eighteen healthy subjects performed motor imagery tasks while EEGs were recorded. Five different baseline movies were shown. For the imagery task a right-hand opening/closing movie was shown. Power and ERD of the mu-rhythm recorded over C3 and C4 for the different baselines were estimated. Main Results. 50% of the subjects showed relatively high mu-power for specific baselines only, and ERDs of these subjects were strongly dependent on the baseline used. In 17% of the subjects no preference was found. Contralateral ERD of the mu-rhythm was found in about 67% of the healthy volunteers, with a significant baseline preference in about 75% of that subgroup. Significance. The sensorimotor ERD quantifies activity of the brain during motor imagery tasks. Selection of the optimal baseline increases ERD.

  2. Psychology of computer use: XXXII. Computer screen-savers as distractors.

    PubMed

    Volk, F A; Halcomb, C G

    1994-12-01

    The differences in performance of 16 male and 16 female undergraduates on three cognitive tasks were investigated in the presence of visual distractors (computer-generated dynamic graphic images). These tasks included skilled and unskilled proofreading and listening comprehension. The visually demanding task of proofreading (skilled and unskilled) showed no significant decreases in performance in the distractor conditions. Results showed significant decrements, however, in performance on listening comprehension in at least one of the distractor conditions.

  3. Morphosyntax and Logical Abilities in Italian Poor Readers: The Problem of SLI Under-Identification

    ERIC Educational Resources Information Center

    Arosio, Fabrizio; Pagliarini, Elena; Perugini, Maria; Barbieri, Lina; Guasti, Maria Teresa

    2016-01-01

    The study investigated morphosyntactic abilities and semantic-pragmatic competence in 24 children with developmental dyslexia aged 7-12 years. Morphosyntactic abilities were investgated in a direct object clitic production task, semantic-pragmatic competence in a quantifier comprehension task. Children with dyslexia produced fewer clitics than…

  4. Big climate data analysis

    NASA Astrophysics Data System (ADS)

    Mudelsee, Manfred

    2015-04-01

    The Big Data era has begun also in the climate sciences, not only in economics or molecular biology. We measure climate at increasing spatial resolution by means of satellites and look farther back in time at increasing temporal resolution by means of natural archives and proxy data. We use powerful supercomputers to run climate models. The model output of the calculations made for the IPCC's Fifth Assessment Report amounts to ~650 TB. The 'scientific evolution' of grid computing has started, and the 'scientific revolution' of quantum computing is being prepared. This will increase computing power, and data amount, by several orders of magnitude in the future. However, more data does not automatically mean more knowledge. We need statisticians, who are at the core of transforming data into knowledge. Statisticians notably also explore the limits of our knowledge (uncertainties, that is, confidence intervals and P-values). Mudelsee (2014 Climate Time Series Analysis: Classical Statistical and Bootstrap Methods. Second edition. Springer, Cham, xxxii + 454 pp.) coined the term 'optimal estimation'. Consider the hyperspace of climate estimation. It has many, but not infinite, dimensions. It consists of the three subspaces Monte Carlo design, method and measure. The Monte Carlo design describes the data generating process. The method subspace describes the estimation and confidence interval construction. The measure subspace describes how to detect the optimal estimation method for the Monte Carlo experiment. The envisaged large increase in computing power may bring the following idea of optimal climate estimation into existence. Given a data sample, some prior information (e.g. measurement standard errors) and a set of questions (parameters to be estimated), the first task is simple: perform an initial estimation on basis of existing knowledge and experience with such types of estimation problems. The second task requires the computing power: explore the hyperspace to find the suitable method, that is, the mode of estimation and uncertainty-measure determination that optimizes a selected measure for prescribed values close to the initial estimates. Also here, intelligent exploration methods (gradient, Brent, etc.) are useful. The third task is to apply the optimal estimation method to the climate dataset. This conference paper illustrates by means of three examples that optimal estimation has the potential to shape future big climate data analysis. First, we consider various hypothesis tests to study whether climate extremes are increasing in their occurrence. Second, we compare Pearson's and Spearman's correlation measures. Third, we introduce a novel estimator of the tail index, which helps to better quantify climate-change related risks.

  5. Real-Time Non-Intrusive Assessment of Viewing Distance during Computer Use.

    PubMed

    Argilés, Marc; Cardona, Genís; Pérez-Cabré, Elisabet; Pérez-Magrané, Ramon; Morcego, Bernardo; Gispets, Joan

    2016-12-01

    To develop and test the sensitivity of an ultrasound-based sensor to assess the viewing distance of visual display terminals operators in real-time conditions. A modified ultrasound sensor was attached to a computer display to assess viewing distance in real time. Sensor functionality was tested on a sample of 20 healthy participants while they conducted four 10-minute randomly presented typical computer tasks (a match-three puzzle game, a video documentary, a task requiring participants to complete a series of sentences, and a predefined internet search). The ultrasound sensor offered good measurement repeatability. Game, text completion, and web search tasks were conducted at shorter viewing distances (54.4 cm [95% CI 51.3-57.5 cm], 54.5 cm [95% CI 51.1-58.0 cm], and 54.5 cm [95% CI 51.4-57.7 cm], respectively) than the video task (62.3 cm [95% CI 58.9-65.7 cm]). Statistically significant differences were found between the video task and the other three tasks (all p < 0.05). Range of viewing distances (from 22 to 27 cm) was similar for all tasks (F = 0.996; p = 0.413). Real-time assessment of the viewing distance of computer users with a non-intrusive ultrasonic device disclosed a task-dependent pattern.

  6. Failure Impact Analysis of Key Management in AMI Using Cybernomic Situational Assessment (CSA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Sheldon, Frederick T; Hauser, Katie R

    2013-01-01

    In earlier work, we presented a computational framework for quantifying the security of a system in terms of the average loss a stakeholder stands to sustain as a result of threats to the system. We named this system, the Cyberspace Security Econometrics System (CSES). In this paper, we refine the framework and apply it to cryptographic key management within the Advanced Metering Infrastructure (AMI) as an example. The stakeholders, requirements, components, and threats are determined. We then populate the matrices with justified values by addressing the AMI at a higher level, rather than trying to consider every piece of hardwaremore » and software involved. We accomplish this task by leveraging the recently established NISTR 7628 guideline for smart grid security. This allowed us to choose the stakeholders, requirements, components, and threats realistically. We reviewed the literature and selected an industry technical working group to select three representative threats from a collection of 29 threats. From this subset, we populate the stakes, dependency, and impact matrices, and the threat vector with realistic numbers. Each Stakeholder s Mean Failure Cost is then computed.« less

  7. A Biosequence-based Approach to Software Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oehmen, Christopher S.; Peterson, Elena S.; Phillips, Aaron R.

    For many applications, it is desirable to have some process for recognizing when software binaries are closely related without relying on them to be identical or have identical segments. Some examples include monitoring utilization of high performance computing centers or service clouds, detecting freeware in licensed code, and enforcing application whitelists. But doing so in a dynamic environment is a nontrivial task because most approaches to software similarity require extensive and time-consuming analysis of a binary, or they fail to recognize executables that are similar but nonidentical. Presented herein is a novel biosequence-based method for quantifying similarity of executable binaries.more » Using this method, it is shown in an example application on large-scale multi-author codes that 1) the biosequence-based method has a statistical performance in recognizing and distinguishing between a collection of real-world high performance computing applications better than 90% of ideal; and 2) an example of using family tree analysis to tune identification for a code subfamily can achieve better than 99% of ideal performance.« less

  8. Information presentation through a head-worn display (“smart glasses”) has a smaller influence on the temporal structure of gait variability during dual-task gait compared to handheld displays (paper-based system and smartphone)

    PubMed Central

    Sedighi, Alireza; Ulman, Sophia M.

    2018-01-01

    The need to complete multiple tasks concurrently is a common occurrence both daily life and in occupational activities, which can often include simultaneous cognitive and physical demands. As one example, there is increasing availability of head-worn display technologies that can be employed when a user is mobile (e.g., while walking). This new method of information presentation may, however, introduce risks of adverse outcomes such as a decrement to gait performance. The goal of this study was thus to quantify the effects of a head-worn display (i.e., smart glasses) on motor variability during gait and to compare these effects with those of other common information displays (i.e., smartphone and paper-based system). Twenty participants completed four walking conditions, as a single task and in three dual-task conditions (three information displays). In the dual-task conditions, the information display was used to present several cognitive tasks. Three different measures were used to quantify variability in gait parameters for each walking condition (using the cycle-to-cycle standard deviation, sample entropy, and the “goal-equivalent manifold” approach). Our results indicated that participants used less adaptable gait strategies in dual-task walking using the paper-based system and smartphone conditions compared with single-task walking. Gait performance, however, was less affected during dual-task walking with the smart glasses. We conclude that the risk of an adverse gait event (e.g., a fall) in head-down walking conditions (i.e., the paper-based system and smartphone conditions) were higher than in single-task walking, and that head-worn displays might help reduce the risk of such events during dual-task gait conditions. PMID:29630614

  9. Differences in the activation and co-activation ratios of the four subdivisions of trapezius between genders following a computer typing task.

    PubMed

    Szucs, Kimberly A; Molnar, Megan

    2017-04-01

    The aim of this study was to provide a description of gender differences of the activation patterns of the four subdivisions of the trapezius (clavicular, upper, middle, lower) following a 60min computer work task. Surface EMG was collected from these subdivisions from 21 healthy subjects during bilateral arm elevation pre-/post- task. Subjects completed a standardized 60min computer work task at a standard, ergonomic workstation. Normalized activation and activation ratios of each trapezius subdivision were compared between genders and condition with repeated measures ANOVAs. The interaction effect of Gender×Condition for upper trapezius% activation approached significance at p=0.051with males demonstrating greater activation post-task. The main effect of Condition was statistically significant for% activation of middle and lower trapezius (p<0.05), with both muscles demonstrating increase activation post-task. There was a statistically significant interaction effect of Gender×Condition for the Middle Trapezius/Upper Trapezius ratio and main effect of Condition for the Clavicular Trapezius/Upper Trapezius ratio, with a decreased ratio post-typing. Gender differences exist following 60min of a low force computer typing task. Imbalances in muscle activation and activation ratios following computer work may affect total shoulder kinematics and should be further explored. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Teaching Scientists to Communicate: Evidence-based assessment for undergraduate science education

    NASA Astrophysics Data System (ADS)

    Mercer-Mapstone, Lucy; Kuchel, Louise

    2015-07-01

    Communication skills are one of five nationally recognised learning outcomes for an Australian Bachelor of Science (BSc) degree. Previous evidence indicates that communication skills taught in Australian undergraduate science degrees are not developed sufficiently to meet the requirements of the modern-day workplace-a problem faced in the UK and USA also. Curriculum development in this area, however, hinges on first evaluating how communication skills are taught currently as a base from which to make effective changes. This study aimed to quantify the current standard of communication education within BSc degrees at Australian research-intensive universities. A detailed evidential baseline for not only what but also how communication skills are being taught was established. We quantified which communication skills were taught and assessed explicitly, implicitly, or were absent in a range of undergraduate science assessment tasks (n = 35) from four research-intensive Australian universities. Results indicate that 10 of the 12 core science communication skills used for evaluation were absent from more than 50% of assessment tasks and 77.14% of all assessment tasks taught less than 5 core communication skills explicitly. The design of assessment tasks significantly affected whether communication skills were taught explicitly. Prominent trends were that communication skills in tasks aimed at non-scientific audiences were taught more explicitly than in tasks aimed at scientific audiences, and the majority of group and multimedia tasks taught communication elements more explicitly than individual, or written and oral tasks. Implications for science communication in the BSc and further research are discussed.

  11. PowerPlay: Training an Increasingly General Problem Solver by Continually Searching for the Simplest Still Unsolvable Problem

    PubMed Central

    Schmidhuber, Jürgen

    2013-01-01

    Most of computer science focuses on automatically solving given computational problems. I focus on automatically inventing or discovering problems in a way inspired by the playful behavior of animals and humans, to train a more and more general problem solver from scratch in an unsupervised fashion. Consider the infinite set of all computable descriptions of tasks with possibly computable solutions. Given a general problem-solving architecture, at any given time, the novel algorithmic framework PowerPlay (Schmidhuber, 2011) searches the space of possible pairs of new tasks and modifications of the current problem solver, until it finds a more powerful problem solver that provably solves all previously learned tasks plus the new one, while the unmodified predecessor does not. Newly invented tasks may require to achieve a wow-effect by making previously learned skills more efficient such that they require less time and space. New skills may (partially) re-use previously learned skills. The greedy search of typical PowerPlay variants uses time-optimal program search to order candidate pairs of tasks and solver modifications by their conditional computational (time and space) complexity, given the stored experience so far. The new task and its corresponding task-solving skill are those first found and validated. This biases the search toward pairs that can be described compactly and validated quickly. The computational costs of validating new tasks need not grow with task repertoire size. Standard problem solver architectures of personal computers or neural networks tend to generalize by solving numerous tasks outside the self-invented training set; PowerPlay’s ongoing search for novelty keeps breaking the generalization abilities of its present solver. This is related to Gödel’s sequence of increasingly powerful formal theories based on adding formerly unprovable statements to the axioms without affecting previously provable theorems. The continually increasing repertoire of problem-solving procedures can be exploited by a parallel search for solutions to additional externally posed tasks. PowerPlay may be viewed as a greedy but practical implementation of basic principles of creativity (Schmidhuber, 2006a, 2010). A first experimental analysis can be found in separate papers (Srivastava et al., 2012a,b, 2013). PMID:23761771

  12. Aviation Technician Training I and Task Analyses: Semester II. Field Review Copy.

    ERIC Educational Resources Information Center

    Upchurch, Richard

    This guide for aviation technician training begins with a course description, resource information, and a course outline. Tasks/competencies are categorized into 16 concept/duty areas: understanding technical symbols and abbreviations; understanding mathematical terms, symbols, and formulas; computing decimals; computing fractions; computing ratio…

  13. Quantifying the Physiological Stress Response to Simulated Maritime Pilotage Tasks

    PubMed Central

    Main, Luana C.; Wolkow, Alexander; Chambers, Timothy P.

    2017-01-01

    Objective: The aim of this study was to quantify the stress associated with performing maritime pilotage tasks in a high-fidelity simulator. Methods: Eight trainee and 13 maritime pilots completed two simulated pilotage tasks of varying complexity. Salivary cortisol samples were collected pre- and post-simulation for both trials. Heart rate was measured continuously throughout the study. Results: Significant changes in salivary cortisol (P = 0.000, η2 = 0.139), average (P = 0.006, η2 = 0.087), and peak heart rate (P = 0.013, η2 = 0.077) from pre- to postsimulation were found. Varying task complexity did partially influence stress response; average (P = 0.016, η2 = 0.026) and peak heart rate (P = 0.034, η2 = 0.020) were higher in the experimental condition. Trainees also recorded higher average (P = 0.000, η2 = 0.054) and peak heart rates (P = 0.027, η2 = 0.022). Conclusion: Performing simulated pilotage tasks evoked a measurable stress response in both trainee and expert maritime pilots. PMID:28922309

  14. Complexity quantification of dense array EEG using sample entropy analysis.

    PubMed

    Ramanand, Pravitha; Nampoori, V P N; Sreenivasan, R

    2004-09-01

    In this paper, a time series complexity analysis of dense array electroencephalogram signals is carried out using the recently introduced Sample Entropy (SampEn) measure. This statistic quantifies the regularity in signals recorded from systems that can vary from the purely deterministic to purely stochastic realm. The present analysis is conducted with an objective of gaining insight into complexity variations related to changing brain dynamics for EEG recorded from the three cases of passive, eyes closed condition, a mental arithmetic task and the same mental task carried out after a physical exertion task. It is observed that the statistic is a robust quantifier of complexity suited for short physiological signals such as the EEG and it points to the specific brain regions that exhibit lowered complexity during the mental task state as compared to a passive, relaxed state. In the case of mental tasks carried out before and after the performance of a physical exercise, the statistic can detect the variations brought in by the intermediate fatigue inducing exercise period. This enhances its utility in detecting subtle changes in the brain state that can find wider scope for applications in EEG based brain studies.

  15. A comparison of symptoms after viewing text on a computer screen and hardcopy.

    PubMed

    Chu, Christina; Rosenfield, Mark; Portello, Joan K; Benzoni, Jaclyn A; Collier, Juanita D

    2011-01-01

    Computer vision syndrome (CVS) is a complex of eye and vision problems experienced during or related to computer use. Ocular symptoms may include asthenopia, accommodative and vergence difficulties and dry eye. CVS occurs in up to 90% of computer workers, and given the almost universal use of these devices, it is important to identify whether these symptoms are specific to computer operation, or are simply a manifestation of performing a sustained near-vision task. This study compared ocular symptoms immediately following a sustained near task. 30 young, visually-normal subjects read text aloud either from a desktop computer screen or a printed hardcopy page at a viewing distance of 50 cm for a continuous 20 min period. Identical text was used in the two sessions, which was matched for size and contrast. Target viewing angle and luminance were similar for the two conditions. Immediately following completion of the reading task, subjects completed a written questionnaire asking about their level of ocular discomfort during the task. When comparing the computer and hardcopy conditions, significant differences in median symptom scores were reported with regard to blurred vision during the task (t = 147.0; p = 0.03) and the mean symptom score (t = 102.5; p = 0.04). In both cases, symptoms were higher during computer use. Symptoms following sustained computer use were significantly worse than those reported after hard copy fixation under similar viewing conditions. A better understanding of the physiology underlying CVS is critical to allow more accurate diagnosis and treatment. This will allow practitioners to optimize visual comfort and efficiency during computer operation.

  16. Pattern of Non-Task Interactions in Asynchronous Computer-Supported Collaborative Learning Courses

    ERIC Educational Resources Information Center

    Abedin, Babak; Daneshgar, Farhad; D'Ambra, John

    2014-01-01

    Despite the importance of the non-task interactions in computer-supported collaborative learning (CSCL) environments as emphasized in the literature, few studies have investigated online behavior of people in the CSCL environments. This paper studies the pattern of non-task interactions among postgraduate students in an Australian university. The…

  17. Strategy Generalization across Orientation Tasks: Testing a Computational Cognitive Model

    ERIC Educational Resources Information Center

    Gunzelmann, Glenn

    2008-01-01

    Humans use their spatial information processing abilities flexibly to facilitate problem solving and decision making in a variety of tasks. This article explores the question of whether a general strategy can be adapted for performing two different spatial orientation tasks by testing the predictions of a computational cognitive model. Human…

  18. Learner Use of Holistic Language Units in Multimodal, Task-Based Synchronous Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Collentine, Karina

    2009-01-01

    Second language acquisition (SLA) researchers strive to understand the language and exchanges that learners generate in synchronous computer-mediated communication (SCMC). Doughty and Long (2003) advocate replacing open-ended SCMC with task-based language teaching (TBLT) design principles. Since most task-based SCMC (TB-SCMC) research addresses an…

  19. Integrating human and machine intelligence in galaxy morphology classification tasks

    NASA Astrophysics Data System (ADS)

    Beck, Melanie R.; Scarlata, Claudia; Fortson, Lucy F.; Lintott, Chris J.; Simmons, B. D.; Galloway, Melanie A.; Willett, Kyle W.; Dickinson, Hugh; Masters, Karen L.; Marshall, Philip J.; Wright, Darryl

    2018-06-01

    Quantifying galaxy morphology is a challenging yet scientifically rewarding task. As the scale of data continues to increase with upcoming surveys, traditional classification methods will struggle to handle the load. We present a solution through an integration of visual and automated classifications, preserving the best features of both human and machine. We demonstrate the effectiveness of such a system through a re-analysis of visual galaxy morphology classifications collected during the Galaxy Zoo 2 (GZ2) project. We reprocess the top-level question of the GZ2 decision tree with a Bayesian classification aggregation algorithm dubbed SWAP, originally developed for the Space Warps gravitational lens project. Through a simple binary classification scheme, we increase the classification rate nearly 5-fold classifying 226 124 galaxies in 92 d of GZ2 project time while reproducing labels derived from GZ2 classification data with 95.7 per cent accuracy. We next combine this with a Random Forest machine learning algorithm that learns on a suite of non-parametric morphology indicators widely used for automated morphologies. We develop a decision engine that delegates tasks between human and machine and demonstrate that the combined system provides at least a factor of 8 increase in the classification rate, classifying 210 803 galaxies in just 32 d of GZ2 project time with 93.1 per cent accuracy. As the Random Forest algorithm requires a minimal amount of computational cost, this result has important implications for galaxy morphology identification tasks in the era of Euclid and other large-scale surveys.

  20. Optimal beamforming in ultrasound using the ideal observer.

    PubMed

    Abbey, Craig K; Nguyen, Nghia Q; Insana, Michael F

    2010-08-01

    Beamforming of received pulse-echo data generally involves the compression of signals from multiple channels within an aperture. This compression is irreversible, and therefore allows the possibility that information relevant for performing a diagnostic task is irretrievably lost. The purpose of this study was to evaluate information transfer in beamforming using a previously developed ideal observer model to quantify diagnostic information relevant to performing a task. We describe an elaborated statistical model of image formation for fixed-focus transmission and single-channel reception within a moving aperture, and we use this model on a panel of tasks related to breast sonography to evaluate receive-beamforming approaches that optimize the transfer of information. Under the assumption that acquisition noise is well described as an additive wide-band Gaussian white-noise process, we show that signal compression across receive-aperture channels after a 2-D matched-filtering operation results in no loss of diagnostic information. Across tasks, the matched-filter beamformer results in more information than standard delay-and-sum beamforming in the subsequent radio-frequency signal by a factor of two. We also show that for this matched filter, 68% of the information gain can be attributed to the phase of the matched-filter and 21% can be attributed to the amplitude. A 1-D matched filtering along axial lines shows no advantage over delay-andsum, suggesting an important role for incorporating correlations across different aperture windows in beamforming. We also show that a post-compression processing before the computation of an envelope is necessary to pass the diagnostic information in the beamformed radio-frequency signal to the final envelope image.

  1. A Framework for Load Balancing of Tensor Contraction Expressions via Dynamic Task Partitioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lai, Pai-Wei; Stock, Kevin; Rajbhandari, Samyam

    In this paper, we introduce the Dynamic Load-balanced Tensor Contractions (DLTC), a domain-specific library for efficient task parallel execution of tensor contraction expressions, a class of computation encountered in quantum chemistry and physics. Our framework decomposes each contraction into smaller unit of tasks, represented by an abstraction referred to as iterators. We exploit an extra level of parallelism by having tasks across independent contractions executed concurrently through a dynamic load balancing run- time. We demonstrate the improved performance, scalability, and flexibility for the computation of tensor contraction expressions on parallel computers using examples from coupled cluster methods.

  2. Mobile and fixed computer use by doctors and nurses on hospital wards: multi-method study on the relationships between clinician role, clinical task, and device choice.

    PubMed

    Andersen, Pia; Lindgaard, Anne-Mette; Prgomet, Mirela; Creswick, Nerida; Westbrook, Johanna I

    2009-08-04

    Selecting the right mix of stationary and mobile computing devices is a significant challenge for system planners and implementers. There is very limited research evidence upon which to base such decisions. We aimed to investigate the relationships between clinician role, clinical task, and selection of a computer hardware device in hospital wards. Twenty-seven nurses and eight doctors were observed for a total of 80 hours as they used a range of computing devices to access a computerized provider order entry system on two wards at a major Sydney teaching hospital. Observers used a checklist to record the clinical tasks completed, devices used, and location of the activities. Field notes were also documented during observations. Semi-structured interviews were conducted after observation sessions. Assessment of the physical attributes of three devices-stationary PCs, computers on wheels (COWs) and tablet PCs-was made. Two types of COWs were available on the wards: generic COWs (laptops mounted on trolleys) and ergonomic COWs (an integrated computer and cart device). Heuristic evaluation of the user interfaces was also carried out. The majority (93.1%) of observed nursing tasks were conducted using generic COWs. Most nursing tasks were performed in patients' rooms (57%) or in the corridors (36%), with a small percentage at a patient's bedside (5%). Most nursing tasks related to the preparation and administration of drugs. Doctors on ward rounds conducted 57.3% of observed clinical tasks on generic COWs and 35.9% on tablet PCs. On rounds, 56% of doctors' tasks were performed in the corridors, 29% in patients' rooms, and 3% at the bedside. Doctors not on a ward round conducted 93.6% of tasks using stationary PCs, most often within the doctors' office. Nurses and doctors were observed performing workarounds, such as transcribing medication orders from the computer to paper. The choice of device was related to clinical role, nature of the clinical task, degree of mobility required, including where task completion occurs, and device design. Nurses' work, and clinical tasks performed by doctors during ward rounds, require highly mobile computer devices. Nurses and doctors on ward rounds showed a strong preference for generic COWs over all other devices. Tablet PCs were selected by doctors for only a small proportion of clinical tasks. Even when using mobile devices clinicians completed a very low proportion of observed tasks at the bedside. The design of the devices and ward space configurations place limitations on how and where devices are used and on the mobility of clinical work. In such circumstances, clinicians will initiate workarounds to compensate. In selecting hardware devices, consideration should be given to who will be using the devices, the nature of their work, and the physical layout of the ward.

  3. Mobile and Fixed Computer Use by Doctors and Nurses on Hospital Wards: Multi-method Study on the Relationships Between Clinician Role, Clinical Task, and Device Choice

    PubMed Central

    Andersen, Pia; Lindgaard, Anne-Mette; Prgomet, Mirela; Creswick, Nerida

    2009-01-01

    Background Selecting the right mix of stationary and mobile computing devices is a significant challenge for system planners and implementers. There is very limited research evidence upon which to base such decisions. Objective We aimed to investigate the relationships between clinician role, clinical task, and selection of a computer hardware device in hospital wards. Methods Twenty-seven nurses and eight doctors were observed for a total of 80 hours as they used a range of computing devices to access a computerized provider order entry system on two wards at a major Sydney teaching hospital. Observers used a checklist to record the clinical tasks completed, devices used, and location of the activities. Field notes were also documented during observations. Semi-structured interviews were conducted after observation sessions. Assessment of the physical attributes of three devices—stationary PCs, computers on wheels (COWs) and tablet PCs—was made. Two types of COWs were available on the wards: generic COWs (laptops mounted on trolleys) and ergonomic COWs (an integrated computer and cart device). Heuristic evaluation of the user interfaces was also carried out. Results The majority (93.1%) of observed nursing tasks were conducted using generic COWs. Most nursing tasks were performed in patients’ rooms (57%) or in the corridors (36%), with a small percentage at a patient’s bedside (5%). Most nursing tasks related to the preparation and administration of drugs. Doctors on ward rounds conducted 57.3% of observed clinical tasks on generic COWs and 35.9% on tablet PCs. On rounds, 56% of doctors’ tasks were performed in the corridors, 29% in patients’ rooms, and 3% at the bedside. Doctors not on a ward round conducted 93.6% of tasks using stationary PCs, most often within the doctors’ office. Nurses and doctors were observed performing workarounds, such as transcribing medication orders from the computer to paper. Conclusions The choice of device was related to clinical role, nature of the clinical task, degree of mobility required, including where task completion occurs, and device design. Nurses’ work, and clinical tasks performed by doctors during ward rounds, require highly mobile computer devices. Nurses and doctors on ward rounds showed a strong preference for generic COWs over all other devices. Tablet PCs were selected by doctors for only a small proportion of clinical tasks. Even when using mobile devices clinicians completed a very low proportion of observed tasks at the bedside. The design of the devices and ward space configurations place limitations on how and where devices are used and on the mobility of clinical work. In such circumstances, clinicians will initiate workarounds to compensate. In selecting hardware devices, consideration should be given to who will be using the devices, the nature of their work, and the physical layout of the ward. PMID:19674959

  4. Age-related differences in listening effort during degraded speech recognition

    PubMed Central

    Ward, Kristina M.; Shen, Jing; Souza, Pamela E.; Grieco-Calub, Tina M.

    2016-01-01

    Objectives The purpose of the current study was to quantify age-related differences in executive control as it relates to dual-task performance, which is thought to represent listening effort, during degraded speech recognition. Design Twenty-five younger adults (18–24 years) and twenty-one older adults (56–82 years) completed a dual-task paradigm that consisted of a primary speech recognition task and a secondary visual monitoring task. Sentence material in the primary task was either unprocessed or spectrally degraded into 8, 6, or 4 spectral channels using noise-band vocoding. Performance on the visual monitoring task was assessed by the accuracy and reaction time of participants’ responses. Performance on the primary and secondary task was quantified in isolation (i.e., single task) and during the dual-task paradigm. Participants also completed a standardized psychometric measure of executive control, including attention and inhibition. Statistical analyses were implemented to evaluate changes in listeners’ performance on the primary and secondary tasks (1) per condition (unprocessed vs. vocoded conditions); (2) per task (baseline vs. dual task); and (3) per group (younger vs. older adults). Results Speech recognition declined with increasing spectral degradation for both younger and older adults when they performed the task in isolation or concurrently with the visual monitoring task. Older adults were slower and less accurate than younger adults on the visual monitoring task when performed in isolation, which paralleled age-related differences in standardized scores of executive control. When compared to single-task performance, older adults experienced greater declines in secondary-task accuracy, but not reaction time, than younger adults. Furthermore, results revealed that age-related differences in executive control significantly contributed to age-related differences on the visual monitoring task during the dual-task paradigm. Conclusions Older adults experienced significantly greater declines in secondary-task accuracy during degraded speech recognition than younger adults. These findings are interpreted as suggesting that older listeners expended greater listening effort than younger listeners, and may be partially attributed to age-related differences in executive control. PMID:27556526

  5. Incremental generation of answers during the comprehension of questions with quantifiers.

    PubMed

    Bott, Oliver; Augurzky, Petra; Sternefeld, Wolfgang; Ulrich, Rolf

    2017-09-01

    The paper presents a study on the online interpretation of quantified questions involving complex domain restriction, for instance, are all triangles blue that are in the circle. Two probe reaction time (RT) task experiments were conducted to study the incremental nature of answer generation while manipulating visual contexts and response hand overlap between tasks. We manipulated the contexts in such a way that the incremental answer to the question changed from 'yes' to 'no' or remained the same before and after encountering the extraposed relative clause. The findings of both experiments provide evidence for incremental answer preparation but only if the context did not involve the risk of answer revision. Our results show that preliminary output from incremental semantic interpretation results in response priming that facilitates congruent responses in the probe RT task. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Measurement of tremor transmission during microsurgery.

    PubMed

    Verrelli, David I; Qian, Yi; Wood, James; Wilson, Michael K

    2016-12-01

    Tremor is a major impediment to performing fine motor tasks, as in microsurgery. However, conventional measurements do not involve tasks representative of microsurgery. We developed a low-cost surgical simulator incorporating a force transducer capable of detecting and quantifying the effects of tremor upon high-fidelity silicone replicas of cardiac vessels and substrate muscle. Experienced and trainee surgeons performed simulated anastomoses on this rig. We characterized procedures in terms of tremor intensity, based on Lomb-Scargle periodograms. Distinctive force oscillations occurred at 8-12 Hz, characteristic of enhanced physiological tremor, yielding peaks in power spectral density. These early results suggest a significantly lower transmission of tremor to the operative field by the experienced surgeon in comparison to the trainees. This new device quantifies the action of tremor upon a manipulandum during a complex task, which may be used for assessment and providing feedback to trainee surgeons. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  7. The Differential Effects of Two Types of Task Repetition on the Complexity, Accuracy, and Fluency in Computer-Mediated L2 Written Production: A Focus on Computer Anxiety

    ERIC Educational Resources Information Center

    Amiryousefi, Mohammad

    2016-01-01

    Previous task repetition studies have primarily focused on how task repetition characteristics affect the complexity, accuracy, and fluency in L2 oral production with little attention to L2 written production. The main purpose of the study reported in this paper was to examine the effects of task repetition versus procedural repetition on the…

  8. A comparison of visuomotor cue integration strategies for object placement and prehension.

    PubMed

    Greenwald, Hal S; Knill, David C

    2009-01-01

    Visual cue integration strategies are known to depend on cue reliability and how rapidly the visual system processes incoming information. We investigated whether these strategies also depend on differences in the information demands for different natural tasks. Using two common goal-oriented tasks, prehension and object placement, we determined whether monocular and binocular information influence estimates of three-dimensional (3D) orientation differently depending on task demands. Both tasks rely on accurate 3D orientation estimates, but 3D position is potentially more important for grasping. Subjects placed an object on or picked up a disc in a virtual environment. On some trials, the monocular cues (aspect ratio and texture compression) and binocular cues (e.g., binocular disparity) suggested slightly different 3D orientations for the disc; these conflicts either were present upon initial stimulus presentation or were introduced after movement initiation, which allowed us to quantify how information from the cues accumulated over time. We analyzed the time-varying orientations of subjects' fingers in the grasping task and those of the object in the object placement task to quantify how different visual cues influenced motor control. In the first experiment, different subjects performed each task, and those performing the grasping task relied on binocular information more when orienting their hands than those performing the object placement task. When subjects in the second experiment performed both tasks in interleaved sessions, binocular cues were still more influential during grasping than object placement, and the different cue integration strategies observed for each task in isolation were maintained. In both experiments, the temporal analyses showed that subjects processed binocular information faster than monocular information, but task demands did not affect the time course of cue processing. How one uses visual cues for motor control depends on the task being performed, although how quickly the information is processed appears to be task invariant.

  9. Integration of active pauses and pattern of muscular activity during computer work.

    PubMed

    St-Onge, Nancy; Samani, Afshin; Madeleine, Pascal

    2017-09-01

    Submaximal isometric muscle contractions have been reported to increase variability of muscle activation during computer work; however, other types of active contractions may be more beneficial. Our objective was to determine which type of active pause vs. rest is more efficient in changing muscle activity pattern during a computer task. Asymptomatic regular computer users performed a standardised 20-min computer task four times, integrating a different type of pause: sub-maximal isometric contraction, dynamic contraction, postural exercise and rest. Surface electromyographic (SEMG) activity was recorded bilaterally from five neck/shoulder muscles. Root-mean-square decreased with isometric pauses in the cervical paraspinals, upper trapezius and middle trapezius, whereas it increased with rest. Variability in the pattern of muscular activity was not affected by any type of pause. Overall, no detrimental effects on the level of SEMG during active pauses were found suggesting that they could be implemented without a cost on activation level or variability. Practitioner Summary: We aimed to determine which type of active pause vs. rest is best in changing muscle activity pattern during a computer task. Asymptomatic computer users performed a standardised computer task integrating different types of pauses. Muscle activation decreased with isometric pauses in neck/shoulder muscles, suggesting their implementation during computer work.

  10. Anatomical background and generalized detectability in tomosynthesis and cone-beam CT.

    PubMed

    Gang, G J; Tward, D J; Lee, J; Siewerdsen, J H

    2010-05-01

    Anatomical background presents a major impediment to detectability in 2D radiography as well as 3D tomosynthesis and cone-beam CT (CBCT). This article incorporates theoretical and experimental analysis of anatomical background "noise" in cascaded systems analysis of 2D and 3D imaging performance to yield "generalized" metrics of noise-equivalent quanta (NEQ) and detectability index as a function of the orbital extent of the (circular arc) source-detector orbit. A physical phantom was designed based on principles of fractal self-similarity to exhibit power-law spectral density (kappa/Fbeta) comparable to various anatomical sites (e.g., breast and lung). Background power spectra [S(B)(F)] were computed as a function of source-detector orbital extent, including tomosynthesis (approximately 10 degrees -180 degrees) and CBCT (180 degrees + fan to 360 degrees) under two acquisition schemes: (1) Constant angular separation between projections (variable dose) and (2) constant total number of projections (constant dose). The resulting S(B) was incorporated in the generalized NEQ, and detectability index was computed from 3D cascaded systems analysis for a variety of imaging tasks. The phantom yielded power-law spectra within the expected spatial frequency range, quantifying the dependence of clutter magnitude (kappa) and correlation (beta) with increasing tomosynthesis angle. Incorporation of S(B) in the 3D NEQ provided a useful framework for analyzing the tradeoffs among anatomical, quantum, and electronic noise with dose and orbital extent. Distinct implications are posed for breast and chest tomosynthesis imaging system design-applications varying significantly in kappa and beta, and imaging task and, therefore, in optimal selection of orbital extent, number of projections, and dose. For example, low-frequency tasks (e.g., soft-tissue masses or nodules) tend to benefit from larger orbital extent and more fully 3D tomographic imaging, whereas high-frequency tasks (e.g., microcalcifications) require careful, application-specific selection of orbital extent and number of projections to minimize negative effects of quantum and electronic noise. The complex tradeoffs among anatomical background, quantum noise, and electronic noise in projection imaging, tomosynthesis, and CBCT can be described by generalized cascaded systems analysis, providing a useful framework for system design and optimization.

  11. Evaluation of a computerized aid for creating human behavioral representations of human-computer interaction.

    PubMed

    Williams, Kent E; Voigt, Jeffrey R

    2004-01-01

    The research reported herein presents the results of an empirical evaluation that focused on the accuracy and reliability of cognitive models created using a computerized tool: the cognitive analysis tool for human-computer interaction (CAT-HCI). A sample of participants, expert in interacting with a newly developed tactical display for the U.S. Army's Bradley Fighting Vehicle, individually modeled their knowledge of 4 specific tasks employing the CAT-HCI tool. Measures of the accuracy and consistency of task models created by these task domain experts using the tool were compared with task models created by a double expert. The findings indicated a high degree of consistency and accuracy between the different "single experts" in the task domain in terms of the resultant models generated using the tool. Actual or potential applications of this research include assessing human-computer interaction complexity, determining the productivity of human-computer interfaces, and analyzing an interface design to determine whether methods can be automated.

  12. Sympathetic nervous system activity measured by skin conductance quantifies the challenge of walking adaptability tasks after stroke.

    PubMed

    Clark, David J; Chatterjee, Sudeshna A; McGuirk, Theresa E; Porges, Eric C; Fox, Emily J; Balasubramanian, Chitralakshmi K

    2018-02-01

    Walking adaptability tasks are challenging for people with motor impairments. The construct of perceived challenge is typically measured by self-report assessments, which are susceptible to subjective measurement error. The development of an objective physiologically-based measure of challenge may help to improve the ability to assess this important aspect of mobility function. The objective of this study to investigate the use of sympathetic nervous system (SNS) activity measured by skin conductance to gauge the physiological stress response to challenging walking adaptability tasks in people post-stroke. Thirty adults with chronic post-stroke hemiparesis performed a battery of seventeen walking adaptability tasks. SNS activity was measured by skin conductance from the palmar surface of each hand. The primary outcome variable was the percent change in skin conductance level (ΔSCL) between the baseline resting and walking phases of each task. Task difficulty was measured by performance speed and by physical therapist scoring of performance. Walking function and balance confidence were measured by preferred walking speed and the Activities-specific Balance Confidence Scale, respectively. There was a statistically significant negative association between ΔSCL and task performance speed and between ΔSCL and clinical score, indicating that tasks with greater SNS activity had slower performance speed and poorer clinical scores. ΔSCL was significantly greater for low functioning participants versus high functioning participants, particularly during the most challenging walking adaptability tasks. This study supports the use of SNS activity measured by skin conductance as a valuable approach for objectively quantifying the perceived challenge of walking adaptability tasks in people post-stroke. Published by Elsevier B.V.

  13. Sympathetic nervous system activity measured by skin conductance quantifies the challenge of walking adaptability tasks after stroke

    PubMed Central

    Clark, David J.; Chatterjee, Sudeshna A.; McGuirk, Theresa E.; Porges, Eric C.; Fox, Emily J.; Balasubramanian, Chitralakshmi K.

    2018-01-01

    Background Walking adaptability tasks are challenging for people with motor impairments. The construct of perceived challenge is typically measured by self-report assessments, which are susceptible to subjective measurement error. The development of an objective physiologically-based measure of challenge may help to improve the ability to assess this important aspect of mobility function. The objective of this study to investigate the use of sympathetic nervous system (SNS) activity measured by skin conductance to gauge the physiological stress response to challenging walking adaptability tasks in people post-stroke. Methods Thirty adults with chronic post-stroke hemiparesis performed a battery of seventeen walking adaptability tasks. SNS activity was measured by skin conductance from the palmar surface of each hand. The primary outcome variable was the percent change in skin conductance level (ΔSCL) between the baseline resting and walking phases of each task. Task difficulty was measured by performance speed and by physical therapist grading of performance. Walking function and balance confidence were measured by preferred walking speed and the Activities Specific Balance Confidence Scale, respectively. Results There was a statistically significant negative association between ΔSCL and task performance speed and between ΔSCL and clinical score, indicating that tasks with greater SNS activity had slower performance speed and poorer clinical scores. ΔSCL was significantly greater for low functioning participants versus high functioning participants, particularly during the most challenging walking adaptability tasks. Conclusion This study supports the use of SNS activity measured by skin conductance as a valuable approach for objectively quantifying the perceived challenge of walking adaptability tasks in people post-stroke. PMID:29216598

  14. User-Driven Sampling Strategies in Image Exploitation

    DOE PAGES

    Harvey, Neal R.; Porter, Reid B.

    2013-12-23

    Visual analytics and interactive machine learning both try to leverage the complementary strengths of humans and machines to solve complex data exploitation tasks. These fields overlap most significantly when training is involved: the visualization or machine learning tool improves over time by exploiting observations of the human-computer interaction. This paper focuses on one aspect of the human-computer interaction that we call user-driven sampling strategies. Unlike relevance feedback and active learning sampling strategies, where the computer selects which data to label at each iteration, we investigate situations where the user selects which data is to be labeled at each iteration. User-drivenmore » sampling strategies can emerge in many visual analytics applications but they have not been fully developed in machine learning. We discovered that in user-driven sampling strategies suggest new theoretical and practical research questions for both visualization science and machine learning. In this paper we identify and quantify the potential benefits of these strategies in a practical image analysis application. We find user-driven sampling strategies can sometimes provide significant performance gains by steering tools towards local minima that have lower error than tools trained with all of the data. Furthermore, in preliminary experiments we find these performance gains are particularly pronounced when the user is experienced with the tool and application domain.« less

  15. Identification of differences between finite element analysis and experimental vibration data

    NASA Technical Reports Server (NTRS)

    Lawrence, C.

    1986-01-01

    An important problem that has emerged from combined analytical/experimental investigations is the task of identifying and quantifying the differences between results predicted by F.E. analysis and results obtained from experiment. The objective of this study is to extend and evaluate the procedure developed by Sidhu for correlation of linear F.E. and modal test data to include structures with viscous damping. The desirability of developing this procedure is that the differences are identified in terms of physical mass, damping, and stiffness parameters instead of in terms of frequencies and modes shapes. Since the differences are computed in terms of physical parameters, locations of modeling problems can be directly identified in the F.E. model. From simulated data it was determined that the accuracy of the computed differences increases as the number of experimentally measured modes included in the calculations is increased. When the number of experimental modes is at least equal to the number of translational degrees of freedom in the F.E. model both the location and magnitude of the differences can be computed very accurately. When the number of modes is less than this amount the location of the differences may be determined even though their magnitudes will be under estimated.

  16. User-driven sampling strategies in image exploitation

    NASA Astrophysics Data System (ADS)

    Harvey, Neal; Porter, Reid

    2013-12-01

    Visual analytics and interactive machine learning both try to leverage the complementary strengths of humans and machines to solve complex data exploitation tasks. These fields overlap most significantly when training is involved: the visualization or machine learning tool improves over time by exploiting observations of the human-computer interaction. This paper focuses on one aspect of the human-computer interaction that we call user-driven sampling strategies. Unlike relevance feedback and active learning sampling strategies, where the computer selects which data to label at each iteration, we investigate situations where the user selects which data is to be labeled at each iteration. User-driven sampling strategies can emerge in many visual analytics applications but they have not been fully developed in machine learning. User-driven sampling strategies suggest new theoretical and practical research questions for both visualization science and machine learning. In this paper we identify and quantify the potential benefits of these strategies in a practical image analysis application. We find user-driven sampling strategies can sometimes provide significant performance gains by steering tools towards local minima that have lower error than tools trained with all of the data. In preliminary experiments we find these performance gains are particularly pronounced when the user is experienced with the tool and application domain.

  17. Failure probability analysis of optical grid

    NASA Astrophysics Data System (ADS)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  18. Psychomotor Impairment Detection via Finger Interactions with a Computer Keyboard During Natural Typing

    NASA Astrophysics Data System (ADS)

    Giancardo, L.; Sánchez-Ferro, A.; Butterworth, I.; Mendoza, C. S.; Hooker, J. M.

    2015-04-01

    Modern digital devices and appliances are capable of monitoring the timing of button presses, or finger interactions in general, with a sub-millisecond accuracy. However, the massive amount of high resolution temporal information that these devices could collect is currently being discarded. Multiple studies have shown that the act of pressing a button triggers well defined brain areas which are known to be affected by motor-compromised conditions. In this study, we demonstrate that the daily interaction with a computer keyboard can be employed as means to observe and potentially quantify psychomotor impairment. We induced a psychomotor impairment via a sleep inertia paradigm in 14 healthy subjects, which is detected by our classifier with an Area Under the ROC Curve (AUC) of 0.93/0.91. The detection relies on novel features derived from key-hold times acquired on standard computer keyboards during an uncontrolled typing task. These features correlate with the progression to psychomotor impairment (p < 0.001) regardless of the content and language of the text typed, and perform consistently with different keyboards. The ability to acquire longitudinal measurements of subtle motor changes from a digital device without altering its functionality may allow for early screening and follow-up of motor-compromised neurodegenerative conditions, psychological disorders or intoxication at a negligible cost in the general population.

  19. Introducing anisotropic Minkowski functionals and quantitative anisotropy measures for local structure analysis in biomedical imaging

    NASA Astrophysics Data System (ADS)

    Wismüller, Axel; De, Titas; Lochmüller, Eva; Eckstein, Felix; Nagarajan, Mahesh B.

    2013-03-01

    The ability of Minkowski Functionals to characterize local structure in different biological tissue types has been demonstrated in a variety of medical image processing tasks. We introduce anisotropic Minkowski Functionals (AMFs) as a novel variant that captures the inherent anisotropy of the underlying gray-level structures. To quantify the anisotropy characterized by our approach, we further introduce a method to compute a quantitative measure motivated by a technique utilized in MR diffusion tensor imaging, namely fractional anisotropy. We showcase the applicability of our method in the research context of characterizing the local structure properties of trabecular bone micro-architecture in the proximal femur as visualized on multi-detector CT. To this end, AMFs were computed locally for each pixel of ROIs extracted from the head, neck and trochanter regions. Fractional anisotropy was then used to quantify the local anisotropy of the trabecular structures found in these ROIs and to compare its distribution in different anatomical regions. Our results suggest a significantly greater concentration of anisotropic trabecular structures in the head and neck regions when compared to the trochanter region (p < 10-4). We also evaluated the ability of such AMFs to predict bone strength in the femoral head of proximal femur specimens obtained from 50 donors. Our results suggest that such AMFs, when used in conjunction with multi-regression models, can outperform more conventional features such as BMD in predicting failure load. We conclude that such anisotropic Minkowski Functionals can capture valuable information regarding directional attributes of local structure, which may be useful in a wide scope of biomedical imaging applications.

  20. Automated detection and quantification of residual brain tumor using an interactive computer-aided detection scheme

    NASA Astrophysics Data System (ADS)

    Gaffney, Kevin P.; Aghaei, Faranak; Battiste, James; Zheng, Bin

    2017-03-01

    Detection of residual brain tumor is important to evaluate efficacy of brain cancer surgery, determine optimal strategy of further radiation therapy if needed, and assess ultimate prognosis of the patients. Brain MR is a commonly used imaging modality for this task. In order to distinguish between residual tumor and surgery induced scar tissues, two sets of MRI scans are conducted pre- and post-gadolinium contrast injection. The residual tumors are only enhanced in the post-contrast injection images. However, subjective reading and quantifying this type of brain MR images faces difficulty in detecting real residual tumor regions and measuring total volume of the residual tumor. In order to help solve this clinical difficulty, we developed and tested a new interactive computer-aided detection scheme, which consists of three consecutive image processing steps namely, 1) segmentation of the intracranial region, 2) image registration and subtraction, 3) tumor segmentation and refinement. The scheme also includes a specially designed and implemented graphical user interface (GUI) platform. When using this scheme, two sets of pre- and post-contrast injection images are first automatically processed to detect and quantify residual tumor volume. Then, a user can visually examine segmentation results and conveniently guide the scheme to correct any detection or segmentation errors if needed. The scheme has been repeatedly tested using five cases. Due to the observed high performance and robustness of the testing results, the scheme is currently ready for conducting clinical studies and helping clinicians investigate the association between this quantitative image marker and outcome of patients.

  1. Introducing Anisotropic Minkowski Functionals and Quantitative Anisotropy Measures for Local Structure Analysis in Biomedical Imaging

    PubMed Central

    Wismüller, Axel; De, Titas; Lochmüller, Eva; Eckstein, Felix; Nagarajan, Mahesh B.

    2017-01-01

    The ability of Minkowski Functionals to characterize local structure in different biological tissue types has been demonstrated in a variety of medical image processing tasks. We introduce anisotropic Minkowski Functionals (AMFs) as a novel variant that captures the inherent anisotropy of the underlying gray-level structures. To quantify the anisotropy characterized by our approach, we further introduce a method to compute a quantitative measure motivated by a technique utilized in MR diffusion tensor imaging, namely fractional anisotropy. We showcase the applicability of our method in the research context of characterizing the local structure properties of trabecular bone micro-architecture in the proximal femur as visualized on multi-detector CT. To this end, AMFs were computed locally for each pixel of ROIs extracted from the head, neck and trochanter regions. Fractional anisotropy was then used to quantify the local anisotropy of the trabecular structures found in these ROIs and to compare its distribution in different anatomical regions. Our results suggest a significantly greater concentration of anisotropic trabecular structures in the head and neck regions when compared to the trochanter region (p < 10−4). We also evaluated the ability of such AMFs to predict bone strength in the femoral head of proximal femur specimens obtained from 50 donors. Our results suggest that such AMFs, when used in conjunction with multi-regression models, can outperform more conventional features such as BMD in predicting failure load. We conclude that such anisotropic Minkowski Functionals can capture valuable information regarding directional attributes of local structure, which may be useful in a wide scope of biomedical imaging applications. PMID:29170580

  2. Mapping whole-brain activity with cellular resolution by light-sheet microscopy and high-throughput image analysis (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Silvestri, Ludovico; Rudinskiy, Nikita; Paciscopi, Marco; Müllenbroich, Marie Caroline; Costantini, Irene; Sacconi, Leonardo; Frasconi, Paolo; Hyman, Bradley T.; Pavone, Francesco S.

    2016-03-01

    Mapping neuronal activity patterns across the whole brain with cellular resolution is a challenging task for state-of-the-art imaging methods. Indeed, despite a number of technological efforts, quantitative cellular-resolution activation maps of the whole brain have not yet been obtained. Many techniques are limited by coarse resolution or by a narrow field of view. High-throughput imaging methods, such as light sheet microscopy, can be used to image large specimens with high resolution and in reasonable times. However, the bottleneck is then moved from image acquisition to image analysis, since many TeraBytes of data have to be processed to extract meaningful information. Here, we present a full experimental pipeline to quantify neuronal activity in the entire mouse brain with cellular resolution, based on a combination of genetics, optics and computer science. We used a transgenic mouse strain (Arc-dVenus mouse) in which neurons which have been active in the last hours before brain fixation are fluorescently labelled. Samples were cleared with CLARITY and imaged with a custom-made confocal light sheet microscope. To perform an automatic localization of fluorescent cells on the large images produced, we used a novel computational approach called semantic deconvolution. The combined approach presented here allows quantifying the amount of Arc-expressing neurons throughout the whole mouse brain. When applied to cohorts of mice subject to different stimuli and/or environmental conditions, this method helps finding correlations in activity between different neuronal populations, opening the possibility to infer a sort of brain-wide 'functional connectivity' with cellular resolution.

  3. Interactive Computer Based Assessment Tasks: How Problem-Solving Process Data Can Inform Instruction

    ERIC Educational Resources Information Center

    Zoanetti, Nathan

    2010-01-01

    This article presents key steps in the design and analysis of a computer based problem-solving assessment featuring interactive tasks. The purpose of the assessment is to support targeted instruction for students by diagnosing strengths and weaknesses at different stages of problem-solving. The first focus of this article is the task piloting…

  4. Item Mass and Complexity and the Arithmetic Computation of Students with Learning Disabilities.

    ERIC Educational Resources Information Center

    Cawley, John F.; Shepard, Teri; Smith, Maureen; Parmar, Rene S.

    1997-01-01

    The performance of 76 students (ages 10 to 15) with learning disabilities on four tasks of arithmetic computation within each of the four basic operations was examined. Tasks varied in difficulty level and number of strokes needed to complete all items. Intercorrelations between task sets and operations were examined as was the use of…

  5. Task Scheduling in Desktop Grids: Open Problems

    NASA Astrophysics Data System (ADS)

    Chernov, Ilya; Nikitina, Natalia; Ivashko, Evgeny

    2017-12-01

    We survey the areas of Desktop Grid task scheduling that seem to be insufficiently studied so far and are promising for efficiency, reliability, and quality of Desktop Grid computing. These topics include optimal task grouping, "needle in a haystack" paradigm, game-theoretical scheduling, domain-imposed approaches, special optimization of the final stage of the batch computation, and Enterprise Desktop Grids.

  6. Computer-Mediated Communication in English for Specific Purposes: A Case Study with Computer Science Students at Universiti Teknologi Malaysia

    ERIC Educational Resources Information Center

    Shamsudin, Sarimah; Nesi, Hilary

    2006-01-01

    This paper will describe an ESP approach to the design and implementation of computer-mediated communication (CMC) tasks for computer science students at Universiti Teknologi Malaysia, and discuss the effectiveness of the chat feature of Windows NetMeeting as a tool for developing specified language skills. CMC tasks were set within a programme of…

  7. Simplified Distributed Computing

    NASA Astrophysics Data System (ADS)

    Li, G. G.

    2006-05-01

    The distributed computing runs from high performance parallel computing, GRID computing, to an environment where idle CPU cycles and storage space of numerous networked systems are harnessed to work together through the Internet. In this work we focus on building an easy and affordable solution for computationally intensive problems in scientific applications based on existing technology and hardware resources. This system consists of a series of controllers. When a job request is detected by a monitor or initialized by an end user, the job manager launches the specific job handler for this job. The job handler pre-processes the job, partitions the job into relative independent tasks, and distributes the tasks into the processing queue. The task handler picks up the related tasks, processes the tasks, and puts the results back into the processing queue. The job handler also monitors and examines the tasks and the results, and assembles the task results into the overall solution for the job request when all tasks are finished for each job. A resource manager configures and monitors all participating notes. A distributed agent is deployed on all participating notes to manage the software download and report the status. The processing queue is the key to the success of this distributed system. We use BEA's Weblogic JMS queue in our implementation. It guarantees the message delivery and has the message priority and re-try features so that the tasks never get lost. The entire system is built on the J2EE technology and it can be deployed on heterogeneous platforms. It can handle algorithms and applications developed in any languages on any platforms. J2EE adaptors are provided to manage and communicate the existing applications to the system so that the applications and algorithms running on Unix, Linux and Windows can all work together. This system is easy and fast to develop based on the industry's well-adopted technology. It is highly scalable and heterogeneous. It is an open system and any number and type of machines can join the system to provide the computational power. This asynchronous message-based system can achieve second of response time. For efficiency, communications between distributed tasks are often done at the start and end of the tasks but intermediate status of the tasks can also be provided.

  8. Improving communication when seeking informed consent: a randomised controlled study of a computer-based method for providing information to prospective clinical trial participants.

    PubMed

    Karunaratne, Asuntha S; Korenman, Stanley G; Thomas, Samantha L; Myles, Paul S; Komesaroff, Paul A

    2010-04-05

    To assess the efficacy, with respect to participant understanding of information, of a computer-based approach to communication about complex, technical issues that commonly arise when seeking informed consent for clinical research trials. An open, randomised controlled study of 60 patients with diabetes mellitus, aged 27-70 years, recruited between August 2006 and October 2007 from the Department of Diabetes and Endocrinology at the Alfred Hospital and Baker IDI Heart and Diabetes Institute, Melbourne. Participants were asked to read information about a mock study via a computer-based presentation (n = 30) or a conventional paper-based information statement (n = 30). The computer-based presentation contained visual aids, including diagrams, video, hyperlinks and quiz pages. Understanding of information as assessed by quantitative and qualitative means. Assessment scores used to measure level of understanding were significantly higher in the group that completed the computer-based task than the group that completed the paper-based task (82% v 73%; P = 0.005). More participants in the group that completed the computer-based task expressed interest in taking part in the mock study (23 v 17 participants; P = 0.01). Most participants from both groups preferred the idea of a computer-based presentation to the paper-based statement (21 in the computer-based task group, 18 in the paper-based task group). A computer-based method of providing information may help overcome existing deficiencies in communication about clinical research, and may reduce costs and improve efficiency in recruiting participants for clinical trials.

  9. A Familiar Pattern? Semantic Memory Contributes to the Enhancement of Visuo-Spatial Memories

    ERIC Educational Resources Information Center

    Riby, Leigh M.; Orme, Elizabeth

    2013-01-01

    In this study we quantify for the first time electrophysiological components associated with incorporating long-term semantic knowledge with visuo-spatial information using two variants of a traditional matrix patterns task. Results indicated that the matrix task with greater semantic content was associated with enhanced accuracy and RTs in a…

  10. Schedule Risk Assessment

    NASA Technical Reports Server (NTRS)

    Smith, Greg

    2003-01-01

    Schedule risk assessments determine the likelihood of finishing on time. Each task in a schedule has a varying degree of probability of being finished on time. A schedule risk assessment quantifies these probabilities by assigning values to each task. This viewgraph presentation contains a flow chart for conducting a schedule risk assessment, and profiles applicable several methods of data analysis.

  11. Use of Computed Tomography Imaging for Qualifying Coarse Roots, Rhizomes, Peat, and Particle Densities in Marsh Soils

    EPA Science Inventory

    Computed tomography (CT) imaging has been used to describe and quantify subtidal, benthic animals such as polychaetes, amphipods, and shrimp. Here, for the first time, CT imaging is used to successfully quantify wet mass of coarse roots, rhizomes, and peat in cores collected from...

  12. Use of Computer-Aided Tomography (CT) Imaging for Quantifying Coarse Roots, Rhizomes, Peat, and Particle Densities in Marsh Soils

    EPA Science Inventory

    Computer-aided Tomography (CT) imaging was utilized to quantify wet mass of coarse roots, rhizomes, and peat in cores collected from organic-rich (Jamaica Bay, NY) and mineral (North Inlet, SC) Spartina alterniflora soils. Calibration rods composed of materials with standard dens...

  13. Soil structure characterized using computed tomographic images

    Treesearch

    Zhanqi Cheng; Stephen H. Anderson; Clark J. Gantzer; J. W. Van Sambeek

    2003-01-01

    Fractal analysis of soil structure is a relatively new method for quantifying the effects of management systems on soil properties and quality. The objective of this work was to explore several methods of studying images to describe and quantify structure of soils under forest management. This research uses computed tomography and a topological method called Multiple...

  14. APPLICATION OF COMPUTER-AIDED TOMOGRAPHY TO VISUALIZE AND QUANTIFY BIOGENIC STRUCTURES IN MARINE SEDIMENTS

    EPA Science Inventory

    We used computer-aided tomography (CT) for 3D visualization and 2D analysis of

    marine sediment cores from 3 stations (at 10, 75 and 118 m depths) with different environmental

    impact. Biogenic structures such as tubes and burrows were quantified and compared among st...

  15. Advanced information processing system: Local system services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Alger, Linda; Whittredge, Roy; Stasiowski, Peter

    1989-01-01

    The Advanced Information Processing System (AIPS) is a multi-computer architecture composed of hardware and software building blocks that can be configured to meet a broad range of application requirements. The hardware building blocks are fault-tolerant, general-purpose computers, fault-and damage-tolerant networks (both computer and input/output), and interfaces between the networks and the computers. The software building blocks are the major software functions: local system services, input/output, system services, inter-computer system services, and the system manager. The foundation of the local system services is an operating system with the functions required for a traditional real-time multi-tasking computer, such as task scheduling, inter-task communication, memory management, interrupt handling, and time maintenance. Resting on this foundation are the redundancy management functions necessary in a redundant computer and the status reporting functions required for an operator interface. The functional requirements, functional design and detailed specifications for all the local system services are documented.

  16. A resource management architecture based on complex network theory in cloud computing federation

    NASA Astrophysics Data System (ADS)

    Zhang, Zehua; Zhang, Xuejie

    2011-10-01

    Cloud Computing Federation is a main trend of Cloud Computing. Resource Management has significant effect on the design, realization, and efficiency of Cloud Computing Federation. Cloud Computing Federation has the typical characteristic of the Complex System, therefore, we propose a resource management architecture based on complex network theory for Cloud Computing Federation (abbreviated as RMABC) in this paper, with the detailed design of the resource discovery and resource announcement mechanisms. Compare with the existing resource management mechanisms in distributed computing systems, a Task Manager in RMABC can use the historical information and current state data get from other Task Managers for the evolution of the complex network which is composed of Task Managers, thus has the advantages in resource discovery speed, fault tolerance and adaptive ability. The result of the model experiment confirmed the advantage of RMABC in resource discovery performance.

  17. Adaptive Allocation of Decision Making Responsibility Between Human and Computer in Multi-Task Situations. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chu, Y. Y.

    1978-01-01

    A unified formulation of computer-aided, multi-task, decision making is presented. Strategy for the allocation of decision making responsibility between human and computer is developed. The plans of a flight management systems are studied. A model based on the queueing theory was implemented.

  18. Computer-Based Simulations for Maintenance Training: Current ARI Research. Technical Report 544.

    ERIC Educational Resources Information Center

    Knerr, Bruce W.; And Others

    Three research efforts that used computer-based simulations for maintenance training were in progress when this report was written: Game-Based Learning, which investigated the use of computer-based games to train electronics diagnostic skills; Human Performance in Fault Diagnosis Tasks, which evaluated the use of context-free tasks to train…

  19. Evaluating the Efficacy of the Cloud for Cluster Computation

    NASA Technical Reports Server (NTRS)

    Knight, David; Shams, Khawaja; Chang, George; Soderstrom, Tom

    2012-01-01

    Computing requirements vary by industry, and it follows that NASA and other research organizations have computing demands that fall outside the mainstream. While cloud computing made rapid inroads for tasks such as powering web applications, performance issues on highly distributed tasks hindered early adoption for scientific computation. One venture to address this problem is Nebula, NASA's homegrown cloud project tasked with delivering science-quality cloud computing resources. However, another industry development is Amazon's high-performance computing (HPC) instances on Elastic Cloud Compute (EC2) that promises improved performance for cluster computation. This paper presents results from a series of benchmarks run on Amazon EC2 and discusses the efficacy of current commercial cloud technology for running scientific applications across a cluster. In particular, a 240-core cluster of cloud instances achieved 2 TFLOPS on High-Performance Linpack (HPL) at 70% of theoretical computational performance. The cluster's local network also demonstrated sub-100 ?s inter-process latency with sustained inter-node throughput in excess of 8 Gbps. Beyond HPL, a real-world Hadoop image processing task from NASA's Lunar Mapping and Modeling Project (LMMP) was run on a 29 instance cluster to process lunar and Martian surface images with sizes on the order of tens of gigapixels. These results demonstrate that while not a rival of dedicated supercomputing clusters, commercial cloud technology is now a feasible option for moderately demanding scientific workloads.

  20. Adapting to the surface: A comparison of handwriting measures when writing on a tablet computer and on paper.

    PubMed

    Gerth, Sabrina; Dolk, Thomas; Klassert, Annegret; Fliesser, Michael; Fischer, Martin H; Nottbusch, Guido; Festman, Julia

    2016-08-01

    Our study addresses the following research questions: Are there differences between handwriting movements on paper and on a tablet computer? Can experienced writers, such as most adults, adapt their graphomotor execution during writing to a rather unfamiliar surface for instance a tablet computer? We examined the handwriting performance of adults in three tasks with different complexity: (a) graphomotor abilities, (b) visuomotor abilities and (c) handwriting. Each participant performed each task twice, once on paper and once on a tablet computer with a pen. We tested 25 participants by measuring their writing duration, in air time, number of pen lifts, writing velocity and number of inversions in velocity. The data were analyzed using linear mixed-effects modeling with repeated measures. Our results reveal differences between writing on paper and on a tablet computer which were partly task-dependent. Our findings also show that participants were able to adapt their graphomotor execution to the smoother surface of the tablet computer during the tasks. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. MAT - MULTI-ATTRIBUTE TASK BATTERY FOR HUMAN OPERATOR WORKLOAD AND STRATEGIC BEHAVIOR RESEARCH

    NASA Technical Reports Server (NTRS)

    Comstock, J. R.

    1994-01-01

    MAT, a Multi-Attribute Task battery, gives the researcher the capability of performing multi-task workload and performance experiments. The battery provides a benchmark set of tasks for use in a wide range of laboratory studies of operator performance and workload. MAT incorporates tasks analogous to activities that aircraft crew members perform in flight, while providing a high degree of experiment control, performance data on each subtask, and freedom to use non-pilot test subjects. The MAT battery primary display is composed of four separate task windows which are as follows: a monitoring task window which includes gauges and warning lights, a tracking task window for the demands of manual control, a communication task window to simulate air traffic control communications, and a resource management task window which permits maintaining target levels on a fuel management task. In addition, a scheduling task window gives the researcher information about future task demands. The battery also provides the option of manual or automated control of tasks. The task generates performance data for each subtask. The task battery may be paused and onscreen workload rating scales presented to the subject. The MAT battery was designed to use a serially linked second computer to generate the voice messages for the Communications task. The MATREMX program and support files, which are included in the MAT package, were designed to work with the Heath Voice Card (Model HV-2000, available through the Heath Company, Benton Harbor, Michigan 49022); however, the MATREMX program and support files may easily be modified to work with other voice synthesizer or digitizer cards. The MAT battery task computer may also be used independent of the voice computer if no computer synthesized voice messages are desired or if some other method of presenting auditory messages is devised. MAT is written in QuickBasic and assembly language for IBM PC series and compatible computers running MS-DOS. The code in MAT is written for Microsoft QuickBasic 4.5 and Microsoft Macro Assembler 5.1. This package requires a joystick and EGA or VGA color graphics. An 80286, 386, or 486 processor machine is highly recommended. The standard distribution medium for MAT is a 5.25 inch 360K MS-DOS format diskette. The files are compressed using the PKZIP file compression utility. PKUNZIP is included on the distribution diskette. MAT was developed in 1992. IBM PC is a registered trademark of International Business Machines. MS-DOS, Microsoft QuickBasic, and Microsoft Macro Assembler are registered trademarks of Microsoft Corporation. PKZIP and PKUNZIP are registered trademarks of PKWare, Inc.

  2. Checkpoint triggering in a computer system

    DOEpatents

    Cher, Chen-Yong

    2016-09-06

    According to an aspect, a method for triggering creation of a checkpoint in a computer system includes executing a task in a processing node of the computer system and determining whether it is time to read a monitor associated with a metric of the task. The monitor is read to determine a value of the metric based on determining that it is time to read the monitor. A threshold for triggering creation of the checkpoint is determined based on the value of the metric. Based on determining that the value of the metric has crossed the threshold, the checkpoint including state data of the task is created to enable restarting execution of the task upon a restart operation.

  3. [Computer-assisted image processing for quantifying histopathologic variables in the healing of colonic anastomosis in dogs].

    PubMed

    Novelli, M D; Barreto, E; Matos, D; Saad, S S; Borra, R C

    1997-01-01

    The authors present the experimental results of the computerized quantifying of tissular structures involved in the reparative process of colonic anastomosis performed by manual suture and biofragmentable ring. The quantified variables in this study were: oedema fluid, myofiber tissue, blood vessel and cellular nuclei. An image processing software developed at Laboratório de Informática Dedicado à Odontologia (LIDO) was utilized to quantifying the pathognomonic alterations in the inflammatory process in colonic anastomosis performed in 14 dogs. The results were compared to those obtained through traditional way diagnosis by two pathologists in view of counterproof measures. The criteria for these diagnoses were defined in levels represented by absent, light, moderate and intensive which were compared to analysis performed by the computer. There was significant statistical difference between two techniques: the biofragmentable ring technique exhibited low oedema fluid, organized myofiber tissue and higher number of alongated cellular nuclei in relation to manual suture technique. The analysis of histometric variables through computational image processing was considered efficient and powerful to quantify the main tissular inflammatory and reparative changing.

  4. Quantifiers are incrementally interpreted in context, more than less

    PubMed Central

    Urbach, Thomas P.; DeLong, Katherine A.; Kutas, Marta

    2015-01-01

    Language interpretation is often assumed to be incremental. However, our studies of quantifier expressions in isolated sentences found N400 event-related brain potential (ERP) evidence for partial but not full immediate quantifier interpretation (Urbach & Kutas, 2010). Here we tested similar quantifier expressions in pragmatically supporting discourse contexts (Alex was an unusual toddler. Most/Few kids prefer sweets/vegetables…) while participants made plausibility judgments (Experiment 1) or read for comprehension (Experiment 2). Control Experiments 3A (plausibility) and 3B (comprehension) removed the discourse contexts. Quantifiers always modulated typical and/or atypical word N400 amplitudes. However, only the real-time N400 effects only in Experiment 2 mirrored offline quantifier and typicality crossover interaction effects for plausibility ratings and cloze probabilities. We conclude that quantifier expressions can be interpreted fully and immediately, though pragmatic and task variables appear to impact the speed and/or depth of quantifier interpretation. PMID:26005285

  5. Novel in Vitro Modification of Bone for an Allograft with Improved Toughness Osteoconductivity

    DTIC Science & Technology

    2015-06-01

    osteocalcin, Runx2, and col1a1 by RT-PCR. Spectrophotometry and fluorescence microscopy were used to quantify AGEs. 2. KEYWORDS Fracture toughness, R...markers (alkaline phosphatase, osteocalcin, RUNX2 and COL1A1 ) Completed Task 10 Data analysis, publications, reports Completed Task 1. Retrieval...FEMALE 25 Task 9. Measure expression of molecular markers of mineralization, osteocalcin, RUNX2 and COL1A1 using quantitative RT-PCR with specific

  6. Performance Analysis of and Tool Support for Transactional Memory on BG/Q

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schindewolf, M

    2011-12-08

    Martin Schindewolf worked during his internship at the Lawrence Livermore National Laboratory (LLNL) under the guidance of Martin Schulz at the Computer Science Group of the Center for Applied Scientific Computing. We studied the performance of the TM subsystem of BG/Q as well as researched the possibilities for tool support for TM. To study the performance, we run CLOMP-TM. CLOMP-TM is a benchmark designed for the purpose to quantify the overhead of OpenMP and compare different synchronization primitives. To advance CLOMP-TM, we added Message Passing Interface (MPI) routines for a hybrid parallelization. This enables to run multiple MPI tasks, eachmore » running OpenMP, on one node. With these enhancements, a beneficial MPI task to OpenMP thread ratio is determined. Further, the synchronization primitives are ranked as a function of the application characteristics. To demonstrate the usefulness of these results, we investigate a real Monte Carlo simulation called Monte Carlo Benchmark (MCB). Applying the lessons learned yields the best task to thread ratio. Further, we were able to tune the synchronization by transactifying the MCB. Further, we develop tools that capture the performance of the TM run time system and present it to the application's developer. The performance of the TM run time system relies on the built-in statistics. These tools use the Blue Gene Performance Monitoring (BGPM) interface to correlate the statistics from the TM run time system with performance counter values. This combination provides detailed insights in the run time behavior of the application and enables to track down the cause of degraded performance. Further, one tool has been implemented that separates the performance counters in three categories: Successful Speculation, Unsuccessful Speculation and No Speculation. All of the tools are crafted around IBM's xlc compiler for C and C++ and have been run and tested on a Q32 early access system.« less

  7. An assessment of the physical impact of complex surgical tasks on surgeon errors and discomfort: a comparison between robot-assisted, laparoscopic and open approaches.

    PubMed

    Elhage, Oussama; Challacombe, Ben; Shortland, Adam; Dasgupta, Prokar

    2015-02-01

    To evaluate, in a simulated suturing task, individual surgeons’ performance using three surgical approaches: open, laparoscopic and robot-assisted. subjects and methods: Six urological surgeons made an in vitro simulated vesico-urethral anastomosis. All surgeons performed the simulated suturing task using all three surgical approaches (open, laparoscopic and robot-assisted). The time taken to perform each task was recorded. Participants were evaluated for perceived discomfort using the self-reporting Borg scale. Errors made by surgeons were quantified by studying the video recording of the tasks. Anastomosis quality was quantified using scores for knot security, symmetry of suture, position of suture and apposition of anastomosis. The time taken to complete the task by the laparoscopic approach was on average 221 s, compared with 55 s for the open approach and 116 s for the robot-assisted approach (anova, P < 0.005). The number of errors and the level of self-reported discomfort were highest for the laparoscopic approach (anova, P < 0.005). Limitations of the present study include the small sample size and variation in prior surgical experience of the participants. In an in vitro model of anastomosis surgery, robot-assisted surgery combines the accuracy of open surgery while causing lesser surgeon discomfort than laparoscopy and maintaining minimal access.

  8. Understanding self-reported difficulties in decision-making by people with autism spectrum disorders.

    PubMed

    Vella, Lydia; Ring, Howard A; Aitken, Mike Rf; Watson, Peter C; Presland, Alexander; Clare, Isabel Ch

    2017-04-01

    Autobiographical accounts and a limited research literature suggest that adults with autism spectrum disorders can experience difficulties with decision-making. We examined whether some of the difficulties they describe correspond to quantifiable differences in decision-making when compared to adults in the general population. The participants (38 intellectually able adults with autism spectrum disorders and 40 neurotypical adults) were assessed on three tasks of decision-making (Iowa Gambling Task, Cambridge Gamble Task and Information Sampling Task), which quantified, respectively, decision-making performance and relative attention to negative and positive outcomes, speed and flexibility, and information sampling. As a caution, all analyses were repeated with a subset of participants ( n ASD  = 29 and n neurotypical  = 39) who were not taking antidepressant or anxiolytic medication. Compared to the neurotypical participants, participants with autism spectrum disorders demonstrated slower decision-making on the Cambridge Gamble Task, and superior performance on the Iowa Gambling Task. When those taking the medications were excluded, participants with autism spectrum disorders also sampled more information. There were no other differences between the groups. These processing tendencies may contribute to the difficulties self-reported in some contexts; however, the results also highlight strengths in autism spectrum disorders, such as a more logical approach to, and care in, decision-making. The findings lead to recommendations for how adults with autism spectrum disorders may be better supported with decision-making.

  9. ITKids part II: variation of postures and muscle activity in children using different information and communication technologies.

    PubMed

    Ciccarelli, Marina; Straker, Leon; Mathiassen, Svend Erik; Pollock, Clare

    2011-01-01

    There are concerns that insufficient variation in postural and muscle activity associated with use of modern information and communication technology (ICT) presents a risk for musculoskeletal ill-health among school children. However, scientific knowledge on physical exposure variation in this group is limited. The purpose of this study was to quantify postures and muscle activity of school children using different types of ICT. Postures of the head, upper back and upper arm, and muscle activity of the right and left upper trapezius and right forearm extensors were measured over 10-12 hours in nine school children using different types of ICT at school and away-from-school. Variation in postures and muscle activity was quantified using two indices, EVA{sd} and APDF₉₀-₁₀. Paper-based (Old) ICT tasks produced postures that were less neutral but more variable than electronics-based (New ICT) and Non-ICT tasks. Non-ICT tasks involved mean postures similar to New ICT tasks, but with greater variation. Variation of muscle activity was similar between ICT types in the right and left upper trapezius muscles. Non-ICT tasks produced more muscle activity variation in the right forearm extensor group compared to New and Old ICT tasks. Different ICT tasks produce different degrees of variation in posture and muscle activity. Combining tasks that use different ICT may increase overall exposure variation. More research is needed to determine what degree of postural and muscle activity variation is associated with reduced risk of musculoskeletal ill-health.

  10. Ergonomics and human factors in endoscopic surgery: a comparison of manual vs telerobotic simulation systems.

    PubMed

    Lee, E C; Rafiq, A; Merrell, R; Ackerman, R; Dennerlein, J T

    2005-08-01

    Minimally invasive surgical techniques expose surgeons to a variety of occupational hazards that may promote musculoskeletal disorders. Telerobotic systems for minimally invasive surgery may help to reduce these stressors. The objective of this study was to compare manual and telerobotic endoscopic surgery in terms of postural and mental stress. Thirteen participants with no experience as primary surgeons in endoscopic surgery performed a set of simulated surgical tasks using two different techniques--a telerobotic master--slave system and a manual endoscopic surgery system. The tasks consisted of passing a soft spherical object through a series of parallel rings, suturing along a line 5-cm long, running a 32-in ribbon, and cannulation. The Job Strain Index (JSI) and Rapid Upper Limb Assessment (RULA) were used to quantify upper extremity exposure to postural and force risk factors. Task duration was quantified in seconds. A questionnaire provided measures of the participants' intuitiveness and mental stress. The JSI and RULA scores for all four tasks were significantly lower for the telerobotic technique than for the manual one. Task duration was significantly longer for telerobotic than for manual tasks. Participants reported that the telerobotic technique was as intuitive as, and no more stressful than, the manual technique. Given identical tasks, the time to completion is longer using the telerobotic technique than its manual counterpart. For the given simulated tasks in the laboratory setting, the better scores for the upper extremity postural analysis indicate that telerobotic surgery provides a more comfortable environment for the surgeon without any additional mental stress.

  11. Airborne Intelligent Display (AID) Phase I Software Description,

    DTIC Science & Technology

    1983-10-24

    Board Computer Characteristics 10 3.0 SOFTWARE GENERAL DESCRIPTION 13 3.1 Overview 13 3.2 System Software 14 3.2.1 System Startup 14 3.2.1.1 Initial...3 A-2 Task States A-4 A-3 Task Program Structure A-6 A-4 Task States and State Change Mechanisms A-7 A-5 Computing Return Addresses: RUNADR, SLPADR A...techniques. 2.2 Design Approach The stated objectives were met by: 1. distributing the processing load among multiple Z80 single-board computers (SBC’s). This

  12. One Task, Divergent Solutions: High- versus Low-Status Sources and Social Comparison Guide Adaptation in a Computer-Supported Socio-Cognitive Conflict Task

    ERIC Educational Resources Information Center

    Baumeister, Antonia E.; Engelmann, Tanja; Hesse, Friedrich W.

    2017-01-01

    This experimental study extends conflict elaboration theory (1) by revealing social influence dynamics for a knowledge-rich computer-supported socio-cognitive conflict task not investigated in the context of this theory before and (2) by showing the impact of individual differences in social comparison orientation. Students in two conditions…

  13. What and When Second-Language Learners Revise When Responding to Timed Writing Tasks on the Computer: The Roles of Task Type, Second Language Proficiency, and Keyboarding Skills

    ERIC Educational Resources Information Center

    Barkaoui, Khaled

    2016-01-01

    This study contributes to the literature on second language (L2) learners' revision behavior by describing what, when, and how often L2 learners revise their texts when responding to timed writing tasks on the computer and by examining the effects of task type, L2 proficiency, and keyboarding skills on what and when L2 learners revise. Each of 54…

  14. Parietal neural prosthetic control of a computer cursor in a graphical-user-interface task

    NASA Astrophysics Data System (ADS)

    Revechkis, Boris; Aflalo, Tyson NS; Kellis, Spencer; Pouratian, Nader; Andersen, Richard A.

    2014-12-01

    Objective. To date, the majority of Brain-Machine Interfaces have been used to perform simple tasks with sequences of individual targets in otherwise blank environments. In this study we developed a more practical and clinically relevant task that approximated modern computers and graphical user interfaces (GUIs). This task could be problematic given the known sensitivity of areas typically used for BMIs to visual stimuli, eye movements, decision-making, and attentional control. Consequently, we sought to assess the effect of a complex, GUI-like task on the quality of neural decoding. Approach. A male rhesus macaque monkey was implanted with two 96-channel electrode arrays in area 5d of the superior parietal lobule. The animal was trained to perform a GUI-like ‘Face in a Crowd’ task on a computer screen that required selecting one cued, icon-like, face image from a group of alternatives (the ‘Crowd’) using a neurally controlled cursor. We assessed whether the crowd affected decodes of intended cursor movements by comparing it to a ‘Crowd Off’ condition in which only the matching target appeared without alternatives. We also examined if training a neural decoder with the Crowd On rather than Off had any effect on subsequent decode quality. Main results. Despite the additional demands of working with the Crowd On, the animal was able to robustly perform the task under Brain Control. The presence of the crowd did not itself affect decode quality. Training the decoder with the Crowd On relative to Off had no negative influence on subsequent decoding performance. Additionally, the subject was able to gaze around freely without influencing cursor position. Significance. Our results demonstrate that area 5d recordings can be used for decoding in a complex, GUI-like task with free gaze. Thus, this area is a promising source of signals for neural prosthetics that utilize computing devices with GUI interfaces, e.g. personal computers, mobile devices, and tablet computers.

  15. Parietal neural prosthetic control of a computer cursor in a graphical-user-interface task.

    PubMed

    Revechkis, Boris; Aflalo, Tyson N S; Kellis, Spencer; Pouratian, Nader; Andersen, Richard A

    2014-12-01

    To date, the majority of Brain-Machine Interfaces have been used to perform simple tasks with sequences of individual targets in otherwise blank environments. In this study we developed a more practical and clinically relevant task that approximated modern computers and graphical user interfaces (GUIs). This task could be problematic given the known sensitivity of areas typically used for BMIs to visual stimuli, eye movements, decision-making, and attentional control. Consequently, we sought to assess the effect of a complex, GUI-like task on the quality of neural decoding. A male rhesus macaque monkey was implanted with two 96-channel electrode arrays in area 5d of the superior parietal lobule. The animal was trained to perform a GUI-like 'Face in a Crowd' task on a computer screen that required selecting one cued, icon-like, face image from a group of alternatives (the 'Crowd') using a neurally controlled cursor. We assessed whether the crowd affected decodes of intended cursor movements by comparing it to a 'Crowd Off' condition in which only the matching target appeared without alternatives. We also examined if training a neural decoder with the Crowd On rather than Off had any effect on subsequent decode quality. Despite the additional demands of working with the Crowd On, the animal was able to robustly perform the task under Brain Control. The presence of the crowd did not itself affect decode quality. Training the decoder with the Crowd On relative to Off had no negative influence on subsequent decoding performance. Additionally, the subject was able to gaze around freely without influencing cursor position. Our results demonstrate that area 5d recordings can be used for decoding in a complex, GUI-like task with free gaze. Thus, this area is a promising source of signals for neural prosthetics that utilize computing devices with GUI interfaces, e.g. personal computers, mobile devices, and tablet computers.

  16. The Logical Extension

    NASA Technical Reports Server (NTRS)

    2003-01-01

    The same software controlling autonomous and crew-assisted operations for the International Space Station (ISS) is enabling commercial enterprises to integrate and automate manual operations, also known as decision logic, in real time across complex and disparate networked applications, databases, servers, and other devices, all with quantifiable business benefits. Auspice Corporation, of Framingham, Massachusetts, developed the Auspice TLX (The Logical Extension) software platform to effectively mimic the human decision-making process. Auspice TLX automates operations across extended enterprise systems, where any given infrastructure can include thousands of computers, servers, switches, and modems that are connected, and therefore, dependent upon each other. The concept behind the Auspice software spawned from a computer program originally developed in 1981 by Cambridge, Massachusetts-based Draper Laboratory for simulating tasks performed by astronauts aboard the Space Shuttle. At the time, the Space Shuttle Program was dependent upon paper-based procedures for its manned space missions, which typically averaged 2 weeks in duration. As the Shuttle Program progressed, NASA began increasing the length of manned missions in preparation for a more permanent space habitat. Acknowledging the need to relinquish paper-based procedures in favor of an electronic processing format to properly monitor and manage the complexities of these longer missions, NASA realized that Draper's task simulation software could be applied to its vision of year-round space occupancy. In 1992, Draper was awarded a NASA contract to build User Interface Language software to enable autonomous operations of a multitude of functions on Space Station Freedom (the station was redesigned in 1993 and converted into the international venture known today as the ISS)

  17. A Functional Cartography of Cognitive Systems

    PubMed Central

    Mattar, Marcelo G.; Cole, Michael W.; Thompson-Schill, Sharon L.; Bassett, Danielle S.

    2015-01-01

    One of the most remarkable features of the human brain is its ability to adapt rapidly and efficiently to external task demands. Novel and non-routine tasks, for example, are implemented faster than structural connections can be formed. The neural underpinnings of these dynamics are far from understood. Here we develop and apply novel methods in network science to quantify how patterns of functional connectivity between brain regions reconfigure as human subjects perform 64 different tasks. By applying dynamic community detection algorithms, we identify groups of brain regions that form putative functional communities, and we uncover changes in these groups across the 64-task battery. We summarize these reconfiguration patterns by quantifying the probability that two brain regions engage in the same network community (or putative functional module) across tasks. These tools enable us to demonstrate that classically defined cognitive systems—including visual, sensorimotor, auditory, default mode, fronto-parietal, cingulo-opercular and salience systems—engage dynamically in cohesive network communities across tasks. We define the network role that a cognitive system plays in these dynamics along the following two dimensions: (i) stability vs. flexibility and (ii) connected vs. isolated. The role of each system is therefore summarized by how stably that system is recruited over the 64 tasks, and how consistently that system interacts with other systems. Using this cartography, classically defined cognitive systems can be categorized as ephemeral integrators, stable loners, and anything in between. Our results provide a new conceptual framework for understanding the dynamic integration and recruitment of cognitive systems in enabling behavioral adaptability across both task and rest conditions. This work has important implications for understanding cognitive network reconfiguration during different task sets and its relationship to cognitive effort, individual variation in cognitive performance, and fatigue. PMID:26629847

  18. Endpoint-based parallel data processing in a parallel active messaging interface of a parallel computer

    DOEpatents

    Archer, Charles J; Blocksome, Michael E; Ratterman, Joseph D; Smith, Brian E

    2014-02-11

    Endpoint-based parallel data processing in a parallel active messaging interface ('PAMI') of a parallel computer, the PAMI composed of data communications endpoints, each endpoint including a specification of data communications parameters for a thread of execution on a compute node, including specifications of a client, a context, and a task, the compute nodes coupled for data communications through the PAMI, including establishing a data communications geometry, the geometry specifying, for tasks representing processes of execution of the parallel application, a set of endpoints that are used in collective operations of the PAMI including a plurality of endpoints for one of the tasks; receiving in endpoints of the geometry an instruction for a collective operation; and executing the instruction for a collective opeartion through the endpoints in dependence upon the geometry, including dividing data communications operations among the plurality of endpoints for one of the tasks.

  19. Psychological Issues in Online Adaptive Task Allocation

    NASA Technical Reports Server (NTRS)

    Morris, N. M.; Rouse, W. B.; Ward, S. L.; Frey, P. R.

    1984-01-01

    Adaptive aiding is an idea that offers potential for improvement over many current approaches to aiding in human-computer systems. The expected return of tailoring the system to fit the user could be in the form of improved system performance and/or increased user satisfaction. Issues such as the manner in which information is shared between human and computer, the appropriate division of labor between them, and the level of autonomy of the aid are explored. A simulated visual search task was developed. Subjects are required to identify targets in a moving display while performing a compensatory sub-critical tracking task. By manipulating characteristics of the situation such as imposed task-related workload and effort required to communicate with the computer, it is possible to create conditions in which interaction with the computer would be more or less desirable. The results of preliminary research using this experimental scenario are presented, and future directions for this research effort are discussed.

  20. Endpoint-based parallel data processing in a parallel active messaging interface of a parallel computer

    DOEpatents

    Archer, Charles J.; Blocksome, Michael A.; Ratterman, Joseph D.; Smith, Brian E.

    2014-08-12

    Endpoint-based parallel data processing in a parallel active messaging interface (`PAMI`) of a parallel computer, the PAMI composed of data communications endpoints, each endpoint including a specification of data communications parameters for a thread of execution on a compute node, including specifications of a client, a context, and a task, the compute nodes coupled for data communications through the PAMI, including establishing a data communications geometry, the geometry specifying, for tasks representing processes of execution of the parallel application, a set of endpoints that are used in collective operations of the PAMI including a plurality of endpoints for one of the tasks; receiving in endpoints of the geometry an instruction for a collective operation; and executing the instruction for a collective operation through the endpoints in dependence upon the geometry, including dividing data communications operations among the plurality of endpoints for one of the tasks.

  1. Task-Based Assessment of Students' Computational Thinking Skills Developed through Visual Programming or Tangible Coding Environments

    ERIC Educational Resources Information Center

    Djambong, Takam; Freiman, Viktor

    2016-01-01

    While today's schools in several countries, like Canada, are about to bring back programming to their curricula, a new conceptual angle, namely one of computational thinking, draws attention of researchers. In order to understand the articulation between computational thinking tasks in one side, student's targeted skills, and the types of problems…

  2. Mediated Activity in the Primary Classroom: Girls, Boys and Computers.

    ERIC Educational Resources Information Center

    Fitzpatrick, Helen; Hardman, Margaret

    2000-01-01

    Studied the social interaction of 7- and 9-year-olds working in the same or mixed gender pairs on language-based computer and noncomputer tasks. At both ages, mixed gender pairs showed more assertive and less transactive (collaborative) interaction than same gender pairs on both tasks. Discusses the mediational role of the computer and the social…

  3. Task-Relevant Sound and User Experience in Computer-Mediated Firefighter Training

    ERIC Educational Resources Information Center

    Houtkamp, Joske M.; Toet, Alexander; Bos, Frank A.

    2012-01-01

    The authors added task-relevant sounds to a computer-mediated instructor in-the-loop virtual training for firefighter commanders in an attempt to raise the engagement and arousal of the users. Computer-mediated training for crew commanders should provide a sensory experience that is sufficiently intense to make the training viable and effective.…

  4. Distributed computation of graphics primitives on a transputer network

    NASA Technical Reports Server (NTRS)

    Ellis, Graham K.

    1988-01-01

    A method is developed for distributing the computation of graphics primitives on a parallel processing network. Off-the-shelf transputer boards are used to perform the graphics transformations and scan-conversion tasks that would normally be assigned to a single transputer based display processor. Each node in the network performs a single graphics primitive computation. Frequently requested tasks can be duplicated on several nodes. The results indicate that the current distribution of commands on the graphics network shows a performance degradation when compared to the graphics display board alone. A change to more computation per node for every communication (perform more complex tasks on each node) may cause the desired increase in throughput.

  5. Climate Change to the Year 2000: A Survey of Expert Opinion.

    ERIC Educational Resources Information Center

    Institute for the Future, Menlo Park, CA.

    This survey of expert opinion was conducted by the National Defense University, Washington, D.C. to quantify the likelihood of significant changes in climate and their practical consequences. The major objectives of the study are embodied in four tasks. This publication presents the results of the first task only: the definition and estimation of…

  6. Measuring Search Efficiency in Complex Visual Search Tasks: Global and Local Clutter

    ERIC Educational Resources Information Center

    Beck, Melissa R.; Lohrenz, Maura C.; Trafton, J. Gregory

    2010-01-01

    Set size and crowding affect search efficiency by limiting attention for recognition and attention against competition; however, these factors can be difficult to quantify in complex search tasks. The current experiments use a quantitative measure of the amount and variability of visual information (i.e., clutter) in highly complex stimuli (i.e.,…

  7. Characterization of a laboratory model of computer mouse use - applications for studying risk factors for musculoskeletal disorders.

    PubMed

    Flodgren, G; Heiden, M; Lyskov, E; Crenshaw, A G

    2007-03-01

    In the present study, we assessed the wrist kinetics (range of motion, mean position, velocity and mean power frequency in radial/ulnar deviation, flexion/extension, and pronation/supination) associated with performing a mouse-operated computerized task involving painting rectangles on a computer screen. Furthermore, we evaluated the effects of the painting task on subjective perception of fatigue and wrist position sense. The results showed that the painting task required constrained wrist movements, and repetitive movements of about the same magnitude as those performed in mouse-operated design tasks. In addition, the painting task induced a perception of muscle fatigue in the upper extremity (Borg CR-scale: 3.5, p<0.001) and caused a reduction in the position sense accuracy of the wrist (error before: 4.6 degrees , error after: 5.6 degrees , p<0.05). This standardized painting task appears suitable for studying relevant risk factors, and therefore it offers a potential for investigating the pathophysiological mechanisms behind musculoskeletal disorders related to computer mouse use.

  8. A quantitative three-dimensional dose attenuation analysis around Fletcher-Suit-Delclos due to stainless steel tube for high-dose-rate brachytherapy by Monte Carlo calculations.

    PubMed

    Parsai, E Ishmael; Zhang, Zhengdong; Feldmeier, John J

    2009-01-01

    The commercially available brachytherapy treatment-planning systems today, usually neglects the attenuation effect from stainless steel (SS) tube when Fletcher-Suit-Delclos (FSD) is used in treatment of cervical and endometrial cancers. This could lead to potential inaccuracies in computing dwell times and dose distribution. A more accurate analysis quantifying the level of attenuation for high-dose-rate (HDR) iridium 192 radionuclide ((192)Ir) source is presented through Monte Carlo simulation verified by measurement. In this investigation a general Monte Carlo N-Particles (MCNP) transport code was used to construct a typical geometry of FSD through simulation and compare the doses delivered to point A in Manchester System with and without the SS tubing. A quantitative assessment of inaccuracies in delivered dose vs. the computed dose is presented. In addition, this investigation expanded to examine the attenuation-corrected radial and anisotropy dose functions in a form parallel to the updated AAPM Task Group No. 43 Report (AAPM TG-43) formalism. This will delineate quantitatively the inaccuracies in dose distributions in three-dimensional space. The changes in dose deposition and distribution caused by increased attenuation coefficient resulted from presence of SS are quantified using MCNP Monte Carlo simulations in coupled photon/electron transport. The source geometry was that of the Vari Source wire model VS2000. The FSD was that of the Varian medical system. In this model, the bending angles of tandem and colpostats are 15 degrees and 120 degrees , respectively. We assigned 10 dwell positions to the tandem and 4 dwell positions to right and left colpostats or ovoids to represent a typical treatment case. Typical dose delivered to point A was determined according to Manchester dosimetry system. Based on our computations, the reduction of dose to point A was shown to be at least 3%. So this effect presented by SS-FSD systems on patient dose is of concern.

  9. A resource-sharing model based on a repeated game in fog computing.

    PubMed

    Sun, Yan; Zhang, Nan

    2017-03-01

    With the rapid development of cloud computing techniques, the number of users is undergoing exponential growth. It is difficult for traditional data centers to perform many tasks in real time because of the limited bandwidth of resources. The concept of fog computing is proposed to support traditional cloud computing and to provide cloud services. In fog computing, the resource pool is composed of sporadic distributed resources that are more flexible and movable than a traditional data center. In this paper, we propose a fog computing structure and present a crowd-funding algorithm to integrate spare resources in the network. Furthermore, to encourage more resource owners to share their resources with the resource pool and to supervise the resource supporters as they actively perform their tasks, we propose an incentive mechanism in our algorithm. Simulation results show that our proposed incentive mechanism can effectively reduce the SLA violation rate and accelerate the completion of tasks.

  10. The Differential Effects of Collaborative vs. Individual Prewriting Planning on Computer-Mediated L2 Writing: Transferability of Task-Based Linguistic Skills in Focus

    ERIC Educational Resources Information Center

    Amiryousefi, Mohammad

    2017-01-01

    The current study aimed at investigating the effects of three types of prewriting planning conditions, namely teacher-monitored collaborative planning (TMCP), student-led collaborative planning (SLCP), and individual planning (IP) on EFL learners' computer-mediated L2 written production and learning transfer from a pedagogic task to a new task of…

  11. A queueing model of pilot decision making in a multi-task flight management situation

    NASA Technical Reports Server (NTRS)

    Walden, R. S.; Rouse, W. B.

    1977-01-01

    Allocation of decision making responsibility between pilot and computer is considered and a flight management task, designed for the study of pilot-computer interaction, is discussed. A queueing theory model of pilot decision making in this multi-task, control and monitoring situation is presented. An experimental investigation of pilot decision making and the resulting model parameters are discussed.

  12. Quantifying Human Performance of a Dynamic Military Target Detection Task: An Application of the Theory of Signal Detection.

    DTIC Science & Technology

    1995-06-01

    applied to analyze numerous experimental tasks (Macmillan and Creelman , 1991). One of these tasks, target detection, is the subject research. In...between each associated pair of false alarm rate and hit rate z-scores is d’ for the bias level associated with the pairing (Macmillan and Creelman , 1991...unequal variance in normal distributions (Macmillan and Creelman , 1991). 61 1966). It is described in detail for the interested reader by Green and

  13. Study of electrical and chemical propulsion systems for auxiliary propulsion of large space systems, volume 2

    NASA Technical Reports Server (NTRS)

    Smith, W. W.

    1981-01-01

    The five major tasks of the program are reported. Task 1 is a literature search followed by selection and definition of seven generic spacecraft classes. Task 2 covers the determination and description of important disturbance effects. Task 3 applies the disturbances to the generic spacecraft and adds maneuver and stationkeeping functions to define total auxiliary propulsion systems requirements for control. The important auxiliary propulsion system characteristics are identified and sensitivities to control functions and large space system characteristics determined. In Task 4, these sensitivities are quantified and the optimum auxiliary propulsion system characteristics determined. Task 5 compares the desired characteristics with those available for both electrical and chemical auxiliary propulsion systems to identify the directions technology advances should take.

  14. Quantifying fast optical signal and event-related potential relationships during a visual oddball task.

    PubMed

    Proulx, Nicole; Samadani, Ali-Akbar; Chau, Tom

    2018-05-16

    Event-related potentials (ERPs) have previously been used to confirm the existence of the fast optical signal (FOS) but validation methods have mainly been limited to exploring the temporal correspondence of FOS peaks to those of ERPs. The purpose of this study was to systematically quantify the relationship between FOS and ERP responses to a visual oddball task in both time and frequency domains. Near-infrared spectroscopy (NIRS) and electroencephalography (EEG) sensors were co-located over the prefrontal cortex while participants performed a visual oddball task. Fifteen participants completed 2 data collection sessions each, where they were instructed to keep a mental count of oddball images. The oddball condition produced a positive ERP at 200 ms followed by a negativity 300-500 ms after image onset in the frontal electrodes. In contrast to previous FOS studies, a FOS response was identified only in DC intensity signals and not in phase delay signals. A decrease in DC intensity was found 150-250 ms after oddball image onset with a 400-trial average in 10 of 15 participants. The latency of the positive 200 ms ERP and the FOS DC intensity decrease were significantly correlated for only 6 (out of 15) participants due to the low signal-to-noise ratio of the FOS response. Coherence values between the FOS and ERP oddball responses were found to be significant in the 3-5 Hz frequency band for 10 participants. A significant Granger causal influence of the ERP on the FOS oddball response was uncovered in the 2-6 Hz frequency band for 7 participants. Collectively, our findings suggest that, for a majority of participants, the ERP and the DC intensity signal of the FOS are spectrally coherent, specifically in narrow frequency bands previously associated with event-related oscillations in the prefrontal cortex. However, these electro-optical relationships were only found in a subset of participants. Further research on enhancing the quality of the event-related FOS signal is required before it can be practically exploited in applications such as brain-computer interfacing. Copyright © 2018. Published by Elsevier Inc.

  15. Alterations in Masticatory Muscle Activation in People with Persistent Neck Pain Despite the Absence of Orofacial Pain or Temporomandibular Disorders.

    PubMed

    Testa, Marco; Geri, Tommaso; Gizzi, Leonardo; Petzke, Frank; Falla, Deborah

    2015-01-01

    To assess whether patients with persistent neck pain display evidence of altered masticatory muscle behavior during a jaw-clenching task, despite the absence of orofacial pain or temporomandibular disorders. Ten subjects with persistent, nonspecific neck pain and 10 age- and sex-matched healthy controls participated. Maximal voluntary contractions (MVCs) of unilateral jaw clenching followed by 5-second submaximal contractions at 10%, 30%, 50%, and 70% MVC were recorded by two flexible force transducers positioned between the first molar teeth. Task performance was quantified by mean distance and offset error from the reference target force as error indices, and standard deviation of force was used as an index of force steadiness. Electromyographic (EMG) activity was recorded bilaterally from the masseter muscle with 13 X 5 grids of electrodes and from the anterior temporalis with bipolar electrodes. Normalized EMG root mean square (RMS) was computed for each location of the grid to form a map of the EMG amplitude distribution, and the average normalized RMS was determined for the bipolar acquisition. Between-group differences were analyzed with the Kruskal Wallis analysis of variance. Task performance was similar in patients and controls. However, patients displayed greater masseter EMG activity bilaterally at higher force levels (P<.05). This study has provided novel evidence of altered motor control of the jaw in people with neck pain despite the absence of orofacial pain or temporomandibular disorders.

  16. Multitasking During Degraded Speech Recognition in School-Age Children

    PubMed Central

    Ward, Kristina M.; Brehm, Laurel

    2017-01-01

    Multitasking requires individuals to allocate their cognitive resources across different tasks. The purpose of the current study was to assess school-age children’s multitasking abilities during degraded speech recognition. Children (8 to 12 years old) completed a dual-task paradigm including a sentence recognition (primary) task containing speech that was either unprocessed or noise-band vocoded with 8, 6, or 4 spectral channels and a visual monitoring (secondary) task. Children’s accuracy and reaction time on the visual monitoring task was quantified during the dual-task paradigm in each condition of the primary task and compared with single-task performance. Children experienced dual-task costs in the 6- and 4-channel conditions of the primary speech recognition task with decreased accuracy on the visual monitoring task relative to baseline performance. In all conditions, children’s dual-task performance on the visual monitoring task was strongly predicted by their single-task (baseline) performance on the task. Results suggest that children’s proficiency with the secondary task contributes to the magnitude of dual-task costs while multitasking during degraded speech recognition. PMID:28105890

  17. Multitasking During Degraded Speech Recognition in School-Age Children.

    PubMed

    Grieco-Calub, Tina M; Ward, Kristina M; Brehm, Laurel

    2017-01-01

    Multitasking requires individuals to allocate their cognitive resources across different tasks. The purpose of the current study was to assess school-age children's multitasking abilities during degraded speech recognition. Children (8 to 12 years old) completed a dual-task paradigm including a sentence recognition (primary) task containing speech that was either unprocessed or noise-band vocoded with 8, 6, or 4 spectral channels and a visual monitoring (secondary) task. Children's accuracy and reaction time on the visual monitoring task was quantified during the dual-task paradigm in each condition of the primary task and compared with single-task performance. Children experienced dual-task costs in the 6- and 4-channel conditions of the primary speech recognition task with decreased accuracy on the visual monitoring task relative to baseline performance. In all conditions, children's dual-task performance on the visual monitoring task was strongly predicted by their single-task (baseline) performance on the task. Results suggest that children's proficiency with the secondary task contributes to the magnitude of dual-task costs while multitasking during degraded speech recognition.

  18. Learners' Field Dependence and the Effects of Personalized Narration on Learners' Computer Perceptions and Task-Related Attitudes in Multimedia Learning

    ERIC Educational Resources Information Center

    Liew, Tze Wei; Tan, Su-Mae; Seydali, Rouzbeh

    2014-01-01

    In this article, the effects of personalized narration in multimedia learning on learners' computer perceptions and task-related attitudes were examined. Twenty-six field independent and 22 field dependent participants studied the computer-based multimedia lessons on C-Programming, either with personalized narration or non-personalized narration.…

  19. The use of analytical models in human-computer interface design

    NASA Technical Reports Server (NTRS)

    Gugerty, Leo

    1993-01-01

    Recently, a large number of human-computer interface (HCI) researchers have investigated building analytical models of the user, which are often implemented as computer models. These models simulate the cognitive processes and task knowledge of the user in ways that allow a researcher or designer to estimate various aspects of an interface's usability, such as when user errors are likely to occur. This information can lead to design improvements. Analytical models can supplement design guidelines by providing designers rigorous ways of analyzing the information-processing requirements of specific tasks (i.e., task analysis). These models offer the potential of improving early designs and replacing some of the early phases of usability testing, thus reducing the cost of interface design. This paper describes some of the many analytical models that are currently being developed and evaluates the usefulness of analytical models for human-computer interface design. This paper will focus on computational, analytical models, such as the GOMS model, rather than less formal, verbal models, because the more exact predictions and task descriptions of computational models may be useful to designers. The paper also discusses some of the practical requirements for using analytical models in complex design organizations such as NASA.

  20. Characterizing quantum supremacy in near-term devices

    NASA Astrophysics Data System (ADS)

    Boixo, Sergio; Isakov, Sergei V.; Smelyanskiy, Vadim N.; Babbush, Ryan; Ding, Nan; Jiang, Zhang; Bremner, Michael J.; Martinis, John M.; Neven, Hartmut

    2018-06-01

    A critical question for quantum computing in the near future is whether quantum devices without error correction can perform a well-defined computational task beyond the capabilities of supercomputers. Such a demonstration of what is referred to as quantum supremacy requires a reliable evaluation of the resources required to solve tasks with classical approaches. Here, we propose the task of sampling from the output distribution of random quantum circuits as a demonstration of quantum supremacy. We extend previous results in computational complexity to argue that this sampling task must take exponential time in a classical computer. We introduce cross-entropy benchmarking to obtain the experimental fidelity of complex multiqubit dynamics. This can be estimated and extrapolated to give a success metric for a quantum supremacy demonstration. We study the computational cost of relevant classical algorithms and conclude that quantum supremacy can be achieved with circuits in a two-dimensional lattice of 7 × 7 qubits and around 40 clock cycles. This requires an error rate of around 0.5% for two-qubit gates (0.05% for one-qubit gates), and it would demonstrate the basic building blocks for a fault-tolerant quantum computer.

  1. Application of a fast skyline computation algorithm for serendipitous searching problems

    NASA Astrophysics Data System (ADS)

    Koizumi, Kenichi; Hiraki, Kei; Inaba, Mary

    2018-02-01

    Skyline computation is a method of extracting interesting entries from a large population with multiple attributes. These entries, called skyline or Pareto optimal entries, are known to have extreme characteristics that cannot be found by outlier detection methods. Skyline computation is an important task for characterizing large amounts of data and selecting interesting entries with extreme features. When the population changes dynamically, the task of calculating a sequence of skyline sets is called continuous skyline computation. This task is known to be difficult to perform for the following reasons: (1) information of non-skyline entries must be stored since they may join the skyline in the future; (2) the appearance or disappearance of even a single entry can change the skyline drastically; (3) it is difficult to adopt a geometric acceleration algorithm for skyline computation tasks with high-dimensional datasets. Our new algorithm called jointed rooted-tree (JR-tree) manages entries using a rooted tree structure. JR-tree delays extend the tree to deep levels to accelerate tree construction and traversal. In this study, we presented the difficulties in extracting entries tagged with a rare label in high-dimensional space and the potential of fast skyline computation in low-latency cell identification technology.

  2. Nondestructive microimaging during preclinical pin-on-plate testing of novel materials for arthroplasty.

    PubMed

    Teeter, Matthew G; Langohr, G Daniel G; Medley, John B; Holdsworth, David W

    2014-02-01

    The purpose of this study was to determine the ability of micro-computed tomography to quantify wear in preclinical pin-on-plate testing of materials for use in joint arthroplasty. Wear testing of CoCr pins articulating against six polyetheretherketone plates was performed using a pin-on-plate apparatus over 2 million cycles. Change in volume due to wear was quantified with gravimetric analysis and with micro-computed tomography, and the volumes were compared. Separately, the volume of polyetheretherketone pin-on-plate specimens that had been soaking in fluid for 52 weeks was quantified with both gravimetric analysis and micro-computed tomography, and repeated after drying. The volume change with micro-computed tomography was compared to the mass change with gravimetric analysis. The mean wear volume measured was 8.02 ± 6.38 mm(3) with gravimetric analysis and 6.76 ± 5.38 mm(3) with micro-computed tomography (p = 0.06). Micro-computed tomography volume measurements did not show a statistically significant change with drying for either the plates (p = 0.60) or the pins (p = 0.09), yet drying had a significant effect on the gravimetric mass measurements for both the plates (p = 0.03) and the pins (p = 0.04). Micro-computed tomography provided accurate measurements of wear in polyetheretherketone pin-on-plate test specimens, and no statistically significant change was caused by fluid uptake. Micro-computed tomography quantifies wear depth and wear volume, mapped to the specific location of damage on the specimen, and is also capable of examining subsurface density as well as cracking. Its noncontact, nondestructive nature makes it ideal for preclinical testing of materials, in which further additional analysis techniques may be utilized.

  3. Glove-Enabled Computer Operations (GECO): Design and Testing of an Extravehicular Activity Glove Adapted for Human-Computer Interface

    NASA Technical Reports Server (NTRS)

    Adams, Richard J.; Olowin, Aaron; Krepkovich, Eileen; Hannaford, Blake; Lindsay, Jack I. C.; Homer, Peter; Patrie, James T.; Sands, O. Scott

    2013-01-01

    The Glove-Enabled Computer Operations (GECO) system enables an extravehicular activity (EVA) glove to be dual-purposed as a human-computer interface device. This paper describes the design and human participant testing of a right-handed GECO glove in a pressurized glove box. As part of an investigation into the usability of the GECO system for EVA data entry, twenty participants were asked to complete activities including (1) a Simon Says Games in which they attempted to duplicate random sequences of targeted finger strikes and (2) a Text Entry activity in which they used the GECO glove to enter target phrases in two different virtual keyboard modes. In a within-subjects design, both activities were performed both with and without vibrotactile feedback. Participants' mean accuracies in correctly generating finger strikes with the pressurized glove were surprisingly high, both with and without the benefit of tactile feedback. Five of the subjects achieved mean accuracies exceeding 99% in both conditions. In Text Entry, tactile feedback provided a statistically significant performance benefit, quantified by characters entered per minute, as well as reduction in error rate. Secondary analyses of responses to a NASA Task Loader Index (TLX) subjective workload assessments reveal a benefit for tactile feedback in GECO glove use for data entry. This first-ever investigation of employment of a pressurized EVA glove for human-computer interface opens up a wide range of future applications, including text "chat" communications, manipulation of procedures/checklists, cataloguing/annotating images, scientific note taking, human-robot interaction, and control of suit and/or other EVA systems.

  4. Glove-Enabled Computer Operations (GECO): Design and Testing of an Extravehicular Activity Glove Adapted for Human-Computer Interface

    NASA Technical Reports Server (NTRS)

    Adams, Richard J.; Olowin, Aaron; Krepkovich, Eileen; Hannaford, Blake; Lindsay, Jack I. C.; Homer, Peter; Patrie, James T.; Sands, O. Scott

    2013-01-01

    The Glove-Enabled Computer Operations (GECO) system enables an extravehicular activity (EVA) glove to be dual-purposed as a human-computer interface device. This paper describes the design and human participant testing of a right-handed GECO glove in a pressurized glove box. As part of an investigation into the usability of the GECO system for EVA data entry, twenty participants were asked to complete activities including (1) a Simon Says Games in which they attempted to duplicate random sequences of targeted finger strikes and (2) a Text Entry activity in which they used the GECO glove to enter target phrases in two different virtual keyboard modes. In a within-subjects design, both activities were performed both with and without vibrotactile feedback. Participants mean accuracies in correctly generating finger strikes with the pressurized glove were surprisingly high, both with and without the benefit of tactile feedback. Five of the subjects achieved mean accuracies exceeding 99 in both conditions. In Text Entry, tactile feedback provided a statistically significant performance benefit, quantified by characters entered per minute, as well as reduction in error rate. Secondary analyses of responses to a NASA Task Loader Index (TLX) subjective workload assessments reveal a benefit for tactile feedback in GECO glove use for data entry. This first-ever investigation of employment of a pressurized EVA glove for human-computer interface opens up a wide range of future applications, including text chat communications, manipulation of procedureschecklists, cataloguingannotating images, scientific note taking, human-robot interaction, and control of suit andor other EVA systems.

  5. Interlimb transfer of motor skill learning during walking: No evidence for asymmetric transfer.

    PubMed

    Krishnan, Chandramouli; Ranganathan, Rajiv; Tetarbe, Manik

    2017-07-01

    Several studies have shown that learning a motor skill in one limb can transfer to the opposite limb-a phenomenon called as interlimb transfer. The transfer of motor skills between limbs, however, has shown to be asymmetric, where one side benefits to a greater extent than the other. While this phenomenon has been well-documented in the upper-extremity, evidence for interlimb transfer in the lower-extremity is limited and mixed. This study investigated the extent of interlimb transfer during walking, and tested whether this transfer was asymmetric using a foot trajectory-tracking paradigm that has been specifically used for gait rehabilitation. The paradigm involved learning a new gait pattern which required greater hip and knee flexion during the swing phase of the gait while walking on a treadmill. Twenty young adults were randomized into two equal groups, where one group (right-to-left: RL) practiced the task initially with the dominant right leg and the other group (left-to-right: LR) practiced the task initially with their non-dominant left leg. After training, both groups practiced the task with their opposite leg to test the transfer effects. The changes in tracking error on each leg were computed to quantify learning and transfer effects. The results indicated that practice with one leg improved the motor performance of the other leg; however, the amount of transfer was similar across groups, indicating that there was no asymmetry in transfer. This finding is contradictory to most upper-extremity studies (where asymmetric transfer has been reported) and points out that both differences in neural processes and types of tasks may mediate interlimb transfer. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Functional Neuroanatomy Involved in Automatic order Mental Arithmetic and Recitation of the Multiplication Table

    NASA Astrophysics Data System (ADS)

    Wang, Li-Qun; Saito, Masao

    We used 1.5T functional magnetic resonance imaging (fMRI) to explore that which brain areas contribute uniquely to numeric computation. The BOLD effect activation pattern of metal arithmetic task (successive subtraction: actual calculation task) was compared with multiplication tables repetition task (rote verbal arithmetic memory task) response. The activation found in right parietal lobule during metal arithmetic task suggested that quantitative cognition or numeric computation may need the assistance of sensuous convert, such as spatial imagination and spatial sensuous convert. In addition, this mechanism may be an ’analog algorithm’ in the simple mental arithmetic processing.

  7. Assessing the effects of manual dexterity and playing computer games on catheter-wire manipulation for inexperienced operators.

    PubMed

    Alsafi, Z; Hameed, Y; Amin, P; Shamsad, S; Raja, U; Alsafi, A; Hamady, M S

    2017-09-01

    To investigate the effect of playing computer games and manual dexterity on catheter-wire manipulation in a mechanical aortic model. Medical student volunteers filled in a preprocedure questionnaire assessing their exposure to computer games. Their manual dexterity was measured using a smartphone game. They were then shown a video clip demonstrating renal artery cannulation and were asked to reproduce this. All attempts were timed. Two-tailed Student's t-test was used to compare continuous data, while Fisher's exact test was used for categorical data. Fifty students aged 18-22 years took part in the study. Forty-six completed the task at an average of 168 seconds (range 103-301 seconds). There was no significant difference in the dexterity score or time to cannulate the renal artery between male and female students. Students who played computer games for >10 hours per week had better dexterity scores than those who did not play computer games: 9.1 versus 10.2 seconds (p=0.0237). Four of 19 students who did not play computer games failed to complete the task, while all of those who played computer games regularly completed the task (p=0.0168). Playing computer games is associated with better manual dexterity and ability to complete a basic interventional radiology task for novices. Copyright © 2017 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  8. 29 CFR 541.707 - Occasional tasks.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... DELIMITING THE EXEMPTIONS FOR EXECUTIVE, ADMINISTRATIVE, PROFESSIONAL, COMPUTER AND OUTSIDE SALES EMPLOYEES Definitions and Miscellaneous Provisions § 541.707 Occasional tasks. Occasional, infrequently recurring tasks...

  9. 29 CFR 541.707 - Occasional tasks.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... DELIMITING THE EXEMPTIONS FOR EXECUTIVE, ADMINISTRATIVE, PROFESSIONAL, COMPUTER AND OUTSIDE SALES EMPLOYEES Definitions and Miscellaneous Provisions § 541.707 Occasional tasks. Occasional, infrequently recurring tasks...

  10. 29 CFR 541.707 - Occasional tasks.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... DELIMITING THE EXEMPTIONS FOR EXECUTIVE, ADMINISTRATIVE, PROFESSIONAL, COMPUTER AND OUTSIDE SALES EMPLOYEES Definitions and Miscellaneous Provisions § 541.707 Occasional tasks. Occasional, infrequently recurring tasks...

  11. 29 CFR 541.707 - Occasional tasks.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... DELIMITING THE EXEMPTIONS FOR EXECUTIVE, ADMINISTRATIVE, PROFESSIONAL, COMPUTER AND OUTSIDE SALES EMPLOYEES Definitions and Miscellaneous Provisions § 541.707 Occasional tasks. Occasional, infrequently recurring tasks...

  12. Multi-segmental movement patterns reflect juggling complexity and skill level.

    PubMed

    Zago, Matteo; Pacifici, Ilaria; Lovecchio, Nicola; Galli, Manuela; Federolf, Peter Andreas; Sforza, Chiarella

    2017-08-01

    The juggling action of six experts and six intermediates jugglers was recorded with a motion capture system and decomposed into its fundamental components through Principal Component Analysis. The aim was to quantify trends in movement dimensionality, multi-segmental patterns and rhythmicity as a function of proficiency level and task complexity. Dimensionality was quantified in terms of Residual Variance, while the Relative Amplitude was introduced to account for individual differences in movement components. We observed that: experience-related modifications in multi-segmental actions exist, such as the progressive reduction of error-correction movements, especially in complex task condition. The systematic identification of motor patterns sensitive to the acquisition of specific experience could accelerate the learning process. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Dual-Arm Generalized Compliant Motion With Shared Control

    NASA Technical Reports Server (NTRS)

    Backes, Paul G.

    1994-01-01

    Dual-Arm Generalized Compliant Motion (DAGCM) primitive computer program implementing improved unified control scheme for two manipulator arms cooperating in task in which both grasp same object. Provides capabilities for autonomous, teleoperation, and shared control of two robot arms. Unifies cooperative dual-arm control with multi-sensor-based task control and makes complete task-control capability available to higher-level task-planning computer system via large set of input parameters used to describe desired force and position trajectories followed by manipulator arms. Some concepts discussed in "A Generalized-Compliant-Motion Primitive" (NPO-18134).

  14. Numerical Study of Boundary-Layer in Aerodynamics

    NASA Technical Reports Server (NTRS)

    Shih, Tom I-P.

    1997-01-01

    The accomplishments made in the following three tasks are described: (1) The first task was to study shock-wave boundary-layer interactions with bleed - this study is relevant to boundary-layer control in external and mixed-compression inlets of supersonic aircraft; (2) The second task was to test RAAKE, a code developed for computing turbulence quantities; and (3) The third task was to compute flow around the Ames ER-2 aircraft that has been retrofitted with containers over its wings and fuselage. The appendices include two reports submitted to AIAA for publication.

  15. Quantifying the effects of on-the-fly changes of seating configuration on the stability of a manual wheelchair.

    PubMed

    Thomas, Louise; Borisoff, Jaimie; Sparrey, Carolyn J

    2017-07-01

    In general, manual wheelchairs are designed with a fixed frame, which is not optimal for every situation. Adjustable on the fly seating allow users to rapidly adapt their wheelchair configuration to suit different tasks. These changes move the center of gravity (CoG) of the system, altering the wheelchair stability and maneuverability. To assess these changes, a computer simulation of a manual wheelchair was created with adjustable seat, backrest, rear axle position and user position, and validated with experimental testing. The stability of the wheelchair was most affected by the position of the rear axle, but adjustments to the backrest and seat angles also result in stability improvements that could be used when wheeling in the community. These findings describe the most influential parameters for wheelchair stability and maneuverability, as well as provide quantitative guidelines for the use of manual wheelchairs with on the fly adjustable seats.

  16. Comparing Phylogenetic Trees by Matching Nodes Using the Transfer Distance Between Partitions

    PubMed Central

    Giaro, Krzysztof

    2017-01-01

    Abstract Ability to quantify dissimilarity of different phylogenetic trees describing the relationship between the same group of taxa is required in various types of phylogenetic studies. For example, such metrics are used to assess the quality of phylogeny construction methods, to define optimization criteria in supertree building algorithms, or to find horizontal gene transfer (HGT) events. Among the set of metrics described so far in the literature, the most commonly used seems to be the Robinson–Foulds distance. In this article, we define a new metric for rooted trees—the Matching Pair (MP) distance. The MP metric uses the concept of the minimum-weight perfect matching in a complete bipartite graph constructed from partitions of all pairs of leaves of the compared phylogenetic trees. We analyze the properties of the MP metric and present computational experiments showing its potential applicability in tasks related to finding the HGT events. PMID:28177699

  17. Comparing Phylogenetic Trees by Matching Nodes Using the Transfer Distance Between Partitions.

    PubMed

    Bogdanowicz, Damian; Giaro, Krzysztof

    2017-05-01

    Ability to quantify dissimilarity of different phylogenetic trees describing the relationship between the same group of taxa is required in various types of phylogenetic studies. For example, such metrics are used to assess the quality of phylogeny construction methods, to define optimization criteria in supertree building algorithms, or to find horizontal gene transfer (HGT) events. Among the set of metrics described so far in the literature, the most commonly used seems to be the Robinson-Foulds distance. In this article, we define a new metric for rooted trees-the Matching Pair (MP) distance. The MP metric uses the concept of the minimum-weight perfect matching in a complete bipartite graph constructed from partitions of all pairs of leaves of the compared phylogenetic trees. We analyze the properties of the MP metric and present computational experiments showing its potential applicability in tasks related to finding the HGT events.

  18. Quantifying driver's field-of-view in tractors: methodology and case study.

    PubMed

    Gilad, Issachar; Byran, Eyal

    2015-01-01

    When driving a car, the visual awareness is important for operating and controlling the vehicle. When operating a tractor, it is even more complex. This is because the driving is always accompanied with another task (e.g., plough) that demands constant changes of body postures, to achieve the needed Field-of-View (FoV). Therefore, the cockpit must be well designed to provide best FoV. Today, the driver's FoV is analyzed mostly by computer simulations of a cockpit model and a Digital Human Model (DHM) positioned inside. The outcome is an 'Eye view' that displays what the DHM 'sees'. This paper suggests a new approach that adds quantitative information to the current display; presented on three tractor models as case studies. Based on the results, the design can be modified. This may assist the engineer, to analyze, compare and improve the design, for better addressing the driver needs.

  19. Dynamically allocating sets of fine-grained processors to running computations

    NASA Technical Reports Server (NTRS)

    Middleton, David

    1988-01-01

    Researchers explore an approach to using general purpose parallel computers which involves mapping hardware resources onto computations instead of mapping computations onto hardware. Problems such as processor allocation, task scheduling and load balancing, which have traditionally proven to be challenging, change significantly under this approach and may become amenable to new attacks. Researchers describe the implementation of this approach used by the FFP Machine whose computation and communication resources are repeatedly partitioned into disjoint groups that match the needs of available tasks from moment to moment. Several consequences of this system are examined.

  20. Image Processing and Computer Aided Diagnosis in Computed Tomography of the Breast

    DTIC Science & Technology

    2007-03-01

    TERMS breast imaging, breast CT, scatter compensation, denoising, CAD , Cone-beam CT 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...clinical projection images. The CAD tool based on signal known exactly (SKE) scenario is under development. Task 6: Test and compare the...performances of the CAD developed in Task 5 applied to processed projection data from Task 1 with the CAD performance on the projection data without Bayesian

  1. The Effects of Synchronous Text-Based Computer-Mediated Communication Tasks on the Development of L2 and Academic Literacy: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Li, Jinrong

    2012-01-01

    The dissertation examines how synchronous text-based computer-mediated communication (SCMC) tasks may affect English as a Second Language (ESL) learners' development of second language (L2) and academic literacy. The study is motivated by two issues concerning the use of SCMC tasks in L2 writing classes. First, although some of the alleged…

  2. Design and Analysis of Self-Adapted Task Scheduling Strategies in Wireless Sensor Networks

    PubMed Central

    Guo, Wenzhong; Xiong, Naixue; Chao, Han-Chieh; Hussain, Sajid; Chen, Guolong

    2011-01-01

    In a wireless sensor network (WSN), the usage of resources is usually highly related to the execution of tasks which consume a certain amount of computing and communication bandwidth. Parallel processing among sensors is a promising solution to provide the demanded computation capacity in WSNs. Task allocation and scheduling is a typical problem in the area of high performance computing. Although task allocation and scheduling in wired processor networks has been well studied in the past, their counterparts for WSNs remain largely unexplored. Existing traditional high performance computing solutions cannot be directly implemented in WSNs due to the limitations of WSNs such as limited resource availability and the shared communication medium. In this paper, a self-adapted task scheduling strategy for WSNs is presented. First, a multi-agent-based architecture for WSNs is proposed and a mathematical model of dynamic alliance is constructed for the task allocation problem. Then an effective discrete particle swarm optimization (PSO) algorithm for the dynamic alliance (DPSO-DA) with a well-designed particle position code and fitness function is proposed. A mutation operator which can effectively improve the algorithm’s ability of global search and population diversity is also introduced in this algorithm. Finally, the simulation results show that the proposed solution can achieve significant better performance than other algorithms. PMID:22163971

  3. Dynamic stability requirements during gait and standing exergames on the wii fit® system in the elderly

    PubMed Central

    2012-01-01

    Background In rehabilitation, training intensity is usually adapted to optimize the trained system to attain better performance (overload principle). However, in balance rehabilitation, the level of intensity required during training exercises to optimize improvement in balance has rarely been studied, probably due to the difficulty in quantifying the stability level during these exercises. The goal of the present study was to test whether the stabilizing/destabilizing forces model could be used to analyze how stability is challenged during several exergames, that are more and more used in balance rehabilitation, and a dynamic functional task, such as gait. Methods Seven healthy older adults were evaluated with three-dimensional motion analysis during gait at natural and fast speed, and during three balance exergames (50/50 Challenge, Ski Slalom and Soccer). Mean and extreme values for stabilizing force, destabilizing force and the ratio of the two forces (stability index) were computed from kinematic and kinetic data to determine the mean and least level of dynamic, postural and overall balance stability, respectively. Results Mean postural stability was lower (lower mean destabilizing force) during the 50/50 Challenge game than during all the other tasks, but peak postural instability moments were less challenging during this game than during any of the other tasks, as shown by the minimum destabilizing force values. Dynamic stability was progressively more challenged (higher mean and maximum stabilizing force) from the 50/50 Challenge to the Soccer and Slalom games, to the natural gait speed task and to the fast gait speed task, increasing the overall stability difficulty (mean and minimum stability index) in the same manner. Conclusions The stabilizing/destabilizing forces model can be used to rate the level of balance requirements during different tasks such as gait or exergames. The results of our study showed that postural stability did not differ much between the evaluated tasks (except for the 50/50 Challenge), compared to dynamic stability, which was significantly less challenged during the games than during the functional tasks. Games with greater centre of mass displacements and changes in the base of support are likely to stimulate balance control enough to see improvements in balance during dynamic functional tasks, and could be tested in pathological populations with the approach used here. PMID:22607025

  4. On The Computational Capabilities of Physical Systems. Part 2; Relationship With Conventional Computer Science

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In the first of this pair of papers, it was proven that there cannot be a physical computer to which one can properly pose any and all computational tasks concerning the physical universe. It was then further proven that no physical computer C can correctly carry out all computational tasks that can be posed to C. As a particular example, this result means that no physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly "processing information faster than the universe does". These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - "physical computation" - is needed to address the issues considered in these papers, which concern real physical computers. While this novel definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. This second paper of the pair presents a preliminary exploration of some of this mathematical structure. Analogues of Chomskian results concerning universal Turing Machines and the Halting theorem are derived, as are results concerning the (im)possibility of certain kinds of error-correcting codes. In addition, an analogue of algorithmic information complexity, "prediction complexity", is elaborated. A task-independent bound is derived on how much the prediction complexity of a computational task can differ for two different reference universal physical computers used to solve that task, a bound similar to the "encoding" bound governing how much the algorithm information complexity of a Turing machine calculation can differ for two reference universal Turing machines. Finally, it is proven that either the Hamiltonian of our universe proscribes a certain type of computation, or prediction complexity is unique (unlike algorithmic information complexity), in that there is one and only version of it that can be applicable throughout our universe.

  5. Statistical comparison of a hybrid approach with approximate and exact inference models for Fusion 2+

    NASA Astrophysics Data System (ADS)

    Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew

    2007-04-01

    One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.

  6. Quantification of Load Dependent Brain Activity in Parametric N-Back Working Memory Tasks using Pseudo-continuous Arterial Spin Labeling (pCASL) Perfusion Imaging.

    PubMed

    Zou, Qihong; Gu, Hong; Wang, Danny J J; Gao, Jia-Hong; Yang, Yihong

    2011-04-01

    Brain activation and deactivation induced by N-back working memory tasks and their load effects have been extensively investigated using positron emission tomography (PET) and blood-oxygenation level dependent (BOLD) functional magnetic resonance imaging (fMRI). However, the underlying mechanisms of BOLD fMRI are still not completely understood and PET imaging requires injection of radioactive tracers. In this study, a pseudo-continuous arterial spin labeling (pCASL) perfusion imaging technique was used to quantify cerebral blood flow (CBF), a well understood physiological index reflective of cerebral metabolism, in N-back working memory tasks. Using pCASL, we systematically investigated brain activation and deactivation induced by the N-back working memory tasks and further studied the load effects on brain activity based on quantitative CBF. Our data show increased CBF in the fronto-parietal cortices, thalamus, caudate, and cerebellar regions, and decreased CBF in the posterior cingulate cortex and medial prefrontal cortex, during the working memory tasks. Most of the activated/deactivated brain regions show an approximately linear relationship between CBF and task loads (0, 1, 2 and 3 back), although several regions show non-linear relationships (quadratic and cubic). The CBF-based spatial patterns of brain activation/deactivation and load effects from this study agree well with those obtained from BOLD fMRI and PET techniques. These results demonstrate the feasibility of ASL techniques to quantify human brain activity during high cognitive tasks, suggesting its potential application to assessing the mechanisms of cognitive deficits in neuropsychiatric and neurological disorders.

  7. Dopamine D3 Receptor Availability Is Associated with Inflexible Decision Making.

    PubMed

    Groman, Stephanie M; Smith, Nathaniel J; Petrullli, J Ryan; Massi, Bart; Chen, Lihui; Ropchan, Jim; Huang, Yiyun; Lee, Daeyeol; Morris, Evan D; Taylor, Jane R

    2016-06-22

    Dopamine D2/3 receptor signaling is critical for flexible adaptive behavior; however, it is unclear whether D2, D3, or both receptor subtypes modulate precise signals of feedback and reward history that underlie optimal decision making. Here, PET with the radioligand [(11)C]-(+)-PHNO was used to quantify individual differences in putative D3 receptor availability in rodents trained on a novel three-choice spatial acquisition and reversal-learning task with probabilistic reinforcement. Binding of [(11)C]-(+)-PHNO in the midbrain was negatively related to the ability of rats to adapt to changes in rewarded locations, but not to the initial learning. Computational modeling of choice behavior in the reversal phase indicated that [(11)C]-(+)-PHNO binding in the midbrain was related to the learning rate and sensitivity to positive, but not negative, feedback. Administration of a D3-preferring agonist likewise impaired reversal performance by reducing the learning rate and sensitivity to positive feedback. These results demonstrate a previously unrecognized role for D3 receptors in select aspects of reinforcement learning and suggest that individual variation in midbrain D3 receptors influences flexible behavior. Our combined neuroimaging, behavioral, pharmacological, and computational approach implicates the dopamine D3 receptor in decision-making processes that are altered in psychiatric disorders. Flexible decision-making behavior is dependent upon dopamine D2/3 signaling in corticostriatal brain regions. However, the role of D3 receptors in adaptive, goal-directed behavior has not been thoroughly investigated. By combining PET imaging with the D3-preferring radioligand [(11)C]-(+)-PHNO, pharmacology, a novel three-choice probabilistic discrimination and reversal task and computational modeling of behavior in rats, we report that naturally occurring variation in [(11)C]-(+)-PHNO receptor availability relates to specific aspects of flexible decision making. We confirm these relationships using a D3-preferring agonist, thus identifying a unique role of midbrain D3 receptors in decision-making processes. Copyright © 2016 the authors 0270-6474/16/366732-10$15.00/0.

  8. Computer control improves ethylene plant operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitehead, B.D.; Parnis, M.

    ICIA Australia ordered a turnkey 250,000-tpy ethylene plant to be built at the Botany site, Sydney, Australia. Following a feasibility study, an additional order was placed for a process computer system for advanced process control and optimization. This article gives a broad outline of the process computer tasks, how the tasks were implemented, what problems were met, what lessons were learned and what results were achieved.

  9. Hybrid Symbiotic Organisms Search Optimization Algorithm for Scheduling of Tasks on Cloud Computing Environment.

    PubMed

    Abdullahi, Mohammed; Ngadi, Md Asri

    2016-01-01

    Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS) has been shown to perform competitively with Particle Swarm Optimization (PSO). The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA) based SOS (SASOS) in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs) which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.

  10. Hybrid Symbiotic Organisms Search Optimization Algorithm for Scheduling of Tasks on Cloud Computing Environment

    PubMed Central

    Abdullahi, Mohammed; Ngadi, Md Asri

    2016-01-01

    Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS) has been shown to perform competitively with Particle Swarm Optimization (PSO). The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA) based SOS (SASOS) in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs) which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan. PMID:27348127

  11. On the development of a computer-based handwriting assessment tool to objectively quantify handwriting proficiency in children.

    PubMed

    Falk, Tiago H; Tam, Cynthia; Schellnus, Heidi; Chau, Tom

    2011-12-01

    Standardized writing assessments such as the Minnesota Handwriting Assessment (MHA) can inform interventions for handwriting difficulties, which are prevalent among school-aged children. However, these tests usually involve the laborious task of subjectively rating the legibility of the written product, precluding their practical use in some clinical and educational settings. This study describes a portable computer-based handwriting assessment tool to objectively measure MHA quality scores and to detect handwriting difficulties in children. Several measures are proposed based on spatial, temporal, and grip force measurements obtained from a custom-built handwriting instrument. Thirty-five first and second grade students participated in the study, nine of whom exhibited handwriting difficulties. Students performed the MHA test and were subjectively scored based on speed and handwriting quality using five primitives: legibility, form, alignment, size, and space. Several spatial parameters are shown to correlate significantly (p<0.001) with subjective scores obtained for alignment, size, space, and form. Grip force and temporal measures, in turn, serve as useful indicators of handwriting legibility and speed, respectively. Using only size and space parameters, promising discrimination between proficient and non-proficient handwriting can be achieved. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  12. Identification and restoration in 3D fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Dieterlen, Alain; Xu, Chengqi; Haeberle, Olivier; Hueber, Nicolas; Malfara, R.; Colicchio, B.; Jacquey, Serge

    2004-06-01

    3-D optical fluorescent microscopy becomes now an efficient tool for volumic investigation of living biological samples. The 3-D data can be acquired by Optical Sectioning Microscopy which is performed by axial stepping of the object versus the objective. For any instrument, each recorded image can be described by a convolution equation between the original object and the Point Spread Function (PSF) of the acquisition system. To assess performance and ensure the data reproducibility, as for any 3-D quantitative analysis, the system indentification is mandatory. The PSF explains the properties of the image acquisition system; it can be computed or acquired experimentally. Statistical tools and Zernike moments are shown appropriate and complementary to describe a 3-D system PSF and to quantify the variation of the PSF as function of the optical parameters. Some critical experimental parameters can be identified with these tools. This is helpful for biologist to define an aquisition protocol optimizing the use of the system. Reduction of out-of-focus light is the task of 3-D microscopy; it is carried out computationally by deconvolution process. Pre-filtering the images improves the stability of deconvolution results, now less dependent on the regularization parameter; this helps the biologists to use restoration process.

  13. Grid Task Execution

    NASA Technical Reports Server (NTRS)

    Hu, Chaumin

    2007-01-01

    IPG Execution Service is a framework that reliably executes complex jobs on a computational grid, and is part of the IPG service architecture designed to support location-independent computing. The new grid service enables users to describe the platform on which they need a job to run, which allows the service to locate the desired platform, configure it for the required application, and execute the job. After a job is submitted, users can monitor it through periodic notifications, or through queries. Each job consists of a set of tasks that performs actions such as executing applications and managing data. Each task is executed based on a starting condition that is an expression of the states of other tasks. This formulation allows tasks to be executed in parallel, and also allows a user to specify tasks to execute when other tasks succeed, fail, or are canceled. The two core components of the Execution Service are the Task Database, which stores tasks that have been submitted for execution, and the Task Manager, which executes tasks in the proper order, based on the user-specified starting conditions, and avoids overloading local and remote resources while executing tasks.

  14. Crew/computer communications study. Volume 1: Final report. [onboard computerized communications system for spacecrews

    NASA Technical Reports Server (NTRS)

    Johannes, J. D.

    1974-01-01

    Techniques, methods, and system requirements are reported for an onboard computerized communications system that provides on-line computing capability during manned space exploration. Communications between man and computer take place by sequential execution of each discrete step of a procedure, by interactive progression through a tree-type structure to initiate tasks or by interactive optimization of a task requiring man to furnish a set of parameters. Effective communication between astronaut and computer utilizes structured vocabulary techniques and a word recognition system.

  15. Semiautomatic computer-aided classification of degenerative lumbar spine disease in magnetic resonance imaging.

    PubMed

    Ruiz-España, Silvia; Arana, Estanislao; Moratal, David

    2015-07-01

    Computer-aided diagnosis (CAD) methods for detecting and classifying lumbar spine disease in Magnetic Resonance imaging (MRI) can assist radiologists to perform their decision-making tasks. In this paper, a CAD software has been developed able to classify and quantify spine disease (disc degeneration, herniation and spinal stenosis) in two-dimensional MRI. A set of 52 lumbar discs from 14 patients was used for training and 243 lumbar discs from 53 patients for testing in conventional two-dimensional MRI of the lumbar spine. To classify disc degeneration according to the gold standard, Pfirrmann classification, a method based on the measurement of disc signal intensity and structure was developed. A gradient Vector Flow algorithm was used to extract disc shape features and for detecting contour abnormalities. Also, a signal intensity method was used for segmenting and detecting spinal stenosis. Novel algorithms have also been developed to quantify the severity of these pathologies. Variability was evaluated by kappa (k) and intra-class correlation (ICC) statistics. Segmentation inaccuracy was below 1%. Almost perfect agreement, as measured by the k and ICC statistics, was obtained for all the analyzed pathologies: disc degeneration (k=0.81 with 95% CI=[0.75..0.88]) with a sensitivity of 95.8% and a specificity of 92.6%, disc herniation (k=0.94 with 95% CI=[0.87..1]) with a sensitivity of 60% and a specificity of 87.1%, categorical stenosis (k=0.94 with 95% CI=[0.90..0.98]) and quantitative stenosis (ICC=0.98 with 95% CI=[0.97..0.98]) with a sensitivity of 70% and a specificity of 81.7%. The proposed methods are reproducible and should be considered as a possible alternative when compared to reference standards. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Non-Relative Value Unit-Generating Activities Represent One-Fifth of Academic Neuroradiologist Productivity.

    PubMed

    Wintermark, M; Zeineh, M; Zaharchuk, G; Srivastava, A; Fischbein, N

    2016-07-01

    A neuroradiologist's activity includes many tasks beyond interpreting relative value unit-generating imaging studies. Our aim was to test a simple method to record and quantify the non-relative value unit-generating clinical activity represented by consults and clinical conferences, including tumor boards. Four full-time neuroradiologists, working an average of 50% clinical and 50% academic activity, systematically recorded all the non-relative value unit-generating consults and conferences in which they were involved during 3 months by using a simple, Web-based, computer-based application accessible from smartphones, tablets, or computers. The number and type of imaging studies they interpreted during the same period and the associated relative value units were extracted from our billing system. During 3 months, the 4 neuroradiologists working an average of 50% clinical activity interpreted 4241 relative value unit-generating imaging studies, representing 8152 work relative value units. During the same period, they recorded 792 non-relative value unit-generating study reviews as part of consults and conferences (not including reading room consults), representing 19% of the interpreted relative value unit-generating imaging studies. We propose a simple Web-based smartphone app to record and quantify non-relative value unit-generating activities including consults, clinical conferences, and tumor boards. The quantification of non-relative value unit-generating activities is paramount in this time of a paradigm shift from volume to value. It also represents an important tool for determining staffing levels, which cannot be performed on the basis of relative value unit only, considering the importance of time spent by radiologists on non-relative value unit-generating activities. It may also influence payment models from medical centers to radiology departments or practices. © 2016 by American Journal of Neuroradiology.

  17. Inter-rater reliability of kinesthetic measurements with the KINARM robotic exoskeleton.

    PubMed

    Semrau, Jennifer A; Herter, Troy M; Scott, Stephen H; Dukelow, Sean P

    2017-05-22

    Kinesthesia (sense of limb movement) has been extremely difficult to measure objectively, especially in individuals who have survived a stroke. The development of valid and reliable measurements for proprioception is important to developing a better understanding of proprioceptive impairments after stroke and their impact on the ability to perform daily activities. We recently developed a robotic task to evaluate kinesthetic deficits after stroke and found that the majority (~60%) of stroke survivors exhibit significant deficits in kinesthesia within the first 10 days post-stroke. Here we aim to determine the inter-rater reliability of this robotic kinesthetic matching task. Twenty-five neurologically intact control subjects and 15 individuals with first-time stroke were evaluated on a robotic kinesthetic matching task (KIN). Subjects sat in a robotic exoskeleton with their arms supported against gravity. In the KIN task, the robot moved the subjects' stroke-affected arm at a preset speed, direction and distance. As soon as subjects felt the robot begin to move their affected arm, they matched the robot movement with the unaffected arm. Subjects were tested in two sessions on the KIN task: initial session and then a second session (within an average of 18.2 ± 13.8 h of the initial session for stroke subjects), which were supervised by different technicians. The task was performed both with and without the use of vision in both sessions. We evaluated intra-class correlations of spatial and temporal parameters derived from the KIN task to determine the reliability of the robotic task. We evaluated 8 spatial and temporal parameters that quantify kinesthetic behavior. We found that the parameters exhibited moderate to high intra-class correlations between the initial and retest conditions (Range, r-value = [0.53-0.97]). The robotic KIN task exhibited good inter-rater reliability. This validates the KIN task as a reliable, objective method for quantifying kinesthesia after stroke.

  18. CREASE 6.0 Catalog of Resources for Education in Ada and Software Engineering

    DTIC Science & Technology

    1992-02-01

    Programming Software Engineering Strong Typing Tasking Audene . Computer Scientists Terbook(s): Barnes, J. Programming in Ada, 3rd ed. Addison-Wesley...Ada. Concept: Abstract Data Types Management Overview Package Real-Time Programming Tasking Audene Computer Scientists Textbook(s): Barnes, J

  19. Strategy generalization across orientation tasks: testing a computational cognitive model.

    PubMed

    Gunzelmann, Glenn

    2008-07-08

    Humans use their spatial information processing abilities flexibly to facilitate problem solving and decision making in a variety of tasks. This article explores the question of whether a general strategy can be adapted for performing two different spatial orientation tasks by testing the predictions of a computational cognitive model. Human performance was measured on an orientation task requiring participants to identify the location of a target either on a map (find-on-map) or within an egocentric view of a space (find-in-scene). A general strategy instantiated in a computational cognitive model of the find-on-map task, based on the results from Gunzelmann and Anderson (2006), was adapted to perform both tasks and used to generate performance predictions for a new study. The qualitative fit of the model to the human data supports the view that participants were able to tailor a general strategy to the requirements of particular spatial tasks. The quantitative differences between the predictions of the model and the performance of human participants in the new experiment expose individual differences in sample populations. The model provides a means of accounting for those differences and a framework for understanding how human spatial abilities are applied to naturalistic spatial tasks that involve reasoning with maps. 2008 Cognitive Science Society, Inc.

  20. From gaze cueing to perspective taking: Revisiting the claim that we automatically compute where or what other people are looking at

    PubMed Central

    Bukowski, Henryk; Hietanen, Jari K.; Samson, Dana

    2015-01-01

    ABSTRACT Two paradigms have shown that people automatically compute what or where another person is looking at. In the visual perspective-taking paradigm, participants judge how many objects they see; whereas, in the gaze cueing paradigm, participants identify a target. Unlike in the former task, in the latter task, the influence of what or where the other person is looking at is only observed when the other person is presented alone before the task-relevant objects. We show that this discrepancy across the two paradigms is not due to differences in visual settings (Experiment 1) or available time to extract the directional information (Experiment 2), but that it is caused by how attention is deployed in response to task instructions (Experiment 3). Thus, the mere presence of another person in the field of view is not sufficient to compute where/what that person is looking at, which qualifies the claimed automaticity of such computations. PMID:26924936

  1. From gaze cueing to perspective taking: Revisiting the claim that we automatically compute where or what other people are looking at.

    PubMed

    Bukowski, Henryk; Hietanen, Jari K; Samson, Dana

    2015-09-14

    Two paradigms have shown that people automatically compute what or where another person is looking at. In the visual perspective-taking paradigm, participants judge how many objects they see; whereas, in the gaze cueing paradigm, participants identify a target. Unlike in the former task, in the latter task, the influence of what or where the other person is looking at is only observed when the other person is presented alone before the task-relevant objects. We show that this discrepancy across the two paradigms is not due to differences in visual settings (Experiment 1) or available time to extract the directional information (Experiment 2), but that it is caused by how attention is deployed in response to task instructions (Experiment 3). Thus, the mere presence of another person in the field of view is not sufficient to compute where/what that person is looking at, which qualifies the claimed automaticity of such computations.

  2. Climate@Home: Crowdsourcing Climate Change Research

    NASA Astrophysics Data System (ADS)

    Xu, C.; Yang, C.; Li, J.; Sun, M.; Bambacus, M.

    2011-12-01

    Climate change deeply impacts human wellbeing. Significant amounts of resources have been invested in building super-computers that are capable of running advanced climate models, which help scientists understand climate change mechanisms, and predict its trend. Although climate change influences all human beings, the general public is largely excluded from the research. On the other hand, scientists are eagerly seeking communication mediums for effectively enlightening the public on climate change and its consequences. The Climate@Home project is devoted to connect the two ends with an innovative solution: crowdsourcing climate computing to the general public by harvesting volunteered computing resources from the participants. A distributed web-based computing platform will be built to support climate computing, and the general public can 'plug-in' their personal computers to participate in the research. People contribute the spare computing power of their computers to run a computer model, which is used by scientists to predict climate change. Traditionally, only super-computers could handle such a large computing processing load. By orchestrating massive amounts of personal computers to perform atomized data processing tasks, investments on new super-computers, energy consumed by super-computers, and carbon release from super-computers are reduced. Meanwhile, the platform forms a social network of climate researchers and the general public, which may be leveraged to raise climate awareness among the participants. A portal is to be built as the gateway to the climate@home project. Three types of roles and the corresponding functionalities are designed and supported. The end users include the citizen participants, climate scientists, and project managers. Citizen participants connect their computing resources to the platform by downloading and installing a computing engine on their personal computers. Computer climate models are defined at the server side. Climate scientists configure computer model parameters through the portal user interface. After model configuration, scientists then launch the computing task. Next, data is atomized and distributed to computing engines that are running on citizen participants' computers. Scientists will receive notifications on the completion of computing tasks, and examine modeling results via visualization modules of the portal. Computing tasks, computing resources, and participants are managed by project managers via portal tools. A portal prototype has been built for proof of concept. Three forums have been setup for different groups of users to share information on science aspect, technology aspect, and educational outreach aspect. A facebook account has been setup to distribute messages via the most popular social networking platform. New treads are synchronized from the forums to facebook. A mapping tool displays geographic locations of the participants and the status of tasks on each client node. A group of users have been invited to test functions such as forums, blogs, and computing resource monitoring.

  3. Do monkeys choose to choose?

    PubMed

    Perdue, Bonnie M; Evans, Theodore A; Washburn, David A; Rumbaugh, Duane M; Beran, Michael J

    2014-06-01

    Both empirical and anecdotal evidence supports the idea that choice is preferred by humans. Previous research has demonstrated that this preference extends to nonhuman animals, but it remains largely unknown whether animals will actively seek out or prefer opportunities to choose. Here we explored the issue of whether capuchin and rhesus monkeys choose to choose. We used a modified version of the SELECT task-a computer program in which monkeys can choose the order of completion of various psychomotor and cognitive tasks. In the present experiments, each trial began with a choice between two icons, one of which allowed the monkey to select the order of task completion, and the other of which led to the assignment of a task order by the computer. In either case, subjects still had to complete the same number of tasks and the same number of task trials. The tasks were relatively easy, and the monkeys responded correctly on most trials. Thus, global reinforcement rates were approximately equated across conditions. The only difference was whether the monkey chose the task order or it was assigned, thus isolating the act of choosing. Given sufficient experience with the task icons, all monkeys showed a significant preference for choice when the alternative was a randomly assigned order of tasks. To a lesser extent, some of the monkeys maintained a preference for choice over a preferred, but computer-assigned, task order that was yoked to their own previous choice selection. The results indicated that monkeys prefer to choose when all other aspects of the task are equated.

  4. Active Nodal Task Seeking for High-Performance, Ultra-Dependable Computing

    DTIC Science & Technology

    1994-07-01

    implementation. Figure 1 shows a hardware organization of ANTS: stand-alone computing nodes inter - connected by buses. 2.1 Run Time Partitioning The...nodes in 14 respond to changing loads [27] or system reconfiguration [26]. Existing techniques are all source-initiated or server-initiated [27]. 5.1...short-running task segments. The task segments must be short-running in order that processors will become avalable often enough to satisfy changing

  5. Physical risk factors identification based on body sensor network combined to videotaping.

    PubMed

    Vignais, Nicolas; Bernard, Fabien; Touvenot, Gérard; Sagot, Jean-Claude

    2017-11-01

    The aim of this study was to perform an ergonomic analysis of a material handling task by combining a subtask video analysis and a RULA computation, implemented continuously through a motion capture system combining inertial sensors and electrogoniometers. Five workers participated to the experiment. Seven inertial measurement units, placed on the worker's upper body (pelvis, thorax, head, arms, forearms), were implemented through a biomechanical model of the upper body to continuously provide trunk, neck, shoulder and elbow joint angles. Wrist joint angles were derived from electrogoniometers synchronized with the inertial measurement system. Worker's activity was simultaneously recorded using video. During post-processing, joint angles were used as inputs to a computationally implemented ergonomic evaluation based on the RULA method. Consequently a RULA score was calculated at each time step to characterize the risk of exposure of the upper body (right and left sides). Local risk scores were also computed to identify the anatomical origin of the exposure. Moreover, the video-recorded work activity was time-studied in order to classify and quantify all subtasks involved into the task. Results showed that mean RULA scores were at high risk for all participants (6 and 6.2 for right and left sides respectively). A temporal analysis demonstrated that workers spent most part of the work time at a RULA score of 7 (right: 49.19 ± 35.27%; left: 55.5 ± 29.69%). Mean local scores revealed that most exposed joints during the task were elbows, lower arms, wrists and hands. Elbows and lower arms were indeed at a high level of risk during the total time of a work cycle (100% for right and left sides). Wrist and hands were also exposed to a risky level for much of the period of work (right: 82.13 ± 7.46%; left: 77.85 ± 12.46%). Concerning the subtask analysis, subtasks called 'snow thrower', 'opening the vacuum sealer', 'cleaning' and 'storing' have been identified as the most awkward for right and left sides given mean RULA scores and percentages of time spent at risky levels. Results analysis permitted to suggest ergonomic recommendations for the redesign of the workstation. Contributions of the proposed innovative system dedicated to physical ergonomic assessment are further discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Neural control of finger movement via intracortical brain-machine interface

    NASA Astrophysics Data System (ADS)

    Irwin, Z. T.; Schroeder, K. E.; Vu, P. P.; Bullard, A. J.; Tat, D. M.; Nu, C. S.; Vaskov, A.; Nason, S. R.; Thompson, D. E.; Bentley, J. N.; Patil, P. G.; Chestek, C. A.

    2017-12-01

    Objective. Intracortical brain-machine interfaces (BMIs) are a promising source of prosthesis control signals for individuals with severe motor disabilities. Previous BMI studies have primarily focused on predicting and controlling whole-arm movements; precise control of hand kinematics, however, has not been fully demonstrated. Here, we investigate the continuous decoding of precise finger movements in rhesus macaques. Approach. In order to elicit precise and repeatable finger movements, we have developed a novel behavioral task paradigm which requires the subject to acquire virtual fingertip position targets. In the physical control condition, four rhesus macaques performed this task by moving all four fingers together in order to acquire a single target. This movement was equivalent to controlling the aperture of a power grasp. During this task performance, we recorded neural spikes from intracortical electrode arrays in primary motor cortex. Main results. Using a standard Kalman filter, we could reconstruct continuous finger movement offline with an average correlation of ρ  =  0.78 between actual and predicted position across four rhesus macaques. For two of the monkeys, this movement prediction was performed in real-time to enable direct brain control of the virtual hand. Compared to physical control, neural control performance was slightly degraded; however, the monkeys were still able to successfully perform the task with an average target acquisition rate of 83.1%. The monkeys’ ability to arbitrarily specify fingertip position was also quantified using an information throughput metric. During brain control task performance, the monkeys achieved an average 1.01 bits s-1 throughput, similar to that achieved in previous studies which decoded upper-arm movements to control computer cursors using a standard Kalman filter. Significance. This is, to our knowledge, the first demonstration of brain control of finger-level fine motor skills. We believe that these results represent an important step towards full and dexterous control of neural prosthetic devices.

  7. In vitro quantification of the performance of model-based mono-planar and bi-planar fluoroscopy for 3D joint kinematics estimation.

    PubMed

    Tersi, Luca; Barré, Arnaud; Fantozzi, Silvia; Stagni, Rita

    2013-03-01

    Model-based mono-planar and bi-planar 3D fluoroscopy methods can quantify intact joints kinematics with performance/cost trade-off. The aim of this study was to compare the performances of mono- and bi-planar setups to a marker-based gold-standard, during dynamic phantom knee acquisitions. Absolute pose errors for in-plane parameters were lower than 0.6 mm or 0.6° for both mono- and bi-planar setups. Mono-planar setups resulted critical in quantifying the out-of-plane translation (error < 6.5 mm), and bi-planar in quantifying the rotation along bone longitudinal axis (error < 1.3°). These errors propagated to joint angles and translations differently depending on the alignment of the anatomical axes and the fluoroscopic reference frames. Internal-external rotation was the least accurate angle both with mono- (error < 4.4°) and bi-planar (error < 1.7°) setups, due to bone longitudinal symmetries. Results highlighted that accuracy for mono-planar in-plane pose parameters is comparable to bi-planar, but with halved computational costs, halved segmentation time and halved ionizing radiation dose. Bi-planar analysis better compensated for the out-of-plane uncertainty that is differently propagated to relative kinematics depending on the setup. To take its full benefits, the motion task to be investigated should be designed to maintain the joint inside the visible volume introducing constraints with respect to mono-planar analysis.

  8. Non-Evolutionary Algorithms for Scheduling Dependent Tasks in Distributed Heterogeneous Computing Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wayne F. Boyer; Gurdeep S. Hura

    2005-09-01

    The Problem of obtaining an optimal matching and scheduling of interdependent tasks in distributed heterogeneous computing (DHC) environments is well known to be an NP-hard problem. In a DHC system, task execution time is dependent on the machine to which it is assigned and task precedence constraints are represented by a directed acyclic graph. Recent research in evolutionary techniques has shown that genetic algorithms usually obtain more efficient schedules that other known algorithms. We propose a non-evolutionary random scheduling (RS) algorithm for efficient matching and scheduling of inter-dependent tasks in a DHC system. RS is a succession of randomized taskmore » orderings and a heuristic mapping from task order to schedule. Randomized task ordering is effectively a topological sort where the outcome may be any possible task order for which the task precedent constraints are maintained. A detailed comparison to existing evolutionary techniques (GA and PSGA) shows the proposed algorithm is less complex than evolutionary techniques, computes schedules in less time, requires less memory and fewer tuning parameters. Simulation results show that the average schedules produced by RS are approximately as efficient as PSGA schedules for all cases studied and clearly more efficient than PSGA for certain cases. The standard formulation for the scheduling problem addressed in this paper is Rm|prec|Cmax.,« less

  9. A Decentralized Eigenvalue Computation Method for Spectrum Sensing Based on Average Consensus

    NASA Astrophysics Data System (ADS)

    Mohammadi, Jafar; Limmer, Steffen; Stańczak, Sławomir

    2016-07-01

    This paper considers eigenvalue estimation for the decentralized inference problem for spectrum sensing. We propose a decentralized eigenvalue computation algorithm based on the power method, which is referred to as generalized power method GPM; it is capable of estimating the eigenvalues of a given covariance matrix under certain conditions. Furthermore, we have developed a decentralized implementation of GPM by splitting the iterative operations into local and global computation tasks. The global tasks require data exchange to be performed among the nodes. For this task, we apply an average consensus algorithm to efficiently perform the global computations. As a special case, we consider a structured graph that is a tree with clusters of nodes at its leaves. For an accelerated distributed implementation, we propose to use computation over multiple access channel (CoMAC) as a building block of the algorithm. Numerical simulations are provided to illustrate the performance of the two algorithms.

  10. An opportunity cost model of subjective effort and task performance

    PubMed Central

    Kurzban, Robert; Duckworth, Angela; Kable, Joseph W.; Myers, Justus

    2013-01-01

    Why does performing certain tasks cause the aversive experience of mental effort and concomitant deterioration in task performance? One explanation posits a physical resource that is depleted over time. We propose an alternate explanation that centers on mental representations of the costs and benefits associated with task performance. Specifically, certain computational mechanisms, especially those associated with executive function, can be deployed for only a limited number of simultaneous tasks at any given moment. Consequently, the deployment of these computational mechanisms carries an opportunity cost – that is, the next-best use to which these systems might be put. We argue that the phenomenology of effort can be understood as the felt output of these cost/benefit computations. In turn, the subjective experience of effort motivates reduced deployment of these computational mechanisms in the service of the present task. These opportunity cost representations, then, together with other cost/benefit calculations, determine effort expended and, everything else equal, result in performance reductions. In making our case for this position, we review alternate explanations both for the phenomenology of effort associated with these tasks and for performance reductions over time. Likewise, we review the broad range of relevant empirical results from across subdisciplines, especially psychology and neuroscience. We hope that our proposal will help to build links among the diverse fields that have been addressing similar questions from different perspectives, and we emphasize ways in which alternate models might be empirically distinguished. PMID:24304775

  11. Citizen science: A new perspective to advance spatial pattern evaluation in hydrology.

    PubMed

    Koch, Julian; Stisen, Simon

    2017-01-01

    Citizen science opens new pathways that can complement traditional scientific practice. Intuition and reasoning often make humans more effective than computer algorithms in various realms of problem solving. In particular, a simple visual comparison of spatial patterns is a task where humans are often considered to be more reliable than computer algorithms. However, in practice, science still largely depends on computer based solutions, which inevitably gives benefits such as speed and the possibility to automatize processes. However, the human vision can be harnessed to evaluate the reliability of algorithms which are tailored to quantify similarity in spatial patterns. We established a citizen science project to employ the human perception to rate similarity and dissimilarity between simulated spatial patterns of several scenarios of a hydrological catchment model. In total, the turnout counts more than 2500 volunteers that provided over 43000 classifications of 1095 individual subjects. We investigate the capability of a set of advanced statistical performance metrics to mimic the human perception to distinguish between similarity and dissimilarity. Results suggest that more complex metrics are not necessarily better at emulating the human perception, but clearly provide auxiliary information that is valuable for model diagnostics. The metrics clearly differ in their ability to unambiguously distinguish between similar and dissimilar patterns which is regarded a key feature of a reliable metric. The obtained dataset can provide an insightful benchmark to the community to test novel spatial metrics.

  12. Introducing Co-Activation Pattern Metrics to Quantify Spontaneous Brain Network Dynamics

    PubMed Central

    Chen, Jingyuan E.; Chang, Catie; Greicius, Michael D.; Glover, Gary H.

    2015-01-01

    Recently, fMRI researchers have begun to realize that the brain's intrinsic network patterns may undergo substantial changes during a single resting state (RS) scan. However, despite the growing interest in brain dynamics, metrics that can quantify the variability of network patterns are still quite limited. Here, we first introduce various quantification metrics based on the extension of co-activation pattern (CAP) analysis, a recently proposed point-process analysis that tracks state alternations at each individual time frame and relies on very few assumptions; then apply these proposed metrics to quantify changes of brain dynamics during a sustained 2-back working memory (WM) task compared to rest. We focus on the functional connectivity of two prominent RS networks, the default-mode network (DMN) and executive control network (ECN). We first demonstrate less variability of global Pearson correlations with respect to the two chosen networks using a sliding-window approach during WM task compared to rest; then we show that the macroscopic decrease in variations in correlations during a WM task is also well characterized by the combined effect of a reduced number of dominant CAPs, increased spatial consistency across CAPs, and increased fractional contributions of a few dominant CAPs. These CAP metrics may provide alternative and more straightforward quantitative means of characterizing brain network dynamics than time-windowed correlation analyses. PMID:25662866

  13. Geo-information processing service composition for concurrent tasks: A QoS-aware game theory approach

    NASA Astrophysics Data System (ADS)

    Li, Haifeng; Zhu, Qing; Yang, Xiaoxia; Xu, Linrong

    2012-10-01

    Typical characteristics of remote sensing applications are concurrent tasks, such as those found in disaster rapid response. The existing composition approach to geographical information processing service chain, searches for an optimisation solution and is what can be deemed a "selfish" way. This way leads to problems of conflict amongst concurrent tasks and decreases the performance of all service chains. In this study, a non-cooperative game-based mathematical model to analyse the competitive relationships between tasks, is proposed. A best response function is used, to assure each task maintains utility optimisation by considering composition strategies of other tasks and quantifying conflicts between tasks. Based on this, an iterative algorithm that converges to Nash equilibrium is presented, the aim being to provide good convergence and maximise the utilisation of all tasks under concurrent task conditions. Theoretical analyses and experiments showed that the newly proposed method, when compared to existing service composition methods, has better practical utility in all tasks.

  14. SHC Project 3.63, Task 2, Beneficial Use of Waste Materials ...

    EPA Pesticide Factsheets

    SHC Project 3.63, Task 2, “Beneficial Use of Waste Materials”, is designed to conduct research and analyses to characterize and quantify the risks and benefits of using or reusing waste materials. There are 6 primary research areas in Task 2 that cover a broad spectrum of topics germane to the beneficial use of waste materials and address Agency, Office, Region and other client needs. The 6 research areas include: 1) Materials Recovery Technology, 2) Beneficial Use of Materials Optimization, 3) Novel Products from Waste Materials, 4) Land Application of Biosolids, 5) Soil Remediation Amendments and 6) Improved Leaching Methods for More Accurate Prediction of Environmental Release of Metals. The objectives of each research area, their intended products and progress to date will be presented. The products of this Task will enable communities and the Agency to better protect and enhance human health, well-being and the environment for current and future generations, through the reduction in material consumption, reuse, and recycling of materials. This presentation is designed to convey the rational, purpose and planned research in EPAs Safe and Healthy Communities (SHC) National Research Program Project 3.63 (Sustainable Materials Management) Task 2, “Beneficial Use of Waste Materials”, which is designed to conduct research and analyses to characterize and quantify the risks and benefits of using or reusing waste materials. . This presentation has bee

  15. Computer Anxiety: How to Measure It?

    ERIC Educational Resources Information Center

    McPherson, Bill

    1997-01-01

    Provides an overview of five scales that are used to measure computer anxiety: Computer Anxiety Index, Computer Anxiety Scale, Computer Attitude Scale, Attitudes toward Computers, and Blombert-Erickson-Lowrey Computer Attitude Task. Includes background information and scale specifics. (JOW)

  16. Integrated command, control, communication and computation system design study. Summary of tasks performed

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A summary of tasks performed on an integrated command, control, communication, and computation system design study is given. The Tracking and Data Relay Satellite System command and control system study, an automated real-time operations study, and image processing work are discussed.

  17. Student Computer Use in Selected Undergraduate Agriculture Courses: An Examination of Required Tasks.

    ERIC Educational Resources Information Center

    Johnson, Donald M.; Ferguson, James A.; Vokins, Nancy W.; Lester, Melissa L.

    2000-01-01

    Over 50% of faculty teaching undergraduate agriculture courses (n=58) required use of word processing, Internet, and electronic mail; less than 50% required spreadsheets, databases, graphics, or specialized software. They planned to maintain or increase required computer tasks in their courses. (SK)

  18. Cognitive Support for Learning Computer-Based Tasks Using Animated Demonstration

    ERIC Educational Resources Information Center

    Chen, Chun-Ying

    2016-01-01

    This study investigated the influence of cognitive support for learning computer-based tasks using animated demonstration (AD) on instructional efficiency. Cognitive support included (1) segmentation and learner control introducing interactive devices that allow content sequencing through a navigational menu, and content pacing through stop and…

  19. Embodiment of Learning in Electro-Optical Signal Processors

    NASA Astrophysics Data System (ADS)

    Hermans, Michiel; Antonik, Piotr; Haelterman, Marc; Massar, Serge

    2016-09-01

    Delay-coupled electro-optical systems have received much attention for their dynamical properties and their potential use in signal processing. In particular, it has recently been demonstrated, using the artificial intelligence algorithm known as reservoir computing, that photonic implementations of such systems solve complex tasks such as speech recognition. Here, we show how the backpropagation algorithm can be physically implemented on the same electro-optical delay-coupled architecture used for computation with only minor changes to the original design. We find that, compared to when the backpropagation algorithm is not used, the error rate of the resulting computing device, evaluated on three benchmark tasks, decreases considerably. This demonstrates that electro-optical analog computers can embody a large part of their own training process, allowing them to be applied to new, more difficult tasks.

  20. Job-shop scheduling applied to computer vision

    NASA Astrophysics Data System (ADS)

    Sebastian y Zuniga, Jose M.; Torres-Medina, Fernando; Aracil, Rafael; Reinoso, Oscar; Jimenez, Luis M.; Garcia, David

    1997-09-01

    This paper presents a method for minimizing the total elapsed time spent by n tasks running on m differents processors working in parallel. The developed algorithm not only minimizes the total elapsed time but also reduces the idle time and waiting time of in-process tasks. This condition is very important in some applications of computer vision in which the time to finish the total process is particularly critical -- quality control in industrial inspection, real- time computer vision, guided robots. The scheduling algorithm is based on the use of two matrices, obtained from the precedence relationships between tasks, and the data obtained from the two matrices. The developed scheduling algorithm has been tested in one application of quality control using computer vision. The results obtained have been satisfactory in the application of different image processing algorithms.

  1. Embodiment of Learning in Electro-Optical Signal Processors.

    PubMed

    Hermans, Michiel; Antonik, Piotr; Haelterman, Marc; Massar, Serge

    2016-09-16

    Delay-coupled electro-optical systems have received much attention for their dynamical properties and their potential use in signal processing. In particular, it has recently been demonstrated, using the artificial intelligence algorithm known as reservoir computing, that photonic implementations of such systems solve complex tasks such as speech recognition. Here, we show how the backpropagation algorithm can be physically implemented on the same electro-optical delay-coupled architecture used for computation with only minor changes to the original design. We find that, compared to when the backpropagation algorithm is not used, the error rate of the resulting computing device, evaluated on three benchmark tasks, decreases considerably. This demonstrates that electro-optical analog computers can embody a large part of their own training process, allowing them to be applied to new, more difficult tasks.

  2. A design fix to supervisory control for fault-tolerant scheduling of real-time multiprocessor systems with aperiodic tasks

    NASA Astrophysics Data System (ADS)

    Devaraj, Rajesh; Sarkar, Arnab; Biswas, Santosh

    2015-11-01

    In the article 'Supervisory control for fault-tolerant scheduling of real-time multiprocessor systems with aperiodic tasks', Park and Cho presented a systematic way of computing a largest fault-tolerant and schedulable language that provides information on whether the scheduler (i.e., supervisor) should accept or reject a newly arrived aperiodic task. The computation of such a language is mainly dependent on the task execution model presented in their paper. However, the task execution model is unable to capture the situation when the fault of a processor occurs even before the task has arrived. Consequently, a task execution model that does not capture this fact may possibly be assigned for execution on a faulty processor. This problem has been illustrated with an appropriate example. Then, the task execution model of Park and Cho has been modified to strengthen the requirement that none of the tasks are assigned for execution on a faulty processor.

  3. IGT-Open: An open-source, computerized version of the Iowa Gambling Task.

    PubMed

    Dancy, Christopher L; Ritter, Frank E

    2017-06-01

    The Iowa Gambling Task (IGT) is commonly used to understand the processes involved in decision-making. Though the task was originally run without a computer, using a computerized version of the task has become typical. These computerized versions of the IGT are useful, because they can make the task more standardized across studies and allow for the task to be used in environments where a physical version of the task may be difficult or impossible to use (e.g., while collecting brain imaging data). Though these computerized versions of the IGT have been useful for experimentation, having multiple software implementations of the task could present reliability issues. We present an open-source software version of the Iowa Gambling Task (called IGT-Open) that allows for millisecond visual presentation accuracy and is freely available to be used and modified. This software has been used to collect data from human subjects and also has been used to run model-based simulations with computational process models developed to run in the ACT-R architecture.

  4. Slot-like capacity and resource-like coding in a neural model of multiple-item working memory.

    PubMed

    Standage, Dominic; Pare, Martin

    2018-06-27

    For the past decade, research on the storage limitations of working memory has been dominated by two fundamentally different hypotheses. On the one hand, the contents of working memory may be stored in a limited number of `slots', each with a fixed resolution. On the other hand, any number of items may be stored, but with decreasing resolution. These two hypotheses have been invaluable in characterizing the computational structure of working memory, but neither provides a complete account of the available experimental data, nor speaks to the neural basis of the limitations it characterizes. To address these shortcomings, we simulated a multiple-item working memory task with a cortical network model, the cellular resolution of which allowed us to quantify the coding fidelity of memoranda as a function of memory load, as measured by the discriminability, regularity and reliability of simulated neural spiking. Our simulations account for a wealth of neural and behavioural data from human and non-human primate studies, and they demonstrate that feedback inhibition lowers both capacity and coding fidelity. Because the strength of inhibition scales with the number of items stored by the network, increasing this number progressively lowers fidelity until capacity is reached. Crucially, the model makes specific, testable predictions for neural activity on multiple-item working memory tasks.

  5. A Direct Brain-to-Brain Interface in Humans

    PubMed Central

    Rao, Rajesh P. N.; Stocco, Andrea; Bryan, Matthew; Sarma, Devapratim; Youngquist, Tiffany M.; Wu, Joseph; Prat, Chantel S.

    2014-01-01

    We describe the first direct brain-to-brain interface in humans and present results from experiments involving six different subjects. Our non-invasive interface, demonstrated originally in August 2013, combines electroencephalography (EEG) for recording brain signals with transcranial magnetic stimulation (TMS) for delivering information to the brain. We illustrate our method using a visuomotor task in which two humans must cooperate through direct brain-to-brain communication to achieve a desired goal in a computer game. The brain-to-brain interface detects motor imagery in EEG signals recorded from one subject (the “sender”) and transmits this information over the internet to the motor cortex region of a second subject (the “receiver”). This allows the sender to cause a desired motor response in the receiver (a press on a touchpad) via TMS. We quantify the performance of the brain-to-brain interface in terms of the amount of information transmitted as well as the accuracies attained in (1) decoding the sender’s signals, (2) generating a motor response from the receiver upon stimulation, and (3) achieving the overall goal in the cooperative visuomotor task. Our results provide evidence for a rudimentary form of direct information transmission from one human brain to another using non-invasive means. PMID:25372285

  6. Beta Oscillatory Dynamics in the Prefrontal and Superior Temporal Cortices Predict Spatial Working Memory Performance.

    PubMed

    Proskovec, Amy L; Wiesman, Alex I; Heinrichs-Graham, Elizabeth; Wilson, Tony W

    2018-05-31

    The oscillatory dynamics serving spatial working memory (SWM), and how such dynamics relate to performance, are poorly understood. To address these topics, the present study recruited 22 healthy adults to perform a SWM task during magnetoencephalography (MEG). The resulting MEG data were transformed into the time-frequency domain, and significant oscillatory responses were imaged using a beamformer. Voxel time series data were extracted from the cluster peaks to quantify the dynamics, while whole-brain partial correlation maps were computed to identify regions where oscillatory strength varied with accuracy on the SWM task. The results indicated transient theta oscillations in spatially distinct subregions of the prefrontal cortices at the onset of encoding and maintenance, which may underlie selection of goal-relevant information. Additionally, strong and persistent decreases in alpha and beta oscillations were observed throughout encoding and maintenance in parietal, temporal, and occipital regions, which could serve sustained attention and maintenance processes during SWM performance. The neuro-behavioral correlations revealed that beta activity within left dorsolateral prefrontal control regions and bilateral superior temporal integration regions was negatively correlated with SWM accuracy. Notably, this is the first study to employ a whole-brain approach to significantly link neural oscillations to behavioral performance in the context of SWM.

  7. Evaluation of various mental task combinations for near-infrared spectroscopy-based brain-computer interfaces

    NASA Astrophysics Data System (ADS)

    Hwang, Han-Jeong; Lim, Jeong-Hwan; Kim, Do-Won; Im, Chang-Hwan

    2014-07-01

    A number of recent studies have demonstrated that near-infrared spectroscopy (NIRS) is a promising neuroimaging modality for brain-computer interfaces (BCIs). So far, most NIRS-based BCI studies have focused on enhancing the accuracy of the classification of different mental tasks. In the present study, we evaluated the performances of a variety of mental task combinations in order to determine the mental task pairs that are best suited for customized NIRS-based BCIs. To this end, we recorded event-related hemodynamic responses while seven participants performed eight different mental tasks. Classification accuracies were then estimated for all possible pairs of the eight mental tasks (C=28). Based on this analysis, mental task combinations with relatively high classification accuracies frequently included the following three mental tasks: "mental multiplication," "mental rotation," and "right-hand motor imagery." Specifically, mental task combinations consisting of two of these three mental tasks showed the highest mean classification accuracies. It is expected that our results will be a useful reference to reduce the time needed for preliminary tests when discovering individual-specific mental task combinations.

  8. Dual task cost of walking is related to fall risk in persons with multiple sclerosis.

    PubMed

    Wajda, Douglas A; Motl, Robert W; Sosnoff, Jacob J

    2013-12-15

    Persons with multiple sclerosis (MS) commonly have walking and cognitive impairments. While walking with a simultaneous cognitive task, persons with MS experience a greater decline in walking performance than healthy controls. This change in performance is termed dual task cost or dual task interference and has been associated with fall risk in older adults. We examined whether dual task cost during walking was related to fall risk in persons with MS. Thirty-three ambulatory persons with MS performed walking tasks with and without a concurrent cognitive task (dual task condition) as well as underwent a fall risk assessment. Dual task cost was operationalized as the percent change in velocity from normal walking conditions to dual task walking conditions. Fall risk was quantified using the Physiological Profile Assessment. A Spearman correlation analysis revealed a significant positive correlation between dual task cost of walking velocity and fall risk as well as dual task cost of stride length and fall risk. Overall, the findings indicate that dual task cost is associated with fall risk and may be an important target for falls prevention strategies. © 2013.

  9. Eigen Spreading

    DTIC Science & Technology

    2008-02-27

    between the PHY layer and for example a host PC computer . The PC wants to generate and receive a sequence of data packets. The PC may also want to send...the testbed is quite similar. Given the intense computational requirements of SVD and other matrix mode operations needed to support eigen spreading a...platform for real time operation. This task is probably the major challenge in the development of the testbed. All compute intensive tasks will be

  10. Computer architecture for efficient algorithmic executions in real-time systems: New technology for avionics systems and advanced space vehicles

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Youngblood, John N.; Saha, Aindam

    1987-01-01

    Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processing elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.

  11. Computer architecture for efficient algorithmic executions in real-time systems: new technology for avionics systems and advanced space vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carroll, C.C.; Youngblood, J.N.; Saha, A.

    1987-12-01

    Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processingmore » elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.« less

  12. Dashboard Task Monitor for Managing ATLAS User Analysis on the Grid

    NASA Astrophysics Data System (ADS)

    Sargsyan, L.; Andreeva, J.; Jha, M.; Karavakis, E.; Kokoszkiewicz, L.; Saiz, P.; Schovancova, J.; Tuckett, D.; Atlas Collaboration

    2014-06-01

    The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and Grid environment. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.

  13. Study to design and develop remote manipulator system

    NASA Technical Reports Server (NTRS)

    Hill, J. W.; Sword, A. J.

    1973-01-01

    Human performance measurement techniques for remote manipulation tasks and remote sensing techniques for manipulators are described for common manipulation tasks, performance is monitored by means of an on-line computer capable of measuring the joint angles of both master and slave arms as a function of time. The computer programs allow measurements of the operator's strategy and physical quantities such as task time and power consumed. The results are printed out after a test run to compare different experimental conditions. For tracking tasks, we describe a method of displaying errors in three dimensions and measuring the end-effector position in three dimensions.

  14. Virtual reality computer simulation.

    PubMed

    Grantcharov, T P; Rosenberg, J; Pahle, E; Funch-Jensen, P

    2001-03-01

    Objective assessment of psychomotor skills should be an essential component of a modern surgical training program. There are computer systems that can be used for this purpose, but their wide application is not yet generally accepted. The aim of this study was to validate the role of virtual reality computer simulation as a method for evaluating surgical laparoscopic skills. The study included 14 surgical residents. On day 1, they performed two runs of all six tasks on the Minimally Invasive Surgical Trainer, Virtual Reality (MIST VR). On day 2, they performed a laparoscopic cholecystectomy on living pigs; afterward, they were tested again on the MIST VR. A group of experienced surgeons evaluated the trainees' performance on the animal operation, giving scores for total performance error and economy of motion. During the tasks on the MIST VR, errors and noneconomy of movements for the left and right hand were also recorded. There were significant correlations between error scores in vivo and three of the six in vitro tasks (p < 0.05). In vivo economy scores correlated significantly with non-economy right-hand scores for five of the six tasks and with non-economy left-hand scores for one of the six tasks (p < 0.05). In this study, laparoscopic performance in the animal model correlated significantly with performance on the computer simulator. Thus, the computer model seems to be a promising objective method for the assessment of laparoscopic psychomotor skills.

  15. Fault recovery for real-time, multi-tasking computer system

    NASA Technical Reports Server (NTRS)

    Hess, Richard (Inventor); Kelly, Gerald B. (Inventor); Rogers, Randy (Inventor); Stange, Kent A. (Inventor)

    2011-01-01

    System and methods for providing a recoverable real time multi-tasking computer system are disclosed. In one embodiment, a system comprises a real time computing environment, wherein the real time computing environment is adapted to execute one or more applications and wherein each application is time and space partitioned. The system further comprises a fault detection system adapted to detect one or more faults affecting the real time computing environment and a fault recovery system, wherein upon the detection of a fault the fault recovery system is adapted to restore a backup set of state variables.

  16. Physical Medicine and Rehabilitation Resident Use of iPad Mini Mobile Devices.

    PubMed

    Niehaus, William; Boimbo, Sandra; Akuthota, Venu

    2015-05-01

    Previous research on the use of tablet devices in residency programs has been undertaken in radiology and medicine or with standard-sized tablet devices. With new, smaller tablet devices, there is an opportunity to assess their effect on resident behavior. This prospective study attempts to evaluate resident behavior after receiving a smaller tablet device. To evaluate whether smaller tablet computers facilitate residents' daily tasks. Prospective study that administered surveys to evaluate tablet computer use. Residency program. Thirteen physical medicine and rehabilitation residents. Residents were provided 16-GB iPad Minis and surveyed with Redcap to collect usage information at baseline, 3, and 6 months. Survey analysis was conducted using SAS (SAS, Cary, NC) for descriptive analysis. To evaluate multiple areas of resident education, the following tasks were selected: accessing e-mail, logging duty hours, logging procedures, researching clinical information, accessing medical journals, reviewing didactic presentations, and completing evaluations. Then, measurements were taken of: (1) residents' response to how tablet computers made it easier to access the aforementioned tasks; and (2) residents' response to how tablet computers affected the frequency they performed the aforementioned tasks. After being provided tablet computers, our physical medicine and rehabilitation residents reported significantly greater access to e-mail, medical journals, and didactic material. Also, receiving tablet computers was reported to increase the frequency that residents accessed e-mail, researched clinical information, accessed medical journals, reviewed didactic presentations, and completed evaluations. After receiving a tablet computer, residents reported an increase in the use of calendar programs, note-taking programs, PDF readers, online storage programs, and file organization programs. These physical medicine and rehabilitation residents reported tablet computers increased access to e-mail, presentation material, and medical journals. Tablet computers also were reported to increase the frequency residents were able to complete tasks associated with residency training. Copyright © 2015 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.

  17. Spatial Memory: Behavioral Determinants of Persistence in the Watermaze Delayed Matching-to-Place Task

    ERIC Educational Resources Information Center

    da Silva, Bruno M.; Bast, Tobias; Morris, Richard G. M.

    2014-01-01

    The watermaze delayed matching-to-place (DMP) task was modified to include probe trials, to quantify search preference for the correct place. Using a zone analysis of search preference, a gradual decay of one-trial memory in rats was observed over 24 h with weak memory consistently detected at a retention interval of 6 h, but unreliably at 24 h.…

  18. Multiloop Manual Control of Dynamic Systems

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Mcnally, B. D.

    1984-01-01

    Human interaction with a simple, multiloop dynamic system in which the human's activity was systematically varied by changing the levels of automation was studied. The control loop structure resulting from the task definition parallels that for any multiloop manual control system, is considered a sterotype. Simple models of the human in the task, and upon extending a technique for describing the manner in which the human subjectively quantifies his opinion of task difficulty were developed. A man in the loop simulation which provides data to support and direct the analytical effort is presented.

  19. Quantitative analysis of task selection for brain-computer interfaces

    NASA Astrophysics Data System (ADS)

    Llera, Alberto; Gómez, Vicenç; Kappen, Hilbert J.

    2014-10-01

    Objective. To assess quantitatively the impact of task selection in the performance of brain-computer interfaces (BCI). Approach. We consider the task-pairs derived from multi-class BCI imagery movement tasks in three different datasets. We analyze for the first time the benefits of task selection on a large-scale basis (109 users) and evaluate the possibility of transferring task-pair information across days for a given subject. Main results. Selecting the subject-dependent optimal task-pair among three different imagery movement tasks results in approximately 20% potential increase in the number of users that can be expected to control a binary BCI. The improvement is observed with respect to the best task-pair fixed across subjects. The best task-pair selected for each subject individually during a first day of recordings is generally a good task-pair in subsequent days. In general, task learning from the user side has a positive influence in the generalization of the optimal task-pair, but special attention should be given to inexperienced subjects. Significance. These results add significant evidence to existing literature that advocates task selection as a necessary step towards usable BCIs. This contribution motivates further research focused on deriving adaptive methods for task selection on larger sets of mental tasks in practical online scenarios.

  20. Human-computer dialogue: Interaction tasks and techniques. Survey and categorization

    NASA Technical Reports Server (NTRS)

    Foley, J. D.

    1983-01-01

    Interaction techniques are described. Six basic interaction tasks, requirements for each task, requirements related to interaction techniques, and a technique's hardware prerequisites affective device selection are discussed.

  1. An Exploration of Cognitive Agility as Quantified by Attention Allocation in a Complex Environment

    DTIC Science & Technology

    2017-03-01

    quantified by eye-tracking data collected while subjects played a military-relevant cognitive agility computer game (Make Goal), to determine whether...subjects played a military-relevant cognitive agility computer game (Make Goal), to determine whether certain patterns are associated with effective...Group and Control Group on Eye Tracking and Game Performance .....................36 3. Comparison between High and Low Performers on Eye tracking and

  2. Risk assessments using the Strain Index and the TLV for HAL, Part I: Task and multi-task job exposure classifications.

    PubMed

    Kapellusch, Jay M; Bao, Stephen S; Silverstein, Barbara A; Merryweather, Andrew S; Thiese, Mathew S; Hegmann, Kurt T; Garg, Arun

    2017-12-01

    The Strain Index (SI) and the American Conference of Governmental Industrial Hygienists (ACGIH) Threshold Limit Value for Hand Activity Level (TLV for HAL) use different constituent variables to quantify task physical exposures. Similarly, time-weighted-average (TWA), Peak, and Typical exposure techniques to quantify physical exposure from multi-task jobs make different assumptions about each task's contribution to the whole job exposure. Thus, task and job physical exposure classifications differ depending upon which model and technique are used for quantification. This study examines exposure classification agreement, disagreement, correlation, and magnitude of classification differences between these models and techniques. Data from 710 multi-task job workers performing 3,647 tasks were analyzed using the SI and TLV for HAL models, as well as with the TWA, Typical and Peak job exposure techniques. Physical exposures were classified as low, medium, and high using each model's recommended, or a priori limits. Exposure classification agreement and disagreement between models (SI, TLV for HAL) and between job exposure techniques (TWA, Typical, Peak) were described and analyzed. Regardless of technique, the SI classified more tasks as high exposure than the TLV for HAL, and the TLV for HAL classified more tasks as low exposure. The models agreed on 48.5% of task classifications (kappa = 0.28) with 15.5% of disagreement between low and high exposure categories. Between-technique (i.e., TWA, Typical, Peak) agreement ranged from 61-93% (kappa: 0.16-0.92) depending on whether the SI or TLV for HAL was used. There was disagreement between the SI and TLV for HAL and between the TWA, Typical and Peak techniques. Disagreement creates uncertainty for job design, job analysis, risk assessments, and developing interventions. Task exposure classifications from the SI and TLV for HAL might complement each other. However, TWA, Typical, and Peak job exposure techniques all have limitations. Part II of this article examines whether the observed differences between these models and techniques produce different exposure-response relationships for predicting prevalence of carpal tunnel syndrome.

  3. Is Neural Activity Detected by ERP-Based Brain-Computer Interfaces Task Specific?

    PubMed

    Wenzel, Markus A; Almeida, Inês; Blankertz, Benjamin

    2016-01-01

    Brain-computer interfaces (BCIs) that are based on event-related potentials (ERPs) can estimate to which stimulus a user pays particular attention. In typical BCIs, the user silently counts the selected stimulus (which is repeatedly presented among other stimuli) in order to focus the attention. The stimulus of interest is then inferred from the electroencephalogram (EEG). Detecting attention allocation implicitly could be also beneficial for human-computer interaction (HCI), because it would allow software to adapt to the user's interest. However, a counting task would be inappropriate for the envisaged implicit application in HCI. Therefore, the question was addressed if the detectable neural activity is specific for silent counting, or if it can be evoked also by other tasks that direct the attention to certain stimuli. Thirteen people performed a silent counting, an arithmetic and a memory task. The tasks required the subjects to pay particular attention to target stimuli of a random color. The stimulus presentation was the same in all three tasks, which allowed a direct comparison of the experimental conditions. Classifiers that were trained to detect the targets in one task, according to patterns present in the EEG signal, could detect targets in all other tasks (irrespective of some task-related differences in the EEG). The neural activity detected by the classifiers is not strictly task specific but can be generalized over tasks and is presumably a result of the attention allocation or of the augmented workload. The results may hold promise for the transfer of classification algorithms from BCI research to implicit relevance detection in HCI.

  4. Modeling User Behavior in Computer Learning Tasks.

    ERIC Educational Resources Information Center

    Mantei, Marilyn M.

    Model building techniques from Artifical Intelligence and Information-Processing Psychology are applied to human-computer interface tasks to evaluate existing interfaces and suggest new and better ones. The model is in the form of an augmented transition network (ATN) grammar which is built by applying grammar induction heuristics on a sequential…

  5. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence from Word Segmentation

    ERIC Educational Resources Information Center

    Phillips, Lawrence; Pearl, Lisa

    2015-01-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's "cognitive plausibility." We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition…

  6. Group Formation in Mobile Computer Supported Collaborative Learning Contexts: A Systematic Literature Review

    ERIC Educational Resources Information Center

    Amara, Sofiane; Macedo, Joaquim; Bendella, Fatima; Santos, Alexandre

    2016-01-01

    Learners are becoming increasingly divers. They may have much personal, social, cultural, psychological, and cognitive diversity. Forming suitable learning groups represents, therefore, a hard and time-consuming task. In Mobile Computer Supported Collaborative Learning (MCSCL) environments, this task is more difficult. Instructors need to consider…

  7. Development of a personal computer-based secondary task procedure as a surrogate for a driving simulator

    DOT National Transportation Integrated Search

    2007-08-01

    This research was conducted to develop and test a personal computer-based study procedure (PCSP) with secondary task loading for use in human factors laboratory experiments in lieu of a driving simulator to test reading time and understanding of traf...

  8. Task Assignment Heuristics for Parallel and Distributed CFD Applications

    NASA Technical Reports Server (NTRS)

    Lopez-Benitez, Noe; Djomehri, M. Jahed; Biswas, Rupak

    2003-01-01

    This paper proposes a task graph (TG) model to represent a single discrete step of multi-block overset grid computational fluid dynamics (CFD) applications. The TG model is then used to not only balance the computational workload across the overset grids but also to reduce inter-grid communication costs. We have developed a set of task assignment heuristics based on the constraints inherent in this class of CFD problems. Two basic assignments, the smallest task first (STF) and the largest task first (LTF), are first presented. They are then systematically costs. To predict the performance of the proposed task assignment heuristics, extensive performance evaluations are conducted on a synthetic TG with tasks defined in terms of the number of grid points in predetermined overlapping grids. A TG derived from a realistic problem with eight million grid points is also used as a test case.

  9. Bayesian neural adjustment of inhibitory control predicts emergence of problem stimulant use.

    PubMed

    Harlé, Katia M; Stewart, Jennifer L; Zhang, Shunan; Tapert, Susan F; Yu, Angela J; Paulus, Martin P

    2015-11-01

    Bayesian ideal observer models quantify individuals' context- and experience-dependent beliefs and expectations about their environment, which provides a powerful approach (i) to link basic behavioural mechanisms to neural processing; and (ii) to generate clinical predictors for patient populations. Here, we focus on (ii) and determine whether individual differences in the neural representation of the need to stop in an inhibitory task can predict the development of problem use (i.e. abuse or dependence) in individuals experimenting with stimulants. One hundred and fifty-seven non-dependent occasional stimulant users, aged 18-24, completed a stop-signal task while undergoing functional magnetic resonance imaging. These individuals were prospectively followed for 3 years and evaluated for stimulant use and abuse/dependence symptoms. At follow-up, 38 occasional stimulant users met criteria for a stimulant use disorder (problem stimulant users), while 50 had discontinued use (desisted stimulant users). We found that those individuals who showed greater neural responses associated with Bayesian prediction errors, i.e. the difference between actual and expected need to stop on a given trial, in right medial prefrontal cortex/anterior cingulate cortex, caudate, anterior insula, and thalamus were more likely to exhibit problem use 3 years later. Importantly, these computationally based neural predictors outperformed clinical measures and non-model based neural variables in predicting clinical status. In conclusion, young adults who show exaggerated brain processing underlying whether to 'stop' or to 'go' are more likely to develop stimulant abuse. Thus, Bayesian cognitive models provide both a computational explanation and potential predictive biomarkers of belief processing deficits in individuals at risk for stimulant addiction. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. GANGA: A tool for computational-task management and easy access to Grid resources

    NASA Astrophysics Data System (ADS)

    Mościcki, J. T.; Brochu, F.; Ebke, J.; Egede, U.; Elmsheuser, J.; Harrison, K.; Jones, R. W. L.; Lee, H. C.; Liko, D.; Maier, A.; Muraru, A.; Patrick, G. N.; Pajchel, K.; Reece, W.; Samset, B. H.; Slater, M. W.; Soroko, A.; Tan, C. L.; van der Ster, D. C.; Williams, M.

    2009-11-01

    In this paper, we present the computational task-management tool GANGA, which allows for the specification, submission, bookkeeping and post-processing of computational tasks on a wide set of distributed resources. GANGA has been developed to solve a problem increasingly common in scientific projects, which is that researchers must regularly switch between different processing systems, each with its own command set, to complete their computational tasks. GANGA provides a homogeneous environment for processing data on heterogeneous resources. We give examples from High Energy Physics, demonstrating how an analysis can be developed on a local system and then transparently moved to a Grid system for processing of all available data. GANGA has an API that can be used via an interactive interface, in scripts, or through a GUI. Specific knowledge about types of tasks or computational resources is provided at run-time through a plugin system, making new developments easy to integrate. We give an overview of the GANGA architecture, give examples of current use, and demonstrate how GANGA can be used in many different areas of science. Catalogue identifier: AEEN_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEN_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL No. of lines in distributed program, including test data, etc.: 224 590 No. of bytes in distributed program, including test data, etc.: 14 365 315 Distribution format: tar.gz Programming language: Python Computer: personal computers, laptops Operating system: Linux/Unix RAM: 1 MB Classification: 6.2, 6.5 Nature of problem: Management of computational tasks for scientific applications on heterogenous distributed systems, including local, batch farms, opportunistic clusters and Grids. Solution method: High-level job management interface, including command line, scripting and GUI components. Restrictions: Access to the distributed resources depends on the installed, 3rd party software such as batch system client or Grid user interface.

  11. A programming environment for distributed complex computing. An overview of the Framework for Interdisciplinary Design Optimization (FIDO) project. NASA Langley TOPS exhibit H120b

    NASA Technical Reports Server (NTRS)

    Townsend, James C.; Weston, Robert P.; Eidson, Thomas M.

    1993-01-01

    The Framework for Interdisciplinary Design Optimization (FIDO) is a general programming environment for automating the distribution of complex computing tasks over a networked system of heterogeneous computers. For example, instead of manually passing a complex design problem between its diverse specialty disciplines, the FIDO system provides for automatic interactions between the discipline tasks and facilitates their communications. The FIDO system networks all the computers involved into a distributed heterogeneous computing system, so they have access to centralized data and can work on their parts of the total computation simultaneously in parallel whenever possible. Thus, each computational task can be done by the most appropriate computer. Results can be viewed as they are produced and variables changed manually for steering the process. The software is modular in order to ease migration to new problems: different codes can be substituted for each of the current code modules with little or no effect on the others. The potential for commercial use of FIDO rests in the capability it provides for automatically coordinating diverse computations on a networked system of workstations and computers. For example, FIDO could provide the coordination required for the design of vehicles or electronics or for modeling complex systems.

  12. Defining and quantifying users' mental Imagery-based BCI skills: a first step.

    PubMed

    Lotte, Fabien; Jeunet, Camille

    2018-05-17

    While promising for many applications, Electroencephalography (EEG)-based Brain-Computer Interfaces (BCIs) are still scarcely used outside laboratories, due to a poor reliability. It is thus necessary to study and fix this reliability issue. Doing so requires the use of appropriate reliability metrics to quantify both the classification algorithm and the BCI user's performances. So far, Classification Accuracy (CA) is the typical metric used for both aspects. However, we argue in this paper that CA is a poor metric to study BCI users' skills. Here, we propose a definition and new metrics to quantify such BCI skills for Mental Imagery (MI) BCIs, independently of any classification algorithm. Approach: We first show in this paper that CA is notably unspecific, discrete, training data and classifier dependent, and as such may not always reflect successful self-modulation of EEG patterns by the user. We then propose a definition of MI-BCI skills that reflects how well the user can self-modulate EEG patterns, and thus how well he could control an MI-BCI. Finally, we propose new performance metrics, classDis, restDist and classStab that specifically measure how distinct and stable the EEG patterns produced by the user are, independently of any classifier. Main results: By re-analyzing EEG data sets with such new metrics, we indeed confirmed that CA may hide some increase in MI-BCI skills or hide the user inability to self-modulate a given EEG pattern. On the other hand, our new metrics could reveal such skill improvements as well as identify when a mental task performed by a user was no different than rest EEG. Significance: Our results showed that when studying MI-BCI users' skills, CA should be used with care, and complemented with metrics such as the new ones proposed. Our results also stressed the need to redefine BCI user training by considering the different BCI subskills and their measures. To promote the complementary use of our new metrics, we provide the Matlab code to compute them for free and open-source. © 2018 IOP Publishing Ltd.

  13. Opportunistic Computing with Lobster: Lessons Learned from Scaling up to 25k Non-Dedicated Cores

    NASA Astrophysics Data System (ADS)

    Wolf, Matthias; Woodard, Anna; Li, Wenzhao; Hurtado Anampa, Kenyi; Yannakopoulos, Anna; Tovar, Benjamin; Donnelly, Patrick; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2017-10-01

    We previously described Lobster, a workflow management tool for exploiting volatile opportunistic computing resources for computation in HEP. We will discuss the various challenges that have been encountered while scaling up the simultaneous CPU core utilization and the software improvements required to overcome these challenges. Categories: Workflows can now be divided into categories based on their required system resources. This allows the batch queueing system to optimize assignment of tasks to nodes with the appropriate capabilities. Within each category, limits can be specified for the number of running jobs to regulate the utilization of communication bandwidth. System resource specifications for a task category can now be modified while a project is running, avoiding the need to restart the project if resource requirements differ from the initial estimates. Lobster now implements time limits on each task category to voluntarily terminate tasks. This allows partially completed work to be recovered. Workflow dependency specification: One workflow often requires data from other workflows as input. Rather than waiting for earlier workflows to be completed before beginning later ones, Lobster now allows dependent tasks to begin as soon as sufficient input data has accumulated. Resource monitoring: Lobster utilizes a new capability in Work Queue to monitor the system resources each task requires in order to identify bottlenecks and optimally assign tasks. The capability of the Lobster opportunistic workflow management system for HEP computation has been significantly increased. We have demonstrated efficient utilization of 25 000 non-dedicated cores and achieved a data input rate of 30 Gb/s and an output rate of 500GB/h. This has required new capabilities in task categorization, workflow dependency specification, and resource monitoring.

  14. Dynamic balance during walking adaptability tasks in individuals post-stroke.

    PubMed

    Vistamehr, Arian; Balasubramanian, Chitralakshmi K; Clark, David J; Neptune, Richard R; Fox, Emily J

    2018-06-06

    Maintaining dynamic balance during community ambulation is a major challenge post-stroke. Community ambulation requires performance of steady-state level walking as well as tasks that require walking adaptability. Prior studies on balance control post-stroke have mainly focused on steady-state walking, but walking adaptability tasks have received little attention. The purpose of this study was to quantify and compare dynamic balance requirements during common walking adaptability tasks post-stroke and in healthy adults and identify differences in underlying mechanisms used for maintaining dynamic balance. Kinematic data were collected from fifteen individuals with post-stroke hemiparesis during steady-state forward and backward walking, obstacle negotiation, and step-up tasks. In addition, data from ten healthy adults provided the basis for comparison. Dynamic balance was quantified using the peak-to-peak range of whole-body angular-momentum in each anatomical plane during the paretic, nonparetic and healthy control single-leg-stance phase of the gait cycle. To understand differences in some of the key underlying mechanisms for maintaining dynamic balance, foot placement and plantarflexor muscle activation were examined. Individuals post-stroke had significant dynamic balance deficits in the frontal plane across most tasks, particularly during the paretic single-leg-stance. Frontal plane balance deficits were associated with wider paretic foot placement, elevated body center-of-mass, and lower soleus activity. Further, the obstacle negotiation task imposed a higher balance requirement, particularly during the trailing leg single-stance. Thus, improving paretic foot placement and ankle plantarflexor activity, particularly during obstacle negotiation, may be important rehabilitation targets to enhance dynamic balance during post-stroke community ambulation. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Task-Induced Development of Hinting Behaviors in Online Task-Oriented L2 Interaction

    ERIC Educational Resources Information Center

    Balaman, Ufuk

    2018-01-01

    Technology-mediated task settings are rich interactional domains in which second language (L2) learners manage a multitude of interactional resources for task accomplishment. The affordances of these settings have been repeatedly addressed in computer-assisted language learning (CALL) literature mainly based on theory-informed task design…

  16. Has computational creativity successfully made it "Beyond the Fence" in musical theatre?

    NASA Astrophysics Data System (ADS)

    Jordanous, Anna

    2017-10-01

    A significant test for software is to task it with replicating human performance, as done recently with creative software and the commercial project Beyond the Fence (undertaken for a television documentary Computer Says Show). The remit of this project was to use computer software as much as possible to produce "the world's first computer-generated musical". Several creative systems were used to generate this musical, which was performed in London's West End in 2016. This paper considers the challenge of evaluating this project. Current computational creativity evaluation methods are ill-suited to evaluating projects that involve creative input from multiple systems and people. Following recent inspiration within computational creativity research from interaction design, here the DECIDE evaluation framework is applied to evaluate the Beyond the Fence project. Evaluation finds that the project was reasonably successful at achieving the task of using computational generation to produce a credible musical. Lessons have been learned for future computational creativity projects though, particularly for affording creative software more agency and enabling software to interact with other creative partners. Upon reflection, the DECIDE framework emerges as a useful evaluation "checklist" (if not a tangible operational methodology) for evaluating multiple creative systems participating in a creative task.

  17. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1977-01-01

    Models, measures and techniques were developed for evaluating the effectiveness of aircraft computing systems. The concept of effectiveness involves aspects of system performance, reliability and worth. Specifically done was a detailed development of model hierarchy at mission, functional task, and computational task levels. An appropriate class of stochastic models was investigated which served as bottom level models in the hierarchial scheme. A unified measure of effectiveness called 'performability' was defined and formulated.

  18. Emergent Leadership and Team Effectiveness on a Team Resource Allocation Task

    DTIC Science & Technology

    1987-10-01

    equivalent training and experience on this task, but they had different levels of experience with computers and video games . This differential experience...typed: that is. it is sex-typed to the extent that males spend mnore time on related instrumeuts like computers and video games . However. the sex...perform better or worse than less talkative teams? Did teams with much computer and ’or video game experience perform better than inexperienced teams

  19. Translational Genomics Research Institute (TGen): Quantified Cancer Cell Line Encyclopedia (CCLE) RNA-seq Data | Office of Cancer Genomics

    Cancer.gov

    Many applications analyze quantified transcript-level abundances to make inferences.  Having completed this computation across the large sample set, the CTD2 Center at the Translational Genomics Research Institute presents the quantified data in a straightforward, consolidated form for these types of analyses.

  20. The Urban Forest Effects (UFORE) model: quantifying urban forest structure and functions

    Treesearch

    David J. Nowak; Daniel E. Crane

    2000-01-01

    The Urban Forest Effects (UFORE) computer model was developed to help managers and researchers quantify urban forest structure and functions. The model quantifies species composition and diversity, diameter distribution, tree density and health, leaf area, leaf biomass, and other structural characteristics; hourly volatile organic compound emissions (emissions that...

  1. After-effects of human-computer interaction indicated by P300 of the event-related brain potential.

    PubMed

    Trimmel, M; Huber, R

    1998-05-01

    After-effects of human-computer interaction (HCI) were investigated by using the P300 component of the event-related brain potential (ERP). Forty-nine subjects (naive non-users, beginners, experienced users, programmers) completed three paper/pencil tasks (text editing, solving intelligence test items, filling out a questionnaire on sensation seeking) and three HCI tasks (text editing, executing a tutor program or programming, playing Tetris). The sequence of 7-min tasks was randomized between subjects and balanced between groups. After each experimental condition ERPs were recorded during an acoustic discrimination task at F3, F4, Cz, P3 and P4. Data indicate that: (1) mental after-effects of HCI can be detected by P300 of the ERP; (2) HCI showed in general a reduced amplitude; (3) P300 amplitude varied also with type of task, mainly at F4 where it was smaller after cognitive tasks (intelligence test/programming) and larger after emotion-based tasks (sensation seeking/Tetris); (4) cognitive tasks showed shorter latencies; (5) latencies were widely location-independent (within the range of 356-358 ms at F3, F4, P3 and P4) after executing the tutor program or programming; and (6) all observed after-effects were independent of the user's experience in operating computers and may therefore reflect short-term after-effects only and no structural changes of information processing caused by HCI.

  2. Mental workload during brain-computer interface training.

    PubMed

    Felton, Elizabeth A; Williams, Justin C; Vanderheiden, Gregg C; Radwin, Robert G

    2012-01-01

    It is not well understood how people perceive the difficulty of performing brain-computer interface (BCI) tasks, which specific aspects of mental workload contribute the most, and whether there is a difference in perceived workload between participants who are able-bodied and disabled. This study evaluated mental workload using the NASA Task Load Index (TLX), a multi-dimensional rating procedure with six subscales: Mental Demands, Physical Demands, Temporal Demands, Performance, Effort, and Frustration. Able-bodied and motor disabled participants completed the survey after performing EEG-based BCI Fitts' law target acquisition and phrase spelling tasks. The NASA-TLX scores were similar for able-bodied and disabled participants. For example, overall workload scores (range 0-100) for 1D horizontal tasks were 48.5 (SD = 17.7) and 46.6 (SD 10.3), respectively. The TLX can be used to inform the design of BCIs that will have greater usability by evaluating subjective workload between BCI tasks, participant groups, and control modalities. Mental workload of brain-computer interfaces (BCI) can be evaluated with the NASA Task Load Index (TLX). The TLX is an effective tool for comparing subjective workload between BCI tasks, participant groups (able-bodied and disabled), and control modalities. The data can inform the design of BCIs that will have greater usability.

  3. Magnetic Tunnel Junction Mimics Stochastic Cortical Spiking Neurons

    NASA Astrophysics Data System (ADS)

    Sengupta, Abhronil; Panda, Priyadarshini; Wijesinghe, Parami; Kim, Yusung; Roy, Kaushik

    2016-07-01

    Brain-inspired computing architectures attempt to mimic the computations performed in the neurons and the synapses in the human brain in order to achieve its efficiency in learning and cognitive tasks. In this work, we demonstrate the mapping of the probabilistic spiking nature of pyramidal neurons in the cortex to the stochastic switching behavior of a Magnetic Tunnel Junction in presence of thermal noise. We present results to illustrate the efficiency of neuromorphic systems based on such probabilistic neurons for pattern recognition tasks in presence of lateral inhibition and homeostasis. Such stochastic MTJ neurons can also potentially provide a direct mapping to the probabilistic computing elements in Belief Networks for performing regenerative tasks.

  4. Childhood obesity affects postural control and aiming performance during an upper limb movement.

    PubMed

    Boucher, François; Handrigan, Grant A; Mackrous, Isabelle; Hue, Olivier

    2015-07-01

    Obesity reduces the efficiency of postural and movement control mechanisms. However, the effects of obesity on a functional motor task and postural control in standing and seated position have not been closely quantified among children. The aim of this study is to examine the effects of obesity on the execution of aiming tasks performed in standing and seated conditions in children. Twelve healthy weight children and eleven obese children aged between 8 and 11 years pointed to a target in standing and seated position. The difficulty of the aiming task was varied by using 2 target sizes (1.0 cm and 5.0 cm width; pointing to the smaller target size needs a more precise movement and constitutes a more difficult task). Hand movement time (MT) and its phases were measured to quantify the aiming task. Mean speed of the center of pressure displacement (COP speed) was calculated to assess postural stability during the movement. Obese children had significantly higher MTs compared to healthy-weight children in seated and standing conditions explained by greater durations of deceleration phase when aiming. Concerning the COP speed during the movement, obese children showed significantly higher values when standing compared to healthy-weight children. This was also observed in the seated position. In conclusion, obesity adds a postural constraint during an aiming task in both seated and standing conditions and requires obese children to take more time to correct their movements due to a greater postural instability of the body when pointing to a target with the upper-limb. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Flame-Vortex Studies to Quantify Markstein Numbers Needed to Model Flame Extinction Limits

    NASA Technical Reports Server (NTRS)

    Driscoll, James F.; Feikema, Douglas A.

    2003-01-01

    This has quantified a database of Markstein numbers for unsteady flames; future work will quantify a database of flame extinction limits for unsteady conditions. Unsteady extinction limits have not been documented previously; both a stretch rate and a residence time must be measured, since extinction requires that the stretch rate be sufficiently large for a sufficiently long residence time. Ma was measured for an inwardly-propagating flame (IPF) that is negatively-stretched under microgravity conditions. Computations also were performed using RUN-1DL to explain the measurements. The Markstein number of an inwardly-propagating flame, for both the microgravity experiment and the computations, is significantly larger than that of an outwardy-propagating flame. The computed profiles of the various species within the flame suggest reasons. Computed hydrogen concentrations build up ahead of the IPF but not the OPF. Understanding was gained by running the computations for both simplified and full-chemistry conditions. Numerical Simulations. To explain the experimental findings, numerical simulations of both inwardly and outwardly propagating spherical flames (with complex chemistry) were generated using the RUN-1DL code, which includes 16 species and 46 reactions.

  6. Desktop Computing Integration Project

    NASA Technical Reports Server (NTRS)

    Tureman, Robert L., Jr.

    1992-01-01

    The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.

  7. Accelerating Dust Storm Simulation by Balancing Task Allocation in Parallel Computing Environment

    NASA Astrophysics Data System (ADS)

    Gui, Z.; Yang, C.; XIA, J.; Huang, Q.; YU, M.

    2013-12-01

    Dust storm has serious negative impacts on environment, human health, and assets. The continuing global climate change has increased the frequency and intensity of dust storm in the past decades. To better understand and predict the distribution, intensity and structure of dust storm, a series of dust storm models have been developed, such as Dust Regional Atmospheric Model (DREAM), the NMM meteorological module (NMM-dust) and Chinese Unified Atmospheric Chemistry Environment for Dust (CUACE/Dust). The developments and applications of these models have contributed significantly to both scientific research and our daily life. However, dust storm simulation is a data and computing intensive process. Normally, a simulation for a single dust storm event may take several days or hours to run. It seriously impacts the timeliness of prediction and potential applications. To speed up the process, high performance computing is widely adopted. By partitioning a large study area into small subdomains according to their geographic location and executing them on different computing nodes in a parallel fashion, the computing performance can be significantly improved. Since spatiotemporal correlations exist in the geophysical process of dust storm simulation, each subdomain allocated to a node need to communicate with other geographically adjacent subdomains to exchange data. Inappropriate allocations may introduce imbalance task loads and unnecessary communications among computing nodes. Therefore, task allocation method is the key factor, which may impact the feasibility of the paralleling. The allocation algorithm needs to carefully leverage the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire system. This presentation introduces two algorithms for such allocation and compares them with evenly distributed allocation method. Specifically, 1) In order to get optimized solutions, a quadratic programming based modeling method is proposed. This algorithm performs well with small amount of computing tasks. However, its efficiency decreases significantly as the subdomain number and computing node number increase. 2) To compensate performance decreasing for large scale tasks, a K-Means clustering based algorithm is introduced. Instead of dedicating to get optimized solutions, this method can get relatively good feasible solutions within acceptable time. However, it may introduce imbalance communication for nodes or node-isolated subdomains. This research shows both two algorithms have their own strength and weakness for task allocation. A combination of the two algorithms is under study to obtain a better performance. Keywords: Scheduling; Parallel Computing; Load Balance; Optimization; Cost Model

  8. An evaluation of nursing tasks.

    PubMed

    Baptiste, Andrea

    2011-01-01

    Functional capacity evaluations have been criticized as being too general in theory and not being accurate enough to determine what tasks an employee can perform. This paper will describe results of a descriptive study that was conducted in a laboratory setting to objectively determine the physical demands of patient transfer tasks performed by nurses. Fifty three tasks were analyzed and broken down into sub-tasks to quantify the peak force required to perform each sub-task in order to determine which tasks pose healthcare workers at highest risk of injury. Dissecting the transfer task into segments allows us to see which part of the task requires high forces on the part of the caregiver. The task can then be modified to eliminate the risk of injury to the caregiver. This modification can be accomplished by using healthcare technology, such as floor based or overhead lifts, friction reducing devices, sit to stand lifts, properly designed slings, and motorized beds/trolleys. Technological solutions are available for some of these high risk tasks and should be implemented where applicable to reduce the force demand and eliminate or reduce the risk of injury to healthcare workers in nursing.

  9. VM Capacity-Aware Scheduling within Budget Constraints in IaaS Clouds

    PubMed Central

    Thanasias, Vasileios; Lee, Choonhwa; Hanif, Muhammad; Kim, Eunsam; Helal, Sumi

    2016-01-01

    Recently, cloud computing has drawn significant attention from both industry and academia, bringing unprecedented changes to computing and information technology. The infrastructure-as-a-Service (IaaS) model offers new abilities such as the elastic provisioning and relinquishing of computing resources in response to workload fluctuations. However, because the demand for resources dynamically changes over time, the provisioning of resources in a way that a given budget is efficiently utilized while maintaining a sufficing performance remains a key challenge. This paper addresses the problem of task scheduling and resource provisioning for a set of tasks running on IaaS clouds; it presents novel provisioning and scheduling algorithms capable of executing tasks within a given budget, while minimizing the slowdown due to the budget constraint. Our simulation study demonstrates a substantial reduction up to 70% in the overall task slowdown rate by the proposed algorithms. PMID:27501046

  10. VM Capacity-Aware Scheduling within Budget Constraints in IaaS Clouds.

    PubMed

    Thanasias, Vasileios; Lee, Choonhwa; Hanif, Muhammad; Kim, Eunsam; Helal, Sumi

    2016-01-01

    Recently, cloud computing has drawn significant attention from both industry and academia, bringing unprecedented changes to computing and information technology. The infrastructure-as-a-Service (IaaS) model offers new abilities such as the elastic provisioning and relinquishing of computing resources in response to workload fluctuations. However, because the demand for resources dynamically changes over time, the provisioning of resources in a way that a given budget is efficiently utilized while maintaining a sufficing performance remains a key challenge. This paper addresses the problem of task scheduling and resource provisioning for a set of tasks running on IaaS clouds; it presents novel provisioning and scheduling algorithms capable of executing tasks within a given budget, while minimizing the slowdown due to the budget constraint. Our simulation study demonstrates a substantial reduction up to 70% in the overall task slowdown rate by the proposed algorithms.

  11. Quadcopter control in three-dimensional space using a noninvasive motor imagery based brain-computer interface

    PubMed Central

    LaFleur, Karl; Cassady, Kaitlin; Doud, Alexander; Shades, Kaleb; Rogin, Eitan; He, Bin

    2013-01-01

    Objective At the balanced intersection of human and machine adaptation is found the optimally functioning brain-computer interface (BCI). In this study, we report a novel experiment of BCI controlling a robotic quadcopter in three-dimensional physical space using noninvasive scalp EEG in human subjects. We then quantify the performance of this system using metrics suitable for asynchronous BCI. Lastly, we examine the impact that operation of a real world device has on subjects’ control with comparison to a two-dimensional virtual cursor task. Approach Five human subjects were trained to modulate their sensorimotor rhythms to control an AR Drone navigating a three-dimensional physical space. Visual feedback was provided via a forward facing camera on the hull of the drone. Individual subjects were able to accurately acquire up to 90.5% of all valid targets presented while travelling at an average straight-line speed of 0.69 m/s. Significance Freely exploring and interacting with the world around us is a crucial element of autonomy that is lost in the context of neurodegenerative disease. Brain-computer interfaces are systems that aim to restore or enhance a user’s ability to interact with the environment via a computer and through the use of only thought. We demonstrate for the first time the ability to control a flying robot in the three-dimensional physical space using noninvasive scalp recorded EEG in humans. Our work indicates the potential of noninvasive EEG based BCI systems to accomplish complex control in three-dimensional physical space. The present study may serve as a framework for the investigation of multidimensional non-invasive brain-computer interface control in a physical environment using telepresence robotics. PMID:23735712

  12. Design of Computer-aided Instruction for Radiology Interpretation: The Role of Cognitive Task Analysis

    PubMed Central

    Pusic, Martin V.; LeBlanc, Vicki; Patel, Vimla L.

    2001-01-01

    Traditional task analysis for instructional design has emphasized the importance of precisely defining behavioral educational objectives and working back to select objective-appropriate instructional strategies. However, this approach may miss effective strategies. Cognitive task analysis, on the other hand, breaks a process down into its component knowledge representations. Selection of instructional strategies based on all such representations in a domain is likely to lead to optimal instructional design. In this demonstration, using the interpretation of cervical spine x-rays as an educational example, we show how a detailed cognitive task analysis can guide the development of computer-aided instruction.

  13. Convolutional neural networks and face recognition task

    NASA Astrophysics Data System (ADS)

    Sochenkova, A.; Sochenkov, I.; Makovetskii, A.; Vokhmintsev, A.; Melnikov, A.

    2017-09-01

    Computer vision tasks are remaining very important for the last couple of years. One of the most complicated problems in computer vision is face recognition that could be used in security systems to provide safety and to identify person among the others. There is a variety of different approaches to solve this task, but there is still no universal solution that would give adequate results in some cases. Current paper presents following approach. Firstly, we extract an area containing face, then we use Canny edge detector. On the next stage we use convolutional neural networks (CNN) to finally solve face recognition and person identification task.

  14. Mission-based Scenario Research: Experimental Design And Analysis

    DTIC Science & Technology

    2012-01-01

    neurotechnologies called Brain-Computer Interaction Technologies. 15. SUBJECT TERMS neuroimaging, EEG, task loading, neurotechnologies , ground... neurotechnologies called Brain-Computer Interaction Technologies. INTRODUCTION Imagine a system that can identify operator fatigue during a long-term...BCIT), a class of neurotechnologies , that aim to improve task performance by incorporating measures of brain activity to optimize the interactions

  15. An Interaction of Screen Colour and Lesson Task in CAL

    ERIC Educational Resources Information Center

    Clariana, Roy B.

    2004-01-01

    Colour is a common feature in computer-aided learning (CAL), though the instructional effects of screen colour are not well understood. This investigation considers the effects of different CAL study tasks with feedback on posttest performance and on posttest memory of the lesson colour scheme. Graduate students (n=68) completed a computer-based…

  16. Nonoccurrence of Negotiation of Meaning in Task-Based Synchronous Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Van Der Zwaard, Rose; Bannink, Anne

    2016-01-01

    This empirical study investigated the occurrence of meaning negotiation in an interactive synchronous computer-mediated second language (L2) environment. Sixteen dyads (N = 32) consisting of nonnative speakers (NNSs) and native speakers (NSs) of English performed 2 different tasks using videoconferencing and written chat. The data were coded and…

  17. Studying Parental Decision Making with Micro-Computers: The CPSI Technique.

    ERIC Educational Resources Information Center

    Holden, George W.

    A technique for studying how parents think, make decisions, and solve childrearing problems, Computer-Presented Social Interactions (CPSI), is described. Two studies involving CPSI are presented. The first study concerns a common parental cognitive task: causal analysis of an undesired behavior. The task was to diagnose the cause of non-contingent…

  18. Tangential Floor in a Classroom Setting

    ERIC Educational Resources Information Center

    Marti, Leyla

    2012-01-01

    This article examines floor management in two classroom sessions: a task-oriented computer lesson and a literature lesson. Recordings made in the computer lesson show the organization of floor when a task is given to students. Temporary or "incipient" side floors (Jones and Thornborrow, 2004) emerge beside the main floor. In the literature lesson,…

  19. From Earth to Space--Advertising Films Created in a Computer-Based Primary School Task

    ERIC Educational Resources Information Center

    Öman, Anne

    2017-01-01

    Today, teachers orchestrate computer-based tasks in software applications in Swedish primary schools. Meaning is made through various modes, and multimodal perspectives on literacy have the basic assumption that meaning is made through many representational and communicational resources. The case study presented in this paper has analysed pupils'…

  20. An Undergraduate Course on Operating Systems Principles.

    ERIC Educational Resources Information Center

    National Academy of Engineering, Washington, DC. Commission on Education.

    This report is from Task Force VIII of the COSINE Committee of the Commission on Education of the National Academy of Engineering. The task force was established to formulate subject matter for an elective undergraduate subject on computer operating systems principles for students whose major interest is in the engineering of computer systems and…

  1. Negotiation of Meaning in Synchronous Computer-Mediated Communication in Relation to Task Types

    ERIC Educational Resources Information Center

    Cho, Hye-jin

    2011-01-01

    The present study explored how negotiation of meaning occurred in task-based synchronous computer-mediated communication (SCMC) environment among college English learners. Based on the theoretical framework of the interaction hypothesis and negotiation of meaning, four research questions arose: (1) how negotiation of meaning occur in non-native…

  2. Computer Task Application Use by Professional Health Educators: Implications for Professional Preparation.

    ERIC Educational Resources Information Center

    Hanks, Walter A.; Barnes, Michael D.; Merrill, Ray M.; Neiger, Brad L.

    2000-01-01

    Investigated how health educators currently used computers and how they expected to use them in the future. Surveys of practicing health educators at many types of sites indicated that important current abilities included Internet, word processing, and electronic presentation skills. Important future tasks and skills included developing computer…

  3. Soldier-Computer Interface

    DTIC Science & Technology

    2015-01-27

    placed on the user by the required tasks. Design areas that are of concern include seating , input and output device location and design , ambient...software, hardware, and workspace design for the test function of operability that influence operator performance in a computer-based system. 15...PRESENTATION ................... 23 APPENDIX A. SAMPLE DESIGN CHECKLISTS ...................................... A-1 B. SAMPLE TASK CHECKLISTS

  4. BASIC, Logo, and Pilot: A Comparison of Three Computer Languages.

    ERIC Educational Resources Information Center

    Maddux, Cleborne D.; Cummings, Rhoda E.

    1985-01-01

    Following a brief history of Logo, BASIC, and Pilot programing languages, common educational programing tasks (input from keyboard, evaluation of keyboard input, and computation) are presented in each language to illustrate how each can be used to perform the same tasks and to demonstrate each language's strengths and weaknesses. (MBR)

  5. ESL Students' Interaction in Second Life: Task-Based Synchronous Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Jee, Min Jung

    2010-01-01

    The purpose of the present study was to explore ESL students' interactions in task-based synchronous computer-mediated communication (SCMC) in Second Life, a virtual environment by which users can interact through representational figures. I investigated Low-Intermediate and High-Intermediate ESL students' interaction patterns before, during, and…

  6. Oral Computer-Mediated Interaction between L2 Learners: It's about Time!

    ERIC Educational Resources Information Center

    Yanguas, Inigo

    2010-01-01

    This study explores task-based, synchronous oral computer-mediated communication (CMC) among intermediate-level learners of Spanish. In particular, this paper examines (a) how learners in video and audio CMC groups negotiate for meaning during task-based interaction, (b) possible differences between both oral CMC modes and traditional face-to-face…

  7. Control-display mapping in brain-computer interfaces.

    PubMed

    Thurlings, Marieke E; van Erp, Jan B F; Brouwer, Anne-Marie; Blankertz, Benjamin; Werkhoven, Peter

    2012-01-01

    Event-related potential (ERP) based brain-computer interfaces (BCIs) employ differences in brain responses to attended and ignored stimuli. When using a tactile ERP-BCI for navigation, mapping is required between navigation directions on a visual display and unambiguously corresponding tactile stimuli (tactors) from a tactile control device: control-display mapping (CDM). We investigated the effect of congruent (both display and control horizontal or both vertical) and incongruent (vertical display, horizontal control) CDMs on task performance, the ERP and potential BCI performance. Ten participants attended to a target (determined via CDM), in a stream of sequentially vibrating tactors. We show that congruent CDM yields best task performance, enhanced the P300 and results in increased estimated BCI performance. This suggests a reduced availability of attentional resources when operating an ERP-BCI with incongruent CDM. Additionally, we found an enhanced N2 for incongruent CDM, which indicates a conflict between visual display and tactile control orientations. Incongruency in control-display mapping reduces task performance. In this study, brain responses, task and system performance are related to (in)congruent mapping of command options and the corresponding stimuli in a brain-computer interface (BCI). Directional congruency reduces task errors, increases available attentional resources, improves BCI performance and thus facilitates human-computer interaction.

  8. Population-based learning of load balancing policies for a distributed computer system

    NASA Technical Reports Server (NTRS)

    Mehra, Pankaj; Wah, Benjamin W.

    1993-01-01

    Effective load-balancing policies use dynamic resource information to schedule tasks in a distributed computer system. We present a novel method for automatically learning such policies. At each site in our system, we use a comparator neural network to predict the relative speedup of an incoming task using only the resource-utilization patterns obtained prior to the task's arrival. Outputs of these comparator networks are broadcast periodically over the distributed system, and the resource schedulers at each site use these values to determine the best site for executing an incoming task. The delays incurred in propagating workload information and tasks from one site to another, as well as the dynamic and unpredictable nature of workloads in multiprogrammed multiprocessors, may cause the workload pattern at the time of execution to differ from patterns prevailing at the times of load-index computation and decision making. Our load-balancing policy accommodates this uncertainty by using certain tunable parameters. We present a population-based machine-learning algorithm that adjusts these parameters in order to achieve high average speedups with respect to local execution. Our results show that our load-balancing policy, when combined with the comparator neural network for workload characterization, is effective in exploiting idle resources in a distributed computer system.

  9. Effects of Dual Monitor Computer Work Versus Laptop Work on Cervical Muscular and Proprioceptive Characteristics of Males and Females.

    PubMed

    Farias Zuniga, Amanda M; Côté, Julie N

    2017-06-01

    The effects of performing a 90-minute computer task with a laptop versus a dual monitor desktop workstation were investigated in healthy young male and female adults. Work-related musculoskeletal disorders are common among computer (especially female) users. Laptops have surpassed desktop computer sales, and working with multiple monitors has also become popular. However, few studies have provided objective evidence on how they affect the musculoskeletal system in both genders. Twenty-seven healthy participants (mean age = 24.6 years; 13 males) completed a 90-minute computer task while using a laptop or dual monitor (DualMon) desktop. Electromyography (EMG) from eight upper body muscles and visual strain were measured throughout the task. Neck proprioception was tested before and after the computer task using a head-repositioning test. EMG amplitude (root mean square [RMS]), variability (coefficients of variation [CV]), and normalized mutual information (NMI) were computed. Visual strain ( p < .01) and right upper trapezius RMS ( p = .03) increased significantly over time regardless of workstation. Right cervical erector spinae RMS and cervical NMI were smaller, while degrees of overshoot (mean = 4.15°) and end position error (mean = 1.26°) were larger in DualMon regardless of time. Effects on muscle activity were more pronounced in males, whereas effects on proprioception were more pronounced in females. Results suggest that compared to laptop, DualMon work is effective in reducing cervical muscle activity, dissociating cervical connectivity, and maintaining more typical neck repositioning patterns, suggesting some health-protective effects. This evidence could be considered when deciding on computer workstation designs.

  10. EPIC Computational Models of Psychological Refractory-Period Effects in Human Multiple-Task Performance.

    ERIC Educational Resources Information Center

    Meyer, David E.; Kieras, David E.

    Perceptual-motor and cognitive processes whereby people perform multiple concurrent tasks have been studied through an overlapping-tasks procedure in which two successive choice-reaction tasks are performed with a variable interval (stimulus onset asynchrony, or SOA) between the beginning of the first and second tasks. The increase in subjects'…

  11. The employment of a spoken language computer applied to an air traffic control task.

    NASA Technical Reports Server (NTRS)

    Laveson, J. I.; Silver, C. A.

    1972-01-01

    Assessment of the merits of a limited spoken language (56 words) computer in a simulated air traffic control (ATC) task. An airport zone approximately 60 miles in diameter with a traffic flow simulation ranging from single-engine to commercial jet aircraft provided the workload for the controllers. This research determined that, under the circumstances of the experiments carried out, the use of a spoken-language computer would not improve the controller performance.

  12. Productivity associated with visual status of computer users.

    PubMed

    Daum, Kent M; Clore, Katherine A; Simms, Suzanne S; Vesely, Jon W; Wilczek, Dawn D; Spittle, Brian M; Good, Greg W

    2004-01-01

    The aim of this project is to examine the potential connection between the astigmatic refractive corrections of subjects using computers and their productivity and comfort. We hypothesize that improving the visual status of subjects using computers results in greater productivity, as well as improved visual comfort. Inclusion criteria required subjects 19 to 30 years of age with complete vision examinations before being enrolled. Using a double-masked, placebo-controlled, randomized design, subjects completed three experimental tasks calculated to assess the effects of refractive error on productivity (time to completion and the number of errors) at a computer. The tasks resembled those commonly undertaken by computer users and involved visual search tasks of: (1) counties and populations; (2) nonsense word search; and (3) a modified text-editing task. Estimates of productivity for time to completion varied from a minimum of 2.5% upwards to 28.7% with 2 D cylinder miscorrection. Assuming a conservative estimate of an overall 2.5% increase in productivity with appropriate astigmatic refractive correction, our data suggest a favorable cost-benefit ratio of at least 2.3 for the visual correction of an employee (total cost 268 dollars) with a salary of 25,000 dollars per year. We conclude that astigmatic refractive error affected both productivity and visual comfort under the conditions of this experiment. These data also suggest a favorable cost-benefit ratio for employers who provide computer-specific eyewear to their employees.

  13. Refueling Strategies for a Team of Cooperating AUVs

    DTIC Science & Technology

    2011-01-01

    manager, and thus the constraint a centrally managed underwater network imposes on the mission. Task management utilizing Robust Decentralized Task ...the computational complexity. A bid based approach to task management has also been studied as a possible means of decentralization of group task ...currently performing another task . In [18], ground robots perform distributed task allocation using the ASyMTRy-D algorithm, which is based on CNP

  14. Reducing radiation dose to the female breast during CT coronary angiography: A simulation study comparing breast shielding, angular tube current modulation, reduced kV, and partial angle protocols using an unknown-location signal-detectability metric

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rupcich, Franco; Gilat Schmidt, Taly; Badal, Andreu

    2013-08-15

    Purpose: The authors compared the performance of five protocols intended to reduce dose to the breast during computed tomography (CT) coronary angiography scans using a model observer unknown-location signal-detectability metric.Methods: The authors simulated CT images of an anthropomorphic female thorax phantom for a 120 kV reference protocol and five “dose reduction” protocols intended to reduce dose to the breast: 120 kV partial angle (posteriorly centered), 120 kV tube-current modulated (TCM), 120 kV with shielded breasts, 80 kV, and 80 kV partial angle (posteriorly centered). Two image quality tasks were investigated: the detection and localization of 4-mm, 3.25 mg/ml and 1-mm,more » 6.0 mg/ml iodine contrast signals randomly located in the heart region. For each protocol, the authors plotted the signal detectability, as quantified by the area under the exponentially transformed free response characteristic curve estimator (A-caret{sub FE}), as well as noise and contrast-to-noise ratio (CNR) versus breast and lung dose. In addition, the authors quantified each protocol's dose performance as the percent difference in dose relative to the reference protocol achieved while maintaining equivalent A-caret{sub FE}.Results: For the 4-mm signal-size task, the 80 kV full scan and 80 kV partial angle protocols decreased dose to the breast (80.5% and 85.3%, respectively) and lung (80.5% and 76.7%, respectively) with A-caret{sub FE} = 0.96, but also resulted in an approximate three-fold increase in image noise. The 120 kV partial protocol reduced dose to the breast (17.6%) at the expense of increased lung dose (25.3%). The TCM algorithm decreased dose to the breast (6.0%) and lung (10.4%). Breast shielding increased breast dose (67.8%) and lung dose (103.4%). The 80 kV and 80 kV partial protocols demonstrated greater dose reductions for the 4-mm task than for the 1-mm task, and the shielded protocol showed a larger increase in dose for the 4-mm task than for the 1-mm task. In general, the CNR curves indicate a similar relative ranking of protocol performance as the corresponding A-caret{sub FE} curves, however, the CNR metric overestimated the performance of the shielded protocol for both tasks, leading to corresponding underestimates in the relative dose increases compared to those obtained when using the A-caret{sub FE} metric.Conclusions: The 80 kV and 80 kV partial angle protocols demonstrated the greatest reduction to breast and lung dose, however, the subsequent increase in image noise may be deemed clinically unacceptable. Tube output for these protocols can be adjusted to achieve a more desirable noise level with lesser breast dose savings. Breast shielding increased breast and lung dose when maintaining equivalent A-caret{sub FE}. The results demonstrated that comparisons of dose performance depend on both the image quality metric and the specific task, and that CNR may not be a reliable metric of signal detectability.« less

  15. Efficiency of the human observer detecting random signals in random backgrounds

    PubMed Central

    Park, Subok; Clarkson, Eric; Kupinski, Matthew A.; Barrett, Harrison H.

    2008-01-01

    The efficiencies of the human observer and the channelized-Hotelling observer relative to the ideal observer for signal-detection tasks are discussed. Both signal-known-exactly (SKE) tasks and signal-known-statistically (SKS) tasks are considered. Signal location is uncertain for the SKS tasks, and lumpy backgrounds are used for background uncertainty in both cases. Markov chain Monte Carlo methods are employed to determine ideal-observer performance on the detection tasks. Psychophysical studies are conducted to compute human-observer performance on the same tasks. Efficiency is computed as the squared ratio of the detectabilities of the observer of interest to the ideal observer. Human efficiencies are approximately 2.1% and 24%, respectively, for the SKE and SKS tasks. The results imply that human observers are not affected as much as the ideal observer by signal-location uncertainty even though the ideal observer outperforms the human observer for both tasks. Three different simplified pinhole imaging systems are simulated, and the humans and the model observers rank the systems in the same order for both the SKE and the SKS tasks. PMID:15669610

  16. Measuring exertion time, duty cycle and hand activity level for industrial tasks using computer vision.

    PubMed

    Akkas, Oguz; Lee, Cheng Hsien; Hu, Yu Hen; Harris Adamson, Carisa; Rempel, David; Radwin, Robert G

    2017-12-01

    Two computer vision algorithms were developed to automatically estimate exertion time, duty cycle (DC) and hand activity level (HAL) from videos of workers performing 50 industrial tasks. The average DC difference between manual frame-by-frame analysis and the computer vision DC was -5.8% for the Decision Tree (DT) algorithm, and 1.4% for the Feature Vector Training (FVT) algorithm. The average HAL difference was 0.5 for the DT algorithm and 0.3 for the FVT algorithm. A sensitivity analysis, conducted to examine the influence that deviations in DC have on HAL, found it remained unaffected when DC error was less than 5%. Thus, a DC error less than 10% will impact HAL less than 0.5 HAL, which is negligible. Automatic computer vision HAL estimates were therefore comparable to manual frame-by-frame estimates. Practitioner Summary: Computer vision was used to automatically estimate exertion time, duty cycle and hand activity level from videos of workers performing industrial tasks.

  17. MAX - An advanced parallel computer for space applications

    NASA Technical Reports Server (NTRS)

    Lewis, Blair F.; Bunker, Robert L.

    1991-01-01

    MAX is a fault-tolerant multicomputer hardware and software architecture designed to meet the needs of NASA spacecraft systems. It consists of conventional computing modules (computers) connected via a dual network topology. One network is used to transfer data among the computers and between computers and I/O devices. This network's topology is arbitrary. The second network operates as a broadcast medium for operating system synchronization messages and supports the operating system's Byzantine resilience. A fully distributed operating system supports multitasking in an asynchronous event and data driven environment. A large grain dataflow paradigm is used to coordinate the multitasking and provide easy control of concurrency. It is the basis of the system's fault tolerance and allows both static and dynamical location of tasks. Redundant execution of tasks with software voting of results may be specified for critical tasks. The dataflow paradigm also supports simplified software design, test and maintenance. A unique feature is a method for reliably patching code in an executing dataflow application.

  18. Recurrence quantification analysis of electroencephalograph signals during standard tasks of Waterloo-Stanford group scale of hypnotic susceptibility.

    PubMed

    Yargholi, Elahe'; Nasrabadi, Ali Motie

    2015-01-01

    The purpose of this study was to apply RQA (recurrence quantification analysis) on hypnotic electroencephalograph (EEG) signals recorded after hypnotic induction while subjects were doing standard tasks of the Waterloo-Stanford Group Scale (WSGS) of hypnotic susceptibility. Then recurrence quantifiers were used to analyse the influence of hypnotic depth on EEGs. By the application of this method, the capability of tasks to distinguish subjects of different hypnotizability levels was determined. Besides, medium hypnotizable subjects showed the highest disposition to be inducted by hypnotizer. Similarities between brain governing dynamics during tasks of the same type were also observed. The present study demonstrated two remarkable innovations; investigating the EEGs of the hypnotized as doing mental tasks of Waterloo-Stanford Group Scale (WSGS) and applying RQA on hypnotic EEGs.

  19. Differences in muscle load between computer and non-computer work among office workers.

    PubMed

    Richter, J M; Mathiassen, S E; Slijper, H P; Over, E A B; Frens, M A

    2009-12-01

    Introduction of more non-computer tasks has been suggested to increase exposure variation and thus reduce musculoskeletal complaints (MSC) in computer-intensive office work. This study investigated whether muscle activity did, indeed, differ between computer and non-computer activities. Whole-day logs of input device use in 30 office workers were used to identify computer and non-computer work, using a range of classification thresholds (non-computer thresholds (NCTs)). Exposure during these activities was assessed by bilateral electromyography recordings from the upper trapezius and lower arm. Contrasts in muscle activity between computer and non-computer work were distinct but small, even at the individualised, optimal NCT. Using an average group-based NCT resulted in less contrast, even in smaller subgroups defined by job function or MSC. Thus, computer activity logs should be used cautiously as proxies of biomechanical exposure. Conventional non-computer tasks may have a limited potential to increase variation in muscle activity during computer-intensive office work.

  20. Effects of precision demands and mental pressure on muscle activation and hand forces in computer mouse tasks.

    PubMed

    Visser, Bart; De Looze, Michiel; De Graaff, Matthijs; Van Dieën, Jaap

    2004-02-05

    The objective of the present study was to gain insight into the effects of precision demands and mental pressure on the load of the upper extremity. Two computer mouse tasks were used: an aiming and a tracking task. Upper extremity loading was operationalized as the myo-electric activity of the wrist flexor and extensor and of the trapezius descendens muscles and the applied grip- and click-forces on the computer mouse. Performance measures, reflecting the accuracy in both tasks and the clicking rate in the aiming task, indicated that the levels of the independent variables resulted in distinguishable levels of accuracy and work pace. Precision demands had a small effect on upper extremity loading with a significant increase in the EMG-amplitudes (21%) of the wrist flexors during the aiming tasks. Precision had large effects on performance. Mental pressure had substantial effects on EMG-amplitudes with an increase of 22% in the trapezius when tracking and increases of 41% in the trapezius and 45% and 140% in the wrist extensors and flexors, respectively, when aiming. During aiming, grip- and click-forces increased by 51% and 40% respectively. Mental pressure had small effects on accuracy but large effects on tempo during aiming. Precision demands and mental pressure in aiming and tracking tasks with a computer mouse were found to coincide with increased muscle activity in some upper extremity muscles and increased force exertion on the computer mouse. Mental pressure caused significant effects on these parameters more often than precision demands. Precision and mental pressure were found to have effects on performance, with precision effects being significant for all performance measures studied and mental pressure effects for some of them. The results of this study suggest that precision demands and mental pressure increase upper extremity load, with mental pressure effects being larger than precision effects. The possible role of precision demands as an indirect mental stressor in working conditions is discussed.

  1. Translational Genomics Research Institute: Quantified Cancer Cell Line Encyclopedia (CCLE) RNA-seq Data | Office of Cancer Genomics

    Cancer.gov

    Many applications analyze quantified transcript-level abundances to make inferences.  Having completed this computation across the large sample set, the CTD2 Center at the Translational Genomics Research Institute presents the quantified data in a straightforward, consolidated form for these types of analyses.   Experimental Approaches  

  2. Evaluating a Computerized Aid for Conducting a Cognitive Task Analysis

    DTIC Science & Technology

    2000-01-01

    in conducting a cognitive task analysis . The conduct of a cognitive task analysis is costly and labor intensive. As a result, a few computerized aids...evaluation of a computerized aid, specifically CAT-HCI (Cognitive Analysis Tool - Human Computer Interface), for the conduct of a detailed cognitive task analysis . A

  3. Taking Over Control From Highly Automated Vehicles in Complex Traffic Situations: The Role of Traffic Density.

    PubMed

    Gold, Christian; Körber, Moritz; Lechner, David; Bengler, Klaus

    2016-06-01

    The aim of this study was to quantify the impact of traffic density and verbal tasks on takeover performance in highly automated driving. In highly automated vehicles, the driver has to occasionally take over vehicle control when approaching system limits. To ensure safety, the ability of the driver to regain control of the driving task under various driving situations and different driver states needs to be quantified. Seventy-two participants experienced takeover situations requiring an evasive maneuver on a three-lane highway with varying traffic density (zero, 10, and 20 vehicles per kilometer). In a between-subjects design, half of the participants were engaged in a verbal 20-Questions Task, representing speaking on the phone while driving in a highly automated vehicle. The presence of traffic in takeover situations led to longer takeover times and worse takeover quality in the form of shorter time to collision and more collisions. The 20-Questions Task did not influence takeover time but seemed to have minor effects on the takeover quality. For the design and evaluation of human-machine interaction in takeover situations of highly automated vehicles, the traffic state seems to play a major role, compared to the driver state, manipulated by the 20-Questions Task. The present results can be used by developers of highly automated systems to appropriately design human-machine interfaces and to assess the driver's time budget for regaining control. © 2016, Human Factors and Ergonomics Society.

  4. Effects on Training Using Illumination in Virtual Environments

    NASA Technical Reports Server (NTRS)

    Maida, James C.; Novak, M. S. Jennifer; Mueller, Kristian

    1999-01-01

    Camera based tasks are commonly performed during orbital operations, and orbital lighting conditions, such as high contrast shadowing and glare, are a factor in performance. Computer based training using virtual environments is a common tool used to make and keep CTW members proficient. If computer based training included some of these harsh lighting conditions, would the crew increase their proficiency? The project goal was to determine whether computer based training increases proficiency if one trains for a camera based task using computer generated virtual environments with enhanced lighting conditions such as shadows and glare rather than color shaded computer images normally used in simulators. Previous experiments were conducted using a two degree of freedom docking system. Test subjects had to align a boresight camera using a hand controller with one axis of rotation and one axis of rotation. Two sets of subjects were trained on two computer simulations using computer generated virtual environments, one with lighting, and one without. Results revealed that when subjects were constrained by time and accuracy, those who trained with simulated lighting conditions performed significantly better than those who did not. To reinforce these results for speed and accuracy, the task complexity was increased.

  5. Toward quantifying the abuse liability of ultraviolet tanning: A behavioral economic approach to tanning addiction.

    PubMed

    Reed, Derek D; Kaplan, Brent A; Becirevic, Amel; Roma, Peter G; Hursh, Steven R

    2016-07-01

    Many adults engage in ultraviolet indoor tanning despite evidence of its association with skin cancer. The constellation of behaviors associated with ultraviolet indoor tanning is analogous to that in other behavioral addictions. Despite a growing literature on ultraviolet indoor tanning as an addiction, there remains no consensus on how to identify ultraviolet indoor tanning addictive tendencies. The purpose of the present study was to translate a behavioral economic task more commonly used in substance abuse to quantify the "abuse liability" of ultraviolet indoor tanning, establish construct validity, and determine convergent validity with the most commonly used diagnostic tools for ultraviolet indoor tanning addiction (i.e., mCAGE and mDSM-IV-TR). We conducted a between-groups study using a novel hypothetical Tanning Purchase Task to quantify intensity and elasticity of ultraviolet indoor tanning demand and permit statistical comparisons with the mCAGE and mDSM-IV-TR. Results suggest that behavioral economic demand is related to ultraviolet indoor tanning addiction status and adequately discriminates between potential addicted individuals from nonaddicted individuals. Moreover, we provide evidence that the Tanning Purchase Task renders behavioral economic indicators that are relevant to public health research. The present findings are limited to two ultraviolet indoor tanning addiction tools and a relatively small sample of high-risk ultraviolet indoor tanning users; however, these pilot data demonstrate the potential for behavioral economic assessment tools as diagnostic and research aids in ultraviolet indoor tanning addiction studies. © 2016 Society for the Experimental Analysis of Behavior.

  6. An Execution Service for Grid Computing

    NASA Technical Reports Server (NTRS)

    Smith, Warren; Hu, Chaumin

    2004-01-01

    This paper describes the design and implementation of the IPG Execution Service that reliably executes complex jobs on a computational grid. Our Execution Service is part of the IPG service architecture whose goal is to support location-independent computing. In such an environment, once n user ports an npplicntion to one or more hardware/software platfrms, the user can describe this environment to the grid the grid can locate instances of this platfrm, configure the platfrm as required for the application, and then execute the application. Our Execution Service runs jobs that set up such environments for applications and executes them. These jobs consist of a set of tasks for executing applications and managing data. The tasks have user-defined starting conditions that allow users to specih complex dependencies including task to execute when tasks fail, afiequent occurrence in a large distributed system, or are cancelled. The execution task provided by our service also configures the application environment exactly as specified by the user and captures the exit code of the application, features that many grid execution services do not support due to dflculties interfacing to local scheduling systems.

  7. Methodology development for evaluation of selective-fidelity rotorcraft simulation

    NASA Technical Reports Server (NTRS)

    Lewis, William D.; Schrage, D. P.; Prasad, J. V. R.; Wolfe, Daniel

    1992-01-01

    This paper addressed the initial step toward the goal of establishing performance and handling qualities acceptance criteria for realtime rotorcraft simulators through a planned research effort to quantify the system capabilities of 'selective fidelity' simulators. Within this framework the simulator is then classified based on the required task. The simulator is evaluated by separating the various subsystems (visual, motion, etc.) and applying corresponding fidelity constants based on the specific task. This methodology not only provides an assessment technique, but also provides a technique to determine the required levels of subsystem fidelity for a specific task.

  8. Quantifying gaze and mouse interactions on spatial visual interfaces with a new movement analytics methodology

    PubMed Central

    2017-01-01

    Eye movements provide insights into what people pay attention to, and therefore are commonly included in a variety of human-computer interaction studies. Eye movement recording devices (eye trackers) produce gaze trajectories, that is, sequences of gaze location on the screen. Despite recent technological developments that enabled more affordable hardware, gaze data are still costly and time consuming to collect, therefore some propose using mouse movements instead. These are easy to collect automatically and on a large scale. If and how these two movement types are linked, however, is less clear and highly debated. We address this problem in two ways. First, we introduce a new movement analytics methodology to quantify the level of dynamic interaction between the gaze and the mouse pointer on the screen. Our method uses volumetric representation of movement, the space-time densities, which allows us to calculate interaction levels between two physically different types of movement. We describe the method and compare the results with existing dynamic interaction methods from movement ecology. The sensitivity to method parameters is evaluated on simulated trajectories where we can control interaction levels. Second, we perform an experiment with eye and mouse tracking to generate real data with real levels of interaction, to apply and test our new methodology on a real case. Further, as our experiment tasks mimics route-tracing when using a map, it is more than a data collection exercise and it simultaneously allows us to investigate the actual connection between the eye and the mouse. We find that there seem to be natural coupling when eyes are not under conscious control, but that this coupling breaks down when instructed to move them intentionally. Based on these observations, we tentatively suggest that for natural tracing tasks, mouse tracking could potentially provide similar information as eye-tracking and therefore be used as a proxy for attention. However, more research is needed to confirm this. PMID:28777822

  9. Quantifying gaze and mouse interactions on spatial visual interfaces with a new movement analytics methodology.

    PubMed

    Demšar, Urška; Çöltekin, Arzu

    2017-01-01

    Eye movements provide insights into what people pay attention to, and therefore are commonly included in a variety of human-computer interaction studies. Eye movement recording devices (eye trackers) produce gaze trajectories, that is, sequences of gaze location on the screen. Despite recent technological developments that enabled more affordable hardware, gaze data are still costly and time consuming to collect, therefore some propose using mouse movements instead. These are easy to collect automatically and on a large scale. If and how these two movement types are linked, however, is less clear and highly debated. We address this problem in two ways. First, we introduce a new movement analytics methodology to quantify the level of dynamic interaction between the gaze and the mouse pointer on the screen. Our method uses volumetric representation of movement, the space-time densities, which allows us to calculate interaction levels between two physically different types of movement. We describe the method and compare the results with existing dynamic interaction methods from movement ecology. The sensitivity to method parameters is evaluated on simulated trajectories where we can control interaction levels. Second, we perform an experiment with eye and mouse tracking to generate real data with real levels of interaction, to apply and test our new methodology on a real case. Further, as our experiment tasks mimics route-tracing when using a map, it is more than a data collection exercise and it simultaneously allows us to investigate the actual connection between the eye and the mouse. We find that there seem to be natural coupling when eyes are not under conscious control, but that this coupling breaks down when instructed to move them intentionally. Based on these observations, we tentatively suggest that for natural tracing tasks, mouse tracking could potentially provide similar information as eye-tracking and therefore be used as a proxy for attention. However, more research is needed to confirm this.

  10. Quantum information processing by a continuous Maxwell demon

    NASA Astrophysics Data System (ADS)

    Stevens, Josey; Deffner, Sebastian

    Quantum computing is believed to be fundamentally superior to classical computing; however quantifying the specific thermodynamic advantage has been elusive. Experimentally motivated, we generalize previous minimal models of discrete demons to continuous state space. Analyzing our model allows one to quantify the thermodynamic resources necessary to process quantum information. By further invoking the semi-classical limit we compare the quantum demon with its classical analogue. Finally, this model also serves as a starting point to study open quantum systems.

  11. Implementation of a Message Passing Interface into a Cloud-Resolving Model for Massively Parallel Computing

    NASA Technical Reports Server (NTRS)

    Juang, Hann-Ming Henry; Tao, Wei-Kuo; Zeng, Xi-Ping; Shie, Chung-Lin; Simpson, Joanne; Lang, Steve

    2004-01-01

    The capability for massively parallel programming (MPP) using a message passing interface (MPI) has been implemented into a three-dimensional version of the Goddard Cumulus Ensemble (GCE) model. The design for the MPP with MPI uses the concept of maintaining similar code structure between the whole domain as well as the portions after decomposition. Hence the model follows the same integration for single and multiple tasks (CPUs). Also, it provides for minimal changes to the original code, so it is easily modified and/or managed by the model developers and users who have little knowledge of MPP. The entire model domain could be sliced into one- or two-dimensional decomposition with a halo regime, which is overlaid on partial domains. The halo regime requires that no data be fetched across tasks during the computational stage, but it must be updated before the next computational stage through data exchange via MPI. For reproducible purposes, transposing data among tasks is required for spectral transform (Fast Fourier Transform, FFT), which is used in the anelastic version of the model for solving the pressure equation. The performance of the MPI-implemented codes (i.e., the compressible and anelastic versions) was tested on three different computing platforms. The major results are: 1) both versions have speedups of about 99% up to 256 tasks but not for 512 tasks; 2) the anelastic version has better speedup and efficiency because it requires more computations than that of the compressible version; 3) equal or approximately-equal numbers of slices between the x- and y- directions provide the fastest integration due to fewer data exchanges; and 4) one-dimensional slices in the x-direction result in the slowest integration due to the need for more memory relocation for computation.

  12. NSI security task: Overview

    NASA Technical Reports Server (NTRS)

    Tencati, Ron

    1991-01-01

    An overview is presented of the NASA Science Internet (NSI) security task. The task includes the following: policies and security documentation; risk analysis and management; computer emergency response team; incident handling; toolkit development; user consulting; and working groups, conferences, and committees.

  13. Evaluation of a modified Fitts law brain-computer interface target acquisition task in able and motor disabled individuals

    NASA Astrophysics Data System (ADS)

    Felton, E. A.; Radwin, R. G.; Wilson, J. A.; Williams, J. C.

    2009-10-01

    A brain-computer interface (BCI) is a communication system that takes recorded brain signals and translates them into real-time actions, in this case movement of a cursor on a computer screen. This work applied Fitts' law to the evaluation of performance on a target acquisition task during sensorimotor rhythm-based BCI training. Fitts' law, which has been used as a predictor of movement time in studies of human movement, was used here to determine the information transfer rate, which was based on target acquisition time and target difficulty. The information transfer rate was used to make comparisons between control modalities and subject groups on the same task. Data were analyzed from eight able-bodied and five motor disabled participants who wore an electrode cap that recorded and translated their electroencephalogram (EEG) signals into computer cursor movements. Direct comparisons were made between able-bodied and disabled subjects, and between EEG and joystick cursor control in able-bodied subjects. Fitts' law aptly described the relationship between movement time and index of difficulty for each task movement direction when evaluated separately and averaged together. This study showed that Fitts' law can be successfully applied to computer cursor movement controlled by neural signals.

  14. High-Performance Data Analysis Tools for Sun-Earth Connection Missions

    NASA Technical Reports Server (NTRS)

    Messmer, Peter

    2011-01-01

    The data analysis tool of choice for many Sun-Earth Connection missions is the Interactive Data Language (IDL) by ITT VIS. The increasing amount of data produced by these missions and the increasing complexity of image processing algorithms requires access to higher computing power. Parallel computing is a cost-effective way to increase the speed of computation, but algorithms oftentimes have to be modified to take advantage of parallel systems. Enhancing IDL to work on clusters gives scientists access to increased performance in a familiar programming environment. The goal of this project was to enable IDL applications to benefit from both computing clusters as well as graphics processing units (GPUs) for accelerating data analysis tasks. The tool suite developed in this project enables scientists now to solve demanding data analysis problems in IDL that previously required specialized software, and it allows them to be solved orders of magnitude faster than on conventional PCs. The tool suite consists of three components: (1) TaskDL, a software tool that simplifies the creation and management of task farms, collections of tasks that can be processed independently and require only small amounts of data communication; (2) mpiDL, a tool that allows IDL developers to use the Message Passing Interface (MPI) inside IDL for problems that require large amounts of data to be exchanged among multiple processors; and (3) GPULib, a tool that simplifies the use of GPUs as mathematical coprocessors from within IDL. mpiDL is unique in its support for the full MPI standard and its support of a broad range of MPI implementations. GPULib is unique in enabling users to take advantage of an inexpensive piece of hardware, possibly already installed in their computer, and achieve orders of magnitude faster execution time for numerically complex algorithms. TaskDL enables the simple setup and management of task farms on compute clusters. The products developed in this project have the potential to interact, so one can build a cluster of PCs, each equipped with a GPU, and use mpiDL to communicate between the nodes and GPULib to accelerate the computations on each node.

  15. A neuronal model of a global workspace in effortful cognitive tasks.

    PubMed

    Dehaene, S; Kerszberg, M; Changeux, J P

    1998-11-24

    A minimal hypothesis is proposed concerning the brain processes underlying effortful tasks. It distinguishes two main computational spaces: a unique global workspace composed of distributed and heavily interconnected neurons with long-range axons, and a set of specialized and modular perceptual, motor, memory, evaluative, and attentional processors. Workspace neurons are mobilized in effortful tasks for which the specialized processors do not suffice. They selectively mobilize or suppress, through descending connections, the contribution of specific processor neurons. In the course of task performance, workspace neurons become spontaneously coactivated, forming discrete though variable spatio-temporal patterns subject to modulation by vigilance signals and to selection by reward signals. A computer simulation of the Stroop task shows workspace activation to increase during acquisition of a novel task, effortful execution, and after errors. We outline predictions for spatio-temporal activation patterns during brain imaging, particularly about the contribution of dorsolateral prefrontal cortex and anterior cingulate to the workspace.

  16. Characterizing and Mitigating Work Time Inflation in Task Parallel Programs

    DOE PAGES

    Olivier, Stephen L.; de Supinski, Bronis R.; Schulz, Martin; ...

    2013-01-01

    Task parallelism raises the level of abstraction in shared memory parallel programming to simplify the development of complex applications. However, task parallel applications can exhibit poor performance due to thread idleness, scheduling overheads, and work time inflation – additional time spent by threads in a multithreaded computation beyond the time required to perform the same work in a sequential computation. We identify the contributions of each factor to lost efficiency in various task parallel OpenMP applications and diagnose the causes of work time inflation in those applications. Increased data access latency can cause significant work time inflation in NUMA systems.more » Our locality framework for task parallel OpenMP programs mitigates this cause of work time inflation. Our extensions to the Qthreads library demonstrate that locality-aware scheduling can improve performance up to 3X compared to the Intel OpenMP task scheduler.« less

  17. Location and acquisition of objects in unpredictable locations. [a teleoperator system with a computer for manipulator control

    NASA Technical Reports Server (NTRS)

    Sword, A. J.; Park, W. T.

    1975-01-01

    A teleoperator system with a computer for manipulator control to combine the capabilities of both man and computer to accomplish a task is described. This system allows objects in unpredictable locations to be successfully located and acquired. By using a method of characterizing the work-space together with man's ability to plan a strategy and coarsely locate an object, the computer is provided with enough information to complete the tedious part of the task. In addition, the use of voice control is shown to be a useful component of the man/machine interface.

  18. Adaptive neuron-to-EMG decoder training for FES neuroprostheses

    NASA Astrophysics Data System (ADS)

    Ethier, Christian; Acuna, Daniel; Solla, Sara A.; Miller, Lee E.

    2016-08-01

    Objective. We have previously demonstrated a brain-machine interface neuroprosthetic system that provided continuous control of functional electrical stimulation (FES) and restoration of grasp in a primate model of spinal cord injury (SCI). Predicting intended EMG directly from cortical recordings provides a flexible high-dimensional control signal for FES. However, no peripheral signal such as force or EMG is available for training EMG decoders in paralyzed individuals. Approach. Here we present a method for training an EMG decoder in the absence of muscle activity recordings; the decoder relies on mapping behaviorally relevant cortical activity to the inferred EMG activity underlying an intended action. Monkeys were trained at a 2D isometric wrist force task to control a computer cursor by applying force in the flexion, extension, ulnar, and radial directions and execute a center-out task. We used a generic muscle force-to-endpoint force model based on muscle pulling directions to relate each target force to an optimal EMG pattern that attained the target force while minimizing overall muscle activity. We trained EMG decoders during the target hold periods using a gradient descent algorithm that compared EMG predictions to optimal EMG patterns. Main results. We tested this method both offline and online. We quantified both the accuracy of offline force predictions and the ability of a monkey to use these real-time force predictions for closed-loop cursor control. We compared both offline and online results to those obtained with several other direct force decoders, including an optimal decoder computed from concurrently measured neural and force signals. Significance. This novel approach to training an adaptive EMG decoder could make a brain-control FES neuroprosthesis an effective tool to restore the hand function of paralyzed individuals. Clinical implementation would make use of individualized EMG-to-force models. Broad generalization could be achieved by including data from multiple grasping tasks in the training of the neuron-to-EMG decoder. Our approach would make it possible for persons with SCI to grasp objects with their own hands, using near-normal motor intent.

  19. A Movement Monitor Based on Magneto-Inertial Sensors for Non-Ambulant Patients with Duchenne Muscular Dystrophy: A Pilot Study in Controlled Environment.

    PubMed

    Le Moing, Anne-Gaëlle; Seferian, Andreea Mihaela; Moraux, Amélie; Annoussamy, Mélanie; Dorveaux, Eric; Gasnier, Erwan; Hogrel, Jean-Yves; Voit, Thomas; Vissière, David; Servais, Laurent

    2016-01-01

    Measurement of muscle strength and activity of upper limbs of non-ambulant patients with neuromuscular diseases is a major challenge. ActiMyo® is an innovative device that uses magneto-inertial sensors to record angular velocities and linear accelerations that can be used over long periods of time in the home environment. The device was designed to insure long-term stability and good signal to noise ratio, even for very weak movements. In order to determine relevant and pertinent clinical variables with potential for use as outcome measures in clinical trials or to guide therapy decisions, we performed a pilot study in non-ambulant neuromuscular patients. We report here data from seven Duchenne Muscular Dystrophy (DMD) patients (mean age 18.5 ± 5.5 years) collected in a clinical setting. Patients were assessed while wearing the device during performance of validated tasks (MoviPlate, Box and Block test and Minnesota test) and tasks mimicking daily living. The ActiMyo® sensors were placed on the wrists during all the tests. Software designed for use with the device computed several variables to qualify and quantify muscular activity in the non-ambulant subjects. Four variables representative of upper limb activity were studied: the rotation rate, the ratio of the vertical component in the overall acceleration, the hand elevation rate, and an estimate of the power of the upper limb. The correlations between clinical data and physical activity and the ActiMyo® movement parameters were analyzed. The mean of the rotation rate and mean of the elevation rate appeared promising since these variables had the best reliability scores and correlations with task scores. Parameters could be computed even in a patient with a Brooke functional score of 6. The variables chosen are good candidates as potential outcome measures in non-ambulant patients with Duchenne Muscular Dystrophy and use of the ActiMyo® is currently being explored in home environment. ClinicalTrials.gov NCT01611597.

  20. Task-Based Oral Computer-Mediated Communication and L2 Vocabulary Acquisition

    ERIC Educational Resources Information Center

    Yanguas, Inigo

    2012-01-01

    The present study adds to the computer-mediated communication (CMC) literature by exploring oral learner-to-learner interaction using Skype, a free and widely used Internet software program. In particular, this task-based study has a two-fold goal. Firstly, it explores possible differences between two modes of oral CMC (audio and video) and…

  1. An Interdisciplinary Team Project: Psychology and Computer Science Students Create Online Cognitive Tasks

    ERIC Educational Resources Information Center

    Flannery, Kathleen A.; Malita, Mihaela

    2014-01-01

    We present our case study of an interdisciplinary team project for students taking either a psychology or computer science (CS) course. The project required psychology and CS students to combine their knowledge and skills to create an online cognitive task. Each interdisciplinary project team included two psychology students who conducted library…

  2. Energy and Power Aware Computing Through Management of Computational Entropy

    DTIC Science & Technology

    2008-01-01

    18 2.4.1 ACIP living framework forum task...This research focused on two sub- tasks: (1) Assessing the need and planning for a potential “Living Framework Forum ” (LFF) software architecture...probabilistic switching with plausible device realizations to save energy in our patent application [35]. In [35], we showed an introverted switch in

  3. Computer-Mediated Training Tools to Enhance Joint Task Force Cognitive Leadership Skills

    DTIC Science & Technology

    2007-04-01

    University); and 5d. TASK NUMBER Barclay Lewis (American Systems) 5e. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ...ple G am ing Platform D ecisive A ction for Training ..................................................... 43 6. Perform ance M etrics...Figure 15: Automated Performance Measurement System ................................................................... 48 iv COMPUTER-MEDIATED TRAINING

  4. The Modulation of Visual and Task Characteristics of a Writing System on Hemispheric Lateralization in Visual Word Recognition--A Computational Exploration

    ERIC Educational Resources Information Center

    Hsiao, Janet H.; Lam, Sze Man

    2013-01-01

    Through computational modeling, here we examine whether visual and task characteristics of writing systems alone can account for lateralization differences in visual word recognition between different languages without assuming influence from left hemisphere (LH) lateralized language processes. We apply a hemispheric processing model of face…

  5. Web-Based Seamless Migration for Task-Oriented Mobile Distance Learning

    ERIC Educational Resources Information Center

    Zhang, Degan; Li, Yuan-chao; Zhang, Huaiyu; Zhang, Xinshang; Zeng, Guangping

    2006-01-01

    As a new kind of computing paradigm, pervasive computing will meet the requirements of human being that anybody maybe obtain services in anywhere and at anytime, task-oriented seamless migration is one of its applications. Apparently, the function of seamless mobility is suitable for mobile services, such as mobile Web-based learning. In this…

  6. Using Higher Order Computer Tasks with Disadvantaged Students.

    ERIC Educational Resources Information Center

    Anderson, Neil

    A pilot program initially designed for a 12-year-old girl with mild to moderate intellectual disabilities in higher order computer tasks was developed for a larger group of students with similar disabilities enrolled in fifth and sixth grades (ages 9-12) at three different schools. An examination of the original pilot study was undertaken to…

  7. The Effects of Computer-Mediated Synchronous and Asynchronous Direct Corrective Feedback on Writing: A Case Study

    ERIC Educational Resources Information Center

    Shintani, Natsuko

    2016-01-01

    This case study investigated the characteristics of computer-mediated synchronous corrective feedback (SCF, provided while students wrote) and asynchronous corrective feedback (ACF, provided after students had finished writing) in an EFL writing task. The task, designed to elicit the use of the hypothetical conditional, was completed by two…

  8. Computer-mediated communication and time pressure induce higher cardiovascular responses in the preparatory and execution phases of cooperative tasks.

    PubMed

    Costa Ferrer, Raquel; Serrano Rosa, Miguel Ángel; Zornoza Abad, Ana; Salvador Fernández-Montejo, Alicia

    2010-11-01

    The cardiovascular (CV) response to social challenge and stress is associated with the etiology of cardiovascular diseases. New ways of communication, time pressure and different types of information are common in our society. In this study, the cardiovascular response to two different tasks (open vs. closed information) was examined employing different communication channels (computer-mediated vs. face-to-face) and with different pace control (self vs. external). Our results indicate that there was a higher CV response in the computer-mediated condition, on the closed information task and in the externally paced condition. These role of these factors should be considered when studying the consequences of social stress and their underlying mechanisms.

  9. On the usage of ultrasound computational models for decision making under ambiguity

    NASA Astrophysics Data System (ADS)

    Dib, Gerges; Sexton, Samuel; Prowant, Matthew; Crawford, Susan; Diaz, Aaron

    2018-04-01

    Computer modeling and simulation is becoming pervasive within the non-destructive evaluation (NDE) industry as a convenient tool for designing and assessing inspection techniques. This raises a pressing need for developing quantitative techniques for demonstrating the validity and applicability of the computational models. Computational models provide deterministic results based on deterministic and well-defined input, or stochastic results based on inputs defined by probability distributions. However, computational models cannot account for the effects of personnel, procedures, and equipment, resulting in ambiguity about the efficacy of inspections based on guidance from computational models only. In addition, ambiguity arises when model inputs, such as the representation of realistic cracks, cannot be defined deterministically, probabilistically, or by intervals. In this work, Pacific Northwest National Laboratory demonstrates the ability of computational models to represent field measurements under known variabilities, and quantify the differences using maximum amplitude and power spectrum density metrics. Sensitivity studies are also conducted to quantify the effects of different input parameters on the simulation results.

  10. Development and preliminary reliability of a multitasking assessment for executive functioning after concussion.

    PubMed

    Smith, Laurel B; Radomski, Mary Vining; Davidson, Leslie Freeman; Finkelstein, Marsha; Weightman, Margaret M; McCulloch, Karen L; Scherer, Matthew R

    2014-01-01

    OBJECTIVES. Executive functioning deficits may result from concussion. The Charge of Quarters (CQ) Duty Task is a multitask assessment designed to assess executive functioning in servicemembers after concussion. In this article, we discuss the rationale and process used in the development of the CQ Duty Task and present pilot data from the preliminary evaluation of interrater reliability (IRR). METHOD. Three evaluators observed as 12 healthy participants performed the CQ Duty Task and measured performance using various metrics. Intraclass correlation coefficient (ICC) quantified IRR. RESULTS. The ICC for task completion was .94. ICCs for other assessment metrics were variable. CONCLUSION. Preliminary IRR data for the CQ Duty Task are encouraging, but further investigation is needed to improve IRR in some domains. Lessons learned in the development of the CQ Duty Task could benefit future test development efforts with populations other than the military. Copyright © 2014 by the American Occupational Therapy Association, Inc.

  11. Development and Preliminary Reliability of a Multitasking Assessment for Executive Functioning After Concussion

    PubMed Central

    Radomski, Mary Vining; Davidson, Leslie Freeman; Finkelstein, Marsha; Weightman, Margaret M.; McCulloch, Karen L.; Scherer, Matthew R.

    2014-01-01

    OBJECTIVES. Executive functioning deficits may result from concussion. The Charge of Quarters (CQ) Duty Task is a multitask assessment designed to assess executive functioning in servicemembers after concussion. In this article, we discuss the rationale and process used in the development of the CQ Duty Task and present pilot data from the preliminary evaluation of interrater reliability (IRR). METHOD. Three evaluators observed as 12 healthy participants performed the CQ Duty Task and measured performance using various metrics. Intraclass correlation coefficient (ICC) quantified IRR. RESULTS. The ICC for task completion was .94. ICCs for other assessment metrics were variable. CONCLUSION. Preliminary IRR data for the CQ Duty Task are encouraging, but further investigation is needed to improve IRR in some domains. Lessons learned in the development of the CQ Duty Task could benefit future test development efforts with populations other than the military. PMID:25005507

  12. Task-specific image partitioning.

    PubMed

    Kim, Sungwoong; Nowozin, Sebastian; Kohli, Pushmeet; Yoo, Chang D

    2013-02-01

    Image partitioning is an important preprocessing step for many of the state-of-the-art algorithms used for performing high-level computer vision tasks. Typically, partitioning is conducted without regard to the task in hand. We propose a task-specific image partitioning framework to produce a region-based image representation that will lead to a higher task performance than that reached using any task-oblivious partitioning framework and existing supervised partitioning framework, albeit few in number. The proposed method partitions the image by means of correlation clustering, maximizing a linear discriminant function defined over a superpixel graph. The parameters of the discriminant function that define task-specific similarity/dissimilarity among superpixels are estimated based on structured support vector machine (S-SVM) using task-specific training data. The S-SVM learning leads to a better generalization ability while the construction of the superpixel graph used to define the discriminant function allows a rich set of features to be incorporated to improve discriminability and robustness. We evaluate the learned task-aware partitioning algorithms on three benchmark datasets. Results show that task-aware partitioning leads to better labeling performance than the partitioning computed by the state-of-the-art general-purpose and supervised partitioning algorithms. We believe that the task-specific image partitioning paradigm is widely applicable to improving performance in high-level image understanding tasks.

  13. Jensen-Bregman LogDet Divergence for Efficient Similarity Computations on Positive Definite Tensors

    DTIC Science & Technology

    2012-05-02

    function of Legendre-type on int(domS) [29]. From (7) the following properties of dφ(x, y) are apparent: strict convexity in x; asym- metry; non ...tensor imaging. An important task in all of these applications is to compute the distance between covariance matrices using a (dis)similarity function ...important task in all of these applications is to compute the distance between covariance matrices using a (dis)similarity function , for which the natural

  14. Queueing Network Models for Parallel Processing of Task Systems: an Operational Approach

    NASA Technical Reports Server (NTRS)

    Mak, Victor W. K.

    1986-01-01

    Computer performance modeling of possibly complex computations running on highly concurrent systems is considered. Earlier works in this area either dealt with a very simple program structure or resulted in methods with exponential complexity. An efficient procedure is developed to compute the performance measures for series-parallel-reducible task systems using queueing network models. The procedure is based on the concept of hierarchical decomposition and a new operational approach. Numerical results for three test cases are presented and compared to those of simulations.

  15. Achievement Goals in a Presentation Task: Performance Expectancy, Achievement Goals, State Anxiety, and Task Performance

    ERIC Educational Resources Information Center

    Tanaka, Ayumi; Takehara, Takuma; Yamauchi, Hirotsugu

    2006-01-01

    The aims of the study were to test the linkages between achievement goals to task performance, as mediated by state anxiety arousal. Performance expectancy was also examined as antecedents of achievement goals. A presentation task in a computer practice class was used as achievement task. Fifty-three undergraduates (37 females and 16 males) were…

  16. Sort-Mid tasks scheduling algorithm in grid computing.

    PubMed

    Reda, Naglaa M; Tawfik, A; Marzok, Mohamed A; Khamis, Soheir M

    2015-11-01

    Scheduling tasks on heterogeneous resources distributed over a grid computing system is an NP-complete problem. The main aim for several researchers is to develop variant scheduling algorithms for achieving optimality, and they have shown a good performance for tasks scheduling regarding resources selection. However, using of the full power of resources is still a challenge. In this paper, a new heuristic algorithm called Sort-Mid is proposed. It aims to maximizing the utilization and minimizing the makespan. The new strategy of Sort-Mid algorithm is to find appropriate resources. The base step is to get the average value via sorting list of completion time of each task. Then, the maximum average is obtained. Finally, the task has the maximum average is allocated to the machine that has the minimum completion time. The allocated task is deleted and then, these steps are repeated until all tasks are allocated. Experimental tests show that the proposed algorithm outperforms almost other algorithms in terms of resources utilization and makespan.

  17. Sort-Mid tasks scheduling algorithm in grid computing

    PubMed Central

    Reda, Naglaa M.; Tawfik, A.; Marzok, Mohamed A.; Khamis, Soheir M.

    2014-01-01

    Scheduling tasks on heterogeneous resources distributed over a grid computing system is an NP-complete problem. The main aim for several researchers is to develop variant scheduling algorithms for achieving optimality, and they have shown a good performance for tasks scheduling regarding resources selection. However, using of the full power of resources is still a challenge. In this paper, a new heuristic algorithm called Sort-Mid is proposed. It aims to maximizing the utilization and minimizing the makespan. The new strategy of Sort-Mid algorithm is to find appropriate resources. The base step is to get the average value via sorting list of completion time of each task. Then, the maximum average is obtained. Finally, the task has the maximum average is allocated to the machine that has the minimum completion time. The allocated task is deleted and then, these steps are repeated until all tasks are allocated. Experimental tests show that the proposed algorithm outperforms almost other algorithms in terms of resources utilization and makespan. PMID:26644937

  18. Estimation Accuracy on Execution Time of Run-Time Tasks in a Heterogeneous Distributed Environment.

    PubMed

    Liu, Qi; Cai, Weidong; Jin, Dandan; Shen, Jian; Fu, Zhangjie; Liu, Xiaodong; Linge, Nigel

    2016-08-30

    Distributed Computing has achieved tremendous development since cloud computing was proposed in 2006, and played a vital role promoting rapid growth of data collecting and analysis models, e.g., Internet of things, Cyber-Physical Systems, Big Data Analytics, etc. Hadoop has become a data convergence platform for sensor networks. As one of the core components, MapReduce facilitates allocating, processing and mining of collected large-scale data, where speculative execution strategies help solve straggler problems. However, there is still no efficient solution for accurate estimation on execution time of run-time tasks, which can affect task allocation and distribution in MapReduce. In this paper, task execution data have been collected and employed for the estimation. A two-phase regression (TPR) method is proposed to predict the finishing time of each task accurately. Detailed data of each task have drawn interests with detailed analysis report being made. According to the results, the prediction accuracy of concurrent tasks' execution time can be improved, in particular for some regular jobs.

  19. Summary of synfuel characterization and combustion studies

    NASA Technical Reports Server (NTRS)

    Schultz, D. F.

    1983-01-01

    Combustion component research studies aimed at evolving environmentally acceptable approaches for burning coal derived fuels for ground power applications were performed at the NASA Lewis Research Center under a program titled the ""Critical Research and Support Technology Program'' (CRT). The work was funded by the Department of Energy and was performed in four tasks. This report summarizes these tasks which have all been previously reported. In addition some previously unreported data from Task 4 is also presented. The first, Task 1 consisted of a literature survey aimed at determining the properties of synthetic fuels. This was followed by a computer modeling effort, Task 2, to predict the exhaust emissions resulting from burning coal liquids by various combustion techniques such as lean and rich-lean combustion. The computer predictions were then compared to the results of a flame tube rig, Task 3, in which the fuel properties were varied to simulate coal liquids. Two actual SRC 2 coal liquids were tested in this flame tube task.

  20. Academic physicians' assessment of the effects of computers on health care.

    PubMed Central

    Detmer, W. M.; Friedman, C. P.

    1994-01-01

    We assessed the attitudes of academic physicians towards computers in health care at two academic medical centers that are in the early stages of clinical information-system deployment. We distributed a 4-page questionnaire to 470 subjects, and a total of 272 physicians (58%) responded. Our results show that respondents use computers frequently, primarily to perform academic-oriented tasks as opposed to clinical tasks. Overall, respondents viewed computers as being slightly beneficial to health care. They perceive self-education and access to up-to-date information as the most beneficial aspects of computers and are most concerned about privacy issues and the effect of computers on the doctor-patient relationship. Physicians with prior computer training and greater knowledge of informatics concepts had more favorable attitudes towards computers in health care. We suggest that negative attitudes towards computers can be addressed by careful system design as well as targeted educational activities. PMID:7949990

Top