ERIC Educational Resources Information Center
Groff, Warren H.
As our society evolves from an industrial society to a computer literate, high technology, information society, educational planners must reexamine the role of postsecondary education in economic development and in intellectual capital formation. In response to this need, a task force on high technology was established to examine the following…
Creating a New Model Curriculum: A Rationale for "Computing Curricula 1990".
ERIC Educational Resources Information Center
Bruce, Kim B.
1991-01-01
Describes a model for the design of undergraduate curricula in the discipline of computing that was developed by the ACM/IEEE (Association for Computing Machinery/Institute of Electrical and Electronics Engineers) Computer Society Joint Curriculum Task Force. Institutional settings and structures in which computing degrees are awarded are…
Siegel, Marilyn J; Kaza, Ravi K; Bolus, David N; Boll, Daniel T; Rofsky, Neil M; De Cecco, Carlo N; Foley, W Dennis; Morgan, Desiree E; Schoepf, U Joseph; Sahani, Dushyant V; Shuman, William P; Vrtiska, Terri J; Yeh, Benjamin M; Berland, Lincoln L
This is the first of a series of 4 white papers that represent Expert Consensus Documents developed by the Society of Computed Body Tomography and Magnetic Resonance through its task force on dual-energy computed tomography (DECT). This article, part 1, describes the fundamentals of the physical basis for DECT and the technology of DECT and proposes uniform nomenclature to account for differences in proprietary terms among manufacturers.
Costa Ferrer, Raquel; Serrano Rosa, Miguel Ángel; Zornoza Abad, Ana; Salvador Fernández-Montejo, Alicia
2010-11-01
The cardiovascular (CV) response to social challenge and stress is associated with the etiology of cardiovascular diseases. New ways of communication, time pressure and different types of information are common in our society. In this study, the cardiovascular response to two different tasks (open vs. closed information) was examined employing different communication channels (computer-mediated vs. face-to-face) and with different pace control (self vs. external). Our results indicate that there was a higher CV response in the computer-mediated condition, on the closed information task and in the externally paced condition. These role of these factors should be considered when studying the consequences of social stress and their underlying mechanisms.
Strategy generalization across orientation tasks: testing a computational cognitive model.
Gunzelmann, Glenn
2008-07-08
Humans use their spatial information processing abilities flexibly to facilitate problem solving and decision making in a variety of tasks. This article explores the question of whether a general strategy can be adapted for performing two different spatial orientation tasks by testing the predictions of a computational cognitive model. Human performance was measured on an orientation task requiring participants to identify the location of a target either on a map (find-on-map) or within an egocentric view of a space (find-in-scene). A general strategy instantiated in a computational cognitive model of the find-on-map task, based on the results from Gunzelmann and Anderson (2006), was adapted to perform both tasks and used to generate performance predictions for a new study. The qualitative fit of the model to the human data supports the view that participants were able to tailor a general strategy to the requirements of particular spatial tasks. The quantitative differences between the predictions of the model and the performance of human participants in the new experiment expose individual differences in sample populations. The model provides a means of accounting for those differences and a framework for understanding how human spatial abilities are applied to naturalistic spatial tasks that involve reasoning with maps. 2008 Cognitive Science Society, Inc.
Postural dynamism during computer mouse and keyboard use: A pilot study.
Van Niekerk, S M; Fourie, S M; Louw, Q A
2015-09-01
Prolonged sedentary computer use is a risk factor for musculoskeletal pain. The aim of this study was to explore postural dynamism during two common computer tasks, namely mouse use and keyboard typing. Postural dynamism was described as the total number of postural changes that occurred during the data capture period. Twelve participants were recruited to perform a mouse and a typing task. The data of only eight participants could be analysed. A 3D motion analysis system measured the number of cervical and thoracic postural changes as well as, the range in which the postural changes occurred. The study findings illustrate that there is less postural dynamism of the cervical and thoracic spinal regions during computer mouse use, when compared to keyboard typing. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
A computational and neural model of momentary subjective well-being
Rutledge, Robb B.; Skandali, Nikolina; Dayan, Peter; Dolan, Raymond J.
2014-01-01
The subjective well-being or happiness of individuals is an important metric for societies. Although happiness is influenced by life circumstances and population demographics such as wealth, we know little about how the cumulative influence of daily life events are aggregated into subjective feelings. Using computational modeling, we show that emotional reactivity in the form of momentary happiness in response to outcomes of a probabilistic reward task is explained not by current task earnings, but by the combined influence of recent reward expectations and prediction errors arising from those expectations. The robustness of this account was evident in a large-scale replication involving 18,420 participants. Using functional MRI, we show that the very same influences account for task-dependent striatal activity in a manner akin to the influences underpinning changes in happiness. PMID:25092308
Tutorial on Generalized Programming Language s and Systems. Instructor Edition.
ERIC Educational Resources Information Center
Fasana, Paul J., Ed.; Shank, Russell, Ed.
This instructor's manual is a comparative analysis and review of the various computer programing languages currently available and their capabilities for performing text manipulation, information storage, and data retrieval tasks. Based on materials presented at the 1967 Convention of the American Society for Information Science, the manual…
Funder, John W; Carey, Robert M; Mantero, Franco; Murad, M Hassan; Reincke, Martin; Shibata, Hirotaka; Stowasser, Michael; Young, William F
2016-05-01
To develop clinical practice guidelines for the management of patients with primary aldosteronism. The Task Force included a chair, selected by the Clinical Guidelines Subcommittee of the Endocrine Society, six additional experts, a methodologist, and a medical writer. The guideline was cosponsored by American Heart Association, American Association of Endocrine Surgeons, European Society of Endocrinology, European Society of Hypertension, International Association of Endocrine Surgeons, International Society of Endocrinology, International Society of Hypertension, Japan Endocrine Society, and The Japanese Society of Hypertension. The Task Force received no corporate funding or remuneration. We searched for systematic reviews and primary studies to formulate the key treatment and prevention recommendations. We used the Grading of Recommendations, Assessment, Development, and Evaluation group criteria to describe both the quality of evidence and the strength of recommendations. We used "recommend" for strong recommendations and "suggest" for weak recommendations. We achieved consensus by collecting the best available evidence and conducting one group meeting, several conference calls, and multiple e-mail communications. With the help of a medical writer, the Endocrine Society's Clinical Guidelines Subcommittee, Clinical Affairs Core Committee, and Council successfully reviewed the drafts prepared by the Task Force. We placed the version approved by the Clinical Guidelines Subcommittee and Clinical Affairs Core Committee on the Endocrine Society's website for comments by members. At each stage of review, the Task Force received written comments and incorporated necessary changes. For high-risk groups of hypertensive patients and those with hypokalemia, we recommend case detection of primary aldosteronism by determining the aldosterone-renin ratio under standard conditions and recommend that a commonly used confirmatory test should confirm/exclude the condition. We recommend that all patients with primary aldosteronism undergo adrenal computed tomography as the initial study in subtype testing and to exclude adrenocortical carcinoma. We recommend that an experienced radiologist should establish/exclude unilateral primary aldosteronism using bilateral adrenal venous sampling, and if confirmed, this should optimally be treated by laparoscopic adrenalectomy. We recommend that patients with bilateral adrenal hyperplasia or those unsuitable for surgery should be treated primarily with a mineralocorticoid receptor antagonist.
Computerized Education in a Low Birth Rate Society.
ERIC Educational Resources Information Center
Asimov, Isaac
1979-01-01
In Asimov's scenario of the future, the percentage of older people will increase and computer technology will relieve humanity of many tasks. The concept of education as a lifelong, natural, and enjoyable process will thus become paramount, and university extension and continuing education institutions will be challenged to meet the needs of an…
Narrowing the scope of failure prediction using targeted fault load injection
NASA Astrophysics Data System (ADS)
Jordan, Paul L.; Peterson, Gilbert L.; Lin, Alan C.; Mendenhall, Michael J.; Sellers, Andrew J.
2018-05-01
As society becomes more dependent upon computer systems to perform increasingly critical tasks, ensuring that those systems do not fail becomes increasingly important. Many organizations depend heavily on desktop computers for day-to-day operations. Unfortunately, the software that runs on these computers is written by humans and, as such, is still subject to human error and consequent failure. A natural solution is to use statistical machine learning to predict failure. However, since failure is still a relatively rare event, obtaining labelled training data to train these models is not a trivial task. This work presents new simulated fault-inducing loads that extend the focus of traditional fault injection techniques to predict failure in the Microsoft enterprise authentication service and Apache web server. These new fault loads were successful in creating failure conditions that were identifiable using statistical learning methods, with fewer irrelevant faults being created.
Allen Newell's Program of Research: The Video-Game Test.
Gobet, Fernand
2017-04-01
Newell (1973) argued that progress in psychology was slow because research focused on experiments trying to answer binary questions, such as serial versus parallel processing. In addition, not enough attention was paid to the strategies used by participants, and there was a lack of theories implemented as computer models offering sufficient precision for being tested rigorously. He proposed a three-headed research program: to develop computational models able to carry out the task they aimed to explain; to study one complex task in detail, such as chess; and to build computational models that can account for multiple tasks. This article assesses the extent to which the papers in this issue advance Newell's program. While half of the papers devote much attention to strategies, several papers still average across them, a capital sin according to Newell. The three courses of action he proposed were not popular in these papers: Only two papers used computational models, with no model being both able to carry out the task and to account for human data; there was no systematic analysis of a specific video game; and no paper proposed a computational model accounting for human data in several tasks. It is concluded that, while they use sophisticated methods of analysis and discuss interesting results, overall these papers contribute only little to Newell's program of research. In this respect, they reflect the current state of psychology and cognitive science. This is a shame, as Newell's ideas might help address the current crisis of lack of replication and fraud in psychology. Copyright © 2017 The Author. Topics in Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.
Tolaymat, Thabet; El Badawy, Amro; Sequeira, Reynold; Genaidy, Ash
2015-11-15
There is an urgent need for broad and integrated studies that address the risks of engineered nanomaterials (ENMs) along the different endpoints of the society, environment, and economy (SEE) complex adaptive system. This article presents an integrated science-based methodology to assess the potential risks of engineered nanomaterials. To achieve the study objective, two major tasks are accomplished, knowledge synthesis and algorithmic computational methodology. The knowledge synthesis task is designed to capture "what is known" and to outline the gaps in knowledge from ENMs risk perspective. The algorithmic computational methodology is geared toward the provision of decisions and an understanding of the risks of ENMs along different endpoints for the constituents of the SEE complex adaptive system. The approach presented herein allows for addressing the formidable task of assessing the implications and risks of exposure to ENMs, with the long term goal to build a decision-support system to guide key stakeholders in the SEE system towards building sustainable ENMs and nano-enabled products. Published by Elsevier B.V.
Kingston, David C; Riddell, Maureen F; McKinnon, Colin D; Gallagher, Kaitlin M; Callaghan, Jack P
2016-02-01
We evaluated the effect of work surface angle and input hardware on upper-limb posture when using a hybrid computer workstation. Offices use sit-stand and/or tablet workstations to increase worker mobility. These workstations may have negative effects on upper-limb joints by increasing time spent in non-neutral postures, but a hybrid standing workstation may improve working postures. Fourteen participants completed office tasks in four workstation configurations: a horizontal or sloped 15° working surface with computer or tablet hardware. Three-dimensional right upper-limb postures were recorded during three tasks: reading, form filling, and writing e-mails. Amplitude probability distribution functions determined the median and range of upper-limb postures. The sloped-surface tablet workstation decreased wrist ulnar deviation by 5° when compared to the horizontal-surface computer when reading. When using computer input devices (keyboard and mouse), the shoulder, elbow, and wrist were closest to neutral joint postures when working on a horizontal work surface. The elbow was 23° and 15° more extended, whereas the wrist was 6° less ulnar deviated, when reading compared to typing forms or e-mails. We recommend that the horizontal-surface computer configuration be used for typing and the sloped-surface tablet configuration be used for intermittent reading tasks in this hybrid workstation. Offices with mobile employees could use this workstation for alternating their upper-extremity postures; however, other aspects of the device need further investigation. © 2015, Human Factors and Ergonomics Society.
Progress towards a world-wide code of conduct
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, J.A.N.; Berleur, J.
1994-12-31
In this paper the work of the International Federation for Information Processing (IFIP) Task Group on Ethics is described and the recommendations presented to the General Assembly are reviewed. While a common code of ethics or conduct has been not recommended for consideration by the member societies of IMP, a set of guidelines for the establishment and evaluation of codes has been produced and procedures for the assistance of code development have been established within IMP. This paper proposes that the data collected by the Task Group and the proposed guidelines can be used as a tool for the studymore » of codes of practice providing a teachable, learnable educational module in courses related to the ethics of computing and computation, and looks at the next steps in bringing ethical awareness to the IT community.« less
Quantum robots and environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benioff, P.
1998-08-01
Quantum robots and their interactions with environments of quantum systems are described, and their study justified. A quantum robot is a mobile quantum system that includes an on-board quantum computer and needed ancillary systems. Quantum robots carry out tasks whose goals include specified changes in the state of the environment, or carrying out measurements on the environment. Each task is a sequence of alternating computation and action phases. Computation phase activites include determination of the action to be carried out in the next phase, and recording of information on neighborhood environmental system states. Action phase activities include motion of themore » quantum robot and changes in the neighborhood environment system states. Models of quantum robots and their interactions with environments are described using discrete space and time. A unitary step operator T that gives the single time step dynamics is associated with each task. T=T{sub a}+T{sub c} is a sum of action phase and computation phase step operators. Conditions that T{sub a} and T{sub c} should satisfy are given along with a description of the evolution as a sum over paths of completed phase input and output states. A simple example of a task{emdash}carrying out a measurement on a very simple environment{emdash}is analyzed in detail. A decision tree for the task is presented and discussed in terms of the sums over phase paths. It is seen that no definite times or durations are associated with the phase steps in the tree, and that the tree describes the successive phase steps in each path in the sum over phase paths. {copyright} {ital 1998} {ital The American Physical Society}« less
Opportunity costs of reward delays and the discounting of hypothetical money and cigarettes.
Johnson, Patrick S; Herrmann, Evan S; Johnson, Matthew W
2015-01-01
Humans are reported to discount delayed rewards at lower rates than nonhumans. However, nonhumans are studied in tasks that restrict reinforcement during delays, whereas humans are typically studied in tasks that do not restrict reinforcement during delays. In nonhuman tasks, the opportunity cost of restricted reinforcement during delays may increase delay discounting rates. The present within-subjects study used online crowdsourcing (Amazon Mechanical Turk, or MTurk) to assess the discounting of hypothetical delayed money (and cigarettes in smokers) under four hypothetical framing conditions differing in the availability of reinforcement during delays. At one extreme, participants were free to leave their computer without returning, and engage in any behavior during reward delays (modeling typical human tasks). At the opposite extreme, participants were required to stay at their computer and engage in little other behavior during reward delays (modeling typical nonhuman tasks). Discounting rates increased as an orderly function of opportunity cost. Results also indicated predominantly hyperbolic discounting, the "magnitude effect," steeper discounting of cigarettes than money, and positive correlations between discounting rates of these commodities. This is the first study to test the effects of opportunity costs on discounting, and suggests that procedural differences may partially account for observed species differences in discounting. © Society for the Experimental Analysis of Behavior.
Taylor, Allen J; Cerqueira, Manuel; Hodgson, John McB; Mark, Daniel; Min, James; O'Gara, Patrick; Rubin, Geoffrey D; Kramer, Christopher M; Berman, Daniel; Brown, Alan; Chaudhry, Farooq A; Cury, Ricardo C; Desai, Milind Y; Einstein, Andrew J; Gomes, Antoinette S; Harrington, Robert; Hoffmann, Udo; Khare, Rahul; Lesser, John; McGann, Christopher; Rosenberg, Alan; Schwartz, Robert; Shelton, Marc; Smetana, Gerald W; Smith, Sidney C
2010-11-23
The American College of Cardiology Foundation (ACCF), along with key specialty and subspecialty societies, conducted an appropriate use review of common clinical scenarios where cardiac computed tomography (CCT) is frequently considered. The present document is an update to the original CCT/cardiac magnetic resonance (CMR) appropriateness criteria published in 2006, written to reflect changes in test utilization, to incorporate new clinical data, and to clarify CCT use where omissions or lack of clarity existed in the original criteria (1). The indications for this review were drawn from common applications or anticipated uses, as well as from current clinical practice guidelines. Ninety-three clinical scenarios were developed by a writing group and scored by a separate technical panel on a scale of 1 to 9 to designate appropriate use, inappropriate use, or uncertain use. In general, use of CCT angiography for diagnosis and risk assessment in patients with low or intermediate risk or pretest probability for coronary artery disease (CAD) was viewed favorably, whereas testing in high-risk patients, routine repeat testing, and general screening in certain clinical scenarios were viewed less favorably. Use of noncontrast computed tomography (CT) for calcium scoring was rated as appropriate within intermediate- and selected low-risk patients. Appropriate applications of CCT are also within the category of cardiac structural and functional evaluation. It is anticipated that these results will have an impact on physician decision making, performance, and reimbursement policy, and that they will help guide future research.
Taylor, Allen J; Cerqueira, Manuel; Hodgson, John McB; Mark, Daniel; Min, James; O'Gara, Patrick; Rubin, Geoffrey D
2010-01-01
The American College of Cardiology Foundation (ACCF), along with key specialty and subspecialty societies, conducted an appropriate use review of common clinical scenarios where cardiac computed tomography (CCT) is frequently considered. The present document is an update to the original CCT/cardiac magnetic resonance (CMR) appropriateness criteria published in 2006, written to reflect changes in test utilization, to incorporate new clinical data, and to clarify CCT use where omissions or lack of clarity existed in the original criteria (1). The indications for this review were drawn from common applications or anticipated uses, as well as from current clinical practice guidelines. Ninety-three clinical scenarios were developed by a writing group and scored by a separate technical panel on a scale of 1 to 9 to designate appropriate use, inappropriate use, or uncertain use. In general, use of CCT angiography for diagnosis and risk assessment in patients with low or intermediate risk or pretest probability for coronary artery disease (CAD) was viewed favorably, whereas testing in high-risk patients, routine repeat testing, and general screening in certain clinical scenarios were viewed less favorably. Use of noncontrast computed tomography (CT) for calcium scoring was rated as appropriate within intermediate- and selected low-risk patients. Appropriate applications of CCT are also within the category of cardiac structural and functional evaluation. It is anticipated that these results will have an impact on physician decision making, performance, and reimbursement policy, and that they will help guide future research. © 2010 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Taylor, Allen J; Cerqueira, Manuel; Hodgson, John McB; Mark, Daniel; Min, James; O'Gara, Patrick; Rubin, Geoffrey D
2010-11-23
The American College of Cardiology Foundation, along with key specialty and subspecialty societies, conducted an appropriate use review of common clinical scenarios where cardiac computed tomography (CCT) is frequently considered. The present document is an update to the original CCT/cardiac magnetic resonance appropriateness criteria published in 2006, written to reflect changes in test utilization, to incorporate new clinical data, and to clarify CCT use where omissions or lack of clarity existed in the original criteria. The indications for this review were drawn from common applications or anticipated uses, as well as from current clinical practice guidelines. Ninety-three clinical scenarios were developed by a writing group and scored by a separate technical panel on a scale of 1 to 9 to designate appropriate use, inappropriate use, or uncertain use. In general, use of CCT angiography for diagnosis and risk assessment in patients with low or intermediate risk or pretest probability for coronary artery disease was viewed favorably, whereas testing in high-risk patients, routine repeat testing, and general screening in certain clinical scenarios were viewed less favorably. Use of noncontrast computed tomography for calcium scoring was rated as appropriate within intermediate- and selected low-risk patients. Appropriate applications of CCT are also within the category of cardiac structural and functional evaluation. It is anticipated that these results will have an impact on physician decision making, performance, and reimbursement policy, and that they will help guide future research.
Accommodation and convergence during sustained computer work.
Collier, Juanita D; Rosenfield, Mark
2011-07-01
With computer usage becoming almost universal in contemporary society, the reported prevalence of computer vision syndrome (CVS) is extremely high. However, the precise physiological mechanisms underlying CVS remain unclear. Although abnormal accommodation and vergence responses have been cited as being responsible for the symptoms produced, there is little objective evidence to support this claim. Accordingly, this study measured both of these oculomotor parameters during a sustained period of computer use. Subjects (N = 20) were required to read text aloud from a laptop computer at a viewing distance of 50 cm for a sustained 30-minute period through their habitual refractive correction. At 2-minute intervals, the accommodative response (AR) to the computer screen was measured objectively using a Grand Seiko WAM 5500 optometer (Grand Seiko, Hiroshima, Japan). Additionally, the vergence response was assessed by measuring the associated phoria (AP), i.e., prism to eliminate fixation disparity, using a customized fixation disparity target that appeared on the computer screen. Subjects were asked to rate the degree of difficulty of the reading task on a scale from 1 to 10. Mean accommodation and AP values during the task were 1.07 diopters and 0.74∆ base-in (BI), respectively. The mean discomfort score was 4.9. No significant changes in accommodation or vergence were observed during the course of the 30-minute test period. There was no significant difference in the AR as a function of subjective difficulty. However, the mean AP for the subjects who reported the least and greatest discomfort during the task was 1.55∆ BI and 0, respectively (P = 0.02). CVS, after 30 minutes was worse in subjects exhibiting zero fixation disparity when compared with those subjects having a BI AP but does not appear to be related to differences in accommodation. A slightly reduced vergence response increases subject comfort during the task. Copyright © 2011 American Optometric Association. Published by Elsevier Inc. All rights reserved.
Greenhow, Anna K; Hunt, Maree J; Macaskill, Anne C; Harper, David N
2015-09-01
Delay and uncertainty of receipt both reduce the subjective value of reinforcers. Delay has a greater impact on the subjective value of smaller reinforcers than of larger ones while the reverse is true for uncertainty. We investigated the effect of reinforcer magnitude on discounting of delayed and uncertain reinforcers using a novel approach: embedding relevant choices within a computer game. Participants made repeated choices between smaller, certain, immediate outcomes and larger, but delayed or uncertain outcomes while experiencing the result of each choice. Participants' choices were generally well described by the hyperbolic discounting function. Smaller numbers of points were discounted more steeply than larger numbers as a function of delay but not probability. The novel experiential choice task described is a promising approach to investigating both delay and probability discounting in humans. © Society for the Experimental Analysis of Behavior.
Performing a global barrier operation in a parallel computer
Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E
2014-12-09
Executing computing tasks on a parallel computer that includes compute nodes coupled for data communications, where each compute node executes tasks, with one task on each compute node designated as a master task, including: for each task on each compute node until all master tasks have joined a global barrier: determining whether the task is a master task; if the task is not a master task, joining a single local barrier; if the task is a master task, joining the global barrier and the single local barrier only after all other tasks on the compute node have joined the single local barrier.
Effects of portable computing devices on posture, muscle activation levels and efficiency.
Werth, Abigail; Babski-Reeves, Kari
2014-11-01
Very little research exists on ergonomic exposures when using portable computing devices. This study quantified muscle activity (forearm and neck), posture (wrist, forearm and neck), and performance (gross typing speed and error rates) differences across three portable computing devices (laptop, netbook, and slate computer) and two work settings (desk and computer) during data entry tasks. Twelve participants completed test sessions on a single computer using a test-rest-test protocol (30min of work at one work setting, 15min of rest, 30min of work at the other work setting). The slate computer resulted in significantly more non-neutral wrist, elbow and neck postures, particularly when working on the sofa. Performance on the slate computer was four times less than that of the other computers, though lower muscle activity levels were also found. Potential or injury or illness may be elevated when working on smaller, portable computers in non-traditional work settings. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Reading Emotion From Mouse Cursor Motions: Affective Computing Approach.
Yamauchi, Takashi; Xiao, Kunchen
2018-04-01
Affective computing research has advanced emotion recognition systems using facial expressions, voices, gaits, and physiological signals, yet these methods are often impractical. This study integrates mouse cursor motion analysis into affective computing and investigates the idea that movements of the computer cursor can provide information about emotion of the computer user. We extracted 16-26 trajectory features during a choice-reaching task and examined the link between emotion and cursor motions. Participants were induced for positive or negative emotions by music, film clips, or emotional pictures, and they indicated their emotions with questionnaires. Our 10-fold cross-validation analysis shows that statistical models formed from "known" participants (training data) could predict nearly 10%-20% of the variance of positive affect and attentiveness ratings of "unknown" participants, suggesting that cursor movement patterns such as the area under curve and direction change help infer emotions of computer users. Copyright © 2017 Cognitive Science Society, Inc.
A computer vision for animal ecology.
Weinstein, Ben G
2018-05-01
A central goal of animal ecology is to observe species in the natural world. The cost and challenge of data collection often limit the breadth and scope of ecological study. Ecologists often use image capture to bolster data collection in time and space. However, the ability to process these images remains a bottleneck. Computer vision can greatly increase the efficiency, repeatability and accuracy of image review. Computer vision uses image features, such as colour, shape and texture to infer image content. I provide a brief primer on ecological computer vision to outline its goals, tools and applications to animal ecology. I reviewed 187 existing applications of computer vision and divided articles into ecological description, counting and identity tasks. I discuss recommendations for enhancing the collaboration between ecologists and computer scientists and highlight areas for future growth of automated image analysis. © 2017 The Author. Journal of Animal Ecology © 2017 British Ecological Society.
Charalambous, Charalambos C; Alcantara, Carolina C; French, Margaret A; Li, Xin; Matt, Kathleen S; Kim, Hyosub E; Morton, Susanne M; Reisman, Darcy S
2018-05-15
Previous work demonstrated an effect of a single high-intensity exercise bout coupled with motor practice on the retention of a newly acquired skilled arm movement, in both neurologically intact and impaired adults. In the present study, using behavioural and computational analyses we demonstrated that a single exercise bout, regardless of its intensity and timing, did not increase the retention of a novel locomotor task after stroke. Considering both present and previous work, we postulate that the benefits of exercise effect may depend on the type of motor learning (e.g. skill learning, sensorimotor adaptation) and/or task (e.g. arm accuracy-tracking task, walking). Acute high-intensity exercise coupled with motor practice improves the retention of motor learning in neurologically intact adults. However, whether exercise could improve the retention of locomotor learning after stroke is still unknown. Here, we investigated the effect of exercise intensity and timing on the retention of a novel locomotor learning task (i.e. split-belt treadmill walking) after stroke. Thirty-seven people post stroke participated in two sessions, 24 h apart, and were allocated to active control (CON), treadmill walking (TMW), or total body exercise on a cycle ergometer (TBE). In session 1, all groups exercised for a short bout (∼5 min) at low (CON) or high (TMW and TBE) intensity and before (CON and TMW) or after (TBE) the locomotor learning task. In both sessions, the locomotor learning task was to walk on a split-belt treadmill in a 2:1 speed ratio (100% and 50% fast-comfortable walking speed) for 15 min. To test the effect of exercise on 24 h retention, we applied behavioural and computational analyses. Behavioural data showed that neither high-intensity group showed greater 24 h retention compared to CON, and computational data showed that 24 h retention was attributable to a slow learning process for sensorimotor adaptation. Our findings demonstrated that acute exercise coupled with a locomotor adaptation task, regardless of its intensity and timing, does not improve retention of the novel locomotor task after stroke. We postulate that exercise effects on motor learning may be context specific (e.g. type of motor learning and/or task) and interact with the presence of genetic variant (BDNF Val66Met). © 2018 The Authors. The Journal of Physiology © 2018 The Physiological Society.
TethysCluster: A comprehensive approach for harnessing cloud resources for hydrologic modeling
NASA Astrophysics Data System (ADS)
Nelson, J.; Jones, N.; Ames, D. P.
2015-12-01
Advances in water resources modeling are improving the information that can be supplied to support decisions affecting the safety and sustainability of society. However, as water resources models become more sophisticated and data-intensive they require more computational power to run. Purchasing and maintaining the computing facilities needed to support certain modeling tasks has been cost-prohibitive for many organizations. With the advent of the cloud, the computing resources needed to address this challenge are now available and cost-effective, yet there still remains a significant technical barrier to leverage these resources. This barrier inhibits many decision makers and even trained engineers from taking advantage of the best science and tools available. Here we present the Python tools TethysCluster and CondorPy, that have been developed to lower the barrier to model computation in the cloud by providing (1) programmatic access to dynamically scalable computing resources, (2) a batch scheduling system to queue and dispatch the jobs to the computing resources, (3) data management for job inputs and outputs, and (4) the ability to dynamically create, submit, and monitor computing jobs. These Python tools leverage the open source, computing-resource management, and job management software, HTCondor, to offer a flexible and scalable distributed-computing environment. While TethysCluster and CondorPy can be used independently to provision computing resources and perform large modeling tasks, they have also been integrated into Tethys Platform, a development platform for water resources web apps, to enable computing support for modeling workflows and decision-support systems deployed as web apps.
Simmering, Vanessa R
2016-09-01
Working memory is a vital cognitive skill that underlies a broad range of behaviors. Higher cognitive functions are reliably predicted by working memory measures from two domains: children's performance on complex span tasks, and infants' performance in looking paradigms. Despite the similar predictive power across these research areas, theories of working memory development have not connected these different task types and developmental periods. The current project takes a first step toward bridging this gap by presenting a process-oriented theory, focusing on two tasks designed to assess visual working memory capacity in infants (the change-preference task) versus children and adults (the change detection task). Previous studies have shown inconsistent results, with capacity estimates increasing from one to four items during infancy, but only two to three items during early childhood. A probable source of this discrepancy is the different task structures used with each age group, but prior theories were not sufficiently specific to explain how performance relates across tasks. The current theory focuses on cognitive dynamics, that is, how memory representations are formed, maintained, and used within specific task contexts over development. This theory was formalized in a computational model to generate three predictions: 1) capacity estimates in the change-preference task should continue to increase beyond infancy; 2) capacity estimates should be higher in the change-preference versus change detection task when tested within individuals; and 3) performance should correlate across tasks because both rely on the same underlying memory system. I also tested a fourth prediction, that development across tasks could be explained through increasing real-time stability, realized computationally as strengthening connectivity within the model. Results confirmed these predictions, supporting the cognitive dynamics account of performance and developmental changes in real-time stability. The monograph concludes with implications for understanding memory, behavior, and development in a broader range of cognitive development. © 2016 The Society for Research in Child Development, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Performing a global barrier operation in a parallel computer that includes compute nodes coupled for data communications, where each compute node executes tasks, with one task on each compute node designated as a master task, including: for each task on each compute node until all master tasks have joined a global barrier: determining whether the task is a master task; if the task is not a master task, joining a single local barrier; if the task is a master task, joining the global barrier and the single local barrier only after all other tasks on the compute node have joinedmore » the single local barrier.« less
Feature Statistics Modulate the Activation of Meaning During Spoken Word Processing.
Devereux, Barry J; Taylor, Kirsten I; Randall, Billi; Geertzen, Jeroen; Tyler, Lorraine K
2016-03-01
Understanding spoken words involves a rapid mapping from speech to conceptual representations. One distributed feature-based conceptual account assumes that the statistical characteristics of concepts' features--the number of concepts they occur in (distinctiveness/sharedness) and likelihood of co-occurrence (correlational strength)--determine conceptual activation. To test these claims, we investigated the role of distinctiveness/sharedness and correlational strength in speech-to-meaning mapping, using a lexical decision task and computational simulations. Responses were faster for concepts with higher sharedness, suggesting that shared features are facilitatory in tasks like lexical decision that require access to them. Correlational strength facilitated responses for slower participants, suggesting a time-sensitive co-occurrence-driven settling mechanism. The computational simulation showed similar effects, with early effects of shared features and later effects of correlational strength. These results support a general-to-specific account of conceptual processing, whereby early activation of shared features is followed by the gradual emergence of a specific target representation. Copyright © 2015 The Authors. Cognitive Science published by Cognitive Science Society, Inc.
Characterization of posture and comfort in laptop users in non-desk settings.
Gold, J E; Driban, J B; Yingling, V R; Komaroff, E
2012-03-01
Laptop computers may be used in a variety of postures not coupled to the office workstation. Using passive motion analysis, this study examined mean joint angles during a short typing/editing task in college students (n=20), in up to seven positions. Comfort was assessed after task execution through a body map. For three required postures, joint angles in a prone posture were different than those while seated at a couch with feet either on floor or on ottoman. Specifically, the prone posture was characterized by comparatively non-neutral shoulders, elbows and wrists, and pronounced neck extension. Significantly greater intensity and more regions of discomfort were marked for the prone posture than for the seated postures. It is recommended that the prone posture only be assumed briefly during laptop use. Exposure to laptops outside of the office setting should be assessed in future epidemiologic studies of musculoskeletal complaints and computer use. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Elekes, Fruzsina; Varga, Máté; Király, Ildikó
2017-11-01
It has been widely assumed that computing how a scene looks from another perspective (level-2 perspective taking, PT) is an effortful process, as opposed to the automatic capacity of tracking visual access to objects (level-1 PT). Recently, adults have been found to compute both forms of visual perspectives in a quick but context-sensitive way, indicating that the two functions share more features than previously assumed. However, the developmental literature still shows the dissociation between automatic level-1 and effortful level-2 PT. In the current paper, we report an experiment showing that in a minimally social situation, participating in a number verification task with an adult confederate, eight- to 9.5-year-old children demonstrate similar online level-2 PT capacities as adults. Future studies need to address whether online PT shows selectivity in children as well and develop paradigms that are adequate to test preschoolers' online level-2 PT abilities. Statement of Contribution What is already known on this subject? Adults can access how objects appear to others (level-2 perspective) spontaneously and online Online level-1, but not level-2 perspective taking (PT) has been documented in school-aged children What the present study adds? Eight- to 9.5-year-olds performed a number verification task with a confederate who had the same task Children showed similar perspective interference as adults, indicating spontaneous level-2 PT Not only agent-object relations but also object appearances are computed online by eight- to 9.5-year-olds. © 2017 The British Psychological Society.
Herdağdelen, Amaç; Marelli, Marco
2017-05-01
Corpus-based word frequencies are one of the most important predictors in language processing tasks. Frequencies based on conversational corpora (such as movie subtitles) are shown to better capture the variance in lexical decision tasks compared to traditional corpora. In this study, we show that frequencies computed from social media are currently the best frequency-based estimators of lexical decision reaction times (up to 3.6% increase in explained variance). The results are robust (observed for Twitter- and Facebook-based frequencies on American English and British English datasets) and are still substantial when we control for corpus size. © 2016 The Authors. Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.
Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.
Costello, Fintan; Watts, Paul
2018-01-01
We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.
Bonow, Robert O; Brown, Alan S; Gillam, Linda D; Kapadia, Samir R; Kavinsky, Clifford J; Lindman, Brian R; Mack, Michael J; Thourani, Vinod H; Dehmer, Gregory J; Bonow, Robert O; Lindman, Brian R; Beaver, Thomas M; Bradley, Steven M; Carabello, Blase A; Desai, Milind Y; George, Isaac; Green, Philip; Holmes, David R; Johnston, Douglas; Leipsic, Jonathon; Mick, Stephanie L; Passeri, Jonathan J; Piana, Robert N; Reichek, Nathaniel; Ruiz, Carlos E; Taub, Cynthia C; Thomas, James D; Turi, Zoltan G; Doherty, John U; Dehmer, Gregory J; Bailey, Steven R; Bhave, Nicole M; Brown, Alan S; Daugherty, Stacie L; Dean, Larry S; Desai, Milind Y; Duvernoy, Claire S; Gillam, Linda D; Hendel, Robert C; Kramer, Christopher M; Lindsay, Bruce D; Manning, Warren J; Mehrotra, Praveen; Patel, Manesh R; Sachdeva, Ritu; Wann, L Samuel; Winchester, David E; Allen, Joseph M
2018-02-01
The American College of Cardiology collaborated with the American Association for Thoracic Surgery, American Heart Association, American Society of Echocardiography, European Association for Cardio-Thoracic Surgery, Heart Valve Society, Society of Cardiovascular Anesthesiologists, Society for Cardiovascular Angiography and Interventions, Society of Cardiovascular Computed Tomography, Society for Cardiovascular Magnetic Resonance, and Society of Thoracic Surgeons to develop and evaluate Appropriate Use Criteria (AUC) for the treatment of patients with severe aortic stenosis (AS). This is the first AUC to address the topic of AS and its treatment options, including surgical aortic valve replacement (SAVR) and transcatheter aortic valve replacement (TAVR). A number of common patient scenarios experienced in daily practice were developed along with assumptions and definitions for those scenarios, which were all created using guidelines, clinical trial data, and expert opinion in the field of AS. The 2014 AHA/ACC guideline for the management of patients with valvular heart disease: a report of the American College of Cardiology/American Heart Association Task Force on Practice Guidelines(1) and its 2017 focused update paper (2) were used as the primary guiding references in developing these indications. The writing group identified 95 clinical scenarios based on patient symptoms and clinical presentation, and up to 6 potential treatment options for those patients. A separate, independent rating panel was asked to score each indication from 1 to 9, with 1-3 categorized as "Rarely Appropriate," 4-6 as "May Be Appropriate," and 7-9 as "Appropriate." After considering factors such as symptom status, left ventricular (LV) function, surgical risk, and the presence of concomitant coronary or other valve disease, the rating panel determined that either SAVR or TAVR is Appropriate in most patients with symptomatic AS at intermediate or high surgical risk; however, situations commonly arise in clinical practice in which the indications for SAVR or TAVR are less clear, including situations in which 1 form of valve replacement would appear reasonable when the other is less so, as do other circumstances in which neither intervention is the suitable treatment option. The purpose of this AUC is to provide guidance to clinicians in the care of patients with severe AS by identifying the reasonable treatment and intervention options available based on the myriad clinical scenarios with which patients present. This AUC document also serves as an educational and quality improvement tool to identify patterns of care and reduce the number of rarely appropriate interventions in clinical practice. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Phillips, Lawrence; Pearl, Lisa
2015-11-01
The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's cognitive plausibility. We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition model can aim to be cognitively plausible in multiple ways. We discuss these cognitive plausibility checkpoints generally and then apply them to a case study in word segmentation, investigating a promising Bayesian segmentation strategy. We incorporate cognitive plausibility by using an age-appropriate unit of perceptual representation, evaluating the model output in terms of its utility, and incorporating cognitive constraints into the inference process. Our more cognitively plausible model shows a beneficial effect of cognitive constraints on segmentation performance. One interpretation of this effect is as a synergy between the naive theories of language structure that infants may have and the cognitive constraints that limit the fidelity of their inference processes, where less accurate inference approximations are better when the underlying assumptions about how words are generated are less accurate. More generally, these results highlight the utility of incorporating cognitive plausibility more fully into computational models of language acquisition. Copyright © 2015 Cognitive Science Society, Inc.
Exploring Initiative as a Signal of Knowledge Co-Construction During Collaborative Problem Solving.
Howard, Cynthia; Di Eugenio, Barbara; Jordan, Pamela; Katz, Sandra
2017-08-01
Peer interaction has been found to be conducive to learning in many settings. Knowledge co-construction (KCC) has been proposed as one explanatory mechanism. However, KCC is a theoretical construct that is too abstract to guide the development of instructional software that can support peer interaction. In this study, we present an extensive analysis of a corpus of peer dialogs that we collected in the domain of introductory Computer Science. We show that the notion of task initiative shifts correlates with both KCC and learning. Speakers take task initiative when they contribute new content that advances problem solving and that is not invited by their partner; if initiative shifts between the partners, it indicates they both contribute to problem solving. We found that task initiative shifts occur more frequently within KCC episodes than outside. In addition, task initiative shifts within KCC episodes correlate with learning for low pre-testers, and total task initiative shifts correlate with learning for high pre-testers. As recognizing task initiative shifts does not require as much deep knowledge as recognizing KCC, task initiative shifts as an indicator of productive collaboration are potentially easier to model in instructional software that simulates a peer. Copyright © 2016 Cognitive Science Society, Inc.
Pheochromocytoma and paraganglioma: an endocrine society clinical practice guideline.
Lenders, Jacques W M; Duh, Quan-Yang; Eisenhofer, Graeme; Gimenez-Roqueplo, Anne-Paule; Grebe, Stefan K G; Murad, Mohammad Hassan; Naruse, Mitsuhide; Pacak, Karel; Young, William F
2014-06-01
The aim was to formulate clinical practice guidelines for pheochromocytoma and paraganglioma (PPGL). The Task Force included a chair selected by the Endocrine Society Clinical Guidelines Subcommittee (CGS), seven experts in the field, and a methodologist. The authors received no corporate funding or remuneration. This evidence-based guideline was developed using the Grading of Recommendations, Assessment, Development, and Evaluation (GRADE) system to describe both the strength of recommendations and the quality of evidence. The Task Force reviewed primary evidence and commissioned two additional systematic reviews. One group meeting, several conference calls, and e-mail communications enabled consensus. Committees and members of the Endocrine Society, European Society of Endocrinology, and Americal Association for Clinical Chemistry reviewed drafts of the guidelines. The Task Force recommends that initial biochemical testing for PPGLs should include measurements of plasma free or urinary fractionated metanephrines. Consideration should be given to preanalytical factors leading to false-positive or false-negative results. All positive results require follow-up. Computed tomography is suggested for initial imaging, but magnetic resonance is a better option in patients with metastatic disease or when radiation exposure must be limited. (123)I-metaiodobenzylguanidine scintigraphy is a useful imaging modality for metastatic PPGLs. We recommend consideration of genetic testing in all patients, with testing by accredited laboratories. Patients with paraganglioma should be tested for SDHx mutations, and those with metastatic disease for SDHB mutations. All patients with functional PPGLs should undergo preoperative blockade to prevent perioperative complications. Preparation should include a high-sodium diet and fluid intake to prevent postoperative hypotension. We recommend minimally invasive adrenalectomy for most pheochromocytomas with open resection for most paragangliomas. Partial adrenalectomy is an option for selected patients. Lifelong follow-up is suggested to detect recurrent or metastatic disease. We suggest personalized management with evaluation and treatment by multidisciplinary teams with appropriate expertise to ensure favorable outcomes.
Asundi, Krishna; Odell, Dan; Luce, Adam; Dennerlein, Jack T
2012-03-01
This study evaluated the use of simple inclines as a portable peripheral for improving head and neck postures during notebook computer use on tables in portable environments such as hotel rooms, cafés, and airport lounges. A 3D motion analysis system measured head, neck and right upper extremity postures of 15 participants as they completed a 10 min computer task in six different configurations, all on a fixed height desk: no-incline, 12° incline, 25° incline, no-incline with external mouse, 25° incline with an external mouse, and a commercially available riser with external mouse and keyboard. After completion of the task, subjects rated the configuration for comfort and ease of use and indicated perceived discomfort in several body segments. Compared to the no-incline configuration, use of the 12° incline reduced forward head tilt and neck flexion while increasing wrist extension. The 25° incline further reduced head tilt and neck flexion while further increasing wrist extension. The 25° incline received the lowest comfort and ease of use ratings and the highest perceived discomfort score. For portable, temporary computing environments where internal input devices are used, users may find improved head and neck postures with acceptable wrist extension postures with the utilization of a 12° incline. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Hendel, Robert C; Berman, Daniel S; Di Carli, Marcelo F; Heidenreich, Paul A; Henkin, Robert E; Pellikka, Patricia A; Pohost, Gerald M; Williams, Kim A
2009-06-09
The American College of Cardiology Foundation (ACCF), along with key specialty and subspecialty societies, conducted an appropriate use review of common clinical scenarios where cardiac radionuclide imaging (RNI) is frequently considered. This document is a revision of the original Single-Photon Emission Computed Tomography Myocardial Perfusion Imaging (SPECT MPI) Appropriateness Criteria, published 4 years earlier, written to reflect changes in test utilization and new clinical data, and to clarify RNI use where omissions or lack of clarity existed in the original criteria. This is in keeping with the commitment to revise and refine appropriate use criteria (AUC) on a frequent basis. The indications for this review were drawn from common applications or anticipated uses, as well as from current clinical practice guidelines. Sixty-seven clinical scenarios were developed by a writing group and scored by a separate technical panel on a scale of 1 to 9 to designate appropriate use, inappropriate use, or uncertain use. In general, use of cardiac RNI for diagnosis and risk assessment in intermediate- and high-risk patients with coronary artery disease (CAD) was viewed favorably, while testing in low-risk patients, routine repeat testing, and general screening in certain clinical scenarios were viewed less favorably. Additionally, use for perioperative testing was found to be inappropriate except for high selected groups of patients. It is anticipated that these results will have a significant impact on physician decision making, test performance, and reimbursement policy, and will help guide future research.
Sex differences on a computerized mental rotation task disappear with computer familiarization.
Roberts, J E; Bell, M A
2000-12-01
The area of cognitive research that has produced the most consistent sex differences is spatial ability. Particularly, men consistently perform better on mental rotation tasks than do women. This study examined the effects of familiarization with a computer on performance of a computerized two-dimensional mental rotation task. Two groups of college students (N=44) performed the rotation task, with one group performing a color-matching task that allowed them to be familiarized with the computer prior to the rotation task. Among the participants who only performed the rotation task, the 11 men performed better than the 11 women. Among the participants who performed the computer familiarization task before the rotation task, how ever, there were no sex differences on the mental rotation task between the 10 men and 12 women. These data indicate that sex differences on this two-dimensional task may reflect familiarization with the computer, not the mental rotation component of the task. Further research with larger samples and increased range of task difficulty is encouraged.
User-centric incentive design for participatory mobile phone sensing
NASA Astrophysics Data System (ADS)
Gao, Wei; Lu, Haoyang
2014-05-01
Mobile phone sensing is a critical underpinning of pervasive mobile computing, and is one of the key factors for improving people's quality of life in modern society via collective utilization of the on-board sensing capabilities of people's smartphones. The increasing demands for sensing services and ambient awareness in mobile environments highlight the necessity of active participation of individual mobile users in sensing tasks. User incentives for such participation have been continuously offered from an application-centric perspective, i.e., as payments from the sensing server, to compensate users' sensing costs. These payments, however, are manipulated to maximize the benefits of the sensing server, ignoring the runtime flexibility and benefits of participating users. This paper presents a novel framework of user-centric incentive design, and develops a universal sensing platform which translates heterogenous sensing tasks to a generic sensing plan specifying the task-independent requirements of sensing performance. We use this sensing plan as input to reduce three categories of sensing costs, which together cover the possible sources hindering users' participation in sensing.
Work and freedom? Working self-objectification and belief in personal free will.
Baldissarri, Cristina; Andrighetto, Luca; Gabbiadini, Alessandro; Volpato, Chiara
2017-06-01
The current work aimed to extend the burgeoning literature on working objectification by investigating the effects of particular job activities on self-perception. By integrating relevant theoretical reflections with recent empirical evidence, we expected that performing objectifying (i.e., repetitive, fragmented, and other-directed) tasks would affect participants' self-objectification and, in turn, their belief in personal free will. In three studies, we consistently found that performing a manual (Study 1 and Study 2) or a computer (Study 3) objectifying task (vs. a non-objectifying task and vs. the baseline condition) led participants to objectify themselves in terms of both decreased self-attribution of human mental states (Study 1 and Study 3) and increased self-perception of being instrument-like (Study 2 and Study 3). Crucially, this increased self-objectification mediated the relationship between performing an objectifying activity and the participants' decreased belief in personal free will. The theoretical and practical implications of these findings are considered. © 2016 The British Psychological Society.
Comparing two types of engineering visualizations: task-related manipulations matter.
Cölln, Martin C; Kusch, Kerstin; Helmert, Jens R; Kohler, Petra; Velichkovsky, Boris M; Pannasch, Sebastian
2012-01-01
This study focuses on the comparison of traditional engineering drawings with a CAD (computer aided design) visualization in terms of user performance and eye movements in an applied context. Twenty-five students of mechanical engineering completed search tasks for measures in two distinct depictions of a car engine component (engineering drawing vs. CAD model). Besides spatial dimensionality, the display types most notably differed in terms of information layout, access and interaction options. The CAD visualization yielded better performance, if users directly manipulated the object, but was inferior, if employed in a conventional static manner, i.e. inspecting only predefined views. An additional eye movement analysis revealed longer fixation durations and a stronger increase of task-relevant fixations over time when interacting with the CAD visualization. This suggests a more focused extraction and filtering of information. We conclude that the three-dimensional CAD visualization can be advantageous if its ability to manipulate is used. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Logic as Marr's Computational Level: Four Case Studies.
Baggio, Giosuè; van Lambalgen, Michiel; Hagoort, Peter
2015-04-01
We sketch four applications of Marr's levels-of-analysis methodology to the relations between logic and experimental data in the cognitive neuroscience of language and reasoning. The first part of the paper illustrates the explanatory power of computational level theories based on logic. We show that a Bayesian treatment of the suppression task in reasoning with conditionals is ruled out by EEG data, supporting instead an analysis based on defeasible logic. Further, we describe how results from an EEG study on temporal prepositions can be reanalyzed using formal semantics, addressing a potential confound. The second part of the article demonstrates the predictive power of logical theories drawing on EEG data on processing progressive constructions and on behavioral data on conditional reasoning in people with autism. Logical theories can constrain processing hypotheses all the way down to neurophysiology, and conversely neuroscience data can guide the selection of alternative computational level models of cognition. Copyright © 2014 Cognitive Science Society, Inc.
Cloud computing task scheduling strategy based on improved differential evolution algorithm
NASA Astrophysics Data System (ADS)
Ge, Junwei; He, Qian; Fang, Yiqiu
2017-04-01
In order to optimize the cloud computing task scheduling scheme, an improved differential evolution algorithm for cloud computing task scheduling is proposed. Firstly, the cloud computing task scheduling model, according to the model of the fitness function, and then used improved optimization calculation of the fitness function of the evolutionary algorithm, according to the evolution of generation of dynamic selection strategy through dynamic mutation strategy to ensure the global and local search ability. The performance test experiment was carried out in the CloudSim simulation platform, the experimental results show that the improved differential evolution algorithm can reduce the cloud computing task execution time and user cost saving, good implementation of the optimal scheduling of cloud computing tasks.
Influence of computer work under time pressure on cardiac activity.
Shi, Ping; Hu, Sijung; Yu, Hongliu
2015-03-01
Computer users are often under stress when required to complete computer work within a required time. Work stress has repeatedly been associated with an increased risk for cardiovascular disease. The present study examined the effects of time pressure workload during computer tasks on cardiac activity in 20 healthy subjects. Heart rate, time domain and frequency domain indices of heart rate variability (HRV) and Poincaré plot parameters were compared among five computer tasks and two rest periods. Faster heart rate and decreased standard deviation of R-R interval were noted in response to computer tasks under time pressure. The Poincaré plot parameters showed significant differences between different levels of time pressure workload during computer tasks, and between computer tasks and the rest periods. In contrast, no significant differences were identified for the frequency domain indices of HRV. The results suggest that the quantitative Poincaré plot analysis used in this study was able to reveal the intrinsic nonlinear nature of the autonomically regulated cardiac rhythm. Specifically, heightened vagal tone occurred during the relaxation computer tasks without time pressure. In contrast, the stressful computer tasks with added time pressure stimulated cardiac sympathetic activity. Copyright © 2015 Elsevier Ltd. All rights reserved.
An integrated science-based methodology to assess potential ...
There is an urgent need for broad and integrated studies that address the risks of engineered nanomaterials (ENMs) along the different endpoints of the society, environment, and economy (SEE) complex adaptive system. This article presents an integrated science-based methodology to assess the potential risks of engineered nanomaterials. To achieve the study objective, two major tasks are accomplished, knowledge synthesis and algorithmic computational methodology. The knowledge synthesis task is designed to capture “what is known” and to outline the gaps in knowledge from ENMs risk perspective. The algorithmic computational methodology is geared toward the provision of decisions and an understanding of the risks of ENMs along different endpoints for the constituents of the SEE complex adaptive system. The approach presented herein allows for addressing the formidable task of assessing the implications and risks of exposure to ENMs, with the long term goal to build a decision-support system to guide key stakeholders in the SEE system towards building sustainable ENMs and nano-enabled products. The following specific aims are formulated to achieve the study objective: (1) to propose a system of systems (SoS) architecture that builds a network management among the different entities in the large SEE system to track the flow of ENMs emission, fate and transport from the source to the receptor; (2) to establish a staged approach for knowledge synthesis methodo
ERIC Educational Resources Information Center
Austin, Bobby William, Ed.
This report of the National Task Force on African-American Men and Boys is the beginning of an approach to repair society's breaches and restore the streets to safety. The Task Force, headed by Andrew J. Young and established in 1994, conceived its mission as one of reclamation. The Task Force made 61 specific recommendations, and three general…
Towards Modeling False Memory With Computational Knowledge Bases.
Li, Justin; Kohanyi, Emma
2017-01-01
One challenge to creating realistic cognitive models of memory is the inability to account for the vast common-sense knowledge of human participants. Large computational knowledge bases such as WordNet and DBpedia may offer a solution to this problem but may pose other challenges. This paper explores some of these difficulties through a semantic network spreading activation model of the Deese-Roediger-McDermott false memory task. In three experiments, we show that these knowledge bases only capture a subset of human associations, while irrelevant information introduces noise and makes efficient modeling difficult. We conclude that the contents of these knowledge bases must be augmented and, more important, that the algorithms must be refined and optimized, before large knowledge bases can be widely used for cognitive modeling. Copyright © 2016 Cognitive Science Society, Inc.
A multimodal dataset for authoring and editing multimedia content: The MAMEM project.
Nikolopoulos, Spiros; Petrantonakis, Panagiotis C; Georgiadis, Kostas; Kalaganis, Fotis; Liaros, Georgios; Lazarou, Ioulietta; Adam, Katerina; Papazoglou-Chalikias, Anastasios; Chatzilari, Elisavet; Oikonomou, Vangelis P; Kumar, Chandan; Menges, Raphael; Staab, Steffen; Müller, Daniel; Sengupta, Korok; Bostantjopoulou, Sevasti; Katsarou, Zoe; Zeilig, Gabi; Plotnik, Meir; Gotlieb, Amihai; Kizoni, Racheli; Fountoukidou, Sofia; Ham, Jaap; Athanasiou, Dimitrios; Mariakaki, Agnes; Comanducci, Dario; Sabatini, Edoardo; Nistico, Walter; Plank, Markus; Kompatsiaris, Ioannis
2017-12-01
We present a dataset that combines multimodal biosignals and eye tracking information gathered under a human-computer interaction framework. The dataset was developed in the vein of the MAMEM project that aims to endow people with motor disabilities with the ability to edit and author multimedia content through mental commands and gaze activity. The dataset includes EEG, eye-tracking, and physiological (GSR and Heart rate) signals collected from 34 individuals (18 able-bodied and 16 motor-impaired). Data were collected during the interaction with specifically designed interface for web browsing and multimedia content manipulation and during imaginary movement tasks. The presented dataset will contribute towards the development and evaluation of modern human-computer interaction systems that would foster the integration of people with severe motor impairments back into society.
Simulation of a Real-Time Brain Computer Interface for Detecting a Self-Paced Hitting Task.
Hammad, Sofyan H; Kamavuako, Ernest N; Farina, Dario; Jensen, Winnie
2016-12-01
An invasive brain-computer interface (BCI) is a promising neurorehabilitation device for severely disabled patients. Although some systems have been shown to work well in restricted laboratory settings, their utility must be tested in less controlled, real-time environments. Our objective was to investigate whether a specific motor task could be reliably detected from multiunit intracortical signals from freely moving animals in a simulated, real-time setting. Intracortical signals were first obtained from electrodes placed in the primary motor cortex of four rats that were trained to hit a retractable paddle (defined as a "Hit"). In the simulated real-time setting, the signal-to-noise-ratio was first increased by wavelet denoising. Action potentials were detected, and features were extracted (spike count, mean absolute values, entropy, and combination of these features) within pre-defined time windows (200 ms, 300 ms, and 400 ms) to classify the occurrence of a "Hit." We found higher detection accuracy of a "Hit" (73.1%, 73.4%, and 67.9% for the three window sizes, respectively) when the decision was made based on a combination of features rather than on a single feature. However, the duration of the window length was not statistically significant (p = 0.5). Our results showed the feasibility of detecting a motor task in real time in a less restricted environment compared to environments commonly applied within invasive BCI research, and they showed the feasibility of using information extracted from multiunit recordings, thereby avoiding the time-consuming and complex task of extracting and sorting single units. © 2016 International Neuromodulation Society.
Limited Benefits of Heterogeneous Dual-Task Training on Transfer Effects in Older Adults.
Lussier, Maxime; Brouillard, Philippe; Bherer, Louis
2017-09-01
It has often been reported that cognitive training has limited transfer effects. The present study addresses training context variability as a factor that could increase transfer effects, as well as the manifestation through time of transfer effects. Fifty-eight older adults were assigned to an active placebo or two dual-task training conditions, one in which the training context varies between sessions (heterogeneous training) and the other in a fixed training context (homogeneous training). Transfer was assessed with near and far-modality transfer tasks. Results show that heterogeneous and homogeneous training led to larger near-modality transfer effects than an active placebo (computer lessons). Transfer effects were roughly comparable in both training groups, but heterogeneous training led to a steeper improvement of the dual-task coordination learning curve within training sessions. Also, results indicated that dual-task cost did not improve in the active placebo group from the pre- to the post-training sessions. Heterogeneous training showed modest advantages over homogeneous training. Results also suggest that transfer effects on dual-task cost induced by training take place early on in the post-training session. These findings provide valuable insights on benefits arising from variability in the training protocol for maximizing transfer effects. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Talke, Pekka O; Sharma, Deepak; Heyer, Eric J; Bergese, Sergio D; Blackham, Kristine A; Stevens, Robert D
2014-04-01
Literature on the anesthetic management of endovascular treatment of acute ischemic stroke (AIS) is limited. Anesthetic management during these procedures is still mostly dependent on individual or institutional preferences. Thus, the Society of Neuroscience in Anesthesiology and Critical Care (SNACC) created a task force to provide expert consensus recommendations on anesthetic management of endovascular treatment of AIS. The task force conducted a systematic literature review (up to August 2012). Because of the limited number of research articles relating to this subject, the task force solicited opinions from experts in this area. The task force created a draft consensus statement based on the available data. Classes of recommendations and levels of evidence were assigned to articles specifically addressing anesthetic management during endovascular treatment of stroke using the standard American Heart Association evidence rating scheme. The draft consensus statement was reviewed by the Task Force, SNACC Executive Committee and representatives of Society of NeuroInterventional Surgery (SNIS) and Neurocritical Care Society (NCS) reaching consensus on the final document. For this consensus statement the anesthetic management of endovascular treatment of AIS was subdivided into 12 topics. Each topic includes a summary of available data followed by recommendations. This consensus statement is intended for use by individuals involved in the care of patients with acute ischemic stroke, such as anesthesiologists, interventional neuroradiologists, neurologists, neurointensivists, and neurosurgeons.
Computer usage and task-switching during resident's working day: Disruptive or not?
Méan, Marie; Garnier, Antoine; Wenger, Nathalie; Castioni, Julien; Waeber, Gérard; Marques-Vidal, Pedro
2017-01-01
Recent implementation of electronic health records (EHR) has dramatically changed medical ward organization. While residents in general internal medicine use EHR systems half of their working time, whether computer usage impacts residents' workflow remains uncertain. We aimed to observe the frequency of task-switches occurring during resident's work and to assess whether computer usage was associated with task-switching. In a large Swiss academic university hospital, we conducted, between May 26 and July 24, 2015 a time-motion study to assess how residents in general internal medicine organize their working day. We observed 49 day and 17 evening shifts of 36 residents, amounting to 697 working hours. During day shifts, residents spent 5.4 hours using a computer (mean total working time: 11.6 hours per day). On average, residents switched 15 times per hour from a task to another. Task-switching peaked between 8:00-9:00 and 16:00-17:00. Task-switching was not associated with resident's characteristics and no association was found between task-switching and extra hours (Spearman r = 0.220, p = 0.137 for day and r = 0.483, p = 0.058 for evening shifts). Computer usage occurred more frequently at the beginning or ends of day shifts and was associated with decreased overall task-switching. Task-switching occurs very frequently during resident's working day. Despite the fact that residents used a computer half of their working time, computer usage was associated with decreased task-switching. Whether frequent task-switches and computer usage impact the quality of patient care and resident's work must be evaluated in further studies.
A Scheduling Algorithm for Cloud Computing System Based on the Driver of Dynamic Essential Path.
Xie, Zhiqiang; Shao, Xia; Xin, Yu
2016-01-01
To solve the problem of task scheduling in the cloud computing system, this paper proposes a scheduling algorithm for cloud computing based on the driver of dynamic essential path (DDEP). This algorithm applies a predecessor-task layer priority strategy to solve the problem of constraint relations among task nodes. The strategy assigns different priority values to every task node based on the scheduling order of task node as affected by the constraint relations among task nodes, and the task node list is generated by the different priority value. To address the scheduling order problem in which task nodes have the same priority value, the dynamic essential long path strategy is proposed. This strategy computes the dynamic essential path of the pre-scheduling task nodes based on the actual computation cost and communication cost of task node in the scheduling process. The task node that has the longest dynamic essential path is scheduled first as the completion time of task graph is indirectly influenced by the finishing time of task nodes in the longest dynamic essential path. Finally, we demonstrate the proposed algorithm via simulation experiments using Matlab tools. The experimental results indicate that the proposed algorithm can effectively reduce the task Makespan in most cases and meet a high quality performance objective.
A Scheduling Algorithm for Cloud Computing System Based on the Driver of Dynamic Essential Path
Xie, Zhiqiang; Shao, Xia; Xin, Yu
2016-01-01
To solve the problem of task scheduling in the cloud computing system, this paper proposes a scheduling algorithm for cloud computing based on the driver of dynamic essential path (DDEP). This algorithm applies a predecessor-task layer priority strategy to solve the problem of constraint relations among task nodes. The strategy assigns different priority values to every task node based on the scheduling order of task node as affected by the constraint relations among task nodes, and the task node list is generated by the different priority value. To address the scheduling order problem in which task nodes have the same priority value, the dynamic essential long path strategy is proposed. This strategy computes the dynamic essential path of the pre-scheduling task nodes based on the actual computation cost and communication cost of task node in the scheduling process. The task node that has the longest dynamic essential path is scheduled first as the completion time of task graph is indirectly influenced by the finishing time of task nodes in the longest dynamic essential path. Finally, we demonstrate the proposed algorithm via simulation experiments using Matlab tools. The experimental results indicate that the proposed algorithm can effectively reduce the task Makespan in most cases and meet a high quality performance objective. PMID:27490901
Kumar, Anand; Ciccarese, Paolo; Quaglini, Silvana; Stefanelli, Mario; Caffi, Ezio; Boiocchi, Lorenzo
2003-01-01
Medical knowledge in clinical practice guideline (GL) texts is the source of task-based computer-interpretable clinical guideline models (CIGMs). We have used Unified Medical Language System (UMLS) semantic types (STs) to understand the percentage of GL text which belongs to a particular ST. We also use UMLS semantic network together with the CIGM-specific ontology to derive a semantic meaning behind the GL text. In order to achieve this objective, we took nine GL texts from the National Guideline Clearinghouse (NGC) and marked up the text dealing with a particular ST. The STs we took into consideration were restricted taking into account the requirements of a task-based CIGM. We used DARPA Agent Markup Language and Ontology Inference Layer (DAML + OIL) to create the UMLS and CIGM specific semantic network. For the latter, as a bench test, we used the 1999 WHO-International Society of Hypertension Guidelines for the Management of Hypertension. We took into consideration the UMLS STs closest to the clinical tasks. The percentage of the GL text dealing with the ST "Health Care Activity" and subtypes "Laboratory Procedure", "Diagnostic Procedure" and "Therapeutic or Preventive Procedure" were measured. The parts of text belonging to other STs or comments were separated. A mapping of terms belonging to other STs was done to the STs under "HCA" for representation in DAML + OIL. As a result, we found that the three STs under "HCA" were the predominant STs present in the GL text. In cases where the terms of related STs existed, they were mapped into one of the three STs. The DAML + OIL representation was able to describe the hierarchy in task-based CIGMs. To conclude, we understood that the three STs could be used to represent the semantic network of the task-bases CIGMs. We identified some mapping operators which could be used for the mapping of other STs into these.
2010-12-01
A European Federation of Neurological Societies/Peripheral Nerve Society consensus guideline on the definition, investigation, and treatment of multifocal motor neuropathy (MMN) was published in 2006. The aim is to revise this guideline. Disease experts considered references retrieved from MEDLINE and Cochrane Systematic Reviews published between August 2004 and July 2009 and prepared statements that were agreed to in an iterative fashion. The Task Force agreed on Good Practice Points to define clinical and electrophysiological diagnostic criteria for MMN, investigations to be considered, and principal recommendations for treatment. © 2010 Peripheral Nerve Society.
Task Selection, Task Switching and Multitasking during Computer-Based Independent Study
ERIC Educational Resources Information Center
Judd, Terry
2015-01-01
Detailed logs of students' computer use, during independent study sessions, were captured in an open-access computer laboratory. Each log consisted of a chronological sequence of tasks representing either the application or the Internet domain displayed in the workstation's active window. Each task was classified using a three-tier schema…
NASA Technical Reports Server (NTRS)
Smith, M. E.; Gevins, A.; Brown, H.; Karnik, A.; Du, R.
2001-01-01
Electroencephalographic (EEG) recordings were made while 16 participants performed versions of a personal-computer-based flight simulation task of low, moderate, or high difficulty. As task difficulty increased, frontal midline theta EEG activity increased and alpha band activity decreased. A participant-specific function that combined multiple EEG features to create a single load index was derived from a sample of each participant's data and then applied to new test data from that participant. Index values were computed for every 4 s of task data. Across participants, mean task load index values increased systematically with increasing task difficulty and differed significantly between the different task versions. Actual or potential applications of this research include the use of multivariate EEG-based methods to monitor task loading during naturalistic computer-based work.
A learnable parallel processing architecture towards unity of memory and computing
NASA Astrophysics Data System (ADS)
Li, H.; Gao, B.; Chen, Z.; Zhao, Y.; Huang, P.; Ye, H.; Liu, L.; Liu, X.; Kang, J.
2015-08-01
Developing energy-efficient parallel information processing systems beyond von Neumann architecture is a long-standing goal of modern information technologies. The widely used von Neumann computer architecture separates memory and computing units, which leads to energy-hungry data movement when computers work. In order to meet the need of efficient information processing for the data-driven applications such as big data and Internet of Things, an energy-efficient processing architecture beyond von Neumann is critical for the information society. Here we show a non-von Neumann architecture built of resistive switching (RS) devices named “iMemComp”, where memory and logic are unified with single-type devices. Leveraging nonvolatile nature and structural parallelism of crossbar RS arrays, we have equipped “iMemComp” with capabilities of computing in parallel and learning user-defined logic functions for large-scale information processing tasks. Such architecture eliminates the energy-hungry data movement in von Neumann computers. Compared with contemporary silicon technology, adder circuits based on “iMemComp” can improve the speed by 76.8% and the power dissipation by 60.3%, together with a 700 times aggressive reduction in the circuit area.
A learnable parallel processing architecture towards unity of memory and computing.
Li, H; Gao, B; Chen, Z; Zhao, Y; Huang, P; Ye, H; Liu, L; Liu, X; Kang, J
2015-08-14
Developing energy-efficient parallel information processing systems beyond von Neumann architecture is a long-standing goal of modern information technologies. The widely used von Neumann computer architecture separates memory and computing units, which leads to energy-hungry data movement when computers work. In order to meet the need of efficient information processing for the data-driven applications such as big data and Internet of Things, an energy-efficient processing architecture beyond von Neumann is critical for the information society. Here we show a non-von Neumann architecture built of resistive switching (RS) devices named "iMemComp", where memory and logic are unified with single-type devices. Leveraging nonvolatile nature and structural parallelism of crossbar RS arrays, we have equipped "iMemComp" with capabilities of computing in parallel and learning user-defined logic functions for large-scale information processing tasks. Such architecture eliminates the energy-hungry data movement in von Neumann computers. Compared with contemporary silicon technology, adder circuits based on "iMemComp" can improve the speed by 76.8% and the power dissipation by 60.3%, together with a 700 times aggressive reduction in the circuit area.
Radiological protection in computed tomography and cone beam computed tomography.
Rehani, M M
2015-06-01
The International Commission on Radiological Protection (ICRP) has sustained interest in radiological protection in computed tomography (CT), and ICRP Publications 87 and 102 focused on the management of patient doses in CT and multi-detector CT (MDCT) respectively. ICRP forecasted and 'sounded the alarm' on increasing patient doses in CT, and recommended actions for manufacturers and users. One of the approaches was that safety is best achieved when it is built into the machine, rather than left as a matter of choice for users. In view of upcoming challenges posed by newer systems that use cone beam geometry for CT (CBCT), and their widened usage, often by untrained users, a new ICRP task group has been working on radiological protection issues in CBCT. Some of the issues identified by the task group are: lack of standardisation of dosimetry in CBCT; the false belief within the medical and dental community that CBCT is a 'light', low-dose CT whereas mobile CBCT units and newer applications, particularly C-arm CT in interventional procedures, involve higher doses; lack of training in radiological protection among clinical users; and lack of dose information and tracking in many applications. This paper provides a summary of approaches used in CT and MDCT, and preliminary information regarding work just published for radiological protection in CBCT. © The International Society for Prosthetics and Orthotics Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Enhanced delegated computing using coherence
NASA Astrophysics Data System (ADS)
Barz, Stefanie; Dunjko, Vedran; Schlederer, Florian; Moore, Merritt; Kashefi, Elham; Walmsley, Ian A.
2016-03-01
A longstanding question is whether it is possible to delegate computational tasks securely—such that neither the computation nor the data is revealed to the server. Recently, both a classical and a quantum solution to this problem were found [C. Gentry, in Proceedings of the 41st Annual ACM Symposium on the Theory of Computing (Association for Computing Machinery, New York, 2009), pp. 167-178; A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science (IEEE Computer Society, Los Alamitos, CA, 2009), pp. 517-526]. Here, we study the first step towards the interplay between classical and quantum approaches and show how coherence can be used as a tool for secure delegated classical computation. We show that a client with limited computational capacity—restricted to an XOR gate—can perform universal classical computation by manipulating information carriers that may occupy superpositions of two states. Using single photonic qubits or coherent light, we experimentally implement secure delegated classical computations between an independent client and a server, which are installed in two different laboratories and separated by 50 m . The server has access to the light sources and measurement devices, whereas the client may use only a restricted set of passive optical devices to manipulate the information-carrying light beams. Thus, our work highlights how minimal quantum and classical resources can be combined and exploited for classical computing.
Order of stimulus presentation influences children's acquisition in receptive identification tasks.
Petursdottir, Anna Ingeborg; Aguilar, Gabriella
2016-03-01
Receptive identification is usually taught in matching-to-sample format, which entails the presentation of an auditory sample stimulus and several visual comparison stimuli in each trial. Conflicting recommendations exist regarding the order of stimulus presentation in matching-to-sample trials. The purpose of this study was to compare acquisition in receptive identification tasks under 2 conditions: when the sample was presented before the comparisons (sample first) and when the comparisons were presented before the sample (comparison first). Participants included 4 typically developing kindergarten-age boys. Stimuli, which included birds and flags, were presented on a computer screen. Acquisition in the 2 conditions was compared in an adapted alternating-treatments design combined with a multiple baseline design across stimulus sets. All participants took fewer trials to meet the mastery criterion in the sample-first condition than in the comparison-first condition. © 2015 Society for the Experimental Analysis of Behavior.
Cuing consumerism: situational materialism undermines personal and social well-being.
Bauer, Monika A; Wilkie, James E B; Kim, Jung K; Bodenhausen, Galen V
2012-05-01
Correlational evidence indicates that materialistic individuals experience relatively low levels of well-being. Across four experiments, we found that situational cuing can also trigger materialistic mind-sets, with similarly negative personal and social consequences. Merely viewing desirable consumer goods resulted in increases in materialistic concerns and led to heightened negative affect and reduced social involvement (Experiment 1). Framing a computer task as a "Consumer Reaction Study" led to a stronger automatic bias toward values reflecting self-enhancement, compared with framing the same task as a "Citizen Reaction Study" (Experiment 2). Consumer cues also increased competitiveness (Experiment 3) and selfishness in a water-conservation dilemma (Experiment 4). Thus, the costs of materialism are not localized only in particularly materialistic people, but can also be found in individuals who happen to be exposed to environmental cues that activate consumerism-cues that are commonplace in contemporary society.
Task allocation in a distributed computing system
NASA Technical Reports Server (NTRS)
Seward, Walter D.
1987-01-01
A conceptual framework is examined for task allocation in distributed systems. Application and computing system parameters critical to task allocation decision processes are discussed. Task allocation techniques are addressed which focus on achieving a balance in the load distribution among the system's processors. Equalization of computing load among the processing elements is the goal. Examples of system performance are presented for specific applications. Both static and dynamic allocation of tasks are considered and system performance is evaluated using different task allocation methodologies.
Primary School Children's Collaboration: Task Presentation and Gender Issues.
ERIC Educational Resources Information Center
Fitzpatrick, Helen; Hardman, Margaret
2000-01-01
Explores the characteristics of social interaction during an English language based task in the primary classroom, and the role of the computer in structuring collaboration when compared to a non-computer mode. Explains that seven and nine year old boys and girls (n=120) completed a computer and non-computer task. (CMK)
Matsumoto, Keiichi; Endo, Keigo
2013-06-01
Two kinds of Japanese guidelines for the data acquisition protocol of oncology fluoro-D-glucose-positron emission tomography (FDG-PET)/computed tomography (CT) scans were created by the joint task force of the Japanese Society of Nuclear Medicine Technology (JSNMT) and the Japanese Society of Nuclear Medicine (JSNM), and published in Kakuigaku-Gijutsu 27(5): 425-456, 2007 and 29(2): 195-235, 2009. These guidelines aim to standardize PET image quality among facilities and different PET/CT scanner models. The objective of this study was to develop a personal computer-based performance measurement and image quality processor for the two kinds of Japanese guidelines for oncology (18)F-FDG PET/CT scans. We call this software package the "PET quality control tool" (PETquact). Microsoft Corporation's Windows(™) is used as the operating system for PETquact, which requires 1070×720 image resolution and includes 12 different applications. The accuracy was examined for numerous applications of PETquact. For example, in the sensitivity application, the system sensitivity measurement results were equivalent when comparing two PET sinograms obtained from the PETquact and the report. PETquact is suited for analysis of the two kinds of Japanese guideline, and it shows excellent spec to performance measurements and image quality analysis. PETquact can be used at any facility if the software package is installed on a laptop computer.
Prisman, Eitan; Daly, Michael J; Chan, Harley; Siewerdsen, Jeffrey H; Vescan, Allan; Irish, Jonathan C
2011-01-01
Custom software was developed to integrate intraoperative cone-beam computed tomography (CBCT) images with endoscopic video for surgical navigation and guidance. A cadaveric head was used to assess the accuracy and potential clinical utility of the following functionality: (1) real-time tracking of the endoscope in intraoperative 3-dimensional (3D) CBCT; (2) projecting an orthogonal reconstructed CBCT image, at or beyond the endoscope, which is parallel to the tip of the endoscope corresponding to the surgical plane; (3) virtual reality fusion of endoscopic video and 3D CBCT surface rendering; and (4) overlay of preoperatively defined contours of anatomical structures of interest. Anatomical landmarks were contoured in CBCT of a cadaveric head. An experienced endoscopic surgeon was oriented to the software and asked to rate the utility of the navigation software in carrying out predefined surgical tasks. Utility was evaluated using a rating scale for: (1) safely completing the task; and (2) potential for surgical training. Surgical tasks included: (1) uncinectomy; (2) ethmoidectomy; (3) sphenoidectomy/pituitary resection; and (4) clival resection. CBCT images were updated following each ablative task. As a teaching tool, the software was evaluated as "very useful" for all surgical tasks. Regarding safety and task completion, the software was evaluated as "no advantage" for task (1), "minimal" for task (2), and "very useful" for tasks (3) and (4). Landmark identification for structures behind bone was "very useful" for both categories. The software increased surgical confidence in safely completing challenging ablative tasks by presenting real-time image guidance for highly complex ablative procedures. In addition, such technology offers a valuable teaching aid to surgeons in training. Copyright © 2011 American Rhinologic Society-American Academy of Otolaryngic Allergy, LLC.
Douglas, Pamela S; Khandheria, Bijoy; Stainback, Raymond F; Weissman, Neil J; Peterson, Eric D; Hendel, Robert C; Stainback, Raymond F; Blaivas, Michael; Des Prez, Roger D; Gillam, Linda D; Golash, Terry; Hiratzka, Loren F; Kussmaul, William G; Labovitz, Arthur J; Lindenfeld, JoAnn; Masoudi, Frederick A; Mayo, Paul H; Porembka, David; Spertus, John A; Wann, L Samuel; Wiegers, Susan E; Brindis, Ralph G; Douglas, Pamela S; Hendel, Robert C; Patel, Manesh R; Peterson, Eric D; Wolk, Michael J; Allen, Joseph M
2008-03-18
The American College of Cardiology Foundation (ACCF) and the American Society of Echocardiography (ASE) together with key specialty and subspecialty societies, conducted an appropriateness review for stress echocardiography. The review assessed the risks and benefits of stress echocardiography for several indications or clinical scenarios and scored them on a scale of 1 to 9 (based upon methodology developed by the ACCF to assess imaging appropriateness). The upper range (7 to 9) implies that the test is generally acceptable and is a reasonable approach, and the lower range (1 to 3) implies that the test is generally not acceptable and is not a reasonable approach. The midrange (4 to 6) indicates a clinical scenario for which the indication for a stress echocardiogram is uncertain. The indications for this review were drawn from common applications or anticipated uses, as well as from current clinical practice guidelines. Use of stress echocardiography for risk assessment in patients with coronary artery disease (CAD) was viewed favorably, while routine repeat testing and general screening in certain clinical scenarios were viewed less favorably. It is anticipated that these results will have a significant impact on physician decision making and performance, reimbursement policy, and will help guide future research.
Douglas, Pamela S; Khandheria, Bijoy; Stainback, Raymond F; Weissman, Neil J; Peterson, Eric D; Hendel, Robert C; Stainback, Raymond F; Blaivas, Michael; Des Prez, Roger D; Gillam, Linda D; Golash, Terry; Hiratzka, Loren F; Kussmaul, William G; Labovitz, Arthur J; Lindenfeld, JoAnn; Masoudi, Frederick A; Mayo, Paul H; Porembka, David; Spertus, John A; Wann, L Samuel; Wiegers, Susan E; Brindis, Ralph G; Douglas, Pamela S; Patel, Manesh R; Wolk, Michael J; Allen, Joseph M
2008-03-18
The American College of Cardiology Foundation (ACCF) and the American Society of Echocardiography (ASE) together with key specialty and subspecialty societies, conducted an appropriateness review for stress echocardiography. The review assessed the risks and benefits of stress echocardiography for several indications or clinical scenarios and scored them on a scale of 1 to 9 (based upon methodology developed by the ACCF to assess imaging appropriateness). The upper range (7 to 9) implies that the test is generally acceptable and is a reasonable approach, and the lower range (1 to 3) implies that the test is generally not acceptable and is not a reasonable approach. The midrange (4 to 6) indicates a clinical scenario for which the indication for a stress echocardiogram is uncertain. The indications for this review were drawn from common applications or anticipated uses, as well as from current clinical practice guidelines. Use of stress echocardiography for risk assessment in patients with coronary artery disease (CAD) was viewed favorably, while routine repeat testing and general screening in certain clinical scenarios were viewed less favorably. It is anticipated that these results will have a significant impact on physician decision making and performance, reimbursement policy, and will help guide future research.
Douglas, Pamela S; Khandheria, Bijoy; Stainback, Raymond F; Weissman, Neil J; Peterson, Eric D; Hendel, Robert C; Stainback, Raymond F; Blaivas, Michael; Des Prez, Roger D; Gillam, Linda D; Golash, Terry; Hiratzka, Loren F; Kussmaul, William G; Labovitz, Arthur J; Lindenfeld, Joann; Masoudi, Frederick A; Mayo, Paul H; Porembka, David; Spertus, John A; Wann, L Samuel; Wiegers, Susan E; Brindis, Ralph G; Douglas, Pamela S; Hendel, Robert C; Patel, Manesh R; Peterson, Eric D; Wolk, Michael J; Allen, Joseph M
2008-04-01
The American College of Cardiology Foundation (ACCF) and the American Society of Echocardiography (ASE) together with key specialty and subspecialty societies, conducted an appropriateness review for stress echocardiography. The review assessed the risks and benefits of stress echocardiography for several indications or clinical scenarios and scored them on a scale of 1 to 9 (based upon methodology developed by the ACCF to assess imaging appropriateness). The upper range (7 to 9) implies that the test is generally acceptable and is a reasonable approach, and the lower range (1 to 3) implies that the test is generally not acceptable and is not a reasonable approach. The midrange (4 to 6) indicates a clinical scenario for which the indication for a stress echocardiogram is uncertain. The indications for this review were drawn from common applications or anticipated uses, as well as from current clinical practice guidelines. Use of stress echocardiography for risk assessment in patients with coronary artery disease (CAD) was viewed favorably, while routine repeat testing and general screening in certain clinical scenarios were viewed less favorably. It is anticipated that these results will have a significant impact on physician decision making and performance, reimbursement policy, and will help guide future research.
Influence of rotator cuff tears on glenohumeral stability during abduction tasks.
Hölscher, Thomas; Weber, Tim; Lazarev, Igor; Englert, Carsten; Dendorfer, Sebastian
2016-09-01
One of the main goals in reconstructing rotator cuff tears is the restoration of glenohumeral joint stability, which is subsequently of utmost importance in order to prevent degenerative damage such as superior labral anterior posterior (SLAP) lesion, arthrosis, and malfunction. The goal of the current study was to facilitate musculoskeletal models in order to estimate glenohumeral instability introduced by muscle weakness due to cuff lesions. Inverse dynamics simulations were used to compute joint reaction forces for several static abduction tasks with different muscle weakness. Results were compared with the existing literature in order to ensure the model validity. Further arm positions taken from activities of daily living, requiring the rotator cuff muscles were modeled and their contribution to joint kinetics computed. Weakness of the superior rotator cuff muscles (supraspinatus; infraspinatus) leads to a deviation of the joint reaction force to the cranial dorsal rim of the glenoid. Massive rotator cuff defects showed higher potential for glenohumeral instability in contrast to single muscle ruptures. The teres minor muscle seems to substitute lost joint torque during several simulated muscle tears to maintain joint stability. Joint instability increases with cuff tear size. Weakness of the upper part of the rotator cuff leads to a joint reaction force closer to the upper glenoid rim. This indicates the comorbidity of cuff tears with SLAP lesions. The teres minor is crucial for maintaining joint stability in case of massive cuff defects and should be uprated in clinical decision-making. © 2016 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 34:1628-1635, 2016. © 2016 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.
Hormonal Replacement in Hypopituitarism in Adults: An Endocrine Society Clinical Practice Guideline.
Fleseriu, Maria; Hashim, Ibrahim A; Karavitaki, Niki; Melmed, Shlomo; Murad, M Hassan; Salvatori, Roberto; Samuels, Mary H
2016-11-01
To formulate clinical practice guidelines for hormonal replacement in hypopituitarism in adults. The participants include an Endocrine Society-appointed Task Force of six experts, a methodologist, and a medical writer. The American Association for Clinical Chemistry, the Pituitary Society, and the European Society of Endocrinology co-sponsored this guideline. The Task Force developed this evidence-based guideline using the Grading of Recommendations, Assessment, Development, and Evaluation system to describe the strength of recommendations and the quality of evidence. The Task Force commissioned two systematic reviews and used the best available evidence from other published systematic reviews and individual studies. One group meeting, several conference calls, and e-mail communications enabled consensus. Committees and members of the Endocrine Society, the American Association for Clinical Chemistry, the Pituitary Society, and the European Society of Endocrinology reviewed and commented on preliminary drafts of these guidelines. Using an evidence-based approach, this guideline addresses important clinical issues regarding the evaluation and management of hypopituitarism in adults, including appropriate biochemical assessments, specific therapeutic decisions to decrease the risk of co-morbidities due to hormonal over-replacement or under-replacement, and managing hypopituitarism during pregnancy, pituitary surgery, and other types of surgeries.
Task allocation model for minimization of completion time in distributed computer systems
NASA Astrophysics Data System (ADS)
Wang, Jai-Ping; Steidley, Carl W.
1993-08-01
A task in a distributed computing system consists of a set of related modules. Each of the modules will execute on one of the processors of the system and communicate with some other modules. In addition, precedence relationships may exist among the modules. Task allocation is an essential activity in distributed-software design. This activity is of importance to all phases of the development of a distributed system. This paper establishes task completion-time models and task allocation models for minimizing task completion time. Current work in this area is either at the experimental level or without the consideration of precedence relationships among modules. The development of mathematical models for the computation of task completion time and task allocation will benefit many real-time computer applications such as radar systems, navigation systems, industrial process control systems, image processing systems, and artificial intelligence oriented systems.
A General Cross-Layer Cloud Scheduling Framework for Multiple IoT Computer Tasks.
Wu, Guanlin; Bao, Weidong; Zhu, Xiaomin; Zhang, Xiongtao
2018-05-23
The diversity of IoT services and applications brings enormous challenges to improving the performance of multiple computer tasks' scheduling in cross-layer cloud computing systems. Unfortunately, the commonly-employed frameworks fail to adapt to the new patterns on the cross-layer cloud. To solve this issue, we design a new computer task scheduling framework for multiple IoT services in cross-layer cloud computing systems. Specifically, we first analyze the features of the cross-layer cloud and computer tasks. Then, we design the scheduling framework based on the analysis and present detailed models to illustrate the procedures of using the framework. With the proposed framework, the IoT services deployed in cross-layer cloud computing systems can dynamically select suitable algorithms and use resources more effectively to finish computer tasks with different objectives. Finally, the algorithms are given based on the framework, and extensive experiments are also given to validate its effectiveness, as well as its superiority.
Doherty, John U; Kort, Smadar; Mehran, Roxana; Schoenhagen, Paul; Soman, Prem; Dehmer, Greg J; Doherty, John U; Schoenhagen, Paul; Amin, Zahid; Bashore, Thomas M; Boyle, Andrew; Calnon, Dennis A; Carabello, Blase; Cerqueira, Manuel D; Conte, John; Desai, Milind; Edmundowicz, Daniel; Ferrari, Victor A; Ghoshhajra, Brian; Mehrotra, Praveen; Nazarian, Saman; Reece, T Brett; Tamarappoo, Balaji; Tzou, Wendy S; Wong, John B; Doherty, John U; Dehmer, Gregory J; Bailey, Steven R; Bhave, Nicole M; Brown, Alan S; Daugherty, Stacie L; Dean, Larry S; Desai, Milind Y; Duvernoy, Claire S; Gillam, Linda D; Hendel, Robert C; Kramer, Christopher M; Lindsay, Bruce D; Manning, Warren J; Mehrotra, Praveen; Patel, Manesh R; Sachdeva, Ritu; Wann, L Samuel; Winchester, David E; Wolk, Michael J; Allen, Joseph M
2018-04-01
This document is 1 of 2 companion appropriate use criteria (AUC) documents developed by the American College of Cardiology, American Association for Thoracic Surgery, American Heart Association, American Society of Echocardiography, American Society of Nuclear Cardiology, Heart Rhythm Society, Society for Cardiovascular Angiography and Interventions, Society of Cardiovascular Computed Tomography, Society for Cardiovascular Magnetic Resonance, and Society of Thoracic Surgeons. This document addresses the evaluation and use of multimodality imaging in the diagnosis and management of valvular heart disease, whereas the second, companion document addresses this topic with regard to structural heart disease. Although there is clinical overlap, the documents addressing valvular and structural heart disease are published separately, albeit with a common structure. The goal of the companion AUC documents is to provide a comprehensive resource for multimodality imaging in the context of valvular and structural heart disease, encompassing multiple imaging modalities. Using standardized methodology, the clinical scenarios (indications) were developed by a diverse writing group to represent patient presentations encountered in everyday practice and included common applications and anticipated uses. Where appropriate, the scenarios were developed on the basis of the most current American College of Cardiology/American Heart Association guidelines. A separate, independent rating panel scored the 92 clinical scenarios in this document on a scale of 1 to 9. Scores of 7 to 9 indicate that a modality is considered appropriate for the clinical scenario presented. Midrange scores of 4 to 6 indicate that a modality may be appropriate for the clinical scenario, and scores of 1 to 3 indicate that a modality is considered rarely appropriate for the clinical scenario. The primary objective of the AUC is to provide a framework for the assessment of these scenarios by practices that will improve and standardize physician decision making. AUC publications reflect an ongoing effort by the American College of Cardiology to critically and systematically create, review, and categorize clinical situations where diagnostic tests and procedures are utilized by physicians caring for patients with cardiovascular diseases. The process is based on the current understanding of the technical capabilities of the imaging modalities examined. Copyright © 2017. Published by Elsevier Inc.
Doherty, John U; Kort, Smadar; Mehran, Roxana; Schoenhagen, Paul; Soman, Prem
2017-12-01
This document is 1 of 2 companion appropriate use criteria (AUC) documents developed by the American College of Cardiology, American Association for Thoracic Surgery, American Heart Association, American Society of Echocardiography, American Society of Nuclear Cardiology, Heart Rhythm Society, Society for Cardiovascular Angiography and Interventions, Society of Cardiovascular Computed Tomography, Society for Cardiovascular Magnetic Resonance, and Society of Thoracic Surgeons. This document addresses the evaluation and use of multimodality imaging in the diagnosis and management of valvular heart disease, whereas the second, companion document addresses this topic with regard to structural heart disease. Although there is clinical overlap, the documents addressing valvular and structural heart disease are published separately, albeit with a common structure. The goal of the companion AUC documents is to provide a comprehensive resource for multimodality imaging in the context of valvular and structural heart disease, encompassing multiple imaging modalities.Using standardized methodology, the clinical scenarios (indications) were developed by a diverse writing group to represent patient presentations encountered in everyday practice and included common applications and anticipated uses. Where appropriate, the scenarios were developed on the basis of the most current American College of Cardiology/American Heart Association guidelines.A separate, independent rating panel scored the 92 clinical scenarios in this document on a scale of 1 to 9. Scores of 7 to 9 indicate that a modality is considered appropriate for the clinical scenario presented. Midrange scores of 4 to 6 indicate that a modality may be appropriate for the clinical scenario, and scores of 1 to 3 indicate that a modality is considered rarely appropriate for the clinical scenario.The primary objective of the AUC is to provide a framework for the assessment of these scenarios by practices that will improve and standardize physician decision making. AUC publications reflect an ongoing effort by the American College of Cardiology to critically and systematically create, review, and categorize clinical situations where diagnostic tests and procedures are utilized by physicians caring for patients with cardiovascular diseases. The process is based on the current understanding of the technical capabilities of the imaging modalities examined.
Wickens, Christopher D; Sebok, Angelia; Li, Huiyang; Sarter, Nadine; Gacy, Andrew M
2015-09-01
The aim of this study was to develop and validate a computational model of the automation complacency effect, as operators work on a robotic arm task, supported by three different degrees of automation. Some computational models of complacency in human-automation interaction exist, but those are formed and validated within the context of fairly simplified monitoring failures. This research extends model validation to a much more complex task, so that system designers can establish, without need for human-in-the-loop (HITL) experimentation, merits and shortcomings of different automation degrees. We developed a realistic simulation of a space-based robotic arm task that could be carried out with three different levels of trajectory visualization and execution automation support. Using this simulation, we performed HITL testing. Complacency was induced via several trials of correctly performing automation and then was assessed on trials when automation failed. Following a cognitive task analysis of the robotic arm operation, we developed a multicomponent model of the robotic operator and his or her reliance on automation, based in part on visual scanning. The comparison of model predictions with empirical results revealed that the model accurately predicted routine performance and predicted the responses to these failures after complacency developed. However, the scanning models do not account for the entire attention allocation effects of complacency. Complacency modeling can provide a useful tool for predicting the effects of different types of imperfect automation. The results from this research suggest that focus should be given to supporting situation awareness in automation development. © 2015, Human Factors and Ergonomics Society.
Accelerating EPI distortion correction by utilizing a modern GPU-based parallel computation.
Yang, Yao-Hao; Huang, Teng-Yi; Wang, Fu-Nien; Chuang, Tzu-Chao; Chen, Nan-Kuei
2013-04-01
The combination of phase demodulation and field mapping is a practical method to correct echo planar imaging (EPI) geometric distortion. However, since phase dispersion accumulates in each phase-encoding step, the calculation complexity of phase modulation is Ny-fold higher than conventional image reconstructions. Thus, correcting EPI images via phase demodulation is generally a time-consuming task. Parallel computing by employing general-purpose calculations on graphics processing units (GPU) can accelerate scientific computing if the algorithm is parallelized. This study proposes a method that incorporates the GPU-based technique into phase demodulation calculations to reduce computation time. The proposed parallel algorithm was applied to a PROPELLER-EPI diffusion tensor data set. The GPU-based phase demodulation method reduced the EPI distortion correctly, and accelerated the computation. The total reconstruction time of the 16-slice PROPELLER-EPI diffusion tensor images with matrix size of 128 × 128 was reduced from 1,754 seconds to 101 seconds by utilizing the parallelized 4-GPU program. GPU computing is a promising method to accelerate EPI geometric correction. The resulting reduction in computation time of phase demodulation should accelerate postprocessing for studies performed with EPI, and should effectuate the PROPELLER-EPI technique for clinical practice. Copyright © 2011 by the American Society of Neuroimaging.
Evaluation of Ground Vibrations Induced by Military Noise Sources
2006-08-01
1 Task 2—Determine the acoustic -to-seismic coupling coefficients C1 and C2 ...................... 1 Task 3—Computational modeling ...Determine the acoustic -to-seismic coupling coefficients C1 and C2 ....................45 Task 3—Computational modeling of acoustically induced ground...ground conditions. Task 3—Computational modeling of acoustically induced ground motion The simple model of blast sound interaction with the
Computer task performance by subjects with Duchenne muscular dystrophy.
Malheiros, Silvia Regina Pinheiro; da Silva, Talita Dias; Favero, Francis Meire; de Abreu, Luiz Carlos; Fregni, Felipe; Ribeiro, Denise Cardoso; de Mello Monteiro, Carlos Bandeira
2016-01-01
Two specific objectives were established to quantify computer task performance among people with Duchenne muscular dystrophy (DMD). First, we compared simple computational task performance between subjects with DMD and age-matched typically developing (TD) subjects. Second, we examined correlations between the ability of subjects with DMD to learn the computational task and their motor functionality, age, and initial task performance. The study included 84 individuals (42 with DMD, mean age of 18±5.5 years, and 42 age-matched controls). They executed a computer maze task; all participants performed the acquisition (20 attempts) and retention (five attempts) phases, repeating the same maze. A different maze was used to verify transfer performance (five attempts). The Motor Function Measure Scale was applied, and the results were compared with maze task performance. In the acquisition phase, a significant decrease was found in movement time (MT) between the first and last acquisition block, but only for the DMD group. For the DMD group, MT during transfer was shorter than during the first acquisition block, indicating improvement from the first acquisition block to transfer. In addition, the TD group showed shorter MT than the DMD group across the study. DMD participants improved their performance after practicing a computational task; however, the difference in MT was present in all attempts among DMD and control subjects. Computational task improvement was positively influenced by the initial performance of individuals with DMD. In turn, the initial performance was influenced by their distal functionality but not their age or overall functionality.
Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Abdulhamid, Shafi'i Muhammad; Usman, Mohammed Joda
2017-01-01
Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing.
Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Usman, Mohammed Joda
2017-01-01
Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing. PMID:28467505
Treatment of Cushing's Syndrome: An Endocrine Society Clinical Practice Guideline
Nieman, Lynnette K.; Biller, Beverly M. K.; Findling, James W.; Murad, M. Hassan; Newell-Price, John; Savage, Martin O.; Tabarin, Antoine
2015-01-01
Objective: The objective is to formulate clinical practice guidelines for treating Cushing's syndrome. Participants: Participants include an Endocrine Society-appointed Task Force of experts, a methodologist, and a medical writer. The European Society for Endocrinology co-sponsored the guideline. Evidence: The Task Force used the Grading of Recommendations, Assessment, Development, and Evaluation system to describe the strength of recommendations and the quality of evidence. The Task Force commissioned three systematic reviews and used the best available evidence from other published systematic reviews and individual studies. Consensus Process: The Task Force achieved consensus through one group meeting, several conference calls, and numerous e-mail communications. Committees and members of The Endocrine Society and the European Society of Endocrinology reviewed and commented on preliminary drafts of these guidelines. Conclusions: Treatment of Cushing's syndrome is essential to reduce mortality and associated comorbidities. Effective treatment includes the normalization of cortisol levels or action. It also includes the normalization of comorbidities via directly treating the cause of Cushing's syndrome and by adjunctive treatments (eg, antihypertensives). Surgical resection of the causal lesion(s) is generally the first-line approach. The choice of second-line treatments, including medication, bilateral adrenalectomy, and radiation therapy (for corticotrope tumors), must be individualized to each patient. PMID:26222757
Abbara, Suhny; Blanke, Philipp; Maroules, Christopher D; Cheezum, Michael; Choi, Andrew D; Han, B Kelly; Marwan, Mohamed; Naoum, Chris; Norgaard, Bjarne L; Rubinshtein, Ronen; Schoenhagen, Paul; Villines, Todd; Leipsic, Jonathon
In response to recent technological advancements in acquisition techniques as well as a growing body of evidence regarding the optimal performance of coronary computed tomography angiography (coronary CTA), the Society of Cardiovascular Computed Tomography Guidelines Committee has produced this update to its previously established 2009 "Guidelines for the Performance of Coronary CTA" (1). The purpose of this document is to provide standards meant to ensure reliable practice methods and quality outcomes based on the best available data in order to improve the diagnostic care of patients. Society of Cardiovascular Computed Tomography Guidelines for the Interpretation is published separately (2). The Society of Cardiovascular Computed Tomography Guidelines Committee ensures compliance with all existing standards for the declaration of conflict of interest by all authors and reviewers for the purpose ofclarity and transparency. Copyright © 2016 Society of Cardiovascular Computed Tomography. All rights reserved.
1986-02-20
related brain potential at the Joint EEG Society/ ohp hysioogical Society (ERP) and measures of the electromyogram Meeting. Bristol (England), 1983. and...proving the memory representation of the task ( mem - manipulations of primary-task difficulty attenuated ory data limits). If the P300 amplitude does in
USDA-ARS?s Scientific Manuscript database
The International Osteoporosis Foundation (IOF) and the International Society for Clinical Densitometry (ISCD) appointed a joint Task Force to develop resource documents in order to make recommendations on how to improve FRAX and better inform clinicians who use FRAX. The Task Force met in November...
ERIC Educational Resources Information Center
Goldhammer, Frank; Naumann, Johannes; Stelter, Annette; Tóth, Krisztina; Rölke, Heiko; Klieme, Eckhard
2014-01-01
Computer-based assessment can provide new insights into behavioral processes of task completion that cannot be uncovered by paper-based instruments. Time presents a major characteristic of the task completion process. Psychologically, time on task has 2 different interpretations, suggesting opposing associations with task outcome: Spending more…
Do we always prioritize balance when walking? Towards an integrated model of task prioritization.
Yogev-Seligmann, Galit; Hausdorff, Jeffrey M; Giladi, Nir
2012-05-01
Previous studies suggest that strategies such as "posture first" are implicitly employed to regulate safety when healthy adults walk while simultaneously performing another task, whereas "posture second" may be inappropriately applied in the presence of neurological disease. However, recent understandings raise questions about the traditional resource allocation concept during walking while dual tasking. We propose a task prioritization model of walking while dual tasking that integrates motor and cognitive capabilities, focusing on postural reserve, hazard estimation, and other individual intrinsic factors. The proposed prioritization model provides a theoretical foundation for future studies and a framework for the development of interventions designed to reduce the profound negative impacts of dual tasking on gait and fall risk in patients with neurological diseases. © 2012 Movement Disorder Society. Copyright © 2012 Movement Disorder Society.
The Effects of Study Tasks in a Computer-Based Chemistry Learning Environment
ERIC Educational Resources Information Center
Urhahne, Detlef; Nick, Sabine; Poepping, Anna Christin; Schulz , Sarah Jayne
2013-01-01
The present study examines the effects of different study tasks on the acquisition of knowledge about acids and bases in a computer-based learning environment. Three different task formats were selected to create three treatment conditions: learning with gap-fill and matching tasks, learning with multiple-choice tasks, and learning only from text…
Neural mechanisms underlying human consensus decision-making
Suzuki, Shinsuke; Adachi, Ryo; Dunne, Simon; Bossaerts, Peter; O'Doherty, John P.
2015-01-01
SUMMARY Consensus building in a group is a hallmark of animal societies, yet little is known about its underlying computational and neural mechanisms. Here, we applied a novel computational framework to behavioral and fMRI data from human participants performing a consensus decision-making task with up to five other participants. We found that participants reached consensus decisions through integrating their own preferences with information about the majority of group-members’ prior choices, as well as inferences about how much each option was stuck to by the other people. These distinct decision variables were separately encoded in distinct brain areas: the ventromedial prefrontal cortex, posterior superior temporal sulcus/temporoparietal junction and intraparietal sulcus, and were integrated in the dorsal anterior cingulate cortex. Our findings provide support for a theoretical account in which collective decisions are made through integrating multiple types of inference about oneself, others and environments, processed in distinct brain modules. PMID:25864634
Neural mechanisms underlying human consensus decision-making.
Suzuki, Shinsuke; Adachi, Ryo; Dunne, Simon; Bossaerts, Peter; O'Doherty, John P
2015-04-22
Consensus building in a group is a hallmark of animal societies, yet little is known about its underlying computational and neural mechanisms. Here, we applied a computational framework to behavioral and fMRI data from human participants performing a consensus decision-making task with up to five other participants. We found that participants reached consensus decisions through integrating their own preferences with information about the majority group members' prior choices, as well as inferences about how much each option was stuck to by the other people. These distinct decision variables were separately encoded in distinct brain areas-the ventromedial prefrontal cortex, posterior superior temporal sulcus/temporoparietal junction, and intraparietal sulcus-and were integrated in the dorsal anterior cingulate cortex. Our findings provide support for a theoretical account in which collective decisions are made through integrating multiple types of inference about oneself, others, and environments, processed in distinct brain modules. Copyright © 2015 Elsevier Inc. All rights reserved.
Can computational goals inform theories of vision?
Anderson, Barton L
2015-04-01
One of the most lasting contributions of Marr's posthumous book is his articulation of the different "levels of analysis" that are needed to understand vision. Although a variety of work has examined how these different levels are related, there is comparatively little examination of the assumptions on which his proposed levels rest, or the plausibility of the approach Marr articulated given those assumptions. Marr placed particular significance on computational level theory, which specifies the "goal" of a computation, its appropriateness for solving a particular problem, and the logic by which it can be carried out. The structure of computational level theory is inherently teleological: What the brain does is described in terms of its purpose. I argue that computational level theory, and the reverse-engineering approach it inspires, requires understanding the historical trajectory that gave rise to functional capacities that can be meaningfully attributed with some sense of purpose or goal, that is, a reconstruction of the fitness function on which natural selection acted in shaping our visual abilities. I argue that this reconstruction is required to distinguish abilities shaped by natural selection-"natural tasks" -from evolutionary "by-products" (spandrels, co-optations, and exaptations), rather than merely demonstrating that computational goals can be embedded in a Bayesian model that renders a particular behavior or process rational. Copyright © 2015 Cognitive Science Society, Inc.
Method and system for benchmarking computers
Gustafson, John L.
1993-09-14
A testing system and method for benchmarking computer systems. The system includes a store containing a scalable set of tasks to be performed to produce a solution in ever-increasing degrees of resolution as a larger number of the tasks are performed. A timing and control module allots to each computer a fixed benchmarking interval in which to perform the stored tasks. Means are provided for determining, after completion of the benchmarking interval, the degree of progress through the scalable set of tasks and for producing a benchmarking rating relating to the degree of progress for each computer.
Checkpointing for a hybrid computing node
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cher, Chen-Yong
2016-03-08
According to an aspect, a method for checkpointing in a hybrid computing node includes executing a task in a processing accelerator of the hybrid computing node. A checkpoint is created in a local memory of the processing accelerator. The checkpoint includes state data to restart execution of the task in the processing accelerator upon a restart operation. Execution of the task is resumed in the processing accelerator after creating the checkpoint. The state data of the checkpoint are transferred from the processing accelerator to a main processor of the hybrid computing node while the processing accelerator is executing the task.
The Effects of Study Tasks in a Computer-Based Chemistry Learning Environment
NASA Astrophysics Data System (ADS)
Urhahne, Detlef; Nick, Sabine; Poepping, Anna Christin; Schulz, Sarah Jayne
2013-12-01
The present study examines the effects of different study tasks on the acquisition of knowledge about acids and bases in a computer-based learning environment. Three different task formats were selected to create three treatment conditions: learning with gap-fill and matching tasks, learning with multiple-choice tasks, and learning only from text and figures without any additional tasks. Participants were 196 ninth-grade students who learned with a self-developed multimedia program in a pretest-posttest control group design. Research results reveal that gap-fill and matching tasks were most effective in promoting knowledge acquisition, followed by multiple-choice tasks, and no tasks at all. The findings are in line with previous research on this topic. The effects can possibly be explained by the generation-recognition model, which predicts that gap-fill and matching tasks trigger more encompassing learning processes than multiple-choice tasks. It is concluded that instructional designers should incorporate more challenging study tasks for enhancing the effectiveness of computer-based learning environments.
Fardon, David F; Williams, Alan L; Dohring, Edward J; Murtagh, F Reed; Gabriel Rothman, Stephen L; Sze, Gordon K
2014-11-15
This article comprises a review of the literature pertaining to the normal and pathological lumbar disc and the compilation of a standardized nomenclature. To provide a resource that promotes a clear understanding of lumbar disc terminology among clinicians, radiologists, and researchers. The article "Nomenclature and Classification of Lumbar Disc Pathology. Recommendations of the Combined Task Forces of the North American Spine Society, American Society of Spine Radiology and American Society of Neuroradiology" was published in 2001 in Spine © Lippincott, Williams and Wilkins and formally endorsed by the 3 boards. Its purpose, which it served for well over a decade, was to promote greater clarity and consistency of usage of spine terminology. Since 2001, there has been sufficient evolution in our understanding of the lumbar disc to suggest the need for revision and updating. The document represents the consensus recommendations of the current combined task forces and reflects changes consistent with current concepts in radiological and clinical care. A PubMed search was performed for literature pertaining to the lumbar disc. The task force members individually and collectively reviewed the literature and revised the 2001 document. It was then reviewed by the governing boards of the American Society of Spine Radiology, the American Society of Neuroradiology, and the North American Spine Society. After further revision based on their feedback, the paper was approved for publication. The article provides a discussion of the recommended diagnostic categories and a glossary of terms pertaining to the lumbar disc, a detailed discussion of the terms and their recommended usage, as well as updated illustrations and literature references. We have revised and updated a document that, since 2001, has provided a widely accepted nomenclature that helps maintain consistency and accuracy in the description of the properties of the normal and abnormal lumbar discs and that serves as a system for classification and reporting built upon that nomenclature.
A History of School Design and Its Indoor Environmental Standards, 1900 to Today
ERIC Educational Resources Information Center
Baker, Lindsay
2012-01-01
Public education is one of the central tasks of a democratic society, and the buildings that house this important task not only shape the way one teaches, but provide icons and symbols for the values people hold common as a society. Perhaps unsurprisingly, this context has placed school buildings squarely in a position of debate and innovation…
29 CFR 778.313 - Computing overtime pay under the Act for employees compensated on task basis.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 3 2010-07-01 2010-07-01 false Computing overtime pay under the Act for employees compensated on task basis. 778.313 Section 778.313 Labor Regulations Relating to Labor (Continued) WAGE AND... TO REGULATIONS OVERTIME COMPENSATION Special Problems âtaskâ Basis of Payment § 778.313 Computing...
Reinforcement learning in computer vision
NASA Astrophysics Data System (ADS)
Bernstein, A. V.; Burnaev, E. V.
2018-04-01
Nowadays, machine learning has become one of the basic technologies used in solving various computer vision tasks such as feature detection, image segmentation, object recognition and tracking. In many applications, various complex systems such as robots are equipped with visual sensors from which they learn state of surrounding environment by solving corresponding computer vision tasks. Solutions of these tasks are used for making decisions about possible future actions. It is not surprising that when solving computer vision tasks we should take into account special aspects of their subsequent application in model-based predictive control. Reinforcement learning is one of modern machine learning technologies in which learning is carried out through interaction with the environment. In recent years, Reinforcement learning has been used both for solving such applied tasks as processing and analysis of visual information, and for solving specific computer vision problems such as filtering, extracting image features, localizing objects in scenes, and many others. The paper describes shortly the Reinforcement learning technology and its use for solving computer vision problems.
Multi-Attribute Task Battery - Applications in pilot workload and strategic behavior research
NASA Technical Reports Server (NTRS)
Arnegard, Ruth J.; Comstock, J. R., Jr.
1991-01-01
The Multi-Attribute Task (MAT) Battery provides a benchmark set of tasks for use in a wide range of lab studies of operator performance and workload. The battery incorporates tasks analogous to activities that aircraft crewmembers perform in flight, while providing a high degree of experimenter control, performance data on each subtask, and freedom to nonpilot test subjects. Features not found in existing computer based tasks include an auditory communication task (to simulate Air Traffic Control communication), a resource management task permitting many avenues or strategies of maintaining target performance, a scheduling window which gives the operator information about future task demands, and the option of manual or automated control of tasks. Performance data are generated for each subtask. In addition, the task battery may be paused and onscreen workload rating scales presented to the subject. The MAT Battery requires a desktop computer with color graphics. The communication task requires a serial link to a second desktop computer with a voice synthesizer or digitizer card.
The multi-attribute task battery for human operator workload and strategic behavior research
NASA Technical Reports Server (NTRS)
Comstock, J. Raymond, Jr.; Arnegard, Ruth J.
1992-01-01
The Multi-Attribute Task (MAT) Battery provides a benchmark set of tasks for use in a wide range of lab studies of operator performance and workload. The battery incorporates tasks analogous to activities that aircraft crewmembers perform in flight, while providing a high degree of experimenter control, performance data on each subtask, and freedom to use nonpilot test subjects. Features not found in existing computer based tasks include an auditory communication task (to simulate Air Traffic Control communication), a resource management task permitting many avenues or strategies of maintaining target performance, a scheduling window which gives the operator information about future task demands, and the option of manual or automated control of tasks. Performance data are generated for each subtask. In addition, the task battery may be paused and onscreen workload rating scales presented to the subject. The MAT Battery requires a desktop computer with color graphics. The communication task requires a serial link to a second desktop computer with a voice synthesizer or digitizer card.
Ergonomic assessment for the task of repairing computers in a manufacturing company: A case study.
Maldonado-Macías, Aidé; Realyvásquez, Arturo; Hernández, Juan Luis; García-Alcaraz, Jorge
2015-01-01
Manufacturing industry workers who repair computers may be exposed to ergonomic risk factors. This project analyzes the tasks involved in the computer repair process to (1) find the risk level for musculoskeletal disorders (MSDs) and (2) propose ergonomic interventions to address any ergonomic issues. Work procedures and main body postures were video recorded and analyzed using task analysis, the Rapid Entire Body Assessment (REBA) postural method, and biomechanical analysis. High risk for MSDs was found on every subtask using REBA. Although biomechanical analysis found an acceptable mass center displacement during tasks, a hazardous level of compression on the lower back during computer's transportation was detected. This assessment found ergonomic risks mainly in the trunk, arm/forearm, and legs; the neck and hand/wrist were also compromised. Opportunities for ergonomic analyses and interventions in the design and execution of computer repair tasks are discussed.
The effects of respiratory sinus arrhythmia on anger reactivity and persistence in major depression.
Ellis, Alissa J; Shumake, Jason; Beevers, Christopher G
2016-10-01
The experience of anger during a depressive episode has recently been identified as a poor prognostic indicator of illness course. Given the clinical implications of anger in major depressive disorder (MDD), understanding the mechanisms involved in anger reactivity and persistence is critical for improved intervention. Biological processes involved in emotion regulation during stress, such as respiratory sinus arrhythmia (RSA), may play a role in maintaining negative moods. Clinically depressed (MDD; n = 49) and nondepressed (non-MDD; n = 50) individuals were challenged with a stressful computer task shown to increase anger, while RSA (high frequency range 0.15-0.4 Hz) was collected. RSA predicted future anger, but was unrelated to current anger. That is, across participants, low baseline RSA predicted anger reactivity during the task, and in depressed individuals, those with low RSA during the task had a greater likelihood of anger persistence during a recovery period. These results suggest that low RSA may be a psychophysiological process involved in anger regulation in depression. Low RSA may contribute to sustained illness course by diminishing the repair of angry moods. © 2016 Society for Psychophysiological Research.
Applied Computational Electromagnetics Society Journal. Volume 13, No. 1
1998-03-01
APPLIED COMPUTATIONAL ELECTROMAGNETICS SOCIETY JOURNAL March 1998 Vol. 13 No. 1 ISSN 1054-4887 MBTMBUTION BTATCICEHt 1 ’ | Appcofd for...public rdtooMf DUrtrlbnttoo Unlimited 1 19980709 083 GENERAL PURPOSE AND SCOPE. The Applied Computational Electromagnetics Society Journal...SOCIETY Journal March 1998 Vol. 13 No. 1 ISSN 1054-4887 The ACES Journal is abstracted in INSPEC, in Engineering Index, and in DTIC. The second
Porter, Stephen C; Guo, Chao-Yu; Bacic, Janine; Chan, Eugenia
2011-01-26
Health care systems increasingly rely on patients' data entry efforts to organize and assist in care delivery through health information exchange. We sought to determine (1) the variation in burden imposed on parents by data entry efforts across paper-based and computer-based environments, and (2) the impact, if any, of parents' health literacy on the task burden. We completed a randomized controlled trial of parent-completed data entry tasks. Parents of children with attention deficit hyperactivity disorder (ADHD) were randomized based on the Test of Functional Health Literacy in Adults (TOFHLA) to either a paper-based or computer-based environment for entry of health information on their children. The primary outcome was the National Aeronautics and Space Administration Task Load Index (TLX) total weighted score. We screened 271 parents: 194 (71.6%) were eligible, and 180 of these (92.8%) constituted the study cohort. We analyzed 90 participants from each arm. Parents who completed information tasks on paper reported a higher task burden than those who worked in the computer environment: mean (SD) TLX scores were 22.8 (20.6) for paper and 16.3 (16.1) for computer. Assignment to the paper environment conferred a significant risk of higher task burden (F(1,178) = 4.05, P = .046). Adequate literacy was associated with lower task burden (decrease in burden score of 1.15 SD, P = .003). After adjusting for relevant child and parent factors, parents' TOFHLA score (beta = -.02, P = .02) and task environment (beta = .31, P = .03) remained significantly associated with task burden. A tailored computer-based environment provided an improved task experience for data entry compared to the same tasks completed on paper. Health literacy was inversely related to task burden.
Takamuku, Shinya; Forbes, Paul A G; Hamilton, Antonia F de C; Gomi, Hiroaki
2018-05-07
There is increasing evidence for motor difficulties in many people with autism spectrum condition (ASC). These difficulties could be linked to differences in the use of internal models which represent relations between motions and forces/efforts. The use of these internal models may be dependent on the cerebellum which has been shown to be abnormal in autism. Several studies have examined internal computations of forward dynamics (motion from force information) in autism, but few have tested the inverse dynamics computation, that is, the determination of force-related information from motion information. Here, we examined this ability in autistic adults by measuring two perceptual biases which depend on the inverse computation. First, we asked participants whether they experienced a feeling of resistance when moving a delayed cursor, which corresponds to the inertial force of the cursor implied by its motion-both typical and ASC participants reported similar feelings of resistance. Second, participants completed a psychophysical task in which they judged the velocity of a moving hand with or without a visual cue implying inertial force. Both typical and ASC participants perceived the hand moving with the inertial cue to be slower than the hand without it. In both cases, the magnitude of the effects did not differ between the two groups. Our results suggest that the neural systems engaged in the inverse dynamics computation are preserved in ASC, at least in the observed conditions. Autism Res 2018. © 2018 International Society for Autism Research, Wiley Periodicals, Inc. We tested the ability to estimate force information from motion information, which arises from a specific "inverse dynamics" computation. Autistic adults and a matched control group reported feeling a resistive sensation when moving a delayed cursor and also judged a moving hand to be slower when it was pulling a load. These findings both suggest that the ability to estimate force information from motion information is intact in autism. © 2018 International Society for Autism Research, Wiley Periodicals, Inc.
Barbieri, Dechristian França; Srinivasan, Divya; Mathiassen, Svend Erik; Nogueira, Helen Cristina; Oliveira, Ana Beatriz
2015-01-01
Postures and muscle activity in the upper body were recorded from 50 academics office workers during 2 hours of normal work, categorised by observation into computer work (CW) and three non-computer (NC) tasks (NC seated work, NC standing/walking work and breaks). NC tasks differed significantly in exposures from CW, with standing/walking NC tasks representing the largest contrasts for most of the exposure variables. For the majority of workers, exposure variability was larger in their present job than in CW alone, as measured by the job variance ratio (JVR), i.e. the ratio between min-min variabilities in the job and in CW. Calculations of JVRs for simulated jobs containing different proportions of CW showed that variability could, indeed, be increased by redistributing available tasks, but that substantial increases could only be achieved by introducing more vigorous tasks in the job, in casu illustrated by cleaning.
Amador-Vargas, Sabrina; Gronenberg, Wulfila; Wcislo, William T; Mueller, Ulrich
2015-02-22
Group size in both multicellular organisms and animal societies can correlate with the degree of division of labour. For ants, the task specialization hypothesis (TSH) proposes that increased behavioural specialization enabled by larger group size corresponds to anatomical specialization of worker brains. Alternatively, the social brain hypothesis proposes that increased levels of social stimuli in larger colonies lead to enlarged brain regions in all workers, regardless of their task specialization. We tested these hypotheses in acacia ants (Pseudomyrmex spinicola), which exhibit behavioural but not morphological task specialization. In wild colonies, we marked, followed and tested ant workers involved in foraging tasks on the leaves (leaf-ants) and in defensive tasks on the host tree trunk (trunk-ants). Task specialization increased with colony size, especially in defensive tasks. The relationship between colony size and brain region volume was task-dependent, supporting the TSH. Specifically, as colony size increased, the relative size of regions within the mushroom bodies of the brain decreased in trunk-ants but increased in leaf-ants; those regions play important roles in learning and memory. Our findings suggest that workers specialized in defence may have reduced learning abilities relative to leaf-ants; these inferences remain to be tested. In societies with monomorphic workers, brain polymorphism enhanced by group size could be a mechanism by which division of labour is achieved. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
Enesco, Ileana; Lago, Oliva; Rodríguez, Purificación; Guerrero, Silvia
2011-09-01
The general purpose of this study was to analyse the developmental relations between the early forms of ethnic attitudes, and the classification abilities of the young child. We designed new cognitive tasks within a detection paradigm adapted to preschoolers and attitudinal tasks that were presented as games in a computer screen. Participants were 75 majority-group children of 3, 4, and 5 years of age. Children's preferences and positive/negative attitudes towards the in-group (Spaniards) and three out-groups (Latin-Americans, Africans, and Asians) were measured. The results showed a remarkable preference and positivity for the in-group, but not out-group derogation. Children's cognitive performance, to a greater extent than their age, was positively associated with in-group favouritism and positivity. On the other hand, we found some interesting differences and developmental changes in children's positive orientation to the out-groups that are discussed in the last section. ©2010 The Authors British Journal of Developmental Psychology © 2010 The British Psychological Society.
Human performance under two different command and control paradigms.
Walker, Guy H; Stanton, Neville A; Salmon, Paul M; Jenkins, Daniel P
2014-05-01
The paradoxical behaviour of a new command and control concept called Network Enabled Capability (NEC) provides the motivation for this paper. In it, a traditional hierarchical command and control organisation was pitted against a network centric alternative on a common task, played thirty times, by two teams. Multiple regression was used to undertake a simple form of time series analysis. It revealed that whilst the NEC condition ended up being slightly slower than its hierarchical counterpart, it was able to balance and optimise all three of the performance variables measured (task time, enemies neutralised and attrition). From this it is argued that a useful conceptual response is not to consider NEC as an end product comprised of networked computers and standard operating procedures, nor to regard the human system interaction as inherently stable, but rather to view it as a set of initial conditions from which the most adaptable component of all can be harnessed: the human. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.
[Impacts of computer- and media usage on the personality development of children and young people].
Süss, D
2007-02-01
The psychosocial development of children and the youth today is embedded in a media society. Socialization is understood as an interaction between the individual and its environment. Media are used to accomplish developmental tasks and media literacy has become a developmental task in itself. The presence of media in all social subsystems of every day life alters the general socialization processes, like the integration into peergroups or the detaching from the parents. Media can play the role of resources or the role of risk factors for the development. Empirical research shows that children's access to media is more and more enhanced and an increasing amount of time is spent with screen media. Media socialization of the young people takes on the mode of self education, but children are dependent on adults to prevent negative media effects such as Internet addiction. If media usage is part of an environment which is adequat for children's wellbeeing, the psychosocial development will not be affected negatively by the media.
Devi, D Chitra; Uthariaraj, V Rhymend
2016-01-01
Cloud computing uses the concepts of scheduling and load balancing to migrate tasks to underutilized VMs for effectively sharing the resources. The scheduling of the nonpreemptive tasks in the cloud computing environment is an irrecoverable restraint and hence it has to be assigned to the most appropriate VMs at the initial placement itself. Practically, the arrived jobs consist of multiple interdependent tasks and they may execute the independent tasks in multiple VMs or in the same VM's multiple cores. Also, the jobs arrive during the run time of the server in varying random intervals under various load conditions. The participating heterogeneous resources are managed by allocating the tasks to appropriate resources by static or dynamic scheduling to make the cloud computing more efficient and thus it improves the user satisfaction. Objective of this work is to introduce and evaluate the proposed scheduling and load balancing algorithm by considering the capabilities of each virtual machine (VM), the task length of each requested job, and the interdependency of multiple tasks. Performance of the proposed algorithm is studied by comparing with the existing methods.
Devi, D. Chitra; Uthariaraj, V. Rhymend
2016-01-01
Cloud computing uses the concepts of scheduling and load balancing to migrate tasks to underutilized VMs for effectively sharing the resources. The scheduling of the nonpreemptive tasks in the cloud computing environment is an irrecoverable restraint and hence it has to be assigned to the most appropriate VMs at the initial placement itself. Practically, the arrived jobs consist of multiple interdependent tasks and they may execute the independent tasks in multiple VMs or in the same VM's multiple cores. Also, the jobs arrive during the run time of the server in varying random intervals under various load conditions. The participating heterogeneous resources are managed by allocating the tasks to appropriate resources by static or dynamic scheduling to make the cloud computing more efficient and thus it improves the user satisfaction. Objective of this work is to introduce and evaluate the proposed scheduling and load balancing algorithm by considering the capabilities of each virtual machine (VM), the task length of each requested job, and the interdependency of multiple tasks. Performance of the proposed algorithm is studied by comparing with the existing methods. PMID:26955656
Women and Computers: Effects of Stereotype Threat on Attribution of Failure
ERIC Educational Resources Information Center
Koch, Sabine C.; Muller, Stephanie M.; Sieverding, Monika
2008-01-01
This study investigated whether stereotype threat can influence women's attributions of failure in a computer task. Male and female college-age students (n = 86, 16-21 years old) from Germany were asked to work on a computer task and were hinted beforehand that in this task, either (a) men usually perform better than women do (negative threat…
ERIC Educational Resources Information Center
Judd, Terry; Kennedy, Gregor
2011-01-01
Logs of on-campus computer and Internet usage were used to conduct a study of computer-based task switching and multitasking by undergraduate medical students. A detailed analysis of over 6000 individual sessions revealed that while a majority of students engaged in both task switching and multitasking behaviours, they did so less frequently than…
ERIC Educational Resources Information Center
Sesn, Burcin Acar
2013-01-01
The purpose of this study was to investigate pre-service science teachers' understanding of surface tension, cohesion and adhesion forces by using computer-mediated predict-observe-explain tasks. 22 third-year pre-service science teachers participated in this study. Three computer-mediated predict-observe-explain tasks were developed and applied…
Report of the Task Force on Computer Charging.
ERIC Educational Resources Information Center
Computer Co-ordination Group, Ottawa (Ontario).
The objectives of the Task Force on Computer Charging as approved by the Committee of Presidents of Universities of Ontario were: (1) to identify alternative methods of costing computing services; (2) to identify alternative methods of pricing computing services; (3) to develop guidelines for the pricing of computing services; (4) to identify…
Applying analytic hierarchy process to assess healthcare-oriented cloud computing service systems.
Liao, Wen-Hwa; Qiu, Wan-Li
2016-01-01
Numerous differences exist between the healthcare industry and other industries. Difficulties in the business operation of the healthcare industry have continually increased because of the volatility and importance of health care, changes to and requirements of health insurance policies, and the statuses of healthcare providers, which are typically considered not-for-profit organizations. Moreover, because of the financial risks associated with constant changes in healthcare payment methods and constantly evolving information technology, healthcare organizations must continually adjust their business operation objectives; therefore, cloud computing presents both a challenge and an opportunity. As a response to aging populations and the prevalence of the Internet in fast-paced contemporary societies, cloud computing can be used to facilitate the task of balancing the quality and costs of health care. To evaluate cloud computing service systems for use in health care, providing decision makers with a comprehensive assessment method for prioritizing decision-making factors is highly beneficial. Hence, this study applied the analytic hierarchy process, compared items related to cloud computing and health care, executed a questionnaire survey, and then classified the critical factors influencing healthcare cloud computing service systems on the basis of statistical analyses of the questionnaire results. The results indicate that the primary factor affecting the design or implementation of optimal cloud computing healthcare service systems is cost effectiveness, with the secondary factors being practical considerations such as software design and system architecture.
Ogawa, K
1992-01-01
This paper proposes a new evaluation and prediction method for computer usability. This method is based on our two previously proposed information transmission measures created from a human-to-computer information transmission model. The model has three information transmission levels: the device, software, and task content levels. Two measures, called the device independent information measure (DI) and the computer independent information measure (CI), defined on the software and task content levels respectively, are given as the amount of information transmitted. Two information transmission rates are defined as DI/T and CI/T, where T is the task completion time: the device independent information transmission rate (RDI), and the computer independent information transmission rate (RCI). The method utilizes the RDI and RCI rates to evaluate relatively the usability of software and device operations on different computer systems. Experiments using three different systems, in this case a graphical information input task, confirm that the method offers an efficient way of determining computer usability.
Job Management and Task Bundling
NASA Astrophysics Data System (ADS)
Berkowitz, Evan; Jansen, Gustav R.; McElvain, Kenneth; Walker-Loud, André
2018-03-01
High Performance Computing is often performed on scarce and shared computing resources. To ensure computers are used to their full capacity, administrators often incentivize large workloads that are not possible on smaller systems. Measurements in Lattice QCD frequently do not scale to machine-size workloads. By bundling tasks together we can create large jobs suitable for gigantic partitions. We discuss METAQ and mpi_jm, software developed to dynamically group computational tasks together, that can intelligently backfill to consume idle time without substantial changes to users' current workflows or executables.
Computational rationality: linking mechanism and behavior through bounded utility maximization.
Lewis, Richard L; Howes, Andrew; Singh, Satinder
2014-04-01
We propose a framework for including information-processing bounds in rational analyses. It is an application of bounded optimality (Russell & Subramanian, 1995) to the challenges of developing theories of mechanism and behavior. The framework is based on the idea that behaviors are generated by cognitive mechanisms that are adapted to the structure of not only the environment but also the mind and brain itself. We call the framework computational rationality to emphasize the incorporation of computational mechanism into the definition of rational action. Theories are specified as optimal program problems, defined by an adaptation environment, a bounded machine, and a utility function. Such theories yield different classes of explanation, depending on the extent to which they emphasize adaptation to bounds, and adaptation to some ecology that differs from the immediate local environment. We illustrate this variation with examples from three domains: visual attention in a linguistic task, manual response ordering, and reasoning. We explore the relation of this framework to existing "levels" approaches to explanation, and to other optimality-based modeling approaches. Copyright © 2014 Cognitive Science Society, Inc.
Straus, S G; McGrath, J E
1994-02-01
The authors investigated the hypothesis that as group tasks pose greater requirements for member interdependence, communication media that transmit more social context cues will foster group performance and satisfaction. Seventy-two 3-person groups of undergraduate students worked in either computer-mediated or face-to-face meetings on 3 tasks with increasing levels of interdependence: an idea-generation task, an intellective task, and a judgment task. Results showed few differences between computer-mediated and face-to-face groups in the quality of the work completed but large differences in productivity favoring face-to-face groups. Analysis of productivity and of members' reactions supported the predicted interaction of tasks and media, with greater discrepancies between media conditions for tasks requiring higher levels of coordination. Results are discussed in terms of the implications of using computer-mediated communications systems for group work.
Taib, Mohd Firdaus Mohd; Bahn, Sangwoo; Yun, Myung Hwan
2016-06-27
The popularity of mobile computing products is well known. Thus, it is crucial to evaluate their contribution to musculoskeletal disorders during computer usage under both comfortable and stressful environments. This study explores the effect of different computer products' usages with different tasks used to induce psychosocial stress on muscle activity. Fourteen male subjects performed computer tasks: sixteen combinations of four different computer products with four different tasks used to induce stress. Electromyography for four muscles on the forearm, shoulder and neck regions and task performances were recorded. The increment of trapezius muscle activity was dependent on the task used to induce the stress where a higher level of stress made a greater increment. However, this relationship was not found in the other three muscles. Besides that, compared to desktop and laptop use, the lowest activity for all muscles was obtained during the use of a tablet or smart phone. The best net performance was obtained in a comfortable environment. However, during stressful conditions, the best performance can be obtained using the device that a user is most comfortable with or has the most experience with. Different computer products and different levels of stress play a big role in muscle activity during computer work. Both of these factors must be taken into account in order to reduce the occurrence of musculoskeletal disorders or problems.
Processing shotgun proteomics data on the Amazon cloud with the trans-proteomic pipeline.
Slagel, Joseph; Mendoza, Luis; Shteynberg, David; Deutsch, Eric W; Moritz, Robert L
2015-02-01
Cloud computing, where scalable, on-demand compute cycles and storage are available as a service, has the potential to accelerate mass spectrometry-based proteomics research by providing simple, expandable, and affordable large-scale computing to all laboratories regardless of location or information technology expertise. We present new cloud computing functionality for the Trans-Proteomic Pipeline, a free and open-source suite of tools for the processing and analysis of tandem mass spectrometry datasets. Enabled with Amazon Web Services cloud computing, the Trans-Proteomic Pipeline now accesses large scale computing resources, limited only by the available Amazon Web Services infrastructure, for all users. The Trans-Proteomic Pipeline runs in an environment fully hosted on Amazon Web Services, where all software and data reside on cloud resources to tackle large search studies. In addition, it can also be run on a local computer with computationally intensive tasks launched onto the Amazon Elastic Compute Cloud service to greatly decrease analysis times. We describe the new Trans-Proteomic Pipeline cloud service components, compare the relative performance and costs of various Elastic Compute Cloud service instance types, and present on-line tutorials that enable users to learn how to deploy cloud computing technology rapidly with the Trans-Proteomic Pipeline. We provide tools for estimating the necessary computing resources and costs given the scale of a job and demonstrate the use of cloud enabled Trans-Proteomic Pipeline by performing over 1100 tandem mass spectrometry files through four proteomic search engines in 9 h and at a very low cost. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.
Patel, Manesh R; Bailey, Steven R; Bonow, Robert O; Chambers, Charles E; Chan, Paul S; Dehmer, Gregory J; Kirtane, Ajay J; Wann, L Samuel; Ward, R Parker
2012-05-29
The American College of Cardiology Foundation, in collaboration with the Society for Cardiovascular Angiography and Interventions and key specialty and subspecialty societies, conducted a review of common clinical scenarios where diagnostic catheterization is frequently considered. The indications (clinical scenarios) were derived from common applications or anticipated uses, as well as from current clinical practice guidelines and results of studies examining the implementation of noninvasive imaging appropriate use criteria. The 166 indications in this document were developed by a diverse writing group and scored by a separate independent technical panel on a scale of 1 to 9, to designate appropriate use (median 7 to 9), uncertain use (median 4 to 6), and inappropriate use (median 1 to 3). Diagnostic catheterization may include several different procedure components. The indications developed focused primarily on 2 aspects of diagnostic catheterization. Many indications focused on the performance of coronary angiography for the detection of coronary artery disease with other procedure components (e.g., hemodynamic measurements, ventriculography) at the discretion of the operator. The majority of the remaining indications focused on hemodynamic measurements to evaluate valvular heart disease, pulmonary hypertension, cardiomyopathy, and other conditions, with the use of coronary angiography at the discretion of the operator. Seventy-five indications were rated as appropriate, 49 were rated as uncertain, and 42 were rated as inappropriate. The appropriate use criteria for diagnostic catheterization have the potential to impact physician decision making, healthcare delivery, and reimbursement policy. Furthermore, recognition of uncertain clinical scenarios facilitates identification of areas that would benefit from future research.
Categories of Computer Use and Their Relationships with Attitudes toward Computers.
ERIC Educational Resources Information Center
Mitra, Anandra
1998-01-01
Analysis of attitude and use questionnaires completed by undergraduates (n1,444) at Wake Forest University determined that computers were used most frequently for word processing. Other uses were e-mail for task and non-task activities and mathematical and statistical computation. Results suggest that the level of computer use was related to…
Weyand, Sabine; Takehara-Nishiuchi, Kaori; Chau, Tom
2015-10-30
Near-infrared spectroscopy (NIRS) brain-computer interfaces (BCIs) enable users to interact with their environment using only cognitive activities. This paper presents the results of a comparison of four methodological frameworks used to select a pair of tasks to control a binary NIRS-BCI; specifically, three novel personalized task paradigms and the state-of-the-art prescribed task framework were explored. Three types of personalized task selection approaches were compared, including: user-selected mental tasks using weighted slope scores (WS-scores), user-selected mental tasks using pair-wise accuracy rankings (PWAR), and researcher-selected mental tasks using PWAR. These paradigms, along with the state-of-the-art prescribed mental task framework, where mental tasks are selected based on the most commonly used tasks in literature, were tested by ten able-bodied participants who took part in five NIRS-BCI sessions. The frameworks were compared in terms of their accuracy, perceived ease-of-use, computational time, user preference, and length of training. Most notably, researcher-selected personalized tasks resulted in significantly higher accuracies, while user-selected personalized tasks resulted in significantly higher perceived ease-of-use. It was also concluded that PWAR minimized the amount of data that needed to be collected; while, WS-scores maximized user satisfaction and minimized computational time. In comparison to the state-of-the-art prescribed mental tasks, our findings show that overall, personalized tasks appear to be superior to prescribed tasks with respect to accuracy and perceived ease-of-use. The deployment of personalized rather than prescribed mental tasks ought to be considered and further investigated in future NIRS-BCI studies. Copyright © 2015 Elsevier B.V. All rights reserved.
Talke, Pekka O; Sharma, Deepak; Heyer, Eric J; Bergese, Sergio D; Blackham, Kristine A; Stevens, Robert D
2014-08-01
Literature on the anesthetic management of endovascular treatment of acute ischemic stroke (AIS) is limited. Anesthetic management during these procedures is still mostly dependent on individual or institutional preferences. Thus, the Society of Neuroscience in Anesthesiology and Critical Care (SNACC) created a task force to provide expert consensus recommendations on anesthetic management of endovascular treatment of AIS. The task force conducted a systematic literature review (up to August 2012). Because of the limited number of research articles relating to this subject, the task force solicited opinions from experts in this area. The task force created a draft consensus statement based on the available data. Classes of recommendations and levels of evidence were assigned to articles specifically addressing anesthetic management during endovascular treatment of stroke using the standard American Heart Association evidence rating scheme. The draft consensus statement was reviewed by the Task Force, SNACC Executive Committee and representatives of Society of NeuroInterventional Surgery (SNIS) and Neurocritical Care Society (NCS) reaching consensus on the final document. For this consensus statement the anesthetic management of endovascular treatment of AIS was subdivided into 12 topics. Each topic includes a summary of available data followed by recommendations. This consensus statement is intended for use by individuals involved in the care of patients with acute ischemic stroke, such as anesthesiologists, interventional neuroradiologists, neurologists, neurointensivists and neurosurgeons. © 2014 American Heart Association, Inc.
ERIC Educational Resources Information Center
Paisley, William; Butler, Matilda
This study of the computer/user interface investigated the role of the computer in performing information tasks that users now perform without computer assistance. Users' perceptual/cognitive processes are to be accelerated or augmented by the computer; a long term goal is to delegate information tasks entirely to the computer. Cybernetic and…
Computer proficiency questionnaire: assessing low and high computer proficient seniors.
Boot, Walter R; Charness, Neil; Czaja, Sara J; Sharit, Joseph; Rogers, Wendy A; Fisk, Arthur D; Mitzner, Tracy; Lee, Chin Chin; Nair, Sankaran
2015-06-01
Computers and the Internet have the potential to enrich the lives of seniors and aid in the performance of important tasks required for independent living. A prerequisite for reaping these benefits is having the skills needed to use these systems, which is highly dependent on proper training. One prerequisite for efficient and effective training is being able to gauge current levels of proficiency. We developed a new measure (the Computer Proficiency Questionnaire, or CPQ) to measure computer proficiency in the domains of computer basics, printing, communication, Internet, calendaring software, and multimedia use. Our aim was to develop a measure appropriate for individuals with a wide range of proficiencies from noncomputer users to extremely skilled users. To assess the reliability and validity of the CPQ, a diverse sample of older adults, including 276 older adults with no or minimal computer experience, was recruited and asked to complete the CPQ. The CPQ demonstrated excellent reliability (Cronbach's α = .98), with subscale reliabilities ranging from .86 to .97. Age, computer use, and general technology use all predicted CPQ scores. Factor analysis revealed three main factors of proficiency related to Internet and e-mail use; communication and calendaring; and computer basics. Based on our findings, we also developed a short-form CPQ (CPQ-12) with similar properties but 21 fewer questions. The CPQ and CPQ-12 are useful tools to gauge computer proficiency for training and research purposes, even among low computer proficient older adults. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
ERIC Educational Resources Information Center
Ihme, Jan Marten; Senkbeil, Martin; Goldhammer, Frank; Gerick, Julia
2017-01-01
The combination of different item formats is found quite often in large scale assessments, and analyses on the dimensionality often indicate multi-dimensionality of tests regarding the task format. In ICILS 2013, three different item types (information-based response tasks, simulation tasks, and authoring tasks) were used to measure computer and…
Belger, A; Banich, M T
1998-07-01
Because interaction of the cerebral hemispheres has been found to aid task performance under demanding conditions, the present study examined how this effect is moderated by computational complexity, the degree of lateralization for a task, and individual differences in asymmetric hemispheric activation (AHA). Computational complexity was manipulated across tasks either by increasing the number of inputs to be processed or by increasing the number of steps to a decision. Comparison of within- and across-hemisphere trials indicated that the size of the between-hemisphere advantage increased as a function of task complexity, except for a highly lateralized rhyme decision task that can only be performed by the left hemisphere. Measures of individual differences in AHA revealed that when task demands and an individual's AHA both load on the same hemisphere, the ability to divide the processing between the hemispheres is limited. Thus, interhemispheric division of processing improves performance at higher levels of computational complexity only when the required operations can be divided between the hemispheres.
Wang, Zhaocai; Ji, Zuwen; Wang, Xiaoming; Wu, Tunhua; Huang, Wei
2017-12-01
As a promising approach to solve the computationally intractable problem, the method based on DNA computing is an emerging research area including mathematics, computer science and molecular biology. The task scheduling problem, as a well-known NP-complete problem, arranges n jobs to m individuals and finds the minimum execution time of last finished individual. In this paper, we use a biologically inspired computational model and describe a new parallel algorithm to solve the task scheduling problem by basic DNA molecular operations. In turn, we skillfully design flexible length DNA strands to represent elements of the allocation matrix, take appropriate biological experiment operations and get solutions of the task scheduling problem in proper length range with less than O(n 2 ) time complexity. Copyright © 2017. Published by Elsevier B.V.
NMRbox: A Resource for Biomolecular NMR Computation.
Maciejewski, Mark W; Schuyler, Adam D; Gryk, Michael R; Moraru, Ion I; Romero, Pedro R; Ulrich, Eldon L; Eghbalnia, Hamid R; Livny, Miron; Delaglio, Frank; Hoch, Jeffrey C
2017-04-25
Advances in computation have been enabling many recent advances in biomolecular applications of NMR. Due to the wide diversity of applications of NMR, the number and variety of software packages for processing and analyzing NMR data is quite large, with labs relying on dozens, if not hundreds of software packages. Discovery, acquisition, installation, and maintenance of all these packages is a burdensome task. Because the majority of software packages originate in academic labs, persistence of the software is compromised when developers graduate, funding ceases, or investigators turn to other projects. To simplify access to and use of biomolecular NMR software, foster persistence, and enhance reproducibility of computational workflows, we have developed NMRbox, a shared resource for NMR software and computation. NMRbox employs virtualization to provide a comprehensive software environment preconfigured with hundreds of software packages, available as a downloadable virtual machine or as a Platform-as-a-Service supported by a dedicated compute cloud. Ongoing development includes a metadata harvester to regularize, annotate, and preserve workflows and facilitate and enhance data depositions to BioMagResBank, and tools for Bayesian inference to enhance the robustness and extensibility of computational analyses. In addition to facilitating use and preservation of the rich and dynamic software environment for biomolecular NMR, NMRbox fosters the development and deployment of a new class of metasoftware packages. NMRbox is freely available to not-for-profit users. Copyright © 2017 Biophysical Society. All rights reserved.
Guo, Chao-Yu; Bacic, Janine; Chan, Eugenia
2011-01-01
Background Health care systems increasingly rely on patients’ data entry efforts to organize and assist in care delivery through health information exchange. Objectives We sought to determine (1) the variation in burden imposed on parents by data entry efforts across paper-based and computer-based environments, and (2) the impact, if any, of parents’ health literacy on the task burden. Methods We completed a randomized controlled trial of parent-completed data entry tasks. Parents of children with attention deficit hyperactivity disorder (ADHD) were randomized based on the Test of Functional Health Literacy in Adults (TOFHLA) to either a paper-based or computer-based environment for entry of health information on their children. The primary outcome was the National Aeronautics and Space Administration Task Load Index (TLX) total weighted score. Results We screened 271 parents: 194 (71.6%) were eligible, and 180 of these (92.8%) constituted the study cohort. We analyzed 90 participants from each arm. Parents who completed information tasks on paper reported a higher task burden than those who worked in the computer environment: mean (SD) TLX scores were 22.8 (20.6) for paper and 16.3 (16.1) for computer. Assignment to the paper environment conferred a significant risk of higher task burden (F1,178 = 4.05, P = .046). Adequate literacy was associated with lower task burden (decrease in burden score of 1.15 SD, P = .003). After adjusting for relevant child and parent factors, parents’ TOFHLA score (beta = -.02, P = .02) and task environment (beta = .31, P = .03) remained significantly associated with task burden. Conclusions A tailored computer-based environment provided an improved task experience for data entry compared to the same tasks completed on paper. Health literacy was inversely related to task burden. Trial registration Clinicaltrials.gov NCT00543257; http://www.clinicaltrials.gov/ct2/show/NCT00543257 (Archived by WebCite at http://www.webcitation.org/5vUVH2DYR) PMID:21269990
Murray, Kim; Johnston, Kate; Cunnane, Helen; Kerr, Charlotte; Spain, Debbie; Gillan, Nicola; Hammond, Neil; Murphy, Declan; Happé, Francesca
2017-06-01
Real-life social processing abilities of adults with autism spectrum disorders (ASD) can be hard to capture in lab-based experimental tasks. A novel measure of social cognition, the "Strange Stories Film task' (SSFt), was designed to overcome limitations of available measures in the field. Brief films were made based on the scenarios from the Strange Stories task (Happé) and designed to capture the subtle social-cognitive difficulties observed in ASD adults. Twenty neurotypical adults were recruited to pilot the new measure. A final test set was produced and administered to a group of 20 adults with ASD and 20 matched controls, alongside established social cognition tasks and questionnaire measures of empathy, alexithymia and ASD traits. The SSFt was more effective than existing measures at differentiating the ASD group from the control group. In the ASD group, the SSFt was associated with the Strange Stories task. The SSFt is a potentially useful tool to identify social cognitive dis/abilities in ASD, with preliminary evidence of adequate convergent validity. Future research directions are discussed. Autism Res 2017. © 2017 International Society for Autism Research, Wiley Periodicals, Inc. Autism Res 2017, 10: 1120-1132. © 2017 International Society for Autism Research, Wiley Periodicals, Inc. © 2017 International Society for Autism Research, Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Bejczy, Antal K.
1995-01-01
This presentation focuses on the application of computer graphics or 'virtual reality' (VR) techniques as a human-computer interface tool in the operation of telerobotic systems. VR techniques offer very valuable task realization aids for planning, previewing and predicting robotic actions, operator training, and for visual perception of non-visible events like contact forces in robotic tasks. The utility of computer graphics in telerobotic operation can be significantly enhanced by high-fidelity calibration of virtual reality images to actual TV camera images. This calibration will even permit the creation of artificial (synthetic) views of task scenes for which no TV camera views are available.
Study to design and develop remote manipulator system. [computer simulation of human performance
NASA Technical Reports Server (NTRS)
Hill, J. W.; Mcgovern, D. E.; Sword, A. J.
1974-01-01
Modeling of human performance in remote manipulation tasks is reported by automated procedures using computers to analyze and count motions during a manipulation task. Performance is monitored by an on-line computer capable of measuring the joint angles of both master and slave and in some cases the trajectory and velocity of the hand itself. In this way the operator's strategies with different transmission delays, displays, tasks, and manipulators can be analyzed in detail for comparison. Some progress is described in obtaining a set of standard tasks and difficulty measures for evaluating manipulator performance.
ERIC Educational Resources Information Center
Sins, Patrick H. M.; van Joolingen, Wouter R.; Savelsbergh, Elwin R.; van Hout-Wolters, Bernadette
2008-01-01
Purpose of the present study was to test a conceptual model of relations among achievement goal orientation, self-efficacy, cognitive processing, and achievement of students working within a particular collaborative task context. The task involved a collaborative computer-based modeling task. In order to test the model, group measures of…
ERIC Educational Resources Information Center
Council on Library and Information Resources, Washington, DC.
The American Council of Learned Societies and the Council on Library and Information Resources appointed 36 scholars, librarians, and leaders of various academic enterprises to five task forces "to consider changes in the process of scholarship and instruction that will result from the use of digital technology and to make recommendations to…
High-reliability computing for the smarter planet
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quinn, Heather M; Graham, Paul; Manuzzato, Andrea
2010-01-01
The geometric rate of improvement of transistor size and integrated circuit performance, known as Moore's Law, has been an engine of growth for our economy, enabling new products and services, creating new value and wealth, increasing safety, and removing menial tasks from our daily lives. Affordable, highly integrated components have enabled both life-saving technologies and rich entertainment applications. Anti-lock brakes, insulin monitors, and GPS-enabled emergency response systems save lives. Cell phones, internet appliances, virtual worlds, realistic video games, and mp3 players enrich our lives and connect us together. Over the past 40 years of silicon scaling, the increasing capabilities ofmore » inexpensive computation have transformed our society through automation and ubiquitous communications. In this paper, we will present the concept of the smarter planet, how reliability failures affect current systems, and methods that can be used to increase the reliable adoption of new automation in the future. We will illustrate these issues using a number of different electronic devices in a couple of different scenarios. Recently IBM has been presenting the idea of a 'smarter planet.' In smarter planet documents, IBM discusses increased computer automation of roadways, banking, healthcare, and infrastructure, as automation could create more efficient systems. A necessary component of the smarter planet concept is to ensure that these new systems have very high reliability. Even extremely rare reliability problems can easily escalate to problematic scenarios when implemented at very large scales. For life-critical systems, such as automobiles, infrastructure, medical implantables, and avionic systems, unmitigated failures could be dangerous. As more automation moves into these types of critical systems, reliability failures will need to be managed. As computer automation continues to increase in our society, the need for greater radiation reliability is necessary. Already critical infrastructure is failing too frequently. In this paper, we will introduce the Cross-Layer Reliability concept for designing more reliable computer systems.« less
Modeling Cognitive Strategies during Complex Task Performing Process
ERIC Educational Resources Information Center
Mazman, Sacide Guzin; Altun, Arif
2012-01-01
The purpose of this study is to examine individuals' computer based complex task performing processes and strategies in order to determine the reasons of failure by cognitive task analysis method and cued retrospective think aloud with eye movement data. Study group was five senior students from Computer Education and Instructional Technologies…
Automated Instructional Monitors for Complex Operational Tasks. Final Report.
ERIC Educational Resources Information Center
Feurzeig, Wallace
A computer-based instructional system is described which incorporates diagnosis of students difficulties in acquiring complex concepts and skills. A computer automatically generated a simulated display. It then monitored and analyzed a student's work in the performance of assigned training tasks. Two major tasks were studied. The first,…
Multi-Functional UV-Visible-IR Nanosensors Devices and Structures
2015-04-29
Dual-Gate MOSFET System, Proceedings of the International Workshop on Computational Electronics, Nara, Japan, Society of Micro- and Nanoelectronics ...International Workshop on Computational Electronics, Nara, Japan, Society of Micro- and Nanoelectronics , 216-217 (2013); ISBN 978-3-901578-26-7 M. S...Raman Spectroscopy, Proceedings of the International Workshop on Computational Electronics, Nara, Japan, Society of Micro- and Nanoelectronics , 198
Simple, efficient allocation of modelling runs on heterogeneous clusters with MPI
Donato, David I.
2017-01-01
In scientific modelling and computation, the choice of an appropriate method for allocating tasks for parallel processing depends on the computational setting and on the nature of the computation. The allocation of independent but similar computational tasks, such as modelling runs or Monte Carlo trials, among the nodes of a heterogeneous computational cluster is a special case that has not been specifically evaluated previously. A simulation study shows that a method of on-demand (that is, worker-initiated) pulling from a bag of tasks in this case leads to reliably short makespans for computational jobs despite heterogeneity both within and between cluster nodes. A simple reference implementation in the C programming language with the Message Passing Interface (MPI) is provided.
Parallel processing using an optical delay-based reservoir computer
NASA Astrophysics Data System (ADS)
Van der Sande, Guy; Nguimdo, Romain Modeste; Verschaffelt, Guy
2016-04-01
Delay systems subject to delayed optical feedback have recently shown great potential in solving computationally hard tasks. By implementing a neuro-inspired computational scheme relying on the transient response to optical data injection, high processing speeds have been demonstrated. However, reservoir computing systems based on delay dynamics discussed in the literature are designed by coupling many different stand-alone components which lead to bulky, lack of long-term stability, non-monolithic systems. Here we numerically investigate the possibility of implementing reservoir computing schemes based on semiconductor ring lasers. Semiconductor ring lasers are semiconductor lasers where the laser cavity consists of a ring-shaped waveguide. SRLs are highly integrable and scalable, making them ideal candidates for key components in photonic integrated circuits. SRLs can generate light in two counterpropagating directions between which bistability has been demonstrated. We demonstrate that two independent machine learning tasks , even with different nature of inputs with different input data signals can be simultaneously computed using a single photonic nonlinear node relying on the parallelism offered by photonics. We illustrate the performance on simultaneous chaotic time series prediction and a classification of the Nonlinear Channel Equalization. We take advantage of different directional modes to process individual tasks. Each directional mode processes one individual task to mitigate possible crosstalk between the tasks. Our results indicate that prediction/classification with errors comparable to the state-of-the-art performance can be obtained even with noise despite the two tasks being computed simultaneously. We also find that a good performance is obtained for both tasks for a broad range of the parameters. The results are discussed in detail in [Nguimdo et al., IEEE Trans. Neural Netw. Learn. Syst. 26, pp. 3301-3307, 2015
Asai, Tsuyoshi; Oshima, Kensuke; Fukumoto, Yoshihiro; Yonezawa, Yuri; Matsuo, Asuka; Misu, Shogo
2018-05-21
To investigate the associations between fall history and the Timed Up and Go (TUG) test (single-TUG test), TUG test while counting aloud backwards from 100 (dual-TUG test) and the dual-task cost (DTC) among independent community-dwelling older adults. This cross-sectional study included 537 older adults who lived independently in the community. Data on fall history in the previous year were obtained by self-administrated questionnaire. The single- and dual-TUG tests were carried out, and the DTC value was computed from these results. Associations between fall history and these TUG-related values were analyzed using multivariate logistic regression models. The participants were divided into fall risk groups using the cut-off values of those significantly associated with falling, and the odds ratios (OR) were computed. Slower single-TUG test scores and lower DTC values were significantly associated with fall history after adjusting for potential confounders (single-TUG test score: OR 1.133, 95% CI 1.029-1.249; DTC value: OR 0.984, 95% CI 0.968-0.998). Older adults with slower single-TUG test scores and lower DTC values reported a fall history more often than those in other categories (OR compared with the lower-risk single-TUG and lower-risk DTC groups: 3.474, 95% CI 1.881-6.570). Slower single-TUG test scores and lower DTC values are associated with fall history among independent community-dwelling older adults. To some extent, dual task performance might provide added value for fall assessment, compared with administering the TUG test alone. Geriatr Gerontol Int 2018; ••: ••-••. © 2018 Japan Geriatrics Society.
Psychology of computer use: XXXII. Computer screen-savers as distractors.
Volk, F A; Halcomb, C G
1994-12-01
The differences in performance of 16 male and 16 female undergraduates on three cognitive tasks were investigated in the presence of visual distractors (computer-generated dynamic graphic images). These tasks included skilled and unskilled proofreading and listening comprehension. The visually demanding task of proofreading (skilled and unskilled) showed no significant decreases in performance in the distractor conditions. Results showed significant decrements, however, in performance on listening comprehension in at least one of the distractor conditions.
Forest biomass as an energy source
P.E. Laks; R.W. Hemingway; A. Conner
1979-01-01
The Task Force on Forest Biomass as an Energy Source was chartered by the Society of American Foresters on September 26, 1977, and took its present form following an amendment to the charter on October 5, 1977. It built upon the findings of two previous task forces, the Task Force on Energy and Forest Resources and the Task Force for Evaluation of the CORRIM Report (...
Evaluation and Treatment of Hypertriglyceridemia: An Endocrine Society Clinical Practice Guideline
Berglund, Lars; Brunzell, John D.; Goldberg, Anne C.; Goldberg, Ira J.; Sacks, Frank; Murad, Mohammad Hassan; Stalenhoef, Anton F. H.
2012-01-01
Objective: The aim was to develop clinical practice guidelines on hypertriglyceridemia. Participants: The Task Force included a chair selected by The Endocrine Society Clinical Guidelines Subcommittee (CGS), five additional experts in the field, and a methodologist. The authors received no corporate funding or remuneration. Consensus Process: Consensus was guided by systematic reviews of evidence, e-mail discussion, conference calls, and one in-person meeting. The guidelines were reviewed and approved sequentially by The Endocrine Society's CGS and Clinical Affairs Core Committee, members responding to a web posting, and The Endocrine Society Council. At each stage, the Task Force incorporated changes in response to written comments. Conclusions: The Task Force recommends that the diagnosis of hypertriglyceridemia be based on fasting levels, that mild and moderate hypertriglyceridemia (triglycerides of 150–999 mg/dl) be diagnosed to aid in the evaluation of cardiovascular risk, and that severe and very severe hypertriglyceridemia (triglycerides of > 1000 mg/dl) be considered a risk for pancreatitis. The Task Force also recommends that patients with hypertriglyceridemia be evaluated for secondary causes of hyperlipidemia and that subjects with primary hypertriglyceridemia be evaluated for family history of dyslipidemia and cardiovascular disease. The Task Force recommends that the treatment goal in patients with moderate hypertriglyceridemia be a non-high-density lipoprotein cholesterol level in agreement with National Cholesterol Education Program Adult Treatment Panel guidelines. The initial treatment should be lifestyle therapy; a combination of diet modification and drug therapy may also be considered. In patients with severe or very severe hypertriglyceridemia, a fibrate should be used as a first-line agent. PMID:22962670
Wang, Jiang-Ning; Chen, Xiao-Lin; Hou, Xin-Wen; Zhou, Li-Bing; Zhu, Chao-Dong; Ji, Li-Qiang
2017-07-01
Many species of Tephritidae are damaging to fruit, which might negatively impact international fruit trade. Automatic or semi-automatic identification of fruit flies are greatly needed for diagnosing causes of damage and quarantine protocols for economically relevant insects. A fruit fly image identification system named AFIS1.0 has been developed using 74 species belonging to six genera, which include the majority of pests in the Tephritidae. The system combines automated image identification and manual verification, balancing operability and accuracy. AFIS1.0 integrates image analysis and expert system into a content-based image retrieval framework. In the the automatic identification module, AFIS1.0 gives candidate identification results. Afterwards users can do manual selection based on comparing unidentified images with a subset of images corresponding to the automatic identification result. The system uses Gabor surface features in automated identification and yielded an overall classification success rate of 87% to the species level by Independent Multi-part Image Automatic Identification Test. The system is useful for users with or without specific expertise on Tephritidae in the task of rapid and effective identification of fruit flies. It makes the application of computer vision technology to fruit fly recognition much closer to production level. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Real-Time Non-Intrusive Assessment of Viewing Distance during Computer Use.
Argilés, Marc; Cardona, Genís; Pérez-Cabré, Elisabet; Pérez-Magrané, Ramon; Morcego, Bernardo; Gispets, Joan
2016-12-01
To develop and test the sensitivity of an ultrasound-based sensor to assess the viewing distance of visual display terminals operators in real-time conditions. A modified ultrasound sensor was attached to a computer display to assess viewing distance in real time. Sensor functionality was tested on a sample of 20 healthy participants while they conducted four 10-minute randomly presented typical computer tasks (a match-three puzzle game, a video documentary, a task requiring participants to complete a series of sentences, and a predefined internet search). The ultrasound sensor offered good measurement repeatability. Game, text completion, and web search tasks were conducted at shorter viewing distances (54.4 cm [95% CI 51.3-57.5 cm], 54.5 cm [95% CI 51.1-58.0 cm], and 54.5 cm [95% CI 51.4-57.7 cm], respectively) than the video task (62.3 cm [95% CI 58.9-65.7 cm]). Statistically significant differences were found between the video task and the other three tasks (all p < 0.05). Range of viewing distances (from 22 to 27 cm) was similar for all tasks (F = 0.996; p = 0.413). Real-time assessment of the viewing distance of computer users with a non-intrusive ultrasonic device disclosed a task-dependent pattern.
Szucs, Kimberly A; Molnar, Megan
2017-04-01
The aim of this study was to provide a description of gender differences of the activation patterns of the four subdivisions of the trapezius (clavicular, upper, middle, lower) following a 60min computer work task. Surface EMG was collected from these subdivisions from 21 healthy subjects during bilateral arm elevation pre-/post- task. Subjects completed a standardized 60min computer work task at a standard, ergonomic workstation. Normalized activation and activation ratios of each trapezius subdivision were compared between genders and condition with repeated measures ANOVAs. The interaction effect of Gender×Condition for upper trapezius% activation approached significance at p=0.051with males demonstrating greater activation post-task. The main effect of Condition was statistically significant for% activation of middle and lower trapezius (p<0.05), with both muscles demonstrating increase activation post-task. There was a statistically significant interaction effect of Gender×Condition for the Middle Trapezius/Upper Trapezius ratio and main effect of Condition for the Clavicular Trapezius/Upper Trapezius ratio, with a decreased ratio post-typing. Gender differences exist following 60min of a low force computer typing task. Imbalances in muscle activation and activation ratios following computer work may affect total shoulder kinematics and should be further explored. Copyright © 2017 Elsevier B.V. All rights reserved.
On Trust Evaluation in Mobile Ad Hoc Networks
NASA Astrophysics Data System (ADS)
Nguyen, Dang Quan; Lamont, Louise; Mason, Peter C.
Trust has been considered as a social relationship between two individuals in human society. But, as computer science and networking have succeeded in using computers to automate many tasks, the concept of trust can be generalized to cover the reliability and relationships of non-human interaction, such as, for example, information gathering and data routing. This paper investigates the evaluation of trust in the context of ad hoc networks. Nodes evaluate each other’s behaviour based on observables. A node then decides whether to trust another node to have certain innate abilities. We show how accurate such an evaluation could be. We also provide the minimum number of observations required to obtain an accurate evaluation, a result that indicates that observation-based trust in ad hoc networks will remain a challenging problem. The impact of making networking decisions using trust evaluation on the network connectivity is also examined. In this manner, quantitative decisions can be made concerning trust-based routing with the knowledge of the potential impact on connectivity.
Attention Modulates Spatial Precision in Multiple-Object Tracking.
Srivastava, Nisheeth; Vul, Ed
2016-01-01
We present a computational model of multiple-object tracking that makes trial-level predictions about the allocation of visual attention and the effect of this allocation on observers' ability to track multiple objects simultaneously. This model follows the intuition that increased attention to a location increases the spatial resolution of its internal representation. Using a combination of empirical and computational experiments, we demonstrate the existence of a tight coupling between cognitive and perceptual resources in this task: Low-level tracking of objects generates bottom-up predictions of error likelihood, and high-level attention allocation selectively reduces error probabilities in attended locations while increasing it at non-attended locations. Whereas earlier models of multiple-object tracking have predicted the big picture relationship between stimulus complexity and response accuracy, our approach makes accurate predictions of both the macro-scale effect of target number and velocity on tracking difficulty and micro-scale variations in difficulty across individual trials and targets arising from the idiosyncratic within-trial interactions of targets and distractors. Copyright © 2016 Cognitive Science Society, Inc.
Schmidhuber, Jürgen
2013-01-01
Most of computer science focuses on automatically solving given computational problems. I focus on automatically inventing or discovering problems in a way inspired by the playful behavior of animals and humans, to train a more and more general problem solver from scratch in an unsupervised fashion. Consider the infinite set of all computable descriptions of tasks with possibly computable solutions. Given a general problem-solving architecture, at any given time, the novel algorithmic framework PowerPlay (Schmidhuber, 2011) searches the space of possible pairs of new tasks and modifications of the current problem solver, until it finds a more powerful problem solver that provably solves all previously learned tasks plus the new one, while the unmodified predecessor does not. Newly invented tasks may require to achieve a wow-effect by making previously learned skills more efficient such that they require less time and space. New skills may (partially) re-use previously learned skills. The greedy search of typical PowerPlay variants uses time-optimal program search to order candidate pairs of tasks and solver modifications by their conditional computational (time and space) complexity, given the stored experience so far. The new task and its corresponding task-solving skill are those first found and validated. This biases the search toward pairs that can be described compactly and validated quickly. The computational costs of validating new tasks need not grow with task repertoire size. Standard problem solver architectures of personal computers or neural networks tend to generalize by solving numerous tasks outside the self-invented training set; PowerPlay’s ongoing search for novelty keeps breaking the generalization abilities of its present solver. This is related to Gödel’s sequence of increasingly powerful formal theories based on adding formerly unprovable statements to the axioms without affecting previously provable theorems. The continually increasing repertoire of problem-solving procedures can be exploited by a parallel search for solutions to additional externally posed tasks. PowerPlay may be viewed as a greedy but practical implementation of basic principles of creativity (Schmidhuber, 2006a, 2010). A first experimental analysis can be found in separate papers (Srivastava et al., 2012a,b, 2013). PMID:23761771
The effect of augmented real-time image guidance on task workload during endoscopic sinus surgery.
Dixon, Benjamin J; Chan, Harley; Daly, Michael J; Vescan, Allan D; Witterick, Ian J; Irish, Jonathan C
2012-01-01
Due to proximity to critical structures, the need for spatial awareness during endoscopic sinus surgery (ESS) is essential. We have developed an augmented, real-time image-guided surgery (ART-IGS) system that provides live navigational data and proximity alerts to the operating surgeon during ablation. We wished to test the hypothesis that task workload would be reduced when using this technology. A trial involved 8 otolaryngology residents and fellows performing ESS on cadaveric specimens; 1 side in a conventional method (control) and 1 side with ART-IGS. After computed tomography scanning, anatomical contouring, and registration of the head, a three-dimensional (3D) virtual endoscopic view, ablative tool tracking, and proximity alerts were enabled. Each subject completed ESS tasks and rated their workload during and after the exercise using the National Aeronautics and Space Administration (NASA) Task Load Index (TLX). A questionnaire and open feedback interview were completed after the procedure. There was a significant reduction in mental demand, temporal demand, effort, and frustration when using the ART-IGS system in comparison to the control (p < 0.02). Perceived performance was increased (p = 0.02). Most subjects agreed that the system was sufficiently accurate, caused minimal interruption, and increased confidence. Optical tracking line-of-sight issues were frequently cited as the main limitation early in the study; however, this was largely resolved. ART-IGS reduces task workload for trainees performing ESS. Live navigation and alert zones may be a valuable intraoperative teaching aid. Copyright © 2012 American Rhinologic Society-American Academy of Otolaryngic Allergy, LLC.
Aviation Technician Training I and Task Analyses: Semester II. Field Review Copy.
ERIC Educational Resources Information Center
Upchurch, Richard
This guide for aviation technician training begins with a course description, resource information, and a course outline. Tasks/competencies are categorized into 16 concept/duty areas: understanding technical symbols and abbreviations; understanding mathematical terms, symbols, and formulas; computing decimals; computing fractions; computing ratio…
Azziz, Ricardo; Carmina, Enrico; Dewailly, Didier; Diamanti-Kandarakis, Evanthia; Escobar-Morreale, Héctor F; Futterweit, Walter; Janssen, Onno E; Legro, Richard S; Norman, Robert J; Taylor, Ann E; Witchel, Selma F
2009-02-01
To review all available data and recommend a definition for polycystic ovary syndrome (PCOS) based on published peer-reviewed data, whether already in use or not, to guide clinical diagnosis and future research. Literature review and expert consensus. Professional society. None. None. A systematic review of the published peer-reviewed medical literature, by querying MEDLINE databases, to identify studies evaluating the epidemiology or phenotypic aspects of PCOS. The Task Force drafted the initial report, following a consensus process via electronic communication, which was then reviewed and critiqued by the Androgen Excess and PCOS (AE-PCOS) Society AE-PCOS Board of Directors. No section was finalized until all members were satisfied with the contents, and minority opinions noted. Statements were not included that were not supported by peer-reviewed evidence. Based on the available data, it is the view of the AE-PCOS Society Task Force that PCOS should be defined by the presence of hyperandrogenism (clinical and/or biochemical), ovarian dysfunction (oligo-anovulation and/or polycystic ovaries), and the exclusion of related disorders. However, a minority considered the possibility that there may be forms of PCOS without overt evidence of hyperandrogenism, but recognized that more data are required before validating this supposition. Finally, the Task Force recognized and fully expects that the definition of this syndrome will evolve over time to incorporate new research findings.
A comparison of symptoms after viewing text on a computer screen and hardcopy.
Chu, Christina; Rosenfield, Mark; Portello, Joan K; Benzoni, Jaclyn A; Collier, Juanita D
2011-01-01
Computer vision syndrome (CVS) is a complex of eye and vision problems experienced during or related to computer use. Ocular symptoms may include asthenopia, accommodative and vergence difficulties and dry eye. CVS occurs in up to 90% of computer workers, and given the almost universal use of these devices, it is important to identify whether these symptoms are specific to computer operation, or are simply a manifestation of performing a sustained near-vision task. This study compared ocular symptoms immediately following a sustained near task. 30 young, visually-normal subjects read text aloud either from a desktop computer screen or a printed hardcopy page at a viewing distance of 50 cm for a continuous 20 min period. Identical text was used in the two sessions, which was matched for size and contrast. Target viewing angle and luminance were similar for the two conditions. Immediately following completion of the reading task, subjects completed a written questionnaire asking about their level of ocular discomfort during the task. When comparing the computer and hardcopy conditions, significant differences in median symptom scores were reported with regard to blurred vision during the task (t = 147.0; p = 0.03) and the mean symptom score (t = 102.5; p = 0.04). In both cases, symptoms were higher during computer use. Symptoms following sustained computer use were significantly worse than those reported after hard copy fixation under similar viewing conditions. A better understanding of the physiology underlying CVS is critical to allow more accurate diagnosis and treatment. This will allow practitioners to optimize visual comfort and efficiency during computer operation.
Pattern of Non-Task Interactions in Asynchronous Computer-Supported Collaborative Learning Courses
ERIC Educational Resources Information Center
Abedin, Babak; Daneshgar, Farhad; D'Ambra, John
2014-01-01
Despite the importance of the non-task interactions in computer-supported collaborative learning (CSCL) environments as emphasized in the literature, few studies have investigated online behavior of people in the CSCL environments. This paper studies the pattern of non-task interactions among postgraduate students in an Australian university. The…
Strategy Generalization across Orientation Tasks: Testing a Computational Cognitive Model
ERIC Educational Resources Information Center
Gunzelmann, Glenn
2008-01-01
Humans use their spatial information processing abilities flexibly to facilitate problem solving and decision making in a variety of tasks. This article explores the question of whether a general strategy can be adapted for performing two different spatial orientation tasks by testing the predictions of a computational cognitive model. Human…
ERIC Educational Resources Information Center
Collentine, Karina
2009-01-01
Second language acquisition (SLA) researchers strive to understand the language and exchanges that learners generate in synchronous computer-mediated communication (SCMC). Doughty and Long (2003) advocate replacing open-ended SCMC with task-based language teaching (TBLT) design principles. Since most task-based SCMC (TB-SCMC) research addresses an…
A Framework for Load Balancing of Tensor Contraction Expressions via Dynamic Task Partitioning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lai, Pai-Wei; Stock, Kevin; Rajbhandari, Samyam
In this paper, we introduce the Dynamic Load-balanced Tensor Contractions (DLTC), a domain-specific library for efficient task parallel execution of tensor contraction expressions, a class of computation encountered in quantum chemistry and physics. Our framework decomposes each contraction into smaller unit of tasks, represented by an abstraction referred to as iterators. We exploit an extra level of parallelism by having tasks across independent contractions executed concurrently through a dynamic load balancing run- time. We demonstrate the improved performance, scalability, and flexibility for the computation of tensor contraction expressions on parallel computers using examples from coupled cluster methods.
Andersen, Pia; Lindgaard, Anne-Mette; Prgomet, Mirela; Creswick, Nerida; Westbrook, Johanna I
2009-08-04
Selecting the right mix of stationary and mobile computing devices is a significant challenge for system planners and implementers. There is very limited research evidence upon which to base such decisions. We aimed to investigate the relationships between clinician role, clinical task, and selection of a computer hardware device in hospital wards. Twenty-seven nurses and eight doctors were observed for a total of 80 hours as they used a range of computing devices to access a computerized provider order entry system on two wards at a major Sydney teaching hospital. Observers used a checklist to record the clinical tasks completed, devices used, and location of the activities. Field notes were also documented during observations. Semi-structured interviews were conducted after observation sessions. Assessment of the physical attributes of three devices-stationary PCs, computers on wheels (COWs) and tablet PCs-was made. Two types of COWs were available on the wards: generic COWs (laptops mounted on trolleys) and ergonomic COWs (an integrated computer and cart device). Heuristic evaluation of the user interfaces was also carried out. The majority (93.1%) of observed nursing tasks were conducted using generic COWs. Most nursing tasks were performed in patients' rooms (57%) or in the corridors (36%), with a small percentage at a patient's bedside (5%). Most nursing tasks related to the preparation and administration of drugs. Doctors on ward rounds conducted 57.3% of observed clinical tasks on generic COWs and 35.9% on tablet PCs. On rounds, 56% of doctors' tasks were performed in the corridors, 29% in patients' rooms, and 3% at the bedside. Doctors not on a ward round conducted 93.6% of tasks using stationary PCs, most often within the doctors' office. Nurses and doctors were observed performing workarounds, such as transcribing medication orders from the computer to paper. The choice of device was related to clinical role, nature of the clinical task, degree of mobility required, including where task completion occurs, and device design. Nurses' work, and clinical tasks performed by doctors during ward rounds, require highly mobile computer devices. Nurses and doctors on ward rounds showed a strong preference for generic COWs over all other devices. Tablet PCs were selected by doctors for only a small proportion of clinical tasks. Even when using mobile devices clinicians completed a very low proportion of observed tasks at the bedside. The design of the devices and ward space configurations place limitations on how and where devices are used and on the mobility of clinical work. In such circumstances, clinicians will initiate workarounds to compensate. In selecting hardware devices, consideration should be given to who will be using the devices, the nature of their work, and the physical layout of the ward.
Andersen, Pia; Lindgaard, Anne-Mette; Prgomet, Mirela; Creswick, Nerida
2009-01-01
Background Selecting the right mix of stationary and mobile computing devices is a significant challenge for system planners and implementers. There is very limited research evidence upon which to base such decisions. Objective We aimed to investigate the relationships between clinician role, clinical task, and selection of a computer hardware device in hospital wards. Methods Twenty-seven nurses and eight doctors were observed for a total of 80 hours as they used a range of computing devices to access a computerized provider order entry system on two wards at a major Sydney teaching hospital. Observers used a checklist to record the clinical tasks completed, devices used, and location of the activities. Field notes were also documented during observations. Semi-structured interviews were conducted after observation sessions. Assessment of the physical attributes of three devices—stationary PCs, computers on wheels (COWs) and tablet PCs—was made. Two types of COWs were available on the wards: generic COWs (laptops mounted on trolleys) and ergonomic COWs (an integrated computer and cart device). Heuristic evaluation of the user interfaces was also carried out. Results The majority (93.1%) of observed nursing tasks were conducted using generic COWs. Most nursing tasks were performed in patients’ rooms (57%) or in the corridors (36%), with a small percentage at a patient’s bedside (5%). Most nursing tasks related to the preparation and administration of drugs. Doctors on ward rounds conducted 57.3% of observed clinical tasks on generic COWs and 35.9% on tablet PCs. On rounds, 56% of doctors’ tasks were performed in the corridors, 29% in patients’ rooms, and 3% at the bedside. Doctors not on a ward round conducted 93.6% of tasks using stationary PCs, most often within the doctors’ office. Nurses and doctors were observed performing workarounds, such as transcribing medication orders from the computer to paper. Conclusions The choice of device was related to clinical role, nature of the clinical task, degree of mobility required, including where task completion occurs, and device design. Nurses’ work, and clinical tasks performed by doctors during ward rounds, require highly mobile computer devices. Nurses and doctors on ward rounds showed a strong preference for generic COWs over all other devices. Tablet PCs were selected by doctors for only a small proportion of clinical tasks. Even when using mobile devices clinicians completed a very low proportion of observed tasks at the bedside. The design of the devices and ward space configurations place limitations on how and where devices are used and on the mobility of clinical work. In such circumstances, clinicians will initiate workarounds to compensate. In selecting hardware devices, consideration should be given to who will be using the devices, the nature of their work, and the physical layout of the ward. PMID:19674959
Frequency guided methods for demodulation of a single fringe pattern.
Wang, Haixia; Kemao, Qian
2009-08-17
Phase demodulation from a single fringe pattern is a challenging task but of interest. A frequency-guided regularized phase tracker and a frequency-guided sequential demodulation method with Levenberg-Marquardt optimization are proposed to demodulate a single fringe pattern. Demodulation path guided by the local frequency from the highest to the lowest is applied in both methods. Since critical points have low local frequency values, they are processed last so that the spurious sign problem caused by these points is avoided. These two methods can be considered as alternatives to the effective fringe follower regularized phase tracker. Demodulation results from one computer-simulated and two experimental fringe patterns using the proposed methods will be demonstrated. (c) 2009 Optical Society of America
ERIC Educational Resources Information Center
Amiryousefi, Mohammad
2016-01-01
Previous task repetition studies have primarily focused on how task repetition characteristics affect the complexity, accuracy, and fluency in L2 oral production with little attention to L2 written production. The main purpose of the study reported in this paper was to examine the effects of task repetition versus procedural repetition on the…
Computational thinking and thinking about computing
Wing, Jeannette M.
2008-01-01
Computational thinking will influence everyone in every field of endeavour. This vision poses a new educational challenge for our society, especially for our children. In thinking about computing, we need to be attuned to the three drivers of our field: science, technology and society. Accelerating technological advances and monumental societal demands force us to revisit the most basic scientific questions of computing. PMID:18672462
Integration of active pauses and pattern of muscular activity during computer work.
St-Onge, Nancy; Samani, Afshin; Madeleine, Pascal
2017-09-01
Submaximal isometric muscle contractions have been reported to increase variability of muscle activation during computer work; however, other types of active contractions may be more beneficial. Our objective was to determine which type of active pause vs. rest is more efficient in changing muscle activity pattern during a computer task. Asymptomatic regular computer users performed a standardised 20-min computer task four times, integrating a different type of pause: sub-maximal isometric contraction, dynamic contraction, postural exercise and rest. Surface electromyographic (SEMG) activity was recorded bilaterally from five neck/shoulder muscles. Root-mean-square decreased with isometric pauses in the cervical paraspinals, upper trapezius and middle trapezius, whereas it increased with rest. Variability in the pattern of muscular activity was not affected by any type of pause. Overall, no detrimental effects on the level of SEMG during active pauses were found suggesting that they could be implemented without a cost on activation level or variability. Practitioner Summary: We aimed to determine which type of active pause vs. rest is best in changing muscle activity pattern during a computer task. Asymptomatic computer users performed a standardised computer task integrating different types of pauses. Muscle activation decreased with isometric pauses in neck/shoulder muscles, suggesting their implementation during computer work.
Algorithm of Taxonomy: Method of Design and Implementation Mechanism
NASA Astrophysics Data System (ADS)
Shalanov, N. V.; Aletdinova, A. A.
2018-05-01
The authors propose that the method of design of the algorithm of taxonomy should be based on the calculation of integral indicators for the estimation of the level of an object according to the set of initial indicators (i. e. potential). Their values will be the values of the projected lengths of the objects on the numeric axis, which will take values [0.100]. This approach will reduce the task of multidimensional classification to the task of one-dimensional classification. The algorithm for solving the task of taxonomy contains 14 stages; the example of its implementation is illustrated by the data of 46 consumer societies of the Yakut Union of Consumer Societies of Russia.
Williams, Kent E; Voigt, Jeffrey R
2004-01-01
The research reported herein presents the results of an empirical evaluation that focused on the accuracy and reliability of cognitive models created using a computerized tool: the cognitive analysis tool for human-computer interaction (CAT-HCI). A sample of participants, expert in interacting with a newly developed tactical display for the U.S. Army's Bradley Fighting Vehicle, individually modeled their knowledge of 4 specific tasks employing the CAT-HCI tool. Measures of the accuracy and consistency of task models created by these task domain experts using the tool were compared with task models created by a double expert. The findings indicated a high degree of consistency and accuracy between the different "single experts" in the task domain in terms of the resultant models generated using the tool. Actual or potential applications of this research include assessing human-computer interaction complexity, determining the productivity of human-computer interfaces, and analyzing an interface design to determine whether methods can be automated.
Informatics in dental education: a horizon of opportunity.
Abbey, L M
1989-11-01
Computers have presented society with the largest array of opportunities since the printing press. More specifically in dental education they represent the path to freedom from the memory-based curriculum. Computers allow us to be constantly in touch with the entire scope of knowledge necessary for decision making in every aspect of the process of preparing young men and women to practice dentistry. No longer is it necessary to spend the energy or time previously used to memorize facts, test for retention of facts or be concerned with remembering facts when dealing with our patients. Modern information management systems can assume that task allowing dentists to concentrate on understanding, skill, judgement and wisdom while helping patients deal with their problems within a health care system that is simultaneously baffling in its complexity and overflowing with options. This paper presents a summary of the choices facing dental educators as computers continue to afford us the freedom to look differently at teaching, research and practice. The discussion will elaborate some of the ways dental educators must think differently about the educational process in order to utilize fully the power of computers in curriculum development and tracking, integration of basic and clinical teaching, problem solving, patient management, record keeping and research. Some alternative strategies will be discussed that may facilitate the transition from the memory-based to the computer-based curriculum and practice.
Building Cognition: The Construction of Computational Representations for Scientific Discovery.
Chandrasekharan, Sanjay; Nersessian, Nancy J
2015-11-01
Novel computational representations, such as simulation models of complex systems and video games for scientific discovery (Foldit, EteRNA etc.), are dramatically changing the way discoveries emerge in science and engineering. The cognitive roles played by such computational representations in discovery are not well understood. We present a theoretical analysis of the cognitive roles such representations play, based on an ethnographic study of the building of computational models in a systems biology laboratory. Specifically, we focus on a case of model-building by an engineer that led to a remarkable discovery in basic bioscience. Accounting for such discoveries requires a distributed cognition (DC) analysis, as DC focuses on the roles played by external representations in cognitive processes. However, DC analyses by and large have not examined scientific discovery, and they mostly focus on memory offloading, particularly how the use of existing external representations changes the nature of cognitive tasks. In contrast, we study discovery processes and argue that discoveries emerge from the processes of building the computational representation. The building process integrates manipulations in imagination and in the representation, creating a coupled cognitive system of model and modeler, where the model is incorporated into the modeler's imagination. This account extends DC significantly, and we present some of the theoretical and application implications of this extended account. Copyright © 2014 Cognitive Science Society, Inc.
Computer Literacy and Social Stratification. Interactive Technology Laboratory Report #9.
ERIC Educational Resources Information Center
Mehan, Hugh
As schools acquire and use computers for educational purposes, two major questions arise: (1) whether students from different strata of society will obtain equal access to computers, and (2) whether students from different strata of society will be taught similar or different uses of the computer. To explore the relationship between the…
Interactive Computer Based Assessment Tasks: How Problem-Solving Process Data Can Inform Instruction
ERIC Educational Resources Information Center
Zoanetti, Nathan
2010-01-01
This article presents key steps in the design and analysis of a computer based problem-solving assessment featuring interactive tasks. The purpose of the assessment is to support targeted instruction for students by diagnosing strengths and weaknesses at different stages of problem-solving. The first focus of this article is the task piloting…
Item Mass and Complexity and the Arithmetic Computation of Students with Learning Disabilities.
ERIC Educational Resources Information Center
Cawley, John F.; Shepard, Teri; Smith, Maureen; Parmar, Rene S.
1997-01-01
The performance of 76 students (ages 10 to 15) with learning disabilities on four tasks of arithmetic computation within each of the four basic operations was examined. Tasks varied in difficulty level and number of strokes needed to complete all items. Intercorrelations between task sets and operations were examined as was the use of…
Task Scheduling in Desktop Grids: Open Problems
NASA Astrophysics Data System (ADS)
Chernov, Ilya; Nikitina, Natalia; Ivashko, Evgeny
2017-12-01
We survey the areas of Desktop Grid task scheduling that seem to be insufficiently studied so far and are promising for efficiency, reliability, and quality of Desktop Grid computing. These topics include optimal task grouping, "needle in a haystack" paradigm, game-theoretical scheduling, domain-imposed approaches, special optimization of the final stage of the batch computation, and Enterprise Desktop Grids.
ERIC Educational Resources Information Center
Shamsudin, Sarimah; Nesi, Hilary
2006-01-01
This paper will describe an ESP approach to the design and implementation of computer-mediated communication (CMC) tasks for computer science students at Universiti Teknologi Malaysia, and discuss the effectiveness of the chat feature of Windows NetMeeting as a tool for developing specified language skills. CMC tasks were set within a programme of…
Simplified Distributed Computing
NASA Astrophysics Data System (ADS)
Li, G. G.
2006-05-01
The distributed computing runs from high performance parallel computing, GRID computing, to an environment where idle CPU cycles and storage space of numerous networked systems are harnessed to work together through the Internet. In this work we focus on building an easy and affordable solution for computationally intensive problems in scientific applications based on existing technology and hardware resources. This system consists of a series of controllers. When a job request is detected by a monitor or initialized by an end user, the job manager launches the specific job handler for this job. The job handler pre-processes the job, partitions the job into relative independent tasks, and distributes the tasks into the processing queue. The task handler picks up the related tasks, processes the tasks, and puts the results back into the processing queue. The job handler also monitors and examines the tasks and the results, and assembles the task results into the overall solution for the job request when all tasks are finished for each job. A resource manager configures and monitors all participating notes. A distributed agent is deployed on all participating notes to manage the software download and report the status. The processing queue is the key to the success of this distributed system. We use BEA's Weblogic JMS queue in our implementation. It guarantees the message delivery and has the message priority and re-try features so that the tasks never get lost. The entire system is built on the J2EE technology and it can be deployed on heterogeneous platforms. It can handle algorithms and applications developed in any languages on any platforms. J2EE adaptors are provided to manage and communicate the existing applications to the system so that the applications and algorithms running on Unix, Linux and Windows can all work together. This system is easy and fast to develop based on the industry's well-adopted technology. It is highly scalable and heterogeneous. It is an open system and any number and type of machines can join the system to provide the computational power. This asynchronous message-based system can achieve second of response time. For efficiency, communications between distributed tasks are often done at the start and end of the tasks but intermediate status of the tasks can also be provided.
Strategic adaptation to performance objectives in a dual-task setting.
Janssen, Christian P; Brumby, Duncan P
2010-11-01
How do people interleave attention when multitasking? One dominant account is that the completion of a subtask serves as a cue to switch tasks. But what happens if switching solely at subtask boundaries led to poor performance? We report a study in which participants manually dialed a UK-style telephone number while driving a simulated vehicle. If the driver were to exclusively return his or her attention to driving after completing a subtask (i.e., using the single break in the xxxxx-xxxxxx representational structure of the number), then we would expect to see a relatively poor driving performance. In contrast, our results show that drivers choose to return attention to steering control before the natural subtask boundary. A computational modeling analysis shows that drivers had to adopt this strategy to meet the required performance objective of maintaining an acceptable lateral position in the road while dialing. Taken together these results support the idea that people can strategically control the allocation of attention in multitask settings to meet specific performance criteria. Copyright © 2010 Cognitive Science Society, Inc.
Dadashi, N; Stedmon, A W; Pridmore, T P
2013-09-01
Recent advances in computer vision technology have lead to the development of various automatic surveillance systems, however their effectiveness is adversely affected by many factors and they are not completely reliable. This study investigated the potential of a semi-automated surveillance system to reduce CCTV operator workload in both detection and tracking activities. A further focus of interest was the degree of user reliance on the automated system. A simulated prototype was developed which mimicked an automated system that provided different levels of system confidence information. Dependent variable measures were taken for secondary task performance, reliance and subjective workload. When the automatic component of a semi-automatic CCTV surveillance system provided reliable system confidence information to operators, workload significantly decreased and spare mental capacity significantly increased. Providing feedback about system confidence and accuracy appears to be one important way of making the status of the automated component of the surveillance system more 'visible' to users and hence more effective to use. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Empowering the impaired through the appropriate use of Information Technology and Internet.
Sanyal, Ishita
2006-01-01
Developments in the fields of science and technology have revolutionized Human Life at material level. But in actuality, this progress is only superficial: underneath modern men and women are living in conditions of great mental and emotional stress, even in developed and affluent countries. People from all over the world irrespective of culture and economic background suffer from mental illness and though a number of researches are carried out worldwide but till date it has not been possible to resolve the problem. In today's world stress is increasing everyday. The individualistic approach towards life; the neonatal family system has increased the burden even further. Without adequate support system of friends and relatives--people are falling prey to mental illness. The insecurities, the inferiority feelings of these persons lead to disruption of communication between the sufferer and the family members and friends. The sufferers prefer to confine themselves within the four walls of their home and remain withdrawn from the whole world. They prefer to stay in touch with their world of fantasy--far away from the world of reality. Disability caused by some of the mental illnesses often remains invisible to the society leading to lack of support system and facilities for them. These unfortunate disabled persons not only need medication and counseling but a thorough rehabilitation programme to bring them back to the main stream of life. The task being not an easy one. According to the research works these persons need some work and income to improve their quality of life. In this scenario where society is adverse towards them, where stigma towards mental illness prevails; where help from friends and community is not available- training them in computer and forming groups through computer was thought to be an ideal option for the solution- a solution to the problems of modern life through modern technology. * It was seen that this insecure disabled persons feel free to experiment with machine more easily than with society and people. * Computer provides them the needed education and information needed for their further developments. * Computers provide them facilities to interact with others and form self-help groups. * Computers also enabled them to earn their livelihood. Thus this modern gadget, which is sometimes believed to make a man loner, has been actually acting as the bridge between the persons suffering from mental illness to the society in general. The disabled person also gains confidence and courage as they gain control over the machine. Gaining control over the machine helps them to gain control over their life. The product of Science and technology has been seen to revolutionized Human Life not only in material level but also on personal level- helping the disabled to gain control over their lives.
Karunaratne, Asuntha S; Korenman, Stanley G; Thomas, Samantha L; Myles, Paul S; Komesaroff, Paul A
2010-04-05
To assess the efficacy, with respect to participant understanding of information, of a computer-based approach to communication about complex, technical issues that commonly arise when seeking informed consent for clinical research trials. An open, randomised controlled study of 60 patients with diabetes mellitus, aged 27-70 years, recruited between August 2006 and October 2007 from the Department of Diabetes and Endocrinology at the Alfred Hospital and Baker IDI Heart and Diabetes Institute, Melbourne. Participants were asked to read information about a mock study via a computer-based presentation (n = 30) or a conventional paper-based information statement (n = 30). The computer-based presentation contained visual aids, including diagrams, video, hyperlinks and quiz pages. Understanding of information as assessed by quantitative and qualitative means. Assessment scores used to measure level of understanding were significantly higher in the group that completed the computer-based task than the group that completed the paper-based task (82% v 73%; P = 0.005). More participants in the group that completed the computer-based task expressed interest in taking part in the mock study (23 v 17 participants; P = 0.01). Most participants from both groups preferred the idea of a computer-based presentation to the paper-based statement (21 in the computer-based task group, 18 in the paper-based task group). A computer-based method of providing information may help overcome existing deficiencies in communication about clinical research, and may reduce costs and improve efficiency in recruiting participants for clinical trials.
Inter-Association Task Force Report on Image.
ERIC Educational Resources Information Center
Special Libraries Association, Washington, DC.
In 1988, the Board of Directors of the Special Libraries Association provided funding to a task force to gather data which would determine how certain segments of society perceive librarians, how librarians view themselves and their colleagues, and to provide recommendations for addressing the issue of image. The task force project consisted of…
Advanced information processing system: Local system services
NASA Technical Reports Server (NTRS)
Burkhardt, Laura; Alger, Linda; Whittredge, Roy; Stasiowski, Peter
1989-01-01
The Advanced Information Processing System (AIPS) is a multi-computer architecture composed of hardware and software building blocks that can be configured to meet a broad range of application requirements. The hardware building blocks are fault-tolerant, general-purpose computers, fault-and damage-tolerant networks (both computer and input/output), and interfaces between the networks and the computers. The software building blocks are the major software functions: local system services, input/output, system services, inter-computer system services, and the system manager. The foundation of the local system services is an operating system with the functions required for a traditional real-time multi-tasking computer, such as task scheduling, inter-task communication, memory management, interrupt handling, and time maintenance. Resting on this foundation are the redundancy management functions necessary in a redundant computer and the status reporting functions required for an operator interface. The functional requirements, functional design and detailed specifications for all the local system services are documented.
A resource management architecture based on complex network theory in cloud computing federation
NASA Astrophysics Data System (ADS)
Zhang, Zehua; Zhang, Xuejie
2011-10-01
Cloud Computing Federation is a main trend of Cloud Computing. Resource Management has significant effect on the design, realization, and efficiency of Cloud Computing Federation. Cloud Computing Federation has the typical characteristic of the Complex System, therefore, we propose a resource management architecture based on complex network theory for Cloud Computing Federation (abbreviated as RMABC) in this paper, with the detailed design of the resource discovery and resource announcement mechanisms. Compare with the existing resource management mechanisms in distributed computing systems, a Task Manager in RMABC can use the historical information and current state data get from other Task Managers for the evolution of the complex network which is composed of Task Managers, thus has the advantages in resource discovery speed, fault tolerance and adaptive ability. The result of the model experiment confirmed the advantage of RMABC in resource discovery performance.
NASA Technical Reports Server (NTRS)
Chu, Y. Y.
1978-01-01
A unified formulation of computer-aided, multi-task, decision making is presented. Strategy for the allocation of decision making responsibility between human and computer is developed. The plans of a flight management systems are studied. A model based on the queueing theory was implemented.
Computer-Based Simulations for Maintenance Training: Current ARI Research. Technical Report 544.
ERIC Educational Resources Information Center
Knerr, Bruce W.; And Others
Three research efforts that used computer-based simulations for maintenance training were in progress when this report was written: Game-Based Learning, which investigated the use of computer-based games to train electronics diagnostic skills; Human Performance in Fault Diagnosis Tasks, which evaluated the use of context-free tasks to train…
Evaluating the Efficacy of the Cloud for Cluster Computation
NASA Technical Reports Server (NTRS)
Knight, David; Shams, Khawaja; Chang, George; Soderstrom, Tom
2012-01-01
Computing requirements vary by industry, and it follows that NASA and other research organizations have computing demands that fall outside the mainstream. While cloud computing made rapid inroads for tasks such as powering web applications, performance issues on highly distributed tasks hindered early adoption for scientific computation. One venture to address this problem is Nebula, NASA's homegrown cloud project tasked with delivering science-quality cloud computing resources. However, another industry development is Amazon's high-performance computing (HPC) instances on Elastic Cloud Compute (EC2) that promises improved performance for cluster computation. This paper presents results from a series of benchmarks run on Amazon EC2 and discusses the efficacy of current commercial cloud technology for running scientific applications across a cluster. In particular, a 240-core cluster of cloud instances achieved 2 TFLOPS on High-Performance Linpack (HPL) at 70% of theoretical computational performance. The cluster's local network also demonstrated sub-100 ?s inter-process latency with sustained inter-node throughput in excess of 8 Gbps. Beyond HPL, a real-world Hadoop image processing task from NASA's Lunar Mapping and Modeling Project (LMMP) was run on a 29 instance cluster to process lunar and Martian surface images with sizes on the order of tens of gigapixels. These results demonstrate that while not a rival of dedicated supercomputing clusters, commercial cloud technology is now a feasible option for moderately demanding scientific workloads.
Gerth, Sabrina; Dolk, Thomas; Klassert, Annegret; Fliesser, Michael; Fischer, Martin H; Nottbusch, Guido; Festman, Julia
2016-08-01
Our study addresses the following research questions: Are there differences between handwriting movements on paper and on a tablet computer? Can experienced writers, such as most adults, adapt their graphomotor execution during writing to a rather unfamiliar surface for instance a tablet computer? We examined the handwriting performance of adults in three tasks with different complexity: (a) graphomotor abilities, (b) visuomotor abilities and (c) handwriting. Each participant performed each task twice, once on paper and once on a tablet computer with a pen. We tested 25 participants by measuring their writing duration, in air time, number of pen lifts, writing velocity and number of inversions in velocity. The data were analyzed using linear mixed-effects modeling with repeated measures. Our results reveal differences between writing on paper and on a tablet computer which were partly task-dependent. Our findings also show that participants were able to adapt their graphomotor execution to the smoother surface of the tablet computer during the tasks. Copyright © 2016 Elsevier B.V. All rights reserved.
Slime mould processors, logic gates and sensors.
Adamatzky, A
2015-07-28
A heterotic, or hybrid, computation implies that two or more substrates of different physical nature are merged into a single device with indistinguishable parts. These hybrid devices then undertake coherent acts on programmable and sensible processing of information. We study the potential of heterotic computers using slime mould acting under the guidance of chemical, mechanical and optical stimuli. Plasmodium of acellular slime mould Physarum polycephalum is a gigantic single cell visible to the unaided eye. The cell shows a rich spectrum of behavioural morphological patterns in response to changing environmental conditions. Given data represented by chemical or physical stimuli, we can employ and modify the behaviour of the slime mould to make it solve a range of computing and sensing tasks. We overview results of laboratory experimental studies on prototyping of the slime mould morphological processors for approximation of Voronoi diagrams, planar shapes and solving mazes, and discuss logic gates implemented via collision of active growing zones and tactile responses of P. polycephalum. We also overview a range of electronic components--memristor, chemical, tactile and colour sensors-made of the slime mould. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
MAT - MULTI-ATTRIBUTE TASK BATTERY FOR HUMAN OPERATOR WORKLOAD AND STRATEGIC BEHAVIOR RESEARCH
NASA Technical Reports Server (NTRS)
Comstock, J. R.
1994-01-01
MAT, a Multi-Attribute Task battery, gives the researcher the capability of performing multi-task workload and performance experiments. The battery provides a benchmark set of tasks for use in a wide range of laboratory studies of operator performance and workload. MAT incorporates tasks analogous to activities that aircraft crew members perform in flight, while providing a high degree of experiment control, performance data on each subtask, and freedom to use non-pilot test subjects. The MAT battery primary display is composed of four separate task windows which are as follows: a monitoring task window which includes gauges and warning lights, a tracking task window for the demands of manual control, a communication task window to simulate air traffic control communications, and a resource management task window which permits maintaining target levels on a fuel management task. In addition, a scheduling task window gives the researcher information about future task demands. The battery also provides the option of manual or automated control of tasks. The task generates performance data for each subtask. The task battery may be paused and onscreen workload rating scales presented to the subject. The MAT battery was designed to use a serially linked second computer to generate the voice messages for the Communications task. The MATREMX program and support files, which are included in the MAT package, were designed to work with the Heath Voice Card (Model HV-2000, available through the Heath Company, Benton Harbor, Michigan 49022); however, the MATREMX program and support files may easily be modified to work with other voice synthesizer or digitizer cards. The MAT battery task computer may also be used independent of the voice computer if no computer synthesized voice messages are desired or if some other method of presenting auditory messages is devised. MAT is written in QuickBasic and assembly language for IBM PC series and compatible computers running MS-DOS. The code in MAT is written for Microsoft QuickBasic 4.5 and Microsoft Macro Assembler 5.1. This package requires a joystick and EGA or VGA color graphics. An 80286, 386, or 486 processor machine is highly recommended. The standard distribution medium for MAT is a 5.25 inch 360K MS-DOS format diskette. The files are compressed using the PKZIP file compression utility. PKUNZIP is included on the distribution diskette. MAT was developed in 1992. IBM PC is a registered trademark of International Business Machines. MS-DOS, Microsoft QuickBasic, and Microsoft Macro Assembler are registered trademarks of Microsoft Corporation. PKZIP and PKUNZIP are registered trademarks of PKWare, Inc.
Checkpoint triggering in a computer system
Cher, Chen-Yong
2016-09-06
According to an aspect, a method for triggering creation of a checkpoint in a computer system includes executing a task in a processing node of the computer system and determining whether it is time to read a monitor associated with a metric of the task. The monitor is read to determine a value of the metric based on determining that it is time to read the monitor. A threshold for triggering creation of the checkpoint is determined based on the value of the metric. Based on determining that the value of the metric has crossed the threshold, the checkpoint including state data of the task is created to enable restarting execution of the task upon a restart operation.
O'Grady, N P; Barie, P S; Bartlett, J G; Bleck, T; Garvey, G; Jacobi, J; Linden, P; Maki, D G; Nam, M; Pasculle, W; Pasquale, M D; Tribett, D L; Masur, H
1998-05-01
The development of practice guidelines for evaluating adult patients who develop new fever in the intensive care unit (ICU) for the purpose of guiding clinical practice. A task force of 13 experts in disciplines related to critical care medicine, infectious diseases, and surgery was convened from the membership of the Society of Critical Care Medicine and the Infectious Disease Society of America. The task force members provided personal experience and determined the published literature (articles retrieved with use of MEDLINE or textbooks) from which consensus would be sought. The published literature was reviewed and classified into one of four categories, according to study design and scientific value. The task force met several times in person and twice monthly by teleconference over a 1-year period to identify the pertinent literature and arrive at consensus recommendations. Consideration was given to the relationship between the weight of scientific evidence and the experts' opinions. Draft documents were composed and debated by the task force until consensus was reached by nominal group process. The panel concluded that because fever can have many infectious and noninfectious etiologies, a new fever in an adult patient in the ICU should trigger a careful clinical assessment rather than automatic orders for laboratory and radiological tests. A cost-conscious approach to obtaining diagnostic studies should be undertaken if they are indicated after a clinical evaluation. The goal of such an approach is to determine, in a directed manner, whether infection is present so that additional testing can be avoided and therapeutic options can be identified.
Early false-belief understanding in traditional non-Western societies
Barrett, H. Clark; Broesch, Tanya; Scott, Rose M.; He, Zijing; Baillargeon, Renée; Wu, Di; Bolz, Matthias; Henrich, Joseph; Setoh, Peipei; Wang, Jianxin; Laurence, Stephen
2013-01-01
The psychological capacity to recognize that others may hold and act on false beliefs has been proposed to reflect an evolved, species-typical adaptation for social reasoning in humans; however, controversy surrounds the developmental timing and universality of this trait. Cross-cultural studies using elicited-response tasks indicate that the age at which children begin to understand false beliefs ranges from 4 to 7 years across societies, whereas studies using spontaneous-response tasks with Western children indicate that false-belief understanding emerges much earlier, consistent with the hypothesis that false-belief understanding is a psychological adaptation that is universally present in early childhood. To evaluate this hypothesis, we used three spontaneous-response tasks that have revealed early false-belief understanding in the West to test young children in three traditional, non-Western societies: Salar (China), Shuar/Colono (Ecuador) and Yasawan (Fiji). Results were comparable with those from the West, supporting the hypothesis that false-belief understanding reflects an adaptation that is universally present early in development. PMID:23363628
Early false-belief understanding in traditional non-Western societies.
Barrett, H Clark; Broesch, Tanya; Scott, Rose M; He, Zijing; Baillargeon, Renée; Wu, Di; Bolz, Matthias; Henrich, Joseph; Setoh, Peipei; Wang, Jianxin; Laurence, Stephen
2013-03-22
The psychological capacity to recognize that others may hold and act on false beliefs has been proposed to reflect an evolved, species-typical adaptation for social reasoning in humans; however, controversy surrounds the developmental timing and universality of this trait. Cross-cultural studies using elicited-response tasks indicate that the age at which children begin to understand false beliefs ranges from 4 to 7 years across societies, whereas studies using spontaneous-response tasks with Western children indicate that false-belief understanding emerges much earlier, consistent with the hypothesis that false-belief understanding is a psychological adaptation that is universally present in early childhood. To evaluate this hypothesis, we used three spontaneous-response tasks that have revealed early false-belief understanding in the West to test young children in three traditional, non-Western societies: Salar (China), Shuar/Colono (Ecuador) and Yasawan (Fiji). Results were comparable with those from the West, supporting the hypothesis that false-belief understanding reflects an adaptation that is universally present early in development.
Patel, Manesh R; Bailey, Steven R; Bonow, Robert O; Chambers, Charles E; Chan, Paul S; Dehmer, Gregory J; Kirtane, Ajay J; Wann, L Samuel; Ward, R Parker; Douglas, Pamela S; Patel, Manesh R; Bailey, Steven R; Altus, Philip; Barnard, Denise D; Blankenship, James C; Casey, Donald E; Dean, Larry S; Fazel, Reza; Gilchrist, Ian C; Kavinsky, Clifford J; Lakoski, Susan G; Le, D Elizabeth; Lesser, John R; Levine, Glenn N; Mehran, Roxana; Russo, Andrea M; Sorrentino, Matthew J; Williams, Mathew R; Wong, John B; Wolk, Michael J; Bailey, Steven R; Douglas, Pamela S; Hendel, Robert C; Kramer, Christopher M; Min, James K; Patel, Manesh R; Shaw, Leslee; Stainback, Raymond F; Allen, Joseph M
2012-09-01
The American College of Cardiology Foundation, in collaboration with the Society for Cardiovascular Angiography and Interventions and key specialty and subspecialty societies, conducted a review of common clinical scenarios where diagnostic catheterization is frequently considered. The indications (clinical scenarios) were derived from common applications or anticipated uses, as well as from current clinical practice guidelines and results of studies examining the implementation of noninvasive imaging appropriate use criteria. The 166 indications in this document were developed by a diverse writing group and scored by a separate independent technical panel on a scale of 1 to 9, to designate appropriate use (median 7 to 9), uncertain use (median 4 to 6), and inappropriate use (median 1 to 3). Diagnostic catheterization may include several different procedure components. The indications developed focused primarily on 2 aspects of diagnostic catheterization. Many indications focused on the performance of coronary angiography for the detection of coronary artery disease with other procedure components (e.g., hemodynamic measurements, ventriculography) at the discretion of the operator. The majority of the remaining indications focused on hemodynamic measurements to evaluate valvular heart disease, pulmonary hypertension, cardiomyopathy, and other conditions, with the use of coronary angiography at the discretion of the operator. Seventy-five indications were rated as appropriate, 49 were rated as uncertain, and 42 were rated as inappropriate. The appropriate use criteria for diagnostic catheterization have the potential to impact physician decision making, healthcare delivery, and reimbursement policy. Furthermore, recognition of uncertain clinical scenarios facilitates identification of areas that would benefit from future research. © 2012 Wiley Periodicals, Inc. Copyright © 2012 Wiley Periodicals, Inc.
Acromegaly: an endocrine society clinical practice guideline.
Katznelson, Laurence; Laws, Edward R; Melmed, Shlomo; Molitch, Mark E; Murad, Mohammad Hassan; Utz, Andrea; Wass, John A H
2014-11-01
The aim was to formulate clinical practice guidelines for acromegaly. The Task Force included a chair selected by the Endocrine Society Clinical Guidelines Subcommittee (CGS), five experts in the field, and a methodologist. The authors received no corporate funding or remuneration. This guideline is cosponsored by the European Society of Endocrinology. This evidence-based guideline was developed using the Grading of Recommendations, Assessment, Development, and Evaluation (GRADE) system to describe both the strength of recommendations and the quality of evidence. The Task Force reviewed primary evidence and commissioned two additional systematic reviews. One group meeting, several conference calls, and e-mail communications enabled consensus. Committees and members of the Endocrine Society and the European Society of Endocrinology reviewed drafts of the guidelines. Using an evidence-based approach, this acromegaly guideline addresses important clinical issues regarding the evaluation and management of acromegaly, including the appropriate biochemical assessment, a therapeutic algorithm, including use of medical monotherapy or combination therapy, and management during pregnancy.
Airborne Intelligent Display (AID) Phase I Software Description,
1983-10-24
Board Computer Characteristics 10 3.0 SOFTWARE GENERAL DESCRIPTION 13 3.1 Overview 13 3.2 System Software 14 3.2.1 System Startup 14 3.2.1.1 Initial...3 A-2 Task States A-4 A-3 Task Program Structure A-6 A-4 Task States and State Change Mechanisms A-7 A-5 Computing Return Addresses: RUNADR, SLPADR A...techniques. 2.2 Design Approach The stated objectives were met by: 1. distributing the processing load among multiple Z80 single-board computers (SBC’s). This
ERIC Educational Resources Information Center
Baumeister, Antonia E.; Engelmann, Tanja; Hesse, Friedrich W.
2017-01-01
This experimental study extends conflict elaboration theory (1) by revealing social influence dynamics for a knowledge-rich computer-supported socio-cognitive conflict task not investigated in the context of this theory before and (2) by showing the impact of individual differences in social comparison orientation. Students in two conditions…
Peters, Anne L; Ahmann, Andrew J; Battelino, Tadej; Evert, Alison; Hirsch, Irl B; Murad, M Hassan; Winter, William E; Wolpert, Howard
2016-11-01
To formulate clinical practice guidelines for the use of continuous glucose monitoring and continuous subcutaneous insulin infusion in adults with diabetes. The participants include an Endocrine Society-appointed Task Force of seven experts, a methodologist, and a medical writer. The American Association for Clinical Chemistry, the American Association of Diabetes Educators, and the European Society of Endocrinology co-sponsored this guideline. The Task Force developed this evidence-based guideline using the Grading of Recommendations, Assessment, Development, and Evaluation system to describe the strength of recommendations and the quality of evidence. The Task Force commissioned one systematic review and used the best available evidence from other published systematic reviews and individual studies. One group meeting, several conference calls, and e-mail communications enabled consensus. Committees and members of the Endocrine Society, the American Association for Clinical Chemistry, the American Association of Diabetes Educators, and the European Society of Endocrinology reviewed and commented on preliminary drafts of these guidelines. Continuous subcutaneous insulin infusion and continuous glucose monitoring have an important role in the treatment of diabetes. Data from randomized controlled trials are limited on the use of medical devices, but existing studies support the use of diabetes technology for a wide variety of indications. This guideline presents a review of the literature and practice recommendations for appropriate device use.
Peterse, Elisabeth F P; Meester, Reinier G S; Siegel, Rebecca L; Chen, Jennifer C; Dwyer, Andrea; Ahnen, Dennis J; Smith, Robert A; Zauber, Ann G; Lansdorp-Vogelaar, Iris
2018-05-30
In 2016, the Microsimulation Screening Analysis-Colon (MISCAN-Colon) model was used to inform the US Preventive Services Task Force colorectal cancer (CRC) screening guidelines. In this study, 1 of 2 microsimulation analyses to inform the update of the American Cancer Society CRC screening guideline, the authors re-evaluated the optimal screening strategies in light of the increase in CRC diagnosed in young adults. The authors adjusted the MISCAN-Colon model to reflect the higher CRC incidence in young adults, who were assumed to carry forward escalated disease risk as they age. Life-years gained (LYG; benefit), the number of colonoscopies (COL; burden) and the ratios of incremental burden to benefit (efficiency ratio [ER] = ΔCOL/ΔLYG) were projected for different screening strategies. Strategies differed with respect to test modality, ages to start (40 years, 45 years, and 50 years) and ages to stop (75 years, 80 years, and 85 years) screening, and screening intervals (depending on screening modality). The authors then determined the model-recommended strategies in a similar way as was done for the US Preventive Services Task Force, using ER thresholds in accordance with the previously accepted ER of 39. Because of the higher CRC incidence, model-predicted LYG from screening increased compared with the previous analyses. Consequently, the balance of burden to benefit of screening improved and now 10-yearly colonoscopy screening starting at age 45 years resulted in an ER of 32. Other recommended strategies included fecal immunochemical testing annually, flexible sigmoidoscopy screening every 5 years, and computed tomographic colonography every 5 years. This decision-analysis suggests that in light of the increase in CRC incidence among young adults, screening may be offered earlier than has previously been recommended. Cancer 2018. © 2018 The Authors. Cancer published by Wiley Periodicals, Inc. on behalf of American Cancer Society. © 2018 The Authors. Cancer published by Wiley Periodicals, Inc. on behalf of American Cancer Society.
ERIC Educational Resources Information Center
Barkaoui, Khaled
2016-01-01
This study contributes to the literature on second language (L2) learners' revision behavior by describing what, when, and how often L2 learners revise their texts when responding to timed writing tasks on the computer and by examining the effects of task type, L2 proficiency, and keyboarding skills on what and when L2 learners revise. Each of 54…
Computer Abuse: Vandalizing the Information Society.
ERIC Educational Resources Information Center
Furnell, Steven M.; Warren, Matthew J.
1997-01-01
Computing and telecommunications, key to an information-based society, are increasingly targets for criminals and mischief makers. This article examines the effects of malicious computer abuse: hacking and viruses, highlights the apparent increase in incidents, and examines their effect on public perceptions of technology. Presents broad…
Parietal neural prosthetic control of a computer cursor in a graphical-user-interface task
NASA Astrophysics Data System (ADS)
Revechkis, Boris; Aflalo, Tyson NS; Kellis, Spencer; Pouratian, Nader; Andersen, Richard A.
2014-12-01
Objective. To date, the majority of Brain-Machine Interfaces have been used to perform simple tasks with sequences of individual targets in otherwise blank environments. In this study we developed a more practical and clinically relevant task that approximated modern computers and graphical user interfaces (GUIs). This task could be problematic given the known sensitivity of areas typically used for BMIs to visual stimuli, eye movements, decision-making, and attentional control. Consequently, we sought to assess the effect of a complex, GUI-like task on the quality of neural decoding. Approach. A male rhesus macaque monkey was implanted with two 96-channel electrode arrays in area 5d of the superior parietal lobule. The animal was trained to perform a GUI-like ‘Face in a Crowd’ task on a computer screen that required selecting one cued, icon-like, face image from a group of alternatives (the ‘Crowd’) using a neurally controlled cursor. We assessed whether the crowd affected decodes of intended cursor movements by comparing it to a ‘Crowd Off’ condition in which only the matching target appeared without alternatives. We also examined if training a neural decoder with the Crowd On rather than Off had any effect on subsequent decode quality. Main results. Despite the additional demands of working with the Crowd On, the animal was able to robustly perform the task under Brain Control. The presence of the crowd did not itself affect decode quality. Training the decoder with the Crowd On relative to Off had no negative influence on subsequent decoding performance. Additionally, the subject was able to gaze around freely without influencing cursor position. Significance. Our results demonstrate that area 5d recordings can be used for decoding in a complex, GUI-like task with free gaze. Thus, this area is a promising source of signals for neural prosthetics that utilize computing devices with GUI interfaces, e.g. personal computers, mobile devices, and tablet computers.
Parietal neural prosthetic control of a computer cursor in a graphical-user-interface task.
Revechkis, Boris; Aflalo, Tyson N S; Kellis, Spencer; Pouratian, Nader; Andersen, Richard A
2014-12-01
To date, the majority of Brain-Machine Interfaces have been used to perform simple tasks with sequences of individual targets in otherwise blank environments. In this study we developed a more practical and clinically relevant task that approximated modern computers and graphical user interfaces (GUIs). This task could be problematic given the known sensitivity of areas typically used for BMIs to visual stimuli, eye movements, decision-making, and attentional control. Consequently, we sought to assess the effect of a complex, GUI-like task on the quality of neural decoding. A male rhesus macaque monkey was implanted with two 96-channel electrode arrays in area 5d of the superior parietal lobule. The animal was trained to perform a GUI-like 'Face in a Crowd' task on a computer screen that required selecting one cued, icon-like, face image from a group of alternatives (the 'Crowd') using a neurally controlled cursor. We assessed whether the crowd affected decodes of intended cursor movements by comparing it to a 'Crowd Off' condition in which only the matching target appeared without alternatives. We also examined if training a neural decoder with the Crowd On rather than Off had any effect on subsequent decode quality. Despite the additional demands of working with the Crowd On, the animal was able to robustly perform the task under Brain Control. The presence of the crowd did not itself affect decode quality. Training the decoder with the Crowd On relative to Off had no negative influence on subsequent decoding performance. Additionally, the subject was able to gaze around freely without influencing cursor position. Our results demonstrate that area 5d recordings can be used for decoding in a complex, GUI-like task with free gaze. Thus, this area is a promising source of signals for neural prosthetics that utilize computing devices with GUI interfaces, e.g. personal computers, mobile devices, and tablet computers.
Archer, Charles J; Blocksome, Michael E; Ratterman, Joseph D; Smith, Brian E
2014-02-11
Endpoint-based parallel data processing in a parallel active messaging interface ('PAMI') of a parallel computer, the PAMI composed of data communications endpoints, each endpoint including a specification of data communications parameters for a thread of execution on a compute node, including specifications of a client, a context, and a task, the compute nodes coupled for data communications through the PAMI, including establishing a data communications geometry, the geometry specifying, for tasks representing processes of execution of the parallel application, a set of endpoints that are used in collective operations of the PAMI including a plurality of endpoints for one of the tasks; receiving in endpoints of the geometry an instruction for a collective operation; and executing the instruction for a collective opeartion through the endpoints in dependence upon the geometry, including dividing data communications operations among the plurality of endpoints for one of the tasks.
Psychological Issues in Online Adaptive Task Allocation
NASA Technical Reports Server (NTRS)
Morris, N. M.; Rouse, W. B.; Ward, S. L.; Frey, P. R.
1984-01-01
Adaptive aiding is an idea that offers potential for improvement over many current approaches to aiding in human-computer systems. The expected return of tailoring the system to fit the user could be in the form of improved system performance and/or increased user satisfaction. Issues such as the manner in which information is shared between human and computer, the appropriate division of labor between them, and the level of autonomy of the aid are explored. A simulated visual search task was developed. Subjects are required to identify targets in a moving display while performing a compensatory sub-critical tracking task. By manipulating characteristics of the situation such as imposed task-related workload and effort required to communicate with the computer, it is possible to create conditions in which interaction with the computer would be more or less desirable. The results of preliminary research using this experimental scenario are presented, and future directions for this research effort are discussed.
Archer, Charles J.; Blocksome, Michael A.; Ratterman, Joseph D.; Smith, Brian E.
2014-08-12
Endpoint-based parallel data processing in a parallel active messaging interface (`PAMI`) of a parallel computer, the PAMI composed of data communications endpoints, each endpoint including a specification of data communications parameters for a thread of execution on a compute node, including specifications of a client, a context, and a task, the compute nodes coupled for data communications through the PAMI, including establishing a data communications geometry, the geometry specifying, for tasks representing processes of execution of the parallel application, a set of endpoints that are used in collective operations of the PAMI including a plurality of endpoints for one of the tasks; receiving in endpoints of the geometry an instruction for a collective operation; and executing the instruction for a collective operation through the endpoints in dependence upon the geometry, including dividing data communications operations among the plurality of endpoints for one of the tasks.
Short-Circuiting the Bureaucracy: Policy Origins in Education.
ERIC Educational Resources Information Center
Graham, Hugh Davis
The Great Society's secret task forces created by Lyndon Johnson, particularly in the case-study area of federal education policy, show the use and misuse of the task force device. Modern use of it began with John F. Kennedy. Although he used the task force device effectively sometimes, he did not use it effectively in his educational programs in…
ERIC Educational Resources Information Center
Gil, Laura; Braten, Ivar; Vidal-Abarca, Eduardo; Stromso, Helge I.
2010-01-01
One of the major challenges of a knowledge society is that students as well as other citizens must learn to understand and integrate information from multiple textual sources. Still, task and reader characteristics that may facilitate or constrain such intertextual processes are not well understood by researchers. In this study, we compare the…
ERIC Educational Resources Information Center
Djambong, Takam; Freiman, Viktor
2016-01-01
While today's schools in several countries, like Canada, are about to bring back programming to their curricula, a new conceptual angle, namely one of computational thinking, draws attention of researchers. In order to understand the articulation between computational thinking tasks in one side, student's targeted skills, and the types of problems…
Mediated Activity in the Primary Classroom: Girls, Boys and Computers.
ERIC Educational Resources Information Center
Fitzpatrick, Helen; Hardman, Margaret
2000-01-01
Studied the social interaction of 7- and 9-year-olds working in the same or mixed gender pairs on language-based computer and noncomputer tasks. At both ages, mixed gender pairs showed more assertive and less transactive (collaborative) interaction than same gender pairs on both tasks. Discusses the mediational role of the computer and the social…
Task-Relevant Sound and User Experience in Computer-Mediated Firefighter Training
ERIC Educational Resources Information Center
Houtkamp, Joske M.; Toet, Alexander; Bos, Frank A.
2012-01-01
The authors added task-relevant sounds to a computer-mediated instructor in-the-loop virtual training for firefighter commanders in an attempt to raise the engagement and arousal of the users. Computer-mediated training for crew commanders should provide a sensory experience that is sufficiently intense to make the training viable and effective.…
Distributed computation of graphics primitives on a transputer network
NASA Technical Reports Server (NTRS)
Ellis, Graham K.
1988-01-01
A method is developed for distributing the computation of graphics primitives on a parallel processing network. Off-the-shelf transputer boards are used to perform the graphics transformations and scan-conversion tasks that would normally be assigned to a single transputer based display processor. Each node in the network performs a single graphics primitive computation. Frequently requested tasks can be duplicated on several nodes. The results indicate that the current distribution of commands on the graphics network shows a performance degradation when compared to the graphics display board alone. A change to more computation per node for every communication (perform more complex tasks on each node) may cause the desired increase in throughput.
Brain-computer interface control along instructed paths
NASA Astrophysics Data System (ADS)
Sadtler, P. T.; Ryu, S. I.; Tyler-Kabara, E. C.; Yu, B. M.; Batista, A. P.
2015-02-01
Objective. Brain-computer interfaces (BCIs) are being developed to assist paralyzed people and amputees by translating neural activity into movements of a computer cursor or prosthetic limb. Here we introduce a novel BCI task paradigm, intended to help accelerate improvements to BCI systems. Through this task, we can push the performance limits of BCI systems, we can quantify more accurately how well a BCI system captures the user’s intent, and we can increase the richness of the BCI movement repertoire. Approach. We have implemented an instructed path task, wherein the user must drive a cursor along a visible path. The instructed path task provides a versatile framework to increase the difficulty of the task and thereby push the limits of performance. Relative to traditional point-to-point tasks, the instructed path task allows more thorough analysis of decoding performance and greater richness of movement kinematics. Main results. We demonstrate that monkeys are able to perform the instructed path task in a closed-loop BCI setting. We further investigate how the performance under BCI control compares to native arm control, whether users can decrease their movement variability in the face of a more demanding task, and how the kinematic richness is enhanced in this task. Significance. The use of the instructed path task has the potential to accelerate the development of BCI systems and their clinical translation.
NASA Astrophysics Data System (ADS)
Zhou, Weimin; Anastasio, Mark A.
2018-03-01
It has been advocated that task-based measures of image quality (IQ) should be employed to evaluate and optimize imaging systems. Task-based measures of IQ quantify the performance of an observer on a medically relevant task. The Bayesian Ideal Observer (IO), which employs complete statistical information of the object and noise, achieves the upper limit of the performance for a binary signal classification task. However, computing the IO performance is generally analytically intractable and can be computationally burdensome when Markov-chain Monte Carlo (MCMC) techniques are employed. In this paper, supervised learning with convolutional neural networks (CNNs) is employed to approximate the IO test statistics for a signal-known-exactly and background-known-exactly (SKE/BKE) binary detection task. The receiver operating characteristic (ROC) curve and the area under the ROC curve (AUC) are compared to those produced by the analytically computed IO. The advantages of the proposed supervised learning approach for approximating the IO are demonstrated.
Flodgren, G; Heiden, M; Lyskov, E; Crenshaw, A G
2007-03-01
In the present study, we assessed the wrist kinetics (range of motion, mean position, velocity and mean power frequency in radial/ulnar deviation, flexion/extension, and pronation/supination) associated with performing a mouse-operated computerized task involving painting rectangles on a computer screen. Furthermore, we evaluated the effects of the painting task on subjective perception of fatigue and wrist position sense. The results showed that the painting task required constrained wrist movements, and repetitive movements of about the same magnitude as those performed in mouse-operated design tasks. In addition, the painting task induced a perception of muscle fatigue in the upper extremity (Borg CR-scale: 3.5, p<0.001) and caused a reduction in the position sense accuracy of the wrist (error before: 4.6 degrees , error after: 5.6 degrees , p<0.05). This standardized painting task appears suitable for studying relevant risk factors, and therefore it offers a potential for investigating the pathophysiological mechanisms behind musculoskeletal disorders related to computer mouse use.
Functional Hypothalamic Amenorrhea: An Endocrine Society Clinical Practice Guideline.
Gordon, Catherine M; Ackerman, Kathryn E; Berga, Sarah L; Kaplan, Jay R; Mastorakos, George; Misra, Madhusmita; Murad, M Hassan; Santoro, Nanette F; Warren, Michelle P
2017-05-01
The American Society for Reproductive Medicine, the European Society of Endocrinology, and the Pediatric Endocrine Society. This guideline was funded by the Endocrine Society. To formulate clinical practice guidelines for the diagnosis and treatment of functional hypothalamic amenorrhea (FHA). The participants include an Endocrine Society-appointed task force of eight experts, a methodologist, and a medical writer. This evidence-based guideline was developed using the Grading of Recommendations, Assessment, Development, and Evaluation approach to describe the strength of recommendations and the quality of evidence. The task force commissioned two systematic reviews and used the best available evidence from other published systematic reviews and individual studies. One group meeting, several conference calls, and e-mail communications enabled consensus. Endocrine Society committees and members and cosponsoring organizations reviewed and commented on preliminary drafts of this guideline. FHA is a form of chronic anovulation, not due to identifiable organic causes, but often associated with stress, weight loss, excessive exercise, or a combination thereof. Investigations should include assessment of systemic and endocrinologic etiologies, as FHA is a diagnosis of exclusion. A multidisciplinary treatment approach is necessary, including medical, dietary, and mental health support. Medical complications include, among others, bone loss and infertility, and appropriate therapies are under debate and investigation. Copyright © 2017 Endocrine Society
A resource-sharing model based on a repeated game in fog computing.
Sun, Yan; Zhang, Nan
2017-03-01
With the rapid development of cloud computing techniques, the number of users is undergoing exponential growth. It is difficult for traditional data centers to perform many tasks in real time because of the limited bandwidth of resources. The concept of fog computing is proposed to support traditional cloud computing and to provide cloud services. In fog computing, the resource pool is composed of sporadic distributed resources that are more flexible and movable than a traditional data center. In this paper, we propose a fog computing structure and present a crowd-funding algorithm to integrate spare resources in the network. Furthermore, to encourage more resource owners to share their resources with the resource pool and to supervise the resource supporters as they actively perform their tasks, we propose an incentive mechanism in our algorithm. Simulation results show that our proposed incentive mechanism can effectively reduce the SLA violation rate and accelerate the completion of tasks.
van Holst, Ruth J; Lemmens, Jeroen S; Valkenburg, Patti M; Peter, Jochen; Veltman, Dick J; Goudriaan, Anna E
2012-06-01
The aim of this study was to examine whether behavioral tendencies commonly related to addictive behaviors are also related to problematic computer and video game playing in adolescents. The study of attentional bias and response inhibition, characteristic for addictive disorders, is relevant to the ongoing discussion on whether problematic gaming should be classified as an addictive disorder. We tested the relation between self-reported levels of problem gaming and two behavioral domains: attentional bias and response inhibition. Ninety-two male adolescents performed two attentional bias tasks (addiction-Stroop, dot-probe) and a behavioral inhibition task (go/no-go). Self-reported problem gaming was measured by the game addiction scale, based on the Diagnostic and Statistical Manual of Mental Disorders-fourth edition criteria for pathological gambling and time spent on computer and/or video games. Male adolescents with higher levels of self-reported problem gaming displayed signs of error-related attentional bias to game cues. Higher levels of problem gaming were also related to more errors on response inhibition, but only when game cues were presented. These findings are in line with the findings of attentional bias reported in clinically recognized addictive disorders, such as substance dependence and pathological gambling, and contribute to the discussion on the proposed concept of "Addiction and Related Disorders" (which may include non-substance-related addictive behaviors) in the Diagnostic and Statistical Manual of Mental Disorders-fourth edition. Copyright © 2012 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Amiryousefi, Mohammad
2017-01-01
The current study aimed at investigating the effects of three types of prewriting planning conditions, namely teacher-monitored collaborative planning (TMCP), student-led collaborative planning (SLCP), and individual planning (IP) on EFL learners' computer-mediated L2 written production and learning transfer from a pedagogic task to a new task of…
A queueing model of pilot decision making in a multi-task flight management situation
NASA Technical Reports Server (NTRS)
Walden, R. S.; Rouse, W. B.
1977-01-01
Allocation of decision making responsibility between pilot and computer is considered and a flight management task, designed for the study of pilot-computer interaction, is discussed. A queueing theory model of pilot decision making in this multi-task, control and monitoring situation is presented. An experimental investigation of pilot decision making and the resulting model parameters are discussed.
Artificial Exo-Society Modeling: a New Tool for SETI Research
NASA Astrophysics Data System (ADS)
Gardner, James N.
2002-01-01
One of the newest fields of complexity research is artificial society modeling. Methodologically related to artificial life research, artificial society modeling utilizes agent-based computer simulation tools like SWARM and SUGARSCAPE developed by the Santa Fe Institute, Los Alamos National Laboratory and the Bookings Institution in an effort to introduce an unprecedented degree of rigor and quantitative sophistication into social science research. The broad aim of artificial society modeling is to begin the development of a more unified social science that embeds cultural evolutionary processes in a computational environment that simulates demographics, the transmission of culture, conflict, economics, disease, the emergence of groups and coadaptation with an environment in a bottom-up fashion. When an artificial society computer model is run, artificial societal patterns emerge from the interaction of autonomous software agents (the "inhabitants" of the artificial society). Artificial society modeling invites the interpretation of society as a distributed computational system and the interpretation of social dynamics as a specialized category of computation. Artificial society modeling techniques offer the potential of computational simulation of hypothetical alien societies in much the same way that artificial life modeling techniques offer the potential to model hypothetical exobiological phenomena. NASA recently announced its intention to begin exploring the possibility of including artificial life research within the broad portfolio of scientific fields comprised by the interdisciplinary astrobiology research endeavor. It may be appropriate for SETI researchers to likewise commence an exploration of the possible inclusion of artificial exo-society modeling within the SETI research endeavor. Artificial exo-society modeling might be particularly useful in a post-detection environment by (1) coherently organizing the set of data points derived from a detected ETI signal, (2) mapping trends in the data points over time (assuming receipt of an extended ETI signal), and (3) projecting such trends forward to derive alternative cultural evolutionary scenarios for the exo-society under analysis. The latter exercise might be particularly useful to compensate for the inevitable time lag between generation of an ETI signal and receipt of an ETI signal on Earth. For this reason, such an exercise might be a helpful adjunct to the decisional process contemplated by Paragraph 9 of the Declaration of Principles Concerning Activities Following the Detection of Extraterrestrial Intelligence.
NASA Astrophysics Data System (ADS)
Peterson, Karl
Since the discovery in the late 1930s that air entrainment can improve the durability of concrete, it has been important for people to know the quantity, spacial distribution, and size distribution of the air-voids in their concrete mixes in order to ensure a durable final product. The task of air-void system characterization has fallen on the microscopist, who, according to a standard test method laid forth by the American Society of Testing and Materials, must meticulously count or measure about a thousand air-voids per sample as exposed on a cut and polished cross-section of concrete. The equipment used to perform this task has traditionally included a stereomicroscope, a mechanical stage, and a tally counter. Over the past 30 years, with the availability of computers and digital imaging, automated methods have been introduced to perform the same task, but using the same basic equipment. The method described here replaces the microscope and mechanical stage with an ordinary flatbed desktop scanner, and replaces the microscopist and tally counter with a personal computer; two pieces of equipment much more readily available than a microscope with a mechanical stage, and certainly easier to find than a person willing to sit for extended periods of time counting air-voids. Most laboratories that perform air-void system characterization typically have cabinets full of prepared samples with corresponding results from manual operators. Proponents of automated methods often take advantage of this fact by analyzing the same samples and comparing the results. A similar iterative approach is described here where scanned images collected from a significant number of samples are analyzed, the results compared to those of the manual operator, and the settings optimized to best approximate the results of the manual operator. The results of this calibration procedure are compared to an alternative calibration procedure based on the more rigorous digital image accuracy assessment methods employed primarily by the remote sensing/satellite imaging community.
NASA Astrophysics Data System (ADS)
Baiotti, Luca; Takabe, Hideaki
2013-08-01
The PDF contains the speech of journalist Atsuko Tsuji (Asahi Shimbun) with the title 'Requests and expectations for computational science' and the record of the following discussion on: 'Will computational science be able to provide answers to important problems of human society?'
Charlton, Bruce G
2006-01-01
Although 'hard work' and 'busyness' are somewhat similar terms, there seem to be significant differences in the way that they are used. While hard work has always been a feature of complex societies, modern society can be seen as evolving toward being dominated by jobs characterized by busyness. Busyness refers to multi-tasking - having many sequential jobs to perform and switching frequently between them on an externally-imposed schedule. Traditionally, the individual gifts of a successful scientist were mainly in terms of knowledge, theoretical or technical aptitude. But nowadays the successful scientist is often one who has been promoted from hard-work to busyness: an expert in synthesizing a sufficient degree of scientific ability with a broad range of managerial and political skills. It is psychologically tough to be busy, because busyness is a consequence of human beings being constrained by the functioning of abstract social systems. In a complex modern organization, individual psychology is subordinated to inflexible programs of being in specific places at specific times doing specific things - this is both tricky to do well and demanding to do at all. Since people are paid (mainly) to do difficult but necessary things they would prefer not to do, busyness has become a major reason why people are paid a premium salary. In the long-term, many straightforward jobs will be analyzed and routinized out of existence, with the narrowly-skilled worker being replaced by teams, machines or computers. But busy jobs are hard to eliminate because they are those in which it is optimal for a variety of disparate and unpredictable tasks to be done by a single person. Consequently, those individuals who can cope with, even thrive-upon, busyness are becoming indispensable. In future 'the busy shall inherit the earth' (or, at least, the most powerful and highest paid jobs), not just in science but in all major social domains.
Douglas, Pamela S; Garcia, Mario J; Haines, David E; Lai, Wyman W; Manning, Warren J; Patel, Ayan R; Picard, Michael H; Polk, Donna M; Ragosta, Michael; Parker Ward, R; Weiner, Rory B
2011-03-01
The American College of Cardiology Foundation (ACCF), in partnership with the American Society of Echocardiography (ASE) and along with key specialty and subspecialty societies, conducted a review of common clinical scenarios where echocardiography is frequently considered. This document combines and updates the original transthoracic and transesophageal echocardiography appropriateness criteria published in 2007 (1) and the original stress echocardiography appropriateness criteria published in 2008 (2). This revision reflects new clinical data, reflects changes in test utilization patterns,and clarifies echocardiography use where omissions or lack of clarity existed in the original criteria.The indications (clinical scenarios)were derived from common applications or anticipated uses, as well as from current clinical practice guidelines and results of studies examining the implementation of the original appropriate use criteria (AUC).The 202 indications in this document were developed by a diverse writing group and scored by a separate independent technical panel on a scale of 1 to 9,to designate appropriate use(median 7 to 9), uncertain use(median 4 to 6), and inappropriate use (median 1 to 3). Ninety-seven indications were rated as appropriate, 34 were rated as uncertain, and 71 were rated as inappropriate. In general,the use of echocardiography for initial diagnosis when there is a change in clinical status or when the results of the echocardiogram are anticipated to change patient management were rated appropriate. Routine testing when there was no change in clinical status or when results of testing were unlikely to modify management were more likely to be inappropriate than appropriate/uncertain.The AUC for echocardiography have the potential to impact physician decision making,healthcare delivery, and reimbursement policy. Furthermore,recognition of uncertain clinical scenarios facilitates identification of areas that would benefit from future research.
The MIGenAS integrated bioinformatics toolkit for web-based sequence analysis
Rampp, Markus; Soddemann, Thomas; Lederer, Hermann
2006-01-01
We describe a versatile and extensible integrated bioinformatics toolkit for the analysis of biological sequences over the Internet. The web portal offers convenient interactive access to a growing pool of chainable bioinformatics software tools and databases that are centrally installed and maintained by the RZG. Currently, supported tasks comprise sequence similarity searches in public or user-supplied databases, computation and validation of multiple sequence alignments, phylogenetic analysis and protein–structure prediction. Individual tools can be seamlessly chained into pipelines allowing the user to conveniently process complex workflows without the necessity to take care of any format conversions or tedious parsing of intermediate results. The toolkit is part of the Max-Planck Integrated Gene Analysis System (MIGenAS) of the Max Planck Society available at (click ‘Start Toolkit’). PMID:16844980
A simple tool for stereological assessment of digital images: the STEPanizer.
Tschanz, S A; Burri, P H; Weibel, E R
2011-07-01
STEPanizer is an easy-to-use computer-based software tool for the stereological assessment of digitally captured images from all kinds of microscopical (LM, TEM, LSM) and macroscopical (radiology, tomography) imaging modalities. The program design focuses on providing the user a defined workflow adapted to most basic stereological tasks. The software is compact, that is user friendly without being bulky. STEPanizer comprises the creation of test systems, the appropriate display of digital images with superimposed test systems, a scaling facility, a counting module and an export function for the transfer of results to spreadsheet programs. Here we describe the major workflow of the tool illustrating the application on two examples from transmission electron microscopy and light microscopy, respectively. © 2011 The Authors Journal of Microscopy © 2011 Royal Microscopical Society.
Customization of user interfaces to reduce errors and enhance user acceptance.
Burkolter, Dina; Weyers, Benjamin; Kluge, Annette; Luther, Wolfram
2014-03-01
Customization is assumed to reduce error and increase user acceptance in the human-machine relation. Reconfiguration gives the operator the option to customize a user interface according to his or her own preferences. An experimental study with 72 computer science students using a simulated process control task was conducted. The reconfiguration group (RG) interactively reconfigured their user interfaces and used the reconfigured user interface in the subsequent test whereas the control group (CG) used a default user interface. Results showed significantly lower error rates and higher acceptance of the RG compared to the CG while there were no significant differences between the groups regarding situation awareness and mental workload. Reconfiguration seems to be promising and therefore warrants further exploration. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.
ERIC Educational Resources Information Center
Liew, Tze Wei; Tan, Su-Mae; Seydali, Rouzbeh
2014-01-01
In this article, the effects of personalized narration in multimedia learning on learners' computer perceptions and task-related attitudes were examined. Twenty-six field independent and 22 field dependent participants studied the computer-based multimedia lessons on C-Programming, either with personalized narration or non-personalized narration.…
The use of analytical models in human-computer interface design
NASA Technical Reports Server (NTRS)
Gugerty, Leo
1993-01-01
Recently, a large number of human-computer interface (HCI) researchers have investigated building analytical models of the user, which are often implemented as computer models. These models simulate the cognitive processes and task knowledge of the user in ways that allow a researcher or designer to estimate various aspects of an interface's usability, such as when user errors are likely to occur. This information can lead to design improvements. Analytical models can supplement design guidelines by providing designers rigorous ways of analyzing the information-processing requirements of specific tasks (i.e., task analysis). These models offer the potential of improving early designs and replacing some of the early phases of usability testing, thus reducing the cost of interface design. This paper describes some of the many analytical models that are currently being developed and evaluates the usefulness of analytical models for human-computer interface design. This paper will focus on computational, analytical models, such as the GOMS model, rather than less formal, verbal models, because the more exact predictions and task descriptions of computational models may be useful to designers. The paper also discusses some of the practical requirements for using analytical models in complex design organizations such as NASA.
Characterizing quantum supremacy in near-term devices
NASA Astrophysics Data System (ADS)
Boixo, Sergio; Isakov, Sergei V.; Smelyanskiy, Vadim N.; Babbush, Ryan; Ding, Nan; Jiang, Zhang; Bremner, Michael J.; Martinis, John M.; Neven, Hartmut
2018-06-01
A critical question for quantum computing in the near future is whether quantum devices without error correction can perform a well-defined computational task beyond the capabilities of supercomputers. Such a demonstration of what is referred to as quantum supremacy requires a reliable evaluation of the resources required to solve tasks with classical approaches. Here, we propose the task of sampling from the output distribution of random quantum circuits as a demonstration of quantum supremacy. We extend previous results in computational complexity to argue that this sampling task must take exponential time in a classical computer. We introduce cross-entropy benchmarking to obtain the experimental fidelity of complex multiqubit dynamics. This can be estimated and extrapolated to give a success metric for a quantum supremacy demonstration. We study the computational cost of relevant classical algorithms and conclude that quantum supremacy can be achieved with circuits in a two-dimensional lattice of 7 × 7 qubits and around 40 clock cycles. This requires an error rate of around 0.5% for two-qubit gates (0.05% for one-qubit gates), and it would demonstrate the basic building blocks for a fault-tolerant quantum computer.
Application of a fast skyline computation algorithm for serendipitous searching problems
NASA Astrophysics Data System (ADS)
Koizumi, Kenichi; Hiraki, Kei; Inaba, Mary
2018-02-01
Skyline computation is a method of extracting interesting entries from a large population with multiple attributes. These entries, called skyline or Pareto optimal entries, are known to have extreme characteristics that cannot be found by outlier detection methods. Skyline computation is an important task for characterizing large amounts of data and selecting interesting entries with extreme features. When the population changes dynamically, the task of calculating a sequence of skyline sets is called continuous skyline computation. This task is known to be difficult to perform for the following reasons: (1) information of non-skyline entries must be stored since they may join the skyline in the future; (2) the appearance or disappearance of even a single entry can change the skyline drastically; (3) it is difficult to adopt a geometric acceleration algorithm for skyline computation tasks with high-dimensional datasets. Our new algorithm called jointed rooted-tree (JR-tree) manages entries using a rooted tree structure. JR-tree delays extend the tree to deep levels to accelerate tree construction and traversal. In this study, we presented the difficulties in extracting entries tagged with a rare label in high-dimensional space and the potential of fast skyline computation in low-latency cell identification technology.
Tomorrow. The Report of the Task Force for the Study of Chemistry Education in the United States.
ERIC Educational Resources Information Center
American Chemical Society, Washington, DC.
An American Chemical Society (ACS) task force was charged to examine the state of chemistry education in the United States and to make recommendations in light of its findings. This document presents the task force's report and 39 major (and also secondary) recommendations. These recommendations, with accompanying discussions, focus on: (1)…
NASA Astrophysics Data System (ADS)
Wang, Li-Qun; Saito, Masao
We used 1.5T functional magnetic resonance imaging (fMRI) to explore that which brain areas contribute uniquely to numeric computation. The BOLD effect activation pattern of metal arithmetic task (successive subtraction: actual calculation task) was compared with multiplication tables repetition task (rote verbal arithmetic memory task) response. The activation found in right parietal lobule during metal arithmetic task suggested that quantitative cognition or numeric computation may need the assistance of sensuous convert, such as spatial imagination and spatial sensuous convert. In addition, this mechanism may be an ’analog algorithm’ in the simple mental arithmetic processing.
Alsafi, Z; Hameed, Y; Amin, P; Shamsad, S; Raja, U; Alsafi, A; Hamady, M S
2017-09-01
To investigate the effect of playing computer games and manual dexterity on catheter-wire manipulation in a mechanical aortic model. Medical student volunteers filled in a preprocedure questionnaire assessing their exposure to computer games. Their manual dexterity was measured using a smartphone game. They were then shown a video clip demonstrating renal artery cannulation and were asked to reproduce this. All attempts were timed. Two-tailed Student's t-test was used to compare continuous data, while Fisher's exact test was used for categorical data. Fifty students aged 18-22 years took part in the study. Forty-six completed the task at an average of 168 seconds (range 103-301 seconds). There was no significant difference in the dexterity score or time to cannulate the renal artery between male and female students. Students who played computer games for >10 hours per week had better dexterity scores than those who did not play computer games: 9.1 versus 10.2 seconds (p=0.0237). Four of 19 students who did not play computer games failed to complete the task, while all of those who played computer games regularly completed the task (p=0.0168). Playing computer games is associated with better manual dexterity and ability to complete a basic interventional radiology task for novices. Copyright © 2017 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
29 CFR 541.707 - Occasional tasks.
Code of Federal Regulations, 2013 CFR
2013-07-01
... DELIMITING THE EXEMPTIONS FOR EXECUTIVE, ADMINISTRATIVE, PROFESSIONAL, COMPUTER AND OUTSIDE SALES EMPLOYEES Definitions and Miscellaneous Provisions § 541.707 Occasional tasks. Occasional, infrequently recurring tasks...
29 CFR 541.707 - Occasional tasks.
Code of Federal Regulations, 2014 CFR
2014-07-01
... DELIMITING THE EXEMPTIONS FOR EXECUTIVE, ADMINISTRATIVE, PROFESSIONAL, COMPUTER AND OUTSIDE SALES EMPLOYEES Definitions and Miscellaneous Provisions § 541.707 Occasional tasks. Occasional, infrequently recurring tasks...
29 CFR 541.707 - Occasional tasks.
Code of Federal Regulations, 2012 CFR
2012-07-01
... DELIMITING THE EXEMPTIONS FOR EXECUTIVE, ADMINISTRATIVE, PROFESSIONAL, COMPUTER AND OUTSIDE SALES EMPLOYEES Definitions and Miscellaneous Provisions § 541.707 Occasional tasks. Occasional, infrequently recurring tasks...
29 CFR 541.707 - Occasional tasks.
Code of Federal Regulations, 2011 CFR
2011-07-01
... DELIMITING THE EXEMPTIONS FOR EXECUTIVE, ADMINISTRATIVE, PROFESSIONAL, COMPUTER AND OUTSIDE SALES EMPLOYEES Definitions and Miscellaneous Provisions § 541.707 Occasional tasks. Occasional, infrequently recurring tasks...
Dual-Arm Generalized Compliant Motion With Shared Control
NASA Technical Reports Server (NTRS)
Backes, Paul G.
1994-01-01
Dual-Arm Generalized Compliant Motion (DAGCM) primitive computer program implementing improved unified control scheme for two manipulator arms cooperating in task in which both grasp same object. Provides capabilities for autonomous, teleoperation, and shared control of two robot arms. Unifies cooperative dual-arm control with multi-sensor-based task control and makes complete task-control capability available to higher-level task-planning computer system via large set of input parameters used to describe desired force and position trajectories followed by manipulator arms. Some concepts discussed in "A Generalized-Compliant-Motion Primitive" (NPO-18134).
Numerical Study of Boundary-Layer in Aerodynamics
NASA Technical Reports Server (NTRS)
Shih, Tom I-P.
1997-01-01
The accomplishments made in the following three tasks are described: (1) The first task was to study shock-wave boundary-layer interactions with bleed - this study is relevant to boundary-layer control in external and mixed-compression inlets of supersonic aircraft; (2) The second task was to test RAAKE, a code developed for computing turbulence quantities; and (3) The third task was to compute flow around the Ames ER-2 aircraft that has been retrofitted with containers over its wings and fuselage. The appendices include two reports submitted to AIAA for publication.
Dynamically allocating sets of fine-grained processors to running computations
NASA Technical Reports Server (NTRS)
Middleton, David
1988-01-01
Researchers explore an approach to using general purpose parallel computers which involves mapping hardware resources onto computations instead of mapping computations onto hardware. Problems such as processor allocation, task scheduling and load balancing, which have traditionally proven to be challenging, change significantly under this approach and may become amenable to new attacks. Researchers describe the implementation of this approach used by the FFP Machine whose computation and communication resources are repeatedly partitioned into disjoint groups that match the needs of available tasks from moment to moment. Several consequences of this system are examined.
Image Processing and Computer Aided Diagnosis in Computed Tomography of the Breast
2007-03-01
TERMS breast imaging, breast CT, scatter compensation, denoising, CAD , Cone-beam CT 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...clinical projection images. The CAD tool based on signal known exactly (SKE) scenario is under development. Task 6: Test and compare the...performances of the CAD developed in Task 5 applied to processed projection data from Task 1 with the CAD performance on the projection data without Bayesian
ERIC Educational Resources Information Center
Li, Jinrong
2012-01-01
The dissertation examines how synchronous text-based computer-mediated communication (SCMC) tasks may affect English as a Second Language (ESL) learners' development of second language (L2) and academic literacy. The study is motivated by two issues concerning the use of SCMC tasks in L2 writing classes. First, although some of the alleged…
Design and Analysis of Self-Adapted Task Scheduling Strategies in Wireless Sensor Networks
Guo, Wenzhong; Xiong, Naixue; Chao, Han-Chieh; Hussain, Sajid; Chen, Guolong
2011-01-01
In a wireless sensor network (WSN), the usage of resources is usually highly related to the execution of tasks which consume a certain amount of computing and communication bandwidth. Parallel processing among sensors is a promising solution to provide the demanded computation capacity in WSNs. Task allocation and scheduling is a typical problem in the area of high performance computing. Although task allocation and scheduling in wired processor networks has been well studied in the past, their counterparts for WSNs remain largely unexplored. Existing traditional high performance computing solutions cannot be directly implemented in WSNs due to the limitations of WSNs such as limited resource availability and the shared communication medium. In this paper, a self-adapted task scheduling strategy for WSNs is presented. First, a multi-agent-based architecture for WSNs is proposed and a mathematical model of dynamic alliance is constructed for the task allocation problem. Then an effective discrete particle swarm optimization (PSO) algorithm for the dynamic alliance (DPSO-DA) with a well-designed particle position code and fitness function is proposed. A mutation operator which can effectively improve the algorithm’s ability of global search and population diversity is also introduced in this algorithm. Finally, the simulation results show that the proposed solution can achieve significant better performance than other algorithms. PMID:22163971
NASA Technical Reports Server (NTRS)
Wolpert, David H.; Koga, Dennis (Technical Monitor)
2000-01-01
In the first of this pair of papers, it was proven that there cannot be a physical computer to which one can properly pose any and all computational tasks concerning the physical universe. It was then further proven that no physical computer C can correctly carry out all computational tasks that can be posed to C. As a particular example, this result means that no physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly "processing information faster than the universe does". These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - "physical computation" - is needed to address the issues considered in these papers, which concern real physical computers. While this novel definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. This second paper of the pair presents a preliminary exploration of some of this mathematical structure. Analogues of Chomskian results concerning universal Turing Machines and the Halting theorem are derived, as are results concerning the (im)possibility of certain kinds of error-correcting codes. In addition, an analogue of algorithmic information complexity, "prediction complexity", is elaborated. A task-independent bound is derived on how much the prediction complexity of a computational task can differ for two different reference universal physical computers used to solve that task, a bound similar to the "encoding" bound governing how much the algorithm information complexity of a Turing machine calculation can differ for two reference universal Turing machines. Finally, it is proven that either the Hamiltonian of our universe proscribes a certain type of computation, or prediction complexity is unique (unlike algorithmic information complexity), in that there is one and only version of it that can be applicable throughout our universe.
Amador-Vargas, Sabrina; Gronenberg, Wulfila; Wcislo, William T.; Mueller, Ulrich
2015-01-01
Group size in both multicellular organisms and animal societies can correlate with the degree of division of labour. For ants, the task specialization hypothesis (TSH) proposes that increased behavioural specialization enabled by larger group size corresponds to anatomical specialization of worker brains. Alternatively, the social brain hypothesis proposes that increased levels of social stimuli in larger colonies lead to enlarged brain regions in all workers, regardless of their task specialization. We tested these hypotheses in acacia ants (Pseudomyrmex spinicola), which exhibit behavioural but not morphological task specialization. In wild colonies, we marked, followed and tested ant workers involved in foraging tasks on the leaves (leaf-ants) and in defensive tasks on the host tree trunk (trunk-ants). Task specialization increased with colony size, especially in defensive tasks. The relationship between colony size and brain region volume was task-dependent, supporting the TSH. Specifically, as colony size increased, the relative size of regions within the mushroom bodies of the brain decreased in trunk-ants but increased in leaf-ants; those regions play important roles in learning and memory. Our findings suggest that workers specialized in defence may have reduced learning abilities relative to leaf-ants; these inferences remain to be tested. In societies with monomorphic workers, brain polymorphism enhanced by group size could be a mechanism by which division of labour is achieved. PMID:25567649
Computer control improves ethylene plant operation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitehead, B.D.; Parnis, M.
ICIA Australia ordered a turnkey 250,000-tpy ethylene plant to be built at the Botany site, Sydney, Australia. Following a feasibility study, an additional order was placed for a process computer system for advanced process control and optimization. This article gives a broad outline of the process computer tasks, how the tasks were implemented, what problems were met, what lessons were learned and what results were achieved.
ERIC Educational Resources Information Center
Tipton, Charles M.
2013-01-01
Society members whose research publication during the past 125 yr had an important impact on the discipline of physiology were featured at the American Physiological Society (APS)'s 125th Anniversary symposium. The daunting and challenging task of identifying and selecting significant publications was assumed by the Steering Committee of the…
1983-08-18
5 When planting transplants, holes dug by backhoe are superior to auger bored holes ( Ribera and Sue 1978). The combination of nonglazed roughened...American Society of Agronomy, Crop Science Society of America, Soil Science Society of America, Madison, Wisconsin, USA. 18 Ribera , A. E., and J. C. Sue
ERIC Educational Resources Information Center
Kim, Kyung Hi
Korean society is in the midst of a conflict between modern and postmodern condition. The concept of modernity is rooted in the Enlightenment, which valued reason and proposed the rational and progressive construction and transformation of society and reality. As a result of a rational differentiation between culture and society, modern phenomena…
ISCB: past-present perspective for the International Society for Computational Biology.
Rost, Burkhard
2014-01-01
Since its establishment in 1997, International Society for Computational Biology (ISCB) has contributed importantly toward advancing the understanding of living systems through computation. The ISCB represents nearly 3000 members working in >70 countries. It has doubled the number of members since 2007. At the same time, the number of meetings organized by the ISCB has increased from two in 2007 to eight in 2013, and the society has cemented many lasting alliances with regional societies and specialist groups. ISCB is ready to grow into a challenging and promising future. The progress over the past 7 years has resulted from the vision, and possibly more importantly, the passion and hard working dedication of many individuals.
ISCB: past-present perspective for the International Society for Computational Biology.
Rost, Burkhard
2013-12-15
Since its establishment in 1997, International Society for Computational Biology (ISCB) has contributed importantly toward advancing the understanding of living systems through computation. The ISCB represents nearly 3000 members working in >70 countries. It has doubled the number of members since 2007. At the same time, the number of meetings organized by the ISCB has increased from two in 2007 to eight in 2013, and the society has cemented many lasting alliances with regional societies and specialist groups. ISCB is ready to grow into a challenging and promising future. The progress over the past 7 years has resulted from the vision, and possibly more importantly, the passion and hard working dedication of many individuals.
Abdullahi, Mohammed; Ngadi, Md Asri
2016-01-01
Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS) has been shown to perform competitively with Particle Swarm Optimization (PSO). The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA) based SOS (SASOS) in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs) which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.
Abdullahi, Mohammed; Ngadi, Md Asri
2016-01-01
Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS) has been shown to perform competitively with Particle Swarm Optimization (PSO). The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA) based SOS (SASOS) in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs) which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan. PMID:27348127
Egami, Chiyomi; Yamashita, Yushiro; Tada, Yasuhiro; Anai, Chiduru; Mukasa, Akiko; Yuge, Kotaro; Nagamitsu, Shinichiro; Matsuishi, Toyojiro
2015-10-01
The aim of this study was to investigate the developmental trajectories of attention, short-term memory, and working memory in school-aged children using a 10 min test battery of cognitive function. Participants comprised 144 typically developing children (TDC) aged 7-12 years and 24 healthy adults, divided according to age into seven groups (12 males and 12 females for each age group). Participants were assessed using CogHealth, which is a computer-based measure composed of five tasks. We measured attention, short-term memory, and working memory (WM) with visual stimulation. Each task was analyzed for age-related differences in reaction time and accuracy rate. Attention tasks were faster in stages from the age of 7-10 years. Accuracy rate of short-term memory gradually increased from 12 years of age and suddenly increased and continued to increase at 22 years of age. Accuracy rate of working memory increased until 12 years of age. Correlations were found between the ages and reaction time, and between ages and accuracy rate of the tasks. These results indicate that there were rapid improvements in attention, short-term memory, and WM performance between 7 and 10 years of age followed by gradual improvement until 12 years of age. Increase in short-term memory continued until 22 years of age. In our experience CogHealth was an easy and useful measure for the evaluation of cognitive function in school-age children. Copyright © 2015 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.
Dixon, Benjamin J; Chan, Harley; Daly, Michael J; Qiu, Jimmy; Vescan, Allan; Witterick, Ian J; Irish, Jonathan C
2016-07-01
Providing image guidance in a 3-dimensional (3D) format, visually more in keeping with the operative field, could potentially reduce workload and lead to faster and more accurate navigation. We wished to assess a 3D virtual-view surgical navigation prototype in comparison to a traditional 2D system. Thirty-seven otolaryngology surgeons and trainees completed a randomized crossover navigation exercise on a cadaver model. Each subject identified three sinonasal landmarks with 3D virtual (3DV) image guidance and three landmarks with conventional cross-sectional computed tomography (CT) image guidance. Subjects were randomized with regard to which side and display type was tested initially. Accuracy, task completion time, and task workload were recorded. Display type did not influence accuracy (P > 0.2) or efficiency (P > 0.3) for any of the six landmarks investigated. Pooled landmark data revealed a trend of improved accuracy in the 3DV group by 0.44 millimeters (95% confidence interval [0.00-0.88]). High-volume surgeons were significantly faster (P < 0.01) and had reduced workload scores in all domains (P < 0.01), but they were no more accurate (P > 0.28). Real-time 3D image guidance did not influence accuracy, efficiency, or task workload when compared to conventional triplanar image guidance. The subtle pooled accuracy advantage for the 3DV view is unlikely to be of clinical significance. Experience level was strongly correlated to task completion time and workload but did not influence accuracy. N/A. Laryngoscope, 126:1510-1515, 2016. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.
NASA Technical Reports Server (NTRS)
Hu, Chaumin
2007-01-01
IPG Execution Service is a framework that reliably executes complex jobs on a computational grid, and is part of the IPG service architecture designed to support location-independent computing. The new grid service enables users to describe the platform on which they need a job to run, which allows the service to locate the desired platform, configure it for the required application, and execute the job. After a job is submitted, users can monitor it through periodic notifications, or through queries. Each job consists of a set of tasks that performs actions such as executing applications and managing data. Each task is executed based on a starting condition that is an expression of the states of other tasks. This formulation allows tasks to be executed in parallel, and also allows a user to specify tasks to execute when other tasks succeed, fail, or are canceled. The two core components of the Execution Service are the Task Database, which stores tasks that have been submitted for execution, and the Task Manager, which executes tasks in the proper order, based on the user-specified starting conditions, and avoids overloading local and remote resources while executing tasks.
NASA Technical Reports Server (NTRS)
Johannes, J. D.
1974-01-01
Techniques, methods, and system requirements are reported for an onboard computerized communications system that provides on-line computing capability during manned space exploration. Communications between man and computer take place by sequential execution of each discrete step of a procedure, by interactive progression through a tree-type structure to initiate tasks or by interactive optimization of a task requiring man to furnish a set of parameters. Effective communication between astronaut and computer utilizes structured vocabulary techniques and a word recognition system.
Goldfarb, Charles A; Lee, W P Andrew; Briskey, Dawn; Higgins, James P
2014-02-01
A task force for the American Society for Surgery of the Hand (ASSH) recently investigated the practice patterns, board certification, subspecialty certification status, and ASSH membership of hand surgeons after completion of fellowship training. A total of 37% of the fellowship graduates from 2000 to 2006 had not attained subspecialty certification for a variety of reasons. A smaller group of fellowship graduates obtained the subspecialty certification but had not become Active Members of the ASSH. Efforts to strengthen the hand surgeon community and best serve our patients should focus on evolving patterns in post fellowship choices that reflect practice type choices and generational changes. Copyright © 2014 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
Task transfer: another pressure for evolution of the medical profession.
Van Der Weyden, Martin B
2006-07-03
Since the 1960s, Australian society and the medical profession have undergone enormous change. Our society has moved from a relatively homogeneous and conservative community, supported by limited government services, to one that is multicultural, focused on the individual and consumerism, and supported by extensive government programs, with health care a top public and political priority. A defining feature of contemporary society is its mistrust of institutions, professionals, public servants and politicians. The medical profession has changed from a cohesive entity, valuing generalism and with limited specialisation, to one splintered by ultra-specialisation and competing professional agendas. The medical workforce shortage and efforts to maintain the safety and quality of health services are putting acute pressure on the profession. Task transfer or role substitution of medical services is mooted as a potential solution to this pressure. This has the potential to drastically transform the profession. How task transfer will evolve and change medicine depends on the vision and leadership of the profession and a flexible pragmatism that safeguards quality and safety and places patient priorities above those of the profession.
CREASE 6.0 Catalog of Resources for Education in Ada and Software Engineering
1992-02-01
Programming Software Engineering Strong Typing Tasking Audene . Computer Scientists Terbook(s): Barnes, J. Programming in Ada, 3rd ed. Addison-Wesley...Ada. Concept: Abstract Data Types Management Overview Package Real-Time Programming Tasking Audene Computer Scientists Textbook(s): Barnes, J
Bukowski, Henryk; Hietanen, Jari K.; Samson, Dana
2015-01-01
ABSTRACT Two paradigms have shown that people automatically compute what or where another person is looking at. In the visual perspective-taking paradigm, participants judge how many objects they see; whereas, in the gaze cueing paradigm, participants identify a target. Unlike in the former task, in the latter task, the influence of what or where the other person is looking at is only observed when the other person is presented alone before the task-relevant objects. We show that this discrepancy across the two paradigms is not due to differences in visual settings (Experiment 1) or available time to extract the directional information (Experiment 2), but that it is caused by how attention is deployed in response to task instructions (Experiment 3). Thus, the mere presence of another person in the field of view is not sufficient to compute where/what that person is looking at, which qualifies the claimed automaticity of such computations. PMID:26924936
Bukowski, Henryk; Hietanen, Jari K; Samson, Dana
2015-09-14
Two paradigms have shown that people automatically compute what or where another person is looking at. In the visual perspective-taking paradigm, participants judge how many objects they see; whereas, in the gaze cueing paradigm, participants identify a target. Unlike in the former task, in the latter task, the influence of what or where the other person is looking at is only observed when the other person is presented alone before the task-relevant objects. We show that this discrepancy across the two paradigms is not due to differences in visual settings (Experiment 1) or available time to extract the directional information (Experiment 2), but that it is caused by how attention is deployed in response to task instructions (Experiment 3). Thus, the mere presence of another person in the field of view is not sufficient to compute where/what that person is looking at, which qualifies the claimed automaticity of such computations.
Climate@Home: Crowdsourcing Climate Change Research
NASA Astrophysics Data System (ADS)
Xu, C.; Yang, C.; Li, J.; Sun, M.; Bambacus, M.
2011-12-01
Climate change deeply impacts human wellbeing. Significant amounts of resources have been invested in building super-computers that are capable of running advanced climate models, which help scientists understand climate change mechanisms, and predict its trend. Although climate change influences all human beings, the general public is largely excluded from the research. On the other hand, scientists are eagerly seeking communication mediums for effectively enlightening the public on climate change and its consequences. The Climate@Home project is devoted to connect the two ends with an innovative solution: crowdsourcing climate computing to the general public by harvesting volunteered computing resources from the participants. A distributed web-based computing platform will be built to support climate computing, and the general public can 'plug-in' their personal computers to participate in the research. People contribute the spare computing power of their computers to run a computer model, which is used by scientists to predict climate change. Traditionally, only super-computers could handle such a large computing processing load. By orchestrating massive amounts of personal computers to perform atomized data processing tasks, investments on new super-computers, energy consumed by super-computers, and carbon release from super-computers are reduced. Meanwhile, the platform forms a social network of climate researchers and the general public, which may be leveraged to raise climate awareness among the participants. A portal is to be built as the gateway to the climate@home project. Three types of roles and the corresponding functionalities are designed and supported. The end users include the citizen participants, climate scientists, and project managers. Citizen participants connect their computing resources to the platform by downloading and installing a computing engine on their personal computers. Computer climate models are defined at the server side. Climate scientists configure computer model parameters through the portal user interface. After model configuration, scientists then launch the computing task. Next, data is atomized and distributed to computing engines that are running on citizen participants' computers. Scientists will receive notifications on the completion of computing tasks, and examine modeling results via visualization modules of the portal. Computing tasks, computing resources, and participants are managed by project managers via portal tools. A portal prototype has been built for proof of concept. Three forums have been setup for different groups of users to share information on science aspect, technology aspect, and educational outreach aspect. A facebook account has been setup to distribute messages via the most popular social networking platform. New treads are synchronized from the forums to facebook. A mapping tool displays geographic locations of the participants and the status of tasks on each client node. A group of users have been invited to test functions such as forums, blogs, and computing resource monitoring.
Perdue, Bonnie M; Evans, Theodore A; Washburn, David A; Rumbaugh, Duane M; Beran, Michael J
2014-06-01
Both empirical and anecdotal evidence supports the idea that choice is preferred by humans. Previous research has demonstrated that this preference extends to nonhuman animals, but it remains largely unknown whether animals will actively seek out or prefer opportunities to choose. Here we explored the issue of whether capuchin and rhesus monkeys choose to choose. We used a modified version of the SELECT task-a computer program in which monkeys can choose the order of completion of various psychomotor and cognitive tasks. In the present experiments, each trial began with a choice between two icons, one of which allowed the monkey to select the order of task completion, and the other of which led to the assignment of a task order by the computer. In either case, subjects still had to complete the same number of tasks and the same number of task trials. The tasks were relatively easy, and the monkeys responded correctly on most trials. Thus, global reinforcement rates were approximately equated across conditions. The only difference was whether the monkey chose the task order or it was assigned, thus isolating the act of choosing. Given sufficient experience with the task icons, all monkeys showed a significant preference for choice when the alternative was a randomly assigned order of tasks. To a lesser extent, some of the monkeys maintained a preference for choice over a preferred, but computer-assigned, task order that was yoked to their own previous choice selection. The results indicated that monkeys prefer to choose when all other aspects of the task are equated.
Lexical Predictability During Natural Reading: Effects of Surprisal and Entropy Reduction.
Lowder, Matthew W; Choi, Wonil; Ferreira, Fernanda; Henderson, John M
2018-06-01
What are the effects of word-by-word predictability on sentence processing times during the natural reading of a text? Although information complexity metrics such as surprisal and entropy reduction have been useful in addressing this question, these metrics tend to be estimated using computational language models, which require some degree of commitment to a particular theory of language processing. Taking a different approach, this study implemented a large-scale cumulative cloze task to collect word-by-word predictability data for 40 passages and compute surprisal and entropy reduction values in a theory-neutral manner. A separate group of participants read the same texts while their eye movements were recorded. Results showed that increases in surprisal and entropy reduction were both associated with increases in reading times. Furthermore, these effects did not depend on the global difficulty of the text. The findings suggest that surprisal and entropy reduction independently contribute to variation in reading times, as these metrics seem to capture different aspects of lexical predictability. Copyright © 2018 Cognitive Science Society, Inc.
The operation of large computer-controlled manufacturing systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Upton, D.M.
1988-01-01
This work examines methods for operation of large computer-controlled manufacturing systems, with more than 50 or so disparate CNC machines in congregation. The central theme is the development of a distributed control system, which requires minimal central supervision, and allows manufacturing system re-configuration without extensive control software re-writes. Provision is made for machines to learn from their experience and provide estimates of the time necessary to effect various tasks. Routing is opportunistic, with varying degrees of myopia depending on the prevailing situation. Necessary curtailments of opportunism are built in to the system, in order to provide a society of machinesmore » that operate in unison rather than in chaos. Negotiation and contention resolution are carried out using a UHF radio communications network, along with processing capability on both pallets and tools. Graceful and robust error recovery is facilitated by ensuring adequate pessimistic consideration of failure modes at each stage in the scheme. Theoretical models are developed and an examination is made of fundamental characteristics of auction-based scheduling methods.« less
Active Nodal Task Seeking for High-Performance, Ultra-Dependable Computing
1994-07-01
implementation. Figure 1 shows a hardware organization of ANTS: stand-alone computing nodes inter - connected by buses. 2.1 Run Time Partitioning The...nodes in 14 respond to changing loads [27] or system reconfiguration [26]. Existing techniques are all source-initiated or server-initiated [27]. 5.1...short-running task segments. The task segments must be short-running in order that processors will become avalable often enough to satisfy changing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wayne F. Boyer; Gurdeep S. Hura
2005-09-01
The Problem of obtaining an optimal matching and scheduling of interdependent tasks in distributed heterogeneous computing (DHC) environments is well known to be an NP-hard problem. In a DHC system, task execution time is dependent on the machine to which it is assigned and task precedence constraints are represented by a directed acyclic graph. Recent research in evolutionary techniques has shown that genetic algorithms usually obtain more efficient schedules that other known algorithms. We propose a non-evolutionary random scheduling (RS) algorithm for efficient matching and scheduling of inter-dependent tasks in a DHC system. RS is a succession of randomized taskmore » orderings and a heuristic mapping from task order to schedule. Randomized task ordering is effectively a topological sort where the outcome may be any possible task order for which the task precedent constraints are maintained. A detailed comparison to existing evolutionary techniques (GA and PSGA) shows the proposed algorithm is less complex than evolutionary techniques, computes schedules in less time, requires less memory and fewer tuning parameters. Simulation results show that the average schedules produced by RS are approximately as efficient as PSGA schedules for all cases studied and clearly more efficient than PSGA for certain cases. The standard formulation for the scheduling problem addressed in this paper is Rm|prec|Cmax.,« less
A Decentralized Eigenvalue Computation Method for Spectrum Sensing Based on Average Consensus
NASA Astrophysics Data System (ADS)
Mohammadi, Jafar; Limmer, Steffen; Stańczak, Sławomir
2016-07-01
This paper considers eigenvalue estimation for the decentralized inference problem for spectrum sensing. We propose a decentralized eigenvalue computation algorithm based on the power method, which is referred to as generalized power method GPM; it is capable of estimating the eigenvalues of a given covariance matrix under certain conditions. Furthermore, we have developed a decentralized implementation of GPM by splitting the iterative operations into local and global computation tasks. The global tasks require data exchange to be performed among the nodes. For this task, we apply an average consensus algorithm to efficiently perform the global computations. As a special case, we consider a structured graph that is a tree with clusters of nodes at its leaves. For an accelerated distributed implementation, we propose to use computation over multiple access channel (CoMAC) as a building block of the algorithm. Numerical simulations are provided to illustrate the performance of the two algorithms.
An opportunity cost model of subjective effort and task performance
Kurzban, Robert; Duckworth, Angela; Kable, Joseph W.; Myers, Justus
2013-01-01
Why does performing certain tasks cause the aversive experience of mental effort and concomitant deterioration in task performance? One explanation posits a physical resource that is depleted over time. We propose an alternate explanation that centers on mental representations of the costs and benefits associated with task performance. Specifically, certain computational mechanisms, especially those associated with executive function, can be deployed for only a limited number of simultaneous tasks at any given moment. Consequently, the deployment of these computational mechanisms carries an opportunity cost – that is, the next-best use to which these systems might be put. We argue that the phenomenology of effort can be understood as the felt output of these cost/benefit computations. In turn, the subjective experience of effort motivates reduced deployment of these computational mechanisms in the service of the present task. These opportunity cost representations, then, together with other cost/benefit calculations, determine effort expended and, everything else equal, result in performance reductions. In making our case for this position, we review alternate explanations both for the phenomenology of effort associated with these tasks and for performance reductions over time. Likewise, we review the broad range of relevant empirical results from across subdisciplines, especially psychology and neuroscience. We hope that our proposal will help to build links among the diverse fields that have been addressing similar questions from different perspectives, and we emphasize ways in which alternate models might be empirically distinguished. PMID:24304775
Caro, J Jaime; Briggs, Andrew H; Siebert, Uwe; Kuntz, Karen M
2012-01-01
Models--mathematical frameworks that facilitate estimation of the consequences of health care decisions--have become essential tools for health technology assessment. Evolution of the methods since the first ISPOR Modeling Task Force reported in 2003 has led to a new Task Force, jointly convened with the Society for Medical Decision Making, and this series of seven articles presents the updated recommendations for best practices in conceptualizing models; implementing state-transition approaches, discrete event simulations, or dynamic transmission models; and dealing with uncertainty and validating and reporting models transparently. This overview article introduces the work of the Task Force, provides all the recommendations, and discusses some quandaries that require further elucidation. The audience for these articles includes those who build models, stakeholders who utilize their results, and, indeed, anyone concerned with the use of models to support decision making. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Computer Anxiety: How to Measure It?
ERIC Educational Resources Information Center
McPherson, Bill
1997-01-01
Provides an overview of five scales that are used to measure computer anxiety: Computer Anxiety Index, Computer Anxiety Scale, Computer Attitude Scale, Attitudes toward Computers, and Blombert-Erickson-Lowrey Computer Attitude Task. Includes background information and scale specifics. (JOW)
NASA Technical Reports Server (NTRS)
1982-01-01
A summary of tasks performed on an integrated command, control, communication, and computation system design study is given. The Tracking and Data Relay Satellite System command and control system study, an automated real-time operations study, and image processing work are discussed.
ERIC Educational Resources Information Center
Johnson, Donald M.; Ferguson, James A.; Vokins, Nancy W.; Lester, Melissa L.
2000-01-01
Over 50% of faculty teaching undergraduate agriculture courses (n=58) required use of word processing, Internet, and electronic mail; less than 50% required spreadsheets, databases, graphics, or specialized software. They planned to maintain or increase required computer tasks in their courses. (SK)
Cognitive Support for Learning Computer-Based Tasks Using Animated Demonstration
ERIC Educational Resources Information Center
Chen, Chun-Ying
2016-01-01
This study investigated the influence of cognitive support for learning computer-based tasks using animated demonstration (AD) on instructional efficiency. Cognitive support included (1) segmentation and learner control introducing interactive devices that allow content sequencing through a navigational menu, and content pacing through stop and…
Embodiment of Learning in Electro-Optical Signal Processors
NASA Astrophysics Data System (ADS)
Hermans, Michiel; Antonik, Piotr; Haelterman, Marc; Massar, Serge
2016-09-01
Delay-coupled electro-optical systems have received much attention for their dynamical properties and their potential use in signal processing. In particular, it has recently been demonstrated, using the artificial intelligence algorithm known as reservoir computing, that photonic implementations of such systems solve complex tasks such as speech recognition. Here, we show how the backpropagation algorithm can be physically implemented on the same electro-optical delay-coupled architecture used for computation with only minor changes to the original design. We find that, compared to when the backpropagation algorithm is not used, the error rate of the resulting computing device, evaluated on three benchmark tasks, decreases considerably. This demonstrates that electro-optical analog computers can embody a large part of their own training process, allowing them to be applied to new, more difficult tasks.
Job-shop scheduling applied to computer vision
NASA Astrophysics Data System (ADS)
Sebastian y Zuniga, Jose M.; Torres-Medina, Fernando; Aracil, Rafael; Reinoso, Oscar; Jimenez, Luis M.; Garcia, David
1997-09-01
This paper presents a method for minimizing the total elapsed time spent by n tasks running on m differents processors working in parallel. The developed algorithm not only minimizes the total elapsed time but also reduces the idle time and waiting time of in-process tasks. This condition is very important in some applications of computer vision in which the time to finish the total process is particularly critical -- quality control in industrial inspection, real- time computer vision, guided robots. The scheduling algorithm is based on the use of two matrices, obtained from the precedence relationships between tasks, and the data obtained from the two matrices. The developed scheduling algorithm has been tested in one application of quality control using computer vision. The results obtained have been satisfactory in the application of different image processing algorithms.
Embodiment of Learning in Electro-Optical Signal Processors.
Hermans, Michiel; Antonik, Piotr; Haelterman, Marc; Massar, Serge
2016-09-16
Delay-coupled electro-optical systems have received much attention for their dynamical properties and their potential use in signal processing. In particular, it has recently been demonstrated, using the artificial intelligence algorithm known as reservoir computing, that photonic implementations of such systems solve complex tasks such as speech recognition. Here, we show how the backpropagation algorithm can be physically implemented on the same electro-optical delay-coupled architecture used for computation with only minor changes to the original design. We find that, compared to when the backpropagation algorithm is not used, the error rate of the resulting computing device, evaluated on three benchmark tasks, decreases considerably. This demonstrates that electro-optical analog computers can embody a large part of their own training process, allowing them to be applied to new, more difficult tasks.
NASA Astrophysics Data System (ADS)
Devaraj, Rajesh; Sarkar, Arnab; Biswas, Santosh
2015-11-01
In the article 'Supervisory control for fault-tolerant scheduling of real-time multiprocessor systems with aperiodic tasks', Park and Cho presented a systematic way of computing a largest fault-tolerant and schedulable language that provides information on whether the scheduler (i.e., supervisor) should accept or reject a newly arrived aperiodic task. The computation of such a language is mainly dependent on the task execution model presented in their paper. However, the task execution model is unable to capture the situation when the fault of a processor occurs even before the task has arrived. Consequently, a task execution model that does not capture this fact may possibly be assigned for execution on a faulty processor. This problem has been illustrated with an appropriate example. Then, the task execution model of Park and Cho has been modified to strengthen the requirement that none of the tasks are assigned for execution on a faulty processor.
IGT-Open: An open-source, computerized version of the Iowa Gambling Task.
Dancy, Christopher L; Ritter, Frank E
2017-06-01
The Iowa Gambling Task (IGT) is commonly used to understand the processes involved in decision-making. Though the task was originally run without a computer, using a computerized version of the task has become typical. These computerized versions of the IGT are useful, because they can make the task more standardized across studies and allow for the task to be used in environments where a physical version of the task may be difficult or impossible to use (e.g., while collecting brain imaging data). Though these computerized versions of the IGT have been useful for experimentation, having multiple software implementations of the task could present reliability issues. We present an open-source software version of the Iowa Gambling Task (called IGT-Open) that allows for millisecond visual presentation accuracy and is freely available to be used and modified. This software has been used to collect data from human subjects and also has been used to run model-based simulations with computational process models developed to run in the ACT-R architecture.
NASA Astrophysics Data System (ADS)
Hwang, Han-Jeong; Lim, Jeong-Hwan; Kim, Do-Won; Im, Chang-Hwan
2014-07-01
A number of recent studies have demonstrated that near-infrared spectroscopy (NIRS) is a promising neuroimaging modality for brain-computer interfaces (BCIs). So far, most NIRS-based BCI studies have focused on enhancing the accuracy of the classification of different mental tasks. In the present study, we evaluated the performances of a variety of mental task combinations in order to determine the mental task pairs that are best suited for customized NIRS-based BCIs. To this end, we recorded event-related hemodynamic responses while seven participants performed eight different mental tasks. Classification accuracies were then estimated for all possible pairs of the eight mental tasks (C=28). Based on this analysis, mental task combinations with relatively high classification accuracies frequently included the following three mental tasks: "mental multiplication," "mental rotation," and "right-hand motor imagery." Specifically, mental task combinations consisting of two of these three mental tasks showed the highest mean classification accuracies. It is expected that our results will be a useful reference to reduce the time needed for preliminary tests when discovering individual-specific mental task combinations.
Transcultural Nursing Society position statement on human rights.
Miller, June E; Leininger, Madeleine; Leuning, Cheryl; Pacquiao, Dula; Andrews, Margaret; Ludwid-Beymer, Patti; Papadopoulos, Irena
2008-01-01
In 2006, the Transcultural Nursing Society created a business plan with a firm commitment to social change and the support of human rights. One of the primary goals of the plan was to seek recognition from the United Nations as a Human Rights Organization. As a first step in articulating this goal, the board of trustees of TCNS tasked a small group of Transcultural Nursing Scholars to develop a position statement. This article is the culmination of the collaborative task force's efforts to define how TCNS seeks the fulfillment of human rights for people of all cultures worldwide.
2008-02-27
between the PHY layer and for example a host PC computer . The PC wants to generate and receive a sequence of data packets. The PC may also want to send...the testbed is quite similar. Given the intense computational requirements of SVD and other matrix mode operations needed to support eigen spreading a...platform for real time operation. This task is probably the major challenge in the development of the testbed. All compute intensive tasks will be
NASA Technical Reports Server (NTRS)
Carroll, Chester C.; Youngblood, John N.; Saha, Aindam
1987-01-01
Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processing elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carroll, C.C.; Youngblood, J.N.; Saha, A.
1987-12-01
Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processingmore » elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.« less
Dashboard Task Monitor for Managing ATLAS User Analysis on the Grid
NASA Astrophysics Data System (ADS)
Sargsyan, L.; Andreeva, J.; Jha, M.; Karavakis, E.; Kokoszkiewicz, L.; Saiz, P.; Schovancova, J.; Tuckett, D.; Atlas Collaboration
2014-06-01
The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and Grid environment. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.
Study to design and develop remote manipulator system
NASA Technical Reports Server (NTRS)
Hill, J. W.; Sword, A. J.
1973-01-01
Human performance measurement techniques for remote manipulation tasks and remote sensing techniques for manipulators are described for common manipulation tasks, performance is monitored by means of an on-line computer capable of measuring the joint angles of both master and slave arms as a function of time. The computer programs allow measurements of the operator's strategy and physical quantities such as task time and power consumed. The results are printed out after a test run to compare different experimental conditions. For tracking tasks, we describe a method of displaying errors in three dimensions and measuring the end-effector position in three dimensions.
Virtual reality computer simulation.
Grantcharov, T P; Rosenberg, J; Pahle, E; Funch-Jensen, P
2001-03-01
Objective assessment of psychomotor skills should be an essential component of a modern surgical training program. There are computer systems that can be used for this purpose, but their wide application is not yet generally accepted. The aim of this study was to validate the role of virtual reality computer simulation as a method for evaluating surgical laparoscopic skills. The study included 14 surgical residents. On day 1, they performed two runs of all six tasks on the Minimally Invasive Surgical Trainer, Virtual Reality (MIST VR). On day 2, they performed a laparoscopic cholecystectomy on living pigs; afterward, they were tested again on the MIST VR. A group of experienced surgeons evaluated the trainees' performance on the animal operation, giving scores for total performance error and economy of motion. During the tasks on the MIST VR, errors and noneconomy of movements for the left and right hand were also recorded. There were significant correlations between error scores in vivo and three of the six in vitro tasks (p < 0.05). In vivo economy scores correlated significantly with non-economy right-hand scores for five of the six tasks and with non-economy left-hand scores for one of the six tasks (p < 0.05). In this study, laparoscopic performance in the animal model correlated significantly with performance on the computer simulator. Thus, the computer model seems to be a promising objective method for the assessment of laparoscopic psychomotor skills.
Fault recovery for real-time, multi-tasking computer system
NASA Technical Reports Server (NTRS)
Hess, Richard (Inventor); Kelly, Gerald B. (Inventor); Rogers, Randy (Inventor); Stange, Kent A. (Inventor)
2011-01-01
System and methods for providing a recoverable real time multi-tasking computer system are disclosed. In one embodiment, a system comprises a real time computing environment, wherein the real time computing environment is adapted to execute one or more applications and wherein each application is time and space partitioned. The system further comprises a fault detection system adapted to detect one or more faults affecting the real time computing environment and a fault recovery system, wherein upon the detection of a fault the fault recovery system is adapted to restore a backup set of state variables.
Physical Medicine and Rehabilitation Resident Use of iPad Mini Mobile Devices.
Niehaus, William; Boimbo, Sandra; Akuthota, Venu
2015-05-01
Previous research on the use of tablet devices in residency programs has been undertaken in radiology and medicine or with standard-sized tablet devices. With new, smaller tablet devices, there is an opportunity to assess their effect on resident behavior. This prospective study attempts to evaluate resident behavior after receiving a smaller tablet device. To evaluate whether smaller tablet computers facilitate residents' daily tasks. Prospective study that administered surveys to evaluate tablet computer use. Residency program. Thirteen physical medicine and rehabilitation residents. Residents were provided 16-GB iPad Minis and surveyed with Redcap to collect usage information at baseline, 3, and 6 months. Survey analysis was conducted using SAS (SAS, Cary, NC) for descriptive analysis. To evaluate multiple areas of resident education, the following tasks were selected: accessing e-mail, logging duty hours, logging procedures, researching clinical information, accessing medical journals, reviewing didactic presentations, and completing evaluations. Then, measurements were taken of: (1) residents' response to how tablet computers made it easier to access the aforementioned tasks; and (2) residents' response to how tablet computers affected the frequency they performed the aforementioned tasks. After being provided tablet computers, our physical medicine and rehabilitation residents reported significantly greater access to e-mail, medical journals, and didactic material. Also, receiving tablet computers was reported to increase the frequency that residents accessed e-mail, researched clinical information, accessed medical journals, reviewed didactic presentations, and completed evaluations. After receiving a tablet computer, residents reported an increase in the use of calendar programs, note-taking programs, PDF readers, online storage programs, and file organization programs. These physical medicine and rehabilitation residents reported tablet computers increased access to e-mail, presentation material, and medical journals. Tablet computers also were reported to increase the frequency residents were able to complete tasks associated with residency training. Copyright © 2015 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.
Quantitative analysis of task selection for brain-computer interfaces
NASA Astrophysics Data System (ADS)
Llera, Alberto; Gómez, Vicenç; Kappen, Hilbert J.
2014-10-01
Objective. To assess quantitatively the impact of task selection in the performance of brain-computer interfaces (BCI). Approach. We consider the task-pairs derived from multi-class BCI imagery movement tasks in three different datasets. We analyze for the first time the benefits of task selection on a large-scale basis (109 users) and evaluate the possibility of transferring task-pair information across days for a given subject. Main results. Selecting the subject-dependent optimal task-pair among three different imagery movement tasks results in approximately 20% potential increase in the number of users that can be expected to control a binary BCI. The improvement is observed with respect to the best task-pair fixed across subjects. The best task-pair selected for each subject individually during a first day of recordings is generally a good task-pair in subsequent days. In general, task learning from the user side has a positive influence in the generalization of the optimal task-pair, but special attention should be given to inexperienced subjects. Significance. These results add significant evidence to existing literature that advocates task selection as a necessary step towards usable BCIs. This contribution motivates further research focused on deriving adaptive methods for task selection on larger sets of mental tasks in practical online scenarios.
Semiautomatic tumor segmentation with multimodal images in a conditional random field framework.
Hu, Yu-Chi; Grossberg, Michael; Mageras, Gikas
2016-04-01
Volumetric medical images of a single subject can be acquired using different imaging modalities, such as computed tomography, magnetic resonance imaging (MRI), and positron emission tomography. In this work, we present a semiautomatic segmentation algorithm that can leverage the synergies between different image modalities while integrating interactive human guidance. The algorithm provides a statistical segmentation framework partly automating the segmentation task while still maintaining critical human oversight. The statistical models presented are trained interactively using simple brush strokes to indicate tumor and nontumor tissues and using intermediate results within a patient's image study. To accomplish the segmentation, we construct the energy function in the conditional random field (CRF) framework. For each slice, the energy function is set using the estimated probabilities from both user brush stroke data and prior approved segmented slices within a patient study. The progressive segmentation is obtained using a graph-cut-based minimization. Although no similar semiautomated algorithm is currently available, we evaluated our method with an MRI data set from Medical Image Computing and Computer Assisted Intervention Society multimodal brain segmentation challenge (BRATS 2012 and 2013) against a similar fully automatic method based on CRF and a semiautomatic method based on grow-cut, and our method shows superior performance.
Harmonious University Construction Demands Internal and External Endeavors
ERIC Educational Resources Information Center
Lou, Xiang-yang; Zhi, Xi-zhe; Lu, Jin
2008-01-01
Universities play an irreplaceable role in the process of harmonious society construction. It becomes a critical task to construct harmonious university because of strain relations among universities, governments and society, and internal unbalance of universities. To construct harmonious university demands internal and external endeavors:…
Human-computer dialogue: Interaction tasks and techniques. Survey and categorization
NASA Technical Reports Server (NTRS)
Foley, J. D.
1983-01-01
Interaction techniques are described. Six basic interaction tasks, requirements for each task, requirements related to interaction techniques, and a technique's hardware prerequisites affective device selection are discussed.
Is Neural Activity Detected by ERP-Based Brain-Computer Interfaces Task Specific?
Wenzel, Markus A; Almeida, Inês; Blankertz, Benjamin
2016-01-01
Brain-computer interfaces (BCIs) that are based on event-related potentials (ERPs) can estimate to which stimulus a user pays particular attention. In typical BCIs, the user silently counts the selected stimulus (which is repeatedly presented among other stimuli) in order to focus the attention. The stimulus of interest is then inferred from the electroencephalogram (EEG). Detecting attention allocation implicitly could be also beneficial for human-computer interaction (HCI), because it would allow software to adapt to the user's interest. However, a counting task would be inappropriate for the envisaged implicit application in HCI. Therefore, the question was addressed if the detectable neural activity is specific for silent counting, or if it can be evoked also by other tasks that direct the attention to certain stimuli. Thirteen people performed a silent counting, an arithmetic and a memory task. The tasks required the subjects to pay particular attention to target stimuli of a random color. The stimulus presentation was the same in all three tasks, which allowed a direct comparison of the experimental conditions. Classifiers that were trained to detect the targets in one task, according to patterns present in the EEG signal, could detect targets in all other tasks (irrespective of some task-related differences in the EEG). The neural activity detected by the classifiers is not strictly task specific but can be generalized over tasks and is presumably a result of the attention allocation or of the augmented workload. The results may hold promise for the transfer of classification algorithms from BCI research to implicit relevance detection in HCI.
Modeling User Behavior in Computer Learning Tasks.
ERIC Educational Resources Information Center
Mantei, Marilyn M.
Model building techniques from Artifical Intelligence and Information-Processing Psychology are applied to human-computer interface tasks to evaluate existing interfaces and suggest new and better ones. The model is in the form of an augmented transition network (ATN) grammar which is built by applying grammar induction heuristics on a sequential…
ERIC Educational Resources Information Center
Phillips, Lawrence; Pearl, Lisa
2015-01-01
The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's "cognitive plausibility." We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition…
ERIC Educational Resources Information Center
Amara, Sofiane; Macedo, Joaquim; Bendella, Fatima; Santos, Alexandre
2016-01-01
Learners are becoming increasingly divers. They may have much personal, social, cultural, psychological, and cognitive diversity. Forming suitable learning groups represents, therefore, a hard and time-consuming task. In Mobile Computer Supported Collaborative Learning (MCSCL) environments, this task is more difficult. Instructors need to consider…
DOT National Transportation Integrated Search
2007-08-01
This research was conducted to develop and test a personal computer-based study procedure (PCSP) with secondary task loading for use in human factors laboratory experiments in lieu of a driving simulator to test reading time and understanding of traf...
Task Assignment Heuristics for Parallel and Distributed CFD Applications
NASA Technical Reports Server (NTRS)
Lopez-Benitez, Noe; Djomehri, M. Jahed; Biswas, Rupak
2003-01-01
This paper proposes a task graph (TG) model to represent a single discrete step of multi-block overset grid computational fluid dynamics (CFD) applications. The TG model is then used to not only balance the computational workload across the overset grids but also to reduce inter-grid communication costs. We have developed a set of task assignment heuristics based on the constraints inherent in this class of CFD problems. Two basic assignments, the smallest task first (STF) and the largest task first (LTF), are first presented. They are then systematically costs. To predict the performance of the proposed task assignment heuristics, extensive performance evaluations are conducted on a synthetic TG with tasks defined in terms of the number of grid points in predetermined overlapping grids. A TG derived from a realistic problem with eight million grid points is also used as a test case.
GANGA: A tool for computational-task management and easy access to Grid resources
NASA Astrophysics Data System (ADS)
Mościcki, J. T.; Brochu, F.; Ebke, J.; Egede, U.; Elmsheuser, J.; Harrison, K.; Jones, R. W. L.; Lee, H. C.; Liko, D.; Maier, A.; Muraru, A.; Patrick, G. N.; Pajchel, K.; Reece, W.; Samset, B. H.; Slater, M. W.; Soroko, A.; Tan, C. L.; van der Ster, D. C.; Williams, M.
2009-11-01
In this paper, we present the computational task-management tool GANGA, which allows for the specification, submission, bookkeeping and post-processing of computational tasks on a wide set of distributed resources. GANGA has been developed to solve a problem increasingly common in scientific projects, which is that researchers must regularly switch between different processing systems, each with its own command set, to complete their computational tasks. GANGA provides a homogeneous environment for processing data on heterogeneous resources. We give examples from High Energy Physics, demonstrating how an analysis can be developed on a local system and then transparently moved to a Grid system for processing of all available data. GANGA has an API that can be used via an interactive interface, in scripts, or through a GUI. Specific knowledge about types of tasks or computational resources is provided at run-time through a plugin system, making new developments easy to integrate. We give an overview of the GANGA architecture, give examples of current use, and demonstrate how GANGA can be used in many different areas of science. Catalogue identifier: AEEN_v1_0 Program summary URL:
Ko, Emily M; Havrilesky, Laura J; Alvarez, Ronald D; Zivanovic, Oliver; Boyd, Leslie R; Jewell, Elizabeth L; Timmins, Patrick F; Gibb, Randall S; Jhingran, Anuja; Cohn, David E; Dowdy, Sean C; Powell, Matthew A; Chalas, Eva; Huang, Yongmei; Rathbun, Jill; Wright, Jason D
2018-05-01
Health care in the United States is in the midst of a significant transformation from a "fee for service" to a "fee for value" based model. The Medicare Access and CHIP Reauthorization Act of 2015 has only accelerated this transition. Anticipating these reforms, the Society of Gynecologic Oncology developed the Future of Physician Payment Reform Task Force (PPRTF) in 2015 to develop strategies to ensure fair value based reimbursement policies for gynecologic cancer care. The PPRTF elected as a first task to develop an Alternative Payment Model for thesurgical management of low risk endometrial cancer. The history, rationale, and conceptual framework for the development of an Endometrial Cancer Alternative Payment Model are described in this white paper, as well as directions forfuture efforts. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Townsend, James C.; Weston, Robert P.; Eidson, Thomas M.
1993-01-01
The Framework for Interdisciplinary Design Optimization (FIDO) is a general programming environment for automating the distribution of complex computing tasks over a networked system of heterogeneous computers. For example, instead of manually passing a complex design problem between its diverse specialty disciplines, the FIDO system provides for automatic interactions between the discipline tasks and facilitates their communications. The FIDO system networks all the computers involved into a distributed heterogeneous computing system, so they have access to centralized data and can work on their parts of the total computation simultaneously in parallel whenever possible. Thus, each computational task can be done by the most appropriate computer. Results can be viewed as they are produced and variables changed manually for steering the process. The software is modular in order to ease migration to new problems: different codes can be substituted for each of the current code modules with little or no effect on the others. The potential for commercial use of FIDO rests in the capability it provides for automatically coordinating diverse computations on a networked system of workstations and computers. For example, FIDO could provide the coordination required for the design of vehicles or electronics or for modeling complex systems.
NASA Astrophysics Data System (ADS)
Heglund, Brian
Educators recognize the importance of reasoning ability for development of critical thinking skills, conceptual change, metacognition, and participation in 21st century society. There is a recognized need for students to improve their skills of argumentation, however, argumentation is not explicitly taught outside logic and philosophy---subjects that are not part of the K-12 curriculum. One potential way of supporting the development of argumentation skills in the K-12 context is through incorporating Computer-Assisted Argument Mapping to evaluate arguments. This quasi-experimental study tested the effects of such argument mapping software and was informed by the following two research questions: 1. To what extent does the collaborative use of Computer-Assisted Argumentation Mapping to evaluate competing theories influence the critical thinking skill of argument evaluation, metacognitive awareness, and conceptual knowledge acquisition in high school Advanced Placement physics, compared to the more traditional method of text tables that does not employ Computer-Assisted Argumentation Mapping? 2. What are the student perceptions of the pros and cons of argument evaluation in the high school Advanced Placement physics environment? This study examined changes in critical thinking skills, including argumentation evaluation skills, as well as metacognitive awareness and conceptual knowledge, in two groups: a treatment group using Computer-Assisted Argumentation Mapping to evaluate physics arguments, and a comparison group using text tables to evaluate physics arguments. Quantitative and qualitative methods for collecting and analyzing data were used to answer the research questions. Quantitative data indicated no significant difference between the experimental groups, and qualitative data suggested students perceived pros and cons of argument evaluation in the high school Advanced Placement physics environment, such as self-reported sense of improvement in argument evaluation and low perceived value of the learning task, respectively. The discussion presents implications for practice and research, such as introducing motivation scaffolds to support appreciation of task value, and addressing major differences between the design of this study and similar published studies, respectively. This work provides contributions in that it tested the effect of Computer-Assisted Argumentation Mapping on the critical thinking skills of twelfth-grade students within the context of evaluating physics arguments, a previously unexplored age group and domain.
ERIC Educational Resources Information Center
Association for the Advancement of Computing in Education. Asia-Pacific Chapter.
This conference addressed pedagogical, social, and technological issues related to computers in education. The conference theme, "Learning Societies in the New Millennium: Creativity, Caring & Commitments," focused on creative learning, caring for diverse cultures and global issues, and committing oneself to a new way of…
ERIC Educational Resources Information Center
Brownlow, David; And Others
Arranged in three sections, this resource for secondary school students provides an introduction to the computer's impact on society. The first section surveys historical methods of recording and storing information: clay tablets, papyrus, and books. The second section describes how computers work and ways they can be used. Also considered are the…
XXV IUPAP Conference on Computational Physics (CCP2013): Preface
NASA Astrophysics Data System (ADS)
2014-05-01
XXV IUPAP Conference on Computational Physics (CCP2013) was held from 20-24 August 2013 at the Russian Academy of Sciences in Moscow, Russia. The annual Conferences on Computational Physics (CCP) present an overview of the most recent developments and opportunities in computational physics across a broad range of topical areas. The CCP series aims to draw computational scientists from around the world and to stimulate interdisciplinary discussion and collaboration by putting together researchers interested in various fields of computational science. It is organized under the auspices of the International Union of Pure and Applied Physics and has been in existence since 1989. The CCP series alternates between Europe, America and Asia-Pacific. The conferences are traditionally supported by European Physical Society and American Physical Society. This year the Conference host was Landau Institute for Theoretical Physics. The Conference contained 142 presentations, and, in particular, 11 plenary talks with comprehensive reviews from airbursts to many-electron systems. We would like to take this opportunity to thank our sponsors: International Union of Pure and Applied Physics (IUPAP), European Physical Society (EPS), Division of Computational Physics of American Physical Society (DCOMP/APS), Russian Foundation for Basic Research, Department of Physical Sciences of Russian Academy of Sciences, RSC Group company. Further conference information and images from the conference are available in the pdf.
Opportunistic Computing with Lobster: Lessons Learned from Scaling up to 25k Non-Dedicated Cores
NASA Astrophysics Data System (ADS)
Wolf, Matthias; Woodard, Anna; Li, Wenzhao; Hurtado Anampa, Kenyi; Yannakopoulos, Anna; Tovar, Benjamin; Donnelly, Patrick; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas
2017-10-01
We previously described Lobster, a workflow management tool for exploiting volatile opportunistic computing resources for computation in HEP. We will discuss the various challenges that have been encountered while scaling up the simultaneous CPU core utilization and the software improvements required to overcome these challenges. Categories: Workflows can now be divided into categories based on their required system resources. This allows the batch queueing system to optimize assignment of tasks to nodes with the appropriate capabilities. Within each category, limits can be specified for the number of running jobs to regulate the utilization of communication bandwidth. System resource specifications for a task category can now be modified while a project is running, avoiding the need to restart the project if resource requirements differ from the initial estimates. Lobster now implements time limits on each task category to voluntarily terminate tasks. This allows partially completed work to be recovered. Workflow dependency specification: One workflow often requires data from other workflows as input. Rather than waiting for earlier workflows to be completed before beginning later ones, Lobster now allows dependent tasks to begin as soon as sufficient input data has accumulated. Resource monitoring: Lobster utilizes a new capability in Work Queue to monitor the system resources each task requires in order to identify bottlenecks and optimally assign tasks. The capability of the Lobster opportunistic workflow management system for HEP computation has been significantly increased. We have demonstrated efficient utilization of 25 000 non-dedicated cores and achieved a data input rate of 30 Gb/s and an output rate of 500GB/h. This has required new capabilities in task categorization, workflow dependency specification, and resource monitoring.
Task-Induced Development of Hinting Behaviors in Online Task-Oriented L2 Interaction
ERIC Educational Resources Information Center
Balaman, Ufuk
2018-01-01
Technology-mediated task settings are rich interactional domains in which second language (L2) learners manage a multitude of interactional resources for task accomplishment. The affordances of these settings have been repeatedly addressed in computer-assisted language learning (CALL) literature mainly based on theory-informed task design…
Has computational creativity successfully made it "Beyond the Fence" in musical theatre?
NASA Astrophysics Data System (ADS)
Jordanous, Anna
2017-10-01
A significant test for software is to task it with replicating human performance, as done recently with creative software and the commercial project Beyond the Fence (undertaken for a television documentary Computer Says Show). The remit of this project was to use computer software as much as possible to produce "the world's first computer-generated musical". Several creative systems were used to generate this musical, which was performed in London's West End in 2016. This paper considers the challenge of evaluating this project. Current computational creativity evaluation methods are ill-suited to evaluating projects that involve creative input from multiple systems and people. Following recent inspiration within computational creativity research from interaction design, here the DECIDE evaluation framework is applied to evaluate the Beyond the Fence project. Evaluation finds that the project was reasonably successful at achieving the task of using computational generation to produce a credible musical. Lessons have been learned for future computational creativity projects though, particularly for affording creative software more agency and enabling software to interact with other creative partners. Upon reflection, the DECIDE framework emerges as a useful evaluation "checklist" (if not a tangible operational methodology) for evaluating multiple creative systems participating in a creative task.
Ren, Li-Hong; Ding, Yong-Sheng; Shen, Yi-Zhen; Zhang, Xiang-Feng
2008-10-01
Recently, a collective effort from multiple research areas has been made to understand biological systems at the system level. This research requires the ability to simulate particular biological systems as cells, organs, organisms, and communities. In this paper, a novel bio-network simulation platform is proposed for system biology studies by combining agent approaches. We consider a biological system as a set of active computational components interacting with each other and with an external environment. Then, we propose a bio-network platform for simulating the behaviors of biological systems and modelling them in terms of bio-entities and society-entities. As a demonstration, we discuss how a protein-protein interaction (PPI) network can be seen as a society of autonomous interactive components. From interactions among small PPI networks, a large PPI network can emerge that has a remarkable ability to accomplish a complex function or task. We also simulate the evolution of the PPI networks by using the bio-operators of the bio-entities. Based on the proposed approach, various simulators with different functions can be embedded in the simulation platform, and further research can be done from design to development, including complexity validation of the biological system.
Watson, Karriem S; Blok, Amanda C; Buscemi, Joanna; Molina, Yamile; Fitzgibbon, Marian; Simon, Melissa A; Williams, Lance; Matthews, Kameron; Studts, Jamie L; Lillie, Sarah E; Ostroff, Jamie S; Carter-Harris, Lisa; Winn, Robert A
2016-12-01
The Society of Behavioral Medicine (SBM) supports the United States Preventive Services Task Force (USPSTF) recommendation of low-dose computed tomography (LDCT) screening of the chest for eligible populations to reduce lung cancer mortality. Consistent with efforts to translate research findings into real-world settings, SBM encourages health-care providers and health-care systems to (1) integrate evidence-based tobacco treatment as an essential component of LDCT-based lung cancer screening, (2) examine the structural barriers that may impact screening uptake, and (3) incorporate shared decision-making as a clinical platform to facilitate consultations and engagement with individuals at high risk for lung cancer about the potential benefits and harms associated with participation in a lung cancer screening program. We advise policy makers and legislators to support screening in high-risk populations by continuing to (1) expand access to high quality LDCT-based screening among underserved high-risk populations, (2) enhance cost-effectiveness by integrating evidence-based tobacco treatments into screening in high-risk populations, and (3) increase funding for research that explores implementation science and increased public awareness and access of diverse populations to participate in clinical and translational research.
Models and techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1977-01-01
Models, measures and techniques were developed for evaluating the effectiveness of aircraft computing systems. The concept of effectiveness involves aspects of system performance, reliability and worth. Specifically done was a detailed development of model hierarchy at mission, functional task, and computational task levels. An appropriate class of stochastic models was investigated which served as bottom level models in the hierarchial scheme. A unified measure of effectiveness called 'performability' was defined and formulated.
Emergent Leadership and Team Effectiveness on a Team Resource Allocation Task
1987-10-01
equivalent training and experience on this task, but they had different levels of experience with computers and video games . This differential experience...typed: that is. it is sex-typed to the extent that males spend mnore time on related instrumeuts like computers and video games . However. the sex...perform better or worse than less talkative teams? Did teams with much computer and ’or video game experience perform better than inexperienced teams
Automating tasks in protein structure determination with the clipper python module.
McNicholas, Stuart; Croll, Tristan; Burnley, Tom; Palmer, Colin M; Hoh, Soon Wen; Jenkins, Huw T; Dodson, Eleanor; Cowtan, Kevin; Agirre, Jon
2018-01-01
Scripting programming languages provide the fastest means of prototyping complex functionality. Those with a syntax and grammar resembling human language also greatly enhance the maintainability of the produced source code. Furthermore, the combination of a powerful, machine-independent scripting language with binary libraries tailored for each computer architecture allows programs to break free from the tight boundaries of efficiency traditionally associated with scripts. In the present work, we describe how an efficient C++ crystallographic library such as Clipper can be wrapped, adapted and generalized for use in both crystallographic and electron cryo-microscopy applications, scripted with the Python language. We shall also place an emphasis on best practices in automation, illustrating how this can be achieved with this new Python module. © 2017 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.
III. NIH Toolbox Cognition Battery (CB): measuring episodic memory.
Bauer, Patricia J; Dikmen, Sureyya S; Heaton, Robert K; Mungas, Dan; Slotkin, Jerry; Beaumont, Jennifer L
2013-08-01
One of the most significant domains of cognition is episodic memory, which allows for rapid acquisition and long-term storage of new information. For purposes of the NIH Toolbox, we devised a new test of episodic memory. The nonverbal NIH Toolbox Picture Sequence Memory Test (TPSMT) requires participants to reproduce the order of an arbitrarily ordered sequence of pictures presented on a computer. To adjust for ability, sequence length varies from 6 to 15 pictures. Multiple trials are administered to increase reliability. Pediatric data from the validation study revealed the TPSMT to be sensitive to age-related changes. The task also has high test-retest reliability and promising construct validity. Steps to further increase the sensitivity of the instrument to individual and age-related variability are described. © 2013 The Society for Research in Child Development, Inc.
NASA Astrophysics Data System (ADS)
Schoitsch, Erwin
1988-07-01
Our society is depending more and more on the reliability of embedded (real-time) computer systems even in every-day life. Considering the complexity of the real world, this might become a severe threat. Real-time programming is a discipline important not only in process control and data acquisition systems, but also in fields like communication, office automation, interactive databases, interactive graphics and operating systems development. General concepts of concurrent programming and constructs for process-synchronization are discussed in detail. Tasking and synchronization concepts, methods of process communication, interrupt- and timeout handling in systems based on semaphores, signals, conditional critical regions or on real-time languages like Concurrent PASCAL, MODULA, CHILL and ADA are explained and compared with each other and with respect to their potential to quality and safety.
Lewiecki, E Michael; Compston, Juliet E; Miller, Paul D; Adachi, Jonathan D; Adams, Judith E; Leslie, William D; Kanis, John A
2011-01-01
FRAX(®) is a fracture risk assessment algorithm developed by the World Health Organization in cooperation with other medical organizations and societies. Using easily available clinical information and femoral neck bone mineral density (BMD) measured by dual-energy X-ray absorptiometry (DXA), when available, FRAX(®) is used to predict the 10-year probability of hip fracture and major osteoporotic fracture. These values may be included in country specific guidelines to aid clinicians in determining when fracture risk is sufficiently high that the patient is likely to benefit from pharmacological therapy to reduce that risk. Since the introduction of FRAX(®) into clinical practice, many practical clinical questions have arisen regarding its use. To address such questions, the International Society for Clinical Densitometry (ISCD) and International Osteoporosis Foundations (IOF) assigned task forces to review the best available medical evidence and make recommendations for optimal use of FRAX(®) in clinical practice. Questions were identified and divided into three general categories. A task force was assigned to investigating the medical evidence in each category and developing clinically useful recommendations. The BMD Task Force addressed issues that included the potential use of skeletal sites other than the femoral neck, the use of technologies other than DXA, and the deletion or addition of clinical data for FRAX(®) input. The evidence and recommendations were presented to a panel of experts at the ISCD-IOF FRAX(®) Position Development Conference, resulting in the development of ISCD-IOF Official Positions addressing FRAX(®)-related issues. Copyright © 2011 The International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.
Social justice and the politics of recognition.
Arfken, Michael
2013-09-01
Comments on the original article, "Psychology and social justice: Why we do what we do" by M. J. T. Vasquez (see record 2012-18676-002). Vasquez pointed to numerous initiatives and task forces that the American Psychological Association (APA) has established to address the marginalization and subordination of various groups. There is little doubt that the concerns addressed by these initiatives and task forces are important and play a central role in the development of a just society. Although Vasquez noted that "social realities are important determinants of distress" she failed to appreciate the extent to which our social relations emerge against the background of specific political and economic structures. The cost of this oversight is the perpetuation of a politics of recognition that does little to address the economic inequalities that are a defining feature of unjust societies. Were APA to restrict its attention to psychological distress or access to resources, it would place APA in the service of maintaining rather than transforming the existing structure of society. APA should consider developing initiatives and task forces to investigate the role that capitalism plays in the perpetuation of inequality and exploitation. It may also be time to reflect on why an institution that claims to be dedicated to social justice has had so little to say about one of the dominant features of modern society. © 2013 APA, all rights reserved.
After-effects of human-computer interaction indicated by P300 of the event-related brain potential.
Trimmel, M; Huber, R
1998-05-01
After-effects of human-computer interaction (HCI) were investigated by using the P300 component of the event-related brain potential (ERP). Forty-nine subjects (naive non-users, beginners, experienced users, programmers) completed three paper/pencil tasks (text editing, solving intelligence test items, filling out a questionnaire on sensation seeking) and three HCI tasks (text editing, executing a tutor program or programming, playing Tetris). The sequence of 7-min tasks was randomized between subjects and balanced between groups. After each experimental condition ERPs were recorded during an acoustic discrimination task at F3, F4, Cz, P3 and P4. Data indicate that: (1) mental after-effects of HCI can be detected by P300 of the ERP; (2) HCI showed in general a reduced amplitude; (3) P300 amplitude varied also with type of task, mainly at F4 where it was smaller after cognitive tasks (intelligence test/programming) and larger after emotion-based tasks (sensation seeking/Tetris); (4) cognitive tasks showed shorter latencies; (5) latencies were widely location-independent (within the range of 356-358 ms at F3, F4, P3 and P4) after executing the tutor program or programming; and (6) all observed after-effects were independent of the user's experience in operating computers and may therefore reflect short-term after-effects only and no structural changes of information processing caused by HCI.
Mental workload during brain-computer interface training.
Felton, Elizabeth A; Williams, Justin C; Vanderheiden, Gregg C; Radwin, Robert G
2012-01-01
It is not well understood how people perceive the difficulty of performing brain-computer interface (BCI) tasks, which specific aspects of mental workload contribute the most, and whether there is a difference in perceived workload between participants who are able-bodied and disabled. This study evaluated mental workload using the NASA Task Load Index (TLX), a multi-dimensional rating procedure with six subscales: Mental Demands, Physical Demands, Temporal Demands, Performance, Effort, and Frustration. Able-bodied and motor disabled participants completed the survey after performing EEG-based BCI Fitts' law target acquisition and phrase spelling tasks. The NASA-TLX scores were similar for able-bodied and disabled participants. For example, overall workload scores (range 0-100) for 1D horizontal tasks were 48.5 (SD = 17.7) and 46.6 (SD 10.3), respectively. The TLX can be used to inform the design of BCIs that will have greater usability by evaluating subjective workload between BCI tasks, participant groups, and control modalities. Mental workload of brain-computer interfaces (BCI) can be evaluated with the NASA Task Load Index (TLX). The TLX is an effective tool for comparing subjective workload between BCI tasks, participant groups (able-bodied and disabled), and control modalities. The data can inform the design of BCIs that will have greater usability.
Magnetic Tunnel Junction Mimics Stochastic Cortical Spiking Neurons
NASA Astrophysics Data System (ADS)
Sengupta, Abhronil; Panda, Priyadarshini; Wijesinghe, Parami; Kim, Yusung; Roy, Kaushik
2016-07-01
Brain-inspired computing architectures attempt to mimic the computations performed in the neurons and the synapses in the human brain in order to achieve its efficiency in learning and cognitive tasks. In this work, we demonstrate the mapping of the probabilistic spiking nature of pyramidal neurons in the cortex to the stochastic switching behavior of a Magnetic Tunnel Junction in presence of thermal noise. We present results to illustrate the efficiency of neuromorphic systems based on such probabilistic neurons for pattern recognition tasks in presence of lateral inhibition and homeostasis. Such stochastic MTJ neurons can also potentially provide a direct mapping to the probabilistic computing elements in Belief Networks for performing regenerative tasks.
Caffeine dosing strategies to optimize alertness during sleep loss.
Vital-Lopez, Francisco G; Ramakrishnan, Sridhar; Doty, Tracy J; Balkin, Thomas J; Reifman, Jaques
2018-05-28
Sleep loss, which affects about one-third of the US population, can severely impair physical and neurobehavioural performance. Although caffeine, the most widely used stimulant in the world, can mitigate these effects, currently there are no tools to guide the timing and amount of caffeine consumption to optimize its benefits. In this work, we provide an optimization algorithm, suited for mobile computing platforms, to determine when and how much caffeine to consume, so as to safely maximize neurobehavioural performance at the desired time of the day, under any sleep-loss condition. The algorithm is based on our previously validated Unified Model of Performance, which predicts the effect of caffeine consumption on a psychomotor vigilance task. We assessed the algorithm by comparing the caffeine-dosing strategies (timing and amount) it identified with the dosing strategies used in four experimental studies, involving total and partial sleep loss. Through computer simulations, we showed that the algorithm yielded caffeine-dosing strategies that enhanced performance of the predicted psychomotor vigilance task by up to 64% while using the same total amount of caffeine as in the original studies. In addition, the algorithm identified strategies that resulted in equivalent performance to that in the experimental studies while reducing caffeine consumption by up to 65%. Our work provides the first quantitative caffeine optimization tool for designing effective strategies to maximize neurobehavioural performance and to avoid excessive caffeine consumption during any arbitrary sleep-loss condition. © 2018 The Authors. Journal of Sleep Research published by John Wiley & Sons Ltd on behalf of European Sleep Research Society.
Desktop Computing Integration Project
NASA Technical Reports Server (NTRS)
Tureman, Robert L., Jr.
1992-01-01
The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.
Accelerating Dust Storm Simulation by Balancing Task Allocation in Parallel Computing Environment
NASA Astrophysics Data System (ADS)
Gui, Z.; Yang, C.; XIA, J.; Huang, Q.; YU, M.
2013-12-01
Dust storm has serious negative impacts on environment, human health, and assets. The continuing global climate change has increased the frequency and intensity of dust storm in the past decades. To better understand and predict the distribution, intensity and structure of dust storm, a series of dust storm models have been developed, such as Dust Regional Atmospheric Model (DREAM), the NMM meteorological module (NMM-dust) and Chinese Unified Atmospheric Chemistry Environment for Dust (CUACE/Dust). The developments and applications of these models have contributed significantly to both scientific research and our daily life. However, dust storm simulation is a data and computing intensive process. Normally, a simulation for a single dust storm event may take several days or hours to run. It seriously impacts the timeliness of prediction and potential applications. To speed up the process, high performance computing is widely adopted. By partitioning a large study area into small subdomains according to their geographic location and executing them on different computing nodes in a parallel fashion, the computing performance can be significantly improved. Since spatiotemporal correlations exist in the geophysical process of dust storm simulation, each subdomain allocated to a node need to communicate with other geographically adjacent subdomains to exchange data. Inappropriate allocations may introduce imbalance task loads and unnecessary communications among computing nodes. Therefore, task allocation method is the key factor, which may impact the feasibility of the paralleling. The allocation algorithm needs to carefully leverage the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire system. This presentation introduces two algorithms for such allocation and compares them with evenly distributed allocation method. Specifically, 1) In order to get optimized solutions, a quadratic programming based modeling method is proposed. This algorithm performs well with small amount of computing tasks. However, its efficiency decreases significantly as the subdomain number and computing node number increase. 2) To compensate performance decreasing for large scale tasks, a K-Means clustering based algorithm is introduced. Instead of dedicating to get optimized solutions, this method can get relatively good feasible solutions within acceptable time. However, it may introduce imbalance communication for nodes or node-isolated subdomains. This research shows both two algorithms have their own strength and weakness for task allocation. A combination of the two algorithms is under study to obtain a better performance. Keywords: Scheduling; Parallel Computing; Load Balance; Optimization; Cost Model
Jastremski, M; Jastremski, C; Shepherd, M; Friedman, V; Porembka, D; Smith, R; Gonzales, E; Swedlow, D; Belzberg, H; Crass, R
1995-10-01
To test a model for the assessment of critical care technology on closed loop infusion control, a technology that is in its early stages of development and testing on human subjects. A computer-assisted search of the English language literature and reviews of the gathered data by experts in the field of closed loop infusion control systems. Studies relating to closed loop infusion control that addressed one or more of the questions contained in our technology assessment template were analyzed. Study design was not a factor in article selection. However, the lack of well-designed clinical outcome studies was an important factor in determining our conclusions. A focus person summarized the data from the selected studies that related to each of the assessment questions. The preliminary data summary developed by the focus person was further analyzed and refined by the task force. Experts in closed loop systems were then added to the group to review the summary provided by the task force. These experts' comments were considered by the task force and this final consensus report was developed. Closed loop system control is a technological concept that may be applicable to several aspects of critical care practice. This is a technology in the early stages of evolution and much more research and data are needed before its introduction into usual clinical practice. Furthermore, each specific application and each device for each application (e.g., nitroprusside infusion, ventilator adjustment), although based on the same technological concept, are sufficiently different in terms of hardware and computer algorithms to require independent validation studies. Closed loop infusion systems may have a role in critical care practice. However, for most applications, further development is required to move this technology from the innovation phase to the point where it can be evaluated so that its role in critical car practice can be defined. Each application of closed loop infusion systems must be independently validated by appropriately designed research studies. Users should be provided with the clinical parameters driving each closed loop system so that they can ensure that it agrees with their opinion of acceptable medical practice. Clinical researchers and leaders in industry should collaborate to perform the scientifically valid, outcome-based research that is necessary to evaluate the effect of this new technology. The original model we developed for technology assessment required the addition of several more questions to produce a complete analysis of an emerging technology. An emerging technology should be systematically assessed (using a model such as the model developed by the Society of Critical Care Medicine), before its introduction into clinical practice in order to provide a focus for human outcome validation trials and to minimize the possibility of widespread use of an unproven technology.
VM Capacity-Aware Scheduling within Budget Constraints in IaaS Clouds
Thanasias, Vasileios; Lee, Choonhwa; Hanif, Muhammad; Kim, Eunsam; Helal, Sumi
2016-01-01
Recently, cloud computing has drawn significant attention from both industry and academia, bringing unprecedented changes to computing and information technology. The infrastructure-as-a-Service (IaaS) model offers new abilities such as the elastic provisioning and relinquishing of computing resources in response to workload fluctuations. However, because the demand for resources dynamically changes over time, the provisioning of resources in a way that a given budget is efficiently utilized while maintaining a sufficing performance remains a key challenge. This paper addresses the problem of task scheduling and resource provisioning for a set of tasks running on IaaS clouds; it presents novel provisioning and scheduling algorithms capable of executing tasks within a given budget, while minimizing the slowdown due to the budget constraint. Our simulation study demonstrates a substantial reduction up to 70% in the overall task slowdown rate by the proposed algorithms. PMID:27501046
VM Capacity-Aware Scheduling within Budget Constraints in IaaS Clouds.
Thanasias, Vasileios; Lee, Choonhwa; Hanif, Muhammad; Kim, Eunsam; Helal, Sumi
2016-01-01
Recently, cloud computing has drawn significant attention from both industry and academia, bringing unprecedented changes to computing and information technology. The infrastructure-as-a-Service (IaaS) model offers new abilities such as the elastic provisioning and relinquishing of computing resources in response to workload fluctuations. However, because the demand for resources dynamically changes over time, the provisioning of resources in a way that a given budget is efficiently utilized while maintaining a sufficing performance remains a key challenge. This paper addresses the problem of task scheduling and resource provisioning for a set of tasks running on IaaS clouds; it presents novel provisioning and scheduling algorithms capable of executing tasks within a given budget, while minimizing the slowdown due to the budget constraint. Our simulation study demonstrates a substantial reduction up to 70% in the overall task slowdown rate by the proposed algorithms.
Myburgh, John; Abillama, Fayez; Chiumello, Davide; Dobb, Geoff; Jacobe, Stephen; Kleinpell, Ruth; Koh, Younsuk; Martin, Claudio; Michalsen, Andej; Pelosi, Paolo; Torra, Lluis Blanch; Vincent, Jean-Louis; Yeager, Susan; Zimmerman, Janice
2016-08-01
End-of-life care in the intensive care unit (ICU) was identified as an objective in a series of Task Forces developed by the World Federation of Societies of Intensive and Critical Care Medicine Council in 2014. The objective was to develop a generic statement about current knowledge and to identify challenges relevant to the global community that may inform regional and local initiatives. An updated summary of published statements on end-of-life care in the ICU from national Societies is presented, highlighting commonalities and differences within and between international regions. The complexity of end-of-life care in the ICU, particularly relating to withholding and withdrawing life-sustaining treatment while ensuring the alleviation of suffering, within different ethical and cultural environments is recognized. Although no single statement can therefore be regarded as a criterion standard applicable to all countries and societies, the World Federation of Societies of Intensive and Critical Care Medicine endorses and encourages the role of Member Societies to lead the debate regarding end-of-life care in the ICU within each country and to take a leading role in developing national guidelines and recommendations within each country. Copyright © 2016 Elsevier Inc. All rights reserved.
Pusic, Martin V.; LeBlanc, Vicki; Patel, Vimla L.
2001-01-01
Traditional task analysis for instructional design has emphasized the importance of precisely defining behavioral educational objectives and working back to select objective-appropriate instructional strategies. However, this approach may miss effective strategies. Cognitive task analysis, on the other hand, breaks a process down into its component knowledge representations. Selection of instructional strategies based on all such representations in a domain is likely to lead to optimal instructional design. In this demonstration, using the interpretation of cervical spine x-rays as an educational example, we show how a detailed cognitive task analysis can guide the development of computer-aided instruction.
Convolutional neural networks and face recognition task
NASA Astrophysics Data System (ADS)
Sochenkova, A.; Sochenkov, I.; Makovetskii, A.; Vokhmintsev, A.; Melnikov, A.
2017-09-01
Computer vision tasks are remaining very important for the last couple of years. One of the most complicated problems in computer vision is face recognition that could be used in security systems to provide safety and to identify person among the others. There is a variety of different approaches to solve this task, but there is still no universal solution that would give adequate results in some cases. Current paper presents following approach. Firstly, we extract an area containing face, then we use Canny edge detector. On the next stage we use convolutional neural networks (CNN) to finally solve face recognition and person identification task.
Mission-based Scenario Research: Experimental Design And Analysis
2012-01-01
neurotechnologies called Brain-Computer Interaction Technologies. 15. SUBJECT TERMS neuroimaging, EEG, task loading, neurotechnologies , ground... neurotechnologies called Brain-Computer Interaction Technologies. INTRODUCTION Imagine a system that can identify operator fatigue during a long-term...BCIT), a class of neurotechnologies , that aim to improve task performance by incorporating measures of brain activity to optimize the interactions
An Interaction of Screen Colour and Lesson Task in CAL
ERIC Educational Resources Information Center
Clariana, Roy B.
2004-01-01
Colour is a common feature in computer-aided learning (CAL), though the instructional effects of screen colour are not well understood. This investigation considers the effects of different CAL study tasks with feedback on posttest performance and on posttest memory of the lesson colour scheme. Graduate students (n=68) completed a computer-based…
Nonoccurrence of Negotiation of Meaning in Task-Based Synchronous Computer-Mediated Communication
ERIC Educational Resources Information Center
Van Der Zwaard, Rose; Bannink, Anne
2016-01-01
This empirical study investigated the occurrence of meaning negotiation in an interactive synchronous computer-mediated second language (L2) environment. Sixteen dyads (N = 32) consisting of nonnative speakers (NNSs) and native speakers (NSs) of English performed 2 different tasks using videoconferencing and written chat. The data were coded and…
Studying Parental Decision Making with Micro-Computers: The CPSI Technique.
ERIC Educational Resources Information Center
Holden, George W.
A technique for studying how parents think, make decisions, and solve childrearing problems, Computer-Presented Social Interactions (CPSI), is described. Two studies involving CPSI are presented. The first study concerns a common parental cognitive task: causal analysis of an undesired behavior. The task was to diagnose the cause of non-contingent…
Tangential Floor in a Classroom Setting
ERIC Educational Resources Information Center
Marti, Leyla
2012-01-01
This article examines floor management in two classroom sessions: a task-oriented computer lesson and a literature lesson. Recordings made in the computer lesson show the organization of floor when a task is given to students. Temporary or "incipient" side floors (Jones and Thornborrow, 2004) emerge beside the main floor. In the literature lesson,…
From Earth to Space--Advertising Films Created in a Computer-Based Primary School Task
ERIC Educational Resources Information Center
Öman, Anne
2017-01-01
Today, teachers orchestrate computer-based tasks in software applications in Swedish primary schools. Meaning is made through various modes, and multimodal perspectives on literacy have the basic assumption that meaning is made through many representational and communicational resources. The case study presented in this paper has analysed pupils'…
An Undergraduate Course on Operating Systems Principles.
ERIC Educational Resources Information Center
National Academy of Engineering, Washington, DC. Commission on Education.
This report is from Task Force VIII of the COSINE Committee of the Commission on Education of the National Academy of Engineering. The task force was established to formulate subject matter for an elective undergraduate subject on computer operating systems principles for students whose major interest is in the engineering of computer systems and…
Negotiation of Meaning in Synchronous Computer-Mediated Communication in Relation to Task Types
ERIC Educational Resources Information Center
Cho, Hye-jin
2011-01-01
The present study explored how negotiation of meaning occurred in task-based synchronous computer-mediated communication (SCMC) environment among college English learners. Based on the theoretical framework of the interaction hypothesis and negotiation of meaning, four research questions arose: (1) how negotiation of meaning occur in non-native…
ERIC Educational Resources Information Center
Hanks, Walter A.; Barnes, Michael D.; Merrill, Ray M.; Neiger, Brad L.
2000-01-01
Investigated how health educators currently used computers and how they expected to use them in the future. Surveys of practicing health educators at many types of sites indicated that important current abilities included Internet, word processing, and electronic presentation skills. Important future tasks and skills included developing computer…
2015-01-27
placed on the user by the required tasks. Design areas that are of concern include seating , input and output device location and design , ambient...software, hardware, and workspace design for the test function of operability that influence operator performance in a computer-based system. 15...PRESENTATION ................... 23 APPENDIX A. SAMPLE DESIGN CHECKLISTS ...................................... A-1 B. SAMPLE TASK CHECKLISTS
BASIC, Logo, and Pilot: A Comparison of Three Computer Languages.
ERIC Educational Resources Information Center
Maddux, Cleborne D.; Cummings, Rhoda E.
1985-01-01
Following a brief history of Logo, BASIC, and Pilot programing languages, common educational programing tasks (input from keyboard, evaluation of keyboard input, and computation) are presented in each language to illustrate how each can be used to perform the same tasks and to demonstrate each language's strengths and weaknesses. (MBR)
ESL Students' Interaction in Second Life: Task-Based Synchronous Computer-Mediated Communication
ERIC Educational Resources Information Center
Jee, Min Jung
2010-01-01
The purpose of the present study was to explore ESL students' interactions in task-based synchronous computer-mediated communication (SCMC) in Second Life, a virtual environment by which users can interact through representational figures. I investigated Low-Intermediate and High-Intermediate ESL students' interaction patterns before, during, and…
Oral Computer-Mediated Interaction between L2 Learners: It's about Time!
ERIC Educational Resources Information Center
Yanguas, Inigo
2010-01-01
This study explores task-based, synchronous oral computer-mediated communication (CMC) among intermediate-level learners of Spanish. In particular, this paper examines (a) how learners in video and audio CMC groups negotiate for meaning during task-based interaction, (b) possible differences between both oral CMC modes and traditional face-to-face…
Evolution of Self-Organized Task Specialization in Robot Swarms
Ferrante, Eliseo; Turgut, Ali Emre; Duéñez-Guzmán, Edgar; Dorigo, Marco; Wenseleers, Tom
2015-01-01
Division of labor is ubiquitous in biological systems, as evidenced by various forms of complex task specialization observed in both animal societies and multicellular organisms. Although clearly adaptive, the way in which division of labor first evolved remains enigmatic, as it requires the simultaneous co-occurrence of several complex traits to achieve the required degree of coordination. Recently, evolutionary swarm robotics has emerged as an excellent test bed to study the evolution of coordinated group-level behavior. Here we use this framework for the first time to study the evolutionary origin of behavioral task specialization among groups of identical robots. The scenario we study involves an advanced form of division of labor, common in insect societies and known as “task partitioning”, whereby two sets of tasks have to be carried out in sequence by different individuals. Our results show that task partitioning is favored whenever the environment has features that, when exploited, reduce switching costs and increase the net efficiency of the group, and that an optimal mix of task specialists is achieved most readily when the behavioral repertoires aimed at carrying out the different subtasks are available as pre-adapted building blocks. Nevertheless, we also show for the first time that self-organized task specialization could be evolved entirely from scratch, starting only from basic, low-level behavioral primitives, using a nature-inspired evolutionary method known as Grammatical Evolution. Remarkably, division of labor was achieved merely by selecting on overall group performance, and without providing any prior information on how the global object retrieval task was best divided into smaller subtasks. We discuss the potential of our method for engineering adaptively behaving robot swarms and interpret our results in relation to the likely path that nature took to evolve complex sociality and task specialization. PMID:26247819
Evolution of Self-Organized Task Specialization in Robot Swarms.
Ferrante, Eliseo; Turgut, Ali Emre; Duéñez-Guzmán, Edgar; Dorigo, Marco; Wenseleers, Tom
2015-08-01
Division of labor is ubiquitous in biological systems, as evidenced by various forms of complex task specialization observed in both animal societies and multicellular organisms. Although clearly adaptive, the way in which division of labor first evolved remains enigmatic, as it requires the simultaneous co-occurrence of several complex traits to achieve the required degree of coordination. Recently, evolutionary swarm robotics has emerged as an excellent test bed to study the evolution of coordinated group-level behavior. Here we use this framework for the first time to study the evolutionary origin of behavioral task specialization among groups of identical robots. The scenario we study involves an advanced form of division of labor, common in insect societies and known as "task partitioning", whereby two sets of tasks have to be carried out in sequence by different individuals. Our results show that task partitioning is favored whenever the environment has features that, when exploited, reduce switching costs and increase the net efficiency of the group, and that an optimal mix of task specialists is achieved most readily when the behavioral repertoires aimed at carrying out the different subtasks are available as pre-adapted building blocks. Nevertheless, we also show for the first time that self-organized task specialization could be evolved entirely from scratch, starting only from basic, low-level behavioral primitives, using a nature-inspired evolutionary method known as Grammatical Evolution. Remarkably, division of labor was achieved merely by selecting on overall group performance, and without providing any prior information on how the global object retrieval task was best divided into smaller subtasks. We discuss the potential of our method for engineering adaptively behaving robot swarms and interpret our results in relation to the likely path that nature took to evolve complex sociality and task specialization.
Control-display mapping in brain-computer interfaces.
Thurlings, Marieke E; van Erp, Jan B F; Brouwer, Anne-Marie; Blankertz, Benjamin; Werkhoven, Peter
2012-01-01
Event-related potential (ERP) based brain-computer interfaces (BCIs) employ differences in brain responses to attended and ignored stimuli. When using a tactile ERP-BCI for navigation, mapping is required between navigation directions on a visual display and unambiguously corresponding tactile stimuli (tactors) from a tactile control device: control-display mapping (CDM). We investigated the effect of congruent (both display and control horizontal or both vertical) and incongruent (vertical display, horizontal control) CDMs on task performance, the ERP and potential BCI performance. Ten participants attended to a target (determined via CDM), in a stream of sequentially vibrating tactors. We show that congruent CDM yields best task performance, enhanced the P300 and results in increased estimated BCI performance. This suggests a reduced availability of attentional resources when operating an ERP-BCI with incongruent CDM. Additionally, we found an enhanced N2 for incongruent CDM, which indicates a conflict between visual display and tactile control orientations. Incongruency in control-display mapping reduces task performance. In this study, brain responses, task and system performance are related to (in)congruent mapping of command options and the corresponding stimuli in a brain-computer interface (BCI). Directional congruency reduces task errors, increases available attentional resources, improves BCI performance and thus facilitates human-computer interaction.
Population-based learning of load balancing policies for a distributed computer system
NASA Technical Reports Server (NTRS)
Mehra, Pankaj; Wah, Benjamin W.
1993-01-01
Effective load-balancing policies use dynamic resource information to schedule tasks in a distributed computer system. We present a novel method for automatically learning such policies. At each site in our system, we use a comparator neural network to predict the relative speedup of an incoming task using only the resource-utilization patterns obtained prior to the task's arrival. Outputs of these comparator networks are broadcast periodically over the distributed system, and the resource schedulers at each site use these values to determine the best site for executing an incoming task. The delays incurred in propagating workload information and tasks from one site to another, as well as the dynamic and unpredictable nature of workloads in multiprogrammed multiprocessors, may cause the workload pattern at the time of execution to differ from patterns prevailing at the times of load-index computation and decision making. Our load-balancing policy accommodates this uncertainty by using certain tunable parameters. We present a population-based machine-learning algorithm that adjusts these parameters in order to achieve high average speedups with respect to local execution. Our results show that our load-balancing policy, when combined with the comparator neural network for workload characterization, is effective in exploiting idle resources in a distributed computer system.
Coiera, E
2016-11-10
Anyone with knowledge of information systems has experienced frustration when it comes to system implementation or use. Unanticipated challenges arise frequently and unanticipated consequences may follow. Working from first principles, to understand why information technology (IT) is often challenging, identify which IT endeavors are more likely to succeed, and predict the best role that technology can play in different tasks and settings. The fundamental purpose of IT is to enhance our ability to undertake tasks, supplying new information that changes what we decide and ultimately what occurs in the world. The value of this information (VOI) can be calculated at different stages of the decision-making process and will vary depending on how technology is used. We can imagine a task space that describes the relative benefits of task completion by humans or computers and that contains specific areas where humans or computers are superior. There is a third area where neither is strong and a final joint workspace where humans and computers working in partnership produce the best results. By understanding that information has value and that VOI can be quantified, we can make decisions about how best to support the work we do. Evaluation of the expected utility of task completion by humans or computers should allow us to decide whether solutions should depend on technology, humans, or a partnership between the two.
Applied Computational Electromagnetics Society Journal, Volume 9, Number 2
1994-07-01
input/output standardization; code or technique optimization and error minimization; innovations in solution technique or in data input/output...THE APPLIED COMPUTATIONAL ELECTROMAGNETICS SOCIETY JOURNAL EDITORS 3DITOR-IN-CH•IF/ACES EDITOR-IN-CHIEP/JOURNAL MANAGING EDITOR W. Perry Wheless...Adalbert Konrad and Paul P. Biringer Department of Electrical and Computer Engineering, University of Toronto Toronto, Ontario, CANADA M5S 1A4 Ailiwir
Francescatto, Margherita; Hermans, Susanne M A; Babaei, Sepideh; Vicedo, Esmeralda; Borrel, Alexandre; Meysman, Pieter
2015-01-01
In this meeting report, we give an overview of the talks, presentations and posters presented at the third European Symposium of the International Society for Computational Biology (ISCB) Student Council. The event was organized as a satellite meeting of the 13th European Conference for Computational Biology (ECCB) and took place in Strasbourg, France on September 6th, 2014.
Computational toxicity in 21st century safety sciences (China ...
presentation at the Joint Meeting of Analytical Toxicology and Computational Toxicology Committee (Chinese Society of Toxicology) International Workshop on Advanced Chemical Safety Assessment Technologies on 11 May 2016, Fuzhou University, Fuzhou China presentation at the Joint Meeting of Analytical Toxicology and Computational Toxicology Committee (Chinese Society of Toxicology) International Workshop on Advanced Chemical Safety Assessment Technologies on 11 May 2016, Fuzhou University, Fuzhou China
Farias Zuniga, Amanda M; Côté, Julie N
2017-06-01
The effects of performing a 90-minute computer task with a laptop versus a dual monitor desktop workstation were investigated in healthy young male and female adults. Work-related musculoskeletal disorders are common among computer (especially female) users. Laptops have surpassed desktop computer sales, and working with multiple monitors has also become popular. However, few studies have provided objective evidence on how they affect the musculoskeletal system in both genders. Twenty-seven healthy participants (mean age = 24.6 years; 13 males) completed a 90-minute computer task while using a laptop or dual monitor (DualMon) desktop. Electromyography (EMG) from eight upper body muscles and visual strain were measured throughout the task. Neck proprioception was tested before and after the computer task using a head-repositioning test. EMG amplitude (root mean square [RMS]), variability (coefficients of variation [CV]), and normalized mutual information (NMI) were computed. Visual strain ( p < .01) and right upper trapezius RMS ( p = .03) increased significantly over time regardless of workstation. Right cervical erector spinae RMS and cervical NMI were smaller, while degrees of overshoot (mean = 4.15°) and end position error (mean = 1.26°) were larger in DualMon regardless of time. Effects on muscle activity were more pronounced in males, whereas effects on proprioception were more pronounced in females. Results suggest that compared to laptop, DualMon work is effective in reducing cervical muscle activity, dissociating cervical connectivity, and maintaining more typical neck repositioning patterns, suggesting some health-protective effects. This evidence could be considered when deciding on computer workstation designs.
Resource Guide to Careers in Toxicology, 3rd Edition.
ERIC Educational Resources Information Center
Society of Toxicology, Reston, VA.
This resource guide was prepared by the Tox 90's Educational Issues Task Force of the Society of Toxicology. The introduction provides information on the Society of Toxicology and financial support for graduate students in toxicology. Other sections include career opportunities in toxicology, academic and postdoctoral programs in toxicology, and…
ERIC Educational Resources Information Center
Griswold, Wendy
2017-01-01
Future professionals will bear the brunt of creating sustainable societies. Equipping them for the task is the challenge of current educators. Educational experiences facilitating the development of sustainable habits of mind are needed. This research reports on the experiences of developing scientists and engineers engaged in a sustainable energy…
IOM committee members respond to Endocrine Society vitamin D guideline
USDA-ARS?s Scientific Manuscript database
In early 2011, a committee convened by the Institute of Medicine issued a report on the Dietary Reference Intakes for calcium and vitamin D. The Endocrine Society Task Force in July 2011 published a guideline for the evaluation, treatment, and prevention of vitamin D deficiency. Although these repor...
Who's in Charge Here: The What and How of Leadership.
ERIC Educational Resources Information Center
Osborne, W. Larry
Our rapidly changing society will need individuals trained and skilled in leadership. Leadership is a process designed to maximize individual contributions to organizations and society. Research has shown that there is no one way of exerting leadership. Leadership behavior can be grouped into two categories: task behavior and relationship…
ERIC Educational Resources Information Center
Meyer, David E.; Kieras, David E.
Perceptual-motor and cognitive processes whereby people perform multiple concurrent tasks have been studied through an overlapping-tasks procedure in which two successive choice-reaction tasks are performed with a variable interval (stimulus onset asynchrony, or SOA) between the beginning of the first and second tasks. The increase in subjects'…
Improving multi-tasking ability through action videogames.
Chiappe, Dan; Conger, Mark; Liao, Janet; Caldwell, J Lynn; Vu, Kim-Phuong L
2013-03-01
The present study examined whether action videogames can improve multi-tasking in high workload environments. Two groups with no action videogame experience were pre-tested using the Multi-Attribute Task Battery (MATB). It consists of two primary tasks; tracking and fuel management, and two secondary tasks; systems monitoring and communication. One group served as a control group, while a second played action videogames a minimum of 5 h a week for 10 weeks. Both groups returned for a post-assessment on the MATB. We found the videogame treatment enhanced performance on secondary tasks, without interfering with the primary tasks. Our results demonstrate action videogames can increase people's ability to take on additional tasks by increasing attentional capacity. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Baptiste, Caitlin D; Buckley de Meritens, Alexandre; Jones, Nathaniel L; Chatterjee Paer, Sudeshna; Tergas, Ana I; Hou, June Y; Wright, Jason D; Burke, William M
Laparoscopic port site metastases (PSMs) have an incidence of .5% to 2%. The management of an isolated PSM (iPSM), without evidence of recurrence elsewhere, remains unclear. The aim of this study was to elucidate practices regarding iPSMs. A 23-item survey was created using commercially available survey software. Over the course of January 2016 the survey was e-mailed to the members of the Society of Gynecologic Oncology with 2 follow-up reminder e-mails. (Canadian Task Force classification III.) SETTING: Online survey. Of the 709 surveys sent, 132 were returned. Providers practicing for <5 years saw fewer PSMs and those who performed more minimally invasive surgeries (MISs) saw more PSMs. Comparing providers who have or have not seen PSMs, no differences in pneumoinsufflation pressure, the mode of delivery of the specimen, the use of local anesthesia at port site incisions, or the method of deflation were seen. If an iPSM was suspected, most providers indicated they would obtain imaging (computed tomography, 51%, or positron emission tomography/computed tomography, 43%) followed by an interventional radiology-guided biopsy (29%) or resection of the mass. Tendency for treatment is to surgically resect the lesion followed by adjuvant therapy. After controlling for time in practice, we did not find a strong risk factor for iPSMs other than performing >75% of oncologic surgeries by MIS. Most respondents performed imaging when suspecting iPSMs and use systemic adjuvant therapy after confirming iPSMs. Copyright © 2017 AAGL. Published by Elsevier Inc. All rights reserved.
The employment of a spoken language computer applied to an air traffic control task.
NASA Technical Reports Server (NTRS)
Laveson, J. I.; Silver, C. A.
1972-01-01
Assessment of the merits of a limited spoken language (56 words) computer in a simulated air traffic control (ATC) task. An airport zone approximately 60 miles in diameter with a traffic flow simulation ranging from single-engine to commercial jet aircraft provided the workload for the controllers. This research determined that, under the circumstances of the experiments carried out, the use of a spoken-language computer would not improve the controller performance.
Productivity associated with visual status of computer users.
Daum, Kent M; Clore, Katherine A; Simms, Suzanne S; Vesely, Jon W; Wilczek, Dawn D; Spittle, Brian M; Good, Greg W
2004-01-01
The aim of this project is to examine the potential connection between the astigmatic refractive corrections of subjects using computers and their productivity and comfort. We hypothesize that improving the visual status of subjects using computers results in greater productivity, as well as improved visual comfort. Inclusion criteria required subjects 19 to 30 years of age with complete vision examinations before being enrolled. Using a double-masked, placebo-controlled, randomized design, subjects completed three experimental tasks calculated to assess the effects of refractive error on productivity (time to completion and the number of errors) at a computer. The tasks resembled those commonly undertaken by computer users and involved visual search tasks of: (1) counties and populations; (2) nonsense word search; and (3) a modified text-editing task. Estimates of productivity for time to completion varied from a minimum of 2.5% upwards to 28.7% with 2 D cylinder miscorrection. Assuming a conservative estimate of an overall 2.5% increase in productivity with appropriate astigmatic refractive correction, our data suggest a favorable cost-benefit ratio of at least 2.3 for the visual correction of an employee (total cost 268 dollars) with a salary of 25,000 dollars per year. We conclude that astigmatic refractive error affected both productivity and visual comfort under the conditions of this experiment. These data also suggest a favorable cost-benefit ratio for employers who provide computer-specific eyewear to their employees.
Refueling Strategies for a Team of Cooperating AUVs
2011-01-01
manager, and thus the constraint a centrally managed underwater network imposes on the mission. Task management utilizing Robust Decentralized Task ...the computational complexity. A bid based approach to task management has also been studied as a possible means of decentralization of group task ...currently performing another task . In [18], ground robots perform distributed task allocation using the ASyMTRy-D algorithm, which is based on CNP
Fitness costs of worker specialization for ant societies
Jongepier, Evelien; Foitzik, Susanne
2016-01-01
Division of labour is of fundamental importance for the success of societies, yet little is known about how individual specialization affects the fitness of the group as a whole. While specialized workers may be more efficient in the tasks they perform than generalists, they may also lack the flexibility to respond to rapid shifts in task needs. Such rigidity could impose fitness costs when societies face dynamic and unpredictable events, such as an attack by socially parasitic slavemakers. Here, we experimentally assess the colony-level fitness consequences of behavioural specialization in Temnothorax longispinosus ants that are attacked by the slavemaker ant T. americanus. We manipulated the social organization of 102 T. longispinosus colonies, based on the behavioural responses of all 3842 workers. We find that strict specialization is disadvantageous for a colony's annual reproduction and growth during slave raids. These fitness costs may favour generalist strategies in dynamic environments, as we also demonstrate that societies exposed to slavemakers in the field show a lower degree of specialization than those originating from slavemaker-free populations. Our findings provide an explanation for the ubiquity of generalists and highlight their importance for the flexibility and functional robustness of entire societies. PMID:26763706
Bridging the Gap Between Surveyors and the Geo-Spatial Society
NASA Astrophysics Data System (ADS)
Müller, H.
2016-06-01
For many years FIG, the International Association of Surveyors, has been trying to bridge the gap between surveyors and the geospatial society as a whole, with the geospatial industries in particular. Traditionally the surveying profession contributed to the good of society by creating and maintaining highly precise and accurate geospatial data bases, based on an in-depth knowledge of spatial reference frameworks. Furthermore in many countries surveyors may be entitled to make decisions about land divisions and boundaries. By managing information spatially surveyors today develop into the role of geo-data managers, the longer the more. Job assignments in this context include data entry management, data and process quality management, design of formal and informal systems, information management, consultancy, land management, all that in close cooperation with many different stakeholders. Future tasks will include the integration of geospatial information into e-government and e-commerce systems. The list of professional tasks underpins the capabilities of surveyors to contribute to a high quality geospatial data and information management. In that way modern surveyors support the needs of a geo-spatial society. The paper discusses several approaches to define the role of the surveyor within the modern geospatial society.
Efficiency of the human observer detecting random signals in random backgrounds
Park, Subok; Clarkson, Eric; Kupinski, Matthew A.; Barrett, Harrison H.
2008-01-01
The efficiencies of the human observer and the channelized-Hotelling observer relative to the ideal observer for signal-detection tasks are discussed. Both signal-known-exactly (SKE) tasks and signal-known-statistically (SKS) tasks are considered. Signal location is uncertain for the SKS tasks, and lumpy backgrounds are used for background uncertainty in both cases. Markov chain Monte Carlo methods are employed to determine ideal-observer performance on the detection tasks. Psychophysical studies are conducted to compute human-observer performance on the same tasks. Efficiency is computed as the squared ratio of the detectabilities of the observer of interest to the ideal observer. Human efficiencies are approximately 2.1% and 24%, respectively, for the SKE and SKS tasks. The results imply that human observers are not affected as much as the ideal observer by signal-location uncertainty even though the ideal observer outperforms the human observer for both tasks. Three different simplified pinhole imaging systems are simulated, and the humans and the model observers rank the systems in the same order for both the SKE and the SKS tasks. PMID:15669610
Akkas, Oguz; Lee, Cheng Hsien; Hu, Yu Hen; Harris Adamson, Carisa; Rempel, David; Radwin, Robert G
2017-12-01
Two computer vision algorithms were developed to automatically estimate exertion time, duty cycle (DC) and hand activity level (HAL) from videos of workers performing 50 industrial tasks. The average DC difference between manual frame-by-frame analysis and the computer vision DC was -5.8% for the Decision Tree (DT) algorithm, and 1.4% for the Feature Vector Training (FVT) algorithm. The average HAL difference was 0.5 for the DT algorithm and 0.3 for the FVT algorithm. A sensitivity analysis, conducted to examine the influence that deviations in DC have on HAL, found it remained unaffected when DC error was less than 5%. Thus, a DC error less than 10% will impact HAL less than 0.5 HAL, which is negligible. Automatic computer vision HAL estimates were therefore comparable to manual frame-by-frame estimates. Practitioner Summary: Computer vision was used to automatically estimate exertion time, duty cycle and hand activity level from videos of workers performing industrial tasks.
MAX - An advanced parallel computer for space applications
NASA Technical Reports Server (NTRS)
Lewis, Blair F.; Bunker, Robert L.
1991-01-01
MAX is a fault-tolerant multicomputer hardware and software architecture designed to meet the needs of NASA spacecraft systems. It consists of conventional computing modules (computers) connected via a dual network topology. One network is used to transfer data among the computers and between computers and I/O devices. This network's topology is arbitrary. The second network operates as a broadcast medium for operating system synchronization messages and supports the operating system's Byzantine resilience. A fully distributed operating system supports multitasking in an asynchronous event and data driven environment. A large grain dataflow paradigm is used to coordinate the multitasking and provide easy control of concurrency. It is the basis of the system's fault tolerance and allows both static and dynamical location of tasks. Redundant execution of tasks with software voting of results may be specified for critical tasks. The dataflow paradigm also supports simplified software design, test and maintenance. A unique feature is a method for reliably patching code in an executing dataflow application.
Tusche, Anita; Böckler, Anne; Kanske, Philipp; Trautwein, Fynn-Mathis; Singer, Tania
2016-04-27
Altruistic behavior varies considerably across people and decision contexts. The relevant computational and motivational mechanisms that underlie its heterogeneity, however, are poorly understood. Using a charitable giving task together with multivariate decoding techniques, we identified three distinct psychological mechanisms underlying altruistic decision-making (empathy, perspective taking, and attentional reorienting) and linked them to dissociable neural computations. Neural responses in the anterior insula (AI) (but not temporoparietal junction [TPJ]) encoded trial-wise empathy for beneficiaries, whereas the TPJ (but not AI) predicted the degree of perspective taking. Importantly, the relative influence of both socio-cognitive processes differed across individuals: participants whose donation behavior was heavily influenced by affective empathy exhibited higher predictive accuracies for generosity in AI, whereas those who strongly relied on cognitive perspective taking showed improved predictions of generous donations in TPJ. Furthermore, subject-specific contributions of both processes for donations were reflected in participants' empathy and perspective taking responses in a separate fMRI task (EmpaToM), suggesting that process-specific inputs into altruistic choices may reflect participants' general propensity to either empathize or mentalize. Finally, using independent attention task data, we identified shared neural codes for attentional reorienting and generous donations in the posterior superior temporal sulcus, suggesting that domain-general attention shifts also contribute to generous behavior (but not in TPJ or AI). Overall, our findings demonstrate highly specific roles of AI for affective empathy and TPJ for cognitive perspective taking as precursors of prosocial behavior and suggest that these discrete routes of social cognition differentially drive intraindividual and interindividual differences in altruistic behavior. Human societies depend on the altruistic behavior of their members, but teasing apart its underlying motivations and neural mechanisms poses a serious challenge. Using multivariate decoding techniques, we delineated three distinct processes for altruistic decision-making (affective empathy, cognitive perspective taking, and domain-general attention shifts), linked them to dissociable neural computations, and identified their relative influence across individuals. Distinguishing process-specific computations both behaviorally and neurally is crucial for developing complete theoretical and neuroscientific accounts of altruistic behavior and more effective means of increasing it. Moreover, information on the relative influence of subprocesses across individuals and its link to people's more general propensity to engage empathy or perspective taking can inform training programs to increase prosociality, considering their "fit" with different individuals. Copyright © 2016 the authors 0270-6474/16/364719-14$15.00/0.
Differences in muscle load between computer and non-computer work among office workers.
Richter, J M; Mathiassen, S E; Slijper, H P; Over, E A B; Frens, M A
2009-12-01
Introduction of more non-computer tasks has been suggested to increase exposure variation and thus reduce musculoskeletal complaints (MSC) in computer-intensive office work. This study investigated whether muscle activity did, indeed, differ between computer and non-computer activities. Whole-day logs of input device use in 30 office workers were used to identify computer and non-computer work, using a range of classification thresholds (non-computer thresholds (NCTs)). Exposure during these activities was assessed by bilateral electromyography recordings from the upper trapezius and lower arm. Contrasts in muscle activity between computer and non-computer work were distinct but small, even at the individualised, optimal NCT. Using an average group-based NCT resulted in less contrast, even in smaller subgroups defined by job function or MSC. Thus, computer activity logs should be used cautiously as proxies of biomechanical exposure. Conventional non-computer tasks may have a limited potential to increase variation in muscle activity during computer-intensive office work.
Visser, Bart; De Looze, Michiel; De Graaff, Matthijs; Van Dieën, Jaap
2004-02-05
The objective of the present study was to gain insight into the effects of precision demands and mental pressure on the load of the upper extremity. Two computer mouse tasks were used: an aiming and a tracking task. Upper extremity loading was operationalized as the myo-electric activity of the wrist flexor and extensor and of the trapezius descendens muscles and the applied grip- and click-forces on the computer mouse. Performance measures, reflecting the accuracy in both tasks and the clicking rate in the aiming task, indicated that the levels of the independent variables resulted in distinguishable levels of accuracy and work pace. Precision demands had a small effect on upper extremity loading with a significant increase in the EMG-amplitudes (21%) of the wrist flexors during the aiming tasks. Precision had large effects on performance. Mental pressure had substantial effects on EMG-amplitudes with an increase of 22% in the trapezius when tracking and increases of 41% in the trapezius and 45% and 140% in the wrist extensors and flexors, respectively, when aiming. During aiming, grip- and click-forces increased by 51% and 40% respectively. Mental pressure had small effects on accuracy but large effects on tempo during aiming. Precision demands and mental pressure in aiming and tracking tasks with a computer mouse were found to coincide with increased muscle activity in some upper extremity muscles and increased force exertion on the computer mouse. Mental pressure caused significant effects on these parameters more often than precision demands. Precision and mental pressure were found to have effects on performance, with precision effects being significant for all performance measures studied and mental pressure effects for some of them. The results of this study suggest that precision demands and mental pressure increase upper extremity load, with mental pressure effects being larger than precision effects. The possible role of precision demands as an indirect mental stressor in working conditions is discussed.
Evaluating a Computerized Aid for Conducting a Cognitive Task Analysis
2000-01-01
in conducting a cognitive task analysis . The conduct of a cognitive task analysis is costly and labor intensive. As a result, a few computerized aids...evaluation of a computerized aid, specifically CAT-HCI (Cognitive Analysis Tool - Human Computer Interface), for the conduct of a detailed cognitive task analysis . A
Effects on Training Using Illumination in Virtual Environments
NASA Technical Reports Server (NTRS)
Maida, James C.; Novak, M. S. Jennifer; Mueller, Kristian
1999-01-01
Camera based tasks are commonly performed during orbital operations, and orbital lighting conditions, such as high contrast shadowing and glare, are a factor in performance. Computer based training using virtual environments is a common tool used to make and keep CTW members proficient. If computer based training included some of these harsh lighting conditions, would the crew increase their proficiency? The project goal was to determine whether computer based training increases proficiency if one trains for a camera based task using computer generated virtual environments with enhanced lighting conditions such as shadows and glare rather than color shaded computer images normally used in simulators. Previous experiments were conducted using a two degree of freedom docking system. Test subjects had to align a boresight camera using a hand controller with one axis of rotation and one axis of rotation. Two sets of subjects were trained on two computer simulations using computer generated virtual environments, one with lighting, and one without. Results revealed that when subjects were constrained by time and accuracy, those who trained with simulated lighting conditions performed significantly better than those who did not. To reinforce these results for speed and accuracy, the task complexity was increased.
An Execution Service for Grid Computing
NASA Technical Reports Server (NTRS)
Smith, Warren; Hu, Chaumin
2004-01-01
This paper describes the design and implementation of the IPG Execution Service that reliably executes complex jobs on a computational grid. Our Execution Service is part of the IPG service architecture whose goal is to support location-independent computing. In such an environment, once n user ports an npplicntion to one or more hardware/software platfrms, the user can describe this environment to the grid the grid can locate instances of this platfrm, configure the platfrm as required for the application, and then execute the application. Our Execution Service runs jobs that set up such environments for applications and executes them. These jobs consist of a set of tasks for executing applications and managing data. The tasks have user-defined starting conditions that allow users to specih complex dependencies including task to execute when tasks fail, afiequent occurrence in a large distributed system, or are cancelled. The execution task provided by our service also configures the application environment exactly as specified by the user and captures the exit code of the application, features that many grid execution services do not support due to dflculties interfacing to local scheduling systems.
NASA Technical Reports Server (NTRS)
Juang, Hann-Ming Henry; Tao, Wei-Kuo; Zeng, Xi-Ping; Shie, Chung-Lin; Simpson, Joanne; Lang, Steve
2004-01-01
The capability for massively parallel programming (MPP) using a message passing interface (MPI) has been implemented into a three-dimensional version of the Goddard Cumulus Ensemble (GCE) model. The design for the MPP with MPI uses the concept of maintaining similar code structure between the whole domain as well as the portions after decomposition. Hence the model follows the same integration for single and multiple tasks (CPUs). Also, it provides for minimal changes to the original code, so it is easily modified and/or managed by the model developers and users who have little knowledge of MPP. The entire model domain could be sliced into one- or two-dimensional decomposition with a halo regime, which is overlaid on partial domains. The halo regime requires that no data be fetched across tasks during the computational stage, but it must be updated before the next computational stage through data exchange via MPI. For reproducible purposes, transposing data among tasks is required for spectral transform (Fast Fourier Transform, FFT), which is used in the anelastic version of the model for solving the pressure equation. The performance of the MPI-implemented codes (i.e., the compressible and anelastic versions) was tested on three different computing platforms. The major results are: 1) both versions have speedups of about 99% up to 256 tasks but not for 512 tasks; 2) the anelastic version has better speedup and efficiency because it requires more computations than that of the compressible version; 3) equal or approximately-equal numbers of slices between the x- and y- directions provide the fastest integration due to fewer data exchanges; and 4) one-dimensional slices in the x-direction result in the slowest integration due to the need for more memory relocation for computation.
NASA Technical Reports Server (NTRS)
Tencati, Ron
1991-01-01
An overview is presented of the NASA Science Internet (NSI) security task. The task includes the following: policies and security documentation; risk analysis and management; computer emergency response team; incident handling; toolkit development; user consulting; and working groups, conferences, and committees.
NASA Astrophysics Data System (ADS)
Felton, E. A.; Radwin, R. G.; Wilson, J. A.; Williams, J. C.
2009-10-01
A brain-computer interface (BCI) is a communication system that takes recorded brain signals and translates them into real-time actions, in this case movement of a cursor on a computer screen. This work applied Fitts' law to the evaluation of performance on a target acquisition task during sensorimotor rhythm-based BCI training. Fitts' law, which has been used as a predictor of movement time in studies of human movement, was used here to determine the information transfer rate, which was based on target acquisition time and target difficulty. The information transfer rate was used to make comparisons between control modalities and subject groups on the same task. Data were analyzed from eight able-bodied and five motor disabled participants who wore an electrode cap that recorded and translated their electroencephalogram (EEG) signals into computer cursor movements. Direct comparisons were made between able-bodied and disabled subjects, and between EEG and joystick cursor control in able-bodied subjects. Fitts' law aptly described the relationship between movement time and index of difficulty for each task movement direction when evaluated separately and averaged together. This study showed that Fitts' law can be successfully applied to computer cursor movement controlled by neural signals.
High-Performance Data Analysis Tools for Sun-Earth Connection Missions
NASA Technical Reports Server (NTRS)
Messmer, Peter
2011-01-01
The data analysis tool of choice for many Sun-Earth Connection missions is the Interactive Data Language (IDL) by ITT VIS. The increasing amount of data produced by these missions and the increasing complexity of image processing algorithms requires access to higher computing power. Parallel computing is a cost-effective way to increase the speed of computation, but algorithms oftentimes have to be modified to take advantage of parallel systems. Enhancing IDL to work on clusters gives scientists access to increased performance in a familiar programming environment. The goal of this project was to enable IDL applications to benefit from both computing clusters as well as graphics processing units (GPUs) for accelerating data analysis tasks. The tool suite developed in this project enables scientists now to solve demanding data analysis problems in IDL that previously required specialized software, and it allows them to be solved orders of magnitude faster than on conventional PCs. The tool suite consists of three components: (1) TaskDL, a software tool that simplifies the creation and management of task farms, collections of tasks that can be processed independently and require only small amounts of data communication; (2) mpiDL, a tool that allows IDL developers to use the Message Passing Interface (MPI) inside IDL for problems that require large amounts of data to be exchanged among multiple processors; and (3) GPULib, a tool that simplifies the use of GPUs as mathematical coprocessors from within IDL. mpiDL is unique in its support for the full MPI standard and its support of a broad range of MPI implementations. GPULib is unique in enabling users to take advantage of an inexpensive piece of hardware, possibly already installed in their computer, and achieve orders of magnitude faster execution time for numerically complex algorithms. TaskDL enables the simple setup and management of task farms on compute clusters. The products developed in this project have the potential to interact, so one can build a cluster of PCs, each equipped with a GPU, and use mpiDL to communicate between the nodes and GPULib to accelerate the computations on each node.
A neuronal model of a global workspace in effortful cognitive tasks.
Dehaene, S; Kerszberg, M; Changeux, J P
1998-11-24
A minimal hypothesis is proposed concerning the brain processes underlying effortful tasks. It distinguishes two main computational spaces: a unique global workspace composed of distributed and heavily interconnected neurons with long-range axons, and a set of specialized and modular perceptual, motor, memory, evaluative, and attentional processors. Workspace neurons are mobilized in effortful tasks for which the specialized processors do not suffice. They selectively mobilize or suppress, through descending connections, the contribution of specific processor neurons. In the course of task performance, workspace neurons become spontaneously coactivated, forming discrete though variable spatio-temporal patterns subject to modulation by vigilance signals and to selection by reward signals. A computer simulation of the Stroop task shows workspace activation to increase during acquisition of a novel task, effortful execution, and after errors. We outline predictions for spatio-temporal activation patterns during brain imaging, particularly about the contribution of dorsolateral prefrontal cortex and anterior cingulate to the workspace.
Characterizing and Mitigating Work Time Inflation in Task Parallel Programs
Olivier, Stephen L.; de Supinski, Bronis R.; Schulz, Martin; ...
2013-01-01
Task parallelism raises the level of abstraction in shared memory parallel programming to simplify the development of complex applications. However, task parallel applications can exhibit poor performance due to thread idleness, scheduling overheads, and work time inflation – additional time spent by threads in a multithreaded computation beyond the time required to perform the same work in a sequential computation. We identify the contributions of each factor to lost efficiency in various task parallel OpenMP applications and diagnose the causes of work time inflation in those applications. Increased data access latency can cause significant work time inflation in NUMA systems.more » Our locality framework for task parallel OpenMP programs mitigates this cause of work time inflation. Our extensions to the Qthreads library demonstrate that locality-aware scheduling can improve performance up to 3X compared to the Intel OpenMP task scheduler.« less
NASA Technical Reports Server (NTRS)
Sword, A. J.; Park, W. T.
1975-01-01
A teleoperator system with a computer for manipulator control to combine the capabilities of both man and computer to accomplish a task is described. This system allows objects in unpredictable locations to be successfully located and acquired. By using a method of characterizing the work-space together with man's ability to plan a strategy and coarsely locate an object, the computer is provided with enough information to complete the tedious part of the task. In addition, the use of voice control is shown to be a useful component of the man/machine interface.
2015-01-01
In this meeting report, we give an overview of the talks, presentations and posters presented at the third European Symposium of the International Society for Computational Biology (ISCB) Student Council. The event was organized as a satellite meeting of the 13th European Conference for Computational Biology (ECCB) and took place in Strasbourg, France on September 6th, 2014. PMID:25708611
Report of the Task Force on Human Rights.
ERIC Educational Resources Information Center
National Education Association, Washington, DC.
The NEA Task Force was instructed to "recommend to the Executive Committee a structure and program for the coordination and expansion of the human rights activities of the NEA and of the departments, divisions, commissions, and committees." Their recommendations and a discussion of the forces in American society that make them necessary comprise…
Optical Characterization of Wide Field-of-View Night Vision Devices
1999-01-01
This paper has been cleared by ASC 99-2354 Optical Characterization of Wide Field-Of-View Night Vision Devices Peter L. Marasco and H. Lee Task Air...the SAFE SocietyÕs 36th Annual Symposium. Task, H.L., Hartman, R., Marasco , P.L., Zobel, A, (1993) Methods for measuring characteristics of night
Persistent Poverty in Rural America. Rural Studies Series.
ERIC Educational Resources Information Center
Rural Sociological Society, Bozeman, MT.
In this volume, the Rural Sociological Society Task Force on Persistent Rural Poverty analyzes the leading explanations of persistent rural poverty and points out new directions in theory that should provide a firmer foundation for antipoverty policies and programs. Written by over 50 leading social scientists, the Task Force report explains that…
A Reawakening: Character Education and the Role of the School Board Member.
ERIC Educational Resources Information Center
California School Boards Association, Sacramento.
The California School Boards Association (CSBA) established a task force to define the term "character education" and to clarify the needs of the public schools for curricula and instructional materials supporting character education. This report synthesizes the results of the task force's efforts. The failure of U.S. society's formal…
Response to the Task Force on School Governance.
ERIC Educational Resources Information Center
Denoyer, Richard A.
Although the Task Force on School Governance report claims that restructuring of school boards is essential to save the nation's failing schools, the real failure is society itself. Societal problems such as the nation's $4 trillion debt, air and water pollution, crime, drug abuse, and special interest lobbies abound, and legislators'…
Developmental Tasks of Older Female Students in Undergraduate Education.
ERIC Educational Resources Information Center
Eckard, Pamela J.
The purpose of this study was to determine if there are developmental tasks unique to the older female student returning to undergraduate school. These students are attempting to meet obligations to family, society, and self, while engaging in educational pursuits often experienced by others before assuming family or income-producing obligations;…
Task-Based Oral Computer-Mediated Communication and L2 Vocabulary Acquisition
ERIC Educational Resources Information Center
Yanguas, Inigo
2012-01-01
The present study adds to the computer-mediated communication (CMC) literature by exploring oral learner-to-learner interaction using Skype, a free and widely used Internet software program. In particular, this task-based study has a two-fold goal. Firstly, it explores possible differences between two modes of oral CMC (audio and video) and…
ERIC Educational Resources Information Center
Flannery, Kathleen A.; Malita, Mihaela
2014-01-01
We present our case study of an interdisciplinary team project for students taking either a psychology or computer science (CS) course. The project required psychology and CS students to combine their knowledge and skills to create an online cognitive task. Each interdisciplinary project team included two psychology students who conducted library…
Energy and Power Aware Computing Through Management of Computational Entropy
2008-01-01
18 2.4.1 ACIP living framework forum task...This research focused on two sub- tasks: (1) Assessing the need and planning for a potential “Living Framework Forum ” (LFF) software architecture...probabilistic switching with plausible device realizations to save energy in our patent application [35]. In [35], we showed an introverted switch in
Computer-Mediated Training Tools to Enhance Joint Task Force Cognitive Leadership Skills
2007-04-01
University); and 5d. TASK NUMBER Barclay Lewis (American Systems) 5e. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ...ple G am ing Platform D ecisive A ction for Training ..................................................... 43 6. Perform ance M etrics...Figure 15: Automated Performance Measurement System ................................................................... 48 iv COMPUTER-MEDIATED TRAINING
ERIC Educational Resources Information Center
Hsiao, Janet H.; Lam, Sze Man
2013-01-01
Through computational modeling, here we examine whether visual and task characteristics of writing systems alone can account for lateralization differences in visual word recognition between different languages without assuming influence from left hemisphere (LH) lateralized language processes. We apply a hemispheric processing model of face…
Web-Based Seamless Migration for Task-Oriented Mobile Distance Learning
ERIC Educational Resources Information Center
Zhang, Degan; Li, Yuan-chao; Zhang, Huaiyu; Zhang, Xinshang; Zeng, Guangping
2006-01-01
As a new kind of computing paradigm, pervasive computing will meet the requirements of human being that anybody maybe obtain services in anywhere and at anytime, task-oriented seamless migration is one of its applications. Apparently, the function of seamless mobility is suitable for mobile services, such as mobile Web-based learning. In this…
Using Higher Order Computer Tasks with Disadvantaged Students.
ERIC Educational Resources Information Center
Anderson, Neil
A pilot program initially designed for a 12-year-old girl with mild to moderate intellectual disabilities in higher order computer tasks was developed for a larger group of students with similar disabilities enrolled in fifth and sixth grades (ages 9-12) at three different schools. An examination of the original pilot study was undertaken to…
ERIC Educational Resources Information Center
Shintani, Natsuko
2016-01-01
This case study investigated the characteristics of computer-mediated synchronous corrective feedback (SCF, provided while students wrote) and asynchronous corrective feedback (ACF, provided after students had finished writing) in an EFL writing task. The task, designed to elicit the use of the hypothetical conditional, was completed by two…
Task-specific image partitioning.
Kim, Sungwoong; Nowozin, Sebastian; Kohli, Pushmeet; Yoo, Chang D
2013-02-01
Image partitioning is an important preprocessing step for many of the state-of-the-art algorithms used for performing high-level computer vision tasks. Typically, partitioning is conducted without regard to the task in hand. We propose a task-specific image partitioning framework to produce a region-based image representation that will lead to a higher task performance than that reached using any task-oblivious partitioning framework and existing supervised partitioning framework, albeit few in number. The proposed method partitions the image by means of correlation clustering, maximizing a linear discriminant function defined over a superpixel graph. The parameters of the discriminant function that define task-specific similarity/dissimilarity among superpixels are estimated based on structured support vector machine (S-SVM) using task-specific training data. The S-SVM learning leads to a better generalization ability while the construction of the superpixel graph used to define the discriminant function allows a rich set of features to be incorporated to improve discriminability and robustness. We evaluate the learned task-aware partitioning algorithms on three benchmark datasets. Results show that task-aware partitioning leads to better labeling performance than the partitioning computed by the state-of-the-art general-purpose and supervised partitioning algorithms. We believe that the task-specific image partitioning paradigm is widely applicable to improving performance in high-level image understanding tasks.
Jensen-Bregman LogDet Divergence for Efficient Similarity Computations on Positive Definite Tensors
2012-05-02
function of Legendre-type on int(domS) [29]. From (7) the following properties of dφ(x, y) are apparent: strict convexity in x; asym- metry; non ...tensor imaging. An important task in all of these applications is to compute the distance between covariance matrices using a (dis)similarity function ...important task in all of these applications is to compute the distance between covariance matrices using a (dis)similarity function , for which the natural
Queueing Network Models for Parallel Processing of Task Systems: an Operational Approach
NASA Technical Reports Server (NTRS)
Mak, Victor W. K.
1986-01-01
Computer performance modeling of possibly complex computations running on highly concurrent systems is considered. Earlier works in this area either dealt with a very simple program structure or resulted in methods with exponential complexity. An efficient procedure is developed to compute the performance measures for series-parallel-reducible task systems using queueing network models. The procedure is based on the concept of hierarchical decomposition and a new operational approach. Numerical results for three test cases are presented and compared to those of simulations.
Ranking network of a captive rhesus macaque society: a sophisticated corporative kingdom.
Fushing, Hsieh; McAssey, Michael P; Beisner, Brianne; McCowan, Brenda
2011-03-15
We develop a three-step computing approach to explore a hierarchical ranking network for a society of captive rhesus macaques. The computed network is sufficiently informative to address the question: Is the ranking network for a rhesus macaque society more like a kingdom or a corporation? Our computations are based on a three-step approach. These steps are devised to deal with the tremendous challenges stemming from the transitivity of dominance as a necessary constraint on the ranking relations among all individual macaques, and the very high sampling heterogeneity in the behavioral conflict data. The first step simultaneously infers the ranking potentials among all network members, which requires accommodation of heterogeneous measurement error inherent in behavioral data. Our second step estimates the social rank for all individuals by minimizing the network-wide errors in the ranking potentials. The third step provides a way to compute confidence bounds for selected empirical features in the social ranking. We apply this approach to two sets of conflict data pertaining to two captive societies of adult rhesus macaques. The resultant ranking network for each society is found to be a sophisticated mixture of both a kingdom and a corporation. Also, for validation purposes, we reanalyze conflict data from twenty longhorn sheep and demonstrate that our three-step approach is capable of correctly computing a ranking network by eliminating all ranking error.
ERIC Educational Resources Information Center
Tanaka, Ayumi; Takehara, Takuma; Yamauchi, Hirotsugu
2006-01-01
The aims of the study were to test the linkages between achievement goals to task performance, as mediated by state anxiety arousal. Performance expectancy was also examined as antecedents of achievement goals. A presentation task in a computer practice class was used as achievement task. Fifty-three undergraduates (37 females and 16 males) were…
Sort-Mid tasks scheduling algorithm in grid computing.
Reda, Naglaa M; Tawfik, A; Marzok, Mohamed A; Khamis, Soheir M
2015-11-01
Scheduling tasks on heterogeneous resources distributed over a grid computing system is an NP-complete problem. The main aim for several researchers is to develop variant scheduling algorithms for achieving optimality, and they have shown a good performance for tasks scheduling regarding resources selection. However, using of the full power of resources is still a challenge. In this paper, a new heuristic algorithm called Sort-Mid is proposed. It aims to maximizing the utilization and minimizing the makespan. The new strategy of Sort-Mid algorithm is to find appropriate resources. The base step is to get the average value via sorting list of completion time of each task. Then, the maximum average is obtained. Finally, the task has the maximum average is allocated to the machine that has the minimum completion time. The allocated task is deleted and then, these steps are repeated until all tasks are allocated. Experimental tests show that the proposed algorithm outperforms almost other algorithms in terms of resources utilization and makespan.
Sort-Mid tasks scheduling algorithm in grid computing
Reda, Naglaa M.; Tawfik, A.; Marzok, Mohamed A.; Khamis, Soheir M.
2014-01-01
Scheduling tasks on heterogeneous resources distributed over a grid computing system is an NP-complete problem. The main aim for several researchers is to develop variant scheduling algorithms for achieving optimality, and they have shown a good performance for tasks scheduling regarding resources selection. However, using of the full power of resources is still a challenge. In this paper, a new heuristic algorithm called Sort-Mid is proposed. It aims to maximizing the utilization and minimizing the makespan. The new strategy of Sort-Mid algorithm is to find appropriate resources. The base step is to get the average value via sorting list of completion time of each task. Then, the maximum average is obtained. Finally, the task has the maximum average is allocated to the machine that has the minimum completion time. The allocated task is deleted and then, these steps are repeated until all tasks are allocated. Experimental tests show that the proposed algorithm outperforms almost other algorithms in terms of resources utilization and makespan. PMID:26644937
Estimation Accuracy on Execution Time of Run-Time Tasks in a Heterogeneous Distributed Environment.
Liu, Qi; Cai, Weidong; Jin, Dandan; Shen, Jian; Fu, Zhangjie; Liu, Xiaodong; Linge, Nigel
2016-08-30
Distributed Computing has achieved tremendous development since cloud computing was proposed in 2006, and played a vital role promoting rapid growth of data collecting and analysis models, e.g., Internet of things, Cyber-Physical Systems, Big Data Analytics, etc. Hadoop has become a data convergence platform for sensor networks. As one of the core components, MapReduce facilitates allocating, processing and mining of collected large-scale data, where speculative execution strategies help solve straggler problems. However, there is still no efficient solution for accurate estimation on execution time of run-time tasks, which can affect task allocation and distribution in MapReduce. In this paper, task execution data have been collected and employed for the estimation. A two-phase regression (TPR) method is proposed to predict the finishing time of each task accurately. Detailed data of each task have drawn interests with detailed analysis report being made. According to the results, the prediction accuracy of concurrent tasks' execution time can be improved, in particular for some regular jobs.
Summary of synfuel characterization and combustion studies
NASA Technical Reports Server (NTRS)
Schultz, D. F.
1983-01-01
Combustion component research studies aimed at evolving environmentally acceptable approaches for burning coal derived fuels for ground power applications were performed at the NASA Lewis Research Center under a program titled the ""Critical Research and Support Technology Program'' (CRT). The work was funded by the Department of Energy and was performed in four tasks. This report summarizes these tasks which have all been previously reported. In addition some previously unreported data from Task 4 is also presented. The first, Task 1 consisted of a literature survey aimed at determining the properties of synthetic fuels. This was followed by a computer modeling effort, Task 2, to predict the exhaust emissions resulting from burning coal liquids by various combustion techniques such as lean and rich-lean combustion. The computer predictions were then compared to the results of a flame tube rig, Task 3, in which the fuel properties were varied to simulate coal liquids. Two actual SRC 2 coal liquids were tested in this flame tube task.
Academic physicians' assessment of the effects of computers on health care.
Detmer, W. M.; Friedman, C. P.
1994-01-01
We assessed the attitudes of academic physicians towards computers in health care at two academic medical centers that are in the early stages of clinical information-system deployment. We distributed a 4-page questionnaire to 470 subjects, and a total of 272 physicians (58%) responded. Our results show that respondents use computers frequently, primarily to perform academic-oriented tasks as opposed to clinical tasks. Overall, respondents viewed computers as being slightly beneficial to health care. They perceive self-education and access to up-to-date information as the most beneficial aspects of computers and are most concerned about privacy issues and the effect of computers on the doctor-patient relationship. Physicians with prior computer training and greater knowledge of informatics concepts had more favorable attitudes towards computers in health care. We suggest that negative attitudes towards computers can be addressed by careful system design as well as targeted educational activities. PMID:7949990
Ensemble perception in autism spectrum disorder: Member-identification versus mean-discrimination.
Van der Hallen, Ruth; Lemmens, Lisa; Steyaert, Jean; Noens, Ilse; Wagemans, Johan
2017-07-01
To efficiently represent the outside world our brain compresses sets of similar items into a summarized representation, a phenomenon known as ensemble perception. While most studies on ensemble perception investigate this perceptual mechanism in typically developing (TD) adults, more recently, researchers studying perceptual organization in individuals with autism spectrum disorder (ASD) have turned their attention toward ensemble perception. The current study is the first to investigate the use of ensemble perception for size in children with and without ASD (N = 42, 8-16 years). We administered a pair of tasks pioneered by Ariely [2001] evaluating both member-identification and mean-discrimination. In addition, we varied the distribution types of our sets to allow a more detailed evaluation of task performance. Results show that, overall, both groups performed similarly in the member-identification task, a test of "local perception," and similarly in the mean identification task, a test of "gist perception." However, in both tasks performance of the TD group was affected more strongly by the degree of stimulus variability in the set, than performance of the ASD group. These findings indicate that both TD children and children with ASD use ensemble statistics to represent a set of similar items, illustrating the fundamental nature of ensemble coding in visual perception. Differences in sensitivity to stimulus variability between both groups are discussed in relation to recent theories of information processing in ASD (e.g., increased sampling, decreased priors, increased precision). Autism Res 2017. © 2017 International Society for Autism Research, Wiley Periodicals, Inc. Autism Res 2017, 10: 1291-1299. © 2017 International Society for Autism Research, Wiley Periodicals, Inc. © 2017 International Society for Autism Research, Wiley Periodicals, Inc.
Computer vision camera with embedded FPGA processing
NASA Astrophysics Data System (ADS)
Lecerf, Antoine; Ouellet, Denis; Arias-Estrada, Miguel
2000-03-01
Traditional computer vision is based on a camera-computer system in which the image understanding algorithms are embedded in the computer. To circumvent the computational load of vision algorithms, low-level processing and imaging hardware can be integrated in a single compact module where a dedicated architecture is implemented. This paper presents a Computer Vision Camera based on an open architecture implemented in an FPGA. The system is targeted to real-time computer vision tasks where low level processing and feature extraction tasks can be implemented in the FPGA device. The camera integrates a CMOS image sensor, an FPGA device, two memory banks, and an embedded PC for communication and control tasks. The FPGA device is a medium size one equivalent to 25,000 logic gates. The device is connected to two high speed memory banks, an IS interface, and an imager interface. The camera can be accessed for architecture programming, data transfer, and control through an Ethernet link from a remote computer. A hardware architecture can be defined in a Hardware Description Language (like VHDL), simulated and synthesized into digital structures that can be programmed into the FPGA and tested on the camera. The architecture of a classical multi-scale edge detection algorithm based on a Laplacian of Gaussian convolution has been developed to show the capabilities of the system.
Task-induced frequency modulation features for brain-computer interfacing.
Jayaram, Vinay; Hohmann, Matthias; Just, Jennifer; Schölkopf, Bernhard; Grosse-Wentrup, Moritz
2017-10-01
Task-induced amplitude modulation of neural oscillations is routinely used in brain-computer interfaces (BCIs) for decoding subjects' intents, and underlies some of the most robust and common methods in the field, such as common spatial patterns and Riemannian geometry. While there has been some interest in phase-related features for classification, both techniques usually presuppose that the frequencies of neural oscillations remain stable across various tasks. We investigate here whether features based on task-induced modulation of the frequency of neural oscillations enable decoding of subjects' intents with an accuracy comparable to task-induced amplitude modulation. We compare cross-validated classification accuracies using the amplitude and frequency modulated features, as well as a joint feature space, across subjects in various paradigms and pre-processing conditions. We show results with a motor imagery task, a cognitive task, and also preliminary results in patients with amyotrophic lateral sclerosis (ALS), as well as using common spatial patterns and Laplacian filtering. The frequency features alone do not significantly out-perform traditional amplitude modulation features, and in some cases perform significantly worse. However, across both tasks and pre-processing in healthy subjects the joint space significantly out-performs either the frequency or amplitude features alone. This result only does not hold for ALS patients, for whom the dataset is of insufficient size to draw any statistically significant conclusions. Task-induced frequency modulation is robust and straight forward to compute, and increases performance when added to standard amplitude modulation features across paradigms. This allows more information to be extracted from the EEG signal cheaply and can be used throughout the field of BCIs.
NASA Technical Reports Server (NTRS)
Stroupe, Ashley W.; Okon, Avi; Robinson, Matthew; Huntsberger, Terry; Aghazarian, Hrand; Baumgartner, Eric
2004-01-01
Robotic Construction Crew (RCC) is a heterogeneous multi-robot system for autonomous acquisition, transport, and precision mating of components in construction tasks. RCC minimizes resources constrained in a space environment such as computation, power, communication and, sensing. A behavior-based architecture provides adaptability and robustness despite low computational requirements. RCC successfully performs several construction related tasks in an emulated outdoor environment despite high levels of uncertainty in motions and sensing. Quantitative results are provided for formation keeping in component transport, precision instrument placement, and construction tasks.
Programming distributed medical applications with XWCH2.
Ben Belgacem, Mohamed; Niinimaki, Marko; Abdennadher, Nabil
2010-01-01
Many medical applications utilise distributed/parallel computing in order to cope with demands of large data or computing power requirements. In this paper, we present a new version of the XtremWeb-CH (XWCH) platform, and demonstrate two medical applications that run on XWCH. The platform is versatile in a way that it supports direct communication between tasks. When tasks cannot communicate directly, warehouses are used as intermediary nodes between "producer" and "consumer" tasks. New features have been developed to provide improved support for writing powerfull distributed applications using an easy API.
1980-01-01
TECHNIQUES IMPROVING RAPIDLY C-7 INDUSTRY THRUSTS IN 70s DRIVING FORCE : IMPROVE PRODUCT QUALITY * EASE MAINTENANCE, MODIFICATION IMPROVE PERFORMANCE...together a task force to make recommendations on what we should be doing about computer secur- ity. Other members of the task force came from both our...of the marketing task force mostly echoed and endorsed the user’s report. Both reports were issued in March of 1973. Notice that DoD 5200.28 had just
Cloud computing task scheduling strategy based on differential evolution and ant colony optimization
NASA Astrophysics Data System (ADS)
Ge, Junwei; Cai, Yu; Fang, Yiqiu
2018-05-01
This paper proposes a task scheduling strategy DEACO based on the combination of Differential Evolution (DE) and Ant Colony Optimization (ACO), aiming at the single problem of optimization objective in cloud computing task scheduling, this paper combines the shortest task completion time, cost and load balancing. DEACO uses the solution of the DE to initialize the initial pheromone of ACO, reduces the time of collecting the pheromone in ACO in the early, and improves the pheromone updating rule through the load factor. The proposed algorithm is simulated on cloudsim, and compared with the min-min and ACO. The experimental results show that DEACO is more superior in terms of time, cost, and load.
Greene, Runyu L; Azari, David P; Hu, Yu Hen; Radwin, Robert G
2017-11-01
Patterns of physical stress exposure are often difficult to measure, and the metrics of variation and techniques for identifying them is underdeveloped in the practice of occupational ergonomics. Computer vision has previously been used for evaluating repetitive motion tasks for hand activity level (HAL) utilizing conventional 2D videos. The approach was made practical by relaxing the need for high precision, and by adopting a semi-automatic approach for measuring spatiotemporal characteristics of the repetitive task. In this paper, a new method for visualizing task factors, using this computer vision approach, is demonstrated. After videos are made, the analyst selects a region of interest on the hand to track and the hand location and its associated kinematics are measured for every frame. The visualization method spatially deconstructs and displays the frequency, speed and duty cycle components of tasks that are part of the threshold limit value for hand activity for the purpose of identifying patterns of exposure associated with the specific job factors, as well as for suggesting task improvements. The localized variables are plotted as a heat map superimposed over the video, and displayed in the context of the task being performed. Based on the intensity of the specific variables used to calculate HAL, we can determine which task factors most contribute to HAL, and readily identify those work elements in the task that contribute more to increased risk for an injury. Work simulations and actual industrial examples are described. This method should help practitioners more readily measure and interpret temporal exposure patterns and identify potential task improvements. Copyright © 2017. Published by Elsevier Ltd.
Engelman, Richard; Baker, Robert A; Likosky, Donald S; Grigore, Alina; Dickinson, Timothy A; Shore-Lesserson, Linda; Hammon, John W
2015-09-01
To improve our understanding of the evidence-based literature supporting temperature management during adult cardiopulmonary bypass, The Society of Thoracic Surgeons, the Society of Cardiovascular Anesthesiology and the American Society of ExtraCorporeal Technology tasked the authors to conduct a review of the peer-reviewed literature, including 1) optimal site for temperature monitoring, 2) avoidance of hyperthermia, 3) peak cooling temperature gradient and cooling rate, and 4) peak warming temperature gradient and rewarming rate. Authors adopted the American College of Cardiology/American Heart Association method for development clinical practice guidelines, and arrived at the following recommendation.
Global Design as the Integral Person Formation Strategy
ERIC Educational Resources Information Center
Stepanov, Alexander V.; Fedorov, Vladimir A.; Vorobyeva, Julia A.; Marakulina, Ulyana ?.; Ovchinnikov, Vladislav I.
2016-01-01
The relevance of the problem under study is based on the society's need for educating an integral person who is able to solve ecumenical project tasks. Currently this problem (as natural order from the society) is emerging in the educational system and social practices but has yet to obtain substantial scientific and theoretical justification. The…
Too Much Too Fast: The Dangers of Technological Momentum.
ERIC Educational Resources Information Center
Dyer, Dean
This paper discusses the dangers of technological momentum. Technological momentum is defined as the increase in the rate of the evolution of technology, its infusion into societal tasks and recreations, society's dependence on technology, and the impact of technology on society. Topics of discussion include changes in response to user needs,…
A Sketch of Politically Liberal Principles of Social Justice in Higher Education
ERIC Educational Resources Information Center
Bull, Barry L.
2012-01-01
In light of the importance and the potential danger of education during childhood for politically liberal societies, the author has devoted much of his professional career to thinking about and formulating the moral principles that should govern such a society's educational institutions. However, this task cannot be accomplished for all such…
Applied Computational Electromagnetics Society Journal, Volume 9, Number 1, March 1994
1994-03-01
AD-A7 5 I..... * APPLIED COMPUrA77ONAL ELECTROMAGNETICS SOCIETY Journal FjLECTE TI S*...*....March 1994 Vol. 9 No. 1 .... .. .. .ISSN 1054-4887...25.00. REMIT BY: ( 1 ) BANK DRAFTS (MUST BE DRAWN ON U.S. BANK). (2) INTERNATIONAL MONEY ORDER, (3) TRAVELER’S CHECKS IN U.S. DOLLARS, (4) ELECTRONIC...COMPUTATIONAL ELECTROMAGNETICS SOCIETY "Accesion For Joumal NTIS CRAM OTIC TAB Urannounced Justification. March 1994 By ................ Vol. 9 No. 1
2012-01-01
The report summarizes the scientific content of the annual symposium organized by the Student Council of the International Society for Computational Biology (ISCB) held in conjunction with the Intelligent Systems for Molecular Biology (ISMB) conference in Long Beach, California on July 13, 2012.
Segment Fixed Priority Scheduling for Self Suspending Real Time Tasks
2016-08-11
Segment-Fixed Priority Scheduling for Self-Suspending Real -Time Tasks Junsung Kim, Department of Electrical and Computer Engineering, Carnegie...4 2.1 Application of a Multi-Segment Self-Suspending Real -Time Task Model ............................. 5 3 Fixed Priority Scheduling...1 Figure 2: A multi-segment self-suspending real -time task model
Utility functions and resource management in an oversubscribed heterogeneous computing environment
Khemka, Bhavesh; Friese, Ryan; Briceno, Luis Diego; ...
2014-09-26
We model an oversubscribed heterogeneous computing system where tasks arrive dynamically and a scheduler maps the tasks to machines for execution. The environment and workloads are based on those being investigated by the Extreme Scale Systems Center at Oak Ridge National Laboratory. Utility functions that are designed based on specifications from the system owner and users are used to create a metric for the performance of resource allocation heuristics. Each task has a time-varying utility (importance) that the enterprise will earn based on when the task successfully completes execution. We design multiple heuristics, which include a technique to drop lowmore » utility-earning tasks, to maximize the total utility that can be earned by completing tasks. The heuristics are evaluated using simulation experiments with two levels of oversubscription. The results show the benefit of having fast heuristics that account for the importance of a task and the heterogeneity of the environment when making allocation decisions in an oversubscribed environment. Furthermore, the ability to drop low utility-earning tasks allow the heuristics to tolerate the high oversubscription as well as earn significant utility.« less
On-the-fly scheduling as a manifestation of partial-order planning and dynamic task values.
Hannah, Samuel D; Neal, Andrew
2014-09-01
The aim of this study was to develop a computational account of the spontaneous task ordering that occurs within jobs as work unfolds ("on-the-fly task scheduling"). Air traffic control is an example of work in which operators have to schedule their tasks as a partially predictable work flow emerges. To date, little attention has been paid to such on-the-fly scheduling situations. We present a series of discrete-event models fit to conflict resolution decision data collected from experienced controllers operating in a high-fidelity simulation. Our simulations reveal air traffic controllers' scheduling decisions as examples of the partial-order planning approach of Hayes-Roth and Hayes-Roth. The most successful model uses opportunistic first-come-first-served scheduling to select tasks from a queue. Tasks with short deadlines are executed immediately. Tasks with long deadlines are evaluated to assess whether they need to be executed immediately or deferred. On-the-fly task scheduling is computationally tractable despite its surface complexity and understandable as an example of both the partial-order planning strategy and the dynamic-value approach to prioritization.
Knowledge acquisition and interface design for learning on demand systems
NASA Technical Reports Server (NTRS)
Nelson, Wayne A.
1993-01-01
The rapid changes in our world precipitated by technology have created new problems and new challenges for education and training. A knowledge 'explosion' is occurring as our society moves toward a service oriented economy that relies on information as the major resource. Complex computer systems are beginning to dominate the workplace, causing alarming growth and change in many fields. The rapidly changing nature of the workplace, especially in fields related to information technology, requires that our knowledge be updated constantly. This characteristic of modern society poses seemingly unsolvable instructional problems involving coverage and obsolescence. The sheer amount of information to be learned is rapidly increasing, while at the same time some information becomes obsolete in light of new information. Education, therefore, must become a lifelong process that features learning of new material and skills as needed in relation to the job to be done. Because of the problems cited above, the current model of learning in advance may no longer be feasible in our high-technology world. In many cases, learning in advance is impossible because there are simply too many things to learn. In addition, learning in advance can be time consuming, and often results in decontextualized knowledge that does not readily transfer to the work environment. The large and growing discrepancy between the amount of potentially relevant knowledge available and the amount a person can know and remember makes learning on demand an important alternative to current instructional practices. Learning on demand takes place whenever an individual must learn something new in order to perform a task or make a decision. Learning on demand is a promising approach for addressing the problems of coverage and obsolescence because learning is contextualized and integrated into the task environment rather than being relegated to a separate phase that precedes work. Learning on demand allows learners to see for themselves the usefulness of new knowledge for actual problem situations, thereby increasing the motivation for learning new information. Finally, learning on demand makes new information relevant to the task at hand, leading to more informed decision making, better quality products, and improved performance.
Halpern, Scott D; Becker, Deborah; Curtis, J Randall; Fowler, Robert; Hyzy, Robert; Kaplan, Lewis J; Rawat, Nishi; Sessler, Curtis N; Wunsch, Hannah; Kahn, Jeremy M
2014-10-01
The high costs of health care in the United States and other developed nations are attributable, in part, to overuse of tests, treatments, and procedures that provide little to no benefit for patients. To improve the quality of care while also combating this problem of cost, the American Board of Internal Medicine Foundation developed the Choosing Wisely Campaign, tasking professional societies to develop lists of the top five medical services that patients and physicians should question. To present the Critical Care Societies Collaborative's Top 5 list in Critical Care Medicine and describe its development. Each professional society in the Collaborative nominated members to the Choosing Wisely task force, which established explicit criteria for evaluating candidate items, generated lists of items, performed literature reviews on each, and sought external input from content experts. Task force members narrowed the list to the Top 5 items using a standardized scoring system based on each item's likely overall impact and merits on the five explicit criteria. From an initial list of 58 unique recommendations, the task force proposed a Top 5 list that was ultimately endorsed by each Society within the Collaborative. The five recommendations are: (1) do not order diagnostic tests at regular intervals (such as every day), but rather in response to specific clinical questions; (2) do not transfuse red blood cells in hemodynamically stable, nonbleeding ICU patients with an Hb concentration greater than 7 g/dl; (3) do not use parenteral nutrition in adequately nourished critically ill patients within the first 7 days of an ICU stay; (4) do not deeply sedate mechanically ventilated patients without a specific indication and without daily attempts to lighten sedation; and (5) do not continue life support for patients at high risk for death or severely impaired functional recovery without offering patients and their families the alternative of care focused entirely on comfort. These five recommendations provide a starting point for clinicians and patients to make decisions leading to higher-quality, lower-cost care. Future work is needed to promote adherence to these recommendations and to develop additional ways for intensive care clinicians to take leadership in reining in health-care costs.
Cognitive decline and slower reaction time in elderly individuals with mild cognitive impairment.
Chen, Ko-Chia; Weng, Chia-Ying; Hsiao, Sigmund; Tsao, Wen-Long; Koo, Malcolm
2017-11-01
The relationship between declining performance, as measured by changes in reaction time, and declining cognitive function has not been critically studied. The aim of the present study was to investigate the association between reaction time during a task and cognitive ability in elderly Taiwanese individuals. Patients aged 65 years or older with mild cognitive impairment (MCI) (n = 33) and Alzheimer's disease (n = 26) were recruited from the neurology clinic of a regional hospital in southern Taiwan. In addition, 28 healthy controls aged 65 years or older were recruited from the community. The cognitive performance of the study participants was assessed using the Cognitive Abilities Screening Instrument (CASI). A computer-administered simple reaction time (SRT) task and a flanker reaction time (FRT) task were administered to assess participants' cognitive function. A non-parametric Kruskal-Wallis test was performed to compare CASI scores, SRT, and FRT among the three groups. anova was also used to compare CASI scores, inverse-transformed SRT, and inverse-transformed FRT among the three groups, with adjustment for age and years of education. Additionally, Pearson's partial correlation coefficients were used to assess the association of CASI scores with inverse-transformed SRT, and inverse-transformed FRT within each of the three groups. Significant differences in CASI scores, SRT, and FRT were found between the Alzheimer's disease group and the other two groups, either with or without adjustment for age or education. The reaction time of patients with Alzheimer's disease was significantly slower than the other two groups. Moreover, significant correlation between CASI and FRT was found in patients with MCI. Altered performance in a speed task was observed in patients with MCI. The FRT task should further be explored for its role as a marker for cognitive decline in elderly individuals, particularly in those with MCI. © 2017 Japanese Psychogeriatric Society.
Yu, Naichang; Xia, Ping; Mastroianni, Anthony; Kolar, Matthew D; Chao, Samuel T; Greskovich, John F; Suh, John H
Process consistency in planning and delivery of radiation therapy is essential to maintain patient safety and treatment quality and efficiency. Ensuring the timely completion of each critical clinical task is one aspect of process consistency. The purpose of this work is to report our experience in implementing a quantitative metric and automatic auditing program (QMAP) with a goal of improving the timely completion of critical clinical tasks. Based on our clinical electronic medical records system, we developed a software program to automatically capture the completion timestamp of each critical clinical task while providing frequent alerts of potential delinquency. These alerts were directed to designated triage teams within a time window that would offer an opportunity to mitigate the potential for late completion. Since July 2011, 18 metrics were introduced in our clinical workflow. We compared the delinquency rates for 4 selected metrics before the implementation of the metric with the delinquency rate of 2016. One-tailed Student t test was used for statistical analysis RESULTS: With an average of 150 daily patients on treatment at our main campus, the late treatment plan completion rate and late weekly physics check were reduced from 18.2% and 8.9% in 2011 to 4.2% and 0.1% in 2016, respectively (P < .01). The late weekly on-treatment physician visit rate was reduced from 7.2% in 2012 to <1.6% in 2016. The yearly late cone beam computed tomography review rate was reduced from 1.6% in 2011 to <0.1% in 2016. QMAP is effective in reducing late completions of critical tasks, which can positively impact treatment quality and patient safety by reducing the potential for errors resulting from distractions, interruptions, and rush in completion of critical tasks. Copyright © 2016 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.
Joch, Michael; Hegele, Mathias; Maurer, Heiko; Müller, Hermann; Maurer, Lisa Katharina
2017-07-01
The error (related) negativity (Ne/ERN) is an event-related potential in the electroencephalogram (EEG) correlating with error processing. Its conditions of appearance before terminal external error information suggest that the Ne/ERN is indicative of predictive processes in the evaluation of errors. The aim of the present study was to specifically examine the Ne/ERN in a complex motor task and to particularly rule out other explaining sources of the Ne/ERN aside from error prediction processes. To this end, we focused on the dependency of the Ne/ERN on visual monitoring about the action outcome after movement termination but before result feedback (action effect monitoring). Participants performed a semi-virtual throwing task by using a manipulandum to throw a virtual ball displayed on a computer screen to hit a target object. Visual feedback about the ball flying to the target was masked to prevent action effect monitoring. Participants received a static feedback about the action outcome (850 ms) after each trial. We found a significant negative deflection in the average EEG curves of the error trials peaking at ~250 ms after ball release, i.e., before error feedback. Furthermore, this Ne/ERN signal did not depend on visual ball-flight monitoring after release. We conclude that the Ne/ERN has the potential to indicate error prediction in motor tasks and that it exists even in the absence of action effect monitoring. NEW & NOTEWORTHY In this study, we are separating different kinds of possible contributors to an electroencephalogram (EEG) error correlate (Ne/ERN) in a throwing task. We tested the influence of action effect monitoring on the Ne/ERN amplitude in the EEG. We used a task that allows us to restrict movement correction and action effect monitoring and to control the onset of result feedback. We ascribe the Ne/ERN to predictive error processing where a conscious feeling of failure is not a prerequisite. Copyright © 2017 the American Physiological Society.
ERIC Educational Resources Information Center
Brown, John Seely; Goldstein, Ira
A revolution that will transform learning in our society, altering both the methods and the content of education, has been made possible by harnessing tomorrow's powerful computer technology to serve as intelligent instructional systems. The unique quality of the computer that makes a revolution possible is that it can serve not only as a…
Computer-generated reminders and quality of pediatric HIV care in a resource-limited setting.
Were, Martin C; Nyandiko, Winstone M; Huang, Kristin T L; Slaven, James E; Shen, Changyu; Tierney, William M; Vreeman, Rachel C
2013-03-01
To evaluate the impact of clinician-targeted computer-generated reminders on compliance with HIV care guidelines in a resource-limited setting. We conducted this randomized, controlled trial in an HIV referral clinic in Kenya caring for HIV-infected and HIV-exposed children (<14 years of age). For children randomly assigned to the intervention group, printed patient summaries containing computer-generated patient-specific reminders for overdue care recommendations were provided to the clinician at the time of the child's clinic visit. For children in the control group, clinicians received the summaries, but no computer-generated reminders. We compared differences between the intervention and control groups in completion of overdue tasks, including HIV testing, laboratory monitoring, initiating antiretroviral therapy, and making referrals. During the 5-month study period, 1611 patients (49% female, 70% HIV-infected) were eligible to receive at least 1 computer-generated reminder (ie, had an overdue clinical task). We observed a fourfold increase in the completion of overdue clinical tasks when reminders were availed to providers over the course of the study (68% intervention vs 18% control, P < .001). Orders also occurred earlier for the intervention group (77 days, SD 2.4 days) compared with the control group (104 days, SD 1.2 days) (P < .001). Response rates to reminders varied significantly by type of reminder and between clinicians. Clinician-targeted, computer-generated clinical reminders are associated with a significant increase in completion of overdue clinical tasks for HIV-infected and exposed children in a resource-limited setting.
A computational framework for the study of confidence in humans and animals
Kepecs, Adam; Mainen, Zachary F.
2012-01-01
Confidence judgements, self-assessments about the quality of a subject's knowledge, are considered a central example of metacognition. Prima facie, introspection and self-report appear the only way to access the subjective sense of confidence or uncertainty. Contrary to this notion, overt behavioural measures can be used to study confidence judgements by animals trained in decision-making tasks with perceptual or mnemonic uncertainty. Here, we suggest that a computational approach can clarify the issues involved in interpreting these tasks and provide a much needed springboard for advancing the scientific understanding of confidence. We first review relevant theories of probabilistic inference and decision-making. We then critically discuss behavioural tasks employed to measure confidence in animals and show how quantitative models can help to constrain the computational strategies underlying confidence-reporting behaviours. In our view, post-decision wagering tasks with continuous measures of confidence appear to offer the best available metrics of confidence. Since behavioural reports alone provide a limited window into mechanism, we argue that progress calls for measuring the neural representations and identifying the computations underlying confidence reports. We present a case study using such a computational approach to study the neural correlates of decision confidence in rats. This work shows that confidence assessments may be considered higher order, but can be generated using elementary neural computations that are available to a wide range of species. Finally, we discuss the relationship of confidence judgements to the wider behavioural uses of confidence and uncertainty. PMID:22492750
Kappel, David; Legenstein, Robert; Habenschuss, Stefan; Hsieh, Michael; Maass, Wolfgang
2018-01-01
Synaptic connections between neurons in the brain are dynamic because of continuously ongoing spine dynamics, axonal sprouting, and other processes. In fact, it was recently shown that the spontaneous synapse-autonomous component of spine dynamics is at least as large as the component that depends on the history of pre- and postsynaptic neural activity. These data are inconsistent with common models for network plasticity and raise the following questions: how can neural circuits maintain a stable computational function in spite of these continuously ongoing processes, and what could be functional uses of these ongoing processes? Here, we present a rigorous theoretical framework for these seemingly stochastic spine dynamics and rewiring processes in the context of reward-based learning tasks. We show that spontaneous synapse-autonomous processes, in combination with reward signals such as dopamine, can explain the capability of networks of neurons in the brain to configure themselves for specific computational tasks, and to compensate automatically for later changes in the network or task. Furthermore, we show theoretically and through computer simulations that stable computational performance is compatible with continuously ongoing synapse-autonomous changes. After reaching good computational performance it causes primarily a slow drift of network architecture and dynamics in task-irrelevant dimensions, as observed for neural activity in motor cortex and other areas. On the more abstract level of reinforcement learning the resulting model gives rise to an understanding of reward-driven network plasticity as continuous sampling of network configurations.
Habenschuss, Stefan; Hsieh, Michael
2018-01-01
Synaptic connections between neurons in the brain are dynamic because of continuously ongoing spine dynamics, axonal sprouting, and other processes. In fact, it was recently shown that the spontaneous synapse-autonomous component of spine dynamics is at least as large as the component that depends on the history of pre- and postsynaptic neural activity. These data are inconsistent with common models for network plasticity and raise the following questions: how can neural circuits maintain a stable computational function in spite of these continuously ongoing processes, and what could be functional uses of these ongoing processes? Here, we present a rigorous theoretical framework for these seemingly stochastic spine dynamics and rewiring processes in the context of reward-based learning tasks. We show that spontaneous synapse-autonomous processes, in combination with reward signals such as dopamine, can explain the capability of networks of neurons in the brain to configure themselves for specific computational tasks, and to compensate automatically for later changes in the network or task. Furthermore, we show theoretically and through computer simulations that stable computational performance is compatible with continuously ongoing synapse-autonomous changes. After reaching good computational performance it causes primarily a slow drift of network architecture and dynamics in task-irrelevant dimensions, as observed for neural activity in motor cortex and other areas. On the more abstract level of reinforcement learning the resulting model gives rise to an understanding of reward-driven network plasticity as continuous sampling of network configurations. PMID:29696150
Allen, Michael Todd; Jameson, Molly M; Myers, Catherine E
2017-01-01
Personality factors such as behavioral inhibition (BI), a temperamental tendency for avoidance in the face of unfamiliar situations, have been identified as risk factors for anxiety disorders. Personality factors are generally identified through self-report inventories. However, this tendency to avoid may affect the accuracy of these self-report inventories. Previously, a computer based task was developed in which the participant guides an on-screen "avatar" through a series of onscreen events; performance on the task could accurately predict participants' BI, measured by a standard paper and pencil questionnaire (Adult Measure of Behavioral Inhibition, or AMBI). Here, we sought to replicate this finding as well as compare performance on the avatar task to another measure related to BI, the harm avoidance (HA) scale of the Tridimensional Personality Questionnaire (TPQ). The TPQ includes HA scales as well as scales assessing reward dependence (RD), novelty seeking (NS) and persistence. One hundred and one undergraduates voluntarily completed the avatar task and the paper and pencil inventories in a counter-balanced order. Scores on the avatar task were strongly correlated with BI assessed via the AMBI questionnaire, which replicates prior findings. Females exhibited higher HA scores than males, but did not differ on scores on the avatar task. There was a strong positive relationship between scores on the avatar task and HA scores. One aspect of HA, fear of uncertainty was found to moderately mediate the relationship between AMBI scores and avatar scores. NS had a strong negative relationship with scores on the avatar task, but there was no significant relationship between RD and scores on the avatar task. These findings indicate the effectiveness of the avatar task as a behavioral alternative to self-report measures to assess avoidance. In addition, the use of computer based behavioral tasks are a viable alternative to paper and pencil self-report inventories, particularly when assessing anxiety and avoidance.
ERIC Educational Resources Information Center
Wood, Milton E.
The purpose of the effort was to determine the benefits to be derived from the adaptive training technique of automatically adjusting task difficulty as a function of a student skill during early learning of a complex perceptual motor task. A digital computer provided the task dynamics, scoring, and adaptive control of a second-order, two-axis,…
Why Don't All Professors Use Computers?
ERIC Educational Resources Information Center
Drew, David Eli
1989-01-01
Discusses the adoption of computer technology at universities and examines reasons why some professors don't use computers. Topics discussed include computer applications, including artificial intelligence, social science research, statistical analysis, and cooperative research; appropriateness of the technology for the task; the Computer Aptitude…
Asymptotically Optimal Motion Planning for Learned Tasks Using Time-Dependent Cost Maps
Bowen, Chris; Ye, Gu; Alterovitz, Ron
2015-01-01
In unstructured environments in people’s homes and workspaces, robots executing a task may need to avoid obstacles while satisfying task motion constraints, e.g., keeping a plate of food level to avoid spills or properly orienting a finger to push a button. We introduce a sampling-based method for computing motion plans that are collision-free and minimize a cost metric that encodes task motion constraints. Our time-dependent cost metric, learned from a set of demonstrations, encodes features of a task’s motion that are consistent across the demonstrations and, hence, are likely required to successfully execute the task. Our sampling-based motion planner uses the learned cost metric to compute plans that simultaneously avoid obstacles and satisfy task constraints. The motion planner is asymptotically optimal and minimizes the Mahalanobis distance between the planned trajectory and the distribution of demonstrations in a feature space parameterized by the locations of task-relevant objects. The motion planner also leverages the distribution of the demonstrations to significantly reduce plan computation time. We demonstrate the method’s effectiveness and speed using a small humanoid robot performing tasks requiring both obstacle avoidance and satisfaction of learned task constraints. Note to Practitioners Motivated by the desire to enable robots to autonomously operate in cluttered home and workplace environments, this paper presents an approach for intuitively training a robot in a manner that enables it to repeat the task in novel scenarios and in the presence of unforeseen obstacles in the environment. Based on user-provided demonstrations of the task, our method learns features of the task that are consistent across the demonstrations and that we expect should be repeated by the robot when performing the task. We next present an efficient algorithm for planning robot motions to perform the task based on the learned features while avoiding obstacles. We demonstrate the effectiveness of our motion planner for scenarios requiring transferring a powder and pushing a button in environments with obstacles, and we plan to extend our results to more complex tasks in the future. PMID:26279642
ERIC Educational Resources Information Center
Hedayati, Mohsen; Foomani, Elham Mohammadi
2015-01-01
The study reported here explores whether English as a foreign Language (EFL) learners' preferred ways of learning (i.e., learning styles) affect their task performance in computer-mediated communication (CMC). As Ellis (2010) points out, while the increasing use of different sorts of technology is witnessed in language learning contexts, it is…
ERIC Educational Resources Information Center
Ramsey, Gregory W.
2010-01-01
This dissertation proposes and tests a theory explaining how people make decisions to achieve a goal in a specific task environment. The theory is represented as a computational model and implemented as a computer program. The task studied was primary care physicians treating patients with type 2 diabetes. Some physicians succeed in achieving…
ERIC Educational Resources Information Center
Pierson, Susan Jacques
2015-01-01
One way to provide high quality instruction for underserved English Language Learners around the world is to combine Task-Based English Language Learning with Computer- Assisted Instruction. As part of an ongoing project, "Bridges to Swaziland," these approaches have been implemented in a determined effort to improve the ESL program for…
ERIC Educational Resources Information Center
Dagnino, Francesca Maria; Ballauri, Margherita; Benigno, Vincenza; Caponetto, Ilaria; Pesenti, Elia
2013-01-01
This paper presents the results of preliminary research on the assessment of reasoning abilities in primary school poor achievers vs. normal achievers using computer game tasks. Subjects were evaluated by means of cognitive assessment on logical abilities and academic skills. The aim of this study is to better understand the relationship between…
Quantum robots plus environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benioff, P.
1998-07-23
A quantum robot is a mobile quantum system, including an on board quantum computer and needed ancillary systems, that interacts with an environment of quantum systems. Quantum robots carry out tasks whose goals include making specified changes in the state of the environment or carrying out measurements on the environment. The environments considered so far, oracles, data bases, and quantum registers, are seen to be special cases of environments considered here. It is also seen that a quantum robot should include a quantum computer and cannot be simply a multistate head. A model of quantum robots and their interactions ismore » discussed in which each task, as a sequence of alternating computation and action phases,is described by a unitary single time step operator T {approx} T{sub a} + T{sub c} (discrete space and time are assumed). The overall system dynamics is described as a sum over paths of completed computation (T{sub c}) and action (T{sub a}) phases. A simple example of a task, measuring the distance between the quantum robot and a particle on a 1D lattice with quantum phase path dispersion present, is analyzed. A decision diagram for the task is presented and analyzed.« less
2014-12-26
and Ergonomics Society 54th Annual Meeting (pp. 284-288). Human Factors and Ergonomics Society. Miller, C. A., & Parasuraman, R. (2007). Designing ...Factors and Ergonomics Society, 37 (1), 32-64. Gould, J. (1988). How to design usable systems (excerpt). IBM Research Center Hawthorne. In M...Research Past, Present, and Future. Ergonomics in Design : The Quarterly of Human Factors Applications , 21(2), pp. 9-14. Handal, C., & Ikuma, L. H
Grid workflow job execution service 'Pilot'
NASA Astrophysics Data System (ADS)
Shamardin, Lev; Kryukov, Alexander; Demichev, Andrey; Ilyin, Vyacheslav
2011-12-01
'Pilot' is a grid job execution service for workflow jobs. The main goal for the service is to automate computations with multiple stages since they can be expressed as simple workflows. Each job is a directed acyclic graph of tasks and each task is an execution of something on a grid resource (or 'computing element'). Tasks may be submitted to any WS-GRAM (Globus Toolkit 4) service. The target resources for the tasks execution are selected by the Pilot service from the set of available resources which match the specific requirements from the task and/or job definition. Some simple conditional execution logic is also provided. The 'Pilot' service is built on the REST concepts and provides a simple API through authenticated HTTPS. This service is deployed and used in production in a Russian national grid project GridNNN.
Rosenthal, L E
1986-10-01
Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.
Shrager, Jeff; Billman, Dorrit; Convertino, Gregorio; Massar, J P; Pirolli, Peter
2010-01-01
Science is a form of distributed analysis involving both individual work that produces new knowledge and collaborative work to exchange information with the larger community. There are many particular ways in which individual and community can interact in science, and it is difficult to assess how efficient these are, and what the best way might be to support them. This paper reports on a series of experiments in this area and a prototype implementation using a research platform called CACHE. CACHE both supports experimentation with different structures of interaction between individual and community cognition and serves as a prototype for computational support for those structures. We particularly focus on CACHE-BC, the Bayes community version of CACHE, within which the community can break up analytical tasks into "mind-sized" units and use provenance tracking to keep track of the relationship between these units. Copyright © 2009 Cognitive Science Society, Inc.
Accomplishments and challenges of surgical simulation.
Satava, R M
2001-03-01
For nearly a decade, advanced computer technologies have created extraordinary educational tools using three-dimensional (3D) visualization and virtual reality. Pioneering efforts in surgical simulation with these tools have resulted in a first generation of simulators for surgical technical skills. Accomplishments include simulations with 3D models of anatomy for practice of surgical tasks, initial assessment of student performance in technical skills, and awareness by professional societies of potential in surgical education and certification. However, enormous challenges remain, which include improvement of technical fidelity, standardization of accurate metrics for performance evaluation, integration of simulators into a robust educational curriculum, stringent evaluation of simulators for effectiveness and value added to surgical training, determination of simulation application to certification of surgical technical skills, and a business model to implement and disseminate simulation successfully throughout the medical education community. This review looks at the historical progress of surgical simulators, their accomplishments, and the challenges that remain.
Cultural differences in room size perception
Bülthoff, Heinrich H.; de la Rosa, Stephan; Dodds, Trevor J.
2017-01-01
Cultural differences in spatial perception have been little investigated, which gives rise to the impression that spatial cognitive processes might be universal. Contrary to this idea, we demonstrate cultural differences in spatial volume perception of computer generated rooms between Germans and South Koreans. We used a psychophysical task in which participants had to judge whether a rectangular room was larger or smaller than a square room of reference. We systematically varied the room rectangularity (depth to width aspect ratio) and the viewpoint (middle of the short wall vs. long wall) from which the room was viewed. South Koreans were significantly less biased by room rectangularity and viewpoint than their German counterparts. These results are in line with previous notions of general cognitive processing strategies being more context dependent in East Asian societies than Western ones. We point to the necessity of considering culturally-specific cognitive processing strategies in visual spatial cognition research. PMID:28426729
Hattori, Masasi; Oaksford, Mike
2007-09-10
In this article, 41 models of covariation detection from 2 × 2 contingency tables were evaluated against past data in the literature and against data from new experiments. A new model was also included based on a limiting case of the normative phi-coefficient under an extreme rarity assumption, which has been shown to be an important factor in covariation detection (McKenzie & Mikkelsen, 2007) and data selection (Hattori, 2002; Oaksford & Chater, 1994, 2003). The results were supportive of the new model. To investigate its explanatory adequacy, a rational analysis using two computer simulations was conducted. These simulations revealed the environmental conditions and the memory restrictions under which the new model best approximates the normative model of covariation detection in these tasks. They thus demonstrated the adaptive rationality of the new model. 2007 Cognitive Science Society, Inc.
Civil mini-RPA's for the 1980's: Avionics design considerations. [remotely piloted vehicles
NASA Technical Reports Server (NTRS)
Karmarkar, J. S.
1975-01-01
A number of remote sensing or surveillance tasks (e.g., fire fighting, crop monitoring) in the civilian sector of our society may be performed in a cost effective manner by use of small remotely piloted aircraft (RPA). This study was conducted to determine equipment (and the associated technology) that is available, and that could be applied to the mini-RPA and to examine the potential applications of the mini-RPA with special emphasis on the wild fire surveillance mission. The operational considerations of using the mini-RPA as affected by government regulatory agencies were investigated. These led to equipment requirements (e.g., infra-red sensors) over and above those for the performance of the mission. A computer technology survey and forecast was performed. Key subsystems were identified, and a distributed microcomputer configuration, that was functionally modular, was recommended. Areas for further NASA research and development activity were also identified.
Troiano, Luigi; Birtolo, Cosimo; Armenise, Roberto
2016-01-01
In many circumstances, concepts, ideas and emotions are mainly conveyed by colors. Color vision disorders can heavily limit the user experience in accessing Information Society. Therefore, color vision impairments should be taken into account in order to make information and services accessible to a broader audience. The task is not easy for designers that generally are not affected by any color vision disorder. In any case, the design of accessible user interfaces should not lead to to boring color schemes. The selection of appealing and harmonic color combinations should be preserved. In past research we investigated a generative approach led by evolutionary computing in supporting interface designers to make colors accessible to impaired users. This approach has also been followed by other authors. The contribution of this paper is to provide an experimental validation to the claim that this approach is actually beneficial to designers and users.
NASA Technical Reports Server (NTRS)
Wolpert, David H.; Koga, Dennis (Technical Monitor)
2000-01-01
In this first of two papers, strong limits on the accuracy of physical computation are established. First it is proven that there cannot be a physical computer C to which one can pose any and all computational tasks concerning the physical universe. Next it is proven that no physical computer C can correctly carry out any computational task in the subset of such tasks that can be posed to C. This result holds whether the computational tasks concern a system that is physically isolated from C, or instead concern a system that is coupled to C. As a particular example, this result means that there cannot be a physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly 'processing information faster than the universe does'. The results also mean that there cannot exist an infallible, general-purpose observation apparatus, and that there cannot be an infallible, general-purpose control apparatus. These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - a definition of 'physical computation' - is needed to address the issues considered in these papers. While this definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. The second in this pair of papers presents a preliminary exploration of some of this mathematical structure, including in particular that of prediction complexity, which is a 'physical computation analogue' of algorithmic information complexity. It is proven in that second paper that either the Hamiltonian of our universe proscribes a certain type of computation, or prediction complexity is unique (unlike algorithmic information complexity), in that there is one and only version of it that can be applicable throughout our universe.
Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 5: Study analysis report
NASA Technical Reports Server (NTRS)
1989-01-01
The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex (PTC) at the Marshall Space Flight Center (MSFC). The PTC will train the space station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be on-board the Freedom Space Station. The further analysis performed on the SCS study as part of task 2-Perform Studies and Parametric Analysis-of the SCS study contract is summarized. These analyses were performed to resolve open issues remaining after the completion of task 1, and the publishing of the SCS study issues report. The results of these studies provide inputs into SCS task 3-Develop and present SCS requirements, and SCS task 4-develop SCS conceptual designs. The purpose of these studies is to resolve the issues into usable requirements given the best available information at the time of the study. A list of all the SCS study issues is given.
Using Apex To Construct CPM-GOMS Models
NASA Technical Reports Server (NTRS)
John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger
2006-01-01
process for automatically generating computational models of human/computer interactions as well as graphical and textual representations of the models has been built on the conceptual foundation of a method known in the art as CPM-GOMS. This method is so named because it combines (1) the task decomposition of analysis according to an underlying method known in the art as the goals, operators, methods, and selection (GOMS) method with (2) a model of human resource usage at the level of cognitive, perceptual, and motor (CPM) operations. CPM-GOMS models have made accurate predictions about behaviors of skilled computer users in routine tasks, but heretofore, such models have been generated in a tedious, error-prone manual process. In the present process, CPM-GOMS models are generated automatically from a hierarchical task decomposition expressed by use of a computer program, known as Apex, designed previously to be used to model human behavior in complex, dynamic tasks. An inherent capability of Apex for scheduling of resources automates the difficult task of interleaving the cognitive, perceptual, and motor resources that underlie common task operators (e.g., move and click mouse). The user interface of Apex automatically generates Program Evaluation Review Technique (PERT) charts, which enable modelers to visualize the complex parallel behavior represented by a model. Because interleaving and the generation of displays to aid visualization are automated, it is now feasible to construct arbitrarily long sequences of behaviors. The process was tested by using Apex to create a CPM-GOMS model of a relatively simple human/computer-interaction task and comparing the time predictions of the model and measurements of the times taken by human users in performing the various steps of the task. The task was to withdraw $80 in cash from an automated teller machine (ATM). For the test, a Visual Basic mockup of an ATM was created, with a provision for input from (and measurement of the performance of) the user via a mouse. The times predicted by the automatically generated model turned out to approximate the measured times fairly well (see figure). While these results are promising, there is need for further development of the process. Moreover, it will also be necessary to test other, more complex models: The actions required of the user in the ATM task are too sequential to involve substantial parallelism and interleaving and, hence, do not serve as an adequate test of the unique strength of CPM-GOMS models to accommodate parallelism and interleaving.
Some foundational aspects of quantum computers and quantum robots.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benioff, P.; Physics
1998-01-01
This paper addresses foundational issues related to quantum computing. The need for a universally valid theory such as quantum mechanics to describe to some extent its own validation is noted. This includes quantum mechanical descriptions of systems that do theoretical calculations (i.e. quantum computers) and systems that perform experiments. Quantum robots interacting with an environment are a small first step in this direction. Quantum robots are described here as mobile quantum systems with on-board quantum computers that interact with environments. Included are discussions on the carrying out of tasks and the division of tasks into computation and action phases. Specificmore » models based on quantum Turing machines are described. Differences and similarities between quantum robots plus environments and quantum computers are discussed.« less
Dual tasking and stuttering: from the laboratory to the clinic.
Metten, Christine; Bosshardt, Hans-Georg; Jones, Mark; Eisenhuth, John; Block, Susan; Carey, Brenda; O'Brian, Sue; Packman, Ann; Onslow, Mark; Menzies, Ross
2011-01-01
The aim of the three studies in this article was to develop a way to include dual tasking in speech restructuring treatment for persons who stutter (PWS). It is thought that this may help clients maintain the benefits of treatment in the real world, where attentional resources are frequently diverted away from controlling fluency by the demands of other tasks. In Part 1, 17 PWS performed a story-telling task and a computer semantic task simultaneously. Part 2 reports the incorporation of the Part 1 protocol into a handy device for use in a clinical setting (the Dual Task and Stuttering Device, DAS-D). Part 3 is a proof of concept study in which three PWS reported on their experiences of using the device during treatment. In Part 1, stuttering frequency and errors on the computer task both increased under dual task conditions, indicating that the protocol would be appropriate for use in a clinical setting. All three participants in Part 3 reported positively on their experiences using the DAS-D. Dual tasking during treatment using the DAS-D appears to be a viable clinical procedure. Further research is required to establish effectiveness.
Multimodal Word Meaning Induction From Minimal Exposure to Natural Text.
Lazaridou, Angeliki; Marelli, Marco; Baroni, Marco
2017-04-01
By the time they reach early adulthood, English speakers are familiar with the meaning of thousands of words. In the last decades, computational simulations known as distributional semantic models (DSMs) have demonstrated that it is possible to induce word meaning representations solely from word co-occurrence statistics extracted from a large amount of text. However, while these models learn in batch mode from large corpora, human word learning proceeds incrementally after minimal exposure to new words. In this study, we run a set of experiments investigating whether minimal distributional evidence from very short passages suffices to trigger successful word learning in subjects, testing their linguistic and visual intuitions about the concepts associated with new words. After confirming that subjects are indeed very efficient distributional learners even from small amounts of evidence, we test a DSM on the same multimodal task, finding that it behaves in a remarkable human-like way. We conclude that DSMs provide a convincing computational account of word learning even at the early stages in which a word is first encountered, and the way they build meaning representations can offer new insights into human language acquisition. Copyright © 2017 Cognitive Science Society, Inc.
Task-specific performance effects with different numeric keypad layouts.
Armand, Jenny T; Redick, Thomas S; Poulsen, Joan R
2014-07-01
Two commonly used keypad arrangements are the telephone and calculator layouts. The purpose of this study was to determine if entering different types of numeric information was quicker and more accurate with the telephone or the calculator layout on a computer keyboard numeric keypad. Fifty-seven participants saw a 10-digit numeric stimulus to type with a computer number keypad as quickly and as accurately as possible. Stimuli were presented in either a numerical [1,234,567,890] or phone [(123) 456-7890] format. The results indicated that participants' memory of the layout for the arrangement of keys on a telephone was significantly better than the layout of a calculator. In addition, the results showed that participants were more accurate when entering stimuli using the calculator keypad layout. Critically, participants' response times showed an interaction of stimulus format and keypad layout: participants were specifically slowed when entering numeric stimuli using a telephone keypad layout. Responses made using the middle row of keys were faster and more accurate than responses using the top and bottom row of keys. Implications for keypad design and cell phone usage are discussed. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Study of low speed flow cytometry for diffraction imaging with different chamber and nozzle designs.
Sa, Yu; Feng, Yuanming; Jacobs, Kenneth M; Yang, Jun; Pan, Ran; Gkigkitzis, Ioannis; Lu, Jun Q; Hu, Xin-Hua
2013-11-01
Achieving effective hydrodynamic focusing and flow stability at low speed presents a challenging design task in flow cytometry for studying phenomena such as cell adhesion and diffraction imaging of cells with low-cost cameras. We have developed different designs of flow chamber and sheath nozzle to accomplish the above goal. A 3D computational model of the chambers has been established to simulate the fluid dynamics in different chamber designs and measurements have been performed to determine the velocity and size distributions of the core fluid from the nozzle. Comparison of the simulation data with experimental results shows good agreement. With the computational model significant insights were gained for optimization of the chamber design and improvement of the cell positioning accuracy for study of slow moving cells. The benefit of low flow speed has been demonstrated also by reduced blurring in the diffraction images of single cells. Based on these results, we concluded that the new designs of chamber and sheath nozzle produce stable hydrodynamic focusing of the core fluid at low speed and allow detailed study of cellular morphology under various rheological conditions using the diffraction imaging method. © 2013 International Society for Advancement of Cytometry.
Take Russia to 'task' on bioweapons transparency.
Zilinskas, Raymond A
2012-06-06
In the run-up to his reelection, Russian president Vladimir Putin outlined 28 tasks to be undertaken by his administration, including one that commanded the development of weapons based on “genetic principles.” Political pressure must be applied by governments and professional societies to ensure that there is not a modern reincarnation of the Soviet biological warfare program.
Expanding the Range, Dividing the Task: Educating the Human Brain in an Electronic Society.
ERIC Educational Resources Information Center
Sylwester, Robert
1990-01-01
Reviews five properties of the brain that are central to dividing educational tasks between minds and machines and creating curricula to help students understand the complementary relationships between the brain and supportive machinery. The curriculum should focus on knowledge, skills, and values that most characterize and enhance our brain's…
APA (American Psychological Association) Task Force on Privacy and Confidentiality. Final Report.
ERIC Educational Resources Information Center
American Psychological Association, Washington, DC.
This Task Force on Privacy and Confidentiality is intended to call attention to the central role of the right to privacy in the maintenance and enrichment of a free society. The psychological implications of the changing views of privacy as reflected in political, social, and technological developments are discussed, and recommendations relating…
Evolutionary Paths to Corrupt Societies of Artificial Agents
NASA Astrophysics Data System (ADS)
Nasrallah, Walid
Virtual corrupt societies can be defined as groups of interacting computer-generated agents who predominantly choose behavior that gives short term personal gain at the expense of a higher aggregate cost to others. This paper focuses on corrupt societies that, unlike published models in which cooperation must evolve in order for the society to continue to survive, do not naturally die out as the corrupt class siphons off the resources. For example, a very computationally simple strategy of avoiding confrontation can allow a majority of "unethical" individuals to survive off the efforts of an "ethical" but productive minority. Analogies are drawn to actual human societies in which similar conditions gave rise to behavior traditionally defined as economic or political corruption.
ERIC Educational Resources Information Center
Rozell, E. J.; Gardner, W. L., III
1999-01-01
A model of the intrapersonal processes impacting computer-related performance was tested using data from 75 manufacturing employees in a computer training course. Gender, computer experience, and attributional style were predictive of computer attitudes, which were in turn related to computer efficacy, task-specific performance expectations, and…
Establishing a group of endpoints in a parallel computer
Archer, Charles J.; Blocksome, Michael A.; Ratterman, Joseph D.; Smith, Brian E.; Xue, Hanhong
2016-02-02
A parallel computer executes a number of tasks, each task includes a number of endpoints and the endpoints are configured to support collective operations. In such a parallel computer, establishing a group of endpoints receiving a user specification of a set of endpoints included in a global collection of endpoints, where the user specification defines the set in accordance with a predefined virtual representation of the endpoints, the predefined virtual representation is a data structure setting forth an organization of tasks and endpoints included in the global collection of endpoints and the user specification defines the set of endpoints without a user specification of a particular endpoint; and defining a group of endpoints in dependence upon the predefined virtual representation of the endpoints and the user specification.
Flow of a Gas Turbine Engine Low-Pressure Subsystem Simulated
NASA Technical Reports Server (NTRS)
Veres, Joseph P.
1997-01-01
The NASA Lewis Research Center is managing a task to numerically simulate overnight, on a parallel computing testbed, the aerodynamic flow in the complete low-pressure subsystem (LPS) of a gas turbine engine. The model solves the three-dimensional Navier- Stokes flow equations through all the components within the LPS, as well as the external flow around the engine nacelle. The LPS modeling task is being performed by Allison Engine Company under the Small Engine Technology contract. The large computer simulation was evaluated on networked computer systems using 8, 16, and 32 processors, with the parallel computing efficiency reaching 75 percent when 16 processors were used.
Software designs of image processing tasks with incremental refinement of computation.
Anastasia, Davide; Andreopoulos, Yiannis
2010-08-01
Software realizations of computationally-demanding image processing tasks (e.g., image transforms and convolution) do not currently provide graceful degradation when their clock-cycles budgets are reduced, e.g., when delay deadlines are imposed in a multitasking environment to meet throughput requirements. This is an important obstacle in the quest for full utilization of modern programmable platforms' capabilities since worst-case considerations must be in place for reasonable quality of results. In this paper, we propose (and make available online) platform-independent software designs performing bitplane-based computation combined with an incremental packing framework in order to realize block transforms, 2-D convolution and frame-by-frame block matching. The proposed framework realizes incremental computation: progressive processing of input-source increments improves the output quality monotonically. Comparisons with the equivalent nonincremental software realization of each algorithm reveal that, for the same precision of the result, the proposed approach can lead to comparable or faster execution, while it can be arbitrarily terminated and provide the result up to the computed precision. Application examples with region-of-interest based incremental computation, task scheduling per frame, and energy-distortion scalability verify that our proposal provides significant performance scalability with graceful degradation.
Rose, Michael; Rubal, Bernard; Hulten, Edward; Slim, Jennifer N; Steel, Kevin; Furgerson, James L; Villines, Todd C
2014-01-01
Background: The correlation between normal cardiac chamber linear dimensions measured during retrospective coronary computed tomographic angiography as compared to transthoracic echocardiography using the American Society of Echocardiography guidelines is not well established. Methods: We performed a review from January 2005 to July 2011 to identify subjects with retrospective electrocardiogram-gated coronary computed tomographic angiography scans for chest pain and transthoracic echocardiography with normal cardiac structures performed within 90 days. Dimensions were manually calculated in both imaging modalities in accordance with the American Society of Echocardiography published guidelines. Left ventricular ejection fraction was calculated on echocardiography manually using the Simpson’s formula and by coronary computed tomographic angiography using the end-systolic and end-diastolic volumes. Results: We reviewed 532 studies, rejected 412 and had 120 cases for review with a median time between studies of 7 days (interquartile range (IQR25,75) = 0–22 days) with no correlation between the measurements made by coronary computed tomographic angiography and transthoracic echocardiography using Bland–Altman analysis. We generated coronary computed tomographic angiography cardiac dimension reference ranges for both genders for our population. Conclusion: Our findings represent a step towards generating cardiac chamber dimensions’ reference ranges for coronary computed tomographic angiography as compared to transthoracic echocardiography in patients with normal cardiac morphology and function using the American Society of Echocardiography guideline measurements that are commonly used by cardiologists. PMID:26770706
Training Older Adults to Use Tablet Computers: Does It Enhance Cognitive Function?
Chan, Micaela Y; Haber, Sara; Drew, Linda M; Park, Denise C
2016-06-01
Recent evidence shows that engaging in learning new skills improves episodic memory in older adults. In this study, older adults who were computer novices were trained to use a tablet computer and associated software applications. We hypothesize that sustained engagement in this mentally challenging training would yield a dual benefit of improved cognition and enhancement of everyday function by introducing useful skills. A total of 54 older adults (age 60-90) committed 15 hr/week for 3 months. Eighteen participants received extensive iPad training, learning a broad range of practical applications. The iPad group was compared with 2 separate controls: a Placebo group that engaged in passive tasks requiring little new learning; and a Social group that had regular social interaction, but no active skill acquisition. All participants completed the same cognitive battery pre- and post-engagement. Compared with both controls, the iPad group showed greater improvements in episodic memory and processing speed but did not differ in mental control or visuospatial processing. iPad training improved cognition relative to engaging in social or nonchallenging activities. Mastering relevant technological devices have the added advantage of providing older adults with technological skills useful in facilitating everyday activities (e.g., banking). This work informs the selection of targeted activities for future interventions and community programs. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America.
Chemical computing with reaction-diffusion processes.
Gorecki, J; Gizynski, K; Guzowski, J; Gorecka, J N; Garstecki, P; Gruenert, G; Dittrich, P
2015-07-28
Chemical reactions are responsible for information processing in living organisms. It is believed that the basic features of biological computing activity are reflected by a reaction-diffusion medium. We illustrate the ideas of chemical information processing considering the Belousov-Zhabotinsky (BZ) reaction and its photosensitive variant. The computational universality of information processing is demonstrated. For different methods of information coding constructions of the simplest signal processing devices are described. The function performed by a particular device is determined by the geometrical structure of oscillatory (or of excitable) and non-excitable regions of the medium. In a living organism, the brain is created as a self-grown structure of interacting nonlinear elements and reaches its functionality as the result of learning. We discuss whether such a strategy can be adopted for generation of chemical information processing devices. Recent studies have shown that lipid-covered droplets containing solution of reagents of BZ reaction can be transported by a flowing oil. Therefore, structures of droplets can be spontaneously formed at specific non-equilibrium conditions, for example forced by flows in a microfluidic reactor. We describe how to introduce information to a droplet structure, track the information flow inside it and optimize medium evolution to achieve the maximum reliability. Applications of droplet structures for classification tasks are discussed. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
Classification and Feature Selection Algorithms for Modeling Ice Storm Climatology
NASA Astrophysics Data System (ADS)
Swaminathan, R.; Sridharan, M.; Hayhoe, K.; Dobbie, G.
2015-12-01
Ice storms account for billions of dollars of winter storm loss across the continental US and Canada. In the future, increasing concentration of human populations in areas vulnerable to ice storms such as the northeastern US will only exacerbate the impacts of these extreme events on infrastructure and society. Quantifying the potential impacts of global climate change on ice storm prevalence and frequency is challenging, as ice storm climatology is driven by complex and incompletely defined atmospheric processes, processes that are in turn influenced by a changing climate. This makes the underlying atmospheric and computational modeling of ice storm climatology a formidable task. We propose a novel computational framework that uses sophisticated stochastic classification and feature selection algorithms to model ice storm climatology and quantify storm occurrences from both reanalysis and global climate model outputs. The framework is based on an objective identification of ice storm events by key variables derived from vertical profiles of temperature, humidity and geopotential height. Historical ice storm records are used to identify days with synoptic-scale upper air and surface conditions associated with ice storms. Evaluation using NARR reanalysis and historical ice storm records corresponding to the northeastern US demonstrates that an objective computational model with standard performance measures, with a relatively high degree of accuracy, identify ice storm events based on upper-air circulation patterns and provide insights into the relationships between key climate variables associated with ice storms.
Applicability of computational systems biology in toxicology.
Kongsbak, Kristine; Hadrup, Niels; Audouze, Karine; Vinggaard, Anne Marie
2014-07-01
Systems biology as a research field has emerged within the last few decades. Systems biology, often defined as the antithesis of the reductionist approach, integrates information about individual components of a biological system. In integrative systems biology, large data sets from various sources and databases are used to model and predict effects of chemicals on, for instance, human health. In toxicology, computational systems biology enables identification of important pathways and molecules from large data sets; tasks that can be extremely laborious when performed by a classical literature search. However, computational systems biology offers more advantages than providing a high-throughput literature search; it may form the basis for establishment of hypotheses on potential links between environmental chemicals and human diseases, which would be very difficult to establish experimentally. This is possible due to the existence of comprehensive databases containing information on networks of human protein-protein interactions and protein-disease associations. Experimentally determined targets of the specific chemical of interest can be fed into these networks to obtain additional information that can be used to establish hypotheses on links between the chemical and human diseases. Such information can also be applied for designing more intelligent animal/cell experiments that can test the established hypotheses. Here, we describe how and why to apply an integrative systems biology method in the hypothesis-generating phase of toxicological research. © 2014 Nordic Association for the Publication of BCPT (former Nordic Pharmacological Society).
49 CFR 236.923 - Task analysis and basic requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... classroom, simulator, computer-based, hands-on, or other formally structured training and testing, except... for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements...) Based on a formal task analysis, identify the installation, maintenance, repair, modification...
An intelligent multi-media human-computer dialogue system
NASA Technical Reports Server (NTRS)
Neal, J. G.; Bettinger, K. E.; Byoun, J. S.; Dobes, Z.; Thielman, C. Y.
1988-01-01
Sophisticated computer systems are being developed to assist in the human decision-making process for very complex tasks performed under stressful conditions. The human-computer interface is a critical factor in these systems. The human-computer interface should be simple and natural to use, require a minimal learning period, assist the user in accomplishing his task(s) with a minimum of distraction, present output in a form that best conveys information to the user, and reduce cognitive load for the user. In pursuit of this ideal, the Intelligent Multi-Media Interfaces project is devoted to the development of interface technology that integrates speech, natural language text, graphics, and pointing gestures for human-computer dialogues. The objective of the project is to develop interface technology that uses the media/modalities intelligently in a flexible, context-sensitive, and highly integrated manner modelled after the manner in which humans converse in simultaneous coordinated multiple modalities. As part of the project, a knowledge-based interface system, called CUBRICON (CUBRC Intelligent CONversationalist) is being developed as a research prototype. The application domain being used to drive the research is that of military tactical air control.
Engineering and Computing Portal to Solve Environmental Problems
NASA Astrophysics Data System (ADS)
Gudov, A. M.; Zavozkin, S. Y.; Sotnikov, I. Y.
2018-01-01
This paper describes architecture and services of the Engineering and Computing Portal, which is considered to be a complex solution that provides access to high-performance computing resources, enables to carry out computational experiments, teach parallel technologies and solve computing tasks, including technogenic safety ones.
ERIC Educational Resources Information Center
Wilson, Kimberly; Narayan, Anupama
2016-01-01
This study investigates relationships between self-efficacy, self-regulated learning strategy use and academic performance. Participants were 96 undergraduate students working on projects with three subtasks (idea generation task, methodical task and data collection) in a blended learning environment. Task self-efficacy was measured with…
Rivard, Justin D; Vergis, Ashley S; Unger, Bertram J; Hardy, Krista M; Andrew, Chris G; Gillman, Lawrence M; Park, Jason
2014-06-01
Computer-based surgical simulators capture a multitude of metrics based on different aspects of performance, such as speed, accuracy, and movement efficiency. However, without rigorous assessment, it may be unclear whether all, some, or none of these metrics actually reflect technical skill, which can compromise educational efforts on these simulators. We assessed the construct validity of individual performance metrics on the LapVR simulator (Immersion Medical, San Jose, CA, USA) and used these data to create task-specific summary metrics. Medical students with no prior laparoscopic experience (novices, N = 12), junior surgical residents with some laparoscopic experience (intermediates, N = 12), and experienced surgeons (experts, N = 11) all completed three repetitions of four LapVR simulator tasks. The tasks included three basic skills (peg transfer, cutting, clipping) and one procedural skill (adhesiolysis). We selected 36 individual metrics on the four tasks that assessed six different aspects of performance, including speed, motion path length, respect for tissue, accuracy, task-specific errors, and successful task completion. Four of seven individual metrics assessed for peg transfer, six of ten metrics for cutting, four of nine metrics for clipping, and three of ten metrics for adhesiolysis discriminated between experience levels. Time and motion path length were significant on all four tasks. We used the validated individual metrics to create summary equations for each task, which successfully distinguished between the different experience levels. Educators should maintain some skepticism when reviewing the plethora of metrics captured by computer-based simulators, as some but not all are valid. We showed the construct validity of a limited number of individual metrics and developed summary metrics for the LapVR. The summary metrics provide a succinct way of assessing skill with a single metric for each task, but require further validation.
Task-induced frequency modulation features for brain-computer interfacing
NASA Astrophysics Data System (ADS)
Jayaram, Vinay; Hohmann, Matthias; Just, Jennifer; Schölkopf, Bernhard; Grosse-Wentrup, Moritz
2017-10-01
Objective. Task-induced amplitude modulation of neural oscillations is routinely used in brain-computer interfaces (BCIs) for decoding subjects’ intents, and underlies some of the most robust and common methods in the field, such as common spatial patterns and Riemannian geometry. While there has been some interest in phase-related features for classification, both techniques usually presuppose that the frequencies of neural oscillations remain stable across various tasks. We investigate here whether features based on task-induced modulation of the frequency of neural oscillations enable decoding of subjects’ intents with an accuracy comparable to task-induced amplitude modulation. Approach. We compare cross-validated classification accuracies using the amplitude and frequency modulated features, as well as a joint feature space, across subjects in various paradigms and pre-processing conditions. We show results with a motor imagery task, a cognitive task, and also preliminary results in patients with amyotrophic lateral sclerosis (ALS), as well as using common spatial patterns and Laplacian filtering. Main results. The frequency features alone do not significantly out-perform traditional amplitude modulation features, and in some cases perform significantly worse. However, across both tasks and pre-processing in healthy subjects the joint space significantly out-performs either the frequency or amplitude features alone. This result only does not hold for ALS patients, for whom the dataset is of insufficient size to draw any statistically significant conclusions. Significance. Task-induced frequency modulation is robust and straight forward to compute, and increases performance when added to standard amplitude modulation features across paradigms. This allows more information to be extracted from the EEG signal cheaply and can be used throughout the field of BCIs.
Cognitive Approaches for Medicine in Cloud Computing.
Ogiela, Urszula; Takizawa, Makoto; Ogiela, Lidia
2018-03-03
This paper will present the application potential of the cognitive approach to data interpretation, with special reference to medical areas. The possibilities of using the meaning approach to data description and analysis will be proposed for data analysis tasks in Cloud Computing. The methods of cognitive data management in Cloud Computing are aimed to support the processes of protecting data against unauthorised takeover and they serve to enhance the data management processes. The accomplishment of the proposed tasks will be the definition of algorithms for the execution of meaning data interpretation processes in safe Cloud Computing. • We proposed a cognitive methods for data description. • Proposed a techniques for secure data in Cloud Computing. • Application of cognitive approaches for medicine was described.
Coffee intake and development of pain during computer work.
Strøm, Vegard; Røe, Cecilie; Knardahl, Stein
2012-09-03
The present study sought to determine if subjects who had consumed coffee before performing a simulated computer office-work task found to provoke pain in the neck and shoulders and forearms and wrists exhibited different time course in the pain development than the subjects who had abstained from coffee intake. Forty eight subjects all working fulltime, 22 with chronic shoulder and neck pain and 26 healthy pain-free subjects, were recruited to perform a computer-based office-work task for 90 min. Nineteen (40%) of the subjects had consumed coffee (1/2 -1 cup) on average 1 h 18 min before start. Pain intensity in the shoulders and neck and forearms and wrists was rated on a visual analogue scale every 15 min throughout the work task.During the work task the coffee consumers exhibited significantly lower pain increase than those who abstained from coffee. Subjects who had consumed coffee before starting a pain provoking office work task exhibited attenuated pain development compared with the subjects who had abstained from coffee intake. These results might have potentially interesting implications of a pain-modulating effect of caffeine in an everyday setting. However, studies with a double blind placebo controlled randomized design are needed.
ERIC Educational Resources Information Center
Kimura, Tadamasa
2010-01-01
The objective of this dissertation is to explore the socio-cultural contextualization of the digital divide in Japanese society. I undertake this task by developing a theoretical and methodological framework based on the notion of "culture as models," while explicating the cultural dimensions of the digital divide and the dynamics of…
Leadership Education Priorities for a Democratic Society
ERIC Educational Resources Information Center
Jenlink, Patrick M.
2010-01-01
Determining the priorities for leadership education in a democratic society is a complex, challenging responsibility, not a task to be taken lightly. It is complex on one level in that to be a leader in schools "today is to understand a profoundly human as well as a professional responsibility." It is challenging on another level in that preparing…
Democratic Miseducation: Preparing Students for Democracies That Do Not Exist.
ERIC Educational Resources Information Center
Vaughan, Geoffrey M.
The political educator takes the perspective that, in Thomas Hobbes's phrase, "man is not born fit for society." To make him so fit, contemporary political educators seek to develop individual autonomy and democratic affect, which would have the added task of reforming all of society in the future. The current consensus holds that the…
Education and a Progressive Orientation towards a Cosmopolitan Society
ERIC Educational Resources Information Center
Roth, Klas
2012-01-01
Robin Barrow claims in his "Moral education's modest agenda" that "the task of moral education is to develop understanding, at the lowest level, of the expectations of society and, at the highest level, of the nature of morality...[that is, that moral education] should go on to develop understanding, not of a particular social code, but of the…
Primary School Teachers as a Tool of Secularisation of Society in Communist Czechoslovakia
ERIC Educational Resources Information Center
Zounek, Jirí; Šimáne, Michal; Knotová, Dana
2017-01-01
This study focuses on the secularisation of society in communist Czechoslovakia (1948-1989) as a process in which primary school teachers played an important role. It aims to describe and explain typical everyday situations in which teachers were forced to fulfil tasks in connection with the Communist Party's politics of secularisation. The text…
The Writing Skill in the Contemporary Society: The Kenyan Perspective
ERIC Educational Resources Information Center
Okari, Florence Mokeira
2016-01-01
This paper is an overview of the writing skill in the lower levels of learning in the contemporary society. The following areas of writing are highlighted: the writing programme and its goals, the basic methodology for writing tasks, broad groups of writing skills, the teaching of the writing skills in pre-primary and primary schools where…
Ecological Democracy: An Environmental Approach to Citizenship Education
ERIC Educational Resources Information Center
Houser, Neil O.
2009-01-01
Civic educators strive to develop the kinds of citizens who can identify and address the significant challenges of life in society. A case can be made that we have failed in this fundamental task. In spite of our efforts, contemporary societies seem ill-equipped to cope with the enormous social and environmental issues of our age. The problem is…
"Twenty Percent Free!" So How Much Does the Original Bar Weigh?
ERIC Educational Resources Information Center
Hawera, Ngarewa; Taylor, Merilyn
2011-01-01
Developing critical numeracy is important in a society where mathematics plays a particular and significant role. One way of helping learners to develop the level of numeracy required to participate in society is by exploring ideas embedded in rich, accessible tasks. These can be linked to contexts that have relevance in their lives. One…
Explicit and implicit anti-fat attitudes in children and their relationships with their body images.
Solbes, Irene; Enesco, Ileana
2010-02-01
This study aimed to explore the prevalence of negative attitudes toward overweight peers among children using different explicit and implicit measures, and to analyze their relationships with some aspects of their body image. A total of 120 children aged 6-11 years were interviewed using a computer program that simulated a game containing several tasks. Specifically, we have applied multiple measures of explicit attitudes toward average-weight/overweight peers, several personal body attitudes questions and a child-oriented version of the Implicit Association Test. Our participants showed important prejudice and stereotypes against overweight children, both at the explicit and implicit levels. However, we found important differences in the intensity of prejudice and its developmental course as a function of the tasks and the type of measurement used to assess it. Children who grow up in Western societies idealize thinness from an early age and denigrate overweight, to which they associate explicitly and implicitly a series of negative traits that have nothing to do with the weight. As they grow older, they seem to reduce their levels of explicit prejudice, but not the intensity of implicit bias. More research is needed to study in depth prejudice and discrimination toward overweight children from a developmental point of view. Copyright 2010 S. Karger AG, Basel.