ERIC Educational Resources Information Center
Hull, Daniel M.; Lovett, James E.
This task analysis report for the Robotics/Automated Systems Technician (RAST) curriculum project first provides a RAST job description. It then discusses the task analysis, including the identification of tasks, the grouping of tasks according to major areas of specialty, and the comparison of the competencies to existing or new courses to…
Integrating Cognitive Task Analysis into Instructional Systems Development.
ERIC Educational Resources Information Center
Ryder, Joan M.; Redding, Richard E.
1993-01-01
Discussion of instructional systems development (ISD) focuses on recent developments in cognitive task analysis and describes the Integrated Task Analysis Model, a framework for integrating cognitive and behavioral task analysis methods within the ISD model. Three components of expertise are analyzed: skills, knowledge, and mental models. (96…
Naturalistic Decision Making: Implications for Design
1993-04-01
Cognitive Task Analysis Decision Making Design Engineer Design System Human-Computer Interface System Development 15. NUMBER OF PAGES 182 16...people use to select a course of action. The SOAR explains how stress affects the decision making of both individuals and teams. COGNITIVE TASK ANALYSIS : This...procedures for Cognitive Task Analysis , contrasting the strengths and weaknesses of each, and showing how a Cognitive Task Analysis
NASA Astrophysics Data System (ADS)
Afrahamiryano, A.; Ariani, D.
2018-04-01
The student task analysis is one part of the define stage in development research using the 4-D development model. Analysis of this task is useful to determine the level of understanding of students on lecture materials that have been given. The results of this task analysis serve as a measuring tool to determine the level of success of learning and as a basis in the development of lecture system. Analysis of this task is done by the method of observation and documentation study of the tasks undertaken by students. The results of this analysis are then described and after that triangulation are done to draw conclusions. The results of the analysis indicate that the students' level of understanding is high for theoretical and low material for counting material. Based on the results of this task analysis, it can be concluded that e-learning lecture system developed should be able to increase students' understanding on basic chemicals that are calculated.
SYFSA: A Framework for Systematic Yet Flexible Systems Analysis
Johnson, Todd R.; Markowitz, Eliz; Bernstam, Elmer V.; Herskovic, Jorge R.; Thimbleby, Harold
2013-01-01
Although technological or organizational systems that enforce systematic procedures and best practices can lead to improvements in quality, these systems must also be designed to allow users to adapt to the inherent uncertainty, complexity, and variations in healthcare. We present a framework, called Systematic Yet Flexible Systems Analysis (SYFSA) that supports the design and analysis of Systematic Yet Flexible (SYF) systems (whether organizational or technical) by formally considering the tradeoffs between systematicity and flexibility. SYFSA is based on analyzing a task using three related problem spaces: the idealized space, the natural space, and the system space. The idealized space represents the best practice—how the task is to be accomplished under ideal conditions. The natural space captures the task actions and constraints on how the task is currently done. The system space specifies how the task is done in a redesigned system, including how it may deviate from the idealized space, and how the system supports or enforces task constraints. The goal of the framework is to support the design of systems that allow graceful degradation from the idealized space to the natural space. We demonstrate the application of SYFSA for the analysis of a simplified central line insertion task. We also describe several information-theoretic measures of flexibility that can be used to compare alternative designs, and to measure how efficiently a system supports a given task, the relative cognitive workload, and learnability. PMID:23727053
NASA Astrophysics Data System (ADS)
Panfil, Wawrzyniec; Moczulski, Wojciech
2017-10-01
In the paper presented is a control system of a mobile robots group intended for carrying out inspection missions. The main research problem was to define such a control system in order to facilitate a cooperation of the robots resulting in realization of the committed inspection tasks. Many of the well-known control systems use auctions for tasks allocation, where a subject of an auction is a task to be allocated. It seems that in the case of missions characterized by much larger number of tasks than number of robots it will be better if robots (instead of tasks) are subjects of auctions. The second identified problem concerns the one-sided robot-to-task fitness evaluation. Simultaneous assessment of the robot-to-task fitness and task attractiveness for robot should affect positively for the overall effectiveness of the multi-robot system performance. The elaborated system allows to assign tasks to robots using various methods for evaluation of fitness between robots and tasks, and using some tasks allocation methods. There is proposed the method for multi-criteria analysis, which is composed of two assessments, i.e. robot's concurrency position for task among other robots and task's attractiveness for robot among other tasks. Furthermore, there are proposed methods for tasks allocation applying the mentioned multi-criteria analysis method. The verification of both the elaborated system and the proposed tasks' allocation methods was carried out with the help of simulated experiments. The object under test was a group of inspection mobile robots being a virtual counterpart of the real mobile-robot group.
Designing to Support Command and Control in Urban Firefighting
2008-06-01
complex human- machine systems. Keywords: Command and control, firefighting, cognitive systems engineering, cognitive task analysis 1...Elm, W. (2000). Bootstrapping multiple converging cognitive task analysis techniques for system design. In J.M.C. Schraagen, S.F. Chipman, & V.L...Shalin, (Eds.), Cognitive Task Analysis . (pp. 317-340). Mahwah, NJ: Lawrence Erlbaum. Rasmussen, J., Pejtersen, A., Goodman, L. (1994). Cognitive
2003-10-01
Among the procedures developed to identify cognitive processes, there are the Cognitive Task Analysis (CTA) and the Cognitive Work Analysis (CWA...of Cognitive Task Design. [11] Potter, S.S., Roth, E.M., Woods, D.D., and Elm, W.C. (2000). Cognitive Task Analysis as Bootstrapping Multiple...Converging Techniques, In Schraagen, Chipman, and Shalin (Eds.). Cognitive Task Analysis . Mahwah, NJ: Lawrence Erlbaum Associates. [12] Roth, E.M
A Method for Cognitive Task Analysis
1992-07-01
A method for cognitive task analysis is described based on the notion of ’generic tasks’. The method distinguishes three layers of analysis. At the...model for applied areas such as the development of knowledge-based systems and training, are discussed. Problem solving, Cognitive Task Analysis , Knowledge, Strategies.
ERIC Educational Resources Information Center
Skinner, Anna; Diller, David; Kumar, Rohit; Cannon-Bowers, Jan; Smith, Roger; Tanaka, Alyssa; Julian, Danielle; Perez, Ray
2018-01-01
Background: Contemporary work in the design and development of intelligent training systems employs task analysis (TA) methods for gathering knowledge that is subsequently encoded into task models. These task models form the basis of intelligent interpretation of student performance within education and training systems. Also referred to as expert…
49 CFR 236.1043 - Task analysis and basic requirements.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Positive Train Control Systems § 236.1043 Task analysis and basic requirements. (a) Training structure and... installation, maintenance, repair, modification, inspection, testing, and operating tasks that must be...
49 CFR 236.1043 - Task analysis and basic requirements.
Code of Federal Regulations, 2013 CFR
2013-10-01
..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Positive Train Control Systems § 236.1043 Task analysis and basic requirements. (a) Training structure and... installation, maintenance, repair, modification, inspection, testing, and operating tasks that must be...
49 CFR 236.1043 - Task analysis and basic requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Positive Train Control Systems § 236.1043 Task analysis and basic requirements. (a) Training structure and... installation, maintenance, repair, modification, inspection, testing, and operating tasks that must be...
49 CFR 236.1043 - Task analysis and basic requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Positive Train Control Systems § 236.1043 Task analysis and basic requirements. (a) Training structure and... installation, maintenance, repair, modification, inspection, testing, and operating tasks that must be...
49 CFR 236.1043 - Task analysis and basic requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Positive Train Control Systems § 236.1043 Task analysis and basic requirements. (a) Training structure and... installation, maintenance, repair, modification, inspection, testing, and operating tasks that must be...
Using task analysis to improve the requirements elicitation in health information system.
Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa
2007-01-01
This paper describes the application of task analysis within the design process of a Web-based information system for managing clinical information in hemophilia care, in order to improve the requirements elicitation and, consequently, to validate the domain model obtained in a previous phase of the design process (system analysis). The use of task analysis in this case proved to be a practical and efficient way to improve the requirements engineering process by involving users in the design process.
A Common Foundation of Information and Analytical Capability for AFSPC Decision Making
2005-06-23
System Strategic Master Plan MAPs/MSP CRRAAF TASK FORCE CONOPS MUA Task Weights Engagement Analysis ASIIS Optimization ACEIT COST Analysis...Engangement Architecture Analysis Architecture MUA AFSPC POM S&T Planning Military Utility Analysis ACEIT COST Analysis Joint Capab Integ Develop System
49 CFR 236.923 - Task analysis and basic requirements.
Code of Federal Regulations, 2013 CFR
2013-10-01
..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Standards for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements..., inspection, testing, and operating tasks that must be performed on a railroad's products. This includes the...
49 CFR 236.923 - Task analysis and basic requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Standards for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements..., inspection, testing, and operating tasks that must be performed on a railroad's products. This includes the...
49 CFR 236.923 - Task analysis and basic requirements.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Standards for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements..., inspection, testing, and operating tasks that must be performed on a railroad's products. This includes the...
49 CFR 236.923 - Task analysis and basic requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Standards for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements..., inspection, testing, and operating tasks that must be performed on a railroad's products. This includes the...
Using task analysis to understand the Data System Operations Team
NASA Technical Reports Server (NTRS)
Holder, Barbara E.
1994-01-01
The Data Systems Operations Team (DSOT) currently monitors the Multimission Ground Data System (MGDS) at JPL. The MGDS currently supports five spacecraft and within the next five years, it will support ten spacecraft simultaneously. The ground processing element of the MGDS consists of a distributed UNIX-based system of over 40 nodes and 100 processes. The MGDS system provides operators with little or no information about the system's end-to-end processing status or end-to-end configuration. The lack of system visibility has become a critical issue in the daily operation of the MGDS. A task analysis was conducted to determine what kinds of tools were needed to provide DSOT with useful status information and to prioritize the tool development. The analysis provided the formality and structure needed to get the right information exchange between development and operations. How even a small task analysis can improve developer-operator communications is described, and the challenges associated with conducting a task analysis in a real-time mission operations environment are examined.
Development of a Methodology for Assessing Aircrew Workloads.
1981-11-01
Workload Feasibility Study. .. ...... 52 Subjects. .. .............. ........ 53 Equipment .. ............... ....... 53 Date Analysis ... analysis ; simulation; standard time systems; switching synthetic time systems; task activities; task interference; time study; tracking; workload; work sampl...standard data systems, information content analysis , work sampling and job evaluation. Con- ventional methods were found to be deficient in accounting
A Study of Novice Systems Analysis Problem Solving Behaviors Using Protocol Analysis
1992-09-01
conducted. Each subject was given the same task to perform. The task involved a case study (Appendix B) of a utility company’s customer order processing system...behavior (Ramesh, 1989). The task was to design a customer order processing system that utilized a centralized telephone answering service center...of the utility company’s customer order processing system that was developed based on information obtained by a large systems consulting firm during
A Standard Procedure for Conducting Cognitive Task Analysis.
ERIC Educational Resources Information Center
Redding, Richard E.
Traditional methods for task analysis have been largely based on the Instructional Systems Development (ISD) model, which is widely used throughout industry and the military. The first part of this document gives an overview of cognitive task analysis, which is conducted within the first phase of ISD. The following steps of cognitive task analysis…
Evidential Reasoning in Expert Systems for Image Analysis.
1985-02-01
techniques to image analysis (IA). There is growing evidence that these techniques offer significant improvements in image analysis , particularly in the...2) to provide a common framework for analysis, (3) to structure the ER process for major expert-system tasks in image analysis , and (4) to identify...approaches to three important tasks for expert systems in the domain of image analysis . This segment concluded with an assessment of the strengths
DOT National Transportation Integrated Search
2012-05-16
This Communications Data Delivery System Analysis Task 2 report describes and analyzes options for Vehicle to Vehicle (V2V) and Vehicle to Infrastructure (V2I) communications data delivery systems using various communication media (Dedicated Short Ra...
Participatory Design Methods for C2 Systems (Proceedings/Presentation)
2006-01-01
Cognitive Task Analysis (CTA) 16. SECURITY CLASSIFICATION OF: 17. LIMITATION 18. NUMBER 19a. NAME OF RESPONSIBLE PERSON OF ABSTRACT OF PAGES Janet E. Miller...systems to support cognitive work such as is accomplished in a network-centric -environment. Cognitive task analysis (CTA) methods are used to...of cognitive task analysis methodologies exist (Schraagen et al., 2000). However, many of these methods are skeptically viewed by a domain’s
DOT National Transportation Integrated Search
1996-11-01
This working paper documents Task E of the present project, Task Analyses for Advanced Traveler Information Systems (ATIS) and Commercial Vehicle Operations (CVO) systems. The goal of Task E is to conduct detailed analyses of the influence of using A...
Pilot-model analysis and simulation study of effect of control task desired control response
NASA Technical Reports Server (NTRS)
Adams, J. J.; Gera, J.; Jaudon, J. B.
1978-01-01
A pilot model analysis was performed that relates pilot control compensation, pilot aircraft system response, and aircraft response characteristics for longitudinal control. The results show that a higher aircraft short period frequency is required to achieve superior pilot aircraft system response in an altitude control task than is required in an attitude control task. These results were confirmed by a simulation study of target tracking. It was concluded that the pilot model analysis provides a theoretical basis for determining the effect of control task on pilot opinions.
Nott, Melissa T; Chapparo, Christine
2008-09-01
Agitation following traumatic brain injury is characterised by a heightened state of activity with disorganised information processing that interferes with learning and achieving functional goals. This study aimed to identify information processing problems during task performance of a severely agitated adult using the Perceive, Recall, Plan and Perform (PRPP) System of Task Analysis. Second, this study aimed to examine the sensitivity of the PRPP System to changes in task performance over a short period of rehabilitation, and third, to evaluate the guidance provided by the PRPP in directing intervention. A case study research design was employed. The PRPP System of Task Analysis was used to assess changes in task embedded information processing capacity during occupational therapy intervention with a severely agitated adult in a rehabilitation context. Performance is assessed on three selected tasks over a one-month period. Information processing difficulties during task performance can be clearly identified when observing a severely agitated adult following a traumatic brain injury. Processing skills involving attention, sensory processing and planning were most affected at this stage of rehabilitation. These processing difficulties are linked to established descriptions of agitated behaviour. Fluctuations in performance across three tasks of differing processing complexity were evident, leading to hypothesised relationships between task complexity, environment and novelty with information processing errors. Changes in specific information processing capacity over time were evident based on repeated measures using the PRPP System of Task Analysis. This lends preliminary support for its utility as an outcome measure, and raises hypotheses about the type of therapy required to enhance information processing in people with severe agitation. The PRPP System is sensitive to information processing changes in severely agitated adults when used to reassess performance over short intervals and can provide direct guidance to occupational therapy intervention to improve task embedded information processing by categorising errors under four stages of an information processing model: Perceive, Recall, Plan and Perform.
Cognitive Task Analysis of the HALIFAX-Class Operations Room Officer
1999-03-10
Image Cover Sheet CLASSIFICATION SYSTEM NUMBER 510918 UNCLASSIFIED llllllllllllllllllllllllllllllllllllllll TITLE COGNITIVE TASK ANALYSIS OF THE...DATES COVERED 00-00-1999 to 00-00-1999 4. TITLE AND SUBTITLE Cognitive Task Analysis of the HALIFAX-Class Operations Room Officer 5a. CONTRACT...Ontario . ~ -- . ’ c ... - Incorporated Cognitive Task Analysis of the HALIFAX-Class Operations Room Officer: PWGSC Contract No. W7711-7-7404/001/SV
49 CFR 236.923 - Task analysis and basic requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... classroom, simulator, computer-based, hands-on, or other formally structured training and testing, except... for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements...) Based on a formal task analysis, identify the installation, maintenance, repair, modification...
1987-11-01
differential qualita- tive (DQ) analysis, which solves the task, providing explanations suitable for use by design systems, automated diagnosis, intelligent...solves the task, providing explanations suitable for use by design systems, automated diagnosis, intelligent tutoring systems, and explanation based...comparative analysis as an important component; the explanation is used in many different ways. * One way method of automated design is the principlvd
Space station data system analysis/architecture study. Task 3: Trade studies, DR-5, volume 2
NASA Technical Reports Server (NTRS)
1985-01-01
Results of a Space Station Data System Analysis/Architecture Study for the Goddard Space Flight Center are presented. This study, which emphasized a system engineering design for a complete, end-to-end data system, was divided into six tasks: (1); Functional requirements definition; (2) Options development; (3) Trade studies; (4) System definitions; (5) Program plan; and (6) Study maintenance. The Task inter-relationship and documentation flow are described. Information in volume 2 is devoted to Task 3: trade Studies. Trade Studies have been carried out in the following areas: (1) software development test and integration capability; (2) fault tolerant computing; (3) space qualified computers; (4) distributed data base management system; (5) system integration test and verification; (6) crew workstations; (7) mass storage; (8) command and resource management; and (9) space communications. Results are presented for each task.
ERIC Educational Resources Information Center
Foley, John P., Jr.
A study was conducted to refine and coordinate occupational analysis, job performance aids, and elements of the instructional systems development process for task specific Air Force maintenance training. Techniques for task identification and analysis (TI & A) and data gathering techniques for occupational analysis were related. While TI &…
Cognitive task analysis: Techniques applied to airborne weapons training
DOE Office of Scientific and Technical Information (OSTI.GOV)
Terranova, M.; Seamster, T.L.; Snyder, C.E.
1989-01-01
This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented alongmore » with the results. 6 refs., 2 figs., 4 tabs.« less
2001-08-01
This report presents the results of a preliminary Cognitive Task Analysis (CTA) of the deployed Network Operations Support Center (NOSC-D), and the...conducted Cognitive Task Analysis interviews with four (4) NOSC-D personnel. Because of the preliminary nature of the finding, the analysis is
A Mobile Computing Solution for Collecting Functional Analysis Data on a Pocket PC
ERIC Educational Resources Information Center
Jackson, James; Dixon, Mark R.
2007-01-01
The present paper provides a task analysis for creating a computerized data system using a Pocket PC and Microsoft Visual Basic. With Visual Basic software and any handheld device running the Windows MOBLE operating system, this task analysis will allow behavior analysts to program and customize their own functional analysis data-collection…
Human factors evaluation of remote afterloading brachytherapy. Volume 2, Function and task analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Callan, J.R.; Gwynne, J.W. III; Kelly, T.T.
1995-05-01
A human factors project on the use of nuclear by-product material to treat cancer using remotely operated afterloaders was undertaken by the Nuclear Regulatory Commission. The purpose of the project was to identify factors that contribute to human error in the system for remote afterloading brachytherapy (RAB). This report documents the findings from the first phase of the project, which involved an extensive function and task analysis of RAB. This analysis identified the functions and tasks in RAB, made preliminary estimates of the likelihood of human error in each task, and determined the skills needed to perform each RAB task.more » The findings of the function and task analysis served as the foundation for the remainder of the project, which evaluated four major aspects of the RAB system linked to human error: human-system interfaces; procedures and practices; training and qualifications of RAB staff; and organizational practices and policies. At its completion, the project identified and prioritized areas for recommended NRC and industry attention based on all of the evaluations and analyses.« less
Business and Marketing Cluster. Task Analyses.
ERIC Educational Resources Information Center
Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum and Resource Center.
Developed in Virginia, this publication contains task analysis guides to support selected tech prep programs that prepare students for careers in the business and marketing cluster. Guides are included for accounting systems, legal systems administration, office systems technology, and retail marketing. Each task analyses guide has the following…
Scotch, Matthew; Parmanto, Bambang; Monaco, Valerie
2008-06-09
Data analysis in community health assessment (CHA) involves the collection, integration, and analysis of large numerical and spatial data sets in order to identify health priorities. Geographic Information Systems (GIS) enable for management and analysis using spatial data, but have limitations in performing analysis of numerical data because of its traditional database architecture.On-Line Analytical Processing (OLAP) is a multidimensional datawarehouse designed to facilitate querying of large numerical data. Coupling the spatial capabilities of GIS with the numerical analysis of OLAP, might enhance CHA data analysis. OLAP-GIS systems have been developed by university researchers and corporations, yet their potential for CHA data analysis is not well understood. To evaluate the potential of an OLAP-GIS decision support system for CHA problem solving, we compared OLAP-GIS to the standard information technology (IT) currently used by many public health professionals. SOVAT, an OLAP-GIS decision support system developed at the University of Pittsburgh, was compared against current IT for data analysis for CHA. For this study, current IT was considered the combined use of SPSS and GIS ("SPSS-GIS"). Graduate students, researchers, and faculty in the health sciences at the University of Pittsburgh were recruited. Each round consisted of: an instructional video of the system being evaluated, two practice tasks, five assessment tasks, and one post-study questionnaire. Objective and subjective measurement included: task completion time, success in answering the tasks, and system satisfaction. Thirteen individuals participated. Inferential statistics were analyzed using linear mixed model analysis. SOVAT was statistically significant (alpha = .01) from SPSS-GIS for satisfaction and time (p < .002). Descriptive results indicated that participants had greater success in answering the tasks when using SOVAT as compared to SPSS-GIS. Using SOVAT, tasks were completed more efficiently, with a higher rate of success, and with greater satisfaction, than the combined use of SPSS and GIS. The results from this study indicate a potential for OLAP-GIS decision support systems as a valuable tool for CHA data analysis.
Scotch, Matthew; Parmanto, Bambang; Monaco, Valerie
2008-01-01
Background Data analysis in community health assessment (CHA) involves the collection, integration, and analysis of large numerical and spatial data sets in order to identify health priorities. Geographic Information Systems (GIS) enable for management and analysis using spatial data, but have limitations in performing analysis of numerical data because of its traditional database architecture. On-Line Analytical Processing (OLAP) is a multidimensional datawarehouse designed to facilitate querying of large numerical data. Coupling the spatial capabilities of GIS with the numerical analysis of OLAP, might enhance CHA data analysis. OLAP-GIS systems have been developed by university researchers and corporations, yet their potential for CHA data analysis is not well understood. To evaluate the potential of an OLAP-GIS decision support system for CHA problem solving, we compared OLAP-GIS to the standard information technology (IT) currently used by many public health professionals. Methods SOVAT, an OLAP-GIS decision support system developed at the University of Pittsburgh, was compared against current IT for data analysis for CHA. For this study, current IT was considered the combined use of SPSS and GIS ("SPSS-GIS"). Graduate students, researchers, and faculty in the health sciences at the University of Pittsburgh were recruited. Each round consisted of: an instructional video of the system being evaluated, two practice tasks, five assessment tasks, and one post-study questionnaire. Objective and subjective measurement included: task completion time, success in answering the tasks, and system satisfaction. Results Thirteen individuals participated. Inferential statistics were analyzed using linear mixed model analysis. SOVAT was statistically significant (α = .01) from SPSS-GIS for satisfaction and time (p < .002). Descriptive results indicated that participants had greater success in answering the tasks when using SOVAT as compared to SPSS-GIS. Conclusion Using SOVAT, tasks were completed more efficiently, with a higher rate of success, and with greater satisfaction, than the combined use of SPSS and GIS. The results from this study indicate a potential for OLAP-GIS decision support systems as a valuable tool for CHA data analysis. PMID:18541037
Cognitive Task Analysis of the HALIFAX-Class Operations Room Officer: Data Sheets. Annexes
1999-03-10
Image Cover Sheet CLASSIFICATION SYSTEM NUMBER 510920 UNCLASSIFIED 1111111111111111111111111111111111111111 TITLE ANNEXES TO: COGNITIVE TASK ANALYSIS OF...1999 2. REPORT TYPE 3. DATES COVERED 00-00-1999 to 00-00-1999 4. TITLE AND SUBTITLE Annexes to: Cognitive Task Analysis of the HALIFAX-Class...by ANSI Std Z39-18 Guelph, Ontario .H U. M A N S X S T E M S Incorporated Annexes to: Cognitive Task Analysis of the HALIFAX-Class Operations
Development of a task analysis tool to facilitate user interface design
NASA Technical Reports Server (NTRS)
Scholtz, Jean C.
1992-01-01
A good user interface is one that facilitates the user in carrying out his task. Such interfaces are difficult and costly to produce. The most important aspect in producing a good interface is the ability to communicate to the software designers what the user's task is. The Task Analysis Tool is a system for cooperative task analysis and specification of the user interface requirements. This tool is intended to serve as a guide to development of initial prototypes for user feedback.
NASA Technical Reports Server (NTRS)
1980-01-01
The results of the Large Space Systems Technology special emphasis task are presented. The task was an analysis of structural requirements deriving from the initial Phase A Operational Geostationary Platform study.
From scenarios to domain models: processes and representations
NASA Astrophysics Data System (ADS)
Haddock, Gail; Harbison, Karan
1994-03-01
The domain specific software architectures (DSSA) community has defined a philosophy for the development of complex systems. This philosophy improves productivity and efficiency by increasing the user's role in the definition of requirements, increasing the systems engineer's role in the reuse of components, and decreasing the software engineer's role to the development of new components and component modifications only. The scenario-based engineering process (SEP), the first instantiation of the DSSA philosophy, has been adopted by the next generation controller project. It is also the chosen methodology of the trauma care information management system project, and the surrogate semi-autonomous vehicle project. SEP uses scenarios from the user to create domain models and define the system's requirements. Domain knowledge is obtained from a variety of sources including experts, documents, and videos. This knowledge is analyzed using three techniques: scenario analysis, task analysis, and object-oriented analysis. Scenario analysis results in formal representations of selected scenarios. Task analysis of the scenario representations results in descriptions of tasks necessary for object-oriented analysis and also subtasks necessary for functional system analysis. Object-oriented analysis of task descriptions produces domain models and system requirements. This paper examines the representations that support the DSSA philosophy, including reference requirements, reference architectures, and domain models. The processes used to create and use the representations are explained through use of the scenario-based engineering process. Selected examples are taken from the next generation controller project.
Surgical task analysis of simulated laparoscopic cholecystectomy with a navigation system.
Sugino, T; Kawahira, H; Nakamura, R
2014-09-01
Advanced surgical procedures, which have become complex and difficult, increase the burden of surgeons. Quantitative analysis of surgical procedures can improve training, reduce variability, and enable optimization of surgical procedures. To this end, a surgical task analysis system was developed that uses only surgical navigation information. Division of the surgical procedure, task progress analysis, and task efficiency analysis were done. First, the procedure was divided into five stages. Second, the operating time and progress rate were recorded to document task progress during specific stages, including the dissecting task. Third, the speed of the surgical instrument motion (mean velocity and acceleration), as well as the size and overlap ratio of the approximate ellipse of the location log data distribution, was computed to estimate the task efficiency during each stage. These analysis methods were evaluated based on experimental validation with two groups of surgeons, i.e., skilled and "other" surgeons. The performance metrics and analytical parameters included incidents during the operation, the surgical environment, and the surgeon's skills or habits. Comparison of groups revealed that skilled surgeons tended to perform the procedure in less time and involved smaller regions; they also manipulated the surgical instruments more gently. Surgical task analysis developed for quantitative assessment of surgical procedures and surgical performance may provide practical methods and metrics for objective evaluation of surgical expertise.
Tools to Support Human Factors and Systems Engineering Interactions During Early Analysis
NASA Technical Reports Server (NTRS)
Thronesbery, Carroll; Malin, Jane T.; Holden, Kritina; Smith, Danielle Paige
2005-01-01
We describe an approach and existing software tool support for effective interactions between human factors engineers and systems engineers in early analysis activities during system acquisition. We examine the tasks performed during this stage, emphasizing those tasks where system engineers and human engineers interact. The Concept of Operations (ConOps) document is an important product during this phase, and particular attention is paid to its influences on subsequent acquisition activities. Understanding this influence helps ConOps authors describe a complete system concept that guides subsequent acquisition activities. We identify commonly used system engineering and human engineering tools and examine how they can support the specific tasks associated with system definition. We identify possible gaps in the support of these tasks, the largest of which appears to be creating the ConOps document itself. Finally, we outline the goals of our future empirical investigations of tools to support system concept definition.
Tools to Support Human Factors and Systems Engineering Interactions During Early Analysis
NASA Technical Reports Server (NTRS)
Thronesbery, Carroll; Malin, Jane T.; Holden, Kritina; Smith, Danielle Paige
2006-01-01
We describe an approach and existing software tool support for effective interactions between human factors engineers and systems engineers in early analysis activities during system acquisition. We examine the tasks performed during this stage, emphasizing those tasks where system engineers and human engineers interact. The Concept of Operations (ConOps) document is an important product during this phase, and particular attention is paid to its influences on subsequent acquisition activities. Understanding this influence helps ConOps authors describe a complete system concept that guides subsequent acquisition activities. We identify commonly used system engineering and human engineering tools and examine how they can support the specific tasks associated with system definition. We identify possible gaps in the support of these tasks, the largest of which appears to be creating the ConOps document itself. Finally, we outline the goals of our future empirical investigations of tools to support system concept definition.
Assessing performance of an Electronic Health Record (EHR) using Cognitive Task Analysis.
Saitwal, Himali; Feng, Xuan; Walji, Muhammad; Patel, Vimla; Zhang, Jiajie
2010-07-01
Many Electronic Health Record (EHR) systems fail to provide user-friendly interfaces due to the lack of systematic consideration of human-centered computing issues. Such interfaces can be improved to provide easy to use, easy to learn, and error-resistant EHR systems to the users. To evaluate the usability of an EHR system and suggest areas of improvement in the user interface. The user interface of the AHLTA (Armed Forces Health Longitudinal Technology Application) was analyzed using the Cognitive Task Analysis (CTA) method called GOMS (Goals, Operators, Methods, and Selection rules) and an associated technique called KLM (Keystroke Level Model). The GOMS method was used to evaluate the AHLTA user interface by classifying each step of a given task into Mental (Internal) or Physical (External) operators. This analysis was performed by two analysts independently and the inter-rater reliability was computed to verify the reliability of the GOMS method. Further evaluation was performed using KLM to estimate the execution time required to perform the given task through application of its standard set of operators. The results are based on the analysis of 14 prototypical tasks performed by AHLTA users. The results show that on average a user needs to go through 106 steps to complete a task. To perform all 14 tasks, they would spend about 22 min (independent of system response time) for data entry, of which 11 min are spent on more effortful mental operators. The inter-rater reliability analysis performed for all 14 tasks was 0.8 (kappa), indicating good reliability of the method. This paper empirically reveals and identifies the following finding related to the performance of AHLTA: (1) large number of average total steps to complete common tasks, (2) high average execution time and (3) large percentage of mental operators. The user interface can be improved by reducing (a) the total number of steps and (b) the percentage of mental effort, required for the tasks. 2010 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Kirlik, Alex; Kossack, Merrick Frank
1993-01-01
This status report consists of a thesis entitled 'Ecological Task Analysis: A Method for Display Enhancements.' Previous use of various analysis processes for the purpose of display interface design or enhancement has run the risk of failing to improve user performance due to the analysis resulting in only a sequencial listing of user tasks. Adopting an ecological approach to performing the task analysis, however, may result in the necessary modeling of an unpredictable and variable task domain required to improve user performance. Kirlik has proposed an Ecological Task Analysis framework which is designed for this purpose. It is the purpose of this research to measure this framework's effectiveness at enhancing display interfaces in order to improve user performance. Following the proposed framework, an ecological task analysis of experienced users of a complex and dynamic laboratory task, Star Cruiser, was performed. Based on this analysis, display enhancements were proposed and implemented. An experiment was then conducted to compare this new version of Star Cruiser to the original. By measuring user performance at different tasks, it was determined that during early sessions, use of the enhanced display contributed to better user performance compared to that achieved using the original display. Furthermore, the results indicate that the enhancements proposed as a result of the ecological task analysis affected user performance differently depending on whether they are enhancements which aid in the selection of a possible action or in the performance of an action. Generalizations of these findings to larger, more complex systems were avoided since the analysis was only performed on this one particular system.
Reducing Response Time Bounds for DAG-Based Task Systems on Heterogeneous Multicore Platforms
2016-01-01
synchronous parallel tasks on multicore platforms. In 25th ECRTS, 2013. [10] U. Devi. Soft Real - Time Scheduling on Multiprocessors. PhD thesis...report, Washington University in St Louis, 2014. [18] C. Liu and J. Anderson. Supporting soft real - time DAG-based sys- tems on multiprocessors with...analysis for DAG-based real - time task systems im- plemented on heterogeneous multicore platforms. The spe- cific analysis problem that is considered was
Memory systems, processes, and tasks: taxonomic clarification via factor analysis.
Bruss, Peter J; Mitchell, David B
2009-01-01
The nature of various memory systems was examined using factor analysis. We reanalyzed data from 11 memory tasks previously reported in Mitchell and Bruss (2003). Four well-defined factors emerged, closely resembling episodic and semantic memory and conceptual and perceptual implicit memory, in line with both memory systems and transfer-appropriate processing accounts. To explore taxonomic issues, we ran separate analyses on the implicit tasks. Using a cross-format manipulation (pictures vs. words), we identified 3 prototypical tasks. Word fragment completion and picture fragment identification tasks were "factor pure," tapping perceptual processes uniquely. Category exemplar generation revealed its conceptual nature, yielding both cross-format priming and a picture superiority effect. In contrast, word stem completion and picture naming were more complex, revealing attributes of both processes.
Product Support Manager Guidebook
2011-04-01
package is being developed using supportability analysis concepts such as Failure Mode, Effects and Criticality Analysis (FMECA), Fault Tree Analysis ( FTA ...Analysis (LORA) Condition Based Maintenance + (CBM+) Fault Tree Analysis ( FTA ) Failure Mode, Effects, and Criticality Analysis (FMECA) Maintenance Task...Reporting and Corrective Action System (FRACAS), Fault Tree Analysis ( FTA ), Level of Repair Analysis (LORA), Maintenance Task Analysis (MTA
2006-06-01
heart of a distinction within the CSE community with respect to the differences between Cognitive Task Analysis (CTA) and Cognitive Work Analysis...Wesley. Pirolli, P. and Card, S. (2005). The sensemaking process and leverage points for analyst technology as identified through cognitive task analysis . In...D. D., and Elm, W. C. (2000). Cognitive task analysis as bootstrapping multiple converging techniques. In Schraagen, Chipman, and Shalin (Eds
A reliability analysis tool for SpaceWire network
NASA Astrophysics Data System (ADS)
Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou
2017-04-01
A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.
Thread concept for automatic task parallelization in image analysis
NASA Astrophysics Data System (ADS)
Lueckenhaus, Maximilian; Eckstein, Wolfgang
1998-09-01
Parallel processing of image analysis tasks is an essential method to speed up image processing and helps to exploit the full capacity of distributed systems. However, writing parallel code is a difficult and time-consuming process and often leads to an architecture-dependent program that has to be re-implemented when changing the hardware. Therefore it is highly desirable to do the parallelization automatically. For this we have developed a special kind of thread concept for image analysis tasks. Threads derivated from one subtask may share objects and run in the same context but may process different threads of execution and work on different data in parallel. In this paper we describe the basics of our thread concept and show how it can be used as basis of an automatic task parallelization to speed up image processing. We further illustrate the design and implementation of an agent-based system that uses image analysis threads for generating and processing parallel programs by taking into account the available hardware. The tests made with our system prototype show that the thread concept combined with the agent paradigm is suitable to speed up image processing by an automatic parallelization of image analysis tasks.
ERIC Educational Resources Information Center
Wake Forest Univ., Winston Salem, NC. Bowman Gray School of Medicine.
Utilizing a systematic sampling technique, the professional activities of small groups of pediatricians, family practitioners, surgeons, obstetricians, and internists were observed for 4 or 5 days by a medical student who checked a prearranged activity sheet every 30 seconds to: (1) identify those tasks and activities an assistant could be trained…
DOT National Transportation Integrated Search
1998-05-01
This report, conducted by Parsons Bricknerhoff International, was funded by the U.S. Trade and Development Agency. The report examines the potential for developing electronic toll collection systems in Brazil. This is Volume II and it contains "Task ...
Case Studies in Job Analysis and Training Evaluation.
ERIC Educational Resources Information Center
McKillip, Jack
2001-01-01
An information technology certification program was evaluated by 1,671 systems engineers using job analysis that rated task importance. Professional librarians (n=527) rated importance of their tasks in similar fashion. Results of scatter diagrams provided evidence to enhance training effectiveness by focusing on job tasks significantly related to…
A Task Analysis of Selected Nuclear Technician Occupations.
ERIC Educational Resources Information Center
Braden, Paul V.; Paul, Krishan K.
A task analysis of nuclear technician occupations in selected organizations in the Southern Interstate Nuclear Board Region was conducted as part of a research and development project leading to a nuclear technician manpower information system for these 17 states. In order to answer 11 questions focusing on task performance frequency and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leonard, S.L.; Munjal, P.K.; Rattin, E.J.
1976-06-01
The main emphasis of the activity during the second quarter of this project continued to be on Task 1, Analysis of Near-Term Missions, and on Task 2, Analysis of Major Mid-Term Missions. In addition, considerable progress was also made on Task 6, Comparison of the True Societal Costs of Conventional and Photovoltaic Power Production, and starts were made on Task 3, Review and Updating of the ERDA Technology Implementation Plan, and Task 4, Critical External Issues. As was planned, work on Task 5, Impact of Incentives, was deferred to the second half of the program. Progress is reported. (WHK)
Usability assessment of an electronic health record in a comprehensive dental clinic.
Suebnukarn, Siriwan; Rittipakorn, Pawornwan; Thongyoi, Budsara; Boonpitak, Kwanwong; Wongsapai, Mansuang; Pakdeesan, Panu
2013-12-01
In this paper we present the development and usability of an electronic health record (EHR) system in a comprehensive dental clinic.The graphic user interface of the system was designed to consider the concept of cognitive ergonomics.The cognitive task analysis was used to evaluate the user interface of the EHR by identifying all sub-tasks and classifying them into mental or physical operators, and to predict task execution time required to perform the given task. We randomly selected 30 cases that had oral examinations for routine clinical care in a comprehensive dental clinic. The results were based on the analysis of 4 prototypical tasks performed by ten EHR users. The results showed that on average a user needed to go through 27 steps to complete all tasks for one case. To perform all 4 tasks of 30 cases, they spent about 91 min (independent of system response time) for data entry, of which 51.8 min were spent on more effortful mental operators. In conclusion, the user interface can be improved by reducing the percentage of mental effort required for the tasks.
Task-based modeling and optimization of a cone-beam CT scanner for musculoskeletal imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prakash, P.; Zbijewski, W.; Gang, G. J.
2011-10-15
Purpose: This work applies a cascaded systems model for cone-beam CT imaging performance to the design and optimization of a system for musculoskeletal extremity imaging. The model provides a quantitative guide to the selection of system geometry, source and detector components, acquisition techniques, and reconstruction parameters. Methods: The model is based on cascaded systems analysis of the 3D noise-power spectrum (NPS) and noise-equivalent quanta (NEQ) combined with factors of system geometry (magnification, focal spot size, and scatter-to-primary ratio) and anatomical background clutter. The model was extended to task-based analysis of detectability index (d') for tasks ranging in contrast and frequencymore » content, and d' was computed as a function of system magnification, detector pixel size, focal spot size, kVp, dose, electronic noise, voxel size, and reconstruction filter to examine trade-offs and optima among such factors in multivariate analysis. The model was tested quantitatively versus the measured NPS and qualitatively in cadaver images as a function of kVp, dose, pixel size, and reconstruction filter under conditions corresponding to the proposed scanner. Results: The analysis quantified trade-offs among factors of spatial resolution, noise, and dose. System magnification (M) was a critical design parameter with strong effect on spatial resolution, dose, and x-ray scatter, and a fairly robust optimum was identified at M {approx} 1.3 for the imaging tasks considered. The results suggested kVp selection in the range of {approx}65-90 kVp, the lower end (65 kVp) maximizing subject contrast and the upper end maximizing NEQ (90 kVp). The analysis quantified fairly intuitive results--e.g., {approx}0.1-0.2 mm pixel size (and a sharp reconstruction filter) optimal for high-frequency tasks (bone detail) compared to {approx}0.4 mm pixel size (and a smooth reconstruction filter) for low-frequency (soft-tissue) tasks. This result suggests a specific protocol for 1 x 1 (full-resolution) projection data acquisition followed by full-resolution reconstruction with a sharp filter for high-frequency tasks along with 2 x 2 binning reconstruction with a smooth filter for low-frequency tasks. The analysis guided selection of specific source and detector components implemented on the proposed scanner. The analysis also quantified the potential benefits and points of diminishing return in focal spot size, reduced electronic noise, finer detector pixels, and low-dose limits of detectability. Theoretical results agreed quantitatively with the measured NPS and qualitatively with evaluation of cadaver images by a musculoskeletal radiologist. Conclusions: A fairly comprehensive model for 3D imaging performance in cone-beam CT combines factors of quantum noise, system geometry, anatomical background, and imaging task. The analysis provided a valuable, quantitative guide to design, optimization, and technique selection for a musculoskeletal extremities imaging system under development.« less
Task-Based Information Searching.
ERIC Educational Resources Information Center
Vakkari, Pertti
2003-01-01
Reviews studies on the relationship between task performance and information searching by end-users, focusing on information searching in electronic environments and information retrieval systems. Topics include task analysis; task characteristics; search goals; modeling information searching; modeling search goals; information seeking behavior;…
Overview of the ID, EPI and REL tasks of BioNLP Shared Task 2011.
Pyysalo, Sampo; Ohta, Tomoko; Rak, Rafal; Sullivan, Dan; Mao, Chunhong; Wang, Chunxia; Sobral, Bruno; Tsujii, Jun'ichi; Ananiadou, Sophia
2012-06-26
We present the preparation, resources, results and analysis of three tasks of the BioNLP Shared Task 2011: the main tasks on Infectious Diseases (ID) and Epigenetics and Post-translational Modifications (EPI), and the supporting task on Entity Relations (REL). The two main tasks represent extensions of the event extraction model introduced in the BioNLP Shared Task 2009 (ST'09) to two new areas of biomedical scientific literature, each motivated by the needs of specific biocuration tasks. The ID task concerns the molecular mechanisms of infection, virulence and resistance, focusing in particular on the functions of a class of signaling systems that are ubiquitous in bacteria. The EPI task is dedicated to the extraction of statements regarding chemical modifications of DNA and proteins, with particular emphasis on changes relating to the epigenetic control of gene expression. By contrast to these two application-oriented main tasks, the REL task seeks to support extraction in general by separating challenges relating to part-of relations into a subproblem that can be addressed by independent systems. Seven groups participated in each of the two main tasks and four groups in the supporting task. The participating systems indicated advances in the capability of event extraction methods and demonstrated generalization in many aspects: from abstracts to full texts, from previously considered subdomains to new ones, and from the ST'09 extraction targets to other entities and events. The highest performance achieved in the supporting task REL, 58% F-score, is broadly comparable with levels reported for other relation extraction tasks. For the ID task, the highest-performing system achieved 56% F-score, comparable to the state-of-the-art performance at the established ST'09 task. In the EPI task, the best result was 53% F-score for the full set of extraction targets and 69% F-score for a reduced set of core extraction targets, approaching a level of performance sufficient for user-facing applications. In this study, we extend on previously reported results and perform further analyses of the outputs of the participating systems. We place specific emphasis on aspects of system performance relating to real-world applicability, considering alternate evaluation metrics and performing additional manual analysis of system outputs. We further demonstrate that the strengths of extraction systems can be combined to improve on the performance achieved by any system in isolation. The manually annotated corpora, supporting resources, and evaluation tools for all tasks are available from http://www.bionlp-st.org and the tasks continue as open challenges for all interested parties.
Overview of the ID, EPI and REL tasks of BioNLP Shared Task 2011
2012-01-01
We present the preparation, resources, results and analysis of three tasks of the BioNLP Shared Task 2011: the main tasks on Infectious Diseases (ID) and Epigenetics and Post-translational Modifications (EPI), and the supporting task on Entity Relations (REL). The two main tasks represent extensions of the event extraction model introduced in the BioNLP Shared Task 2009 (ST'09) to two new areas of biomedical scientific literature, each motivated by the needs of specific biocuration tasks. The ID task concerns the molecular mechanisms of infection, virulence and resistance, focusing in particular on the functions of a class of signaling systems that are ubiquitous in bacteria. The EPI task is dedicated to the extraction of statements regarding chemical modifications of DNA and proteins, with particular emphasis on changes relating to the epigenetic control of gene expression. By contrast to these two application-oriented main tasks, the REL task seeks to support extraction in general by separating challenges relating to part-of relations into a subproblem that can be addressed by independent systems. Seven groups participated in each of the two main tasks and four groups in the supporting task. The participating systems indicated advances in the capability of event extraction methods and demonstrated generalization in many aspects: from abstracts to full texts, from previously considered subdomains to new ones, and from the ST'09 extraction targets to other entities and events. The highest performance achieved in the supporting task REL, 58% F-score, is broadly comparable with levels reported for other relation extraction tasks. For the ID task, the highest-performing system achieved 56% F-score, comparable to the state-of-the-art performance at the established ST'09 task. In the EPI task, the best result was 53% F-score for the full set of extraction targets and 69% F-score for a reduced set of core extraction targets, approaching a level of performance sufficient for user-facing applications. In this study, we extend on previously reported results and perform further analyses of the outputs of the participating systems. We place specific emphasis on aspects of system performance relating to real-world applicability, considering alternate evaluation metrics and performing additional manual analysis of system outputs. We further demonstrate that the strengths of extraction systems can be combined to improve on the performance achieved by any system in isolation. The manually annotated corpora, supporting resources, and evaluation tools for all tasks are available from http://www.bionlp-st.org and the tasks continue as open challenges for all interested parties. PMID:22759456
The approach to engineering tasks composition on knowledge portals
NASA Astrophysics Data System (ADS)
Novogrudska, Rina; Globa, Larysa; Schill, Alexsander; Romaniuk, Ryszard; Wójcik, Waldemar; Karnakova, Gaini; Kalizhanova, Aliya
2017-08-01
The paper presents an approach to engineering tasks composition on engineering knowledge portals. The specific features of engineering tasks are highlighted, their analysis makes the basis for partial engineering tasks integration. The formal algebraic system for engineering tasks composition is proposed, allowing to set the context-independent formal structures for engineering tasks elements' description. The method of engineering tasks composition is developed that allows to integrate partial calculation tasks into general calculation tasks on engineering portals, performed on user request demand. The real world scenario «Calculation of the strength for the power components of magnetic systems» is represented, approving the applicability and efficiency of proposed approach.
SAINT: A combined simulation language for modeling man-machine systems
NASA Technical Reports Server (NTRS)
Seifert, D. J.
1979-01-01
SAINT (Systems Analysis of Integrated Networks of Tasks) is a network modeling and simulation technique for design and analysis of complex man machine systems. SAINT provides the conceptual framework for representing systems that consist of discrete task elements, continuous state variables, and interactions between them. It also provides a mechanism for combining human performance models and dynamic system behaviors in a single modeling structure. The SAINT technique is described and applications of the SAINT are discussed.
A Mobile Computing Solution for Collecting Functional Analysis Data on a Pocket PC
Jackson, James; Dixon, Mark R
2007-01-01
The present paper provides a task analysis for creating a computerized data system using a Pocket PC and Microsoft Visual Basic. With Visual Basic software and any handheld device running the Windows Moble operating system, this task analysis will allow behavior analysts to program and customize their own functional analysis data-collection system. The program will allow the user to select the type of behavior to be recorded, choose between interval and frequency data collection, and summarize data for graphing and analysis. We also provide suggestions for customizing the data-collection system for idiosyncratic research and clinical needs. PMID:17624078
Part-task vs. whole-task training on a supervisory control task
NASA Technical Reports Server (NTRS)
Battiste, Vernol
1987-01-01
The efficacy of a part-task training for the psychomotor portion of a supervisory control simulation was compared to that of the whole-task training, using six subjects in each group, who were asked to perform a task as quickly as possible. Part-task training was provided with the cursor-control device prior to transition to the whole-task. The analysis of both the training and experimental trials demonstrated a significant performance advantage for the part-task group: the tasks were performed better and at higher speed. Although the subjects finally achieved the same level of performance in terms of score, the part-task method was preferable for economic reasons, since simple pretraining systems are significantly less expensive than the whole-task training systems.
Tsujii, Takeo; Watanabe, Shigeru
2009-09-01
Recent dual-process reasoning theories have explained the belief-bias effect, the tendency for human reasoning to be erroneously biased when logical conclusions are incongruent with beliefs about the world, by proposing a belief-based automatic heuristic system and logic-based demanding analytic system. Although these claims are supported by the behavioral finding that high-load secondary tasks enhance the belief-bias effect, the neural correlates of dual-task reasoning remain unknown. The present study therefore examined the relationship between dual-task effect and activity in the inferior frontal cortex (IFC) during belief-bias reasoning by near-infrared spectroscopy (NIRS). Forty-eight subjects participated in this study (MA=23.46 years). They were required to perform congruent and incongruent reasoning trials while responding to high- and low-load secondary tasks. Behavioral analysis showed that the high-load secondary task impaired only incongruent reasoning performance. NIRS analysis found that the high-load secondary task decreased right IFC activity during incongruent trials. Correlation analysis showed that subjects with enhanced right IFC activity could perform better in the incongruent reasoning trials, though subjects for whom right IFC activity was impaired by the secondary task could not maintain better reasoning performance. These findings suggest that the right IFC may be responsible for the dual-task effect in conflicting reasoning processes. When secondary tasks impair right IFC activity, subjects may rely on the automatic heuristic system, which results in belief-bias responses. We therefore offer the first demonstration of neural correlates of dual-task effect on IFC activity in belief-bias reasoning.
Team Training for Command and Control Systems: Status.
1982-04-01
in order to develop this set of C 2 systems, including a project listing for the Electronic Systems Division (ESD) of the Air...do the job. Task analysis results in a detailed description of tasks and task steps, and associated environmental and equipment conditions and...simulation exercises match the projected threat either in terms of numbers or capabilities. Live exercises are even less satisfactory because
TADS--A CFD-Based Turbomachinery Analysis and Design System with GUI: User's Manual. 2.0
NASA Technical Reports Server (NTRS)
Koiro, M. J.; Myers, R. A.; Delaney, R. A.
1999-01-01
The primary objective of this study was the development of a Computational Fluid Dynamics (CFD) based turbomachinery airfoil analysis and design system, controlled by a Graphical User Interface (GUI). The computer codes resulting from this effort are referred to as TADS (Turbomachinery Analysis and Design System). This document is intended to serve as a User's Manual for the computer programs which comprise the TADS system, developed under Task 18 of NASA Contract NAS3-27350, ADPAC System Coupling to Blade Analysis & Design System GUI and Task 10 of NASA Contract NAS3-27394, ADPAC System Coupling to Blade Analysis & Design System GUI, Phase II-Loss, Design and, Multi-stage Analysis. TADS couples a throughflow solver (ADPAC) with a quasi-3D blade-to-blade solver (RVCQ3D) in an interactive package. Throughflow analysis and design capability was developed in ADPAC through the addition of blade force and blockage terms to the governing equations. A GUI was developed to simplify user input and automate the many tasks required to perform turbomachinery analysis and design. The coupling of the various programs was done in such a way that alternative solvers or grid generators could be easily incorporated into the TADS framework. Results of aerodynamic calculations using the TADS system are presented for a highly loaded fan, a compressor stator, a low speed turbine blade and a transonic turbine vane.
Using link analysis to explore the impact of the physical environment on pharmacist tasks.
Lester, Corey A; Chui, Michelle A
2016-01-01
National community pharmacy organizations have been redesigning pharmacies to better facilitate direct patient care. However, evidence suggests that changing the physical layout of a pharmacy prior to understanding how the environment impacts pharmacists' work may not achieve the desired benefits. This study describes an objective method to understanding how the physical layout of the pharmacy may affect how pharmacists perform tasks. Link analysis is a systems engineering method used to describe the influence of the physical environment on task completion. This study used a secondary data set of field notes collected from 9 h of direct observation in one mass-merchandise community pharmacy in the U.S. State, Wisconsin. A node is an individual location in the environment. A link is the movement between two nodes. Tasks were inventoried and task themes identified. The mean, minimum, and maximum number of links needed to complete each task were then determined and used to construct a link table. A link diagram is a graphical display showing the links in conjunction with the physical layout of the pharmacy. A total of 92 unique tasks were identified resulting in 221 links. Tasks were sorted into five themes: patient care activities, insurance issues, verifying prescriptions, filling prescriptions, and other. Insurance issues required the greatest number of links with a mean of 4.75. Verifying prescriptions and performing patient care were the most commonly performed tasks with 36 and 30 unique task occurrences, respectively. Link analysis provides an objective method for identifying how a pharmacist interacts with the physical environment to complete tasks. This method provides designers with useful information to target interventions to improve the effectiveness of pharmacist work. Analysis beyond link analysis should be considered for large scale system redesign. Copyright © 2015 Elsevier Inc. All rights reserved.
ATDRS payload technology R & D
NASA Technical Reports Server (NTRS)
Anzic, G.; Connolly, D. J.; Fujikawa, G.; Andro, M.; Kunath, R. R.; Sharp, G. R.
1990-01-01
Four technology development tasks were chosen to reduce (or at least better understand) the technology risks associated with proposed approaches to Advanced Tracking and Data Relay Satellite (ATDRS). The four tasks relate to a Tri-Band Antenna feed system, a Digital Beamforming System for the S Band Multiple-Access System (SMA), an SMA Phased Array Antenna, and a Configuration Thermal/Mechanical Analysis task. The objective, approach, and status of each are discussed.
ATDRS payload technology research and development
NASA Technical Reports Server (NTRS)
Anzic, G.; Connolly, D. J.; Fujikawa, G.; Andro, M.; Kunath, R. R.; Sharp, G. R.
1990-01-01
Four technology development tasks were chosen to reduce (or at least better understand) the technology risks associated with proposed approaches to Advanced Tracking and Data Relay Satellite (ATDRS). The four tasks relate to a Tri-Band Antenna feed system, a Digital Beamforming System for the S Band Multiple Access System (SMA), an SMA Phased Array Antenna, and a Configuration Thermal/Mechanical Analysis task. The objective, approach, and status of each are discussed.
ATDRS payload technology R & D
NASA Astrophysics Data System (ADS)
Anzic, G.; Connolly, D. J.; Fujikawa, G.; Andro, M.; Kunath, R. R.; Sharp, G. R.
Four technology development tasks were chosen to reduce (or at least better understand) the technology risks associated with proposed approaches to Advanced Tracking and Data Relay Satellite (ATDRS). The four tasks relate to a Tri-Band Antenna feed system, a Digital Beamforming System for the S Band Multiple-Access System (SMA), an SMA Phased Array Antenna, and a Configuration Thermal/Mechanical Analysis task. The objective, approach, and status of each are discussed.
Earth resources data analysis program, phase 3
NASA Technical Reports Server (NTRS)
1975-01-01
Tasks were performed in two areas: (1) systems analysis and (2) algorithmic development. The major effort in the systems analysis task was the development of a recommended approach to the monitoring of resource utilization data for the Large Area Crop Inventory Experiment (LACIE). Other efforts included participation in various studies concerning the LACIE Project Plan, the utility of the GE Image 100, and the specifications for a special purpose processor to be used in the LACIE. In the second task, the major effort was the development of improved algorithms for estimating proportions of unclassified remotely sensed data. Also, work was performed on optimal feature extraction and optimal feature extraction for proportion estimation.
2010-09-01
application of existing assessment tools that may be applicable to Marine Air Ground Task Force (MAGTF) Command, Control, Communications and...of existing assessment tools that may be applicable to Marine Air Ground Task Force (MAGTF) Command, Control, Communications and Computers (C4...assessment tools and analysis concepts that may be extended to the Marine Corps’ C4 System of Systems assessment methodology as a means to obtain a
Overlap in the functional neural systems involved in semantic and episodic memory retrieval.
Rajah, M N; McIntosh, A R
2005-03-01
Neuroimaging and neuropsychological data suggest that episodic and semantic memory may be mediated by distinct neural systems. However, an alternative perspective is that episodic and semantic memory represent different modes of processing within a single declarative memory system. To examine whether the multiple or the unitary system view better represents the data we conducted a network analysis using multivariate partial least squares (PLS ) activation analysis followed by covariance structural equation modeling (SEM) of positron emission tomography data obtained while healthy adults performed episodic and semantic verbal retrieval tasks. It is argued that if performance of episodic and semantic retrieval tasks are mediated by different memory systems, then there should differences in both regional activations and interregional correlations related to each type of retrieval task, respectively. The PLS results identified brain regions that were differentially active during episodic retrieval versus semantic retrieval. Regions that showed maximal differences in regional activity between episodic retrieval tasks were used to construct separate functional models for episodic and semantic retrieval. Omnibus tests of these functional models failed to find a significant difference across tasks for both functional models. The pattern of path coefficients for the episodic retrieval model were not different across tasks, nor were the path coefficients for the semantic retrieval model. The SEM results suggest that the same memory network/system was engaged across tasks, given the similarities in path coefficients. Therefore, activation differences between episodic and semantic retrieval may ref lect variation along a continuum of processing during task performance within the context of a single memory system.
NASA Technical Reports Server (NTRS)
Ballard, Richard O.
2007-01-01
In 2005-06, the Prometheus program funded a number of tasks at the NASA-Marshall Space Flight Center (MSFC) to support development of a Nuclear Thermal Propulsion (NTP) system for future manned exploration missions. These tasks include the following: 1. NTP Design Develop Test & Evaluate (DDT&E) Planning 2. NTP Mission & Systems Analysis / Stage Concepts & Engine Requirements 3. NTP Engine System Trade Space Analysis and Studies 4. NTP Engine Ground Test Facility Assessment 5. Non-Nuclear Environmental Simulator (NTREES) 6. Non-Nuclear Materials Fabrication & Evaluation 7. Multi-Physics TCA Modeling. This presentation is a overview of these tasks and their accomplishments
Comparing capacity coefficient and dual task assessment of visual multitasking workload
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaha, Leslie M.
Capacity coefficient analysis could offer a theoretically grounded alternative approach to subjective measures and dual task assessment of cognitive workload. Workload capacity or workload efficiency is a human information processing modeling construct defined as the amount of information that can be processed by the visual cognitive system given a specified of amount of time. In this paper, I explore the relationship between capacity coefficient analysis of workload efficiency and dual task response time measures. To capture multitasking performance, I examine how the relatively simple assumptions underlying the capacity construct generalize beyond the single visual decision making tasks. The fundamental toolsmore » for measuring workload efficiency are the integrated hazard and reverse hazard functions of response times, which are defined by log transforms of the response time distribution. These functions are used in the capacity coefficient analysis to provide a functional assessment of the amount of work completed by the cognitive system over the entire range of response times. For the study of visual multitasking, capacity coefficient analysis enables a comparison of visual information throughput as the number of tasks increases from one to two to any number of simultaneous tasks. I illustrate the use of capacity coefficients for visual multitasking on sample data from dynamic multitasking in the modified Multi-attribute Task Battery.« less
Pilot Task Profiles, Human Factors, And Image Realism
NASA Astrophysics Data System (ADS)
McCormick, Dennis
1982-06-01
Computer Image Generation (CIG) visual systems provide real time scenes for state-of-the-art flight training simulators. The visual system reauires a greater understanding of training tasks, human factors, and the concept of image realism to produce an effective and efficient training scene than is required by other types of visual systems. Image realism must be defined in terms of pilot visual information reauirements. Human factors analysis of training and perception is necessary to determine the pilot's information requirements. System analysis then determines how the CIG and display device can best provide essential information to the pilot. This analysis procedure ensures optimum training effectiveness and system performance.
ERIC Educational Resources Information Center
Salonen, Pekka; Lepola, Janne; Vauras, Marja
2007-01-01
In this exploratory study we conceptualized and explored socio-cognitive, emotional and motivational regulatory processes displayed in scaffolding interaction between parents and their non-task and task-oriented children. Based on the dynamic systems view and findings from developmental research, we assumed that parents with non-task oriented and…
Task-level control for autonomous robots
NASA Technical Reports Server (NTRS)
Simmons, Reid
1994-01-01
Task-level control refers to the integration and coordination of planning, perception, and real-time control to achieve given high-level goals. Autonomous mobile robots need task-level control to effectively achieve complex tasks in uncertain, dynamic environments. This paper describes the Task Control Architecture (TCA), an implemented system that provides commonly needed constructs for task-level control. Facilities provided by TCA include distributed communication, task decomposition and sequencing, resource management, monitoring and exception handling. TCA supports a design methodology in which robot systems are developed incrementally, starting first with deliberative plans that work in nominal situations, and then layering them with reactive behaviors that monitor plan execution and handle exceptions. To further support this approach, design and analysis tools are under development to provide ways of graphically viewing the system and validating its behavior.
Ozone measurement systems improvements studies
NASA Technical Reports Server (NTRS)
Thomas, R. W.; Guard, K.; Holland, A. C.; Spurling, J. F.
1974-01-01
Results are summarized of an initial study of techniques for measuring atmospheric ozone, carried out as the first phase of a program to improve ozone measurement techniques. The study concentrated on two measurement systems, the electro chemical cell (ECC) ozonesonde and the Dobson ozone spectrophotometer, and consisted of two tasks. The first task consisted of error modeling and system error analysis of the two measurement systems. Under the second task a Monte-Carlo model of the Dobson ozone measurement technique was developed and programmed for computer operation.
Rail Inspection Systems Analysis and Technology Survey
DOT National Transportation Integrated Search
1977-09-01
The study was undertaken to identify existing rail inspection system capabilities and methods which might be used to improve these capabilities. Task I was a study to quantify existing inspection parameters and Task II was a cost effectiveness study ...
Structured analysis and modeling of complex systems
NASA Technical Reports Server (NTRS)
Strome, David R.; Dalrymple, Mathieu A.
1992-01-01
The Aircrew Evaluation Sustained Operations Performance (AESOP) facility at Brooks AFB, Texas, combines the realism of an operational environment with the control of a research laboratory. In recent studies we collected extensive data from the Airborne Warning and Control Systems (AWACS) Weapons Directors subjected to high and low workload Defensive Counter Air Scenarios. A critical and complex task in this environment involves committing a friendly fighter against a hostile fighter. Structured Analysis and Design techniques and computer modeling systems were applied to this task as tools for analyzing subject performance and workload. This technology is being transferred to the Man-Systems Division of NASA Johnson Space Center for application to complex mission related tasks, such as manipulating the Shuttle grappler arm.
Dashboard Task Monitor for Managing ATLAS User Analysis on the Grid
NASA Astrophysics Data System (ADS)
Sargsyan, L.; Andreeva, J.; Jha, M.; Karavakis, E.; Kokoszkiewicz, L.; Saiz, P.; Schovancova, J.; Tuckett, D.; Atlas Collaboration
2014-06-01
The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and Grid environment. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.
Atmospheric lidar multi-user instrument system definition study
NASA Technical Reports Server (NTRS)
Greco, R. V. (Editor)
1980-01-01
A spaceborne lidar system for atmospheric studies was defined. The primary input was the Science Objectives Experiment Description and Evolutionary Flow Document. The first task of the study was to perform an experiment evolutionary analysis of the SEED. The second task was the system definition effort of the instrument system. The third task was the generation of a program plan for the hardware phase. The fourth task was the supporting studies which included a Shuttle deficiency analysis, a preliminary safety hazard analysis, the identification of long lead items, and development studies required. As a result of the study an evolutionary Lidar Multi-User Instrument System (MUIS) was defined. The MUIS occupies a full Spacelab pallet and has a weight of 1300 kg. The Lidar MUIS laser provides a 2 joule frequency doubled Nd:YAG laser that can also pump a tuneable dye laser wide frequency range and bandwidth. The MUIS includes a 1.25 meter diameter aperture Cassegrain receiver, with a moveable secondary mirror to provide precise alignment with the laser. The receiver can transmit the return signal to three single and multiple photomultiple tube detectors by use of a rotating fold mirror. It is concluded that the Lidar MUIS proceed to program implementation.
Task Analysis Assessment on Intrastate Bus Traffic Controllers
NASA Astrophysics Data System (ADS)
Yen Bin, Teo; Azlis-Sani, Jalil; Nur Annuar Mohd Yunos, Muhammad; Ismail, S. M. Sabri S. M.; Tajedi, Noor Aqilah Ahmad
2016-11-01
Public transportation acts as social mobility and caters the daily needs of the society for passengers to travel from one place to another. This is true for a country like Malaysia where international trade has been growing significantly over the past few decades. Task analysis assessment was conducted with the consideration of cognitive ergonomic view towards problem related to human factors. Conducting research regarding the task analysis on bus traffic controllers had allowed a better understanding regarding the nature of work and the overall monitoring activities of the bus services. This paper served to study the task analysis assessment on intrastate bus traffic controllers and the objectives of this study include to conduct task analysis assessment on the bus traffic controllers. Task analysis assessment for the bus traffic controllers was developed via Hierarchical Task Analysis (HTA). There are a total of five subsidiary tasks on level one and only two were able to be further broken down in level two. Development of HTA allowed a better understanding regarding the work and this could further ease the evaluation of the tasks conducted by the bus traffic controllers. Thus, human error could be reduced for the safety of all passengers and increase the overall efficiency of the system. Besides, it could assist in improving the operation of the bus traffic controllers by modelling or synthesizing the existing tasks if necessary.
NASA Technical Reports Server (NTRS)
Searcy, Brittani
2017-01-01
Using virtual environments to assess complex large scale human tasks provides timely and cost effective results to evaluate designs and to reduce operational risks during assembly and integration of the Space Launch System (SLS). NASA's Marshall Space Flight Center (MSFC) uses a suite of tools to conduct integrated virtual analysis during the design phase of the SLS Program. Siemens Jack is a simulation tool that allows engineers to analyze human interaction with CAD designs by placing a digital human model into the environment to test different scenarios and assess the design's compliance to human factors requirements. Engineers at MSFC are using Jack in conjunction with motion capture and virtual reality systems in MSFC's Virtual Environments Lab (VEL). The VEL provides additional capability beyond standalone Jack to record and analyze a person performing a planned task to assemble the SLS at Kennedy Space Center (KSC). The VEL integrates Vicon Blade motion capture system, Siemens Jack, Oculus Rift, and other virtual tools to perform human factors assessments. By using motion capture and virtual reality, a more accurate breakdown and understanding of how an operator will perform a task can be gained. By virtual analysis, engineers are able to determine if a specific task is capable of being safely performed by both a 5% (approx. 5ft) female and a 95% (approx. 6'1) male. In addition, the analysis will help identify any tools or other accommodations that may to help complete the task. These assessments are critical for the safety of ground support engineers and keeping launch operations on schedule. Motion capture allows engineers to save and examine human movements on a frame by frame basis, while virtual reality gives the actor (person performing a task in the VEL) an immersive view of the task environment. This presentation will discuss the need of human factors for SLS and the benefits of analyzing tasks in NASA MSFC's VEL.
NASA Technical Reports Server (NTRS)
Gopher, D.; Wickens, C. D.
1975-01-01
A one dimensional compensatory tracking task and a digit processing reaction time task were combined in a three phase experiment designed to investigate tracking performance in time sharing. Adaptive techniques, elaborate feedback devices, and on line standardization procedures were used to adjust task difficulty to the ability of each individual subject and manipulate time sharing demands. Feedback control analysis techniques were employed in the description of tracking performance. The experimental results show that when the dynamics of a system are constrained, in such a manner that man machine system stability is no longer a major concern of the operator, he tends to adopt a first order control describing function, even with tracking systems of higher order. Attention diversion to a concurrent task leads to an increase in remnant level, or nonlinear power. This decrease in linearity is reflected both in the output magnitude spectra of the subjects, and in the linear fit of the amplitude ratio functions.
ERIC Educational Resources Information Center
Mochizuki, Naoko
2017-01-01
Needs analysis (NA) plays a significant role in developing tasks that create opportunities for natural language use in classrooms. Preemptive NA, however, does not necessarily predict the contingently emerging interpersonal and social variables which influence learners and teachers' behaviours. These unpredictable variables often lead to a gap…
Effects of System Timing Parameters on Operator Performance in a Personnel Records Task
1981-03-01
work sampling, embedded performance measures, and operator satisfaction ratings) are needed to provide a complete analysis of the effects of the four...HFL-8 l-l/NPRDC-8 1-1 March 1981 EFFECTS OF SYSTEM TIMING PARAMETERS ON OPERATOR PERFORMANCE IN A PERSONNEL RECORDS TASK Robert C. Williges Beverly H...and Subtitle) S. TYPE OF REPORT & PERIOD COVERED EFFECTS OF SYSTEM TIMING PARAMETERS ON OPERATOR PERFORMANCE IN A PERSONNEL RECORDS TASK Final
Mission control of multiple unmanned aerial vehicles: a workload analysis.
Dixon, Stephen R; Wickens, Christopher D; Chang, Dervon
2005-01-01
With unmanned aerial vehicles (UAVs), 36 licensed pilots flew both single-UAV and dual-UAV simulated military missions. Pilots were required to navigate each UAV through a series of mission legs in one of the following three conditions: a baseline condition, an auditory autoalert condition, and an autopilot condition. Pilots were responsible for (a) mission completion, (b) target search, and (c) systems monitoring. Results revealed that both the autoalert and the autopilot automation improved overall performance by reducing task interference and alleviating workload. The autoalert system benefited performance both in the automated task and mission completion task, whereas the autopilot system benefited performance in the automated task, the mission completion task, and the target search task. Practical implications for the study include the suggestion that reliable automation can help alleviate task interference and reduce workload, thereby allowing pilots to better handle concurrent tasks during single- and multiple-UAV flight control.
Conceptual Replaceability Analysis for Order and Standard Loan Tasks.
ERIC Educational Resources Information Center
California Univ., Santa Barbara. Library Systems Development Program.
Very preliminary systems concepts are presented for the Order and Standard Loan Subsystems. Each of the tasks defined for the current manual operations in (Library Systems Development) LSD 70-60 are evaluated against these concepts to determine how existing work will change when mechanized systems are installed. Then, utilizing this qualitative…
Development of task network models of human performance in microgravity
NASA Technical Reports Server (NTRS)
Diaz, Manuel F.; Adam, Susan
1992-01-01
This paper discusses the utility of task-network modeling for quantifying human performance variability in microgravity. The data are gathered for: (1) improving current methodologies for assessing human performance and workload in the operational space environment; (2) developing tools for assessing alternative system designs; and (3) developing an integrated set of methodologies for the evaluation of performance degradation during extended duration spaceflight. The evaluation entailed an analysis of the Remote Manipulator System payload-grapple task performed on many shuttle missions. Task-network modeling can be used as a tool for assessing and enhancing human performance in man-machine systems, particularly for modeling long-duration manned spaceflight. Task-network modeling can be directed toward improving system efficiency by increasing the understanding of basic capabilities of the human component in the system and the factors that influence these capabilities.
Total systems design analysis of high performance structures
NASA Technical Reports Server (NTRS)
Verderaime, V.
1993-01-01
Designer-control parameters were identified at interdiscipline interfaces to optimize structural systems performance and downstream development and operations with reliability and least life-cycle cost. Interface tasks and iterations are tracked through a matrix of performance disciplines integration versus manufacturing, verification, and operations interactions for a total system design analysis. Performance integration tasks include shapes, sizes, environments, and materials. Integrity integrating tasks are reliability and recurring structural costs. Significant interface designer control parameters were noted as shapes, dimensions, probability range factors, and cost. Structural failure concept is presented, and first-order reliability and deterministic methods, benefits, and limitations are discussed. A deterministic reliability technique combining benefits of both is proposed for static structures which is also timely and economically verifiable. Though launch vehicle environments were primarily considered, the system design process is applicable to any surface system using its own unique filed environments.
Automated power management and control
NASA Technical Reports Server (NTRS)
Dolce, James L.
1991-01-01
A comprehensive automation design is being developed for Space Station Freedom's electric power system. A joint effort between NASA's Office of Aeronautics and Exploration Technology and NASA's Office of Space Station Freedom, it strives to increase station productivity by applying expert systems and conventional algorithms to automate power system operation. The initial station operation will use ground-based dispatches to perform the necessary command and control tasks. These tasks constitute planning and decision-making activities that strive to eliminate unplanned outages. We perceive an opportunity to help these dispatchers make fast and consistent on-line decisions by automating three key tasks: failure detection and diagnosis, resource scheduling, and security analysis. Expert systems will be used for the diagnostics and for the security analysis; conventional algorithms will be used for the resource scheduling.
Interfaces for End-User Information Seeking.
ERIC Educational Resources Information Center
Marchionini, Gary
1992-01-01
Discusses essential features of interfaces to support end-user information seeking. Highlights include cognitive engineering; task models and task analysis; the problem-solving nature of information seeking; examples of systems for end-users, including online public access catalogs (OPACs), hypertext, and help systems; and suggested research…
Evaluating office lighting environments: Second-level analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, B.L.; Fisher, W.S.; Gillette, G.L.
1989-04-01
Data from a post-occupancy evaluation (POE) of 912 work stations with lighting power density (LPD), photometric, and occupant-response measures were examined in a detailed, second-level analysis. Seven types of lighting systems were identified with different combinations of direct and indirect ambient lighting, and task lighting and daylight. The mean illuminances at the primary task location were within the IES target values for office task with a range of mean illuminances from 32 to 75 fc, depending on the lighting system. The median LPD was about 2.36 watts/sq ft, with about one-third the work stations having LPD's at or below 2.0more » watts/sq ft. Although a majority of the occupants (69%) were satisfied about their lighting, the highest percentage of those expressing dissatisfaction (37%) with lighting had an indirect fluorescent furniture-mounted (IFFM) system. The negative reaction of so many people to the IFFM system suggests that the combination of task lighting with an indirect ambient system had an important influence on lighting satisfaction, even though task illuminances tended to be higher with the IFFM system. Concepts of lighting quality, visual health, and control were explored, as well as average luminance to explain the negative reactions to the combination of indirect lighting with furniture-mounted lighting.« less
Militello, L G; Hutton, R J
1998-11-01
Cognitive task analysis (CTA) is a set of methods for identifying cognitive skills, or mental demands, needed to perform a task proficiently. The product of the task analysis can be used to inform the design of interfaces and training systems. However, CTA is resource intensive and has previously been of limited use to design practitioners. A streamlined method of CTA, Applied Cognitive Task Analysis (ACTA), is presented in this paper. ACTA consists of three interview methods that help the practitioner to extract information about the cognitive demands and skills required for a task. ACTA also allows the practitioner to represent this information in a format that will translate more directly into applied products, such as improved training scenarios or interface recommendations. This paper will describe the three methods, an evaluation study conducted to assess the usability and usefulness of the methods, and some directions for future research for making cognitive task analysis accessible to practitioners. ACTA techniques were found to be easy to use, flexible, and to provide clear output. The information and training materials developed based on ACTA interviews were found to be accurate and important for training purposes.
Effects-based strategy development through center of gravity and target system analysis
NASA Astrophysics Data System (ADS)
White, Christopher M.; Prendergast, Michael; Pioch, Nicholas; Jones, Eric K.; Graham, Stephen
2003-09-01
This paper describes an approach to effects-based planning in which a strategic-theater-level mission is refined into operational-level and ultimately tactical-level tasks and desired effects, informed by models of the expected enemy response at each level of abstraction. We describe a strategy development system that implements this approach and supports human-in-the-loop development of an effects-based plan. This system consists of plan authoring tools tightly integrated with a suite of center of gravity (COG) and target system analysis tools. A human planner employs the plan authoring tools to develop a hierarchy of tasks and desired effects. Upon invocation, the target system analysis tools use reduced-order models of enemy centers of gravity to select appropriate target set options for the achievement of desired effects, together with associated indicators for each option. The COG analysis tools also provide explicit models of the causal mechanisms linking tasks and desired effects to one another, and suggest appropriate observable indicators to guide ISR planning, execution monitoring, and campaign assessment. We are currently implementing the system described here as part of the AFRL-sponsored Effects Based Operations program.
Study on user interface of pathology picture archiving and communication system.
Kim, Dasueran; Kang, Peter; Yun, Jungmin; Park, Sung-Hye; Seo, Jeong-Wook; Park, Peom
2014-01-01
It is necessary to improve the pathology workflow. A workflow task analysis was performed using a pathology picture archiving and communication system (pathology PACS) in order to propose a user interface for the Pathology PACS considering user experience. An interface analysis of the Pathology PACS in Seoul National University Hospital and a task analysis of the pathology workflow were performed by observing recorded video. Based on obtained results, a user interface for the Pathology PACS was proposed. Hierarchical task analysis of Pathology PACS was classified into 17 tasks including 1) pre-operation, 2) text, 3) images, 4) medical record viewer, 5) screen transition, 6) pathology identification number input, 7) admission date input, 8) diagnosis doctor, 9) diagnosis code, 10) diagnosis, 11) pathology identification number check box, 12) presence or absence of images, 13) search, 14) clear, 15) Excel save, 16) search results, and 17) re-search. And frequently used menu items were identified and schematized. A user interface for the Pathology PACS considering user experience could be proposed as a preliminary step, and this study may contribute to the development of medical information systems based on user experience and usability.
Usability Evaluation of an Unstructured Clinical Document Query Tool for Researchers.
Hultman, Gretchen; McEwan, Reed; Pakhomov, Serguei; Lindemann, Elizabeth; Skube, Steven; Melton, Genevieve B
2018-01-01
Natural Language Processing - Patient Information Extraction for Researchers (NLP-PIER) was developed for clinical researchers for self-service Natural Language Processing (NLP) queries with clinical notes. This study was to conduct a user-centered analysis with clinical researchers to gain insight into NLP-PIER's usability and to gain an understanding of the needs of clinical researchers when using an application for searching clinical notes. Clinical researcher participants (n=11) completed tasks using the system's two existing search interfaces and completed a set of surveys and an exit interview. Quantitative data including time on task, task completion rate, and survey responses were collected. Interviews were analyzed qualitatively. Survey scores, time on task and task completion proportions varied widely. Qualitative analysis indicated that participants found the system to be useful and usable in specific projects. This study identified several usability challenges and our findings will guide the improvement of NLP-PIER 's interfaces.
Study of roles of remote manipulator systems and EVA for shuttle mission support, volume 1
NASA Technical Reports Server (NTRS)
Malone, T. B.; Micocci, A. J.
1974-01-01
Alternate extravehicular activity (EVA) and remote manipulator system (RMS) configurations were examined for their relative effectiveness in performing an array of representative shuttle and payload support tasks. Initially a comprehensive analysis was performed of payload and shuttle support missions required to be conducted exterior to a pressurized inclosure. A set of task selection criteria was established, and study tasks were identified. The EVA and RMS modes were evaluated according to their applicability for each task and task condition. The results are summarized in tabular form, showing the modes which are chosen as most effective or as feasible for each task/condition. Conclusions concerning the requirements and recommendations for each mode are presented.
The safer clinical systems project in renal care.
Weale, Andy R
2013-09-01
Current systems in place in healthcare are designed to detect harm after it has happened (e.g critical incident reports) and make recommendations based on an assessment of that event. Safer Clinical Systems, a Health Foundation funded project, is designed to proactively search for risk within systems, rather than being reactive to harm. The aim of the Safer Clinical Systems project in Renal Care was to reduce the risks associated with shared care for patients who are undergoing surgery but are looked after peri-operatively by nephrology teams on nephrology wards. This report details our findings of the diagnostic phase of Safer Clinical Systems: the proactive search for risk. We have evaluated the current system of care using a set of risk evaluation and process mapping tools (Failure Modes and Effects Analysis (FMEA) and Hierarchical Task Analysis HTA). We have engaged staff with the process mapping and risk assessment tools. We now understand our system and understand where the highest risk tasks are undertaken during a renal in-patient stay during which a patient has an operation. These key tasks occur across the perioperaive period and are not confined to one aspect of care. A measurement strategy and intervention plan have been designed around these tasks. Safer Clinical Systems has identified high risk, low reliability tasks in our system. We look forward to fully reporting these data in 2014. © 2013 European Dialysis and Transplant Nurses Association/European Renal Care Association.
Weir, Charlene R; Nebeker, Jonathan J R; Hicken, Bret L; Campo, Rebecca; Drews, Frank; Lebar, Beth
2007-01-01
Computerized Provider Order Entry (CPOE) with electronic documentation, and computerized decision support dramatically changes the information environment of the practicing clinician. Prior work patterns based on paper, verbal exchange, and manual methods are replaced with automated, computerized, and potentially less flexible systems. The objective of this study is to explore the information management strategies that clinicians use in the process of adapting to a CPOE system using cognitive task analysis techniques. Observation and semi-structured interviews were conducted with 88 primary-care clinicians at 10 Veterans Administration Medical Centers. Interviews were taped, transcribed, and extensively analyzed to identify key information management goals, strategies, and tasks. Tasks were aggregated into groups, common components across tasks were clarified, and underlying goals and strategies identified. Nearly half of the identified tasks were not fully supported by the available technology. Six core components of tasks were identified. Four meta-cognitive information management goals emerged: 1) Relevance Screening; 2) Ensuring Accuracy; 3) Minimizing memory load; and 4) Negotiating Responsibility. Strategies used to support these goals are presented. Users develop a wide array of information management strategies that allow them to successfully adapt to new technology. Supporting the ability of users to develop adaptive strategies to support meta-cognitive goals is a key component of a successful system.
ERIC Educational Resources Information Center
Fisher, Harold S.; And Others
This is the second volume of a four-volume report of a research project designed to (1) identify job needs for agricultural occupations which will result from the Muskegon County Wastewater Management System and perform a task analysis on each occupation, (2) develop instructional modules and determine their place in either high school or 2-year…
DOT National Transportation Integrated Search
1974-08-01
Volume 2 contains the analysis and description of air traffic management activities at three levels of detail - functions, subfunctions, and tasks. A total of 265 tasks are identified and described, and the flow of information inputs and outputs amon...
DOT National Transportation Integrated Search
1974-08-01
Volume 2 contains the analysis and description of air traffic management activities at three levels of detail - functions, subfunctions, and tasks. A total of 265 tasks are identified and described, and the flow of information inputs and outputs amon...
DOT National Transportation Integrated Search
1974-08-01
Volume 2 contains the analysis and description of air traffic management activities at three levels of detail - functions, subfunctions, and tasks. A total of 265 tasks are identified and described, and the flow of information inputs and outputs amon...
DOT National Transportation Integrated Search
1974-08-01
Volume 2 contains the analysis and description of air traffic management activities at three levels of detail - functions, subfunctions, and tasks. A total of 265 tasks are identified and described, and the flow of information inputs and outputs amon...
NASA Technical Reports Server (NTRS)
Hall, Edward J.; Topp, David A.; Heidegger, Nathan J.; Delaney, Robert A.
1994-01-01
The focus of this task was to validate the ADPAC code for heat transfer calculations. To accomplish this goal, the ADPAC code was modified to allow for a Cartesian coordinate system capability and to add boundary conditions to handle spanwise periodicity and transpiration boundaries. This user's manual describes how to use the ADPAC code as developed in Task 5, NAS3-25270, including the modifications made to date in Tasks 7 and 8, NAS3-25270.
Space station data system analysis/architecture study. Task 3: Trade studies, DR-5, volume 1
NASA Technical Reports Server (NTRS)
1985-01-01
The primary objective of Task 3 is to provide additional analysis and insight necessary to support key design/programmatic decision for options quantification and selection for system definition. This includes: (1) the identification of key trade study topics; (2) the definition of a trade study procedure for each topic (issues to be resolved, key inputs, criteria/weighting, methodology); (3) conduct tradeoff and sensitivity analysis; and (4) the review/verification of results within the context of evolving system design and definition. The trade study topics addressed in this volume include space autonomy and function automation, software transportability, system network topology, communications standardization, onboard local area networking, distributed operating system, software configuration management, and the software development environment facility.
NASA Technical Reports Server (NTRS)
Olsen, R.; Schaefer, O.; Hussey, J.
1992-01-01
Potential space missions of the nineties and the next century require that we look at the broad category of remote systems as an important means to achieve cost-effective operations, exploration and colonization objectives. This paper addresses such missions, which can use remote systems technology as the basis for identifying required capabilities which must be provided. The relationship of the space-based tasks to similar tasks required for terrestrial applications is discussed. The development status of the required technology is assessed and major issues which must be addressed to meet future requirements are identified. This includes the proper mix of humans and machines, from pure teleoperation to full autonomy; the degree of worksite compatibility for a robotic system; and the required design parameters, such as degrees-of-freedom. Methods for resolution are discussed including analysis, graphical simulation and the use of laboratory test beds. Grumman experience in the application of these techniques to a variety of design issues are presented utilizing the Telerobotics Development Laboratory which includes a 17-DOF robot system, a variety of sensing elements, Deneb/IRIS graphics workstations and control stations. The use of task/worksite mockups, remote system development test beds and graphical analysis are discussed with examples of typical results such as estimates of task times, task feasibility and resulting recommendations for design changes. The relationship of this experience and lessons-learned to future development of remote systems is also discussed.
NASA Technical Reports Server (NTRS)
Mohl, C.
1978-01-01
Several tasks of JPL related to geothermal energy are discussed. The major task is the procurement and test and evaluation of a helical screw drive (wellhead unit). A general review of geothermal energy systems is given. The presentation focuses attention on geothermal reservoirs in California, with graphs and charts to support the discussion. Included are discussions on cost analysis, systems maintenance, and a comparison of geothermal and conventional heating and cooling systems.
Mirel, Barbara
2009-02-13
Current usability studies of bioinformatics tools suggest that tools for exploratory analysis support some tasks related to finding relationships of interest but not the deep causal insights necessary for formulating plausible and credible hypotheses. To better understand design requirements for gaining these causal insights in systems biology analyses a longitudinal field study of 15 biomedical researchers was conducted. Researchers interacted with the same protein-protein interaction tools to discover possible disease mechanisms for further experimentation. Findings reveal patterns in scientists' exploratory and explanatory analysis and reveal that tools positively supported a number of well-structured query and analysis tasks. But for several of scientists' more complex, higher order ways of knowing and reasoning the tools did not offer adequate support. Results show that for a better fit with scientists' cognition for exploratory analysis systems biology tools need to better match scientists' processes for validating, for making a transition from classification to model-based reasoning, and for engaging in causal mental modelling. As the next great frontier in bioinformatics usability, tool designs for exploratory systems biology analysis need to move beyond the successes already achieved in supporting formulaic query and analysis tasks and now reduce current mismatches with several of scientists' higher order analytical practices. The implications of results for tool designs are discussed.
Task Management in the New ATLAS Production System
NASA Astrophysics Data System (ADS)
De, K.; Golubkov, D.; Klimentov, A.; Potekhin, M.; Vaniachine, A.; Atlas Collaboration
2014-06-01
This document describes the design of the new Production System of the ATLAS experiment at the LHC [1]. The Production System is the top level workflow manager which translates physicists' needs for production level processing and analysis into actual workflows executed across over a hundred Grid sites used globally by ATLAS. As the production workload increased in volume and complexity in recent years (the ATLAS production tasks count is above one million, with each task containing hundreds or thousands of jobs) there is a need to upgrade the Production System to meet the challenging requirements of the next LHC run while minimizing the operating costs. In the new design, the main subsystems are the Database Engine for Tasks (DEFT) and the Job Execution and Definition Interface (JEDI). Based on users' requests, DEFT manages inter-dependent groups of tasks (Meta-Tasks) and generates corresponding data processing workflows. The JEDI component then dynamically translates the task definitions from DEFT into actual workload jobs executed in the PanDA Workload Management System [2]. We present the requirements, design parameters, basics of the object model and concrete solutions utilized in building the new Production System and its components.
Next Generation Electromagnetic Pump Analysis Tools (PLM DOC-0005-2188). Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stregy, Seth; Dasilva, Ana; Yilmaz, Serkan
2015-10-29
This report provides the broad historical review of EM Pump development and details of MATRIX development under this project. This report summarizes the efforts made to modernize the legacy performance models used in previous EM Pump designs and the improvements made to the analysis tools. This report provides information on Tasks 1, 3, and 4 of the entire project. The research for Task 4 builds upon Task 1: Update EM Pump Databank and Task 3: Modernize the Existing EM Pump Analysis Model, which are summarized within this report. Where research for Task 2: Insulation Materials Development and Evaluation identified parametersmore » applicable to the analysis model with Task 4, the analysis code was updated, and analyses were made for additional materials. The important design variables for the manufacture and operation of an EM Pump that the model improvement can evaluate are: space constraints; voltage capability of insulation system; maximum flux density through iron; flow rate and outlet pressure; efficiency and manufacturability. The development of the next-generation EM Pump analysis tools during this two-year program provides information in three broad areas: Status of analysis model development; Improvements made to older simulations; and Comparison to experimental data.« less
NASA Technical Reports Server (NTRS)
1985-01-01
Task 2 in the Space Station Data System (SSDS) Analysis/Architecture Study is the development of an information base that will support the conduct of trade studies and provide sufficient data to make design/programmatic decisions. This volume identifies the preferred options in the programmatic category and characterizes these options with respect to performance attributes, constraints, costs, and risks. The programmatic category includes methods used to administrate/manage the development, operation and maintenance of the SSDS. The specific areas discussed include standardization/commonality; systems management; and systems development, including hardware procurement, software development and system integration, test and verification.
TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0
NASA Technical Reports Server (NTRS)
Ortiz, C. J.
1994-01-01
The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K MS-DOS format diskette. TARGET was developed in 1991.
A Preliminary Study on Gender Differences in Studying Systems Analysis and Design
ERIC Educational Resources Information Center
Lee, Fion S. L.; Wong, Kelvin C. K.
2017-01-01
Systems analysis and design is a crucial task in system development and is included in a typical information systems programme as a core course. This paper presented a preliminary study on gender differences in studying a systems analysis and design course of an undergraduate programme. Results indicated that male students outperformed female…
Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 5: Study analysis report
NASA Technical Reports Server (NTRS)
1989-01-01
The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex (PTC) at the Marshall Space Flight Center (MSFC). The PTC will train the space station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be on-board the Freedom Space Station. The further analysis performed on the SCS study as part of task 2-Perform Studies and Parametric Analysis-of the SCS study contract is summarized. These analyses were performed to resolve open issues remaining after the completion of task 1, and the publishing of the SCS study issues report. The results of these studies provide inputs into SCS task 3-Develop and present SCS requirements, and SCS task 4-develop SCS conceptual designs. The purpose of these studies is to resolve the issues into usable requirements given the best available information at the time of the study. A list of all the SCS study issues is given.
Human factors in the Naval Air Systems Command: Computer based training
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seamster, T.L.; Snyder, C.E.; Terranova, M.
1988-01-01
Military standards applied to the private sector contracts have a substantial effect on the quality of Computer Based Training (CBT) systems procured for the Naval Air Systems Command. This study evaluated standards regulating the following areas in CBT development and procurement: interactive training systems, cognitive task analysis, and CBT hardware. The objective was to develop some high-level recommendations for evolving standards that will govern the next generation of CBT systems. One of the key recommendations is that there be an integration of the instructional systems development, the human factors engineering, and the software development standards. Recommendations were also made formore » task analysis and CBT hardware standards. (9 refs., 3 figs.)« less
Liquid rocket booster study. Volume 2, book 6, appendix 10: Vehicle systems effects
NASA Technical Reports Server (NTRS)
1989-01-01
Three tasks were undertaken by Eagle Engineering as a part of the Liquid Rocket Booster (LRB) study. Task 1 required Eagle to supply current data relative to the Space Shuttle vehicle and systems affected by an LRB substitution. Tables listing data provided are presented. Task 2 was to evaluate and compare shuttle impacts of candidate LRB configuration in concert with overall trades of analysis activity. Three selected configurations with emphasis on flight loads, separation dynamics, and cost comparison are presented. Task 3 required the development of design guidelines and requirements to minimize impacts to the Space Shuttle system from all LRB substitution. Results are presented for progress to date.
F-16 Task Analysis Criterion-Referenced Objective and Objectives Hierarchy Report. Volume 4
1981-03-01
Initiation cues: Engine flameout Systems presenting cues: Aircraft fuel, engine STANDARD: Authority: TACR 60-2 Performance precision: TD in first 1/3 of...task: None Initiation cues: On short final Systems preventing cues: N/A STANDARD: Authority: 60-2 Performance precision: +/- .5 AOA; TD zone 150-1000...precision: +/- .05 AOA; TD Zone 150-1000 Computational accuracy: N/A ... . . ... . ... e e m I TASK NO.: 1.9.4 BEHAVIOR: Perform short field landing
Effects of VR system fidelity on analyzing isosurface visualization of volume datasets.
Laha, Bireswar; Bowman, Doug A; Socha, John J
2014-04-01
Volume visualization is an important technique for analyzing datasets from a variety of different scientific domains. Volume data analysis is inherently difficult because volumes are three-dimensional, dense, and unfamiliar, requiring scientists to precisely control the viewpoint and to make precise spatial judgments. Researchers have proposed that more immersive (higher fidelity) VR systems might improve task performance with volume datasets, and significant results tied to different components of display fidelity have been reported. However, more information is needed to generalize these results to different task types, domains, and rendering styles. We visualized isosurfaces extracted from synchrotron microscopic computed tomography (SR-μCT) scans of beetles, in a CAVE-like display. We ran a controlled experiment evaluating the effects of three components of system fidelity (field of regard, stereoscopy, and head tracking) on a variety of abstract task categories that are applicable to various scientific domains, and also compared our results with those from our prior experiment using 3D texture-based rendering. We report many significant findings. For example, for search and spatial judgment tasks with isosurface visualization, a stereoscopic display provides better performance, but for tasks with 3D texture-based rendering, displays with higher field of regard were more effective, independent of the levels of the other display components. We also found that systems with high field of regard and head tracking improve performance in spatial judgment tasks. Our results extend existing knowledge and produce new guidelines for designing VR systems to improve the effectiveness of volume data analysis.
NASA Technical Reports Server (NTRS)
Estes, Samantha; Parker, Nelson C. (Technical Monitor)
2001-01-01
Virtual reality and simulation applications are becoming widespread in human task analysis. These programs have many benefits for the Human Factors Engineering field. Not only do creating and using virtual environments for human engineering analyses save money and time, this approach also promotes user experimentation and provides increased quality of analyses. This paper explains the human engineering task analysis performed on the Environmental Control and Life Support System (ECLSS) space station rack and its Distillation Assembly (DA) subsystem using EAI's human modeling simulation software, Jack. When installed on the International Space Station (ISS), ECLSS will provide the life and environment support needed to adequately sustain crew life. The DA is an Orbital Replaceable Unit (ORU) that provides means of wastewater (primarily urine from flight crew and experimental animals) reclamation. Jack was used to create a model of the weightless environment of the ISS Node 3, where the ECLSS is housed. Computer aided drawings of the ECLSS rack and DA system were also brought into the environment. Anthropometric models of a 95th percentile male and 5th percentile female were used to examine the human interfaces encountered during various ECLSS and DA tasks. The results of the task analyses were used in suggesting modifications to hardware and crew task procedures to improve accessibility, conserve crew time, and add convenience for the crew. This paper will address some of those suggested modifications and the method of presenting final analyses for requirements verification.
Failure modes and effects analysis automation
NASA Technical Reports Server (NTRS)
Kamhieh, Cynthia H.; Cutts, Dannie E.; Purves, R. Byron
1988-01-01
A failure modes and effects analysis (FMEA) assistant was implemented as a knowledge based system and will be used during design of the Space Station to aid engineers in performing the complex task of tracking failures throughout the entire design effort. The three major directions in which automation was pursued were the clerical components of the FMEA process, the knowledge acquisition aspects of FMEA, and the failure propagation/analysis portions of the FMEA task. The system is accessible to design, safety, and reliability engineers at single user workstations and, although not designed to replace conventional FMEA, it is expected to decrease by many man years the time required to perform the analysis.
Auvinet, Bernard; Touzard, Claude; Montestruc, François; Delafond, Arnaud; Goeb, Vincent
2017-01-31
Gait disorders and gait analysis under single and dual-task conditions are topics of great interest, but very few studies have looked for the relevance of gait analysis under dual-task conditions in elderly people on the basis of a clinical approach. An observational study including 103 patients (mean age 76.3 ± 7.2, women 56%) suffering from gait disorders or memory impairment was conducted. Gait analysis under dual-task conditions was carried out for all patients. Brain MRI was performed in the absence of contra-indications. Three main gait variables were measured: walking speed, stride frequency, and stride regularity. For each gait variable, the dual task cost was computed and a quartile analysis was obtained. Nonparametric tests were used for all the comparisons (Wilcoxon, Kruskal-Wallis, Fisher or Chi 2 tests). Four clinical subgroups were identified: gait instability (45%), recurrent falls (29%), memory impairment (18%), and cautious gait (8%). The biomechanical severity of these subgroups was ordered according to walking speed and stride regularity under both conditions, from least to most serious as follows: memory impairment, gait instability, recurrent falls, cautious gait (p < 0.01 for walking speed, p = 0.05 for stride regularity). According to the established diagnoses of gait disorders, 5 main pathological subgroups were identified (musculoskeletal diseases (n = 11), vestibular diseases (n = 6), mild cognitive impairment (n = 24), central nervous system pathologies, (n = 51), and without diagnosis (n = 8)). The dual task cost for walking speed, stride frequency and stride regularity were different among these subgroups (p < 0.01). The subgroups mild cognitive impairment and central nervous system pathologies both showed together a higher dual task cost for each variable compared to the other subgroups combined (p = 0.01). The quartile analysis of dual task cost for stride frequency and stride regularity allowed the identification of 3 motor phenotypes (p < 0.01), without any difference for white matter hyperintensities, but with an increased Scheltens score from the first to the third motor phenotype (p = 0.05). Gait analysis under dual-task conditions in elderly people suffering from gait disorders or memory impairment is of great value in assessing the severity of gait disorders, differentiating between peripheral pathologies and central nervous system pathologies, and identifying motor phenotypes. Correlations between motor phenotypes and brain imaging require further studies.
Exploring Operational Test and Evaluation of Unmanned Aircraft Systems: A Qualitative Case Study
NASA Astrophysics Data System (ADS)
Saliceti, Jose A.
The purpose of this qualitative case study was to explore and identify strategies that may potentially remedy operational test and evaluation procedures used to evaluate Unmanned Aircraft Systems (UAS) technology. The sample for analysis consisted of organizations testing and evaluating UASs (e.g., U.S. Air Force, U.S. Navy, U.S. Army, U.S. Marine Corps, U.S. Coast Guard, and Customs Border Protection). A purposeful sampling technique was used to select 15 subject matter experts in the field of operational test and evaluation of UASs. A questionnaire was provided to participants to construct a descriptive and robust research. Analysis of responses revealed themes related to each research question. Findings revealed operational testers utilized requirements documents to extrapolate measures for testing UAS technology and develop critical operational issues. The requirements documents were (a) developed without the contribution of stakeholders and operational testers, (b) developed with vague or unrealistic measures, and (c) developed without a systematic method to derive requirements from mission tasks. Four approaches are recommended to develop testable operational requirements and assist operational testers: (a) use a mission task analysis tool to derive requirements for mission essential tasks for the system, (b) exercise collaboration among stakeholders and testers to ensure testable operational requirements based on mission tasks, (c) ensure testable measures are used in requirements documents, and (d) create a repository list of critical operational issues by mission areas. The preparation of operational test and evaluation processes for UAS technology is not uniform across testers. The processes in place are not standardized, thus test plan preparation and reporting are different among participants. A standard method to prepare and report UAS technology should be used when preparing and reporting on UAS technology. Using a systematic process, such as mission-based test design, resonated among participants as an analytical method to link UAS mission tasks and measures of performance to the capabilities of the system under test when developing operational test plans. Further research should examine system engineering designs for system requirements traceability matrix of mission tasks and subtasks while using an analysis tool that adequately evaluates UASs with an acceptable level of confidence in the results.
Weir, Charlene R.; Nebeker, Jonathan J.R.; Hicken, Bret L.; Campo, Rebecca; Drews, Frank; LeBar, Beth
2007-01-01
Objective Computerized Provider Order Entry (CPOE) with electronic documentation, and computerized decision support dramatically changes the information environment of the practicing clinician. Prior work patterns based on paper, verbal exchange, and manual methods are replaced with automated, computerized, and potentially less flexible systems. The objective of this study is to explore the information management strategies that clinicians use in the process of adapting to a CPOE system using cognitive task analysis techniques. Design Observation and semi-structured interviews were conducted with 88 primary-care clinicians at 10 Veterans Administration Medical Centers. Measurements Interviews were taped, transcribed, and extensively analyzed to identify key information management goals, strategies, and tasks. Tasks were aggregated into groups, common components across tasks were clarified, and underlying goals and strategies identified. Results Nearly half of the identified tasks were not fully supported by the available technology. Six core components of tasks were identified. Four meta-cognitive information management goals emerged: 1) Relevance Screening; 2) Ensuring Accuracy; 3) Minimizing memory load; and 4) Negotiating Responsibility. Strategies used to support these goals are presented. Conclusion Users develop a wide array of information management strategies that allow them to successfully adapt to new technology. Supporting the ability of users to develop adaptive strategies to support meta-cognitive goals is a key component of a successful system. PMID:17068345
Interactive Scene Analysis Module - A sensor-database fusion system for telerobotic environments
NASA Technical Reports Server (NTRS)
Cooper, Eric G.; Vazquez, Sixto L.; Goode, Plesent W.
1992-01-01
Accomplishing a task with telerobotics typically involves a combination of operator control/supervision and a 'script' of preprogrammed commands. These commands usually assume that the location of various objects in the task space conform to some internal representation (database) of that task space. The ability to quickly and accurately verify the task environment against the internal database would improve the robustness of these preprogrammed commands. In addition, the on-line initialization and maintenance of a task space database is difficult for operators using Cartesian coordinates alone. This paper describes the Interactive Scene' Analysis Module (ISAM) developed to provide taskspace database initialization and verification utilizing 3-D graphic overlay modelling, video imaging, and laser radar based range imaging. Through the fusion of taskspace database information and image sensor data, a verifiable taskspace model is generated providing location and orientation data for objects in a task space. This paper also describes applications of the ISAM in the Intelligent Systems Research Laboratory (ISRL) at NASA Langley Research Center, and discusses its performance relative to representation accuracy and operator interface efficiency.
A System Approach to Navy Medical Education and Training. Appendix 12. General Duty Corpsman.
1974-08-31
survey and analysis was conducted relative to all factors affecting education and training programs. Subsequently, a job-analysis sub-system was defined...to be completed for this survey : Part I Career Background Information (answers to be recorded in this TASK BOOKLET) Part II A List of Tasks (answers... SHAMPOO /COMB HAIR, GIVF -9FNATL. F!’M’O.1AIL ICAQE, SHAVE BcAREW 40 IISSU5 HOSPITAL COMFORTS TO PATIENTS,E.G. IKLEENcXSDAP,TOOTHPASTEvRED CROSS SUPPLIES 41
Feature-Oriented Domain Analysis (FODA) Feasibility Study
1990-11-01
controlling the synchronous behavior of the task. A task may wait for one or more synchronizing or message queue events. "* Each task is designed using the...Comparative Study 13 2.2.1. The Genesis System 13 2.2.2. MCC Work 15 2.2.2.1. The DESIRE Design Recovery Tool 15 0 2.2.2.2. Domain Analysis Method 1f...Illustration 43 Figure 6-1: Architectural Layers 48 Figure 6-2: Window Management Subsystem Design Structure 49 Figure 7-1: Function of a Window Manager
Li, Hui-Jie; Hou, Xiao-Hui; Liu, Han-Hui; Yue, Chun-Lin; He, Yong; Zuo, Xi-Nian
2015-03-01
Most of the previous task functional magnetic resonance imaging (fMRI) studies found abnormalities in distributed brain regions in mild cognitive impairment (MCI) and Alzheimer's disease (AD), and few studies investigated the brain network dysfunction from the system level. In this meta-analysis, we aimed to examine brain network dysfunction in MCI and AD. We systematically searched task-based fMRI studies in MCI and AD published between January 1990 and January 2014. Activation likelihood estimation meta-analyses were conducted to compare the significant group differences in brain activation, the significant voxels were overlaid onto seven referenced neuronal cortical networks derived from the resting-state fMRI data of 1,000 healthy participants. Thirty-nine task-based fMRI studies (697 MCI patients and 628 healthy controls) were included in MCI-related meta-analysis while 36 task-based fMRI studies (421 AD patients and 512 healthy controls) were included in AD-related meta-analysis. The meta-analytic results revealed that MCI and AD showed abnormal regional brain activation as well as large-scale brain networks. MCI patients showed hypoactivation in default, frontoparietal, and visual networks relative to healthy controls, whereas AD-related hypoactivation mainly located in visual, default, and ventral attention networks relative to healthy controls. Both MCI-related and AD-related hyperactivation fell in frontoparietal, ventral attention, default, and somatomotor networks relative to healthy controls. MCI and AD presented different pathological while shared similar compensatory large-scale networks in fulfilling the cognitive tasks. These system-level findings are helpful to link the fundamental declines of cognitive tasks to brain networks in MCI and AD. © 2014 Wiley Periodicals, Inc.
Automation of scour analysis at Louisiana bridge sites : final report.
DOT National Transportation Integrated Search
1988-12-01
The computerized system for the organization, analysis, and display of field collected scour data is described. This system will enhance the current manual procedure of accomplishing these tasks. The system accepts input from the user, and based on u...
Brain-computer interface control along instructed paths
NASA Astrophysics Data System (ADS)
Sadtler, P. T.; Ryu, S. I.; Tyler-Kabara, E. C.; Yu, B. M.; Batista, A. P.
2015-02-01
Objective. Brain-computer interfaces (BCIs) are being developed to assist paralyzed people and amputees by translating neural activity into movements of a computer cursor or prosthetic limb. Here we introduce a novel BCI task paradigm, intended to help accelerate improvements to BCI systems. Through this task, we can push the performance limits of BCI systems, we can quantify more accurately how well a BCI system captures the user’s intent, and we can increase the richness of the BCI movement repertoire. Approach. We have implemented an instructed path task, wherein the user must drive a cursor along a visible path. The instructed path task provides a versatile framework to increase the difficulty of the task and thereby push the limits of performance. Relative to traditional point-to-point tasks, the instructed path task allows more thorough analysis of decoding performance and greater richness of movement kinematics. Main results. We demonstrate that monkeys are able to perform the instructed path task in a closed-loop BCI setting. We further investigate how the performance under BCI control compares to native arm control, whether users can decrease their movement variability in the face of a more demanding task, and how the kinematic richness is enhanced in this task. Significance. The use of the instructed path task has the potential to accelerate the development of BCI systems and their clinical translation.
System and method for seamless task-directed autonomy for robots
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nielsen, Curtis; Bruemmer, David; Few, Douglas
Systems, methods, and user interfaces are used for controlling a robot. An environment map and a robot designator are presented to a user. The user may place, move, and modify task designators on the environment map. The task designators indicate a position in the environment map and indicate a task for the robot to achieve. A control intermediary links task designators with robot instructions issued to the robot. The control intermediary analyzes a relative position between the task designators and the robot. The control intermediary uses the analysis to determine a task-oriented autonomy level for the robot and communicates targetmore » achievement information to the robot. The target achievement information may include instructions for directly guiding the robot if the task-oriented autonomy level indicates low robot initiative and may include instructions for directing the robot to determine a robot plan for achieving the task if the task-oriented autonomy level indicates high robot initiative.« less
SOAR: An Architecture for General Intelligence
1987-12-01
these tasks, and (3) learn about all aspects of the tasks and its performance on them. Soar has existed since mid 1982 as an experimental software system...intelligence. Soar’s behavior has already been studied over a range of tasks and methods (Figure 1), which sample its intended range, though...in multiple small tasks: Generate and test, AND/OR search, hill climbing ( simple and steepest-ascent), means-ends analysis, operator subgoaling
Goode, Natassia; Salmon, Paul M; Lenné, Michael G; Hillard, Peter
2014-07-01
Injuries resulting from manual handling tasks represent an on-going problem for the transport and storage industry. This article describes an application of a systems theory-based approach, Rasmussen's (1997. Safety Science 27, 183), risk management framework, to the analysis of the factors influencing safety during manual handling activities in a freight handling organisation. Observations of manual handling activities, cognitive decision method interviews with workers (n=27) and interviews with managers (n=35) were used to gather information about three manual handling activities. Hierarchical task analysis and thematic analysis were used to identify potential risk factors and performance shaping factors across the levels of Rasmussen's framework. These different data sources were then integrated using Rasmussen's Accimap technique to provide an overall analysis of the factors influencing safety during manual handling activities in this context. The findings demonstrate how a systems theory-based approach can be applied to this domain, and suggest that policy-orientated, rather than worker-orientated, changes are required to prevent future manual handling injuries. Copyright © 2013 Elsevier Ltd. All rights reserved.
Lee, Hyunyoung; Cheon, Byungsik; Hwang, Minho; Kang, Donghoon; Kwon, Dong-Soo
2018-02-01
In robotic surgical systems, commercial master devices have limitations owing to insufficient workspace and lack of intuitiveness. To overcome these limitations, a remote-center-of-motion (RCM) master manipulator was proposed. The feasibility of the proposed RCM structure was evaluated through kinematic analysis using a conventional serial structure. Two performance comparison experiments (peg transfer task and objective transfer task) were conducted for the developed master and Phantom Omni. The kinematic analysis results showed that compared with the serial structure, the proposed RCM structure has better performance in terms of design efficiency (19%) and workspace quality (59.08%). Further, in comparison with Phantom Omni, the developed master significantly increased task efficiency and significantly decreased workload in both experiments. The comparatively better performance in terms of intuitiveness, design efficiency, and operability of the proposed master for a robotic system for minimally invasive surgery was confirmed through kinematic and experimental analysis. Copyright © 2017 John Wiley & Sons, Ltd.
Detection and categorization of bacteria habitats using shallow linguistic analysis
2015-01-01
Background Information regarding bacteria biotopes is important for several research areas including health sciences, microbiology, and food processing and preservation. One of the challenges for scientists in these domains is the huge amount of information buried in the text of electronic resources. Developing methods to automatically extract bacteria habitat relations from the text of these electronic resources is crucial for facilitating research in these areas. Methods We introduce a linguistically motivated rule-based approach for recognizing and normalizing names of bacteria habitats in biomedical text by using an ontology. Our approach is based on the shallow syntactic analysis of the text that include sentence segmentation, part-of-speech (POS) tagging, partial parsing, and lemmatization. In addition, we propose two methods for identifying bacteria habitat localization relations. The underlying assumption for the first method is that discourse changes with a new paragraph. Therefore, it operates on a paragraph-basis. The second method performs a more fine-grained analysis of the text and operates on a sentence-basis. We also develop a novel anaphora resolution method for bacteria coreferences and incorporate it with the sentence-based relation extraction approach. Results We participated in the Bacteria Biotope (BB) Task of the BioNLP Shared Task 2013. Our system (Boun) achieved the second best performance with 68% Slot Error Rate (SER) in Sub-task 1 (Entity Detection and Categorization), and ranked third with an F-score of 27% in Sub-task 2 (Localization Event Extraction). This paper reports the system that is implemented for the shared task, including the novel methods developed and the improvements obtained after the official evaluation. The extensions include the expansion of the OntoBiotope ontology using the training set for Sub-task 1, and the novel sentence-based relation extraction method incorporated with anaphora resolution for Sub-task 2. These extensions resulted in promising results for Sub-task 1 with a SER of 68%, and state-of-the-art performance for Sub-task 2 with an F-score of 53%. Conclusions Our results show that a linguistically-oriented approach based on the shallow syntactic analysis of the text is as effective as machine learning approaches for the detection and ontology-based normalization of habitat entities. Furthermore, the newly developed sentence-based relation extraction system with the anaphora resolution module significantly outperforms the paragraph-based one, as well as the other systems that participated in the BB Shared Task 2013. PMID:26201262
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miron, M.S.; Christopher, C.; Hirshfield, S.
1978-05-01
Psycholinguistics provides crisis managers in nuclear threat incidents with a quantitative methodology which can aid in the determination of threat credibility, authorship identification and perpetrator apprehension. The objective of this contract is to improve and enhance present psycholinguistic software systems by means of newly-developed, computer-automated techniques which significantly extend the technology of automated content and stylistic analysis of nuclear threat. In accordance with this overall objective, the first two contract Tasks have been completed and are reported on in this document. The first Task specifies the development of software support for the purpose of syntax regularization of vocabulary to rootmore » form. The second calls for the exploration and development of alternative approaches to correlative analysis of vocabulary usage.« less
Cognitive analysis of physicians' medication ordering activity.
Pelayo, Sylvia; Leroy, Nicolas; Guerlinger, Sandra; Degoulet, Patrice; Meaux, Jean-Jacques; Beuscart-Zéphir, Marie-Catherine
2005-01-01
Computerized Physician Order Entry (CPOE) addresses critical functions in healthcare systems. As the name clearly indicates, these systems focus on order entry. With regard to medication orders, such systems generally force physicians to enter exhaustively documented orders. But a cognitive analysis of the physician's medication ordering task shows that order entry is the last (and least) important step of the entire cognitive therapeutic decision making task. We performed a comparative analysis of these complex cognitive tasks in two working environments, computer-based and paper-based. The results showed that information gathering, selection and interpretation are critical cognitive functions to support the therapeutic decision making. Thus the most important requirement from the physician's perspective would be an efficient display of relevant information provided first in the form of a summarized view of the patient's current treatment, followed by in a more detailed focused display of those items pertinent to the current situation. The CPOE system examined obviously failed to provide the physicians this critical summarized view. Following these results, consistent with users' complaints, the Company decided to engage in a significant re-engineering process of their application.
NASA Technical Reports Server (NTRS)
LaPointe, Michael
2006-01-01
The Solar Electric Propulsion (SEP) technology area is tasked to develop near and mid-term SEP technology to improve or enable science mission capture while minimizing risk and cost to the end user. The solar electric propulsion investments are primarily driven by SMD cost-capped mission needs. The technology needs are determined partially through systems analysis tasks including the recent "Re-focus Studies" and "Standard Architecture Study." These systems analysis tasks transitioned the technology development to address the near term propulsion needs suitable for cost-capped open solicited missions such as Discovery and New Frontiers Class missions. Major SEP activities include NASA's Evolutionary Xenon Thruster (NEXT), implementing a Standard Architecture for NSTAR and NEXT EP systems, and developing a long life High Voltage Hall Accelerator (HiVHAC). Lower level investments include advanced feed system development and xenon recovery testing. Future plans include completion of ongoing ISP development activities and evaluating potential use of commercial electric propulsion systems for SMD applications. Examples of enhanced mission capability and technology readiness dates shall be discussed.
Banks, Victoria A; Stanton, Neville A
2016-11-01
To the average driver, the concept of automation in driving infers that they can become completely 'hands and feet free'. This is a common misconception, however, one that has been shown through the application of Network Analysis to new Cruise Assist technologies that may feature on our roads by 2020. Through the adoption of a Systems Theoretic approach, this paper introduces the concept of driver-initiated automation which reflects the role of the driver in highly automated driving systems. Using a combination of traditional task analysis and the application of quantitative network metrics, this agent-based modelling paper shows how the role of the driver remains an integral part of the driving system implicating the need for designers to ensure they are provided with the tools necessary to remain actively in-the-loop despite giving increasing opportunities to delegate their control to the automated subsystems. Practitioner Summary: This paper describes and analyses a driver-initiated command and control system of automation using representations afforded by task and social networks to understand how drivers remain actively involved in the task. A network analysis of different driver commands suggests that such a strategy does maintain the driver in the control loop.
Teaching Case: Analysis of an Electronic Voting System
ERIC Educational Resources Information Center
Thompson, Nik; Toohey, Danny
2014-01-01
This teaching case discusses the analysis of an electronic voting system. The development of the case was motivated by research into information security and management, but as it includes procedural aspects, organizational structure and personnel, it is a suitable basis for all aspects of systems analysis, planning and design tasks. The material…
Uncovering the requirements of cognitive work.
Roth, Emilie M
2008-06-01
In this article, the author provides an overview of cognitive analysis methods and how they can be used to inform system analysis and design. Human factors has seen a shift toward modeling and support of cognitively intensive work (e.g., military command and control, medical planning and decision making, supervisory control of automated systems). Cognitive task analysis and cognitive work analysis methods extend traditional task analysis techniques to uncover the knowledge and thought processes that underlie performance in cognitively complex settings. The author reviews the multidisciplinary roots of cognitive analysis and the variety of cognitive task analysis and cognitive work analysis methods that have emerged. Cognitive analysis methods have been used successfully to guide system design, as well as development of function allocation, team structure, and training, so as to enhance performance and reduce the potential for error. A comprehensive characterization of cognitive work requires two mutually informing analyses: (a) examination of domain characteristics and constraints that define cognitive requirements and challenges and (b) examination of practitioner knowledge and strategies that underlie both expert and error-vulnerable performance. A variety of specific methods can be adapted to achieve these aims within the pragmatic constraints of particular projects. Cognitive analysis methods can be used effectively to anticipate cognitive performance problems and specify ways to improve individual and team cognitive performance (be it through new forms of training, user interfaces, or decision aids).
Hidden Markov model analysis of force/torque information in telemanipulation
NASA Technical Reports Server (NTRS)
Hannaford, Blake; Lee, Paul
1991-01-01
A model for the prediction and analysis of sensor information recorded during robotic performance of telemanipulation tasks is presented. The model uses the hidden Markov model to describe the task structure, the operator's or intelligent controller's goal structure, and the sensor signals. A methodology for constructing the model parameters based on engineering knowledge of the task is described. It is concluded that the model and its optimal state estimation algorithm, the Viterbi algorithm, are very succesful at the task of segmenting the data record into phases corresponding to subgoals of the task. The model provides a rich modeling structure within a statistical framework, which enables it to represent complex systems and be robust to real-world sensory signals.
Major Events Leading to Establishment of The National Task Bank.
ERIC Educational Resources Information Center
Upjohn (W.E.) Inst. for Employment Research, Washington, DC.
This document describes how the plan for a National Task Bank evolved as part of an effort to encourage State and local public welfare agencies to adopt new approaches to staff planning and utilization. The task bank is an outgrowth of the application of systems approach and functional job analysis to agency management. Individualized data banks…
NASA Technical Reports Server (NTRS)
1976-01-01
The following areas related to the final definition and preliminary design study of the initial atmospheric cloud physics laboratory (ACPL) were covered: (1) proposal organization, personnel, schedule, and project management, (2) proposed configurations, (3) study objectives, (4) ACPL experiment program listing and description, (5) mission/flight flexibility and modularity/commonality, (6) study plan, and (7) description of following tasks: requirement analysis and definition task flow, systems analysis and trade studies, subsystem analysis and trade studies, specifications and interface control documents, preliminary design task flow, work breakdown structure, programmatic analysis and planning, and project costs. Finally, an overview of the scientific requirements was presented.
Analysis of remote operating systems for space-based servicing operations. Volume 2: Study results
NASA Technical Reports Server (NTRS)
1985-01-01
The developments in automation and robotics have increased the importance of applications for space based servicing using remotely operated systems. A study on three basic remote operating systems (teleoperation, telepresence and robotics) was performed in two phases. In phase one, requirements development, which consisted of one three-month task, a group of ten missions were selected. These included the servicing of user equipment on the station and the servicing of the station itself. In phase two, concepts development, which consisted of three tasks, overall system concepts were developed for the selected missions. These concepts, which include worksite servicing equipment, a carrier system, and payload handling equipment, were evaluated relative to the configurations of the overall worksite. It is found that the robotic/teleoperator concepts are appropriate for relatively simple structured tasks, while the telepresence/teleoperator concepts are applicable for missions that are complex, unstructured tasks.
Altitude deviations: Breakdowns of an error-tolerant system
NASA Technical Reports Server (NTRS)
Palmer, Everett A.; Hutchins, Edwin L.; Ritter, Richard D.; Vancleemput, Inge
1993-01-01
Pilot reports of aviation incidents to the Aviation Safety Reporting System (ASRS) provide a window on the problems occurring in today's airline cockpits. The narratives of 10 pilot reports of errors made in the automation-assisted altitude-change task are used to illustrate some of the issues of pilots interacting with automatic systems. These narratives are then used to construct a description of the cockpit as an information processing system. The analysis concentrates on the error-tolerant properties of the system and on how breakdowns can occasionally occur. An error-tolerant system can detect and correct its internal processing errors. The cockpit system consists of two or three pilots supported by autoflight, flight-management, and alerting systems. These humans and machines have distributed access to clearance information and perform redundant processing of information. Errors can be detected as deviations from either expected behavior or as deviations from expected information. Breakdowns in this system can occur when the checking and cross-checking tasks that give the system its error-tolerant properties are not performed because of distractions or other task demands. Recommendations based on the analysis for improving the error tolerance of the cockpit system are given.
Using cognitive task analysis to develop simulation-based training for medical tasks.
Cannon-Bowers, Jan; Bowers, Clint; Stout, Renee; Ricci, Katrina; Hildabrand, Annette
2013-10-01
Pressures to increase the efficacy and effectiveness of medical training are causing the Department of Defense to investigate the use of simulation technologies. This article describes a comprehensive cognitive task analysis technique that can be used to simultaneously generate training requirements, performance metrics, scenario requirements, and simulator/simulation requirements for medical tasks. On the basis of a variety of existing techniques, we developed a scenario-based approach that asks experts to perform the targeted task multiple times, with each pass probing a different dimension of the training development process. In contrast to many cognitive task analysis approaches, we argue that our technique can be highly cost effective because it is designed to accomplish multiple goals. The technique was pilot tested with expert instructors from a large military medical training command. These instructors were employed to generate requirements for two selected combat casualty care tasks-cricothyroidotomy and hemorrhage control. Results indicated that the technique is feasible to use and generates usable data to inform simulation-based training system design. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.
2013-12-04
Table 3. Monthly PMCS Actions and Labor Required Task Hours Task Hours Inspect cab and hood 1.0 Inspect turbocharger 0.5 Inspect fuel tank 0.1 Inspect...Annual PMCS Actions and Labor Required Task Hours Task Hours Inspect cab and hood 1.0 Air intake system 0.2 Inspect fuel Tank 0.1 Inspect turbocharger
POPCORN: a Supervisory Control Simulation for Workload and Performance Research
NASA Technical Reports Server (NTRS)
Hart, S. G.; Battiste, V.; Lester, P. T.
1984-01-01
A multi-task simulation of a semi-automatic supervisory control system was developed to provide an environment in which training, operator strategy development, failure detection and resolution, levels of automation, and operator workload can be investigated. The goal was to develop a well-defined, but realistically complex, task that would lend itself to model-based analysis. The name of the task (POPCORN) reflects the visual display that depicts different task elements milling around waiting to be released and pop out to be performed. The operator's task was to complete each of 100 task elements that ere represented by different symbols, by selecting a target task and entering the desired a command. The simulated automatic system then completed the selected function automatically. Highly significant differences in performance, strategy, and rated workload were found as a function of all experimental manipulations (except reward/penalty).
Sprinkler System Installer. Occupational Analyses Series.
ERIC Educational Resources Information Center
Chinien, Chris; Boutin, France
This analysis covers tasks performed by a sprinkler system installer, an occupational title some provinces and territories of Canada have also identified as pipefitter--fire protection mechanic specialty; sprinkler and fire protection installer; sprinkler and fire protection systems installer; and sprinkler fitter. A guide to analysis discusses…
Explanation production by expert planners
NASA Technical Reports Server (NTRS)
Bridges, Susan; Jhannes, James D.
1988-01-01
Although the explanation capability of expert systems is usually listed as one of the distinguishing characteristics of these systems, the explanation facilities of most existing systems are quite primitive. Computer generated explanations are typically produced from canned text or by direct translation of the knowledge structures. Explanations produced in this manner bear little resemblance to those produced by humans for similar tasks. The focus of our research in explanation is the production of justifications for decisions by expert planning systems. An analysis of justifications written by people for planning tasks has been taken as the starting point. The purpose of this analysis is two-fold. First, analysis of the information content of the justifications will provide a basis for deciding what knowledge must be represented if human-like justifications are to be produced. Second, an analysis of the textual organization of the justifications will be used in the development of a mechanism for selecting and organizing the knowledge to be included in a computer-produced explanation. This paper describes a preliminary analysis done of justifications written by people for a planning task. It is clear that these justifications differ significantly from those that would be produced by an expert system by tracing the firing of production rules. The results from the text analysis have been used to develop an augmented phrase structured grammar (APSG) describing the organization of the justifications. The grammar was designed to provide a computationally feasible method for determining textual organization that will allow the necessary information to be communicated in a cohesive manner.
NASA Astrophysics Data System (ADS)
Barreiro, F. H.; Borodin, M.; De, K.; Golubkov, D.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Padolski, S.; Wenaus, T.; ATLAS Collaboration
2017-10-01
The second generation of the ATLAS Production System called ProdSys2 is a distributed workload manager that runs daily hundreds of thousands of jobs, from dozens of different ATLAS specific workflows, across more than hundred heterogeneous sites. It achieves high utilization by combining dynamic job definition based on many criteria, such as input and output size, memory requirements and CPU consumption, with manageable scheduling policies and by supporting different kind of computational resources, such as GRID, clouds, supercomputers and volunteer-computers. The system dynamically assigns a group of jobs (task) to a group of geographically distributed computing resources. Dynamic assignment and resources utilization is one of the major features of the system, it didn’t exist in the earliest versions of the production system where Grid resources topology was predefined using national or/and geographical pattern. Production System has a sophisticated job fault-recovery mechanism, which efficiently allows to run multi-Terabyte tasks without human intervention. We have implemented “train” model and open-ended production which allow to submit tasks automatically as soon as new set of data is available and to chain physics groups data processing and analysis with central production by the experiment. We present an overview of the ATLAS Production System and its major components features and architecture: task definition, web user interface and monitoring. We describe the important design decisions and lessons learned from an operational experience during the first year of LHC Run2. We also report the performance of the designed system and how various workflows, such as data (re)processing, Monte-Carlo and physics group production, users analysis, are scheduled and executed within one production system on heterogeneous computing resources.
Geruschat, Duane R; Richards, Thomas P; Arditi, Aries; da Cruz, Lyndon; Dagnelie, Gislin; Dorn, Jessy D; Duncan, Jacque L; Ho, Allen C; Olmos de Koo, Lisa C; Sahel, José-Alain; Stanga, Paulo E; Thumann, Gabriele; Wang, Vizhong; Greenberg, Robert J
2016-05-01
The purpose of this analysis was to compare observer-rated tasks in patients implanted with the Argus II Retinal Prosthesis System, when the device is ON versus OFF. The Functional Low-Vision Observer Rated Assessment (FLORA) instrument was administered to 26 blind patients implanted with the Argus II Retinal Prosthesis System at a mean follow-up of 36 months. FLORA is a multi-component instrument that consists in part of observer-rated assessment of 35 tasks completed with the device ON versus OFF. The ease with which a patient completes a task is scored using a four-point scale, ranging from easy (score of 1) to impossible (score of 4). The tasks are evaluated individually and organised into four discrete domains, including 'Visual orientation', 'Visual mobility', 'Daily life and 'Interaction with others'. Twenty-six patients completed each of the 35 tasks. Overall, 24 out of 35 tasks (69 per cent) were statistically significantly easier to achieve with the device ON versus OFF. In each of the four domains, patients' performances were significantly better (p < 0.05) with the device ON versus OFF, ranging from 19 to 38 per cent improvement. Patients with an Argus II Retinal Prosthesis implanted for 18 to 44 months (mean 36 months), demonstrated significantly improved completion of vision-related tasks with the device ON versus OFF. © 2016 The Authors Clinical and Experimental Optometry published by John Wiley & Sons Australia, Ltd on behalf of Optometry Australia.
Richards, Thomas P; Arditi, Aries; da Cruz, Lyndon; Dagnelie, Gislin; Dorn, Jessy D; Duncan, Jacque L; Ho, Allen C; Olmos de Koo, Lisa C; Sahel, José‐Alain; Stanga, Paulo E; Thumann, Gabriele; Wang, Vizhong; Greenberg, Robert J
2016-01-01
Abstract Objective The purpose of this analysis was to compare observer‐rated tasks in patients implanted with the Argus II Retinal Prosthesis System, when the device is ON versus OFF. Methods The Functional Low‐Vision Observer Rated Assessment (FLORA) instrument was administered to 26 blind patients implanted with the Argus II Retinal Prosthesis System at a mean follow‐up of 36 months. FLORA is a multi‐component instrument that consists in part of observer‐rated assessment of 35 tasks completed with the device ON versus OFF. The ease with which a patient completes a task is scored using a four‐point scale, ranging from easy (score of 1) to impossible (score of 4). The tasks are evaluated individually and organised into four discrete domains, including ‘Visual orientation’, ‘Visual mobility’, ‘Daily life and ‘Interaction with others’. Results Twenty‐six patients completed each of the 35 tasks. Overall, 24 out of 35 tasks (69 per cent) were statistically significantly easier to achieve with the device ON versus OFF. In each of the four domains, patients’ performances were significantly better (p < 0.05) with the device ON versus OFF, ranging from 19 to 38 per cent improvement. Conclusion Patients with an Argus II Retinal Prosthesis implanted for 18 to 44 months (mean 36 months), demonstrated significantly improved completion of vision‐related tasks with the device ON versus OFF. PMID:26804484
Development of an improved coating for polybenzimidazole foam. [for space shuttle heat shields
NASA Technical Reports Server (NTRS)
Neuner, G. J.; Delano, C. B.
1976-01-01
An improved coating system was developed for Polybenzimidazole (PBI) foam to provide coating stability, ruggedness, moisture resistance, and to satisfy optical property requirements (alpha sub (s/epsilon) or = 0.4 and epsilon 0.8) for the space shuttle. The effort was performed in five tasks: Task 1 to establish material and process specifications for the PBI foam, and material specifications for the coatings; Task 2 to identify and evaluate promising coatings; Task 3 to establish mechanical and thermophysical properties of the tile components; Task 4 to determine by systems analysis the potential weight trade-offs associated with a coated PBI TPS; and Task 5 to establish a preliminary quality assurance program. The coated PBI tile was, through screening tests, determined to satisfy the design objectives with a reduced system weight over the baseline shuttle silica LRSI TPS. The developed tile provides a thermally stable, extremely rugged, low thermal conductivity insulator with a well characterized optical coating.
Systematic review automation technologies
2014-01-01
Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128
ERIC Educational Resources Information Center
Katz, Sandra N.; Hall, Ellen; Lesgold, Alan
This paper describes some results of a collaborative effort between the University of Pittsburgh and the Air Force to develop advanced troubleshooting training for F-15 maintenance technicians. The focus is on the cognitive task methodology used in the development of three intelligent tutoring systems to inform their instructional content and…
Banks, Victoria A; Stanton, Neville A; Harvey, Catherine
2014-01-01
Although task analysis of pedestrian detection can provide us with useful insights into how a driver may behave in emergency situations, the cognitive elements of driver decision-making are less well understood. To assist in the design of future Advanced Driver Assistance Systems, such as Autonomous Emergency Brake systems, it is essential that the cognitive elements of the driving task are better understood. This paper uses verbal protocol analysis in an exploratory fashion to uncover the thought processes underlying behavioural outcomes represented by hard data collected using the Southampton University Driving Simulator.
Closed-loop, pilot/vehicle analysis of the approach and landing task
NASA Technical Reports Server (NTRS)
Anderson, M. R.; Schmidt, D. K.
1986-01-01
In the case of approach and landing, it is universally accepted that the pilot uses more than one vehicle response, or output, to close his control loops. Therefore, to model this task, a multi-loop analysis technique is required. The analysis problem has been in obtaining reasonable analytic estimates of the describing functions representing the pilot's loop compensation. Once these pilot describing functions are obtained, appropriate performance and workload metrics must then be developed for the landing task. The optimal control approach provides a powerful technique for obtaining the necessary describing functions, once the appropriate task objective is defined in terms of a quadratic objective function. An approach is presented through the use of a simple, reasonable objective function and model-based metrics to evaluate loop performance and pilot workload. The results of an analysis of the LAHOS (Landing and Approach of Higher Order Systems) study performed by R.E. Smith is also presented.
NASA Technical Reports Server (NTRS)
Merino, F.; Wakabayashi, I.; Pleasant, R. L.; Hill, M.
1982-01-01
Preferred techniques for providing abort pressurization and engine feed system net positive suction pressure (NPSP) for low thrust chemical propulsion systems (LTPS) were determined. A representative LTPS vehicle configuration is presented. Analysis tasks include: propellant heating analysis; pressurant requirements for abort propellant dump; and comparative analysis of pressurization techniques and thermal subcoolers.
Lather (Interior Systems Mechanic). Occupational Analyses Series.
ERIC Educational Resources Information Center
Chapman, Mike; Chapman, Carol; MacLean, Margaret
This analysis covers tasks performed by a lather, an occupational title some provinces and territories of Canada have also identified as drywall and acoustical mechanic; interior systems installer; and interior systems mechanic. A guide to analysis discusses development, structure, and validation method; scope of the occupation; trends; and…
Data Analysis for Ocean Thermal Energy Conversion (otec)
1979-11-01
the OTEC system consisted of copper heater cylinders which were press fitted to the outside of the heat exchanger tubes. Voltage to the heaters was...INFORMATION The Heat Exchanger Heating task was sponsored by the Department of Energy under Interagency Agreement ET-78-I-O1-3218, Task Number 13218, Work...Panama City, Florida. Test site characterization, cleaning systems, and the physical structure of the OTEC system are discussed briefly. Data sampling
No psychological effect of color context in a low level vision task
Pedley, Adam; Wade, Alex R
2013-01-01
Background: A remarkable series of recent papers have shown that colour can influence performance in cognitive tasks. In particular, they suggest that viewing a participant number printed in red ink or other red ancillary stimulus elements improves performance in tasks requiring local processing and impedes performance in tasks requiring global processing whilst the reverse is true for the colour blue. The tasks in these experiments require high level cognitive processing such as analogy solving or remote association tests and the chromatic effect on local vs. global processing is presumed to involve widespread activation of the autonomic nervous system. If this is the case, we might expect to see similar effects on all local vs. global task comparisons. To test this hypothesis, we asked whether chromatic cues also influence performance in tasks involving low level visual feature integration. Methods: Subjects performed either local (contrast detection) or global (form detection) tasks on achromatic dynamic Glass pattern stimuli. Coloured instructions, target frames and fixation points were used to attempt to bias performance to different task types. Based on previous literature, we hypothesised that red cues would improve performance in the (local) contrast detection task but would impede performance in the (global) form detection task. Results: A two-way, repeated measures, analysis of covariance (2×2 ANCOVA) with gender as a covariate, revealed no influence of colour on either task, F(1,29) = 0.289, p = 0.595, partial η 2 = 0.002. Additional analysis revealed no significant differences in only the first attempts of the tasks or in the improvement in performance between trials. Discussion: We conclude that motivational processes elicited by colour perception do not influence neuronal signal processing in the early visual system, in stark contrast to their putative effects on processing in higher areas. PMID:25075280
No psychological effect of color context in a low level vision task.
Pedley, Adam; Wade, Alex R
2013-01-01
A remarkable series of recent papers have shown that colour can influence performance in cognitive tasks. In particular, they suggest that viewing a participant number printed in red ink or other red ancillary stimulus elements improves performance in tasks requiring local processing and impedes performance in tasks requiring global processing whilst the reverse is true for the colour blue. The tasks in these experiments require high level cognitive processing such as analogy solving or remote association tests and the chromatic effect on local vs. global processing is presumed to involve widespread activation of the autonomic nervous system. If this is the case, we might expect to see similar effects on all local vs. global task comparisons. To test this hypothesis, we asked whether chromatic cues also influence performance in tasks involving low level visual feature integration. Subjects performed either local (contrast detection) or global (form detection) tasks on achromatic dynamic Glass pattern stimuli. Coloured instructions, target frames and fixation points were used to attempt to bias performance to different task types. Based on previous literature, we hypothesised that red cues would improve performance in the (local) contrast detection task but would impede performance in the (global) form detection task. A two-way, repeated measures, analysis of covariance (2×2 ANCOVA) with gender as a covariate, revealed no influence of colour on either task, F(1,29) = 0.289, p = 0.595, partial η (2) = 0.002. Additional analysis revealed no significant differences in only the first attempts of the tasks or in the improvement in performance between trials. We conclude that motivational processes elicited by colour perception do not influence neuronal signal processing in the early visual system, in stark contrast to their putative effects on processing in higher areas.
2005-01-01
Surface Tasks ................................................................................................... 250 Goali : Creep and Microstructural...SURFACE TASKS Morris Driels, Professor Department of Mechanical Engineering Sponsor: U.S. Army Materiel Systems Analysis Activity GOALI : CREEP AND...Professor Department of Mechanical Engineering Sponsor: National Science Foundation SUMMARY: This GOALI (Grant Opportunities for Academic Liaison
Studying the HIT-Complexity Interchange.
Kuziemsky, Craig E; Borycki, Elizabeth M; Kushniruk, Andre W
2016-01-01
The design and implementation of health information technology (HIT) is challenging, particularly when it is being introduced into complex settings. While complex adaptive system (CASs) can be a valuable means of understanding relationships between users, HIT and tasks, much of the existing work using CASs is descriptive in nature. This paper addresses that issue by integrating a model for analyzing task complexity with approaches for HIT evaluation and systems analysis. The resulting framework classifies HIT-user tasks and issues as simple, complicated or complex, and provides insight on how to study them.
2011-08-15
system must, at a minimum, include design and configuration framework supporting: Part 1. Net Ready. The system must support net ‐ centric operations...Analyze, evaluate and incorporate relevant DoD Architecture Framework . 5) Document standards for each task / condition combination. 6) Prepare final FAA...task Analyze, evaluate and incorporate relevant Army Architecture Framework Document standards for each task/condition combination forming
TADS: A CFD-based turbomachinery and analysis design system with GUI. Volume 1: Method and results
NASA Technical Reports Server (NTRS)
Topp, D. A.; Myers, R. A.; Delaney, R. A.
1995-01-01
The primary objective of this study was the development of a CFD (Computational Fluid Dynamics) based turbomachinery airfoil analysis and design system, controlled by a GUI (Graphical User Interface). The computer codes resulting from this effort are referred to as TADS (Turbomachinery Analysis and Design System). This document is the Final Report describing the theoretical basis and analytical results from the TADS system, developed under Task 18 of NASA Contract NAS3-25950, ADPAC System Coupling to Blade Analysis & Design System GUI. TADS couples a throughflow solver (ADPAC) with a quasi-3D blade-to-blade solver (RVCQ3D) in an interactive package. Throughflow analysis capability was developed in ADPAC through the addition of blade force and blockage terms to the governing equations. A GUI was developed to simplify user input and automate the many tasks required to perform turbomachinery analysis and design. The coupling of the various programs was done in such a way that alternative solvers or grid generators could be easily incorporated into the TADS framework. Results of aerodynamic calculations using the TADS system are presented for a highly loaded fan, a compressor stator, a low speed turbine blade and a transonic turbine vane.
Inertial Upper Stage (IUS) software analysis
NASA Technical Reports Server (NTRS)
Grayson, W. L.; Nickel, C. E.; Rose, P. L.; Singh, R. P.
1979-01-01
The Inertial Upper Stage (IUS) System, an extension of the Space Transportation System (STS) operating regime to include higher orbits, orbital plane changes, geosynchronous orbits, and interplanetary trajectories is presented. The IUS software design, the IUS software interfaces with other systems, and the cost effectiveness in software verification are described. Tasks of the IUS discussed include: (1) design analysis; (2) validation requirements analysis; (3) interface analysis; and (4) requirements analysis.
A Scheduling Algorithm for Replicated Real-Time Tasks
NASA Technical Reports Server (NTRS)
Yu, Albert C.; Lin, Kwei-Jay
1991-01-01
We present an algorithm for scheduling real-time periodic tasks on a multiprocessor system under fault-tolerant requirement. Our approach incorporates both the redundancy and masking technique and the imprecise computation model. Since the tasks in hard real-time systems have stringent timing constraints, the redundancy and masking technique are more appropriate than the rollback techniques which usually require extra time for error recovery. The imprecise computation model provides flexible functionality by trading off the quality of the result produced by a task with the amount of processing time required to produce it. It therefore permits the performance of a real-time system to degrade gracefully. We evaluate the algorithm by stochastic analysis and Monte Carlo simulations. The results show that the algorithm is resilient under hardware failures.
2015-01-01
Background Modern methods for mining biomolecular interactions from literature typically make predictions based solely on the immediate textual context, in effect a single sentence. No prior work has been published on extending this context to the information automatically gathered from the whole biomedical literature. Thus, our motivation for this study is to explore whether mutually supporting evidence, aggregated across several documents can be utilized to improve the performance of the state-of-the-art event extraction systems. In this paper, we describe our participation in the latest BioNLP Shared Task using the large-scale text mining resource EVEX. We participated in the Genia Event Extraction (GE) and Gene Regulation Network (GRN) tasks with two separate systems. In the GE task, we implemented a re-ranking approach to improve the precision of an existing event extraction system, incorporating features from the EVEX resource. In the GRN task, our system relied solely on the EVEX resource and utilized a rule-based conversion algorithm between the EVEX and GRN formats. Results In the GE task, our re-ranking approach led to a modest performance increase and resulted in the first rank of the official Shared Task results with 50.97% F-score. Additionally, in this paper we explore and evaluate the usage of distributed vector representations for this challenge. In the GRN task, we ranked fifth in the official results with a strict/relaxed SER score of 0.92/0.81 respectively. To try and improve upon these results, we have implemented a novel machine learning based conversion system and benchmarked its performance against the original rule-based system. Conclusions For the GRN task, we were able to produce a gene regulatory network from the EVEX data, warranting the use of such generic large-scale text mining data in network biology settings. A detailed performance and error analysis provides more insight into the relatively low recall rates. In the GE task we demonstrate that both the re-ranking approach and the word vectors can provide slight performance improvement. A manual evaluation of the re-ranking results pinpoints some of the challenges faced in applying large-scale text mining knowledge to event extraction. PMID:26551766
NASA Technical Reports Server (NTRS)
Johnson, Kathy A.; Shek, Molly
2003-01-01
Astronauts in a space station are to some extent like patients in an intensive care unit (ICU). Medical support of a mission crew will require acquisition, transmission, distribution, integration, and archiving of significant amounts of data. These data are acquired by disparate systems and will require timely, reliable, and secure distribution to different communities for the execution of various tasks of space missions. The goal of the Comprehensive Medical Information System (CMIS) Project at Johnson Space Center Flight Medical Clinic is to integrate data from all Medical Operations sources, including the reference information sources and the electronic medical records of astronauts. A first step toward the full CMIS implementation is to integrate and organize the reference information sources and the electronic medical record with the Flight Surgeons console. In order to investigate this integration, we need to understand the usability problems of the Flight Surgeon's console in particular and medical information systems in general. One way to achieve this understanding is through the use of user and task analyses whose general purpose is to ensure that only the necessary and sufficient task features that match users capacities will be included in system implementations. The goal of this summer project was to conduct user and task analyses employing cognitive engineering techniques to analyze the task of the Flight Surgeons and Biomedical Engineers (BMEs) while they worked on Console. The techniques employed were user interviews, observations and a questionnaire to collect data for which a hierarchical task analysis and an information resource assessment were performed. They are described in more detail below. Finally, based on our analyses, we make recommendations for improvements to the support structure.
Ordering Design Tasks Based on Coupling Strengths
NASA Technical Reports Server (NTRS)
Rogers, J. L.; Bloebaum, C. L.
1994-01-01
The design process associated with large engineering systems requires an initial decomposition of the complex system into modules of design tasks which are coupled through the transference of output data. In analyzing or optimizing such a coupled system, it is essential to be able to determine which interactions figure prominently enough to significantly affect the accuracy of the system solution. Many decomposition approaches assume the capability is available to determine what design tasks and interactions exist and what order of execution will be imposed during the analysis process. Unfortunately, this is often a complex problem and beyond the capabilities of a human design manager. A new feature for DeMAID (Design Manager's Aid for Intelligent Decomposition) will allow the design manager to use coupling strength information to find a proper sequence for ordering the design tasks. In addition, these coupling strengths aid in deciding if certain tasks or couplings could be removed (or temporarily suspended) from consideration to achieve computational savings without a significant loss of system accuracy. New rules are presented and two small test cases are used to show the effects of using coupling strengths in this manner.
Ordering design tasks based on coupling strengths
NASA Technical Reports Server (NTRS)
Rogers, James L., Jr.; Bloebaum, Christina L.
1994-01-01
The design process associated with large engineering systems requires an initial decomposition of the complex system into modules of design tasks which are coupled through the transference of output data. In analyzing or optimizing such a coupled system, it is essential to be able to determine which interactions figure prominently enough to significantly affect the accuracy of the system solution. Many decomposition approaches assume the capability is available to determine what design tasks and interactions exist and what order of execution will be imposed during the analysis process. Unfortunately, this is often a complex problem and beyond the capabilities of a human design manager. A new feature for DeMAID (Design Manager's Aid for Intelligent Decomposition) will allow the design manager to use coupling strength information to find a proper sequence for ordering the design tasks. In addition, these coupling strengths aid in deciding if certain tasks or couplings could be removed (or temporarily suspended) from consideration to achieve computational savings without a significant loss of system accuracy. New rules are presented and two small test cases are used to show the effects of using coupling strengths in this manner.
TEES 2.2: Biomedical Event Extraction for Diverse Corpora
2015-01-01
Background The Turku Event Extraction System (TEES) is a text mining program developed for the extraction of events, complex biomedical relationships, from scientific literature. Based on a graph-generation approach, the system detects events with the use of a rich feature set built via dependency parsing. The TEES system has achieved record performance in several of the shared tasks of its domain, and continues to be used in a variety of biomedical text mining tasks. Results The TEES system was quickly adapted to the BioNLP'13 Shared Task in order to provide a public baseline for derived systems. An automated approach was developed for learning the underlying annotation rules of event type, allowing immediate adaptation to the various subtasks, and leading to a first place in four out of eight tasks. The system for the automated learning of annotation rules is further enhanced in this paper to the point of requiring no manual adaptation to any of the BioNLP'13 tasks. Further, the scikit-learn machine learning library is integrated into the system, bringing a wide variety of machine learning methods usable with TEES in addition to the default SVM. A scikit-learn ensemble method is also used to analyze the importances of the features in the TEES feature sets. Conclusions The TEES system was introduced for the BioNLP'09 Shared Task and has since then demonstrated good performance in several other shared tasks. By applying the current TEES 2.2 system to multiple corpora from these past shared tasks an overarching analysis of the most promising methods and possible pitfalls in the evolving field of biomedical event extraction are presented. PMID:26551925
TEES 2.2: Biomedical Event Extraction for Diverse Corpora.
Björne, Jari; Salakoski, Tapio
2015-01-01
The Turku Event Extraction System (TEES) is a text mining program developed for the extraction of events, complex biomedical relationships, from scientific literature. Based on a graph-generation approach, the system detects events with the use of a rich feature set built via dependency parsing. The TEES system has achieved record performance in several of the shared tasks of its domain, and continues to be used in a variety of biomedical text mining tasks. The TEES system was quickly adapted to the BioNLP'13 Shared Task in order to provide a public baseline for derived systems. An automated approach was developed for learning the underlying annotation rules of event type, allowing immediate adaptation to the various subtasks, and leading to a first place in four out of eight tasks. The system for the automated learning of annotation rules is further enhanced in this paper to the point of requiring no manual adaptation to any of the BioNLP'13 tasks. Further, the scikit-learn machine learning library is integrated into the system, bringing a wide variety of machine learning methods usable with TEES in addition to the default SVM. A scikit-learn ensemble method is also used to analyze the importances of the features in the TEES feature sets. The TEES system was introduced for the BioNLP'09 Shared Task and has since then demonstrated good performance in several other shared tasks. By applying the current TEES 2.2 system to multiple corpora from these past shared tasks an overarching analysis of the most promising methods and possible pitfalls in the evolving field of biomedical event extraction are presented.
Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task 2 Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Shuai; Makarov, Yuri V.; McKinstry, Craig A.
Task report detailing low probability tail event analysis and mitigation in BPA control area. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, causing the imbalance between generation and load to become very significant.
Accessing FMS Functionality: The Impact of Design on Learning
NASA Technical Reports Server (NTRS)
Fennell, Karl; Sherry, Lance; Roberts, Ralph, Jr.
2004-01-01
In modern commercial and military aircraft, the Flight Management System (FMS) lies at the heart of the functionality of the airplane. The nature of the FMS has also caused great difficulties learning and accessing this functionality. This study examines actual Air Force pilots who were qualified on the newly introduced advanced FMS and shows that the design of the system itself is a primary source of difficulty learning the system. Twenty representative tasks were selected which the pilots could be expected to accomplish on an ' actual flight. These tasks were analyzed using the RAFIV stage model (Sherry, Polson, et al. 2002). This analysis demonstrates that a great burden is placed on remembering complex reformulation of the task to function mapping. 65% of the tasks required retaining one access steps in memory to accomplish the task, 20% required two memorized access steps, and 15% required zero memorized access steps. The probability that a participant would make an access error on the tasks was: two memorized access steps - 74%, one memorized access step - 13%, and zero memorized access steps - 6%. Other factors were analyzed as well, including experience with the system and frequency of use. This completed the picture of a system with many memorized steps causing difficulty with the new system, especially when trying to fine where to access the correct function.
TEJAS - TELEROBOTICS/EVA JOINT ANALYSIS SYSTEM VERSION 1.0
NASA Technical Reports Server (NTRS)
Drews, M. L.
1994-01-01
The primary objective of space telerobotics as a research discipline is the augmentation and/or support of extravehicular activity (EVA) with telerobotic activity; this allows increased emplacement of on-orbit assets while providing for their "in situ" management. Development of the requisite telerobot work system requires a well-understood correspondence between EVA and telerobotics that to date has been only partially established. The Telerobotics/EVA Joint Analysis Systems (TEJAS) hypermedia information system uses object-oriented programming to bridge the gap between crew-EVA and telerobotics activities. TEJAS Version 1.0 contains twenty HyperCard stacks that use a visual, customizable interface of icon buttons, pop-up menus, and relational commands to store, link, and standardize related information about the primitives, technologies, tasks, assumptions, and open issues involved in space telerobot or crew EVA tasks. These stacks are meant to be interactive and can be used with any database system running on a Macintosh, including spreadsheets, relational databases, word-processed documents, and hypermedia utilities. The software provides a means for managing volumes of data and for communicating complex ideas, relationships, and processes inherent to task planning. The stack system contains 3MB of data and utilities to aid referencing, discussion, communication, and analysis within the EVA and telerobotics communities. The six baseline analysis stacks (EVATasks, EVAAssume, EVAIssues, TeleTasks, TeleAssume, and TeleIssues) work interactively to manage and relate basic information which you enter about the crew-EVA and telerobot tasks you wish to analyze in depth. Analysis stacks draw on information in the Reference stacks as part of a rapid point-and-click utility for building scripts of specific task primitives or for any EVA or telerobotics task. Any or all of these stacks can be completely incorporated within other hypermedia applications, or they can be referenced as is, without requiring data to be transferred into any other database. TEJAS is simple to use and requires no formal training. Some knowledge of HyperCard is helpful, but not essential. All Help cards printed in the TEJAS User's Guide are part of the TEJAS Help Stack and are available from a pop-up menu any time you are using TEJAS. Specific stacks created in TEJAS can be exchanged between groups, divisions, companies, or centers for complete communication of fundamental information that forms the basis for further analyses. TEJAS runs on any Apple Macintosh personal computer with at least one megabyte of RAM, a hard disk, and HyperCard 1.21, or later version. TEJAS is a copyrighted work with all copyright vested in NASA. HyperCard and Macintosh are registered trademarks of Apple Computer, Inc.
Beuscart-Zéphir, Marie-Catherine; Pelayo, Sylvia; Bernonville, Stéphanie
2010-04-01
The objectives of this paper are: In this approach, the implementation of such a complex IT solution is considered a major redesign of the work system. The paper describes the Human Factor (HF) tasks embedded in the project lifecycle: (1) analysis and modelling of the current work system and usability assessment of the medication CPOE solution; (2) HF recommendations for work re-design and usability recommendations for IT system re-engineering both aiming at a safer and more efficient work situation. Standard ethnographic methods were used to support the analysis of the current work system and work situations, coupled with cognitive task analysis methods and documents review. Usability inspection (heuristic evaluation) and both in-lab (simulated tasks) and on-site (real tasks) usability tests were performed for the evaluation of the CPOE candidate. Adapted software engineering models were used in combination with usual textual descriptions, tasks models and mock-ups to support the recommendations for work and product re-design. The analysis of the work situations identified different work organisations and procedures across the hospital's departments. The most important differences concerned the doctor-nurse communications and cooperation modes and the procedures for preparing and administering the medications. The assessment of the medication CPOE functions uncovered a number of usability problems including severe ones leading to impossible to detect or to catch errors. Models of the actual and possible distribution of tasks and roles were used to support decision making in the work design process. The results of the usability assessment were translated into requirements to support the necessary re-engineering of the IT application. The HFE approach to medication CPOE efficiently identifies and distinguishes currently unsafe or uncomfortable work situations that could obviously benefit from an IT solution from other work situations incorporating efficient work procedures that might be impaired by the implementation of the CPOE. In this context, a careful redesign of the work situation and of the entire work system is necessary to actually benefit from the installation of the product in terms of patient safety and human performances. In parallel, a usability assessment of the product to be implemented is mandatory to identify potentially dangerous usability flaws and to fix them before the installation. (c) 2009 Elsevier Ireland Ltd. All rights reserved.
Visual Task Demands and the Auditory Mismatch Negativity: An Empirical Study and a Meta-Analysis
Wiens, Stefan; Szychowska, Malina; Nilsson, Mats E.
2016-01-01
Because the auditory system is particularly useful in monitoring the environment, previous research has examined whether task-irrelevant, auditory distracters are processed even if subjects focus their attention on visual stimuli. This research suggests that attentionally demanding visual tasks decrease the auditory mismatch negativity (MMN) to simultaneously presented auditory distractors. Because a recent behavioral study found that high visual perceptual load decreased detection sensitivity of simultaneous tones, we used a similar task (n = 28) to determine if high visual perceptual load would reduce the auditory MMN. Results suggested that perceptual load did not decrease the MMN. At face value, these nonsignificant findings may suggest that effects of perceptual load on the MMN are smaller than those of other demanding visual tasks. If so, effect sizes should differ systematically between the present and previous studies. We conducted a selective meta-analysis of published studies in which the MMN was derived from the EEG, the visual task demands were continuous and varied between high and low within the same task, and the task-irrelevant tones were presented in a typical oddball paradigm simultaneously with the visual stimuli. Because the meta-analysis suggested that the present (null) findings did not differ systematically from previous findings, the available evidence was combined. Results of this meta-analysis confirmed that demanding visual tasks reduce the MMN to auditory distracters. However, because the meta-analysis was based on small studies and because of the risk for publication biases, future studies should be preregistered with large samples (n > 150) to provide confirmatory evidence for the results of the present meta-analysis. These future studies should also use control conditions that reduce confounding effects of neural adaptation, and use load manipulations that are defined independently from their effects on the MMN. PMID:26741815
Militello, L G
1998-01-01
The growing role of information technology in our society has changed the very nature of many of the tasks that workers are called on to perform. Technology has resulted in a dramatic reduction in the number of proceduralized, rote tasks that workers must face. The impact of technology on many tasks and functions has been to greatly increase demands on the cognitive skills of workers. More procedural or predictable tasks are now handled by smart machines, while workers are responsible for tasks that require inference, diagnosis, judgment, and decision making. The increase in the cognitive demands placed on workers and the redistribution of tasks have created a need for a better understanding of the cognitive components of many tasks. This need has been recognized by many in the health care domain, including the U.S. Food and Drug Administration (FDA). Recent FDA regulations encourage the use of human factors in the development of medical devices, instruments, and systems. One promising set of methods for filling this need is cognitive task analysis.
Passive wireless sensor systems can recognize activites of daily living.
Urwyler, Prabitha; Stucki, Reto; Muri, Rene; Mosimann, Urs P; Nef, Tobias
2015-08-01
The ability to determine what activity of daily living a person performs is of interest in many application domains. It is possible to determine the physical and cognitive capabilities of the elderly by inferring what activities they perform in their houses. Our primary aim was to establish a proof of concept that a wireless sensor system can monitor and record physical activity and these data can be modeled to predict activities of daily living. The secondary aim was to determine the optimal placement of the sensor boxes for detecting activities in a room. A wireless sensor system was set up in a laboratory kitchen. The ten healthy participants were requested to make tea following a defined sequence of tasks. Data were collected from the eight wireless sensor boxes placed in specific places in the test kitchen and analyzed to detect the sequences of tasks performed by the participants. These sequence of tasks were trained and tested using the Markov Model. Data analysis focused on the reliability of the system and the integrity of the collected data. The sequence of tasks were successfully recognized for all subjects and the averaged data pattern of tasks sequences between the subjects had a high correlation. Analysis of the data collected indicates that sensors placed in different locations are capable of recognizing activities, with the movement detection sensor contributing the most to detection of tasks. The central top of the room with no obstruction of view was considered to be the best location to record data for activity detection. Wireless sensor systems show much promise as easily deployable to monitor and recognize activities of daily living.
A Graphical User Interface for the Low Cost Combat Direction System
1991-09-16
the same tasks. These shipboard tasks, which include contact management , moving geometry calculations, intelligence compila- tion, area plotting and...Display Defaults Analysis This category covers a wide range of required data input and system configuration issues. To keep the screen display manageable ...parts or dialog boxes. The implementation of an Ada application using STARS is quite straightforward, although knowlede of X Protocol primitives is
Strayer, David L; Cooper, Joel M; Turrill, Jonna; Coleman, James R; Hopman, Rachel J
2017-06-01
The goal of this research was to examine the impact of voice-based interactions using 3 different intelligent personal assistants (Apple's Siri , Google's Google Now for Android phones, and Microsoft's Cortana ) on the cognitive workload of the driver. In 2 experiments using an instrumented vehicle on suburban roadways, we measured the cognitive workload of drivers when they used the voice-based features of each smartphone to place a call, select music, or send text messages. Cognitive workload was derived from primary task performance through video analysis, secondary-task performance using the Detection Response Task (DRT), and subjective mental workload. We found that workload was significantly higher than that measured in the single-task drive. There were also systematic differences between the smartphones: The Google system placed lower cognitive demands on the driver than the Apple and Microsoft systems, which did not differ. Video analysis revealed that the difference in mental workload between the smartphones was associated with the number of system errors, the time to complete an action, and the complexity and intuitiveness of the devices. Finally, surprisingly high levels of cognitive workload were observed when drivers were interacting with the devices: "on-task" workload measures did not systematically differ from that associated with a mentally demanding Operation Span (OSPAN) task. The analysis also found residual costs associated using each of the smartphones that took a significant time to dissipate. The data suggest that caution is warranted in the use of smartphone voice-based technology in the vehicle because of the high levels of cognitive workload associated with these interactions. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Task planning with uncertainty for robotic systems. Thesis
NASA Technical Reports Server (NTRS)
Cao, Tiehua
1993-01-01
In a practical robotic system, it is important to represent and plan sequences of operations and to be able to choose an efficient sequence from them for a specific task. During the generation and execution of task plans, different kinds of uncertainty may occur and erroneous states need to be handled to ensure the efficiency and reliability of the system. An approach to task representation, planning, and error recovery for robotic systems is demonstrated. Our approach to task planning is based on an AND/OR net representation, which is then mapped to a Petri net representation of all feasible geometric states and associated feasibility criteria for net transitions. Task decomposition of robotic assembly plans based on this representation is performed on the Petri net for robotic assembly tasks, and the inheritance of properties of liveness, safeness, and reversibility at all levels of decomposition are explored. This approach provides a framework for robust execution of tasks through the properties of traceability and viability. Uncertainty in robotic systems are modeled by local fuzzy variables, fuzzy marking variables, and global fuzzy variables which are incorporated in fuzzy Petri nets. Analysis of properties and reasoning about uncertainty are investigated using fuzzy reasoning structures built into the net. Two applications of fuzzy Petri nets, robot task sequence planning and sensor-based error recovery, are explored. In the first application, the search space for feasible and complete task sequences with correct precedence relationships is reduced via the use of global fuzzy variables in reasoning about subgoals. In the second application, sensory verification operations are modeled by mutually exclusive transitions to reason about local and global fuzzy variables on-line and automatically select a retry or an alternative error recovery sequence when errors occur. Task sequencing and task execution with error recovery capability for one and multiple soft components in robotic systems are investigated.
Space station integrated propulsion and fluid systems study
NASA Technical Reports Server (NTRS)
Bicknell, B.; Wilson, S.; Dennis, M.; Shepard, D.; Rossier, R.
1988-01-01
The program study was performed in two tasks: Task 1 addressed propulsion systems and Task 2 addressed all fluid systems associated with the Space Station elements, which also included propulsion and pressurant systems. Program results indicated a substantial reduction in life cycle costs through integrating the oxygen/hydrogen propulsion system with the environmental control and life support system, and through supplying nitrogen in a cryogenic gaseous supercritical or subcritical liquid state. A water sensitivity analysis showed that increasing the food water content would substantially increase the amount of water available for propulsion use and in all cases, the implementation of the BOSCH CO2 reduction process would reduce overall life cycle costs to the station and minimize risk. An investigation of fluid systems and associated requirements revealed a delicate balance between the individual propulsion and fluid systems across work packages and a strong interdependence between all other fluid systems.
NASA Technical Reports Server (NTRS)
Staveland, Lowell
1994-01-01
This is the experimental and software detailed design report for the prototype task loading model (TLM) developed as part of the man-machine integration design and analysis system (MIDAS), as implemented and tested in phase 6 of the Army-NASA Aircrew/Aircraft Integration (A3I) Program. The A3I program is an exploratory development effort to advance the capabilities and use of computational representations of human performance and behavior in the design, synthesis, and analysis of manned systems. The MIDAS TLM computationally models the demands designs impose on operators to aide engineers in the conceptual design of aircraft crewstations. This report describes TLM and the results of a series of experiments which were run this phase to test its capabilities as a predictive task demand modeling tool. Specifically, it includes discussions of: the inputs and outputs of TLM, the theories underlying it, the results of the test experiments, the use of the TLM as both stand alone tool and part of a complete human operator simulation, and a brief introduction to the TLM software design.
Blackboard architecture for medical image interpretation
NASA Astrophysics Data System (ADS)
Davis, Darryl N.; Taylor, Christopher J.
1991-06-01
There is a growing interest in using sophisticated knowledge-based systems for biomedical image interpretation. We present a principled attempt to use artificial intelligence methodologies in interpreting lateral skull x-ray images. Such radiographs are routinely used in cephalometric analysis to provide quantitative measurements useful to clinical orthodontists. Manual and interactive methods of analysis are known to be error prone and previous attempts to automate this analysis typically fail to capture the expertise and adaptability required to cope with the variability in biological structure and image quality. An integrated model-based system has been developed which makes use of a blackboard architecture and multiple knowledge sources. A model definition interface allows quantitative models, of feature appearance and location, to be built from examples as well as more qualitative modelling constructs. Visual task definition and blackboard control modules allow task-specific knowledge sources to act on information available to the blackboard in a hypothesise and test reasoning cycle. Further knowledge-based modules include object selection, location hypothesis, intelligent segmentation, and constraint propagation systems. Alternative solutions to given tasks are permitted.
Analysis of Feedback in after Action Reviews
1987-06-01
CONNTSM Page INTRODUCTIUN . . . . . . . . . . . . . . . . . . . A Perspective on Feedback. . ....... • • ..... • 1 Overviev of %,•urrent Research...part of their training program . The AAR is in marked contrast to the critique method of feedback which is often used in military training. The AAR...feedback is task-inherent feedback. Task-inherent feedback refers to human-machine interacting systems, e.g., computers , where in a visual tracking task
Task analysis of autonomous on-road driving
NASA Astrophysics Data System (ADS)
Barbera, Anthony J.; Horst, John A.; Schlenoff, Craig I.; Aha, David W.
2004-12-01
The Real-time Control System (RCS) Methodology has evolved over a number of years as a technique to capture task knowledge and organize it into a framework conducive to implementation in computer control systems. The fundamental premise of this methodology is that the present state of the task activities sets the context that identifies the requirements for all of the support processing. In particular, the task context at any time determines what is to be sensed in the world, what world model states are to be evaluated, which situations are to be analyzed, what plans should be invoked, and which behavior generation knowledge is to be accessed. This methodology concentrates on the task behaviors explored through scenario examples to define a task decomposition tree that clearly represents the branching of tasks into layers of simpler and simpler subtask activities. There is a named branching condition/situation identified for every fork of this task tree. These become the input conditions of the if-then rules of the knowledge set that define how the task is to respond to input state changes. Detailed analysis of each branching condition/situation is used to identify antecedent world states and these, in turn, are further analyzed to identify all of the entities, objects, and attributes that have to be sensed to determine if any of these world states exist. This paper explores the use of this 4D/RCS methodology in some detail for the particular task of autonomous on-road driving, which work was funded under the Defense Advanced Research Project Agency (DARPA) Mobile Autonomous Robot Software (MARS) effort (Doug Gage, Program Manager).
NASA Technical Reports Server (NTRS)
Wolfgang, R.; Natarajan, T.; Day, J.
1987-01-01
A feedback control system, called an auxiliary array switch, was designed to connect or disconnect auxiliary solar panel segments from a spacecraft electrical bus to meet fluctuating demand for power. A simulation of the control system was used to carry out a number of design and analysis tasks that could not economically be performed with a breadboard of the hardware. These tasks included: (1) the diagnosis of a stability problem, (2) identification of parameters to which the performance of the control system was particularly sensitive, (3) verification that the response of the control system to anticipated fluctuations in the electrical load of the spacecraft was satisfactory, and (4) specification of limitations on the frequency and amplitude of the load fluctuations.
Dzyubachyk, Oleh; Essers, Jeroen; van Cappellen, Wiggert A; Baldeyron, Céline; Inagaki, Akiko; Niessen, Wiro J; Meijering, Erik
2010-10-01
Complete, accurate and reproducible analysis of intracellular foci from fluorescence microscopy image sequences of live cells requires full automation of all processing steps involved: cell segmentation and tracking followed by foci segmentation and pattern analysis. Integrated systems for this purpose are lacking. Extending our previous work in cell segmentation and tracking, we developed a new system for performing fully automated analysis of fluorescent foci in single cells. The system was validated by applying it to two common tasks: intracellular foci counting (in DNA damage repair experiments) and cell-phase identification based on foci pattern analysis (in DNA replication experiments). Experimental results show that the system performs comparably to expert human observers. Thus, it may replace tedious manual analyses for the considered tasks, and enables high-content screening. The described system was implemented in MATLAB (The MathWorks, Inc., USA) and compiled to run within the MATLAB environment. The routines together with four sample datasets are available at http://celmia.bigr.nl/. The software is planned for public release, free of charge for non-commercial use, after publication of this article.
Composable Analytic Systems for next-generation intelligence analysis
NASA Astrophysics Data System (ADS)
DiBona, Phil; Llinas, James; Barry, Kevin
2015-05-01
Lockheed Martin Advanced Technology Laboratories (LM ATL) is collaborating with Professor James Llinas, Ph.D., of the Center for Multisource Information Fusion at the University at Buffalo (State of NY), researching concepts for a mixed-initiative associate system for intelligence analysts to facilitate reduced analysis and decision times while proactively discovering and presenting relevant information based on the analyst's needs, current tasks and cognitive state. Today's exploitation and analysis systems have largely been designed for a specific sensor, data type, and operational context, leading to difficulty in directly supporting the analyst's evolving tasking and work product development preferences across complex Operational Environments. Our interactions with analysts illuminate the need to impact the information fusion, exploitation, and analysis capabilities in a variety of ways, including understanding data options, algorithm composition, hypothesis validation, and work product development. Composable Analytic Systems, an analyst-driven system that increases flexibility and capability to effectively utilize Multi-INT fusion and analytics tailored to the analyst's mission needs, holds promise to addresses the current and future intelligence analysis needs, as US forces engage threats in contested and denied environments.
Estimation Accuracy on Execution Time of Run-Time Tasks in a Heterogeneous Distributed Environment.
Liu, Qi; Cai, Weidong; Jin, Dandan; Shen, Jian; Fu, Zhangjie; Liu, Xiaodong; Linge, Nigel
2016-08-30
Distributed Computing has achieved tremendous development since cloud computing was proposed in 2006, and played a vital role promoting rapid growth of data collecting and analysis models, e.g., Internet of things, Cyber-Physical Systems, Big Data Analytics, etc. Hadoop has become a data convergence platform for sensor networks. As one of the core components, MapReduce facilitates allocating, processing and mining of collected large-scale data, where speculative execution strategies help solve straggler problems. However, there is still no efficient solution for accurate estimation on execution time of run-time tasks, which can affect task allocation and distribution in MapReduce. In this paper, task execution data have been collected and employed for the estimation. A two-phase regression (TPR) method is proposed to predict the finishing time of each task accurately. Detailed data of each task have drawn interests with detailed analysis report being made. According to the results, the prediction accuracy of concurrent tasks' execution time can be improved, in particular for some regular jobs.
An overview of the BioCreative 2012 Workshop Track III: interactive text mining task.
Arighi, Cecilia N; Carterette, Ben; Cohen, K Bretonnel; Krallinger, Martin; Wilbur, W John; Fey, Petra; Dodson, Robert; Cooper, Laurel; Van Slyke, Ceri E; Dahdul, Wasila; Mabee, Paula; Li, Donghui; Harris, Bethany; Gillespie, Marc; Jimenez, Silvia; Roberts, Phoebe; Matthews, Lisa; Becker, Kevin; Drabkin, Harold; Bello, Susan; Licata, Luana; Chatr-aryamontri, Andrew; Schaeffer, Mary L; Park, Julie; Haendel, Melissa; Van Auken, Kimberly; Li, Yuling; Chan, Juancarlos; Muller, Hans-Michael; Cui, Hong; Balhoff, James P; Chi-Yang Wu, Johnny; Lu, Zhiyong; Wei, Chih-Hsuan; Tudor, Catalina O; Raja, Kalpana; Subramani, Suresh; Natarajan, Jeyakumar; Cejuela, Juan Miguel; Dubey, Pratibha; Wu, Cathy
2013-01-01
In many databases, biocuration primarily involves literature curation, which usually involves retrieving relevant articles, extracting information that will translate into annotations and identifying new incoming literature. As the volume of biological literature increases, the use of text mining to assist in biocuration becomes increasingly relevant. A number of groups have developed tools for text mining from a computer science/linguistics perspective, and there are many initiatives to curate some aspect of biology from the literature. Some biocuration efforts already make use of a text mining tool, but there have not been many broad-based systematic efforts to study which aspects of a text mining tool contribute to its usefulness for a curation task. Here, we report on an effort to bring together text mining tool developers and database biocurators to test the utility and usability of tools. Six text mining systems presenting diverse biocuration tasks participated in a formal evaluation, and appropriate biocurators were recruited for testing. The performance results from this evaluation indicate that some of the systems were able to improve efficiency of curation by speeding up the curation task significantly (∼1.7- to 2.5-fold) over manual curation. In addition, some of the systems were able to improve annotation accuracy when compared with the performance on the manually curated set. In terms of inter-annotator agreement, the factors that contributed to significant differences for some of the systems included the expertise of the biocurator on the given curation task, the inherent difficulty of the curation and attention to annotation guidelines. After the task, annotators were asked to complete a survey to help identify strengths and weaknesses of the various systems. The analysis of this survey highlights how important task completion is to the biocurators' overall experience of a system, regardless of the system's high score on design, learnability and usability. In addition, strategies to refine the annotation guidelines and systems documentation, to adapt the tools to the needs and query types the end user might have and to evaluate performance in terms of efficiency, user interface, result export and traditional evaluation metrics have been analyzed during this task. This analysis will help to plan for a more intense study in BioCreative IV.
The integrated manual and automatic control of complex flight systems
NASA Technical Reports Server (NTRS)
Schmidt, D. K.
1985-01-01
Pilot/vehicle analysis techniques for optimizing aircraft handling qualities are presented. The analysis approach considered is based on the optimal control frequency domain techniques. These techniques stem from an optimal control approach of a Neal-Smith like analysis on aircraft attitude dynamics extended to analyze the flared landing task. Some modifications to the technique are suggested and discussed. An in depth analysis of the effect of the experimental variables, such as prefilter, is conducted to gain further insight into the flared land task for this class of vehicle dynamics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
This Feasibility Analysis covers a wide range of studies and evaluations. The Report is divided into five parts. Section 1 contains all material relating to the Institutional Assessment including consideration of the requirements and position of the Potomac Electric Co. as they relate to cogeneration at Georgetown in parallel with the utility (Task 1). Sections 2 through 7 contain all technical information relating to the Alternative Subsystems Analysis (Task 4). This includes the energy demand profiles upon which the evaluations were based (Task 3). It further includes the results of the Life-Cycle-Cost Analyses (Task 5) which are developed in detailmore » in the Appendix for evaluation in the Technical Report. Also included is the material relating to Incremental Savings and Optimization (Task 6) and the Conceptual Design for candidate alternate subsystems (Task 7). Section 8 contains all material relating to the Environmental Impact Assessment (Task 2). The Appendix contains supplementary material including the budget cost estimates used in the life-cycle-cost analyses, the basic assumptions upon which the life-cycle analyses were developed, and the detailed life-cycle-cost anlysis for each subsystem considered in detail.« less
Matsui, Takemi; Shinba, Toshikazu; Sun, Guanghao
2018-02-01
12.6% of major depressive disorder (MDD) patients have suicide intent, while it has been reported that 43% of patients did not consult their doctors for MDD, automated MDD screening is eagerly anticipated. Recently, in order to achieve automated screening of MDD, biomarkers such as multiplex DNA methylation profiles or physiological method using near infra-red spectroscopy (NIRS) have been studied, however, they require inspection using 96-well DNA ELIZA kit after blood sampling or significant cost. Using a single-lead electrocardiography (ECG), we developed a high-precision MDD screening system using transient autonomic responses induced by dual mental tasks. We developed a novel high precision MDD screening system which is composed of a single-lead ECG monitor, analogue to digital (AD) converter and a personal computer with measurement and analysis program written by LabView programming language. The system discriminates MDD patients from normal subjects using heat rate variability (HRV)-derived transient autonomic responses induced by dual mental tasks, i.e. verbal fluency task and random number generation task, via linear discriminant analysis (LDA) adopting HRV-related predictor variables (hear rate (HR), high frequency (HF), low frequency (LF)/HF). The proposed system was tested for 12 MDD patients (32 ± 15 years) under antidepressant treatment from Shizuoka Saiseikai General Hospital outpatient unit and 30 normal volunteers (37 ± 17 years) from Tokyo Metropolitan University. The proposed system achieved 100% sensitivity and 100% specificity in classifying 42 examinees into 12 MDD patients and 30 normal subjects. The proposed system appears promising for future HRV-based high-precision and low-cost screening of MDDs using only single-lead ECG.
Research on schedulers for astronomical observatories
NASA Astrophysics Data System (ADS)
Colome, Josep; Colomer, Pau; Guàrdia, Josep; Ribas, Ignasi; Campreciós, Jordi; Coiffard, Thierry; Gesa, Lluis; Martínez, Francesc; Rodler, Florian
2012-09-01
The main task of a scheduler applied to astronomical observatories is the time optimization of the facility and the maximization of the scientific return. Scheduling of astronomical observations is an example of the classical task allocation problem known as the job-shop problem (JSP), where N ideal tasks are assigned to M identical resources, while minimizing the total execution time. A problem of higher complexity, called the Flexible-JSP (FJSP), arises when the tasks can be executed by different resources, i.e. by different telescopes, and it focuses on determining a routing policy (i.e., which machine to assign for each operation) other than the traditional scheduling decisions (i.e., to determine the starting time of each operation). In most cases there is no single best approach to solve the planning system and, therefore, various mathematical algorithms (Genetic Algorithms, Ant Colony Optimization algorithms, Multi-Objective Evolutionary algorithms, etc.) are usually considered to adapt the application to the system configuration and task execution constraints. The scheduling time-cycle is also an important ingredient to determine the best approach. A shortterm scheduler, for instance, has to find a good solution with the minimum computation time, providing the system with the capability to adapt the selected task to varying execution constraints (i.e., environment conditions). We present in this contribution an analysis of the task allocation problem and the solutions currently in use at different astronomical facilities. We also describe the schedulers for three different projects (CTA, CARMENES and TJO) where the conclusions of this analysis are applied to develop a suitable routine.
Center for Efficiency in Sustainable Energy Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abraham, Martin
The main goal of the Center for Efficiency in Sustainable Energy Systems is to produce a methodology that evaluates a variety of energy systems. Task I. Improved Energy Efficiency for Industrial Processes: This task, completed in partnership with area manufacturers, analyzes the operation of complex manufacturing facilities to provide flexibilities that allow them to improve active-mode power efficiency, lower standby-mode power consumption, and use low cost energy resources to control energy costs in meeting their economic incentives; (2) Identify devices for the efficient transformation of instantaneous or continuous power to different devices and sections of industrial plants; and (3) usemore » these manufacturing sites to demonstrate and validate general principles of power management. Task II. Analysis of a solid oxide fuel cell operating on landfill gas: This task consists of: (1) analysis of a typical landfill gas; (2) establishment of a comprehensive design of the fuel cell system (including the SOFC stack and BOP), including durability analysis; (3) development of suitable reforming methods and catalysts that are tailored to the specific SOFC system concept; and (4) SOFC stack fabrication with testing to demonstrate the salient operational characteristics of the stack, including an analysis of the overall energy conversion efficiency of the system. Task III. Demonstration of an urban wind turbine system: This task consists of (1) design and construction of two side-by-side wind turbine systems on the YSU campus, integrated through power control systems with grid power; (2) preliminary testing of aerodynamic control effectors (provided by a small business partner) to demonstrate improved power control, and evaluation of the system performance, including economic estimates of viability in an urban environment; and (3) computational analysis of the wind turbine system as an enabling activity for development of smart rotor blades that contain integrated sensor/actuator/controller modules to enhance energy capture and reduce aerodynamic loading and noise by way of virtual aerodynamic shaping. Accomplishments: Task I. Improved Energy Efficiency for Industrial Processes: We organized an energy management training session held on February 22, 2011, which was advertised through a regional manufacturing association to provide wide-ranging notification. Over two dozen companies were represented a the seminar, ranging from heavy manufacturing businesses with $5,000,000 per year energy expenses, to small, light manufacturing facilities. Task 2. Landfill Fuel Cell Power Generation Solid Oxide Fuel Cells (SOFCs) were constructed and evaluated as a means of obtaining electrical energy from landfill gas. Analysis of landfill gas. Attempts at collecting gas samples at the landfill and evaluating them on campus were still unsuccessful. Even a Teflon® sample bag would lose its H2S content. Evaluation of Gas Clean-up We consider this a confirmation of the CO2 effect on the solubility of H2S in water making much less sulfide available for the photocatalyst. It also means that another method should be employed to clean up landfill gas. Nonetheless, composition of impurities in landfill gas was reduced sufficiently to allow successful operation of the test fuel cell. Comparison to a PEM fuel cell system. If a PEMFC were to be operated with landfill gas as the fuel, the gas would have to be treated for sulfur removal, and then processed in a reformer large enough to drive the equilibrium far toward the products, so that negligible CO would flow into the fuel cell. Analysis of a fuel cell running on landfill gas. Using a Gow-Mac gas chromatograph with a thermal conductivity detector, unambiguous determination of CO can be made, at least as a primary constituent Task 3: Task 3 Plasma Controlled Turbine Blades Wind Turbine Selection. After carefully reviewing the various model available in the market the team selected the ARE 110 (2.5kW). The ARE 110 provides a very long life with little maintenance due to their relatively low rotational speeds (low RPM). The turbines large swept area (10.2ms2/110sq.ft), high-efficiency blades, purpose built alternator, and optimized power electronics ensure maximum energy capture from a wide range of wind speeds. Two wind turbines were installed side-by-side at the Melnick Hall site to compare their performance. Evaluate and Optimize Aerodynamically Enhanced Turbine Blades Due to delays in the installation of the wind turbines, no actual data was obtained within the contract period. At this time, the turbines are installed and operational at YSU with standard blades. We are in contact with Orbital Research and in discussion as to how best the required data can be obtained.« less
On-line data analysis and monitoring for H1 drift chambers
NASA Astrophysics Data System (ADS)
Düllmann, Dirk
1992-05-01
The on-line monitoring, slow control and calibration of the H1 central jet chamber uses a VME multiprocessor system to perform the analysis and a connected Macintosh computer as graphical interface to the operator on shift. Task of this system are: - analysis of event data including on-line track search, - on-line calibration from normal events and testpulse events, - control of the high voltage and monitoring of settings and currents, - monitoring of temperature, pressure and mixture of the chambergas. A program package is described which controls the dataflow between data aquisition, differnt VME CPUs and Macintosh. It allows to run off-line style programs for the different tasks.
Custers, Eugène J F M
2013-08-01
Recently, human reasoning, problem solving, and decision making have been viewed as products of two separate systems: "System 1," the unconscious, intuitive, or nonanalytic system, and "System 2," the conscious, analytic, or reflective system. This view has penetrated the medical education literature, yet the idea of two independent dichotomous cognitive systems is not entirely without problems.This article outlines the difficulties of this "two-system view" and presents an alternative, developed by K.R. Hammond and colleagues, called cognitive continuum theory (CCT). CCT is featured by three key assumptions. First, human reasoning, problem solving, and decision making can be arranged on a cognitive continuum, with pure intuition at one end, pure analysis at the other, and a large middle ground called "quasirationality." Second, the nature and requirements of the cognitive task, as perceived by the person performing the task, determine to a large extent whether a task will be approached more intuitively or more analytically. Third, for optimal task performance, this approach needs to match the cognitive properties and requirements of the task. Finally, the author makes a case that CCT is better able than a two-system view to describe medical problem solving and clinical reasoning and that it provides clear clues for how to organize training in clinical reasoning.
Toward a process-level view of distributed healthcare tasks: Medication management as a case study.
Werner, Nicole E; Malkana, Seema; Gurses, Ayse P; Leff, Bruce; Arbaje, Alicia I
2017-11-01
We aim to highlight the importance of using a process-level view in analyzing distributed healthcare tasks through a case study analysis of medication management (MM). MM during older adults' hospital-to-skilled-home-healthcare (SHHC) transitions is a healthcare process with tasks distributed across people, organizations, and time. MM has typically been studied at the task level, but a process-level is needed to fully understand and improve MM during transitions. A process-level view allows for a broader investigation of how tasks are distributed throughout the work system through an investigation of interactions and the resultant emergent properties. We studied MM during older adults' hospital-to-SHHC transitions through interviews and observations with 60 older adults, their 33 caregivers, and 79 SHHC providers at 5 sites associated with 3 SHHC agencies. Study findings identified key cross-system characteristics not observable at the task-level: (1) identification of emergent properties (e.g., role ambiguity, loosely-coupled teams performing MM) and associated barriers; and (2) examination of barrier propagation across system boundaries. Findings highlight the importance of a process-level view of healthcare delivery occurring across system boundaries. Copyright © 2017 Elsevier Ltd. All rights reserved.
PROS: An IRAF based system for analysis of x ray data
NASA Technical Reports Server (NTRS)
Conroy, M. A.; Deponte, J.; Moran, J. F.; Orszak, J. S.; Roberts, W. P.; Schmidt, D.
1992-01-01
PROS is an IRAF based software package for the reduction and analysis of x-ray data. The use of a standard, portable, integrated environment provides for both multi-frequency and multi-mission analysis. The analysis of x-ray data differs from optical analysis due to the nature of the x-ray data and its acquisition during constantly varying conditions. The scarcity of data, the low signal-to-noise ratio and the large gaps in exposure time make data screening and masking an important part of the analysis. PROS was developed to support the analysis of data from the ROSAT and Einstein missions but many of the tasks have been used on data from other missions. IRAF/PROS provides a complete end-to-end system for x-ray data analysis: (1) a set of tools for importing and exporting data via FITS format -- in particular, IRAF provides a specialized event-list format, QPOE, that is compatible with its IMAGE (2-D array) format; (2) a powerful set of IRAF system capabilities for both temporal and spatial event filtering; (3) full set of imaging and graphics tasks; (4) specialized packages for scientific analysis such as spatial, spectral and timing analysis -- these consist of both general and mission specific tasks; and (5) complete system support including ftp and magnetic tape releases, electronic and conventional mail hotline support, electronic mail distribution of solutions to frequently asked questions and current known bugs. We will discuss the philosophy, architecture and development environment used by PROS to generate a portable, multimission software environment. PROS is available on all platforms that support IRAF, including Sun/Unix, VAX/VMS, HP, and Decstations. It is available on request at no charge.
Dual Arm Work Package performance estimates and telerobot task network simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Draper, J.V.; Blair, L.M.
1997-02-01
This paper describes the methodology and results of a network simulation study of the Dual Arm Work Package (DAWP), to be employed for dismantling the Argonne National Laboratory CP-5 reactor. The development of the simulation model was based upon the results of a task analysis for the same system. This study was performed by the Oak Ridge National Laboratory (ORNL), in the Robotics and Process Systems Division. Funding was provided the US Department of Energy`s Office of Technology Development, Robotics Technology Development Program (RTDP). The RTDP is developing methods of computer simulation to estimate telerobotic system performance. Data were collectedmore » to provide point estimates to be used in a task network simulation model. Three skilled operators performed six repetitions of a pipe cutting task representative of typical teleoperation cutting operations.« less
Survey of Human Systems Integration (HSI) Tools for USCG Acquisitions
2009-04-01
an IMPRINT HPM. IMPRINT uses task network modeling to represent human performance. As the name implies, task networks use a flowchart type format...tools; and built-in tutoring support for beginners . A perceptual/motor layer extending ACT-R’s theory of cognition to perception and action is also...chisystems.com B.8 Information and Functional Flow Analysis Description In information flow analysis, a flowchart of the information and decisions
Social Insects: A Model System for Network Dynamics
NASA Astrophysics Data System (ADS)
Charbonneau, Daniel; Blonder, Benjamin; Dornhaus, Anna
Social insect colonies (ants, bees, wasps, and termites) show sophisticated collective problem-solving in the face of variable constraints. Individuals exchange information and materials such as food. The resulting network structure and dynamics can inform us about the mechanisms by which the insects achieve particular collective behaviors and these can be transposed to man-made and social networks. We discuss how network analysis can answer important questions about social insects, such as how effective task allocation or information flow is realized. We put forward the idea that network analysis methods are under-utilized in social insect research, and that they can provide novel ways to view the complexity of collective behavior, particularly if network dynamics are taken into account. To illustrate this, we present an example of network tasks performed by ant workers, linked by instances of workers switching from one task to another. We show how temporal network analysis can propose and test new hypotheses on mechanisms of task allocation, and how adding temporal elements to static networks can drastically change results. We discuss the benefits of using social insects as models for complex systems in general. There are multiple opportunities emergent technologies and analysis methods in facilitating research on social insect network. The potential for interdisciplinary work could significantly advance diverse fields such as behavioral ecology, computer sciences, and engineering.
ROBOSIGHT: Robotic Vision System For Inspection And Manipulation
NASA Astrophysics Data System (ADS)
Trivedi, Mohan M.; Chen, ChuXin; Marapane, Suresh
1989-02-01
Vision is an important sensory modality that can be used for deriving information critical to the proper, efficient, flexible, and safe operation of an intelligent robot. Vision systems are uti-lized for developing higher level interpretation of the nature of a robotic workspace using images acquired by cameras mounted on a robot. Such information can be useful for tasks such as object recognition, object location, object inspection, obstacle avoidance and navigation. In this paper we describe efforts directed towards developing a vision system useful for performing various robotic inspection and manipulation tasks. The system utilizes gray scale images and can be viewed as a model-based system. It includes general purpose image analysis modules as well as special purpose, task dependent object status recognition modules. Experiments are described to verify the robust performance of the integrated system using a robotic testbed.
NASA Technical Reports Server (NTRS)
Rowell, Lawrence F.; Davis, John S.
1989-01-01
The Environment for Application Software Integration and Execution (EASIE) provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational database management system. Volume 1, Executive Overview, gives an overview of the functions provided by EASIE and describes their use. Three operational design systems based upon the EASIE software are briefly described.
Crew Office Evaluation of a Precision Lunar Landing System
NASA Technical Reports Server (NTRS)
Major, Laura M.; Duda, Kevin R.; Hirsh, Robert L.
2011-01-01
A representative Human System Interface for a precision lunar landing system, ALHAT, has been developed as a platform for prototype visualization and interaction concepts. This facilitates analysis of crew interaction with advanced sensors and AGNC systems. Human-in-the-loop evaluations with representatives from the Crew Office (i.e. astronauts) and Mission Operations Directorate (MOD) were performed to refine the crew role and information requirements during the final phases of landing. The results include a number of lessons learned from Shuttle that are applicable to the design of a human supervisory landing system and cockpit. Overall, the results provide a first order analysis of the tasks the crew will perform during lunar landing, an architecture for the Human System Interface based on these tasks, as well as details on the information needs to land safely.
Two-dimensional systolic-array architecture for pixel-level vision tasks
NASA Astrophysics Data System (ADS)
Vijverberg, Julien A.; de With, Peter H. N.
2010-05-01
This paper presents ongoing work on the design of a two-dimensional (2D) systolic array for image processing. This component is designed to operate on a multi-processor system-on-chip. In contrast with other 2D systolic-array architectures and many other hardware accelerators, we investigate the applicability of executing multiple tasks in a time-interleaved fashion on the Systolic Array (SA). This leads to a lower external memory bandwidth and better load balancing of the tasks on the different processing tiles. To enable the interleaving of tasks, we add a shadow-state register for fast task switching. To reduce the number of accesses to the external memory, we propose to share the communication assist between consecutive tasks. A preliminary, non-functional version of the SA has been synthesized for an XV4S25 FPGA device and yields a maximum clock frequency of 150 MHz requiring 1,447 slices and 5 memory blocks. Mapping tasks from video content-analysis applications from literature on the SA yields reductions in the execution time of 1-2 orders of magnitude compared to the software implementation. We conclude that the choice for an SA architecture is useful, but a scaled version of the SA featuring less logic with fewer processing and pipeline stages yielding a lower clock frequency, would be sufficient for a video analysis system-on-chip.
Cognitive performance modeling based on general systems performance theory.
Kondraske, George V
2010-01-01
General Systems Performance Theory (GSPT) was initially motivated by problems associated with quantifying different aspects of human performance. It has proved to be invaluable for measurement development and understanding quantitative relationships between human subsystem capacities and performance in complex tasks. It is now desired to bring focus to the application of GSPT to modeling of cognitive system performance. Previous studies involving two complex tasks (i.e., driving and performing laparoscopic surgery) and incorporating measures that are clearly related to cognitive performance (information processing speed and short-term memory capacity) were revisited. A GSPT-derived method of task analysis and performance prediction termed Nonlinear Causal Resource Analysis (NCRA) was employed to determine the demand on basic cognitive performance resources required to support different levels of complex task performance. This approach is presented as a means to determine a cognitive workload profile and the subsequent computation of a single number measure of cognitive workload (CW). Computation of CW may be a viable alternative to measuring it. Various possible "more basic" performance resources that contribute to cognitive system performance are discussed. It is concluded from this preliminary exploration that a GSPT-based approach can contribute to defining cognitive performance models that are useful for both individual subjects and specific groups (e.g., military pilots).
Particulate and Gaseous Emissions Measurement System (PAGEMS) Project
NASA Technical Reports Server (NTRS)
Kostic, Milivoje
2003-01-01
Professor Kostic will work on the current UEET program of the Aerosol and Particulate task. This task will focus on: how to acquire experimental data through Labview software how to make the data acquisition system more efficient trouble existing problem of the labview software recommend a better system improve existing system with better data and usually friendly.Three different assignments in this project included:Particle-Size Distribution Data Presentation;Error or Uncertainty Analysis of Measurement Results; and Enhancement of LabVlRN Data Acquisition Program for GRC PAGEMS Project.
ERIC Educational Resources Information Center
Michigan State Univ., East Lansing. Coll. of Agriculture and Natural Resources Education Inst.
This curriculum guide, developed for use in dental assistant education programs in Michigan, describes a task-based curriculum that can help a teacher to develop a classroom management system where students learn by doing. It is based on task analysis and reflects the skills, knowledge, and attitudes that employers expect entry-level dental…
CMS distributed data analysis with CRAB3
NASA Astrophysics Data System (ADS)
Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; Riahi, H.; Tanasijczuk, A. J.; Yusli, M. N. B.; Wolf, M.; Woodard, A. E.; Vaandering, E.
2015-12-01
The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day. CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks and submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. The new system improves the robustness, scalability and sustainability of the service. Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.
ERIC Educational Resources Information Center
Peterson, Craig M.
A system of task analysis and positive reinforcement was used in the vocational training of a 19-year-old trainable retarded youth (MA=6 years). The task of polishing shoe skates was analyzed and programmed into 29 steps and was reinforced with praise and money. The trainee learned the task in 13 sessions (approximately 1 month) and was employed…
A General Cross-Layer Cloud Scheduling Framework for Multiple IoT Computer Tasks.
Wu, Guanlin; Bao, Weidong; Zhu, Xiaomin; Zhang, Xiongtao
2018-05-23
The diversity of IoT services and applications brings enormous challenges to improving the performance of multiple computer tasks' scheduling in cross-layer cloud computing systems. Unfortunately, the commonly-employed frameworks fail to adapt to the new patterns on the cross-layer cloud. To solve this issue, we design a new computer task scheduling framework for multiple IoT services in cross-layer cloud computing systems. Specifically, we first analyze the features of the cross-layer cloud and computer tasks. Then, we design the scheduling framework based on the analysis and present detailed models to illustrate the procedures of using the framework. With the proposed framework, the IoT services deployed in cross-layer cloud computing systems can dynamically select suitable algorithms and use resources more effectively to finish computer tasks with different objectives. Finally, the algorithms are given based on the framework, and extensive experiments are also given to validate its effectiveness, as well as its superiority.
Dynamic Task Performance, Cohesion, and Communications in Human Groups.
Giraldo, Luis Felipe; Passino, Kevin M
2016-10-01
In the study of the behavior of human groups, it has been observed that there is a strong interaction between the cohesiveness of the group, its performance when the group has to solve a task, and the patterns of communication between the members of the group. Developing mathematical and computational tools for the analysis and design of task-solving groups that are not only cohesive but also perform well is of importance in social sciences, organizational management, and engineering. In this paper, we model a human group as a dynamical system whose behavior is driven by a task optimization process and the interaction between subsystems that represent the members of the group interconnected according to a given communication network. These interactions are described as attractions and repulsions among members. We show that the dynamics characterized by the proposed mathematical model are qualitatively consistent with those observed in real-human groups, where the key aspect is that the attraction patterns in the group and the commitment to solve the task are not static but change over time. Through a theoretical analysis of the system we provide conditions on the parameters that allow the group to have cohesive behaviors, and Monte Carlo simulations are used to study group dynamics for different sets of parameters, communication topologies, and tasks to solve.
Incremental Upgrade of Legacy Systems (IULS)
2001-04-01
analysis task employed SEI’s Feature-Oriented Domain Analysis methodology (see FODA reference) and included several phases: • Context Analysis • Establish...Legacy, new Host and upgrade system and software. The Feature Oriented Domain Analysis approach ( FODA , see SUM References) was used for this step...Feature-Oriented Domain Analysis ( FODA ) Feasibility Study (CMU/SEI-90-TR- 21, ESD-90-TR-222); Software Engineering Institute, Carnegie Mellon University
The effects of motion and g-seat cues on pilot simulator performance of three piloting tasks
NASA Technical Reports Server (NTRS)
Showalter, T. W.; Parris, B. L.
1980-01-01
Data are presented that show the effects of motion system cues, g-seat cues, and pilot experience on pilot performance during takeoffs with engine failures, during in-flight precision turns, and during landings with wind shear. Eight groups of USAF pilots flew a simulated KC-135 using four different cueing systems. The basic cueing system was a fixed-base type (no-motion cueing) with visual cueing. The other three systems were produced by the presence of either a motion system or a g-seat, or both. Extensive statistical analysis of the data was performed and representative performance means were examined. These data show that the addition of motion system cueing results in significant improvement in pilot performance for all three tasks; however, the use of g-seat cueing, either alone or in conjunction with the motion system, provides little if any performance improvement for these tasks and for this aircraft type.
NASA Technical Reports Server (NTRS)
Gross, Anthony R.; Gerald-Yamasaki, Michael; Trent, Robert P.
2009-01-01
As part of the FDIR (Fault Detection, Isolation, and Recovery) Project for the Constellation Program, a task was designed within the context of the Constellation Program FDIR project called the Legacy Benchmarking Task to document as accurately as possible the FDIR processes and resources that were used by the Space Shuttle ground support equipment (GSE) during the Shuttle flight program. These results served as a comparison with results obtained from the new FDIR capability. The task team assessed Shuttle and EELV (Evolved Expendable Launch Vehicle) historical data for GSE-related launch delays to identify expected benefits and impact. This analysis included a study of complex fault isolation situations that required a lengthy troubleshooting process. Specifically, four elements of that system were considered: LH2 (liquid hydrogen), LO2 (liquid oxygen), hydraulic test, and ground special power.
Bueno, Mercedes; Fabrigoule, Colette; Deleurence, Philippe; Ndiaye, Daniel; Fort, Alexandra
2012-08-27
Driver distraction has been identified as the most important contributing factor in rear-end collisions. In this context, Forward Collision Warning Systems (FCWS) have been developed specifically to warn drivers of potential rear-end collisions. The main objective of this work is to evaluate the impact of a surrogate FCWS and of its reliability according to the driver's attentional state by recording both behavioral and electrophysiological data. Participants drove following a lead motorcycle in a simplified simulator with or without a warning system which gave forewarning of the preceding vehicle braking. Participants had to perform this driving task either alone (simple task) or simultaneously with a secondary cognitive task (dual task). Behavioral and electrophysiological data contributed to revealing a positive effect of the warning system. Participants were faster in detecting the brake light when the system was perfect or imperfect, and the time and attentional resources allocation required for processing the target at higher cognitive level were reduced when the system was completely reliable. When both tasks were performed simultaneously, warning effectiveness was considerably affected at both performance and neural levels; however, the analysis of the brain activity revealed fewer differences between distracted and undistracted drivers when using the warning system. These results show that electrophysiological data could be a valuable tool to complement behavioral data and to have a better understanding of how these systems impact the driver. Copyright © 2012 Elsevier B.V. All rights reserved.
An overview of the BioCreative 2012 Workshop Track III: interactive text mining task
Arighi, Cecilia N.; Carterette, Ben; Cohen, K. Bretonnel; Krallinger, Martin; Wilbur, W. John; Fey, Petra; Dodson, Robert; Cooper, Laurel; Van Slyke, Ceri E.; Dahdul, Wasila; Mabee, Paula; Li, Donghui; Harris, Bethany; Gillespie, Marc; Jimenez, Silvia; Roberts, Phoebe; Matthews, Lisa; Becker, Kevin; Drabkin, Harold; Bello, Susan; Licata, Luana; Chatr-aryamontri, Andrew; Schaeffer, Mary L.; Park, Julie; Haendel, Melissa; Van Auken, Kimberly; Li, Yuling; Chan, Juancarlos; Muller, Hans-Michael; Cui, Hong; Balhoff, James P.; Chi-Yang Wu, Johnny; Lu, Zhiyong; Wei, Chih-Hsuan; Tudor, Catalina O.; Raja, Kalpana; Subramani, Suresh; Natarajan, Jeyakumar; Cejuela, Juan Miguel; Dubey, Pratibha; Wu, Cathy
2013-01-01
In many databases, biocuration primarily involves literature curation, which usually involves retrieving relevant articles, extracting information that will translate into annotations and identifying new incoming literature. As the volume of biological literature increases, the use of text mining to assist in biocuration becomes increasingly relevant. A number of groups have developed tools for text mining from a computer science/linguistics perspective, and there are many initiatives to curate some aspect of biology from the literature. Some biocuration efforts already make use of a text mining tool, but there have not been many broad-based systematic efforts to study which aspects of a text mining tool contribute to its usefulness for a curation task. Here, we report on an effort to bring together text mining tool developers and database biocurators to test the utility and usability of tools. Six text mining systems presenting diverse biocuration tasks participated in a formal evaluation, and appropriate biocurators were recruited for testing. The performance results from this evaluation indicate that some of the systems were able to improve efficiency of curation by speeding up the curation task significantly (∼1.7- to 2.5-fold) over manual curation. In addition, some of the systems were able to improve annotation accuracy when compared with the performance on the manually curated set. In terms of inter-annotator agreement, the factors that contributed to significant differences for some of the systems included the expertise of the biocurator on the given curation task, the inherent difficulty of the curation and attention to annotation guidelines. After the task, annotators were asked to complete a survey to help identify strengths and weaknesses of the various systems. The analysis of this survey highlights how important task completion is to the biocurators’ overall experience of a system, regardless of the system’s high score on design, learnability and usability. In addition, strategies to refine the annotation guidelines and systems documentation, to adapt the tools to the needs and query types the end user might have and to evaluate performance in terms of efficiency, user interface, result export and traditional evaluation metrics have been analyzed during this task. This analysis will help to plan for a more intense study in BioCreative IV. PMID:23327936
Network analysis of exploratory behaviors of mice in a spatial learning and memory task
Suzuki, Yusuke
2017-01-01
The Barnes maze is one of the main behavioral tasks used to study spatial learning and memory. The Barnes maze is a task conducted on “dry land” in which animals try to escape from a brightly lit exposed circular open arena to a small dark escape box located under one of several holes at the periphery of the arena. In comparison with another classical spatial learning and memory task, the Morris water maze, the negative reinforcements that motivate animals in the Barnes maze are less severe and less stressful. Furthermore, the Barnes maze is more compatible with recently developed cutting-edge techniques in neural circuit research, such as the miniature brain endoscope or optogenetics. For this study, we developed a lift-type task start system and equipped the Barnes maze with it. The subject mouse is raised up by the lift and released into the maze automatically so that it can start navigating the maze smoothly from exactly the same start position across repeated trials. We believe that a Barnes maze test with a lift-type task start system may be useful for behavioral experiments when combined with head-mounted or wire-connected devices for online imaging and intervention in neural circuits. Furthermore, we introduced a network analysis method for the analysis of the Barnes maze data. Each animal’s exploratory behavior in the maze was visualized as a network of nodes and their links, and spatial learning in the maze is described by systematic changes in network structures of search behavior. Network analysis was capable of visualizing and quantitatively analyzing subtle but significant differences in an animal’s exploratory behavior in the maze. PMID:28700627
User Needs, Benefits, and Integration of Robotic Systems in a Space Station Laboratory
NASA Technical Reports Server (NTRS)
Dodd, W. R.; Badgley, M. B.; Konkel, C. R.
1989-01-01
The methodology, results and conclusions of all tasks of the User Needs, Benefits, and Integration Study (UNBIS) of Robotic Systems in a Space Station Laboratory are summarized. Study goals included the determination of user requirements for robotics within the Space Station, United States Laboratory. In Task 1, three experiments were selected to determine user needs and to allow detailed investigation of microgravity requirements. In Task 2, a NASTRAN analysis of Space Station response to robotic disturbances, and acceleration measurement of a standard industrial robot (Intelledex Model 660) resulted in selection of two ranges of microgravity manipulation: Level 1 (10-3 to 10-5 G at greater than 1 Hz) and Level 2 (less than equal 10-6 G at 0.1 Hz). This task included an evaluation of microstepping methods for controlling stepper motors and concluded that an industrial robot actuator can perform milli-G motion without modification. Relative merits of end-effectors and manipulators were studied in Task 3 in order to determine their ability to perform a range of tasks related to the three microgravity experiments. An Effectivity Rating was established for evaluating these robotic system capabilities. Preliminary interface requirements for an orbital flight demonstration were determined in Task 4. Task 5 assessed the impact of robotics.
The enactment of tasks in a fifth grade classroom
NASA Astrophysics Data System (ADS)
Schwartz, Jonathan L.
2007-12-01
This study looked at one classroom's manifestation of inquiry. Looking at tasks as part of the Full Option Science System (FOSS) shed light on the way in which inquiry took shape in the classroom. To do this, detailed descriptions and analysis of the enactment of inquiry-based tasks were conducted in one fifth-grade elementary school classroom during an 8-week period of instruction. A central finding was that the intended tasks differed from the actual tasks. This incongruence occurred primarily due to the actions of individuals in the classroom. These actions shaped tasks and transformed inquiry-based tasks from highly ambiguous, high-risk tasks to a routine set of steps and procedures. Teacher's actions included establishing a classroom culture, creating a flow to classroom events, and making instructional decisions. These actions resulted in implicit structures in the classroom that determined the pace and sequence of events, as well as how the requirements and value of work were understood by students. Implicit structures reflected shared understandings between the teacher and students about work and the overall system of accountability in the classroom.
NASA Technical Reports Server (NTRS)
Crane, D. F.
1984-01-01
When human operators are performing precision tracking tasks, their dynamic response can often be modeled by quasilinear describing functions. That fact permits analysis of the effects of delay in certain man machine control systems using linear control system analysis techniques. The analysis indicates that a reduction in system stability is the immediate effect of additional control system delay, and that system characteristics moderate or exaggerate the importance of the delay. A selection of data (simulator and flight test) consistent with the analysis is reviewed. Flight simulator visual-display delay compensation, designed to restore pilot aircraft system stability, was evaluated in several studies which are reviewed here. The studies range from single-axis, tracking-task experiments (with sufficient subjects and trials to establish the statistical significance of the results) to a brief evaluation of compensation of a computer generated imagery (CGI) visual display system in a full six degree of freedom simulation. The compensation was effective, improvements in pilot performance and workload or aircraft handling qualities rating (HQR) were observed. Results from recent aircraft handling qualities research literature, which support the compensation design approach, are also reviewed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This report summarizes the work done by InterTechnology/Solar Corporation, its consultants, Mobil Tyco Solar Energy Corporation and the University of Delaware Institute for Energy Conversion, and its consultants, during the marketing analysis of near and intermediate term photovoltaic power applications. To obtain estimates of the domestic and foreign market potential for photovoltaically powered devices two approaches were used. First, the study was identifying then screening all possible photovoltaic power supply applications. This approach encompassed the first two tasks of the study: (1) a survey of the current uses of photovoltaic systems, both domestic and international, and a projection of themore » usage of those systems into the future; and (2) a new idea generation task which attempted to come up with new ways of using photovoltaic power. Second, the study required in-depth analysis of key near-term and intermediate-term photovoltaic applications identified during the first phase to obtain reasonable estimates of photovoltaic market potential. This process encompassed the third and fourth tasks of the analysis: (3) refinement of ideas generated in Task 2 so that certain products/applications could be identified, the product defined and a market survey carried out; and (4) development of a detailed product scenario which forecasts sales, barriers to market acceptance, and technical innovationsrequired for proper introduction of the products. The work performed and findings of each task are presented.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krahn, John; Reed, Claude; Loewen, Eric
Final Technical Report: Electromagnetic Pump Insulation Materials Development and Testing (Report # DOEGEHB00613) summarizes the information gathered from the analysis of the 160 m3/min EM Pump insulation that was tested in 2000-2002 and additional evaluations of new resilient, engineered insulation system evaluated and tested at both GRC and ANL. This report provides information on Tasks 1 and 2 of the entire project. This report also provides information in three broad areas: Historical and current data; Conclusions based on test data; and Insulation specifications for use in EM Pumps. The research for Task 2 builds upon Task 1: Update EM Pumpmore » Databank, which is summarized within this report. Where research for Task 3 and 4 Next-Generation EM Pump Analysis Tools identified parameters or analysis model that benefits Task 2 research, those items are noted within this report. The important design variables for the manufacture and operation of an EM Pump that the insulation research can evaluate are: space constraints; voltage capability of insulation system; maximum flux density through iron; flow rate and outlet pressure; efficiency and manufacturability. The development summary of the Electromagnetic Pump Insulation Materials Development and Testing was completed to include: Historical and current data; Conclusions based on test data; and Insulation specifications for use in EM Pumps.« less
Human Error Analysis in a Permit to Work System: A Case Study in a Chemical Plant
Jahangiri, Mehdi; Hoboubi, Naser; Rostamabadi, Akbar; Keshavarzi, Sareh; Hosseini, Ali Akbar
2015-01-01
Background A permit to work (PTW) is a formal written system to control certain types of work which are identified as potentially hazardous. However, human error in PTW processes can lead to an accident. Methods This cross-sectional, descriptive study was conducted to estimate the probability of human errors in PTW processes in a chemical plant in Iran. In the first stage, through interviewing the personnel and studying the procedure in the plant, the PTW process was analyzed using the hierarchical task analysis technique. In doing so, PTW was considered as a goal and detailed tasks to achieve the goal were analyzed. In the next step, the standardized plant analysis risk-human (SPAR-H) reliability analysis method was applied for estimation of human error probability. Results The mean probability of human error in the PTW system was estimated to be 0.11. The highest probability of human error in the PTW process was related to flammable gas testing (50.7%). Conclusion The SPAR-H method applied in this study could analyze and quantify the potential human errors and extract the required measures for reducing the error probabilities in PTW system. Some suggestions to reduce the likelihood of errors, especially in the field of modifying the performance shaping factors and dependencies among tasks are provided. PMID:27014485
Design Evaluation for Personnel, Training and Human Factors (DEPTH) Final Report.
1998-01-17
human activity was primarily intended to facilitate man-machine design analyses of complex systems. By importing computer aided design (CAD) data, the human figure models and analysis algorithms can help to ensure components can be seen, reached, lifted and removed by most maintainers. These simulations are also useful for logistics data capture, training, and task analysis. DEPTH was also found to be useful in obtaining task descriptions for technical
NASA Technical Reports Server (NTRS)
Kirlik, Alex
1991-01-01
Advances in computer and control technology offer the opportunity for task-offload aiding in human-machine systems. A task-offload aid (e.g., an autopilot, an intelligent assistant) can be selectively engaged by the human operator to dynamically delegate tasks to an automated system. Successful design and performance prediction in such systems requires knowledge of the factors influencing the strategy the operator develops and uses for managing interaction with the task-offload aid. A model is presented that shows how such strategies can be predicted as a function of three task context properties (frequency and duration of secondary tasks and costs of delaying secondary tasks) and three aid design properties (aid engagement and disengagement times, aid performance relative to human performance). Sensitivity analysis indicates how each of these contextual and design factors affect the optimal aid aid usage strategy and attainable system performance. The model is applied to understanding human-automation interaction in laboratory experiments on human supervisory control behavior. The laboratory task allowed subjects freedom to determine strategies for using an autopilot in a dynamic, multi-task environment. Modeling results suggested that many subjects may indeed have been acting appropriately by not using the autopilot in the way its designers intended. Although autopilot function was technically sound, this aid was not designed with due regard to the overall task context in which it was placed. These results demonstrate the need for additional research on how people may strategically manage their own resources, as well as those provided by automation, in an effort to keep workload and performance at acceptable levels.
La, Christian; Garcia-Ramos, Camille; Nair, Veena A; Meier, Timothy B; Farrar-Edwards, Dorothy; Birn, Rasmus; Meyerand, Mary E; Prabhakaran, Vivek
2016-01-01
Healthy aging is associated with decline of cognitive functions. However, even before those declines become noticeable, the neural architecture underlying those mechanisms has undergone considerable restructuring and reorganization. During performance of a cognitive task, not only have the task-relevant networks demonstrated reorganization with aging, which occurs primarily by recruitment of additional areas to preserve performance, but the task-irrelevant network of the "default-mode" network (DMN), which is normally deactivated during task performance, has also consistently shown reduction of this deactivation with aging. Here, we revisited those age-related changes in task-relevant (i.e., language system) and task-irrelevant (i.e., DMN) systems with a language production paradigm in terms of task-induced activation/deactivation, functional connectivity, and context-dependent correlations between the two systems. Our task fMRI data demonstrated a late increase in cortical recruitment in terms of extent of activation, only observable in our older healthy adult group, when compared to the younger healthy adult group, with recruitment of the contralateral hemisphere, but also other regions from the network previously underutilized. Our middle-aged individuals, when compared to the younger healthy adult group, presented lower levels of activation intensity and connectivity strength, with no recruitment of additional regions, possibly reflecting an initial, uncompensated, network decline. In contrast, the DMN presented a gradual decrease in deactivation intensity and deactivation extent (i.e., low in the middle-aged, and lower in the old) and similar gradual reduction of functional connectivity within the network, with no compensation. The patterns of age-related changes in the task-relevant system and DMN are incongruent with the previously suggested notion of anti-correlation of the two systems. The context-dependent correlation by psycho-physiological interaction (PPI) analysis demonstrated an independence of these two systems, with the onset of task not influencing the correlation between the two systems. Our results suggest that the language network and the DMN may be non-dependent systems, potentially correlated through the re-allocation of cortical resources, and that aging may affect those two systems differently.
La, Christian; Garcia-Ramos, Camille; Nair, Veena A.; Meier, Timothy B.; Farrar-Edwards, Dorothy; Birn, Rasmus; Meyerand, Mary E.; Prabhakaran, Vivek
2016-01-01
Healthy aging is associated with decline of cognitive functions. However, even before those declines become noticeable, the neural architecture underlying those mechanisms has undergone considerable restructuring and reorganization. During performance of a cognitive task, not only have the task-relevant networks demonstrated reorganization with aging, which occurs primarily by recruitment of additional areas to preserve performance, but the task-irrelevant network of the “default-mode” network (DMN), which is normally deactivated during task performance, has also consistently shown reduction of this deactivation with aging. Here, we revisited those age-related changes in task-relevant (i.e., language system) and task-irrelevant (i.e., DMN) systems with a language production paradigm in terms of task-induced activation/deactivation, functional connectivity, and context-dependent correlations between the two systems. Our task fMRI data demonstrated a late increase in cortical recruitment in terms of extent of activation, only observable in our older healthy adult group, when compared to the younger healthy adult group, with recruitment of the contralateral hemisphere, but also other regions from the network previously underutilized. Our middle-aged individuals, when compared to the younger healthy adult group, presented lower levels of activation intensity and connectivity strength, with no recruitment of additional regions, possibly reflecting an initial, uncompensated, network decline. In contrast, the DMN presented a gradual decrease in deactivation intensity and deactivation extent (i.e., low in the middle-aged, and lower in the old) and similar gradual reduction of functional connectivity within the network, with no compensation. The patterns of age-related changes in the task-relevant system and DMN are incongruent with the previously suggested notion of anti-correlation of the two systems. The context-dependent correlation by psycho-physiological interaction (PPI) analysis demonstrated an independence of these two systems, with the onset of task not influencing the correlation between the two systems. Our results suggest that the language network and the DMN may be non-dependent systems, potentially correlated through the re-allocation of cortical resources, and that aging may affect those two systems differently. PMID:27242519
Power consumption analysis of operating systems for wireless sensor networks.
Lajara, Rafael; Pelegrí-Sebastiá, José; Perez Solano, Juan J
2010-01-01
In this paper four wireless sensor network operating systems are compared in terms of power consumption. The analysis takes into account the most common operating systems--TinyOS v1.0, TinyOS v2.0, Mantis and Contiki--running on Tmote Sky and MICAz devices. With the objective of ensuring a fair evaluation, a benchmark composed of four applications has been developed, covering the most typical tasks that a Wireless Sensor Network performs. The results show the instant and average current consumption of the devices during the execution of these applications. The experimental measurements provide a good insight into the power mode in which the device components are running at every moment, and they can be used to compare the performance of different operating systems executing the same tasks.
NASA Astrophysics Data System (ADS)
Jensen, Winnie; Rousche, Patrick J.
2006-03-01
The success of a cortical motor neuroprosthetic system will rely on the system's ability to effectively execute complex motor tasks in a changing environment. Invasive, intra-cortical electrodes have been successfully used to predict joint movement and grip force of a robotic arm/hand with a non-human primate (Chapin J K, Moxon K A, Markowitz R S and Nicolelis M A L 1999 Real-time control of a robotic arm using simultaneously recorded neurons in the motor cortex Nat. Neurosci. 2 664-70). It is well known that cortical encoding occurs with a high degree of cortical plasticity and depends on both the functional and behavioral context. Questions on the expected robustness of future motor prosthesis systems therefore still remain. The objective of the present work was to study the effect of minor changes in functional movement strategies on the M1 encoding. We compared the M1 encoding in freely moving, non-constrained animals that performed two similar behavioral tasks with the same end-goal, and investigated if these behavioral tasks could be discriminated based on the M1 recordings. The rats depressed a response paddle either with a set of restrictive bars ('WB') or without the bars ('WOB') placed in front of the paddle. The WB task required changes in the motor strategy to complete the paddle press and resulted in highly stereotyped movements, whereas in the WOB task the movement strategy was not restricted. Neural population activity was recorded from 16-channel micro-wire arrays and data up to 200 ms before a paddle hit were analyzed off-line. The analysis showed a significant neural firing difference between the two similar WB and WOB tasks, and using principal component analysis it was possible to distinguish between the two tasks with a best classification at 76.6%. While the results are dependent upon a small, randomly sampled neural population, they indicate that information about similar behavioral tasks may be extracted from M1 based on relatively few channels of neural signal for possible use in a cortical neuroprosthetic system.
Efficient parallel architecture for highly coupled real-time linear system applications
NASA Technical Reports Server (NTRS)
Carroll, Chester C.; Homaifar, Abdollah; Barua, Soumavo
1988-01-01
A systematic procedure is developed for exploiting the parallel constructs of computation in a highly coupled, linear system application. An overall top-down design approach is adopted. Differential equations governing the application under consideration are partitioned into subtasks on the basis of a data flow analysis. The interconnected task units constitute a task graph which has to be computed in every update interval. Multiprocessing concepts utilizing parallel integration algorithms are then applied for efficient task graph execution. A simple scheduling routine is developed to handle task allocation while in the multiprocessor mode. Results of simulation and scheduling are compared on the basis of standard performance indices. Processor timing diagrams are developed on the basis of program output accruing to an optimal set of processors. Basic architectural attributes for implementing the system are discussed together with suggestions for processing element design. Emphasis is placed on flexible architectures capable of accommodating widely varying application specifics.
ATTDES: An Expert System for Satellite Attitude Determination and Control. 2
NASA Technical Reports Server (NTRS)
Mackison, Donald L.; Gifford, Kevin
1996-01-01
The design, analysis, and flight operations of satellite attitude determintion and attitude control systems require extensive mathematical formulations, optimization studies, and computer simulation. This is best done by an analyst with extensive education and experience. The development of programs such as ATTDES permit the use of advanced techniques by those with less experience. Typical tasks include the mission analysis to select stabilization and damping schemes, attitude determination sensors and algorithms, and control system designs to meet program requirements. ATTDES is a system that includes all of these activities, including high fidelity orbit environment models that can be used for preliminary analysis, parameter selection, stabilization schemes, the development of estimators covariance analyses, and optimization, and can support ongoing orbit activities. The modification of existing simulations to model new configurations for these purposes can be an expensive, time consuming activity that becomes a pacing item in the development and operation of such new systems. The use of an integrated tool such as ATTDES significantly reduces the effort and time required for these tasks.
Effect of partition board color on mood and autonomic nervous function.
Sakuragi, Sokichi; Sugiyama, Yoshiki
2011-12-01
The purpose of this study was to evaluate the effects of the presence or absence (control) of a partition board and its color (red, yellow, blue) on subjective mood ratings and changes in autonomic nervous system indicators induced by a video game task. The increase in the mean Profile of Mood States (POMS) Fatigue score and mean Oppressive feeling rating after the task was lowest with the blue partition board. Multiple-regression analysis identified oppressive feeling and error scores on the second half of the task as statistically significant contributors to Fatigue. While explanatory variables were limited to the physiological indices, multiple-regression analysis identified a significant contribution of autonomic reactivity (assessed by heart rate variability) to Fatigue. These results suggest that a blue partition board would reduce task-induced subjective fatigue, in part by lowering the oppressive feeling of being enclosed during the task, possibly by increasing autonomic reactivity.
Instrumentation for Mars Environments
NASA Technical Reports Server (NTRS)
Landis, Geoffrey A.
1997-01-01
The main portion of the project was to support the "MAE" experiment on the Mars Pathfinder mission and to design instrumentation for future space missions to measure dust deposition on Mars and to characterize the properties of the dust. A second task was to analyze applications for photovoltaics in new space environments, and a final task was analysis of advanced applications for solar power, including planetary probes, photovoltaic system operation on Mars, and satellite solar power systems.
2013-12-01
of power from sunlight or a wind turbine (same solar panel tarps used in NEST Raptor Solar Light Trailer) • Global Positioning System (GPS) devices...satellite-enabled rapid wireless communications to the most critical areas and functions, working with Joint Task Forces. The first priority after the...a rapid response wireless communications system from military, civilian government, and non-government organizations. The tasks performed by HFN
Materials experiment carrier concepts definition study. Volume 2: Technical report, part 2
NASA Technical Reports Server (NTRS)
1981-01-01
A materials experiment carrier (MEC) that provides effective accommodation of the given baseline materials processing in space (MPS) payloads and demonstration of the MPS platform concept for high priority materials processing science, multidiscipline MPS investigations, host carrier for commercial MPS payloads, and system economy of orbital operations is defined. The study flow of task work is shown. Study tasks featured analysis and trades to identify the MEC system concept options.
What Makes Patient Navigation Most Effective: Defining Useful Tasks and Networks.
Gunn, Christine; Battaglia, Tracy A; Parker, Victoria A; Clark, Jack A; Paskett, Electra D; Calhoun, Elizabeth; Snyder, Frederick R; Bergling, Emily; Freund, Karen M
2017-01-01
Given the momentum in adopting patient navigation into cancer care, there is a need to understand the contribution of specific navigator activities to improved clinical outcomes. A mixed-methods study combined direct observations of patient navigators within the Patient Navigation Research Program and outcome data from the trial. We correlated the frequency of navigator tasks with the outcome of rate of diagnostic resolution within 365 days among patients who received the intervention relative to controls. A focused content analysis examined those tasks with the strongest correlations between navigator tasks and patient outcomes. Navigating directly with specific patients (r = 0.679), working with clinical providers to facilitate patient care (r = 0.643), and performing tasks not directly related to their diagnostic evaluation for patients were positively associated with more timely diagnosis (r = 0.714). Using medical records for non-navigation tasks had a negative association (r = -0.643). Content analysis revealed service provision directed at specific patients improved care while systems-focused activities did not.
NASA Technical Reports Server (NTRS)
Kirlik, Alex
1993-01-01
Task-offload aids (e.g., an autopilot, an 'intelligent' assistant) can be selectively engaged by the human operator to dynamically delegate tasks to automation. Introducing such aids eliminates some task demands but creates new ones associated with programming, engaging, and disengaging the aiding device via an interface. The burdens associated with managing automation can sometimes outweigh the potential benefits of automation to improved system performance. Aid design parameters and features of the overall multitask context combine to determine whether or not a task-offload aid will effectively support the operator. A modeling and sensitivity analysis approach is presented that identifies effective strategies for human-automation interaction as a function of three task-context parameters and three aid design parameters. The analysis and modeling approaches provide resources for predicting how a well-adapted operator will use a given task-offload aid, and for specifying aid design features that ensure that automation will provide effective operator support in a multitask environment.
NASA Technical Reports Server (NTRS)
Wiley, Lowell F.
1985-01-01
A work breakdown structure for the Space Station Life Sciences Research Facility (LSRF) is presented up to level 5. The purpose is to provide the framework for task planning and control and to serve as a basis for budgeting, task assignment, cost collection and report, and contractual performance measurement and tracking of the Full Scale Development Phase tasks.
Flexible data-management system
NASA Technical Reports Server (NTRS)
Pelouch, J. J., Jr.
1977-01-01
Combined ASRDI Data-Management and Analysis Technique (CADMAT) is system of computer programs and procedures that can be used to conduct data-management tasks. System was developed specifically for use by scientists and engineers who are confronted with management and analysis of large quantities of data organized into records of events and parametric fields. CADMAT is particularly useful when data are continually accumulated, such as when the need of retrieval and analysis is ongoing.
Deployable antenna phase A study
NASA Technical Reports Server (NTRS)
Schultz, J.; Bernstein, J.; Fischer, G.; Jacobson, G.; Kadar, I.; Marshall, R.; Pflugel, G.; Valentine, J.
1979-01-01
Applications for large deployable antennas were re-examined, flight demonstration objectives were defined, the flight article (antenna) was preliminarily designed, and the flight program and ground development program, including the support equipment, were defined for a proposed space transportation system flight experiment to demonstrate a large (50 to 200 meter) deployable antenna system. Tasks described include: (1) performance requirements analysis; (2) system design and definition; (3) orbital operations analysis; and (4) programmatic analysis.
Study of Plasma Motor Generator (PMG) tether system for orbit reboost
NASA Technical Reports Server (NTRS)
1988-01-01
A progress report is given on a system study by TRW begun in January 1987 of a 2 kW Plasma Motor Generator Tether to be used for orbit reboost. Following the completion of the initial phase in September 1987, additional tasks were agreed on and work on them begun in March 1988. These tasks fell into three categories: tests on the prototype tether fabricated during the first phase, simulations of the spacecraft and tether system after deployment using GTOSS, and a brief investigation of the impact and feasibility of increasing the system to 20 kW and hosting it on the Orbital Maneuvering Vehicle. The subcontractor, Energy Sciences Laboratory, was assigned the responsibility of performing the simulations and some mechanical tests on the prototype tether to supplement those done at TRW. A summary of the significant findings and issues from each task follows. Recommendations for future work constitutes the third section. A copy of the final briefing is in Appendix A, plus additional reports for each task and additional analysis.
TADS: A CFD-Based Turbomachinery Analysis and Design System with GUI: Methods and Results. 2.0
NASA Technical Reports Server (NTRS)
Koiro, M. J.; Myers, R. A.; Delaney, R. A.
1999-01-01
The primary objective of this study was the development of a Computational Fluid Dynamics (CFD) based turbomachinery airfoil analysis and design system, controlled by a Graphical User Interface (GUI). The computer codes resulting from this effort are referred to as TADS (Turbomachinery Analysis and Design System). This document is the Final Report describing the theoretical basis and analytical results from the TADS system developed under Task 10 of NASA Contract NAS3-27394, ADPAC System Coupling to Blade Analysis & Design System GUI, Phase II-Loss, Design and. Multi-stage Analysis. TADS couples a throughflow solver (ADPAC) with a quasi-3D blade-to-blade solver (RVCQ3D) or a 3-D solver with slip condition on the end walls (B2BADPAC) in an interactive package. Throughflow analysis and design capability was developed in ADPAC through the addition of blade force and blockage terms to the governing equations. A GUI was developed to simplify user input and automate the many tasks required to perform turbomachinery analysis and design. The coupling of the various programs was done in such a way that alternative solvers or grid generators could be easily incorporated into the TADS framework. Results of aerodynamic calculations using the TADS system are presented for a multistage compressor, a multistage turbine, two highly loaded fans, and several single stage compressor and turbine example cases.
Orbit transfer rocket engine technology program
NASA Technical Reports Server (NTRS)
Gustafson, N. B.; Harmon, T. J.
1993-01-01
An advanced near term (1990's) space-based Orbit Transfer Vehicle Engine (OTVE) system was designed, and the technologies applicable to its construction, maintenance, and operations were developed under Tasks A through F of the Orbit Transfer Rocket Engine Technology Program. Task A was a reporting task. In Task B, promising OTV turbomachinery technologies were explored: two stage partial admission turbines, high velocity ratio diffusing crossovers, soft wear ring seals, advanced bearing concepts, and a rotordynamic analysis. In Task C, a ribbed combustor design was developed. Possible rib and channel geometries were chosen analytically. Rib candidates were hot air tested and laser velocimeter boundary layer analyses were conducted. A channel geometry was also chosen on the basis of laser velocimeter data. To verify the predicted heat enhancement effects, a ribbed calorimeter spool was hot fire tested. Under Task D, the optimum expander cycle engine thrust, performance and envelope were established for a set of OTV missions. Optimal nozzle contours and quick disconnects for modularity were developed. Failure Modes and Effects Analyses, maintenance and reliability studies and component study results were incorporated into the engine system. Parametric trades on engine thrust, mixture ratio, and area ratio were also generated. A control system and the health monitoring and maintenance operations necessary for a space-based engine were outlined in Task E. In addition, combustor wall thickness measuring devices and a fiberoptic shaft monitor were developed. These monitoring devices were incorporated into preflight engine readiness checkout procedures. In Task F, the Integrated Component Evaluator (I.C.E.) was used to demonstrate performance and operational characteristics of an advanced expander cycle engine system and its component technologies. Sub-system checkouts and a system blowdown were performed. Short transitions were then made into main combustor ignition and main stage operation.
Diverse task scheduling for individualized requirements in cloud manufacturing
NASA Astrophysics Data System (ADS)
Zhou, Longfei; Zhang, Lin; Zhao, Chun; Laili, Yuanjun; Xu, Lida
2018-03-01
Cloud manufacturing (CMfg) has emerged as a new manufacturing paradigm that provides ubiquitous, on-demand manufacturing services to customers through network and CMfg platforms. In CMfg system, task scheduling as an important means of finding suitable services for specific manufacturing tasks plays a key role in enhancing the system performance. Customers' requirements in CMfg are highly individualized, which leads to diverse manufacturing tasks in terms of execution flows and users' preferences. We focus on diverse manufacturing tasks and aim to address their scheduling issue in CMfg. First of all, a mathematical model of task scheduling is built based on analysis of the scheduling process in CMfg. To solve this scheduling problem, we propose a scheduling method aiming for diverse tasks, which enables each service demander to obtain desired manufacturing services. The candidate service sets are generated according to subtask directed graphs. An improved genetic algorithm is applied to searching for optimal task scheduling solutions. The effectiveness of the scheduling method proposed is verified by a case study with individualized customers' requirements. The results indicate that the proposed task scheduling method is able to achieve better performance than some usual algorithms such as simulated annealing and pattern search.
Robert-Lachaine, Xavier; Mecheri, Hakim; Larue, Christian; Plamondon, André
2017-04-01
The potential of inertial measurement units (IMUs) for ergonomics applications appears promising. However, previous IMUs validation studies have been incomplete regarding aspects of joints analysed, complexity of movements and duration of trials. The objective was to determine the technological error and biomechanical model differences between IMUs and an optoelectronic system and evaluate the effect of task complexity and duration. Whole-body kinematics from 12 participants was recorded simultaneously with a full-body Xsens system where an Optotrak cluster was fixed on every IMU. Short functional movements and long manual material handling tasks were performed and joint angles were compared between the two systems. The differences attributed to the biomechanical model showed significantly greater (P ≤ .001) RMSE than the technological error. RMSE was systematically higher (P ≤ .001) for the long complex task with a mean on all joints of 2.8° compared to 1.2° during short functional movements. Definition of local coordinate systems based on anatomical landmarks or single posture was the most influent difference between the two systems. Additionally, IMUs accuracy was affected by the complexity and duration of the tasks. Nevertheless, technological error remained under 5° RMSE during handling tasks, which shows potential to track workers during their daily labour.
Artificial Neural Network Analysis System
2001-02-27
Contract No. DASG60-00-M-0201 Purchase request no.: Foot in the Door-01 Title Name: Artificial Neural Network Analysis System Company: Atlantic... Artificial Neural Network Analysis System 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Powell, Bruce C 5d. PROJECT NUMBER 5e. TASK NUMBER...34) 27-02-2001 Report Type N/A Dates Covered (from... to) ("DD MON YYYY") 28-10-2000 27-02-2001 Title and Subtitle Artificial Neural Network Analysis
Dasgupta, Aritra; Lee, Joon-Yong; Wilson, Ryan; Lafrance, Robert A; Cramer, Nick; Cook, Kristin; Payne, Samuel
2017-01-01
Combining interactive visualization with automated analytical methods like statistics and data mining facilitates data-driven discovery. These visual analytic methods are beginning to be instantiated within mixed-initiative systems, where humans and machines collaboratively influence evidence-gathering and decision-making. But an open research question is that, when domain experts analyze their data, can they completely trust the outputs and operations on the machine-side? Visualization potentially leads to a transparent analysis process, but do domain experts always trust what they see? To address these questions, we present results from the design and evaluation of a mixed-initiative, visual analytics system for biologists, focusing on analyzing the relationships between familiarity of an analysis medium and domain experts' trust. We propose a trust-augmented design of the visual analytics system, that explicitly takes into account domain-specific tasks, conventions, and preferences. For evaluating the system, we present the results of a controlled user study with 34 biologists where we compare the variation of the level of trust across conventional and visual analytic mediums and explore the influence of familiarity and task complexity on trust. We find that despite being unfamiliar with a visual analytic medium, scientists seem to have an average level of trust that is comparable with the same in conventional analysis medium. In fact, for complex sense-making tasks, we find that the visual analytic system is able to inspire greater trust than other mediums. We summarize the implications of our findings with directions for future research on trustworthiness of visual analytic systems.
NASA Technical Reports Server (NTRS)
1985-01-01
The initial task in the Space Station Data System (SSDS) Analysis/Architecture Study is the definition of the functional and key performance requirements for the SSDS. The SSDS is the set of hardware and software, both on the ground and in space, that provides the basic data management services for Space Station customers and systems. The primary purpose of the requirements development activity was to provide a coordinated, documented requirements set as a basis for the system definition of the SSDS and for other subsequent study activities. These requirements should also prove useful to other Space Station activities in that they provide an indication of the scope of the information services and systems that will be needed in the Space Station program. The major results of the requirements development task are as follows: (1) identification of a conceptual topology and architecture for the end-to-end Space Station Information Systems (SSIS); (2) development of a complete set of functional requirements and design drivers for the SSIS; (3) development of functional requirements and key performance requirements for the Space Station Data System (SSDS); and (4) definition of an operating concept for the SSIS. The operating concept was developed both from a Space Station payload customer and operator perspective in order to allow a requirements practicality assessment.
Systems design analysis applied to launch vehicle configuration
NASA Technical Reports Server (NTRS)
Ryan, R.; Verderaime, V.
1993-01-01
As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.
Clinical quality needs complex adaptive systems and machine learning.
Marsland, Stephen; Buchan, Iain
2004-01-01
The vast increase in clinical data has the potential to bring about large improvements in clinical quality and other aspects of healthcare delivery. However, such benefits do not come without cost. The analysis of such large datasets, particularly where the data may have to be merged from several sources and may be noisy and incomplete, is a challenging task. Furthermore, the introduction of clinical changes is a cyclical task, meaning that the processes under examination operate in an environment that is not static. We suggest that traditional methods of analysis are unsuitable for the task, and identify complexity theory and machine learning as areas that have the potential to facilitate the examination of clinical quality. By its nature the field of complex adaptive systems deals with environments that change because of the interactions that have occurred in the past. We draw parallels between health informatics and bioinformatics, which has already started to successfully use machine learning methods.
1987-10-01
19 treated in interaction with each other and the hardware and software design. The authors point out some of the inadequacies in HP technologies and...life cycle costs recognition performance on secondary tasks effort/efficiency number of wins ( gaming tasks) number of instructors needed amount of...student interacts with this material in real time via a terminal and display system. The computer performs many functions, such as diagnose student
Guidance simulation and test support for differential GPS flight experiment
NASA Technical Reports Server (NTRS)
Geier, G. J.; Loomis, P. V. W.; Cabak, A.
1987-01-01
Three separate tasks which supported the test preparation, test operations, and post test analysis of the NASA Ames flight test evaluation of the differential Global Positioning System (GPS) are presented. Task 1 consisted of a navigation filter design, coding, and testing to optimally make use of GPS in a differential mode. The filter can be configured to accept inputs from external censors such as an accelerometer and a barometric or radar altimeter. The filter runs in real time onboard a NASA helicopter. It processes raw pseudo and delta range measurements from a single channel sequential GPS receiver. The Kalman filter software interfaces are described in detail, followed by a description of the filter algorithm, including the basic propagation and measurement update equations. The performance during flight tests is reviewed and discussed. Task 2 describes a refinement performed on the lateral and vertical steering algorithms developed on a previous contract. The refinements include modification of the internal logic to allow more diverse inflight initialization procedures, further data smoothing and compensation for system induced time delays. Task 3 describes the TAU Corp participation in the analysis of the real time Kalman navigation filter. The performance was compared to that of the Z-set filter in flight and to the laser tracker position data during post test analysis. This analysis allowed a more optimum selection of the parameters of the filter.
Calabria, Marco; Hernández, Mireia; Branzi, Francesca M.; Costa, Albert
2012-01-01
Previous research has shown that highly proficient bilinguals have comparable switch costs in both directions when they switch between languages (L1 and L2), the so-called “symmetrical switch cost” effect. Interestingly, the same symmetry is also present when they switch between L1 and a much weaker L3. These findings suggest that highly proficient bilinguals develop a language control system that seems to be insensitive to language proficiency. In the present study, we explore whether the pattern of symmetrical switch costs in language switching tasks generalizes to a non-linguistic switching task in the same group of highly proficient bilinguals. The end goal of this is to assess whether bilingual language control (bLC) can be considered as subsidiary to domain-general executive control (EC). We tested highly proficient Catalan–Spanish bilinguals both in a linguistic switching task and in a non-linguistic switching task. In the linguistic task, participants named pictures in L1 and L2 (Experiment 1) or L3 (Experiment 2) depending on a cue presented with the picture (a flag). In the non-linguistic task, the same participants had to switch between two card sorting rule-sets (color and shape). Overall, participants showed symmetrical switch costs in the linguistic switching task, but not in the non-linguistic switching task. In a further analysis, we observed that in the linguistic switching task the asymmetry of the switch costs changed across blocks, while in the non-linguistic switching task an asymmetrical switch cost was observed throughout the task. The observation of different patterns of switch costs in the linguistic and the non-linguistic switching tasks suggest that the bLC system is not completely subsidiary to the domain-general EC system. PMID:22275905
Mission-based Scenario Research: Experimental Design And Analysis
2012-01-01
neurotechnologies called Brain-Computer Interaction Technologies. 15. SUBJECT TERMS neuroimaging, EEG, task loading, neurotechnologies , ground... neurotechnologies called Brain-Computer Interaction Technologies. INTRODUCTION Imagine a system that can identify operator fatigue during a long-term...BCIT), a class of neurotechnologies , that aim to improve task performance by incorporating measures of brain activity to optimize the interactions
DOT National Transportation Integrated Search
2017-10-25
The Task 8 D2X Hub Proof-of-Concept Test Evaluation Report provides results of the experimental data analysis performed in accordance with the experimental plan for the proof-of-concept version of the prototype system. The data set analyzed includes ...
Effects of and preference for pay for performance: an analogue analysis.
Long, Robert D; Wilder, David A; Betz, Alison; Dutta, Ami
2012-01-01
We examined the effects of 2 payment systems on the rate of check processing and time spent on task by participants in a simulated work setting. Three participants experienced individual pay-for-performance (PFP) without base pay and pay-for-time (PFT) conditions. In the last phase, we asked participants to choose which system they preferred. For all participants, the PFP condition produced higher rates of check processing and more time spent on task than did the PFT condition, but choice of payment system varied both within and across participants.
ERIC Educational Resources Information Center
Lavy, Ilana; Yadin, Aharon
2010-01-01
The present study was carried out within a systems analysis and design workshop. In addition to the standard analysis and design tasks, this workshop included practices designed to enhance student capabilities related to non-technical knowledge areas, such as critical thinking, interpersonal and team skills, and business understanding. Each task…
Reachability analysis of real-time systems using time Petri nets.
Wang, J; Deng, Y; Xu, G
2000-01-01
Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.
NOBLE - Flexible concept recognition for large-scale biomedical natural language processing.
Tseytlin, Eugene; Mitchell, Kevin; Legowski, Elizabeth; Corrigan, Julia; Chavan, Girish; Jacobson, Rebecca S
2016-01-14
Natural language processing (NLP) applications are increasingly important in biomedical data analysis, knowledge engineering, and decision support. Concept recognition is an important component task for NLP pipelines, and can be either general-purpose or domain-specific. We describe a novel, flexible, and general-purpose concept recognition component for NLP pipelines, and compare its speed and accuracy against five commonly used alternatives on both a biological and clinical corpus. NOBLE Coder implements a general algorithm for matching terms to concepts from an arbitrary vocabulary set. The system's matching options can be configured individually or in combination to yield specific system behavior for a variety of NLP tasks. The software is open source, freely available, and easily integrated into UIMA or GATE. We benchmarked speed and accuracy of the system against the CRAFT and ShARe corpora as reference standards and compared it to MMTx, MGrep, Concept Mapper, cTAKES Dictionary Lookup Annotator, and cTAKES Fast Dictionary Lookup Annotator. We describe key advantages of the NOBLE Coder system and associated tools, including its greedy algorithm, configurable matching strategies, and multiple terminology input formats. These features provide unique functionality when compared with existing alternatives, including state-of-the-art systems. On two benchmarking tasks, NOBLE's performance exceeded commonly used alternatives, performing almost as well as the most advanced systems. Error analysis revealed differences in error profiles among systems. NOBLE Coder is comparable to other widely used concept recognition systems in terms of accuracy and speed. Advantages of NOBLE Coder include its interactive terminology builder tool, ease of configuration, and adaptability to various domains and tasks. NOBLE provides a term-to-concept matching system suitable for general concept recognition in biomedical NLP pipelines.
Human factors evaluation of teletherapy: Function and task analysis. Volume 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaye, R.D.; Henriksen, K.; Jones, R.
1995-07-01
As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatmentmore » requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included.« less
Advanced technology cogeneration system conceptual design study: Closed cycle gas turbines
NASA Technical Reports Server (NTRS)
Mock, E. A. T.; Daudet, H. C.
1983-01-01
The results of a three task study performed for the Department of Energy under the direction of the NASA Lewis Research Center are documented. The thermal and electrical energy requirements of three specific industrial plants were surveyed and cost records for the energies consumed were compiled. Preliminary coal fired atmospheric fluidized bed heated closed cycle gas turbine and steam turbine cogeneration system designs were developed for each industrial plant. Preliminary cost and return-on-equity values were calculated and the results compared. The best of the three sites was selected for more detailed design and evaluation of both closed cycle gas turbine and steam turbine cogeneration systems during Task II. Task III involved characterizing the industrial sector electrical and thermal loads for the 48 contiguous states, applying a family of closed cycle gas turbine and steam turbine cogeneration systems to these loads, and conducting a market penetration analysis of the closed cycle gas turbine cogeneration system.
Benefits of Matching Domain Structure for Planning Software: The Right Stuff
NASA Technical Reports Server (NTRS)
Billman, Dorrit Owen; Arsintescu, Lucica; Feary, Michael S.; Lee, Jessica Chia-Rong; Smith, Asha Halima; Tiwary, Rachna
2011-01-01
We investigated the role of domain structure in software design. We compared 2 planning applications, for a Mission Control group (International Space Station), and measured users speed and accuracy. Based on our needs analysis, we identified domain structure and used this to develop new prototype software that matched domain structure better than the legacy system. We took a high-fidelity analog of the natural task into the laboratory and found (large) periformance differences, favoring the system that matched domain structure. Our task design enabled us to attribute better periormance to better match of domain structure. We ran through the whole development cycle, in miniature, from needs analysis through design, development, and evaluation. Doing so enabled inferences not just about the particular systems compared, but also provided evidence for the viability of the design process (particularly needs analysis) that we are exploring.
Robust Resilience of the Frontotemporal Syntax System to Aging
Samu, Dávid; Davis, Simon W.; Geerligs, Linda; Mustafa, Abdur; Tyler, Lorraine K.
2016-01-01
Brain function is thought to become less specialized with age. However, this view is largely based on findings of increased activation during tasks that fail to separate task-related processes (e.g., attention, decision making) from the cognitive process under examination. Here we take a systems-level approach to separate processes specific to language comprehension from those related to general task demands and to examine age differences in functional connectivity both within and between those systems. A large population-based sample (N = 111; 22–87 years) from the Cambridge Centre for Aging and Neuroscience (Cam-CAN) was scanned using functional MRI during two versions of an experiment: a natural listening version in which participants simply listened to spoken sentences and an explicit task version in which they rated the acceptability of the same sentences. Independent components analysis across the combined data from both versions showed that although task-free language comprehension activates only the auditory and frontotemporal (FTN) syntax networks, performing a simple task with the same sentences recruits several additional networks. Remarkably, functionality of the critical FTN is maintained across age groups, showing no difference in within-network connectivity or responsivity to syntactic processing demands despite gray matter loss and reduced connectivity to task-related networks. We found no evidence for reduced specialization or compensation with age. Overt task performance was maintained across the lifespan and performance in older, but not younger, adults related to crystallized knowledge, suggesting that decreased between-network connectivity may be compensated for by older adults' richer knowledge base. SIGNIFICANCE STATEMENT Understanding spoken language requires the rapid integration of information at many different levels of analysis. Given the complexity and speed of this process, it is remarkably well preserved with age. Although previous work claims that this preserved functionality is due to compensatory activation of regions outside the frontotemporal language network, we use a novel systems-level approach to show that these “compensatory” activations simply reflect age differences in response to experimental task demands. Natural, task-free language comprehension solely recruits auditory and frontotemporal networks, the latter of which is similarly responsive to language-processing demands across the lifespan. These findings challenge the conventional approach to neurocognitive aging by showing that the neural underpinnings of a given cognitive function depend on how you test it. PMID:27170120
NASA Astrophysics Data System (ADS)
Nawani, Jigna; Rixius, Julia; Neuhaus, Birgit J.
2016-08-01
Empirical analysis of secondary biology classrooms revealed that, on average, 68% of teaching time in Germany revolved around processing tasks. Quality of instruction can thus be assessed by analyzing the quality of tasks used in classroom discourse. This quasi-experimental study analyzed how teachers used tasks in 38 videotaped biology lessons pertaining to the topic 'blood and circulatory system'. Two fundamental characteristics used to analyze tasks include: (1) required cognitive level of processing (e.g. low level information processing: repetiition, summary, define, classify and high level information processing: interpret-analyze data, formulate hypothesis, etc.) and (2) complexity of task content (e.g. if tasks require use of factual, linking or concept level content). Additionally, students' cognitive knowledge structure about the topic 'blood and circulatory system' was measured using student-drawn concept maps (N = 970 students). Finally, linear multilevel models were created with high-level cognitive processing tasks and higher content complexity tasks as class-level predictors and students' prior knowledge, students' interest in biology, and students' interest in biology activities as control covariates. Results showed a positive influence of high-level cognitive processing tasks (β = 0.07; p < .01) on students' cognitive knowledge structure. However, there was no observed effect of higher content complexity tasks on students' cognitive knowledge structure. Presented findings encourage the use of high-level cognitive processing tasks in biology instruction.
Early Training Estimation System (ETES). Appendix F. User’s Guide
1984-06-01
Related to Early Training Estimation 2-17 2-5 Organizations Interviewed During Task 1 2-17 2-6 Potential Problem Solving Aids 2-24 2-7 Task Deletion...tasks are available, only the training program elements must be estimated. Thus, by adding comparability analysis procedures to SDT data base management...data base manage- ment capabilities of the SDT, and (3) conduct trade-off studies of proposed solutions to identified training problems . 1-17
STINGRAY: system for integrated genomic resources and analysis.
Wagner, Glauber; Jardim, Rodrigo; Tschoeke, Diogo A; Loureiro, Daniel R; Ocaña, Kary A C S; Ribeiro, Antonio C B; Emmel, Vanessa E; Probst, Christian M; Pitaluga, André N; Grisard, Edmundo C; Cavalcanti, Maria C; Campos, Maria L M; Mattoso, Marta; Dávila, Alberto M R
2014-03-07
The STINGRAY system has been conceived to ease the tasks of integrating, analyzing, annotating and presenting genomic and expression data from Sanger and Next Generation Sequencing (NGS) platforms. STINGRAY includes: (a) a complete and integrated workflow (more than 20 bioinformatics tools) ranging from functional annotation to phylogeny; (b) a MySQL database schema, suitable for data integration and user access control; and (c) a user-friendly graphical web-based interface that makes the system intuitive, facilitating the tasks of data analysis and annotation. STINGRAY showed to be an easy to use and complete system for analyzing sequencing data. While both Sanger and NGS platforms are supported, the system could be faster using Sanger data, since the large NGS datasets could potentially slow down the MySQL database usage. STINGRAY is available at http://stingray.biowebdb.org and the open source code at http://sourceforge.net/projects/stingray-biowebdb/.
STINGRAY: system for integrated genomic resources and analysis
2014-01-01
Background The STINGRAY system has been conceived to ease the tasks of integrating, analyzing, annotating and presenting genomic and expression data from Sanger and Next Generation Sequencing (NGS) platforms. Findings STINGRAY includes: (a) a complete and integrated workflow (more than 20 bioinformatics tools) ranging from functional annotation to phylogeny; (b) a MySQL database schema, suitable for data integration and user access control; and (c) a user-friendly graphical web-based interface that makes the system intuitive, facilitating the tasks of data analysis and annotation. Conclusion STINGRAY showed to be an easy to use and complete system for analyzing sequencing data. While both Sanger and NGS platforms are supported, the system could be faster using Sanger data, since the large NGS datasets could potentially slow down the MySQL database usage. STINGRAY is available at http://stingray.biowebdb.org and the open source code at http://sourceforge.net/projects/stingray-biowebdb/. PMID:24606808
Doytchev, Doytchin E; Szwillus, Gerd
2009-11-01
Understanding the reasons for incident and accident occurrence is important for an organization's safety. Different methods have been developed to achieve this goal. To better understand the human behaviour in incident occurrence we propose an analysis concept that combines Fault Tree Analysis (FTA) and Task Analysis (TA). The former method identifies the root causes of an accident/incident, while the latter analyses the way people perform the tasks in their work environment and how they interact with machines or colleagues. These methods were complemented with the use of the Human Error Identification in System Tools (HEIST) methodology and the concept of Performance Shaping Factors (PSF) to deepen the insight into the error modes of an operator's behaviour. HEIST shows the external error modes that caused the human error and the factors that prompted the human to err. To show the validity of the approach, a case study at a Bulgarian Hydro power plant was carried out. An incident - the flooding of the plant's basement - was analysed by combining the afore-mentioned methods. The case study shows that Task Analysis in combination with other methods can be applied successfully to human error analysis, revealing details about erroneous actions in a realistic situation.
Dynamic analysis methods for detecting anomalies in asynchronously interacting systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, Akshat; Solis, John Hector; Matschke, Benjamin
2014-01-01
Detecting modifications to digital system designs, whether malicious or benign, is problematic due to the complexity of the systems being analyzed. Moreover, static analysis techniques and tools can only be used during the initial design and implementation phases to verify safety and liveness properties. It is computationally intractable to guarantee that any previously verified properties still hold after a system, or even a single component, has been produced by a third-party manufacturer. In this paper we explore new approaches for creating a robust system design by investigating highly-structured computational models that simplify verification and analysis. Our approach avoids the needmore » to fully reconstruct the implemented system by incorporating a small verification component that dynamically detects for deviations from the design specification at run-time. The first approach encodes information extracted from the original system design algebraically into a verification component. During run-time this component randomly queries the implementation for trace information and verifies that no design-level properties have been violated. If any deviation is detected then a pre-specified fail-safe or notification behavior is triggered. Our second approach utilizes a partitioning methodology to view liveness and safety properties as a distributed decision task and the implementation as a proposed protocol that solves this task. Thus the problem of verifying safety and liveness properties is translated to that of verifying that the implementation solves the associated decision task. We develop upon results from distributed systems and algebraic topology to construct a learning mechanism for verifying safety and liveness properties from samples of run-time executions.« less
Feng, Chuan; Rozenblit, Jerzy W; Hamilton, Allan J
2010-11-01
Surgeons performing laparoscopic surgery have strong biases regarding the quality and nature of the laparoscopic video monitor display. In a comparative study, we used a unique computerized sensing and analysis system to evaluate the various types of monitors employed in laparoscopic surgery. We compared the impact of different types of monitor displays on an individual's performance of a laparoscopic training task which required the subject to move the instrument to a set of targets. Participants (varying from no laparoscopic experience to board-certified surgeons) were asked to perform the assigned task while using all three display systems, which were randomly assigned: a conventional laparoscopic monitor system (2D), a high-definition monitor system (HD), and a stereoscopic display (3D). The effects of monitor system on various performance parameters (total time consumed to finish the task, average speed, and movement economy) were analyzed by computer. Each of the subjects filled out a subjective questionnaire at the end of their training session. A total of 27 participants completed our study. Performance with the HD monitor was significantly slower than with either the 3D or 2D monitor (p < 0.0001). Movement economy with the HD monitor was significantly reduced compared with the 3D (p < 0.0004) or 2D (p < 0.0001) monitor. In terms of average time required to complete the task, performance with the 3D monitor was significantly faster than with the HD (p < 0.0001) or 2D (p < 0.0086) monitor. However, the HD system was the overwhelming favorite according to subjective evaluation. Computerized sensing and analysis is capable of quantitatively assessing the seemingly minor effect of monitor display on surgical training performance. The study demonstrates that, while users expressed a decided preference for HD systems, actual quantitative analysis indicates that HD monitors offer no statistically significant advantage and may even worsen performance compared with standard 2D or 3D laparoscopic monitors.
Neural architecture underlying classification of face perception paradigms.
Laird, Angela R; Riedel, Michael C; Sutherland, Matthew T; Eickhoff, Simon B; Ray, Kimberly L; Uecker, Angela M; Fox, P Mickle; Turner, Jessica A; Fox, Peter T
2015-10-01
We present a novel strategy for deriving a classification system of functional neuroimaging paradigms that relies on hierarchical clustering of experiments archived in the BrainMap database. The goal of our proof-of-concept application was to examine the underlying neural architecture of the face perception literature from a meta-analytic perspective, as these studies include a wide range of tasks. Task-based results exhibiting similar activation patterns were grouped as similar, while tasks activating different brain networks were classified as functionally distinct. We identified four sub-classes of face tasks: (1) Visuospatial Attention and Visuomotor Coordination to Faces, (2) Perception and Recognition of Faces, (3) Social Processing and Episodic Recall of Faces, and (4) Face Naming and Lexical Retrieval. Interpretation of these sub-classes supports an extension of a well-known model of face perception to include a core system for visual analysis and extended systems for personal information, emotion, and salience processing. Overall, these results demonstrate that a large-scale data mining approach can inform the evolution of theoretical cognitive models by probing the range of behavioral manipulations across experimental tasks. Copyright © 2015 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
T. Nakamura; C.L. Senior
Most of the anthropogenic emissions of carbon dioxide result from the combustion of fossil fuels for energy production. Photosynthesis has long been recognized as a means, at least in theory, to sequester anthropogenic carbon dioxide. Aquatic microalgae have been identified as fast growing species whose carbon fixing rates are higher than those of land-based plants by one order of magnitude. Physical Sciences Inc. (PSI), Aquasearch, and the Hawaii Natural Energy Institute at the University of Hawaii are jointly developing technologies for recovery and sequestration of CO{sub 2} from stationary combustion systems by photosynthesis of microalgae. The research is aimed primarilymore » at demonstrating the ability of selected species of microalgae to effectively fix carbon from typical power plant exhaust gases. This report covers the reporting period 1 October 2000 to 31 March 2005 in which PSI, Aquasearch and University of Hawaii conducted their tasks. This report discusses results of the work pertaining to five tasks: Task 1--Supply of CO2 from Power Plant Flue Gas to Photobioreactor; Task 2--Selection of Microalgae; Task 3--Optimization and Demonstration of Industrial Scale Photobioreactor; Task 4--Carbon Sequestration System Design; and Task 5--Economic Analysis. Based on the work conducted in each task summary conclusion is presented.« less
Analysis of Air Traffic Track Data with the AutoBayes Synthesis System
NASA Technical Reports Server (NTRS)
Schumann, Johann Martin Philip; Cate, Karen; Lee, Alan G.
2010-01-01
The Next Generation Air Traffic System (NGATS) is aiming to provide substantial computer support for the air traffic controllers. Algorithms for the accurate prediction of aircraft movements are of central importance for such software systems but trajectory prediction has to work reliably in the presence of unknown parameters and uncertainties. We are using the AutoBayes program synthesis system to generate customized data analysis algorithms that process large sets of aircraft radar track data in order to estimate parameters and uncertainties. In this paper, we present, how the tasks of finding structure in track data, estimation of important parameters in climb trajectories, and the detection of continuous descent approaches can be accomplished with compact task-specific AutoBayes specifications. We present an overview of the AutoBayes architecture and describe, how its schema-based approach generates customized analysis algorithms, documented C/C++ code, and detailed mathematical derivations. Results of experiments with actual air traffic control data are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1979-06-01
The conceptual design of an advanced central receiver power system using liquid sodium as a heat transport medium has been completed by a team consisting of the Energy Systems Group (prime contractor), McDonnell Douglas, Stearns-Roger, The University of Houston, and Salt River Project. The purpose of this study was to determine the technical and economic advantages of this concept for commercial-scale power plants. This final report covers all tasks of the project. These tasks were as follows: (1) review and analysis of preliminary specification; (2) parametric analysis; (3) select commercial configuration; (4) commercial plant conceptual design; (5) assessment of commercialmore » plant; (6) advanced central receiver power system development plan; (7) program plan; (8) reports and data; (9) program management; and (10) safety analysis. A programmatic overview of the accomplishments of this program is given. The 100-MW conceptual commercial plant, the 281-MW optimum plant, and the 10-MW pilot plant are described. (WHK)« less
On the acquisition and representation of procedural knowledge
NASA Technical Reports Server (NTRS)
Saito, T.; Ortiz, C.; Loftin, R. B.
1992-01-01
Historically knowledge acquisition has proven to be one of the greatest barriers to the development of intelligent systems. Current practice generally requires lengthy interactions between the expert whose knowledge is to be captured and the knowledge engineer whose responsibility is to acquire and represent knowledge in a useful form. Although much research has been devoted to the development of methodologies and computer software to aid in the capture and representation of some of some types of knowledge, little attention has been devoted to procedural knowledge. NASA personnel frequently perform tasks that are primarily procedural in nature. Previous work is reviewed in the field of knowledge acquisition and then focus on knowledge acquisition for procedural tasks with special attention devoted to the Navy's VISTA tool. The design and development is described of a system for the acquisition and representation of procedural knowledge-TARGET (Task Analysis and Rule Generation Tool). TARGET is intended as a tool that permits experts to visually describe procedural tasks and as a common medium for knowledge refinement by the expert and knowledge engineer. The system is designed to represent the acquired knowledge in the form of production rules. Systems such as TARGET have the potential to profoundly reduce the time, difficulties, and costs of developing knowledge-based systems for the performance of procedural tasks.
NASA Technical Reports Server (NTRS)
Vigeant-Langlois, Laurence; Hansman, R. John, Jr.
2003-01-01
The objective of this project was to propose a means to improve aviation weather information, training procedures based on a human-centered systems approach. Methodology: cognitive analysis of pilot's tasks; trajectory-based approach to weather information; contingency planning support; and implications for improving weather information.
Multilevel semantic analysis and problem-solving in the flight domain
NASA Technical Reports Server (NTRS)
Chien, R. T.; Chen, D. C.; Ho, W. P. C.; Pan, Y. C.
1982-01-01
A computer based cockpit system which is capable of assisting the pilot in such important tasks as monitoring, diagnosis, and trend analysis was developed. The system is properly organized and is endowed with a knowledge base so that it enhances the pilot's control over the aircraft while simultaneously reducing his workload.
CMS distributed data analysis with CRAB3
Mascheroni, M.; Balcas, J.; Belforte, S.; ...
2015-12-23
The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day.CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks andmore » submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. Furthermore, the new system improves the robustness, scalability and sustainability of the service.Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.« less
NASA Technical Reports Server (NTRS)
1983-01-01
Mission scenario analysis and architectural concepts, alternative systems concepts, mission operations and architectural development, architectural analysis trades, evolution, configuration, and technology development are assessed.
Pressure control and analysis report: Hydrogen Thermal Test Article (HTTA)
NASA Technical Reports Server (NTRS)
1971-01-01
Tasks accomplished during the HTTA Program study period included: (1) performance of a literature review to provide system guidelines; (2) development of analytical procedures needed to predict system performance; (3) design and analysis of the HTTA pressurization system considering (a) future utilization of results in the design of a spacecraft maneuvering system propellant package, (b) ease of control and operation, (c) system safety, and (d) hardware cost; and (4) making conclusions and recommendations for systems design.
Parametric analysis of closed cycle magnetohydrodynamic (MHD) power plants
NASA Technical Reports Server (NTRS)
Owens, W.; Berg, R.; Murthy, R.; Patten, J.
1981-01-01
A parametric analysis of closed cycle MHD power plants was performed which studied the technical feasibility, associated capital cost, and cost of electricity for the direct combustion of coal or coal derived fuel. Three reference plants, differing primarily in the method of coal conversion utilized, were defined. Reference Plant 1 used direct coal fired combustion while Reference Plants 2 and 3 employed on site integrated gasifiers. Reference Plant 2 used a pressurized gasifier while Reference Plant 3 used a ""state of the art' atmospheric gasifier. Thirty plant configurations were considered by using parametric variations from the Reference Plants. Parametric variations include the type of coal (Montana Rosebud or Illinois No. 6), clean up systems (hot or cold gas clean up), on or two stage atmospheric or pressurized direct fired coal combustors, and six different gasifier systems. Plant sizes ranged from 100 to 1000 MWe. Overall plant performance was calculated using two methodologies. In one task, the channel performance was assumed and the MHD topping cycle efficiencies were based on the assumed values. A second task involved rigorous calculations of channel performance (enthalpy extraction, isentropic efficiency and generator output) that verified the original (task one) assumptions. Closed cycle MHD capital costs were estimated for the task one plants; task two cost estimates were made for the channel and magnet only.
Decaf: Decoupled Dataflows for In Situ High-Performance Workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dreher, M.; Peterka, T.
Decaf is a dataflow system for the parallel communication of coupled tasks in an HPC workflow. The dataflow can perform arbitrary data transformations ranging from simply forwarding data to complex data redistribution. Decaf does this by allowing the user to allocate resources and execute custom code in the dataflow. All communication through the dataflow is efficient parallel message passing over MPI. The runtime for calling tasks is entirely message-driven; Decaf executes a task when all messages for the task have been received. Such a messagedriven runtime allows cyclic task dependencies in the workflow graph, for example, to enact computational steeringmore » based on the result of downstream tasks. Decaf includes a simple Python API for describing the workflow graph. This allows Decaf to stand alone as a complete workflow system, but Decaf can also be used as the dataflow layer by one or more other workflow systems to form a heterogeneous task-based computing environment. In one experiment, we couple a molecular dynamics code with a visualization tool using the FlowVR and Damaris workflow systems and Decaf for the dataflow. In another experiment, we test the coupling of a cosmology code with Voronoi tessellation and density estimation codes using MPI for the simulation, the DIY programming model for the two analysis codes, and Decaf for the dataflow. Such workflows consisting of heterogeneous software infrastructures exist because components are developed separately with different programming models and runtimes, and this is the first time that such heterogeneous coupling of diverse components was demonstrated in situ on HPC systems.« less
Design, engineering and evaluation of refractory liners for slagging gasifiers. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
deTineo, B J; Booth, G; Firestone, R F
1982-08-01
The contract for this program was awarded at the end of September 1978. Work was started on 1 October 1978, on Tasks A, B, and E. Task A, Conceptual Liner Designs, and Task B, Test System Design and Construction, were completed. Task C, Liner Tests, and Task D, Liner Design Evaluation, were to begin upon completion of Task B. Task E, Liner Model Development, is inactive after an initial data compilation and theoretical model development effort. It was to be activated as soon as data were available from Task D. Task F, Liner Design Handbook, was active along with Taskmore » A since the reports of both tasks were to use the same format. At this time, Tasks C, D, and F are not to be completed since funding of this project was phased out by DOE directive. The refractory text facility, which was constructed, was tested and found to perform satisfactorily. It is described in detail, including a hazard analysis which was performed. (LTN)« less
An fMRI study of sex differences in regional activation to a verbal and a spatial task.
Gur, R C; Alsop, D; Glahn, D; Petty, R; Swanson, C L; Maldjian, J A; Turetsky, B I; Detre, J A; Gee, J; Gur, R E
2000-09-01
Sex differences in cognitive performance have been documented, women performing better on some phonological tasks and men on spatial tasks. An earlier fMRI study suggested sex differences in distributed brain activation during phonological processing, with bilateral activation seen in women while men showed primarily left-lateralized activation. This blood oxygen level-dependent fMRI study examined sex differences (14 men, 13 women) in activation for a spatial task (judgment of line orientation) compared to a verbal-reasoning task (analogies) that does not typically show sex differences. Task difficulty was manipulated. Hypothesized ROI-based analysis documented the expected left-lateralized changes for the verbal task in the inferior parietal and planum temporal regions in both men and women, but only men showed right-lateralized increase for the spatial task in these regions. Image-based analysis revealed a distributed network of cortical regions activated by the tasks, which consisted of the lateral frontal, medial frontal, mid-temporal, occipitoparietal, and occipital regions. The activation was more left lateralized for the verbal and more right for the spatial tasks, but men also showed some left activation for the spatial task, which was not seen in women. Increased task difficulty produced more distributed activation for the verbal and more circumscribed activation for the spatial task. The results suggest that failure to activate the appropriate hemisphere in regions directly involved in task performance may explain certain sex differences in performance. They also extend, for a spatial task, the principle that bilateral activation in a distributed cognitive system underlies sex differences in performance. Copyright 2000 Academic Press.
Power Consumption Analysis of Operating Systems for Wireless Sensor Networks
Lajara, Rafael; Pelegrí-Sebastiá, José; Perez Solano, Juan J.
2010-01-01
In this paper four wireless sensor network operating systems are compared in terms of power consumption. The analysis takes into account the most common operating systems—TinyOS v1.0, TinyOS v2.0, Mantis and Contiki—running on Tmote Sky and MICAz devices. With the objective of ensuring a fair evaluation, a benchmark composed of four applications has been developed, covering the most typical tasks that a Wireless Sensor Network performs. The results show the instant and average current consumption of the devices during the execution of these applications. The experimental measurements provide a good insight into the power mode in which the device components are running at every moment, and they can be used to compare the performance of different operating systems executing the same tasks. PMID:22219688
Lee, E C; Rafiq, A; Merrell, R; Ackerman, R; Dennerlein, J T
2005-08-01
Minimally invasive surgical techniques expose surgeons to a variety of occupational hazards that may promote musculoskeletal disorders. Telerobotic systems for minimally invasive surgery may help to reduce these stressors. The objective of this study was to compare manual and telerobotic endoscopic surgery in terms of postural and mental stress. Thirteen participants with no experience as primary surgeons in endoscopic surgery performed a set of simulated surgical tasks using two different techniques--a telerobotic master--slave system and a manual endoscopic surgery system. The tasks consisted of passing a soft spherical object through a series of parallel rings, suturing along a line 5-cm long, running a 32-in ribbon, and cannulation. The Job Strain Index (JSI) and Rapid Upper Limb Assessment (RULA) were used to quantify upper extremity exposure to postural and force risk factors. Task duration was quantified in seconds. A questionnaire provided measures of the participants' intuitiveness and mental stress. The JSI and RULA scores for all four tasks were significantly lower for the telerobotic technique than for the manual one. Task duration was significantly longer for telerobotic than for manual tasks. Participants reported that the telerobotic technique was as intuitive as, and no more stressful than, the manual technique. Given identical tasks, the time to completion is longer using the telerobotic technique than its manual counterpart. For the given simulated tasks in the laboratory setting, the better scores for the upper extremity postural analysis indicate that telerobotic surgery provides a more comfortable environment for the surgeon without any additional mental stress.
Vocational Teacher Stress and the Educational System.
ERIC Educational Resources Information Center
Adams, Elaine; Heath-Camp, Betty; Camp, William G.
1999-01-01
A multiple regression analysis of data from 235 secondary vocational teachers in Virginia found that educational system-related variables explained most teacher stress. The most important explanatory variables were task stress and role overload. (SK)
Space station data system analysis/architecture study. Task 4: System definition report. Appendix
NASA Technical Reports Server (NTRS)
1985-01-01
Appendices to the systems definition study for the space station Data System are compiled. Supplemental information on external interface specification, simulation and modeling, and function design characteristics is presented along with data flow diagrams, a data dictionary, and function allocation matrices.
Application of Shuttle EVA Systems to Payloads. Volume 2: Payload EVA Task Completion Plans
NASA Technical Reports Server (NTRS)
1976-01-01
Candidate payload tasks for EVA application were identified and selected, based on an analysis of four representative space shuttle payloads, and typical EVA scenarios with supporting crew timelines and procedures were developed. The EVA preparations and post EVA operations, as well as the timelines emphasizing concurrent payload support functions, were also summarized.
THE DEVELOPMENT OF TRAINING OBJECTIVES.
ERIC Educational Resources Information Center
SMITH, ROBERT G., JR.
A SIX-STEP PROCESS IS DESCRIBED FOR DEFINING JOB-RELEVANT OBJECTIVES FOR THE TRAINING OF MILITARY PERSONNEL. (1) A FORM OF SYSTEM ANALYSIS IS OUTLINED TO PROVIDE THE CONTEXT FOR THE STUDY OF A PARTICULAR MILITARY OCCUPATION SPECIALTY. (2) A TASK INVENTORY IS MADE OF THE MAJOR DUTIES IN THE JOB AND THE MORE SPECIFIC JOB TASKS ASSOCIATED WITH EACH…
MANPRINT Methods Monograph: Aiding the Development of Manpower-Based System Evaluation
1989-06-01
zone below tree level where threats are known to be (the actual number of threats may vary). Weather conditions are VFR. The helicopter pops up to...12.0 Replace 13.3 Bearing, connecting Inspect 6.2 Replace 6.2 0105 Camshaft Inspect 7.2 Replace 7.2 Cover, cylinder head Inspect .2 (valve cover...matrix to analyze the data and identify task clusters. . Outputs and Use of Cluster Analysis 1. Hierarchical cluster tree (taxonomy) of system tasks will
Training Impact Analysis for Land Warrior Block II
2006-01-01
maximum W... ... 384 384 instructor requirement - S~ Infantry OSUT STraining company size (25 training companies) NA NA 200•:• LW systems to equip all...Grenade, NA NA 10 M 18 Green SMK C-111 TACTICAL Low Expertise Medium Expertise High Expertise EMPLOYMENT Tasks/Skills Taught No Task Number...Practical Exercise of Tactical Employment on LW Capabilities (Leader Task) G945 Hand Grenade, NA NA 10 M18 Yellow SMK G950 Hand Grenade, NA NA 3 M18 Red SMK
Conversion of Questionnaire Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, Danny H; Elwood Jr, Robert H
During the survey, respondents are asked to provide qualitative answers (well, adequate, needs improvement) on how well material control and accountability (MC&A) functions are being performed. These responses can be used to develop failure probabilities for basic events performed during routine operation of the MC&A systems. The failure frequencies for individual events may be used to estimate total system effectiveness using a fault tree in a probabilistic risk analysis (PRA). Numeric risk values are required for the PRA fault tree calculations that are performed to evaluate system effectiveness. So, the performance ratings in the questionnaire must be converted to relativemore » risk values for all of the basic MC&A tasks performed in the facility. If a specific material protection, control, and accountability (MPC&A) task is being performed at the 'perfect' level, the task is considered to have a near zero risk of failure. If the task is performed at a less than perfect level, the deficiency in performance represents some risk of failure for the event. As the degree of deficiency in performance increases, the risk of failure increases. If a task that should be performed is not being performed, that task is in a state of failure. The failure probabilities of all basic events contribute to the total system risk. Conversion of questionnaire MPC&A system performance data to numeric values is a separate function from the process of completing the questionnaire. When specific questions in the questionnaire are answered, the focus is on correctly assessing and reporting, in an adjectival manner, the actual performance of the related MC&A function. Prior to conversion, consideration should not be given to the numeric value that will be assigned during the conversion process. In the conversion process, adjectival responses to questions on system performance are quantified based on a log normal scale typically used in human error analysis (see A.D. Swain and H.E. Guttmann, 'Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications,' NUREG/CR-1278). This conversion produces the basic event risk of failure values required for the fault tree calculations. The fault tree is a deductive logic structure that corresponds to the operational nuclear MC&A system at a nuclear facility. The conventional Delphi process is a time-honored approach commonly used in the risk assessment field to extract numerical values for the failure rates of actions or activities when statistically significant data is absent.« less
FY17 Status Report on the Computing Systems for the Yucca Mountain Project TSPA-LA Models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Appel, Gordon John; Hadgu, Teklu; Appel, Gordon John
Sandia National Laboratories (SNL) continued evaluation of total system performance assessment (TSPA) computing systems for the previously considered Yucca Mountain Project (YMP). This was done to maintain the operational readiness of the computing infrastructure (computer hardware and software) and knowledge capability for total system performance assessment (TSPA) type analysis, as directed by the National Nuclear Security Administration (NNSA), DOE 2010. This work is a continuation of the ongoing readiness evaluation reported in Lee and Hadgu (2014), Hadgu et al. (2015) and Hadgu and Appel (2016). The TSPA computing hardware (CL2014) and storage system described in Hadgu et al. (2015) weremore » used for the current analysis. One floating license of GoldSim with Versions 9.60.300, 10.5, 11.1 and 12.0 was installed on the cluster head node, and its distributed processing capability was mapped on the cluster processors. Other supporting software were tested and installed to support the TSPA- type analysis on the server cluster. The current tasks included preliminary upgrade of the TSPA-LA from Version 9.60.300 to the latest version 12.0 and address DLL-related issues observed in the FY16 work. The model upgrade task successfully converted the Nominal Modeling case to GoldSim Versions 11.1/12. Conversions of the rest of the TSPA models were also attempted but program and operational difficulties precluded this. Upgrade of the remaining of the modeling cases and distributed processing tasks is expected to continue. The 2014 server cluster and supporting software systems are fully operational to support TSPA-LA type analysis.« less
Examples of finite element mesh generation using SDRC IDEAS
NASA Technical Reports Server (NTRS)
Zapp, John; Volakis, John L.
1990-01-01
IDEAS (Integrated Design Engineering Analysis Software) offers a comprehensive package for mechanical design engineers. Due to its multifaceted capabilities, however, it can be manipulated to serve the needs of electrical engineers, also. IDEAS can be used to perform the following tasks: system modeling, system assembly, kinematics, finite element pre/post processing, finite element solution, system dynamics, drafting, test data analysis, and project relational database.
Mirel, Barbara; Eichinger, Felix; Keller, Benjamin J; Kretzler, Matthias
2011-03-21
Bioinformatics visualization tools are often not robust enough to support biomedical specialists’ complex exploratory analyses. Tools need to accommodate the workflows that scientists actually perform for specific translational research questions. To understand and model one of these workflows, we conducted a case-based, cognitive task analysis of a biomedical specialist’s exploratory workflow for the question: What functional interactions among gene products of high throughput expression data suggest previously unknown mechanisms of a disease? From our cognitive task analysis four complementary representations of the targeted workflow were developed. They include: usage scenarios, flow diagrams, a cognitive task taxonomy, and a mapping between cognitive tasks and user-centered visualization requirements. The representations capture the flows of cognitive tasks that led a biomedical specialist to inferences critical to hypothesizing. We created representations at levels of detail that could strategically guide visualization development, and we confirmed this by making a trial prototype based on user requirements for a small portion of the workflow. Our results imply that visualizations should make available to scientific users “bundles of features†consonant with the compositional cognitive tasks purposefully enacted at specific points in the workflow. We also highlight certain aspects of visualizations that: (a) need more built-in flexibility; (b) are critical for negotiating meaning; and (c) are necessary for essential metacognitive support.
WISP information display system user's manual
NASA Technical Reports Server (NTRS)
Alley, P. L.; Smith, G. R.
1978-01-01
The wind shears program (WISP) supports the collection of data on magnetic tape for permanent storage or analysis. The document structure provides: (1) the hardware and software configuration required to execute the WISP system and start up procedure from a power down condition; (2) data collection task, calculations performed on the incoming data, and a description of the magnetic tape format; (3) the data display task and examples of displays obtained from execution of the real time simulation program; and (4) the raw data dump task and examples of operator actions required to obtained the desired format. The procedures outlines herein will allow continuous data collection at the expense of real time visual displays.
Meta-T: TetrisⓇ as an experimental paradigm for cognitive skills research.
Lindstedt, John K; Gray, Wayne D
2015-12-01
Studies of human performance in complex tasks using video games are an attractive prospect, but many existing games lack a comprehensive way to modify the game and track performance beyond basic levels of analysis. Meta-T provides experimenters a tool to study behavior in a dynamic task environment with time-stressed decision-making and strong perceptual-motor elements, offering a host of experimental manipulations with a robust and detailed logging system for all user events, system events, and screen objects. Its experimenter-friendly interface provides control over detailed parameters of the task environment without need for programming expertise. Support for eye-tracking and computational cognitive modeling extend the paradigm's scope.
Longo, Alessia; Federolf, Peter; Haid, Thomas; Meulenbroek, Ruud
2018-06-01
In many daily jobs, repetitive arm movements are performed for extended periods of time under continuous cognitive demands. Even highly monotonous tasks exhibit an inherent motor variability and subtle fluctuations in movement stability. Variability and stability are different aspects of system dynamics, whose magnitude may be further affected by a cognitive load. Thus, the aim of the study was to explore and compare the effects of a cognitive dual task on the variability and local dynamic stability in a repetitive bimanual task. Thirteen healthy volunteers performed the repetitive motor task with and without a concurrent cognitive task of counting aloud backwards in multiples of three. Upper-body 3D kinematics were collected and postural reconfigurations-the variability related to the volunteer's postural change-were determined through a principal component analysis-based procedure. Subsequently, the most salient component was selected for the analysis of (1) cycle-to-cycle spatial and temporal variability, and (2) local dynamic stability as reflected by the largest Lyapunov exponent. Finally, end-point variability was evaluated as a control measure. The dual cognitive task proved to increase the temporal variability and reduce the local dynamic stability, marginally decrease endpoint variability, and substantially lower the incidence of postural reconfigurations. Particularly, the latter effect is considered to be relevant for the prevention of work-related musculoskeletal disorders since reduced variability in sustained repetitive tasks might increase the risk of overuse injuries.
NASA Technical Reports Server (NTRS)
Horst, Richard L.; Mahaffey, David L.; Munson, Robert C.
1989-01-01
The present Phase 2 small business innovation research study was designed to address issues related to scalp-recorded event-related potential (ERP) indices of mental workload and to transition this technology from the laboratory to cockpit simulator environments for use as a systems engineering tool. The project involved five main tasks: (1) Two laboratory studies confirmed the generality of the ERP indices of workload obtained in the Phase 1 study and revealed two additional ERP components related to workload. (2) A task analysis' of flight scenarios and pilot tasks in the Advanced Concepts Flight Simulator (ACFS) defined cockpit events (i.e., displays, messages, alarms) that would be expected to elicit ERPs related to workload. (3) Software was developed to support ERP data analysis. An existing ARD-proprietary package of ERP data analysis routines was upgraded, new graphics routines were developed to enhance interactive data analysis, and routines were developed to compare alternative single-trial analysis techniques using simulated ERP data. (4) Working in conjunction with NASA Langley research scientists and simulator engineers, preparations were made for an ACFS validation study of ERP measures of workload. (5) A design specification was developed for a general purpose, computerized, workload assessment system that can function in simulators such as the ACFS.
Icon and user interface design for emergency medical information systems: a case study.
Salman, Y Batu; Cheng, Hong-In; Patterson, Patrick E
2012-01-01
A usable medical information system should allow for reliable and accurate interaction between users and the system in emergencies. A participatory design approach was used to develop a medical information system in two Turkish hospitals. The process consisted of task and user analysis, an icon design survey, initial icon design, final icon design and evaluation, and installation of the iconic medical information system with the icons. We observed work sites to note working processes and tasks related to the information system and interviewed medical personnel. Emergency personnel then participated in the design process to develop a usable graphical user interface, by drawing icon sketches for 23 selected tasks. Similar sketches were requested for specific tasks such as family medical history, contact information, translation, addiction, required inspections, requests and applications, and nurse observations. The sketches were analyzed and redesigned into computer icons by professional designers and the research team. A second group of physicians and nurses then tested the understandability of the icons. The user interface layout was examined and evaluated by system users, followed by the system's installation. Medical personnel reported the participatory design process was interesting and believed the resulting designs would be more familiar and friendlier. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Reliability of drivers in urban intersections.
Gstalter, Herbert; Fastenmeier, Wolfgang
2010-01-01
The concept of human reliability has been widely used in industrial settings by human factors experts to optimise the person-task fit. Reliability is estimated by the probability that a task will successfully be completed by personnel in a given stage of system operation. Human Reliability Analysis (HRA) is a technique used to calculate human error probabilities as the ratio of errors committed to the number of opportunities for that error. To transfer this notion to the measurement of car driver reliability the following components are necessary: a taxonomy of driving tasks, a definition of correct behaviour in each of these tasks, a list of errors as deviations from the correct actions and an adequate observation method to register errors and opportunities for these errors. Use of the SAFE-task analysis procedure recently made it possible to derive driver errors directly from the normative analysis of behavioural requirements. Driver reliability estimates could be used to compare groups of tasks (e.g. different types of intersections with their respective regulations) as well as groups of drivers' or individual drivers' aptitudes. This approach was tested in a field study with 62 drivers of different age groups. The subjects drove an instrumented car and had to complete an urban test route, the main features of which were 18 intersections representing six different driving tasks. The subjects were accompanied by two trained observers who recorded driver errors using standardized observation sheets. Results indicate that error indices often vary between both the age group of drivers and the type of driving task. The highest error indices occurred in the non-signalised intersection tasks and the roundabout, which exactly equals the corresponding ratings of task complexity from the SAFE analysis. A comparison of age groups clearly shows the disadvantage of older drivers, whose error indices in nearly all tasks are significantly higher than those of the other groups. The vast majority of these errors could be explained by high task load in the intersections, as they represent difficult tasks. The discussion shows how reliability estimates can be used in a constructive way to propose changes in car design, intersection layout and regulation as well as driver training.
Probabilistic fault tree analysis of a radiation treatment system.
Ekaette, Edidiong; Lee, Robert C; Cooke, David L; Iftody, Sandra; Craighead, Peter
2007-12-01
Inappropriate administration of radiation for cancer treatment can result in severe consequences such as premature death or appreciably impaired quality of life. There has been little study of vulnerable treatment process components and their contribution to the risk of radiation treatment (RT). In this article, we describe the application of probabilistic fault tree methods to assess the probability of radiation misadministration to patients at a large cancer treatment center. We conducted a systematic analysis of the RT process that identified four process domains: Assessment, Preparation, Treatment, and Follow-up. For the Preparation domain, we analyzed possible incident scenarios via fault trees. For each task, we also identified existing quality control measures. To populate the fault trees we used subjective probabilities from experts and compared results with incident report data. Both the fault tree and the incident report analysis revealed simulation tasks to be most prone to incidents, and the treatment prescription task to be least prone to incidents. The probability of a Preparation domain incident was estimated to be in the range of 0.1-0.7% based on incident reports, which is comparable to the mean value of 0.4% from the fault tree analysis using probabilities from the expert elicitation exercise. In conclusion, an analysis of part of the RT system using a fault tree populated with subjective probabilities from experts was useful in identifying vulnerable components of the system, and provided quantitative data for risk management.
Estimation Accuracy on Execution Time of Run-Time Tasks in a Heterogeneous Distributed Environment
Liu, Qi; Cai, Weidong; Jin, Dandan; Shen, Jian; Fu, Zhangjie; Liu, Xiaodong; Linge, Nigel
2016-01-01
Distributed Computing has achieved tremendous development since cloud computing was proposed in 2006, and played a vital role promoting rapid growth of data collecting and analysis models, e.g., Internet of things, Cyber-Physical Systems, Big Data Analytics, etc. Hadoop has become a data convergence platform for sensor networks. As one of the core components, MapReduce facilitates allocating, processing and mining of collected large-scale data, where speculative execution strategies help solve straggler problems. However, there is still no efficient solution for accurate estimation on execution time of run-time tasks, which can affect task allocation and distribution in MapReduce. In this paper, task execution data have been collected and employed for the estimation. A two-phase regression (TPR) method is proposed to predict the finishing time of each task accurately. Detailed data of each task have drawn interests with detailed analysis report being made. According to the results, the prediction accuracy of concurrent tasks’ execution time can be improved, in particular for some regular jobs. PMID:27589753
Straus, S G; McGrath, J E
1994-02-01
The authors investigated the hypothesis that as group tasks pose greater requirements for member interdependence, communication media that transmit more social context cues will foster group performance and satisfaction. Seventy-two 3-person groups of undergraduate students worked in either computer-mediated or face-to-face meetings on 3 tasks with increasing levels of interdependence: an idea-generation task, an intellective task, and a judgment task. Results showed few differences between computer-mediated and face-to-face groups in the quality of the work completed but large differences in productivity favoring face-to-face groups. Analysis of productivity and of members' reactions supported the predicted interaction of tasks and media, with greater discrepancies between media conditions for tasks requiring higher levels of coordination. Results are discussed in terms of the implications of using computer-mediated communications systems for group work.
Computational models of the Posner simple and choice reaction time tasks
Feher da Silva, Carolina; Baldo, Marcus V. C.
2015-01-01
The landmark experiments by Posner in the late 1970s have shown that reaction time (RT) is faster when the stimulus appears in an expected location, as indicated by a cue; since then, the so-called Posner task has been considered a “gold standard” test of spatial attention. It is thus fundamental to understand the neural mechanisms involved in performing it. To this end, we have developed a Bayesian detection system and small integrate-and-fire neural networks, which modeled sensory and motor circuits, respectively, and optimized them to perform the Posner task under different cue type proportions and noise levels. In doing so, main findings of experimental research on RT were replicated: the relative frequency effect, suboptimal RTs and significant error rates due to noise and invalid cues, slower RT for choice RT tasks than for simple RT tasks, fastest RTs for valid cues and slowest RTs for invalid cues. Analysis of the optimized systems revealed that the employed mechanisms were consistent with related findings in neurophysiology. Our models predict that (1) the results of a Posner task may be affected by the relative frequency of valid and neutral trials, (2) in simple RT tasks, input from multiple locations are added together to compose a stronger signal, and (3) the cue affects motor circuits more strongly in choice RT tasks than in simple RT tasks. In discussing the computational demands of the Posner task, attention has often been described as a filter that protects the nervous system, whose capacity is limited, from information overload. Our models, however, reveal that the main problems that must be overcome to perform the Posner task effectively are distinguishing signal from external noise and selecting the appropriate response in the presence of internal noise. PMID:26190997
The person's conception of the structures of developing intellect: early adolescence to middle age.
Demetriou, A; Efklides, A
1989-08-01
According to experiential structuralism, thought abilities have six capacity spheres: experimental, propositional, quantitative, imaginal, qualitative, and metacognitive. The first five are applied to the environment. The metacognitive capacity is applied to the others, serving as the interface between reality and the cognitive system or between any of the other capacities. To test this postulate, 648 subjects aged 12 to 40 years, solved eight tasks that were addressed, in pairs, to the first four capacity spheres. One of the tasks in each pair tapped the first and the other the third formal level of the sphere. Having solved the tasks, the subjects were required to rate each pair of tasks in terms of similarity of operations, difficulty, and success of solution. Factor analysis of difficulty and success evaluation scores revealed the same capacity-specific factors as the analysis of performance scores. Factor analysis of similarity scores differentiated between same- and different-sphere pairs. Analysis of variance showed that difficulty and success evaluation scores preserved performance differences between the first and the third formal tasks. Cognitive level, age, socioeconomic status, and sex were related to the metacognitive measures in ways similar to their relations to performance measures. These findings were integrated into a model aimed at capturing real-time metacognitive functioning.
Kushniruk, A. W.; Patel, V. L.; Cimino, J. J.
1997-01-01
This paper describes an approach to the evaluation of health care information technologies based on usability engineering and a methodological framework from the study of medical cognition. The approach involves collection of a rich set of data including video recording of health care workers as they interact with systems, such as computerized patient records and decision support tools. The methodology can be applied in the laboratory setting, typically involving subjects "thinking aloud" as they interact with a system. A similar approach to data collection and analysis can also be extended to study of computer systems in the "live" environment of hospital clinics. Our approach is also influenced from work in the area of cognitive task analysis, which aims to characterize the decision making and reasoning of subjects of varied levels of expertise as they interact with information technology in carrying out representative tasks. The stages involved in conducting cognitively-based usability analyses are detailed and the application of such analysis in the iterative process of system and interface development is discussed. PMID:9357620
Medication Reconciliation: Work Domain Ontology, prototype development, and a predictive model.
Markowitz, Eliz; Bernstam, Elmer V; Herskovic, Jorge; Zhang, Jiajie; Shneiderman, Ben; Plaisant, Catherine; Johnson, Todd R
2011-01-01
Medication errors can result from administration inaccuracies at any point of care and are a major cause for concern. To develop a successful Medication Reconciliation (MR) tool, we believe it necessary to build a Work Domain Ontology (WDO) for the MR process. A WDO defines the explicit, abstract, implementation-independent description of the task by separating the task from work context, application technology, and cognitive architecture. We developed a prototype based upon the WDO and designed to adhere to standard principles of interface design. The prototype was compared to Legacy Health System's and Pre-Admission Medication List Builder MR tools via a Keystroke-Level Model analysis for three MR tasks. The analysis found the prototype requires the fewest mental operations, completes tasks in the fewest steps, and completes tasks in the least amount of time. Accordingly, we believe that developing a MR tool, based upon the WDO and user interface guidelines, improves user efficiency and reduces cognitive load.
Jatobá, Alessandro; de Carvalho, Paulo Victor R; da Cunha, Amauri Marques
2012-01-01
Work in organizations requires a minimum level of consensus on the understanding of the practices performed. To adopt technological devices to support the activities in environments where work is complex, characterized by the interdependence among a large number of variables, understanding about how work is done not only takes an even greater importance, but also becomes a more difficult task. Therefore, this study aims to present a method for modeling of work in complex systems, which allows improving the knowledge about the way activities are performed where these activities do not simply happen by performing procedures. Uniting techniques of Cognitive Task Analysis with the concept of Work Process, this work seeks to provide a method capable of providing a detailed and accurate vision of how people perform their tasks, in order to apply information systems for supporting work in organizations.
Introducing a modular activity monitoring system.
Reiss, Attila; Stricker, Didier
2011-01-01
In this paper, the idea of a modular activity monitoring system is introduced. By using different combinations of the system's three modules, different functionality becomes available: 1) a coarse intensity estimation of physical activities 2) different features based on HR-data and 3) the recognition of basic activities and postures. 3D-accelerometers--placed on lower arm, chest and foot--and a heart rate monitor were used as sensors. A dataset with 8 subjects and 14 different activities was recorded to evaluate the performance of the system. The overall performance on the intensity estimation task, relying on the chest-worn accelerometer and the HR-monitor, was 94.37%. The overall performance on the activity recognition task, using all three accelerometer placements and the HR-monitor, was 90.65%. This paper also gives an analysis of the importance of different accelerometer placements and the importance of a HR-monitor for both tasks.
Development and evaluation of an instantaneous atmospheric corrosion rate monitor
NASA Astrophysics Data System (ADS)
Mansfeld, F.; Jeanjaquet, S. L.; Kendig, M. W.; Roe, D. K.
1985-06-01
A research program was carried out in which a new instantaneous atmospheric corrosion rate monitor (ACRM) was developed and evaluated, and equipment was constructed which will allow the use of many sensors in an economical way in outdoor exposures. In the first task, the ACRM was developed and tested in flow chambers in which relative humidity and gaseous and particulate pollutant levels can be controlled. Diurnal cycles and periods of rain were simulated. The effects of aerosols were studied. A computerized system was used for collection, storage, and analysis of the electrochemical data. In the second task, a relatively inexpensive electronics system for control of the ACRM and measurement of atmospheric corrosion rates was designed and built. In the third task, calibration of deterioration rates of various metallic and nonmetallic materials with the response of the ACRMs attached to these materials was carried out under controlled environmental conditions using the system developed in the second task. A Quality Assurance project plan was prepared with inputs from the Rockwell International Environmental Monitoring and Service Center and Quality Assurance System audits were performed.
HIGH-PRECISION BIOLOGICAL EVENT EXTRACTION: EFFECTS OF SYSTEM AND OF DATA
Cohen, K. Bretonnel; Verspoor, Karin; Johnson, Helen L.; Roeder, Chris; Ogren, Philip V.; Baumgartner, William A.; White, Elizabeth; Tipney, Hannah; Hunter, Lawrence
2013-01-01
We approached the problems of event detection, argument identification, and negation and speculation detection in the BioNLP’09 information extraction challenge through concept recognition and analysis. Our methodology involved using the OpenDMAP semantic parser with manually written rules. The original OpenDMAP system was updated for this challenge with a broad ontology defined for the events of interest, new linguistic patterns for those events, and specialized coordination handling. We achieved state-of-the-art precision for two of the three tasks, scoring the highest of 24 teams at precision of 71.81 on Task 1 and the highest of 6 teams at precision of 70.97 on Task 2. We provide a detailed analysis of the training data and show that a number of trigger words were ambiguous as to event type, even when their arguments are constrained by semantic class. The data is also shown to have a number of missing annotations. Analysis of a sampling of the comparatively small number of false positives returned by our system shows that major causes of this type of error were failing to recognize second themes in two-theme events, failing to recognize events when they were the arguments to other events, failure to recognize nontheme arguments, and sentence segmentation errors. We show that specifically handling coordination had a small but important impact on the overall performance of the system. The OpenDMAP system and the rule set are available at http://bionlp.sourceforge.net. PMID:25937701
Stability of multifinger action in different state spaces
Reschechtko, Sasha; Zatsiorsky, Vladimir M.
2014-01-01
We investigated stability of action by a multifinger system with three methods: analysis of intertrial variance, application of transient perturbations, and analysis of the system's motion in different state spaces. The “inverse piano” device was used to apply transient (lifting-and-lowering) perturbations to individual fingers during single- and two-finger accurate force production tasks. In each trial, the perturbation was applied either to a finger explicitly involved in the task or one that was not. We hypothesized that, in one-finger tasks, task-specific stability would be observed in the redundant space of finger forces but not in the nonredundant space of finger modes (commands to explicitly involved fingers). In two-finger tasks, we expected that perturbations applied to a nontask finger would not contribute to task-specific stability in mode space. In contrast to our expectations, analyses in both force and mode spaces showed lower stability in directions that did not change total force output compared with directions that did cause changes in total force. In addition, the transient perturbations led to a significant increase in the enslaving index. We consider these results within a theoretical scheme of control with referent body configurations organized hierarchically, using multiple few-to-many mappings organized in a synergic way. The observed volatility of enslaving, greater equifinality of total force compared with elemental variables, and large magnitude of motor equivalent motion in both force and mode spaces provide support for the concept of task-specific stability of performance and the existence of multiple neural loops, which ensure this stability. PMID:25253478
Stability of multifinger action in different state spaces.
Reschechtko, Sasha; Zatsiorsky, Vladimir M; Latash, Mark L
2014-12-15
We investigated stability of action by a multifinger system with three methods: analysis of intertrial variance, application of transient perturbations, and analysis of the system's motion in different state spaces. The "inverse piano" device was used to apply transient (lifting-and-lowering) perturbations to individual fingers during single- and two-finger accurate force production tasks. In each trial, the perturbation was applied either to a finger explicitly involved in the task or one that was not. We hypothesized that, in one-finger tasks, task-specific stability would be observed in the redundant space of finger forces but not in the nonredundant space of finger modes (commands to explicitly involved fingers). In two-finger tasks, we expected that perturbations applied to a nontask finger would not contribute to task-specific stability in mode space. In contrast to our expectations, analyses in both force and mode spaces showed lower stability in directions that did not change total force output compared with directions that did cause changes in total force. In addition, the transient perturbations led to a significant increase in the enslaving index. We consider these results within a theoretical scheme of control with referent body configurations organized hierarchically, using multiple few-to-many mappings organized in a synergic way. The observed volatility of enslaving, greater equifinality of total force compared with elemental variables, and large magnitude of motor equivalent motion in both force and mode spaces provide support for the concept of task-specific stability of performance and the existence of multiple neural loops, which ensure this stability. Copyright © 2014 the American Physiological Society.
The flight telerobotic servicer Tinman concept: System design drivers and task analysis
NASA Technical Reports Server (NTRS)
Andary, J. F.; Hewitt, D. R.; Hinkal, S. W.
1989-01-01
A study was conducted to develop a preliminary definition of the Flight Telerobotic Servicer (FTS) that could be used to understand the operational concepts and scenarios for the FTS. Called the Tinman, this design concept was also used to begin the process of establishing resources and interfaces for the FTS on Space Station Freedom, the National Space Transportation System shuttle orbiter, and the Orbital Maneuvering vehicle. Starting with an analysis of the requirements and task capabilities as stated in the Phase B study requirements document, the study identified eight major design drivers for the FTS. Each of these design drivers and their impacts on the Tinman design concept are described. Next, the planning that is currently underway for providing resources for the FTS on Space Station Freedom is discussed, including up to 2000 W of peak power, up to four color video channels, and command and data rates up to 500 kbps between the telerobot and the control station. Finally, an example is presented to show how the Tinman design concept was used to analyze task scenarios and explore the operational capabilities of the FTS. A structured methodology using a standard terminology consistent with the NASA/National Bureau of Standards Standard Reference Model for Telerobot Control System Architecture (NASREM) was developed for this analysis.
Verification and validation of a Work Domain Analysis with turing machine task analysis.
Rechard, J; Bignon, A; Berruet, P; Morineau, T
2015-03-01
While the use of Work Domain Analysis as a methodological framework in cognitive engineering is increasing rapidly, verification and validation of work domain models produced by this method are becoming a significant issue. In this article, we propose the use of a method based on Turing machine formalism named "Turing Machine Task Analysis" to verify and validate work domain models. The application of this method on two work domain analyses, one of car driving which is an "intentional" domain, and the other of a ship water system which is a "causal domain" showed the possibility of highlighting improvements needed by these models. More precisely, the step by step analysis of a degraded task scenario in each work domain model pointed out unsatisfactory aspects in the first modelling, like overspecification, underspecification, omission of work domain affordances, or unsuitable inclusion of objects in the work domain model. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
NASA Technical Reports Server (NTRS)
Brown, David B.
1990-01-01
The results of research and development efforts are described for Task one, Phase two of a general project entitled The Development of a Program Analysis Environment for Ada. The scope of this task includes the design and development of a prototype system for testing Ada software modules at the unit level. The system is called Query Utility Environment for Software Testing of Ada (QUEST/Ada). The prototype for condition coverage provides a platform that implements expert system interaction with program testing. The expert system can modify data in the instrument source code in order to achieve coverage goals. Given this initial prototype, it is possible to evaluate the rule base in order to develop improved rules for test case generation. The goals of Phase two are the following: (1) to continue to develop and improve the current user interface to support the other goals of this research effort (i.e., those related to improved testing efficiency and increased code reliable); (2) to develop and empirically evaluate a succession of alternative rule bases for the test case generator such that the expert system achieves coverage in a more efficient manner; and (3) to extend the concepts of the current test environment to address the issues of Ada concurrency.
Front-End Analysis Methods for the Noncommissioned Officer Education System
2013-02-01
The Noncommissioned Officer Education System plays a crucial role in Soldier development by providing both institutional training and structured-self...created challenges with maintaining currency of institutional training . Questions have arisen regarding the optimal placement of tasks as their...relevance changes, especially considering the resources required to update institutional training . An analysis was conducted to identify the
Kirchner, Elsa A; Kim, Su Kyoung
2018-01-01
Event-related potentials (ERPs) are often used in brain-computer interfaces (BCIs) for communication or system control for enhancing or regaining control for motor-disabled persons. Especially results from single-trial EEG classification approaches for BCIs support correlations between single-trial ERP detection performance and ERP expression. Hence, BCIs can be considered as a paradigm shift contributing to new methods with strong influence on both neuroscience and clinical applications. Here, we investigate the relevance of the choice of training data and classifier transfer for the interpretability of results from single-trial ERP detection. In our experiments, subjects performed a visual-motor oddball task with motor-task relevant infrequent ( targets ), motor-task irrelevant infrequent ( deviants ), and motor-task irrelevant frequent ( standards ) stimuli. Under dual-task condition, a secondary senso-motor task was performed, compared to the simple-task condition. For evaluation, average ERP analysis and single-trial detection analysis with different numbers of electrodes were performed. Further, classifier transfer was investigated between simple and dual task. Parietal positive ERPs evoked by target stimuli (but not by deviants) were expressed stronger under dual-task condition, which is discussed as an increase of task emphasis and brain processes involved in task coordination and change of task set. Highest classification performance was found for targets irrespective whether all 62, 6 or 2 parietal electrodes were used. Further, higher detection performance of targets compared to standards was achieved under dual-task compared to simple-task condition in case of training on data from 2 parietal electrodes corresponding to results of ERP average analysis. Classifier transfer between tasks improves classification performance in case that training took place on more varying examples (from dual task). In summary, we showed that P300 and overlaying parietal positive ERPs can successfully be detected while subjects are performing additional ongoing motor activity. This supports single-trial detection of ERPs evoked by target events to, e.g., infer a patient's attentional state during therapeutic intervention.
Kirchner, Elsa A.; Kim, Su Kyoung
2018-01-01
Event-related potentials (ERPs) are often used in brain-computer interfaces (BCIs) for communication or system control for enhancing or regaining control for motor-disabled persons. Especially results from single-trial EEG classification approaches for BCIs support correlations between single-trial ERP detection performance and ERP expression. Hence, BCIs can be considered as a paradigm shift contributing to new methods with strong influence on both neuroscience and clinical applications. Here, we investigate the relevance of the choice of training data and classifier transfer for the interpretability of results from single-trial ERP detection. In our experiments, subjects performed a visual-motor oddball task with motor-task relevant infrequent (targets), motor-task irrelevant infrequent (deviants), and motor-task irrelevant frequent (standards) stimuli. Under dual-task condition, a secondary senso-motor task was performed, compared to the simple-task condition. For evaluation, average ERP analysis and single-trial detection analysis with different numbers of electrodes were performed. Further, classifier transfer was investigated between simple and dual task. Parietal positive ERPs evoked by target stimuli (but not by deviants) were expressed stronger under dual-task condition, which is discussed as an increase of task emphasis and brain processes involved in task coordination and change of task set. Highest classification performance was found for targets irrespective whether all 62, 6 or 2 parietal electrodes were used. Further, higher detection performance of targets compared to standards was achieved under dual-task compared to simple-task condition in case of training on data from 2 parietal electrodes corresponding to results of ERP average analysis. Classifier transfer between tasks improves classification performance in case that training took place on more varying examples (from dual task). In summary, we showed that P300 and overlaying parietal positive ERPs can successfully be detected while subjects are performing additional ongoing motor activity. This supports single-trial detection of ERPs evoked by target events to, e.g., infer a patient's attentional state during therapeutic intervention. PMID:29636660
Assessment of Joystick control during the performance of powered wheelchair driving tasks
2011-01-01
Background Powered wheelchairs are essential for many individuals who have mobility impairments. Nevertheless, if operated improperly, the powered wheelchair poses dangers to both the user and to those in its vicinity. Thus, operating a powered wheelchair with some degree of proficiency is important for safety, and measuring driving skills becomes an important issue to address. The objective of this study was to explore the discriminate validity of outcome measures of driving skills based on joystick control strategies and performance recorded using a data logging system. Methods We compared joystick control strategies and performance during standardized driving tasks between a group of 10 expert and 13 novice powered wheelchair users. Driving tasks were drawn from the Wheelchair Skills Test (v. 4.1). Data from the joystick controller were collected on a data logging system. Joystick control strategies and performance outcome measures included the mean number of joystick movements, time required to complete tasks, as well as variability of joystick direction. Results In simpler tasks, the expert group's driving skills were comparable to those of the novice group. Yet, in more difficult and spatially confined tasks, the expert group required fewer joystick movements for task completion. In some cases, experts also completed tasks in approximately half the time with respect to the novice group. Conclusions The analysis of joystick control made it possible to discriminate between novice and expert powered wheelchair users in a variety of driving tasks. These results imply that in spatially confined areas, a greater powered wheelchair driving skill level is required to complete tasks efficiently. Based on these findings, it would appear that the use of joystick signal analysis constitutes an objective tool for the measurement of powered wheelchair driving skills. This tool may be useful for the clinical assessment and training of powered wheelchair skills. PMID:21609435
NASA Technical Reports Server (NTRS)
Jeng, Frank F.
2007-01-01
Development of analysis guidelines for Exploration Life Support (ELS) technology tests was completed. The guidelines were developed based on analysis experiences gained from supporting Environmental Control and Life Support System (ECLSS) technology development in air revitalization systems and water recovery systems. Analyses are vital during all three phases of the ELS technology test: pre-test, during test and post test. Pre-test analyses of a test system help define hardware components, predict system and component performances, required test duration, sampling frequencies of operation parameters, etc. Analyses conducted during tests could verify the consistency of all the measurements and the performance of the test system. Post test analyses are an essential part of the test task. Results of post test analyses are an important factor in judging whether the technology development is a successful one. In addition, development of a rigorous model for a test system is an important objective of any new technology development. Test data analyses, especially post test data analyses, serve to verify the model. Test analyses have supported development of many ECLSS technologies. Some test analysis tasks in ECLSS technology development are listed in the Appendix. To have effective analysis support for ECLSS technology tests, analysis guidelines would be a useful tool. These test guidelines were developed based on experiences gained through previous analysis support of various ECLSS technology tests. A comment on analysis from an experienced NASA ECLSS manager (1) follows: "Bad analysis was one that bent the test to prove that the analysis was right to begin with. Good analysis was one that directed where the testing should go and also bridged the gap between the reality of the test facility and what was expected on orbit."
Ivanov, Iliyan; Liu, Xun; Clerkin, Suzanne; Schulz, Kurt; Fan, Jin; Friston, Karl; London, Edythe D; Schwartz, Jeffrey; Newcorn, Jeffrey H
2014-06-01
Psychostimulants, such as methylphenidate, are thought to improve information processing in motivation-reward and attention-activation networks by enhancing the effects of more relevant signals and suppressing those of less relevant ones; however the nature of such reciprocal influences remains poorly understood. To explore this question, we tested the effect of methylphenidate on performance and associated brain activity in the Anticipation, Conflict, Reward (ACR) task. Sixteen healthy adult volunteers, ages 21-45, were scanned twice using functional magnetic resonance imaging (fMRI) as they performed the ACR task under placebo and methylphenidate conditions. A three-way repeated measures analysis of variance, with cue (reward vs. non-reward), target (congruent vs. incongruent) and medication condition (methylphenidate vs. placebo) as the factors, was used to analyze behaviors on the task. Blood oxygen level dependent (BOLD) signals, reflecting task-related neural activity, were evaluated using linear contrasts. Participants exhibited significantly greater accuracy in the methylphenidate condition than the placebo condition. Compared with placebo, the methylphenidate condition also was associated with lesser task-related activity in components of attention-activation systems irrespective of the reward cue, and less task-related activity in components of the reward-motivation system, particularly the insula, during reward trials irrespective of target difficulty. These results suggest that methylphenidate enhances task performance by improving efficiency of information processing in both reward-motivation and in attention-activation systems. Published by Elsevier B.V.
Complexity quantification of dense array EEG using sample entropy analysis.
Ramanand, Pravitha; Nampoori, V P N; Sreenivasan, R
2004-09-01
In this paper, a time series complexity analysis of dense array electroencephalogram signals is carried out using the recently introduced Sample Entropy (SampEn) measure. This statistic quantifies the regularity in signals recorded from systems that can vary from the purely deterministic to purely stochastic realm. The present analysis is conducted with an objective of gaining insight into complexity variations related to changing brain dynamics for EEG recorded from the three cases of passive, eyes closed condition, a mental arithmetic task and the same mental task carried out after a physical exertion task. It is observed that the statistic is a robust quantifier of complexity suited for short physiological signals such as the EEG and it points to the specific brain regions that exhibit lowered complexity during the mental task state as compared to a passive, relaxed state. In the case of mental tasks carried out before and after the performance of a physical exercise, the statistic can detect the variations brought in by the intermediate fatigue inducing exercise period. This enhances its utility in detecting subtle changes in the brain state that can find wider scope for applications in EEG based brain studies.
MSIX - A general and user-friendly platform for RAM analysis
NASA Astrophysics Data System (ADS)
Pan, Z. J.; Blemel, Peter
The authors present a CAD (computer-aided design) platform supporting RAM (reliability, availability, and maintainability) analysis with efficient system description and alternative evaluation. The design concepts, implementation techniques, and application results are described. This platform is user-friendly because of its graphic environment, drawing facilities, object orientation, self-tutoring, and access to the operating system. The programs' independency and portability make them generally applicable to various analysis tasks.
International Space Station ECLSS Technical Task Agreement Summary Report
NASA Technical Reports Server (NTRS)
Ray, C. D. (Compiler); Salyer, B. H. (Compiler)
1999-01-01
This Technical Memorandum provides a summary of current work accomplished under Technical Task Agreement (TTA) by the Marshall Space Flight Center (MSFC) regarding the International Space Station (ISS) Environmental Control and Life Support System (ECLSS). Current activities include ECLSS component design and development, computer model development, subsystem/integrated system testing, life testing, and general test support provided to the ISS program. Under ECLSS design, MSFC was responsible for the six major ECLSS functions, specifications and standard, component design and development, and was the architectural control agent for the ISS ECLSS. MSFC was responsible for ECLSS analytical model development. In-house subsystem and system level analysis and testing were conducted in support of the design process, including testing air revitalization, water reclamation and management hardware, and certain nonregenerative systems. The activities described herein were approved in task agreements between MSFC and NASA Headquarters Space Station Program Management Office and their prime contractor for the ISS, Boeing. These MSFC activities are in line to the designing, development, testing, and flight of ECLSS equipment planned by Boeing. MSFC's unique capabilities for performing integrated systems testing and analyses, and its ability to perform some tasks cheaper and faster to support ISS program needs, are the basis for the TTA activities.
1975-06-01
AUTOKON-71 System. Her major tasks include processing Analysis Requests, releasing new system versions, and coordinating program modifications. Some past...for deformed meshes; improvement of crane boom analysis test data and a post-pro- cessing program. I. BACKGROUND 1. What is the AUTOKON System n? a...updates to either enhance or correct the capa- bilities has been generated. Two major sources contribute these updates: the REAPS Analysis Request (AR
1988-10-01
Structured Analysis involves building a logical (non-physical) model of a system, using graphic techniques which enable users, analysts, and designers to... Design uses tools, especially graphic ones, to render systems readily understandable. 8 Ř. Structured Design offers a set of strategies for...in the overall systems design process, and an overview of the assessment procedures, as well as a guide to the overall assessment. 20. DISTRIBUTION
Application of GIS Technology for Town Planning Tasks Solving
NASA Astrophysics Data System (ADS)
Kiyashko, G. A.
2017-11-01
For developing territories, one of the most actual town-planning tasks is to find out the suitable sites for building projects. The geographic information system (GIS) allows one to model complex spatial processes and can provide necessary effective tools to solve these tasks. We propose several GIS analysis models which can define suitable settlement allocations and select appropriate parcels for construction objects. We implement our models in the ArcGIS Desktop package and verify by application to the existing objects in Primorsky Region (Primorye Territory). These suitability models use several variations of the analysis method combinations and include various ways to resolve the suitability task using vector data and a raster data set. The suitability models created in this study can be combined, and one model can be integrated into another as its part. Our models can be updated by other suitability models for further detailed planning.
Towards systems neuroscience of ADHD: A meta-analysis of 55 fMRI studies
Cortese, Samuele; Kelly, Clare; Chabernaud, Camille; Proal, Erika; Di Martino, Adriana; Milham, Michael P.; Castellanos, F. Xavier
2013-01-01
Objective To perform a comprehensive meta-analysis of task-based functional MRI studies of Attention-Deficit/Hyperactivity Disorder (ADHD). Method PubMed, Ovid, EMBASE, Web of Science, ERIC, CINHAL, and NeuroSynth were searched for studies published through 06/30/2011. Significant differences in activation of brain regions between individuals with ADHD and comparisons were detected using activation likelihood estimation meta-analysis (p<0.05, corrected). Dysfunctional regions in ADHD were related to seven reference neuronal systems. We performed a set of meta-analyses focused on age groups (children; adults), clinical characteristics (history of stimulant treatment; presence of psychiatric comorbidities), and specific neuropsychological tasks (inhibition; working memory; vigilance/attention). Results Fifty-five studies were included (39 in children, 16 in adults). In children, hypoactivation in ADHD vs. comparisons was found mostly in systems involved in executive functions (frontoparietal network) and attention (ventral attentional network). Significant hyperactivation in ADHD vs. comparisons was observed predominantly within the default, ventral attention, and somatomotor networks. In adults, ADHD-related hypoactivation was predominant in the frontoparietal system, while ADHD-related hyperactivation was present in the visual, dorsal attention, and default networks. Significant ADHD-related dysfunction largely reflected task features and was detected even in the absence of comorbid mental disorders or history of stimulant treatment. Conclusions A growing literature provides evidence of ADHD-related dysfunction within multiple neuronal systems involved in higher-level cognitive functions but also in sensorimotor processes, including the visual system, and in the default network. This meta-analytic evidence extends early models of ADHD pathophysiology focused on prefrontal-striatal circuits. PMID:22983386
Wind Sensing, Analysis, and Modeling
NASA Technical Reports Server (NTRS)
Corvin, Michael A.
1995-01-01
The purpose of this task was to begin development of a unified approach to the sensing, analysis, and modeling of the wind environments in which launch system operate. The initial activity was to examine the current usage and requirements for wind modeling for the Titan 4 vehicle. This was to be followed by joint technical efforts with NASA Langley Research Center to develop applicable analysis methods. This work was to be performed in and demonstrate the use of prototype tools implementing an environment in which to realize a unified system. At the convenience of the customer, due to resource limitations, the task was descoped. The survey of Titan 4 processes was accomplished and is reported in this document. A summary of general requirements is provided. Current versions of prototype Process Management Environment tools are being provided to the customer.
Wind sensing, analysis, and modeling
NASA Technical Reports Server (NTRS)
Corvin, Michael A.
1995-01-01
The purpose of this task was to begin development of a unified approach to the sensing, analysis, and modeling of the wind environments in which launch systems operate. The initial activity was to examine the current usage and requirements for wind modeling for the Titan 4 vehicle. This was to be followed by joint technical efforts with NASA Langley Research Center to develop applicable analysis methods. This work was to be performed in and demonstrate the use of prototype tools implementing an environment in which to realize a unified system. At the convenience of the customer, due to resource limitations, the task was descoped. The survey of Titan 4 processes was accomplished and is reported in this document. A summary of general requirements is provided . Current versions of prototype Process Management Environment tools are being provided to the customer.
Information processing and dynamics in minimally cognitive agents.
Beer, Randall D; Williams, Paul L
2015-01-01
There has been considerable debate in the literature about the relative merits of information processing versus dynamical approaches to understanding cognitive processes. In this article, we explore the relationship between these two styles of explanation using a model agent evolved to solve a relational categorization task. Specifically, we separately analyze the operation of this agent using the mathematical tools of information theory and dynamical systems theory. Information-theoretic analysis reveals how task-relevant information flows through the system to be combined into a categorization decision. Dynamical analysis reveals the key geometrical and temporal interrelationships underlying the categorization decision. Finally, we propose a framework for directly relating these two different styles of explanation and discuss the possible implications of our analysis for some of the ongoing debates in cognitive science. Copyright © 2014 Cognitive Science Society, Inc.
Applying Behavior-Based Robotics Concepts to Telerobotic Use of Power Tooling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noakes, Mark W; Hamel, Dr. William R.
While it has long been recognized that telerobotics has potential advantages to reduce operator fatigue, to permit lower skilled operators to function as if they had higher skill levels, and to protect tools and manipulators from excessive forces during operation, relatively little laboratory research in telerobotics has actually been implemented in fielded systems. Much of this has to do with the complexity of the implementation and its lack of ability to operate in complex unstructured remote systems environments. One possible solution is to approach the tooling task using an adaptation of behavior-based techniques to facilitate task decomposition to a simplermore » perspective and to provide sensor registration to the task target object in the field. An approach derived from behavior-based concepts has been implemented to provide automated tool operation for a teleoperated manipulator system. The generic approach is adaptable to a wide range of typical remote tools used in hot-cell and decontamination and dismantlement-type operations. Two tasks are used in this work to test the validity of the concept. First, a reciprocating saw is used to cut a pipe. The second task is bolt removal from mockup process equipment. This paper explains the technique, its implementation, and covers experimental data, analysis of results, and suggestions for implementation on fielded systems.« less
ERIC Educational Resources Information Center
Gosman, Minna L.
Following an analysis of the task of transcribing as practiced in a health facility, this study guide was designed to teach the knowledge and skills required of a medical transcriber. The medical record department was identified as a major occupational area, and a task inventory for medical records was developed and used as a basis for…
ERIC Educational Resources Information Center
Gosman, Minna L.
Following an analysis of the task of transcribing as practiced in a health facility, this study guide was developed to teach the knowledge and skills required of a medical transcriber. The medical record department was identified as a major occupational area, and a task inventory for medical records was developed and used as a basis for a…
NASA Technical Reports Server (NTRS)
Gott, Charles; Galicki, Peter; Shores, David
1990-01-01
The Helmet Mounted Display system and Part Task Trainer are two projects currently underway that are closely related to the in-flight crew training concept. The first project is a training simulator and an engineering analysis tool. The simulator's unique helmet mounted display actually projects the wearer into the simulated environment of 3-D space. Miniature monitors are mounted in front of the wearers eyes. Partial Task Trainer is a kinematic simulator for the Shuttle Remote Manipulator System. The simulator consists of a high end graphics workstation with a high resolution color screen and a number of input peripherals that create a functional equivalent of the RMS control panel in the back of the Orbiter. It is being used in the training cycle for Shuttle crew members. Activities are underway to expand the capability of the Helmet Display System and the Partial Task Trainer.
ERIC Educational Resources Information Center
Hull, Daniel M.; Lovett, James E.
The six new robotics and automated systems specialty courses developed by the Robotics/Automated Systems Technician (RAST) project are described in this publication. Course titles are Fundamentals of Robotics and Automated Systems, Automated Systems and Support Components, Controllers for Robots and Automated Systems, Robotics and Automated…
Solar thermal repowering utility value analysis. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, R.; Day, J.; Reed, B.
The retrofit of solar central receiver energy supply systems to existing steam-electric generating stations (repowering) is being considered as a major programmatic thrust by DOE. The determination of a government response appropriate to the opportunities of repowering is an important policy question, and is the major reason for the analysis. The study objective is to define a government role in repowering that constitutes an efficient program investment in pursuit of viable private markets for heliostat-based energy systems. In support of that objective, the study is designed to identify the scope and nature of the repowering opportunity within the larger contextmore » of its contributions to central receiver technology development and commercialization. The Supply and Integration Tasks are documented elsewhere. This report documents the Demand Task, determining and quantifying the sources of the value of repowering and of central receiver technology in general to electric utilities. The modeling tools and assumptions used in the Demand Task are described and the results are presented and interpreted. (MCW)« less
Research on the use of space resources
NASA Technical Reports Server (NTRS)
Carroll, W. F. (Editor)
1983-01-01
The second year of a multiyear research program on the processing and use of extraterrestrial resources is covered. The research tasks included: (1) silicate processing, (2) magma electrolysis, (3) vapor phase reduction, and (4) metals separation. Concomitant studies included: (1) energy systems, (2) transportation systems, (3) utilization analysis, and (4) resource exploration missions. Emphasis in fiscal year 1982 was placed on the magma electrolysis and vapor phase reduction processes (both analytical and experimental) for separation of oxygen and metals from lunar regolith. The early experimental work on magma electrolysis resulted in gram quantities of iron (mixed metals) and the identification of significant anode, cathode, and container problems. In the vapor phase reduction tasks a detailed analysis of various process concepts led to the selection of two specific processes designated as ""Vapor Separation'' and ""Selective Ionization.'' Experimental work was deferred to fiscal year 1983. In the Silicate Processing task a thermophysical model of the casting process was developed and used to study the effect of variations in material properties on the cooling behavior of lunar basalt.
Project #8, Task 3 - Travel Information Services (TIS), Requirements Analysis Report
DOT National Transportation Integrated Search
1995-03-24
THE L-95 CORRIDOR COALITION'S TRAVELER INFORMATION SERVICES (TIS) PROJECT IS INTENDED TO IMPLEMENT AN ADVANCED TRAVELER INFORMATION SYSTEM TAILORED TO THE UNIQUE NEEDS OF THE NORTHEAST CORRIDOR. THE SYSTEM WILL ACQUIRE AND DISSEMINATE INFORMATION ON ...
Conceptual design study for a teleoperator visual system, phase 1
NASA Technical Reports Server (NTRS)
Adams, D.; Grant, C.; Johnson, C.; Meirick, R.; Polhemus, C.; Ray, A.; Rittenhouse, D.; Skidmore, R.
1972-01-01
Results are reported for work performed during the first phase of the conceptual design study for a teleoperator visual system. This phase consists of four tasks: General requirements, concept development, subsystem requirements and analysis, and concept evaluation.
Human Factors Research in Anesthesia Patient Safety
Weinger, Matthew B.; Slagle, Jason
2002-01-01
Patient safety has become a major public concern. Human factors research in other high-risk fields has demonstrated how rigorous study of factors that affect job performance can lead to improved outcome and reduced errors after evidence-based redesign of tasks or systems. These techniques have increasingly been applied to the anesthesia work environment. This paper describes data obtained recently using task analysis and workload assessment during actual patient care and the use of cognitive task analysis to study clinical decision making. A novel concept of “non-routine events” is introduced and pilot data are presented. The results support the assertion that human factors research can make important contributions to patient safety. Information technologies play a key role in these efforts.
Human factors research in anesthesia patient safety.
Weinger, M. B.; Slagle, J.
2001-01-01
Patient safety has become a major public concern. Human factors research in other high-risk fields has demonstrated how rigorous study of factors that affect job performance can lead to improved outcome and reduced errors after evidence-based redesign of tasks or systems. These techniques have increasingly been applied to the anesthesia work environment. This paper describes data obtained recently using task analysis and workload assessment during actual patient care and the use of cognitive task analysis to study clinical decision making. A novel concept of "non-routine events" is introduced and pilot data are presented. The results support the assertion that human factors research can make important contributions to patient safety. Information technologies play a key role in these efforts. PMID:11825287
NASA Technical Reports Server (NTRS)
Pavel, M.
1993-01-01
This presentation outlines in viewgraph format a general approach to the evaluation of display system quality for aviation applications. This approach is based on the assumption that it is possible to develop a model of the display which captures most of the significant properties of the display. The display characteristics should include spatial and temporal resolution, intensity quantizing effects, spatial sampling, delays, etc. The model must be sufficiently well specified to permit generation of stimuli that simulate the output of the display system. The first step in the evaluation of display quality is an analysis of the tasks to be performed using the display. Thus, for example, if a display is used by a pilot during a final approach, the aesthetic aspects of the display may be less relevant than its dynamic characteristics. The opposite task requirements may apply to imaging systems used for displaying navigation charts. Thus, display quality is defined with regard to one or more tasks. Given a set of relevant tasks, there are many ways to approach display evaluation. The range of evaluation approaches includes visual inspection, rapid evaluation, part-task simulation, and full mission simulation. The work described is focused on two complementary approaches to rapid evaluation. The first approach is based on a model of the human visual system. A model of the human visual system is used to predict the performance of the selected tasks. The model-based evaluation approach permits very rapid and inexpensive evaluation of various design decisions. The second rapid evaluation approach employs specifically designed critical tests that embody many important characteristics of actual tasks. These are used in situations where a validated model is not available. These rapid evaluation tests are being implemented in a workstation environment.
Skill components of task analysis
Rogers, Wendy A.; Fisk, Arthur D.
2017-01-01
Some task analysis methods break down a task into a hierarchy of subgoals. Although an important tool of many fields of study, learning to create such a hierarchy (redescription) is not trivial. To further the understanding of what makes task analysis a skill, the present research examined novices’ problems with learning Hierarchical Task Analysis and captured practitioners’ performance. All participants received a task description and analyzed three cooking and three communication tasks by drawing on their knowledge of those tasks. Thirty six younger adults (18–28 years) in Study 1 analyzed one task before training and five afterwards. Training consisted of a general handout that all participants received and an additional handout that differed between three conditions: a list of steps, a flow-diagram, and concept map. In Study 2, eight experienced task analysts received the same task descriptions as in Study 1 and demonstrated their understanding of task analysis while thinking aloud. Novices’ initial task analysis scored low on all coding criteria. Performance improved on some criteria but was well below 100 % on others. Practitioners’ task analyses were 2–3 levels deep but also scored low on some criteria. A task analyst’s purpose of analysis may be the reason for higher specificity of analysis. This research furthers the understanding of Hierarchical Task Analysis and provides insights into the varying nature of task analyses as a function of experience. The derived skill components can inform training objectives. PMID:29075044
Pupillary transient responses to within-task cognitive load variation.
Wong, Hoe Kin; Epps, Julien
2016-12-01
Changes in physiological signals due to task evoked cognitive load have been reported extensively. However, pupil size based approaches for estimating cognitive load on a moment-to-moment basis are not as well understood as estimating cognitive load on a task-to-task basis, despite the appeal these approaches have for continuous load estimation. In particular, the pupillary transient response to instantaneous changes in induced load has not been experimentally quantified, and the within-task changes in pupil dilation have not been investigated in a manner that allows their consistency to be quantified with a view to biomedical system design. In this paper, a variation of the digit span task is developed which reliably induces rapid changes of cognitive load to generate task-evoked pupillary responses (TEPRs) associated with large, within-task load changes. Linear modelling and one-way ANOVA reveals that increasing the rate of cognitive loading, while keeping task demands constant, results in a steeper pupillary response. Instantaneous drops in cognitive load are shown to produce statistically significantly different transient pupillary responses relative to sustained load, and when characterised using an exponential decay response, the task-evoked pupillary response time constant is in the order of 1-5 s. Within-task test-retest analysis confirms the reliability of the moment-to-moment measurements. Based on these results, estimates of pupil diameter can be employed with considerably more confidence in moment-to-moment cognitive load estimation systems. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hadgu, Teklu; Appel, Gordon John
Sandia National Laboratories (SNL) continued evaluation of total system performance assessment (TSPA) computing systems for the previously considered Yucca Mountain Project (YMP). This was done to maintain the operational readiness of the computing infrastructure (computer hardware and software) and knowledge capability for total system performance assessment (TSPA) type analysis, as directed by the National Nuclear Security Administration (NNSA), DOE 2010. This work is a continuation of the ongoing readiness evaluation reported in Lee and Hadgu (2014) and Hadgu et al. (2015). The TSPA computing hardware (CL2014) and storage system described in Hadgu et al. (2015) were used for the currentmore » analysis. One floating license of GoldSim with Versions 9.60.300, 10.5 and 11.1.6 was installed on the cluster head node, and its distributed processing capability was mapped on the cluster processors. Other supporting software were tested and installed to support the TSPA-type analysis on the server cluster. The current tasks included verification of the TSPA-LA uncertainty and sensitivity analyses, and preliminary upgrade of the TSPA-LA from Version 9.60.300 to the latest version 11.1. All the TSPA-LA uncertainty and sensitivity analyses modeling cases were successfully tested and verified for the model reproducibility on the upgraded 2014 server cluster (CL2014). The uncertainty and sensitivity analyses used TSPA-LA modeling cases output generated in FY15 based on GoldSim Version 9.60.300 documented in Hadgu et al. (2015). The model upgrade task successfully converted the Nominal Modeling case to GoldSim Version 11.1. Upgrade of the remaining of the modeling cases and distributed processing tasks will continue. The 2014 server cluster and supporting software systems are fully operational to support TSPA-LA type analysis.« less
Aerodynamic preliminary analysis system 2. Part 1: Theory
NASA Technical Reports Server (NTRS)
Bonner, E.; Clever, W.; Dunn, K.
1981-01-01
A subsonic/supersonic/hypersonic aerodynamic analysis was developed by integrating the Aerodynamic Preliminary Analysis System (APAS), and the inviscid force calculation modules of the Hypersonic Arbitrary Body Program. APAS analysis was extended for nonlinear vortex forces using a generalization of the Polhamus analogy. The interactive system provides appropriate aerodynamic models for a single input geometry data base and has a run/output format similar to a wind tunnel test program. The user's manual was organized to cover the principle system activities of a typical application, geometric input/editing, aerodynamic evaluation, and post analysis review/display. Sample sessions are included to illustrate the specific task involved and are followed by a comprehensive command/subcommand dictionary used to operate the system.
NASA Astrophysics Data System (ADS)
Durden, D.; Muraoka, H.; Scholes, R. J.; Kim, D. G.; Loescher, H. W.; Bombelli, A.
2017-12-01
The development of an integrated global carbon cycle observation system to monitor changes in the carbon cycle, and ultimately the climate system, across the globe is of crucial importance in the 21stcentury. This system should be comprised of space and ground-based observations, in concert with modelling and analysis, to produce more robust budgets of carbon and other greenhouse gases (GHGs). A global initiative, the GEO Carbon and GHG Initiative, is working within the framework of Group on Earth Observations (GEO) to promote interoperability and provide integration across different parts of the system, particularly at domain interfaces. Thus, optimizing the efforts of existing networks and initiatives to reduce uncertainties in budgets of carbon and other GHGs. This is a very ambitious undertaking; therefore, the initiative is separated into tasks to provide actionable objectives. Task 3 focuses on the optimization of in-situ observational networks. The main objective of Task 3 is to develop and implement a procedure for enhancing and refining the observation system for identified essential carbon cycle variables (ECVs) that meets user-defined specifications at minimum total cost. This work focuses on the outline of the implementation plan, which includes a review of essential carbon cycle variables and observation technologies, mapping the ECVs performance, and analyzing gaps and opportunities in order to design an improved observing system. A description of the gap analysis of in-situ observations that will begin in the terrestrial domain to address issues of missing coordination and large spatial gaps, then extend to ocean and atmospheric observations in the future, will be outlined as the subsequent step to landscape mapping of existing observational networks.
Forecasting the impact of virtual environment technology on maintenance training
NASA Technical Reports Server (NTRS)
Schlager, Mark S.; Boman, Duane; Piantanida, Tom; Stephenson, Robert
1993-01-01
To assist NASA and the Air Force in determining how and when to invest in virtual environment (VE) technology for maintenance training, we identified possible roles for VE technology in such training, assessed its cost-effectiveness relative to existing technologies, and formulated recommendations for a research agenda that would address instructional and system development issues involved in fielding a VE training system. In the first phase of the study, we surveyed VE developers to forecast capabilities, maturity, and estimated costs for VE component technologies. We then identified maintenance tasks and their training costs through interviews with maintenance technicians, instructors, and training developers. Ten candidate tasks were selected from two classes of maintenance tasks (seven aircraft maintenance and three space maintenance) using five criteria developed to identify types of tasks most likely to benefit from VE training. Three tasks were used as specific cases for cost-benefit analysis. In formulating research recommendations, we considered three aspects of feasibility: technological considerations, cost-effectiveness, and anticipated R&D efforts. In this paper, we describe the major findings in each of these areas and suggest research efforts that we believe will help achieve the goal of a cost-effective VE maintenance training system by the next decade.
Modal analysis for Liapunov stability of rotating elastic bodies. Ph.D. Thesis. Final Report
NASA Technical Reports Server (NTRS)
Colin, A. D.
1973-01-01
This study consisted of four parallel efforts: (1) modal analyses of elastic continua for Liapunov stability analysis of flexible spacecraft; (2) development of general purpose simulation equations for arbitrary spacecraft; (3) evaluation of alternative mathematical models for elastic components of spacecraft; and (4) examination of the influence of vehicle flexibility on spacecraft attitude control system performance. A complete record is given of achievements under tasks (1) and (3), in the form of technical appendices, and a summary description of progress under tasks two and four.
An intelligent crowdsourcing system for forensic analysis of surveillance video
NASA Astrophysics Data System (ADS)
Tahboub, Khalid; Gadgil, Neeraj; Ribera, Javier; Delgado, Blanca; Delp, Edward J.
2015-03-01
Video surveillance systems are of a great value for public safety. With an exponential increase in the number of cameras, videos obtained from surveillance systems are often archived for forensic purposes. Many automatic methods have been proposed to do video analytics such as anomaly detection and human activity recognition. However, such methods face significant challenges due to object occlusions, shadows and scene illumination changes. In recent years, crowdsourcing has become an effective tool that utilizes human intelligence to perform tasks that are challenging for machines. In this paper, we present an intelligent crowdsourcing system for forensic analysis of surveillance video that includes the video recorded as a part of search and rescue missions and large-scale investigation tasks. We describe a method to enhance crowdsourcing by incorporating human detection, re-identification and tracking. At the core of our system, we use a hierarchal pyramid model to distinguish the crowd members based on their ability, experience and performance record. Our proposed system operates in an autonomous fashion and produces a final output of the crowdsourcing analysis consisting of a set of video segments detailing the events of interest as one storyline.
Supervisory manipulation based on the concepts of absolute vs relative and fixed vs moving tasks
NASA Technical Reports Server (NTRS)
Brooks, T. L.
1980-01-01
If a machine is to perform a given subtask autonomously, it will require an internal model which, combined with operator and environmental inputs, can be used to generate the manipulator functions necessary to complete the task. This paper will advance a technique based on linear transformations by which short, supervised periods of manipulation can be accomplished. To achieve this end a distinction will be made between tasks which can be completely defined during the training period, and tasks which can be only partially defined prior to the moment of execution. A further distinction will be made between tasks which have a fixed relationship to the manipulator base throughout the execution period, and tasks which have a continuously changing task/base relationship during execution. Finally, through a rudimentary analysis of the methods developed in this paper, some of the practical aspects of implementing a supervisory system will be illustrated
Choi, Younggeun; Gordon, James; Park, Hyeshin; Schweighofer, Nicolas
2011-08-03
Current guidelines for rehabilitation of arm and hand function after stroke recommend that motor training focus on realistic tasks that require reaching and manipulation and engage the patient intensively, actively, and adaptively. Here, we investigated the feasibility of a novel robotic task-practice system, ADAPT, designed in accordance with such guidelines. At each trial, ADAPT selects a functional task according to a training schedule and with difficulty based on previous performance. Once the task is selected, the robot picks up and presents the corresponding tool, simulates the dynamics of the tasks, and the patient interacts with the tool to perform the task. Five participants with chronic stroke with mild to moderate impairments (> 9 months post-stroke; Fugl-Meyer arm score 49.2 ± 5.6) practiced four functional tasks (selected out of six in a pre-test) with ADAPT for about one and half hour and 144 trials in a pseudo-random schedule of 3-trial blocks per task. No adverse events occurred and ADAPT successfully presented the six functional tasks without human intervention for a total of 900 trials. Qualitative analysis of trajectories showed that ADAPT simulated the desired task dynamics adequately, and participants reported good, although not excellent, task fidelity. During training, the adaptive difficulty algorithm progressively increased task difficulty leading towards an optimal challenge point based on performance; difficulty was then continuously adjusted to keep performance around the challenge point. Furthermore, the time to complete all trained tasks decreased significantly from pretest to one-hour post-test. Finally, post-training questionnaires demonstrated positive patient acceptance of ADAPT. ADAPT successfully provided adaptive progressive training for multiple functional tasks based on participant's performance. Our encouraging results establish the feasibility of ADAPT; its efficacy will next be tested in a clinical trial.
Backward assembly planning with DFA analysis
NASA Technical Reports Server (NTRS)
Lee, Sukhan (Inventor)
1995-01-01
An assembly planning system that operates based on a recursive decomposition of assembly into subassemblies, and analyzes assembly cost in terms of stability, directionality, and manipulability to guide the generation of preferred assembly plans is presented. The planning in this system incorporates the special processes, such as cleaning, testing, labeling, etc. that must occur during the assembly, and handles nonreversible as well as reversible assembly tasks through backward assembly planning. In order to increase the planning efficiency, the system avoids the analysis of decompositions that do not correspond to feasible assembly tasks. This is achieved by grouping and merging those parts that can not be decomposable at the current stage of backward assembly planning due to the requirement of special processes and the constraint of interconnection feasibility. The invention includes methods of evaluating assembly cost in terms of the number of fixtures (or holding devices) and reorientations required for assembly, through the analysis of stability, directionality, and manipulability. All these factors are used in defining cost and heuristic functions for an AO* search for an optimal plan.
NASA Astrophysics Data System (ADS)
Fredouille, Corinne; Pouchoulin, Gilles; Ghio, Alain; Revis, Joana; Bonastre, Jean-François; Giovanni, Antoine
2009-12-01
This paper addresses voice disorder assessment. It proposes an original back-and-forth methodology involving an automatic classification system as well as knowledge of the human experts (machine learning experts, phoneticians, and pathologists). The goal of this methodology is to bring a better understanding of acoustic phenomena related to dysphonia. The automatic system was validated on a dysphonic corpus (80 female voices), rated according to the GRBAS perceptual scale by an expert jury. Firstly, focused on the frequency domain, the classification system showed the interest of 0-3000 Hz frequency band for the classification task based on the GRBAS scale. Later, an automatic phonemic analysis underlined the significance of consonants and more surprisingly of unvoiced consonants for the same classification task. Submitted to the human experts, these observations led to a manual analysis of unvoiced plosives, which highlighted a lengthening of VOT according to the dysphonia severity validated by a preliminary statistical analysis.
Methods for Conducting Cognitive Task Analysis for a Decision Making Task.
1996-01-01
Cognitive task analysis (CTA) improves traditional task analysis procedures by analyzing the thought processes of performers while they complete a...for using these methods to conduct a CTA for domains which involve critical decision making tasks in naturalistic settings. The cognitive task analysis methods
Analysis of methods of processing of expert information by optimization of administrative decisions
NASA Astrophysics Data System (ADS)
Churakov, D. Y.; Tsarkova, E. G.; Marchenko, N. D.; Grechishnikov, E. V.
2018-03-01
In the real operation the measure definition methodology in case of expert estimation of quality and reliability of application-oriented software products is offered. In operation methods of aggregation of expert estimates on the example of a collective choice of an instrumental control projects in case of software development of a special purpose for needs of institutions are described. Results of operation of dialogue decision making support system are given an algorithm of the decision of the task of a choice on the basis of a method of the analysis of hierarchies and also. The developed algorithm can be applied by development of expert systems to the solution of a wide class of the tasks anyway connected to a multicriteria choice.
Wireless and wearable EEG system for evaluating driver vigilance.
Lin, Chin-Teng; Chuang, Chun-Hsiang; Huang, Chih-Sheng; Tsai, Shu-Fang; Lu, Shao-Wei; Chen, Yen-Hsuan; Ko, Li-Wei
2014-04-01
Brain activity associated with attention sustained on the task of safe driving has received considerable attention recently in many neurophysiological studies. Those investigations have also accurately estimated shifts in drivers' levels of arousal, fatigue, and vigilance, as evidenced by variations in their task performance, by evaluating electroencephalographic (EEG) changes. However, monitoring the neurophysiological activities of automobile drivers poses a major measurement challenge when using a laboratory-oriented biosensor technology. This work presents a novel dry EEG sensor based mobile wireless EEG system (referred to herein as Mindo) to monitor in real time a driver's vigilance status in order to link the fluctuation of driving performance with changes in brain activities. The proposed Mindo system incorporates the use of a wireless and wearable EEG device to record EEG signals from hairy regions of the driver conveniently. Additionally, the proposed system can process EEG recordings and translate them into the vigilance level. The study compares the system performance between different regression models. Moreover, the proposed system is implemented using JAVA programming language as a mobile application for online analysis. A case study involving 15 study participants assigned a 90 min sustained-attention driving task in an immersive virtual driving environment demonstrates the reliability of the proposed system. Consistent with previous studies, power spectral analysis results confirm that the EEG activities correlate well with the variations in vigilance. Furthermore, the proposed system demonstrated the feasibility of predicting the driver's vigilance in real time.
A Management Information Systems Needs Analysis for the University of Nevada Reno.
ERIC Educational Resources Information Center
Nevada Univ., Reno.
Results of a needs assessment for administrative computing at the University of Nevada, Reno, are presented. The objectives of the Management Information Systems Task Force are identified, along with 17 problems in existing operational and management data systems, and institutional goals for future planning and management systems. In addition to…
NASA Technical Reports Server (NTRS)
1973-01-01
An analytical comparison is made of space communication accomplished at six different wavelengths. In the radio band, 2.25, 7.5, and 14.5 GHz systems are analyzed, while at optical wavelengths, 0.53, 1.06 and 10.6 micron systems are examined. The purpose of the comparison is to determine which of these systems will require the least hardware weight to perform a given communication task. The problem is solved by requiring each communication system to meet a given performance while selecting combinations of transmitted power and antenna diameter to obtain the least overall system weight. This performance is provided while maintaining practical values for parameters other than antenna diameter and power, which also affect system performance. The results of the analysis indicate that for future data links over ranges of 42,000 to 84,000 km and with data bandwidths of 100 to 1000 MHz, the CO2 laser system will provide the required performance with the least total system weight impact on a spacecraft.
Integrated exhaust gas analysis system for aircraft turbine engine component testing
NASA Technical Reports Server (NTRS)
Summers, R. L.; Anderson, R. C.
1985-01-01
An integrated exhaust gas analysis system was designed and installed in the hot-section facility at the Lewis Research Center. The system is designed to operate either manually or automatically and also to be operated from a remote station. The system measures oxygen, water vapor, total hydrocarbons, carbon monoxide, carbon dioxide, and oxides of nitrogen. Two microprocessors control the system and the analyzers, collect data and process them into engineering units, and present the data to the facility computers and the system operator. Within the design of this system there are innovative concepts and procedures that are of general interest and application to other gas analysis tasks.
Tutorial Workshop on Robotics and Robot Control.
1982-10-26
J^V7S US ARMY TANK-AUTOMOTIVE COMMAND, WARREN MICHIGAN US ARMY MATERIEL SYSTEMS ANALYSIS ACTIVITY, ABERDEEN PROVING GROUNDS, MARYLAND ^ V&S...Technology Pasadena, California 91103 M. Vur.kovic Senior Research Associate Institute for Technoeconomic Systems Department of Industrial...Further investigation of the action precedence graphs together with their appli- cation to more complex manipulator tasks and analysis of J2. their
NASA Technical Reports Server (NTRS)
1983-01-01
Space station systems characteristics and architecture are described. A manned space station operational analysis is performed to determine crew size, crew task complexity and time tables, and crew equipment to support the definition of systems and subsystems concepts. This analysis is used to select and evaluate the architectural options for development.
Approach to recognition of flexible form for credit card expiration date recognition as example
NASA Astrophysics Data System (ADS)
Sheshkus, Alexander; Nikolaev, Dmitry P.; Ingacheva, Anastasia; Skoryukina, Natalya
2015-12-01
In this paper we consider a task of finding information fields within document with flexible form for credit card expiration date field as example. We discuss main difficulties and suggest possible solutions. In our case this task is to be solved on mobile devices therefore computational complexity has to be as low as possible. In this paper we provide results of the analysis of suggested algorithm. Error distribution of the recognition system shows that suggested algorithm solves the task with required accuracy.
[Computers in biomedical research: I. Analysis of bioelectrical signals].
Vivaldi, E A; Maldonado, P
2001-08-01
A personal computer equipped with an analog-to-digital conversion card is able to input, store and display signals of biomedical interest. These signals can additionally be submitted to ad-hoc software for analysis and diagnosis. Data acquisition is based on the sampling of a signal at a given rate and amplitude resolution. The automation of signal processing conveys syntactic aspects (data transduction, conditioning and reduction); and semantic aspects (feature extraction to describe and characterize the signal and diagnostic classification). The analytical approach that is at the basis of computer programming allows for the successful resolution of apparently complex tasks. Two basic principles involved are the definition of simple fundamental functions that are then iterated and the modular subdivision of tasks. These two principles are illustrated, respectively, by presenting the algorithm that detects relevant elements for the analysis of a polysomnogram, and the task flow in systems that automate electrocardiographic reports.
Mindtagger: A Demonstration of Data Labeling in Knowledge Base Construction.
Shin, Jaeho; Ré, Christopher; Cafarella, Michael
2015-08-01
End-to-end knowledge base construction systems using statistical inference are enabling more people to automatically extract high-quality domain-specific information from unstructured data. As a result of deploying DeepDive framework across several domains, we found new challenges in debugging and improving such end-to-end systems to construct high-quality knowledge bases. DeepDive has an iterative development cycle in which users improve the data. To help our users, we needed to develop principles for analyzing the system's error as well as provide tooling for inspecting and labeling various data products of the system. We created guidelines for error analysis modeled after our colleagues' best practices, in which data labeling plays a critical role in every step of the analysis. To enable more productive and systematic data labeling, we created Mindtagger, a versatile tool that can be configured to support a wide range of tasks. In this demonstration, we show in detail what data labeling tasks are modeled in our error analysis guidelines and how each of them is performed using Mindtagger.
Backward assembly planning with DFA analysis
NASA Technical Reports Server (NTRS)
Lee, Sukhan (Inventor)
1992-01-01
An assembly planning system that operates based on a recursive decomposition of assembly into subassemblies is presented. The planning system analyzes assembly cost in terms of stability, directionality, and manipulability to guide the generation of preferred assembly plans. The planning in this system incorporates the special processes, such as cleaning, testing, labeling, etc., that must occur during the assembly. Additionally, the planning handles nonreversible, as well as reversible, assembly tasks through backward assembly planning. In order to decrease the planning efficiency, the system avoids the analysis of decompositions that do not correspond to feasible assembly tasks. This is achieved by grouping and merging those parts that can not be decomposable at the current stage of backward assembly planning due to the requirement of special processes and the constraint of interconnection feasibility. The invention includes methods of evaluating assembly cost in terms of the number of fixtures (or holding devices) and reorientations required for assembly, through the analysis of stability, directionality, and manipulability. All these factors are used in defining cost and heuristic functions for an AO* search for an optimal plan.
Applications of flight control system methods to an advanced combat rotorcraft
NASA Technical Reports Server (NTRS)
Tischler, Mark B.; Fletcher, Jay W.; Morris, Patrick M.; Tucker, George T.
1989-01-01
Advanced flight control system design, analysis, and testing methodologies developed at the Ames Research Center are applied in an analytical and flight test evaluation of the Advanced Digital Optical Control System (ADOCS) demonstrator. The primary objectives are to describe the knowledge gained about the implications of digital flight control system design for rotorcraft, and to illustrate the analysis of the resulting handling-qualities in the context of the proposed new handling-qualities specification for rotorcraft. Topics covered in-depth are digital flight control design and analysis methods, flight testing techniques, ADOCS handling-qualities evaluation results, and correlation of flight test results with analytical models and the proposed handling-qualities specification. The evaluation of the ADOCS demonstrator indicates desirable response characteristics based on equivalent damping and frequency, but undersirably large effective time-delays (exceeding 240 m sec in all axes). Piloted handling-qualities are found to be desirable or adequate for all low, medium, and high pilot gain tasks; but handling-qualities are inadequate for ultra-high gain tasks such as slope and running landings.
Using cognitive task analysis to inform issues in human systems integration in railroad operations
DOT National Transportation Integrated Search
2013-05-23
U.S. Railroad operations are undergoing rapid changes involving the introduction of new technologies such as positive train control (PTC), energy management systems (EMS), and electronically controlled pneumatic (ECP) brakes in the locomotive cab. To...
Digital processing of mesoscale analysis and space sensor data
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.
1985-01-01
The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.
Lessons Learned from Deploying an Analytical Task Management Database
NASA Technical Reports Server (NTRS)
O'Neil, Daniel A.; Welch, Clara; Arceneaux, Joshua; Bulgatz, Dennis; Hunt, Mitch; Young, Stephen
2007-01-01
Defining requirements, missions, technologies, and concepts for space exploration involves multiple levels of organizations, teams of people with complementary skills, and analytical models and simulations. Analytical activities range from filling a To-Be-Determined (TBD) in a requirement to creating animations and simulations of exploration missions. In a program as large as returning to the Moon, there are hundreds of simultaneous analysis activities. A way to manage and integrate efforts of this magnitude is to deploy a centralized database that provides the capability to define tasks, identify resources, describe products, schedule deliveries, and generate a variety of reports. This paper describes a web-accessible task management system and explains the lessons learned during the development and deployment of the database. Through the database, managers and team leaders can define tasks, establish review schedules, assign teams, link tasks to specific requirements, identify products, and link the task data records to external repositories that contain the products. Data filters and spreadsheet export utilities provide a powerful capability to create custom reports. Import utilities provide a means to populate the database from previously filled form files. Within a four month period, a small team analyzed requirements, developed a prototype, conducted multiple system demonstrations, and deployed a working system supporting hundreds of users across the aeros pace community. Open-source technologies and agile software development techniques, applied by a skilled team enabled this impressive achievement. Topics in the paper cover the web application technologies, agile software development, an overview of the system's functions and features, dealing with increasing scope, and deploying new versions of the system.
NASA Technical Reports Server (NTRS)
1979-01-01
A plan for the production of two PEP flight systems is defined. The task's milestones are described. Provisions for the development and assembly of new ground support equipment required for both testing and launch operations are included.
DOT National Transportation Integrated Search
1978-06-01
A plow-resistant recessed reflective marker (RRM) delineation system having a linear tapered profile and which uses a reflector base/reflector unit is proposed. A single-operator mechanized vehicle to install the RRM delineation system is described. ...
1985-06-01
ADDRESS 10. PROGRAM ELEMENT, PROJECT. TASK AREA & WORK UNIT NUMBERS Naval Postgraduate School Monterey, California 93943 11. CONTROLLING OFFICE NAME AND...determine the sccioeccnomic representativeness of the Army’s enlistees in that iarticular year. In addition, the socioeconomic overviev of Republic cf...accomplished with the use of the Statistical Analysis System (SAS), an integrated computer system for data analysis. 32 TABLE 2 The States in Each District
Archaeological Inventory and Evaluation at Milford, Melvern and Pomona Lakes, Eastern Kansas
1988-01-01
Milford, Melvern and Pomona Lakes, Kansas US Army Corps of Engneers Environmental Systems Analysis , Inc. Kansas City District Cultural Resources... Analysis , Inc. Cultural Resources Division Accesikn For Kansas City, Kansas NTIS CRA&I UTIC TA’R Larry J. Schits’ Editor Principal Investigator i. 1988...PERFORMING ORGANIZATION NAME AND ADDRESS S0. PROGRAM ELEMENT. PROJECT. TASK AREA & WORK UNIT NUMBERS Environmental Systems Analysis , Inc. P.O. Box
ERIC Educational Resources Information Center
Marcum, Deanna; Boss, Richard
1983-01-01
Relates office automation to its application in libraries, discussing computer software packages for microcomputers performing tasks involved in word processing, accounting, statistical analysis, electronic filing cabinets, and electronic mail systems. (EJS)
Fault tolerant architectures for integrated aircraft electronics systems, task 2
NASA Technical Reports Server (NTRS)
Levitt, K. N.; Melliar-Smith, P. M.; Schwartz, R. L.
1984-01-01
The architectural basis for an advanced fault tolerant on-board computer to succeed the current generation of fault tolerant computers is examined. The network error tolerant system architecture is studied with particular attention to intercluster configurations and communication protocols, and to refined reliability estimates. The diagnosis of faults, so that appropriate choices for reconfiguration can be made is discussed. The analysis relates particularly to the recognition of transient faults in a system with tasks at many levels of priority. The demand driven data-flow architecture, which appears to have possible application in fault tolerant systems is described and work investigating the feasibility of automatic generation of aircraft flight control programs from abstract specifications is reported.
Anatomical background and generalized detectability in tomosynthesis and cone-beam CT.
Gang, G J; Tward, D J; Lee, J; Siewerdsen, J H
2010-05-01
Anatomical background presents a major impediment to detectability in 2D radiography as well as 3D tomosynthesis and cone-beam CT (CBCT). This article incorporates theoretical and experimental analysis of anatomical background "noise" in cascaded systems analysis of 2D and 3D imaging performance to yield "generalized" metrics of noise-equivalent quanta (NEQ) and detectability index as a function of the orbital extent of the (circular arc) source-detector orbit. A physical phantom was designed based on principles of fractal self-similarity to exhibit power-law spectral density (kappa/Fbeta) comparable to various anatomical sites (e.g., breast and lung). Background power spectra [S(B)(F)] were computed as a function of source-detector orbital extent, including tomosynthesis (approximately 10 degrees -180 degrees) and CBCT (180 degrees + fan to 360 degrees) under two acquisition schemes: (1) Constant angular separation between projections (variable dose) and (2) constant total number of projections (constant dose). The resulting S(B) was incorporated in the generalized NEQ, and detectability index was computed from 3D cascaded systems analysis for a variety of imaging tasks. The phantom yielded power-law spectra within the expected spatial frequency range, quantifying the dependence of clutter magnitude (kappa) and correlation (beta) with increasing tomosynthesis angle. Incorporation of S(B) in the 3D NEQ provided a useful framework for analyzing the tradeoffs among anatomical, quantum, and electronic noise with dose and orbital extent. Distinct implications are posed for breast and chest tomosynthesis imaging system design-applications varying significantly in kappa and beta, and imaging task and, therefore, in optimal selection of orbital extent, number of projections, and dose. For example, low-frequency tasks (e.g., soft-tissue masses or nodules) tend to benefit from larger orbital extent and more fully 3D tomographic imaging, whereas high-frequency tasks (e.g., microcalcifications) require careful, application-specific selection of orbital extent and number of projections to minimize negative effects of quantum and electronic noise. The complex tradeoffs among anatomical background, quantum noise, and electronic noise in projection imaging, tomosynthesis, and CBCT can be described by generalized cascaded systems analysis, providing a useful framework for system design and optimization.
NASA Astrophysics Data System (ADS)
Platisa, Ljiljana; Vansteenkiste, Ewout; Goossens, Bart; Marchessoux, Cédric; Kimpe, Tom; Philips, Wilfried
2009-02-01
Medical-imaging systems are designed to aid medical specialists in a specific task. Therefore, the physical parameters of a system need to optimize the task performance of a human observer. This requires measurements of human performance in a given task during the system optimization. Typically, psychophysical studies are conducted for this purpose. Numerical observer models have been successfully used to predict human performance in several detection tasks. Especially, the task of signal detection using a channelized Hotelling observer (CHO) in simulated images has been widely explored. However, there are few studies done for clinically acquired images that also contain anatomic noise. In this paper, we investigate the performance of a CHO in the task of detecting lung nodules in real radiographic images of the chest. To evaluate variability introduced by the limited available data, we employ a commonly used study of a multi-reader multi-case (MRMC) scenario. It accounts for both case and reader variability. Finally, we use the "oneshot" methods to estimate the MRMC variance of the area under the ROC curve (AUC). The obtained AUC compares well to those reported for human observer study on a similar data set. Furthermore, the "one-shot" analysis implies a fairly consistent performance of the CHO with the variance of AUC below 0.002. This indicates promising potential for numerical observers in optimization of medical imaging displays and encourages further investigation on the subject.
NDARC NASA Design and Analysis of Rotorcraft. Appendix 5; Theory
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2017-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC: NASA Design and Analysis of Rotorcraft. Appendix 3; Theory
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2016-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet speci?ed requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft con?gurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates con?guration ?exibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-?delity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy speci?ed design conditions and missions. The analysis tasks can include off-design mission performance calculation, ?ight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft con?gurations is facilitated, while retaining the capability to model novel and advanced concepts. Speci?c rotorcraft con?gurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-?delity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft - Input, Appendix 2
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2016-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration exibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tilt-rotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft. Appendix 6; Input
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2017-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft
NASA Technical Reports Server (NTRS)
Johnson, Wayne R.
2009-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool intended to support both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility; a hierarchy of models; and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with lowfidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single main-rotor and tailrotor helicopter; tandem helicopter; coaxial helicopter; and tiltrotors. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC - NASA Design and Analysis of Rotorcraft
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2015-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft Theory Appendix 1
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2016-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
A WorkFlow Engine Oriented Modeling System for Hydrologic Sciences
NASA Astrophysics Data System (ADS)
Lu, B.; Piasecki, M.
2009-12-01
In recent years the use of workflow engines for carrying out modeling and data analyses tasks has gained increased attention in the science and engineering communities. Tasks like processing raw data coming from sensors and passing these raw data streams to filters for QA/QC procedures possibly require multiple and complicated steps that need to be repeated over and over again. A workflow sequence that carries out a number of steps of various complexity is an ideal approach to deal with these tasks because the sequence can be stored, called up and repeated over again and again. This has several advantages: for one it ensures repeatability of processing steps and with that provenance, an issue that is increasingly important in the science and engineering communities. It also permits the hand off of lengthy and time consuming tasks that can be error prone to a chain of processing actions that are carried out automatically thus reducing the chance for error on the one side and freeing up time to carry out other tasks on the other hand. This paper aims to present the development of a workflow engine embedded modeling system which allows to build up working sequences for carrying out numerical modeling tasks regarding to hydrologic science. Trident, which facilitates creating, running and sharing scientific data analysis workflows, is taken as the central working engine of the modeling system. Current existing functionalities of the modeling system involve digital watershed processing, online data retrieval, hydrologic simulation and post-event analysis. They are stored as sequences or modules respectively. The sequences can be invoked to implement their preset tasks in orders, for example, triangulating a watershed from raw DEM. Whereas the modules encapsulated certain functions can be selected and connected through a GUI workboard to form sequences. This modeling system is demonstrated by setting up a new sequence for simulating rainfall-runoff processes which involves embedded Penn State Integrated Hydrologic Model(PIHM) module for hydrologic simulation as a kernel, DEM processing sub-sequence which prepares geospatial data for PIHM, data retrieval module which access time series data from online data repository via web services or from local database, post- data management module which stores , visualizes and analyzes model outputs.
Space station data system analysis/architecture study. Task 4: System definition report
NASA Technical Reports Server (NTRS)
1985-01-01
Functional/performance requirements for the Space Station Data System (SSDS) are analyzed and architectural design concepts are derived and evaluated in terms of their performance and growth potential, technical feasibility and risk, and cost effectiveness. The design concepts discussed are grouped under five major areas: SSDS top-level architecture overview, end-to-end SSDS design and operations perspective, communications assumptions and traffic analysis, onboard SSDS definition, and ground SSDS definition.
Neural Correlates of Belief and Emotion Attribution in Schizophrenia.
Lee, Junghee; Horan, William P; Wynn, Jonathan K; Green, Michael F
2016-01-01
Impaired mental state attribution is a core social cognitive deficit in schizophrenia. With functional magnetic resonance imaging (fMRI), this study examined the extent to which the core neural system of mental state attribution is involved in mental state attribution, focusing on belief attribution and emotion attribution. Fifteen schizophrenia outpatients and 14 healthy controls performed two mental state attribution tasks in the scanner. In a Belief Attribution Task, after reading a short vignette, participants were asked infer either the belief of a character (a false belief condition) or a physical state of an affair (a false photograph condition). In an Emotion Attribution Task, participants were asked either to judge whether character(s) in pictures felt unpleasant, pleasant, or neutral emotion (other condition) or to look at pictures that did not have any human characters (view condition). fMRI data were analyzing focusing on a priori regions of interest (ROIs) of the core neural systems of mental state attribution: the medial prefrontal cortex (mPFC), temporoparietal junction (TPJ) and precuneus. An exploratory whole brain analysis was also performed. Both patients and controls showed greater activation in all four ROIs during the Belief Attribution Task than the Emotion Attribution Task. Patients also showed less activation in the precuneus and left TPJ compared to controls during the Belief Attribution Task. No significant group difference was found during the Emotion Attribution Task in any of ROIs. An exploratory whole brain analysis showed a similar pattern of neural activations. These findings suggest that while schizophrenia patients rely on the same neural network as controls do when attributing beliefs of others, patients did not show reduced activation in the key regions such as the TPJ. Further, this study did not find evidence for aberrant neural activation during emotion attribution or recruitment of compensatory brain regions in schizophrenia.
Howes, Andrew; Lewis, Richard L; Vera, Alonso
2009-10-01
The authors assume that individuals adapt rationally to a utility function given constraints imposed by their cognitive architecture and the local task environment. This assumption underlies a new approach to modeling and understanding cognition-cognitively bounded rational analysis-that sharpens the predictive acuity of general, integrated theories of cognition and action. Such theories provide the necessary computational means to explain the flexible nature of human behavior but in doing so introduce extreme degrees of freedom in accounting for data. The new approach narrows the space of predicted behaviors through analysis of the payoff achieved by alternative strategies, rather than through fitting strategies and theoretical parameters to data. It extends and complements established approaches, including computational cognitive architectures, rational analysis, optimal motor control, bounded rationality, and signal detection theory. The authors illustrate the approach with a reanalysis of an existing account of psychological refractory period (PRP) dual-task performance and the development and analysis of a new theory of ordered dual-task responses. These analyses yield several novel results, including a new understanding of the role of strategic variation in existing accounts of PRP and the first predictive, quantitative account showing how the details of ordered dual-task phenomena emerge from the rational control of a cognitive system subject to the combined constraints of internal variance, motor interference, and a response selection bottleneck.
Scheduling Operational Operational-Level Courses of Action
2003-10-01
Process modelling and analysis – process synchronisation techniques Information and knowledge management – Collaborative planning systems – Workflow...logistics – Some tasks may consume resources The military user may wish to impose synchronisation constraints among tasks A military end state can be...effects, – constrained with resource and synchronisation considerations, and – lead to the achievement of conditions set in the end state. The COA is
ERIC Educational Resources Information Center
Gosman, Minna L.
Developed as a result of an analysis of the task of transcribing as practiced in a health facility, this study guide was designed to teach the knowledge and skills required of a medical transcriber. The medical record department was identified as a major occupational area, and a task inventory for medical records was developed and used as a basis…
Johnson Space Center's Risk and Reliability Analysis Group 2008 Annual Report
NASA Technical Reports Server (NTRS)
Valentine, Mark; Boyer, Roger; Cross, Bob; Hamlin, Teri; Roelant, Henk; Stewart, Mike; Bigler, Mark; Winter, Scott; Reistle, Bruce; Heydorn,Dick
2009-01-01
The Johnson Space Center (JSC) Safety & Mission Assurance (S&MA) Directorate s Risk and Reliability Analysis Group provides both mathematical and engineering analysis expertise in the areas of Probabilistic Risk Assessment (PRA), Reliability and Maintainability (R&M) analysis, and data collection and analysis. The fundamental goal of this group is to provide National Aeronautics and Space Administration (NASA) decisionmakers with the necessary information to make informed decisions when evaluating personnel, flight hardware, and public safety concerns associated with current operating systems as well as with any future systems. The Analysis Group includes a staff of statistical and reliability experts with valuable backgrounds in the statistical, reliability, and engineering fields. This group includes JSC S&MA Analysis Branch personnel as well as S&MA support services contractors, such as Science Applications International Corporation (SAIC) and SoHaR. The Analysis Group s experience base includes nuclear power (both commercial and navy), manufacturing, Department of Defense, chemical, and shipping industries, as well as significant aerospace experience specifically in the Shuttle, International Space Station (ISS), and Constellation Programs. The Analysis Group partners with project and program offices, other NASA centers, NASA contractors, and universities to provide additional resources or information to the group when performing various analysis tasks. The JSC S&MA Analysis Group is recognized as a leader in risk and reliability analysis within the NASA community. Therefore, the Analysis Group is in high demand to help the Space Shuttle Program (SSP) continue to fly safely, assist in designing the next generation spacecraft for the Constellation Program (CxP), and promote advanced analytical techniques. The Analysis Section s tasks include teaching classes and instituting personnel qualification processes to enhance the professional abilities of our analysts as well as performing major probabilistic assessments used to support flight rationale and help establish program requirements. During 2008, the Analysis Group performed more than 70 assessments. Although all these assessments were important, some were instrumental in the decisionmaking processes for the Shuttle and Constellation Programs. Two of the more significant tasks were the Space Transportation System (STS)-122 Low Level Cutoff PRA for the SSP and the Orion Pad Abort One (PA-1) PRA for the CxP. These two activities, along with the numerous other tasks the Analysis Group performed in 2008, are summarized in this report. This report also highlights several ongoing and upcoming efforts to provide crucial statistical and probabilistic assessments, such as the Extravehicular Activity (EVA) PRA for the Hubble Space Telescope service mission and the first fully integrated PRAs for the CxP's Lunar Sortie and ISS missions.
An Analysis of the Plumbing Occupation.
ERIC Educational Resources Information Center
Carlton, Earnest L.; Hollar, Charles E.
The occupational analysis contains a brief job description, presenting for the occupation of plumbing 12 detailed task statements which specify job duties (tools, equipment, materials, objects acted upon, performance knowledge, safety considerations/hazards, decisions, cues, and errors) and learning skills (science, mathematics/number systems, and…
Frequency of distracting tasks people do while driving : an analysis of the ACAS FOT data.
DOT National Transportation Integrated Search
2007-06-01
This report describes further analysis of data from the advanced collision avoidance system (ACAS) field operational test, a naturalistic driving study. To determine how distracted and nondistracted driving differ, a stratified sample of 2,914 video ...
DOT National Transportation Integrated Search
1996-11-01
The Highway Economic Requirements System (HERS) is a computer model designed to simulate improvement selection decisions based on the relative benefit-cost merits of alternative improvement options. HERS is intended to estimate national level investm...
2004-01-01
Cognitive Task Analysis Abstract As Department of Defense (DoD) leaders rely more on modeling and simulation to provide information on which to base...capabilities and intent. Cognitive Task Analysis (CTA) Cognitive Task Analysis (CTA) is an extensive/detailed look at tasks and subtasks performed by a...Domain Analysis and Task Analysis: A Difference That Matters. In Cognitive Task Analysis , edited by J. M. Schraagen, S.
Measuring task-related changes in heart rate variability.
Moses, Ziev B; Luecken, Linda J; Eason, James C
2007-01-01
Small beat-to-beat differences in heart rate are the result of dynamic control of the cardiovascular system by the sympathetic and parasympathetic nervous systems. Heart rate variability (HRV) has been positively correlated with both mental and physical health. While many studies measure HRV under rest conditions, few have measured HRV during stressful situations. We describe an experimental protocol designed to measure baseline, task, and recovery values of HRV as a function of three different types of stressors. These stressors involve an attention task, a cold pressor test, and a videotaped speech presentation. We found a measurable change in heart rate in participants (n=10) during each task (all p's < 0.05). The relative increase or decrease from pre-task heart rate was predicted by task (one-way ANOVA, p= 0.0001). Spectral analysis of HRV during the attention task revealed consistently decreased measures of both high (68+/-7%, mean+/-S.E.) and low (62+/-13%) frequency HRV components as compared to baseline. HRV spectra for the cold pressor and speech tasks revealed no consistent patterns of increase or decrease from baseline measurements. We also found no correlation in reactivity measures between any of our tasks. These findings suggest that each of the tasks in our experimental design elicits a different type of stress response in an individual. Our experimental approach may prove useful to biobehavioral researchers searching for factors that determine individual differences in responses to stress in daily life.
Motor equivalence during multi-finger accurate force production
Mattos, Daniela; Schöner, Gregor; Zatsiorsky, Vladimir M.; Latash, Mark L.
2014-01-01
We explored stability of multi-finger cyclical accurate force production action by analysis of responses to small perturbations applied to one of the fingers and inter-cycle analysis of variance. Healthy subjects performed two versions of the cyclical task, with and without an explicit target. The “inverse piano” apparatus was used to lift/lower a finger by 1 cm over 0.5 s; the subjects were always instructed to perform the task as accurate as they could at all times. Deviations in the spaces of finger forces and modes (hypothetical commands to individual fingers) were quantified in directions that did not change total force (motor equivalent) and in directions that changed the total force (non-motor equivalent). Motor equivalent deviations started immediately with the perturbation and increased progressively with time. After a sequence of lifting-lowering perturbations leading to the initial conditions, motor equivalent deviations were dominating. These phenomena were less pronounced for analysis performed with respect to the total moment of force with respect to an axis parallel to the forearm/hand. Analysis of inter-cycle variance showed consistently higher variance in a subspace that did not change the total force as compared to the variance that affected total force. We interpret the results as reflections of task-specific stability of the redundant multi-finger system. Large motor equivalent deviations suggest that reactions of the neuromotor system to a perturbation involve large changes of neural commands that do not affect salient performance variables, even during actions with the purpose to correct those salient variables. Consistency of the analyses of motor equivalence and variance analysis provides additional support for the idea of task-specific stability ensured at a neural level. PMID:25344311
Collection, processing and dissemination of data for the national solar demonstration program
NASA Technical Reports Server (NTRS)
Day, R. E.; Murphy, L. J.; Smok, J. T.
1978-01-01
A national solar data system developed for the DOE by IBM provides for automatic gathering, conversion, transfer, and analysis of demonstration site data. NASA requirements for this system include providing solar site hardware, engineering, data collection, and analysis. The specific tasks include: (1) solar energy system design/integration; (2) developing a site data acquisition subsystem; (3) developing a central data processing system; (4) operating the test facility at Marshall Space Flight Center; (5) collecting and analyzing data. The systematic analysis and evaluation of the data from the National Solar Data System is reflected in a monthly performance report and a solar energy system performance evaluation report.
Information flow through threespine stickleback networks without social transmission
Atton, N.; Hoppitt, W.; Webster, M. M.; Galef, B. G.; Laland, K. N.
2012-01-01
Social networks can result in directed social transmission of learned information, thus influencing how innovations spread through populations. Here we presented shoals of threespine sticklebacks (Gasterosteous aculeatus) with two identical foraging tasks and applied network-based diffusion analysis (NBDA) to determine whether the order in which individuals in a social group contacted and solved the tasks was affected by the group's network structure. We found strong evidence for a social effect on discovery of the foraging tasks with individuals tending to discover a task sooner when others in their group had previously done so, and with the spread of discovery of the foraging tasks influenced by groups' social networks. However, the same patterns of association did not reliably predict spread of solution to the tasks, suggesting that social interactions affected the time at which the tasks were discovered, but not the latency to its solution following discovery. The present analysis, one of the first applications of NBDA to a natural animal system, illustrates how NBDA can lead to insight into the mechanisms supporting behaviour acquisition that more conventional statistical approaches might miss. Importantly, we provide the first compelling evidence that the spread of novel behaviours can result from social learning in the absence of social transmission, a phenomenon that we refer to as an untransmitted social effect on learning. PMID:22896644
Cognitive task analysis: harmonizing tasks to human capacities.
Neerincx, M A; Griffioen, E
1996-04-01
This paper presents the development of a cognitive task analysis that assesses the task load of jobs and provides indicators for the redesign of jobs. General principles of human task performance were selected and, subsequently, integrated into current task modelling techniques. The resulting cognitive task analysis centres around four aspects of task load: the number of actions in a period, the ratio between knowledge- and rule-based actions, lengthy uninterrupted actions, and momentary overloading. The method consists of three stages: (1) construction of a hierarchical task model, (2) a time-line analysis and task load assessment, and (3), if necessary, adjustment of the task model. An application of the cognitive task analysis in railway traffic control showed its benefits over the 'old' task load analysis of the Netherlands Railways. It provided a provisional standard for traffic control jobs, conveyed two load risks -- momentary overloading and underloading -- and resulted in proposals to satisfy the standard and to diminish the two load risk.
1988-11-01
system, using graphic techniques which enable users, analysts, and designers to get a clear and common picture of the system and how its parts fit...boxes into hierarchies suitable for computer implementation. ŗ. Structured Design uses tools, especially graphic ones, to render systems readily...LSA, PROCESSES, DATA FLOWS, DATA STORES, EX"RNAL ENTITIES, OVERALL SYSTEMS DESIGN PROCESS, over 19, ABSTRACT (Continue on reverse if necessary and
Sheehan, Barbara; Lucero, Robert J
2015-01-01
Electronic personal health record-based (ePHR-based) self-management systems can improve patient engagement and have an impact on health outcomes. In order to realize the benefits of these systems, there is a need to develop and evaluate heath information technology from the same theoretical underpinnings. Using an innovative usability approach based in human-centered distributed information design (HCDID), we tested an ePHR-based falls-prevention self-management system-Self-Assessment via a Personal Health Record (i.e., SAPHeR)-designed using HCDID principles in a laboratory. And we later evaluated SAPHeR's use by community-dwelling older adults at home. The innovative approach used in this study supported the analysis of four components: tasks, users, representations, and functions. Tasks were easily learned and features such as text-associated images facilitated task completion. Task performance times were slow, however user satisfaction was high. Nearly seven out of every ten features desired by design participants were evaluated in our usability testing of the SAPHeR system. The in vivo evaluation suggests that older adults could improve their confidence in performing indoor and outdoor activities after using the SAPHeR system. We have applied an innovative consumer-usability evaluation. Our approach addresses the limitations of other usability testing methods that do not utilize consistent theoretically based methods for designing and testing technology. We have successfully demonstrated the utility of testing consumer technology use across multiple components (i.e., task, user, representational, functional) to evaluate the usefulness, usability, and satisfaction of an ePHR-based self-management system.
DOT National Transportation Integrated Search
2015-12-01
This study used the National EMS Information System (NEMSIS) South Dakota data to develop datadriven performance metrics for EMS. Researchers used the data for three tasks: geospatial analysis of EMS events, optimization of station locations, and ser...
Automating a Detailed Cognitive Task Analysis for Structuring Curriculum
1991-06-01
Cognitive Task Analysis For... cognitive task analysis o3 0 chniques. A rather substantial literature has been amassed relative to _ - cutonqed knowledge acquisition but only seven...references have been found in LO V*r data base seaci of literature specifically addressing cognitive task analysis . - A variety of forms of cognitive task analysis
Evolutionary space platform concept study. Volume 2, part B: Manned space platform concepts
NASA Technical Reports Server (NTRS)
1982-01-01
Logical, cost-effective steps in the evolution of manned space platforms are investigated and assessed. Tasks included the analysis of requirements for a manned space platform, identifying alternative concepts, performing system analysis and definition of the concepts, comparing the concepts and performing programmatic analysis for a reference concept.
An Ideal Observer Analysis of Visual Working Memory
ERIC Educational Resources Information Center
Sims, Chris R.; Jacobs, Robert A.; Knill, David C.
2012-01-01
Limits in visual working memory (VWM) strongly constrain human performance across many tasks. However, the nature of these limits is not well understood. In this article we develop an ideal observer analysis of human VWM by deriving the expected behavior of an optimally performing but limited-capacity memory system. This analysis is framed around…
Bueno, Mercedes; Fort, Alexandra; Francois, Mathilde; Ndiaye, Daniel; Deleurence, Philippe; Fabrigoule, Colette
2013-04-29
Forward Collision Warning Systems (FCWS) are expected to assist drivers; however, it is not completely clear whether these systems are of benefit to distracted drivers as much as they are to undistracted drivers. This study aims at investigating further the analysis of the effectiveness of a surrogate FCWS according to the attentional state of participants. In this experiment electrophysiological and behavioural data were recording while participants were required to drive in a simple car simulator and to react to the braking of the lead vehicle which could be announced by a warning system. The effectiveness of this warning system was evaluated when drivers were distracted or not by a secondary cognitive task. In a previous study, the warning signal was not completely effective likely due to the presence of another predictor of the forthcoming braking which competes with the warning. By eliminating this secondary predictor in the present study, the results confirmed the negative effect of the secondary task and revealed the expected effectiveness of the warning system at behavioural and electrophysiological levels. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Kozora, E; Uluğ, A M; Erkan, D; Vo, A; Filley, C M; Ramon, G; Burleson, A; Zimmerman, R; Lockshin, M D
2016-11-01
Standardized cognitive tests and functional magnetic resonance imaging (fMRI) studies of systemic lupus erythematosus (SLE) patients demonstrate deficits in working memory and executive function. These neurobehavioral abnormalities are not well studied in antiphospholipid syndrome, which may occur independently of or together with SLE. This study compares an fMRI paradigm involving motor skills, working memory, and executive function in SLE patients without antiphospholipid antibody (aPL) (the SLE group), aPL-positive non-SLE patients (the aPL-positive group), and controls. Brain MRI, fMRI, and standardized cognitive assessment results were obtained from 20 SLE, 20 aPL-positive, and 10 healthy female subjects with no history of neuropsychiatric abnormality. Analysis of fMRI data showed no differences in performance across groups on bilateral motor tasks. When analysis of variance was used, significant group differences were found in 2 executive function tasks (word generation and word rhyming) and in a working memory task (N-Back). Patients positive for aPL demonstrated higher activation in bilateral frontal, temporal, and parietal cortices compared to controls during working memory and executive function tasks. SLE patients also demonstrated bilateral frontal and temporal activation during working memory and executive function tasks. Compared to controls, both aPL-positive and SLE patients had elevated cortical activation, primarily in the frontal lobes, during tasks involving working memory and executive function. These findings are consistent with cortical overactivation as a compensatory mechanism for early white matter neuropathology in these disorders. © 2016, American College of Rheumatology.
Cognitive simulation as a tool for cognitive task analysis.
Roth, E M; Woods, D D; Pople, H E
1992-10-01
Cognitive simulations are runnable computer programs that represent models of human cognitive activities. We show how one cognitive simulation built as a model of some of the cognitive processes involved in dynamic fault management can be used in conjunction with small-scale empirical data on human performance to uncover the cognitive demands of a task, to identify where intention errors are likely to occur, and to point to improvements in the person-machine system. The simulation, called Cognitive Environment Simulation or CES, has been exercised on several nuclear power plant accident scenarios. Here we report one case to illustrate how a cognitive simulation tool such as CES can be used to clarify the cognitive demands of a problem-solving situation as part of a cognitive task analysis.
Error rate information in attention allocation pilot models
NASA Technical Reports Server (NTRS)
Faulkner, W. H.; Onstott, E. D.
1977-01-01
The Northrop urgency decision pilot model was used in a command tracking task to compare the optimized performance of multiaxis attention allocation pilot models whose urgency functions were (1) based on tracking error alone, and (2) based on both tracking error and error rate. A matrix of system dynamics and command inputs was employed, to create both symmetric and asymmetric two axis compensatory tracking tasks. All tasks were single loop on each axis. Analysis showed that a model that allocates control attention through nonlinear urgency functions using only error information could not achieve performance of the full model whose attention shifting algorithm included both error and error rate terms. Subsequent to this analysis, tracking performance predictions for the full model were verified by piloted flight simulation. Complete model and simulation data are presented.
NDARC NASA Design and Analysis of Rotorcraft - Input, Appendix 4
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2016-01-01
The NDARC code performs design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance analysis, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. The principal tasks (sizing, mission analysis, flight performance analysis) are shown in the figure as boxes with heavy borders. Heavy arrows show control of subordinate tasks. The aircraft description consists of all the information, input and derived, that denes the aircraft. The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. This information can be the result of the sizing task; can come entirely from input, for a fixed model; or can come from the sizing task in a previous case or previous job. The aircraft description information is available to all tasks and all solutions. The sizing task determines the dimensions, power, and weight of a rotorcraft that can perform a specified set of design conditions and missions. The aircraft size is characterized by parameters such as design gross weight, weight empty, rotor radius, and engine power available. The relations between dimensions, power, and weight generally require an iterative solution. From the design flight conditions and missions, the task can determine the total engine power or the rotor radius (or both power and radius can be fixed), as well as the design gross weight, maximum takeoff weight, drive system torque limit, and fuel tank capacity. For each propulsion group, the engine power or the rotor radius can be sized. Missions are defined for the sizing task, and for the mission performance analysis. A mission consists of a number of mission segments, for which time, distance, and fuel burn are evaluated. For the sizing task, certain missions are designated to be used for design gross weight calculations; for transmission sizing; and for fuel tank sizing. The mission parameters include mission takeoff gross weight and useful load. For specified takeoff fuel weight with adjustable segments, the mission time or distance is adjusted so the fuel required for the mission equals the takeoff fuel weight. The mission iteration is on fuel weight or energy. Flight conditions are specified for the sizing task, and for the flight performance analysis. For the sizing task, certain flight conditions are designated to be used for design gross weight calculations; for transmission sizing; for maximum takeoff weight calculations; and for anti-torque or auxiliary thrust rotor sizing. The flight condition parameters include gross weight and useful load. For flight conditions and mission takeoff, the gross weight can be maximized, such that the power required equals the power available. A flight state is defined for each mission segment and each flight condition. The aircraft performance can be analyzed for the specified state, or a maximum effort performance can be identified. The maximum effort is specified in terms of a quantity such as best endurance or best range, and a variable such as speed, rate of climb, or altitude.
Preliminary Face and Construct Validation Study of a Virtual Basic Laparoscopic Skill Trainer
Sankaranarayanan, Ganesh; Lin, Henry; Arikatla, Venkata S.; Mulcare, Maureen; Zhang, Likun; Derevianko, Alexandre; Lim, Robert; Fobert, David; Cao, Caroline; Schwaitzberg, Steven D.; Jones, Daniel B.
2010-01-01
Abstract Background The Virtual Basic Laparoscopic Skill Trainer (VBLaST™) is a developing virtual-reality–based surgical skill training system that incorporates several of the tasks of the Fundamentals of Laparoscopic Surgery (FLS) training system. This study aimed to evaluate the face and construct validity of the VBLaST™ system. Materials and Methods Thirty-nine subjects were voluntarily recruited at the Beth Israel Deaconess Medical Center (Boston, MA) and classified into two groups: experts (PGY 5, fellow and practicing surgeons) and novice (PGY 1–4). They were then asked to perform three FLS tasks, consisting of peg transfer, pattern cutting, and endoloop, on both the VBLaST and FLS systems. The VBLaST performance scores were automatically computed, while the FLS scores were rated by a trained evaluator. Face validity was assessed using a 5-point Likert scale, varying from not realistic/useful (1) to very realistic/useful (5). Results Face-validity scores showed that the VBLaST system was significantly realistic in portraying the three FLS tasks (3.95 ± 0.909), as well as the reality in trocar placement and tool movements (3.67 ± 0.874). Construct-validity results show that VBLaST was able to differentiate between the expert and novice group (P = 0.015). However, of the two tasks used for evaluating VBLaST, only the peg-transfer task showed a significant difference between the expert and novice groups (P = 0.003). Spearman correlation coefficient analysis between the two scores showed significant correlation for the peg-transfer task (Spearman coefficient 0.364; P = 0.023). Conclusions VBLaST demonstrated significant face and construct validity. A further set of studies, involving improvement to the current VBLaST system, is needed to thoroughly demonstrate face and construct validity for all the tasks. PMID:20201683
Caballero Sánchez, Carla; Barbado Murillo, David; Davids, Keith; Moreno Hernández, Francisco J
2016-06-01
This study investigated the extent to which specific interacting constraints of performance might increase or decrease the emergent complexity in a movement system, and whether this could affect the relationship between observed movement variability and the central nervous system's capacity to adapt to perturbations during balancing. Fifty-two healthy volunteers performed eight trials where different performance constraints were manipulated: task difficulty (three levels) and visual biofeedback conditions (with and without the center of pressure (COP) displacement and a target displayed). Balance performance was assessed using COP-based measures: mean velocity magnitude (MVM) and bivariate variable error (BVE). To assess the complexity of COP, fuzzy entropy (FE) and detrended fluctuation analysis (DFA) were computed. ANOVAs showed that MVM and BVE increased when task difficulty increased. During biofeedback conditions, individuals showed higher MVM but lower BVE at the easiest level of task difficulty. Overall, higher FE and lower DFA values were observed when biofeedback was available. On the other hand, FE reduced and DFA increased as difficulty level increased, in the presence of biofeedback. However, when biofeedback was not available, the opposite trend in FE and DFA values was observed. Regardless of changes to task constraints and the variable investigated, balance performance was positively related to complexity in every condition. Data revealed how specificity of task constraints can result in an increase or decrease in complexity emerging in a neurobiological system during balance performance.
Linking normative models of natural tasks to descriptive models of neural response.
Jaini, Priyank; Burge, Johannes
2017-10-01
Understanding how nervous systems exploit task-relevant properties of sensory stimuli to perform natural tasks is fundamental to the study of perceptual systems. However, there are few formal methods for determining which stimulus properties are most useful for a given natural task. As a consequence, it is difficult to develop principled models for how to compute task-relevant latent variables from natural signals, and it is difficult to evaluate descriptive models fit to neural response. Accuracy maximization analysis (AMA) is a recently developed Bayesian method for finding the optimal task-specific filters (receptive fields). Here, we introduce AMA-Gauss, a new faster form of AMA that incorporates the assumption that the class-conditional filter responses are Gaussian distributed. Then, we use AMA-Gauss to show that its assumptions are justified for two fundamental visual tasks: retinal speed estimation and binocular disparity estimation. Next, we show that AMA-Gauss has striking formal similarities to popular quadratic models of neural response: the energy model and the generalized quadratic model (GQM). Together, these developments deepen our understanding of why the energy model of neural response have proven useful, improve our ability to evaluate results from subunit model fits to neural data, and should help accelerate psychophysics and neuroscience research with natural stimuli.
Object versus spatial visual mental imagery in patients with schizophrenia
Aleman, André; de Haan, Edward H.F.; Kahn, René S.
2005-01-01
Objective Recent research has revealed a larger impairment of object perceptual discrimination than of spatial perceptual discrimination in patients with schizophrenia. It has been suggested that mental imagery may share processing systems with perception. We investigated whether patients with schizophrenia would show greater impairment regarding object imagery than spatial imagery. Methods Forty-four patients with schizophrenia and 20 healthy control subjects were tested on a task of object visual mental imagery and on a task of spatial visual mental imagery. Both tasks included a condition in which no imagery was needed for adequate performance, but which was in other respects identical to the imagery condition. This allowed us to adjust for nonspecific differences in individual performance. Results The results revealed a significant difference between patients and controls on the object imagery task (F1,63 = 11.8, p = 0.001) but not on the spatial imagery task (F1,63 = 0.14, p = 0.71). To test for a differential effect, we conducted a 2 (patients v. controls) х 2 (object task v. spatial task) analysis of variance. The interaction term was statistically significant (F1,62 = 5.2, p = 0.026). Conclusions Our findings suggest a differential dysfunction of systems mediating object and spatial visual mental imagery in schizophrenia. PMID:15644999
A SWOT analysis of the organization and financing of the Danish health care system.
Christiansen, Terkel
2002-02-01
The organization and financing of the Danish health care system was evaluated within a framework of a SWOT analysis (analysis of Strengths, Weaknesses, Opportunities and Threats) by a panel of five members with a background in health economics. The present paper describes the methods and materials used for the evaluation: selection of panel members, structure of the evaluation task according to the health care triangle model, selection of background material consisting of documents and literature on the Danish health care system, and a 1-week study visit.
Automating security monitoring and analysis for Space Station Freedom's electric power system
NASA Technical Reports Server (NTRS)
Dolce, James L.; Sobajic, Dejan J.; Pao, Yoh-Han
1990-01-01
Operating a large, space power system requires classifying the system's status and analyzing its security. Conventional algorithms are used by terrestrial electric utilities to provide such information to their dispatchers, but their application aboard Space Station Freedom will consume too much processing time. A new approach for monitoring and analysis using adaptive pattern techniques is presented. This approach yields an on-line security monitoring and analysis algorithm that is accurate and fast; and thus, it can free the Space Station Freedom's power control computers for other tasks.
Automating security monitoring and analysis for Space Station Freedom's electric power system
NASA Technical Reports Server (NTRS)
Dolce, James L.; Sobajic, Dejan J.; Pao, Yoh-Han
1990-01-01
Operating a large, space power system requires classifying the system's status and analyzing its security. Conventional algorithms are used by terrestrial electric utilities to provide such information to their dispatchers, but their application aboard Space Station Freedom will consume too much processing time. A novel approach for monitoring and analysis using adaptive pattern techniques is presented. This approach yields an on-line security monitoring and analysis algorithm that is accurate and fast; and thus, it can free the Space Station Freedom's power control computers for other tasks.
Seifert, L; De Jesus, K; Komar, J; Ribeiro, J; Abraldes, J A; Figueiredo, P; Vilas-Boas, J P; Fernandes, R J
2016-06-01
The aim was to examine behavioural variability within and between individuals, especially in a swimming task, to explore how swimmers with various specialty (competitive short distance swimming vs. triathlon) adapt to repetitive events of sub-maximal intensity, controlled in speed but of various distances. Five swimmers and five triathletes randomly performed three variants (with steps of 200, 300 and 400m distances) of a front crawl incremental step test until exhaustion. Multi-camera system was used to collect and analyse eight kinematical and swimming efficiency parameters. Analysis of variance showed significant differences between swimmers and triathletes, with significant individual effect. Cluster analysis put these parameters together to investigate whether each individual used the same pattern(s) and one or several patterns to achieve the task goal. Results exhibited ten patterns for the whole population, with only two behavioural patterns shared between swimmers and triathletes. Swimmers tended to use higher hand velocity and index of coordination than triathletes. Mono-stability occurred in swimmers whatever the task constraint showing high stability, while triathletes revealed bi-stability because they switched to another pattern at mid-distance of the task. Finally, our analysis helped to explain and understand effect of specialty and more broadly individual adaptation to task constraint. Copyright © 2016 Elsevier B.V. All rights reserved.
IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. W. Parry; J.A Forester; V.N. Dang
2013-09-01
This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure eventmore » (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.« less
NPSS Multidisciplinary Integration and Analysis
NASA Technical Reports Server (NTRS)
Hall, Edward J.; Rasche, Joseph; Simons, Todd A.; Hoyniak, Daniel
2006-01-01
The objective of this task was to enhance the capability of the Numerical Propulsion System Simulation (NPSS) by expanding its reach into the high-fidelity multidisciplinary analysis area. This task investigated numerical techniques to convert between cold static to hot running geometry of compressor blades. Numerical calculations of blade deformations were iteratively done with high fidelity flow simulations together with high fidelity structural analysis of the compressor blade. The flow simulations were performed with the Advanced Ducted Propfan Analysis (ADPAC) code, while structural analyses were performed with the ANSYS code. High fidelity analyses were used to evaluate the effects on performance of: variations in tip clearance, uncertainty in manufacturing tolerance, variable inlet guide vane scheduling, and the effects of rotational speed on the hot running geometry of the compressor blades.
ERIC Educational Resources Information Center
Green, C. Paul; Orsak, Charles G.
Undertaking of a systems approach to curriculum development for solar training led to (1) a feasibility study to determine the role of the community college in solar energy technology, (2) a market analysis to determine the manpower need, and (3) a task analysis for development of a curriculum for training solar energy technicians at Navarro…
Automating a Detailed Cognitive Task Analysis for Structuring Curriculum
1991-08-01
1991-- ] Aleeo/i ISM’-19# l Title: Automating a Detailed Cognitive Task Analysis for Structuring Curriculum Activities: To date we have completed task...The Institute for Management Sciences. Although the particular application of the modified GOMS cognitive task analysis technique under development is...Laboratories 91 9 23 074 Automnating a Detailed Cognitive Task Analysis For Stucuring Curriculum Research Plan Year 1 Task 1.0 Design Task 1.1 Conduct body
The case against specialized visual-spatial short-term memory.
Morey, Candice C
2018-05-24
The dominant paradigm for understanding working memory, or the combination of the perceptual, attentional, and mnemonic processes needed for thinking, subdivides short-term memory (STM) according to whether memoranda are encoded in aural-verbal or visual formats. This traditional dissociation has been supported by examples of neuropsychological patients who seem to selectively lack STM for either aural-verbal, visual, or spatial memoranda, and by experimental research using dual-task methods. Though this evidence is the foundation of assumptions of modular STM systems, the case it makes for a specialized visual STM system is surprisingly weak. I identify the key evidence supporting a distinct verbal STM system-patients with apparent selective damage to verbal STM and the resilience of verbal short-term memories to general dual-task interference-and apply these benchmarks to neuropsychological and experimental investigations of visual-spatial STM. Contrary to the evidence on verbal STM, patients with apparent visual or spatial STM deficits tend to experience a wide range of additional deficits, making it difficult to conclude that a distinct short-term store was damaged. Consistently with this, a meta-analysis of dual-task visual-spatial STM research shows that robust dual-task costs are consistently observed regardless of the domain or sensory code of the secondary task. Together, this evidence suggests that positing a specialized visual STM system is not necessary. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
NASA Technical Reports Server (NTRS)
Carroll, Chester C.; Youngblood, John N.; Saha, Aindam
1987-01-01
Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processing elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carroll, C.C.; Youngblood, J.N.; Saha, A.
1987-12-01
Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processingmore » elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.« less
Liu, Y; Wickens, C D
1994-11-01
The evaluation of mental workload is becoming increasingly important in system design and analysis. The present study examined the structure and assessment of mental workload in performing decision and monitoring tasks by focusing on two mental workload measurements: subjective assessment and time estimation. The task required the assignment of a series of incoming customers to the shortest of three parallel service lines displayed on a computer monitor. The subject was either in charge of the customer assignment (manual mode) or was monitoring an automated system performing the same task (automatic mode). In both cases, the subjects were required to detect the non-optimal assignments that they or the computer had made. Time pressure was manipulated by the experimenter to create fast and slow conditions. The results revealed a multi-dimensional structure of mental workload and a multi-step process of subjective workload assessment. The results also indicated that subjective workload was more influenced by the subject's participatory mode than by the factor of task speed. The time estimation intervals produced while performing the decision and monitoring tasks had significantly greater length and larger variability than those produced while either performing no other tasks or performing a well practised customer assignment task. This result seemed to indicate that time estimation was sensitive to the presence of perceptual/cognitive demands, but not to response related activities to which behavioural automaticity has developed.
GOATS Image Projection Component
NASA Technical Reports Server (NTRS)
Haber, Benjamin M.; Green, Joseph J.
2011-01-01
When doing mission analysis and design of an imaging system in orbit around the Earth, answering the fundamental question of imaging performance requires an understanding of the image products that will be produced by the imaging system. GOATS software represents a series of MATLAB functions to provide for geometric image projections. Unique features of the software include function modularity, a standard MATLAB interface, easy-to-understand first-principles-based analysis, and the ability to perform geometric image projections of framing type imaging systems. The software modules are created for maximum analysis utility, and can all be used independently for many varied analysis tasks, or used in conjunction with other orbit analysis tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ogden, J.M.; Steinbugler, M.; Dennis, E.
For several years, researchers at Princeton University`s Center for Energy and Environmental Studies have carried out technical and economic assessments of hydrogen energy systems. Initially, we focussed on the long term potential of renewable hydrogen. More recently we have explored how a transition to renewable hydrogen might begin. The goal of our current work is to identify promising strategies leading from near term hydrogen markets and technologies toward eventual large scale use of renewable hydrogen as an energy carrier. Our approach has been to assess the entire hydrogen energy system from production through end-use considering technical performance, economics, infrastructure andmore » environmental issues. This work is part of the systems analysis activity of the DOE Hydrogen Program. In this paper we first summarize the results of three tasks which were completed during the past year under NREL Contract No. XR-11265-2: in Task 1, we carried out assessments of near term options for supplying hydrogen transportation fuel from natural gas; in Task 2, we assessed the feasibility of using the existing natural gas system with hydrogen and hydrogen blends; and in Task 3, we carried out a study of PEM fuel cells for residential cogeneration applications, a market which might have less stringent cost requirements than transportation. We then give preliminary results for two other tasks which are ongoing under DOE Contract No. DE-FG04-94AL85803: In Task 1 we are assessing the technical options for low cost small scale production of hydrogen from natural gas, considering (a) steam reforming, (b) partial oxidation and (c) autothermal reforming, and in Task 2 we are assessing potential markets for hydrogen in Southern California.« less
Assessing performance in complex team environments.
Whitmore, Jeffrey N
2005-07-01
This paper provides a brief introduction to team performance assessment. It highlights some critical aspects leading to the successful measurement of team performance in realistic console operations; discusses the idea of process and outcome measures; presents two types of team data collection systems; and provides an example of team performance assessment. Team performance assessment is a complicated endeavor relative to assessing individual performance. Assessing team performance necessitates a clear understanding of each operator's task, both at the individual and team level, and requires planning for efficient data capture and analysis. Though team performance assessment requires considerable effort, the results can be very worthwhile. Most tasks performed in Command and Control environments are team tasks, and understanding this type of performance is becoming increasingly important to the evaluation of mission success and for overall system optimization.
Crew procedures for microwave landing system operations
NASA Technical Reports Server (NTRS)
Summers, Leland G.
1987-01-01
The objective of this study was to identify crew procedures involved in Microwave Landing System (MLS) operations and to obtain a preliminary assessment of crew workload. The crew procedures were identified for three different complements of airborne equipment coupled to an autopilot. Using these three equipment complements, crew tasks were identified for MLS approaches and precision departures and compared to an ILS approach and a normal departure. Workload comparisons between the approaches and departures were made by using a task-timeline analysis program that obtained workload indexes, i.e., the radio of time required to complete the tasks to the time available. The results showed an increase in workload for the MLS scenario for one of the equipment complements. However, even this workload was within the capacity of two crew members.
From Text to Context: An Open Systems Approach to Research in Written Business Communication.
ERIC Educational Resources Information Center
Suchan, Jim; Dulek, Ron
1998-01-01
Discusses open systems thinking as a new lens to use when exploring written business communication--a lens that integrates task, organizational structure, control, and technology into the analysis of written business messages. Explores the influences these subsystems have on written communication and then develops these systems and subsystems into…
Research Staff | Advanced Manufacturing Research | NREL
SYSTEMS CENTER Kevin Bennion leads NREL's Thermal Sciences and Systems research task focused on thermal vehicle thermal management and vehicle systems analysis. He came to NREL from Ford Motor Company, where he focused on thermal management and reliability for power electronics and electric machines for several
NASA Technical Reports Server (NTRS)
Simon, William E.; Li, Ku-Yen; Yaws, Carl L.; Mei, Harry T.; Nguyen, Vinh D.; Chu, Hsing-Wei
1994-01-01
A methyl acetate reactor was developed to perform a subscale kinetic investigation in the design and optimization of a full-scale metabolic simulator for long term testing of life support systems. Other tasks in support of the closed ecological life support system test program included: (1) heating, ventilation and air conditioning analysis of a variable pressure growth chamber, (2) experimental design for statistical analysis of plant crops, (3) resource recovery for closed life support systems, and (4) development of data acquisition software for automating an environmental growth chamber.
Mayo clinic NLP system for patient smoking status identification.
Savova, Guergana K; Ogren, Philip V; Duffy, Patrick H; Buntrock, James D; Chute, Christopher G
2008-01-01
This article describes our system entry for the 2006 I2B2 contest "Challenges in Natural Language Processing for Clinical Data" for the task of identifying the smoking status of patients. Our system makes the simplifying assumption that patient-level smoking status determination can be achieved by accurately classifying individual sentences from a patient's record. We created our system with reusable text analysis components built on the Unstructured Information Management Architecture and Weka. This reuse of code minimized the development effort related specifically to our smoking status classifier. We report precision, recall, F-score, and 95% exact confidence intervals for each metric. Recasting the classification task for the sentence level and reusing code from other text analysis projects allowed us to quickly build a classification system that performs with a system F-score of 92.64 based on held-out data tests and of 85.57 on the formal evaluation data. Our general medical natural language engine is easily adaptable to a real-world medical informatics application. Some of the limitations as applied to the use-case are negation detection and temporal resolution.
Wagner, David W; Reed, Matthew P; Chaffin, Don B
2010-11-01
Accurate prediction of foot placements in relation to hand locations during manual materials handling tasks is critical for prospective biomechanical analysis. To address this need, the effects of lifting task conditions and anthropometric variables on foot placements were studied in a laboratory experiment. In total, 20 men and women performed two-handed object transfers that required them to walk to a shelf, lift an object from the shelf at waist height and carry the object to a variety of locations. Five different changes in the direction of progression following the object pickup were used, ranging from 45° to 180° relative to the approach direction. Object weights of 1.0 kg, 4.5 kg, 13.6 kg were used. Whole-body motions were recorded using a 3-D optical retro-reflective marker-based camera system. A new parametric system for describing foot placements, the Quantitative Transition Classification System, was developed to facilitate the parameterisation of foot placement data. Foot placements chosen by the subjects during the transfer tasks appeared to facilitate a change in the whole-body direction of progression, in addition to aiding in performing the lift. Further analysis revealed that five different stepping behaviours accounted for 71% of the stepping patterns observed. More specifically, the most frequently observed behaviour revealed that the orientation of the lead foot during the actual lifting task was primarily affected by the amount of turn angle required after the lift (R(2) = 0.53). One surprising result was that the object mass (scaled by participant body mass) was not found to significantly affect any of the individual step placement parameters. Regression models were developed to predict the most prevalent step placements and are included in this paper to facilitate more accurate human motion simulations and ergonomics analyses of manual material lifting tasks. STATEMENT OF RELEVANCE: This study proposes a method for parameterising the steps (foot placements) associated with manual material handling tasks. The influence of task conditions and subject anthropometry on the foot placements of the most frequently observed stepping pattern during a laboratory study is discussed. For prospective postural analyses conducted using digital human models, accurate prediction of the foot placements is critical to realistic postural analyses and improved biomechanical job evaluations.
Computer-enhanced laparoscopic training system (CELTS): bridging the gap.
Stylopoulos, N; Cotin, S; Maithel, S K; Ottensmeye, M; Jackson, P G; Bardsley, R S; Neumann, P F; Rattner, D W; Dawson, S L
2004-05-01
There is a large and growing gap between the need for better surgical training methodologies and the systems currently available for such training. In an effort to bridge this gap and overcome the disadvantages of the training simulators now in use, we developed the Computer-Enhanced Laparoscopic Training System (CELTS). CELTS is a computer-based system capable of tracking the motion of laparoscopic instruments and providing feedback about performance in real time. CELTS consists of a mechanical interface, a customizable set of tasks, and an Internet-based software interface. The special cognitive and psychomotor skills a laparoscopic surgeon should master were explicitly defined and transformed into quantitative metrics based on kinematics analysis theory. A single global standardized and task-independent scoring system utilizing a z-score statistic was developed. Validation exercises were performed. The scoring system clearly revealed a gap between experts and trainees, irrespective of the task performed; none of the trainees obtained a score above the threshold that distinguishes the two groups. Moreover, CELTS provided educational feedback by identifying the key factors that contributed to the overall score. Among the defined metrics, depth perception, smoothness of motion, instrument orientation, and the outcome of the task are major indicators of performance and key parameters that distinguish experts from trainees. Time and path length alone, which are the most commonly used metrics in currently available systems, are not considered good indicators of performance. CELTS is a novel and standardized skills trainer that combines the advantages of computer simulation with the features of the traditional and popular training boxes. CELTS can easily be used with a wide array of tasks and ensures comparability across different training conditions. This report further shows that a set of appropriate and clinically relevant performance metrics can be defined and a standardized scoring system can be designed.
Using a Knowledge Representations Approach to Cognitive Task Analysis.
ERIC Educational Resources Information Center
Black, John B.; And Others
Task analyses have traditionally been framed in terms of overt behaviors performed in accomplishing tasks and goals. Pioneering work at the Learning Research and Development Center looked at what contribution a cognitive analysis might make to current task analysis procedures, since traditional task analysis methods neither elicit nor capture…
Instructional Design: System Strategies.
ERIC Educational Resources Information Center
Ledford, Bruce R.; Sleeman, Phillip J.
This book is intended as a source for those who desire to apply a coherent system of instructional design, thereby insuring accountability. Chapter 1 covers the instructional design process, including: instructional technology; the role of evaluation; goal setting; the psychology of teaching and learning; task analysis; operational objectives;…
Applications of active microwave imagery
NASA Technical Reports Server (NTRS)
Weber, F. P.; Childs, L. F.; Gilbert, R.; Harlan, J. C.; Hoffer, R. M.; Miller, J. M.; Parsons, J.; Polcyn, F.; Schardt, B. B.; Smith, J. L.
1978-01-01
The following topics were discussed in reference to active microwave applications: (1) Use of imaging radar to improve the data collection/analysis process; (2) Data collection tasks for radar that other systems will not perform; (3) Data reduction concepts; and (4) System and vehicle parameters: aircraft and spacecraft.
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban; ...
2015-07-14
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
NASA Technical Reports Server (NTRS)
Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.
1974-01-01
A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.
NASA Technical Reports Server (NTRS)
Maluf, David A. (Inventor); Bell, David G. (Inventor); Gurram, Mohana M. (Inventor); Gawdiak, Yuri O. (Inventor)
2009-01-01
A system for managing a project that includes multiple tasks and a plurality of workers. Input information includes characterizations based upon a human model, a team model and a product model. Periodic reports, such as a monthly report, a task plan report, a budget report and a risk management report, are generated and made available for display or further analysis. An extensible database allows searching for information based upon context and upon content.
Cognitive Architectures and Rational Analysis: Comment
1989-03-17
These last three are assumptions- about the structure of the task 12 Architectures and Rationality 17 March 1989 environment , and are empirically... rationality is what cognitive psychology is all about. And the study of bounded rationality is not the study of optimization in relation tc tPok environments ...one must 16 Architectures and Rationality 17 March 1989 consider both the task environment and the limits upon the adaptive Powers of the system. Only
ERIC Educational Resources Information Center
Mills, Caroline; Chapparo, Christine
2017-01-01
The aim of this study was to determine the impact of a classroom sensory activity schedule (SAS) on cognitive strategy use during task performance. This work studies a single-system AB research design with seven students with autism and intellectual disability. Repeated measures using the Perceive, Recall, Plan and Perform (PRPP) Cognitive Task…
NASA Technical Reports Server (NTRS)
Murphy, M. R.
1980-01-01
A resource management approach to aircrew performance is defined and utilized in structuring an analysis of 84 exemplary incidents from the NASA Aviation Safety Reporting System. The distribution of enabling and associated (evolutionary) and recovery factors between and within five analytic categories suggests that resource management training be concentrated on: (1) interpersonal communications, with air traffic control information of major concern; (2) task management, mainly setting priorities and appropriately allocating tasks under varying workload levels; and (3) planning, coordination, and decisionmaking concerned with preventing and recovering from potentially unsafe situations in certain aircraft maneuvers.
Comparative Cognitive Task Analysis
2007-01-01
is to perform a task analyses to determine how people operate in a specific domain on a specific task. Cognitive Task Analysis (CTA) is a set of...accomplish a task. In this chapter, we build on CTA methods by suggesting that comparative cognitive task analysis (C2TA) can help solve the aforementioned
NASA Technical Reports Server (NTRS)
Frederick, D. K.; Lashmet, P. K.; Sandor, G. N.; Shen, C. N.; Smith, E. V.; Yerazunis, S. W.
1973-01-01
Problems related to the design and control of a mobile planetary vehicle to implement a systematic plan for the exploration of Mars are reported. Problem areas include: vehicle configuration, control, dynamics, systems and propulsion; systems analysis, terrain modeling and path selection; and chemical analysis of specimens. These tasks are summarized: vehicle model design, mathematical model of vehicle dynamics, experimental vehicle dynamics, obstacle negotiation, electrochemical controls, remote control, collapsibility and deployment, construction of a wheel tester, wheel analysis, payload design, system design optimization, effect of design assumptions, accessory optimal design, on-board computer subsystem, laser range measurement, discrete obstacle detection, obstacle detection systems, terrain modeling, path selection system simulation and evaluation, gas chromatograph/mass spectrometer system concepts, and chromatograph model evaluation and improvement.
Mixed Initiative Visual Analytics Using Task-Driven Recommendations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, Kristin A.; Cramer, Nicholas O.; Israel, David
2015-12-07
Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying modelsmore » of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.« less
Box truss analysis and technology development. Task 1: Mesh analysis and control
NASA Technical Reports Server (NTRS)
Bachtell, E. E.; Bettadapur, S. S.; Coyner, J. V.
1985-01-01
An analytical tool was developed to model, analyze and predict RF performance of box truss antennas with reflective mesh surfaces. The analysis system is unique in that it integrates custom written programs for cord tied mesh surfaces, thereby drastically reducing the cost of analysis. The analysis system is capable of determining the RF performance of antennas under any type of manufacturing or operating environment by integrating together the various disciplines of design, finite element analysis, surface best fit analysis and RF analysis. The Integrated Mesh Analysis System consists of six separate programs: The Mesh Tie System Model Generator, The Loadcase Generator, The Model Optimizer, The Model Solver, The Surface Topography Solver and The RF Performance Solver. Additionally, a study using the mesh analysis system was performed to determine the effect of on orbit calibration, i.e., surface adjustment, on a typical box truss antenna.
NASA Technical Reports Server (NTRS)
Degani, Asaf; Mitchell, Christine M.; Chappell, Alan R.; Shafto, Mike (Technical Monitor)
1995-01-01
Task-analytic models structure essential information about operator interaction with complex systems, in this case pilot interaction with the autoflight system. Such models serve two purposes: (1) they allow researchers and practitioners to understand pilots' actions; and (2) they provide a compact, computational representation needed to design 'intelligent' aids, e.g., displays, assistants, and training systems. This paper demonstrates the use of the operator function model to trace the process of mode engagements while a pilot is controlling an aircraft via the, autoflight system. The operator function model is a normative and nondeterministic model of how a well-trained, well-motivated operator manages multiple concurrent activities for effective real-time control. For each function, the model links the pilot's actions with the required information. Using the operator function model, this paper describes several mode engagement scenarios. These scenarios were observed and documented during a field study that focused on mode engagements and mode transitions during normal line operations. Data including time, ATC clearances, altitude, system states, and active modes and sub-modes, engagement of modes, were recorded during sixty-six flights. Using these data, seven prototypical mode engagement scenarios were extracted. One scenario details the decision of the crew to disengage a fully automatic mode in favor of a semi-automatic mode, and the consequences of this action. Another describes a mode error involving updating aircraft speed following the engagement of a speed submode. Other scenarios detail mode confusion at various phases of the flight. This analysis uses the operator function model to identify three aspects of mode engagement: (1) the progress of pilot-aircraft-autoflight system interaction; (2) control/display information required to perform mode management activities; and (3) the potential cause(s) of mode confusion. The goal of this paper is twofold: (1) to demonstrate the use of the operator functio model methodology to describe pilot-system interaction while engaging modes And monitoring the system, and (2) to initiate a discussion of how task-analytic models might inform design processes. While the operator function model is only one type of task-analytic representation, the hypothesis of this paper is that some type of task analytic structure is a prerequisite for the design of effective human-automation interaction.
NASA Technical Reports Server (NTRS)
Heinmiller, J. P.
1971-01-01
This document is the programmer's guide for the GNAT computer program developed under MSC/TRW Task 705-2, Apollo cryogenic storage system analysis, subtask 2, is reported. Detailed logic flow charts and compiled program listings are provided for all program elements.
A Futures Approach to Policy Analysis.
ERIC Educational Resources Information Center
Morrison, James L.
An approach to policy analysis for college officials is described that is based on evaluating and using information about the external environment to consider policy options for the future. The futures approach involves the following tasks: establishing an environmental scanning system to identify critical trends and emerging issues, identifying…
Reentry Hazard Analysis Handbook
DOT National Transportation Integrated Search
2005-01-28
The Aerospace Corporation was tasked by the Volpe National Transportation Systems Center provide technical support to the Federal Aviation Administration, Office of Commercial Space Transportation (FAA/AST), in developing acceptable methods of evalua...
Predicting operator workload during system design
NASA Technical Reports Server (NTRS)
Aldrich, Theodore B.; Szabo, Sandra M.
1988-01-01
A workload prediction methodology was developed in response to the need to measure workloads associated with operation of advanced aircraft. The application of the methodology will involve: (1) conducting mission/task analyses of critical mission segments and assigning estimates of workload for the sensory, cognitive, and psychomotor workload components of each task identified; (2) developing computer-based workload prediction models using the task analysis data; and (3) exercising the computer models to produce predictions of crew workload under varying automation and/or crew configurations. Critical issues include reliability and validity of workload predictors and selection of appropriate criterion measures.
NASA Technical Reports Server (NTRS)
Smagala, Tom; Mcglew, Dave
1988-01-01
The expected pointing performance of an attached payload coupled to the Critical Evaluation Task Force Space Station via a payload pointing system (PPS) is determined. The PPS is a 3-axis gimbal which provides the capability for maintaining inertial pointing of a payload in the presence of disturbances associated with the Space Station environment. A system where the axes of rotation were offset from the payload center of mass (CM) by 10 in. in the Z axis was studied as well as a system having the payload CM offset by only 1 inch. There is a significant improvement in pointing performance when going from the 10 in. to the 1 in. gimbal offset.
NASA Astrophysics Data System (ADS)
Valasek, John; Henrickson, James V.; Bowden, Ezekiel; Shi, Yeyin; Morgan, Cristine L. S.; Neely, Haly L.
2016-05-01
As small unmanned aircraft systems become increasingly affordable, reliable, and formally recognized under federal regulation, they become increasingly attractive as novel platforms for civil applications. This paper details the development and demonstration of fixed-wing unmanned aircraft systems for precision agriculture tasks. Tasks such as soil moisture content and high throughput phenotyping are considered. Rationale for sensor, vehicle, and ground equipment selections are provided, in addition to developed flight operation procedures for minimal numbers of crew. Preliminary imagery results are presented and analyzed, and these results demonstrate that fixed-wing unmanned aircraft systems modified to carry non-traditional sensors at extended endurance durations can provide high quality data that is usable for serious scientific analysis.
Integrating computer programs for engineering analysis and design
NASA Technical Reports Server (NTRS)
Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.
1983-01-01
The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.
Correlation Filtering of Modal Dynamics using the Laplace Wavelet
NASA Technical Reports Server (NTRS)
Freudinger, Lawrence C.; Lind, Rick; Brenner, Martin J.
1997-01-01
Wavelet analysis allows processing of transient response data commonly encountered in vibration health monitoring tasks such as aircraft flutter testing. The Laplace wavelet is formulated as an impulse response of a single mode system to be similar to data features commonly encountered in these health monitoring tasks. A correlation filtering approach is introduced using the Laplace wavelet to decompose a signal into impulse responses of single mode subsystems. Applications using responses from flutter testing of aeroelastic systems demonstrate modal parameters and stability estimates can be estimated by correlation filtering free decay data with a set of Laplace wavelets.
NASA Technical Reports Server (NTRS)
Hall, Edward J.; Topp, David A.; Heidegger, Nathan J.; Delaney, Robert A.
1994-01-01
The focus of this task was to validate the ADPAC code for heat transfer calculations. To accomplish this goal, the ADPAC code was modified to allow for a Cartesian coordinate system capability and to add boundary conditions to handle spanwise periodicity and transpiration boundaries. The primary validation case was the film cooled C3X vane. The cooling hole modeling included both a porous region and grid in each discrete hold. Predictions for these models as well as smooth wall compared well with the experimental data.
Space station systems technology study (add-on task). Volume 2: Trade study and technology selection
NASA Technical Reports Server (NTRS)
1985-01-01
The current Space Station Systems Technology Study add on task was an outgrowth of the Advanced Platform Systems Technology Study (APSTS) that was completed in April 1983 and the subsequent Space Station System Technology Study completed in April 1984. The first APSTS proceeded from the identification of 106 technology topics to the selection of five for detailed trade studies. During the advanced platform study, the technical issues and options were evaluated through detailed trade processes, individual consideration was given to costs and benefits for the technologies identified for advancement, and advancement plans were developed. An approach similar to that was used in the subsequent study, with emphasis on system definition in four specific technology areas to facilitate a more in depth analysis of technology issues.
Adaptive automation of human-machine system information-processing functions.
Kaber, David B; Wright, Melanie C; Prinzel, Lawrence J; Clamann, Michael P
2005-01-01
The goal of this research was to describe the ability of human operators to interact with adaptive automation (AA) applied to various stages of complex systems information processing, defined in a model of human-automation interaction. Forty participants operated a simulation of an air traffic control task. Automated assistance was adaptively applied to information acquisition, information analysis, decision making, and action implementation aspects of the task based on operator workload states, which were measured using a secondary task. The differential effects of the forms of automation were determined and compared with a manual control condition. Results of two 20-min trials of AA or manual control revealed a significant effect of the type of automation on performance, particularly during manual control periods as part of the adaptive conditions. Humans appear to better adapt to AA applied to sensory and psychomotor information-processing functions (action implementation) than to AA applied to cognitive functions (information analysis and decision making), and AA is superior to completely manual control. Potential applications of this research include the design of automation to support air traffic controller information processing.
ERIC Educational Resources Information Center
Siebold, Guy L.
Research was conducted to assess the applicability of the Instructional Systems Development (ISD) job analysis procedures to nine technical aviation maintenance military occupational specialties (MOS). Job analysis questionnaires were developed for each of the nine aviation maintenance MOS's. Research teams administered the questionnaires to…
Mapping university students' epistemic framing of computational physics using network analysis
NASA Astrophysics Data System (ADS)
Bodin, Madelen
2012-06-01
Solving physics problem in university physics education using a computational approach requires knowledge and skills in several domains, for example, physics, mathematics, programming, and modeling. These competences are in turn related to students’ beliefs about the domains as well as about learning. These knowledge and beliefs components are referred to here as epistemic elements, which together represent the students’ epistemic framing of the situation. The purpose of this study was to investigate university physics students’ epistemic framing when solving and visualizing a physics problem using a particle-spring model system. Students’ epistemic framings are analyzed before and after the task using a network analysis approach on interview transcripts, producing visual representations as epistemic networks. The results show that students change their epistemic framing from a modeling task, with expectancies about learning programming, to a physics task, in which they are challenged to use physics principles and conservation laws in order to troubleshoot and understand their simulations. This implies that the task, even though it is not introducing any new physics, helps the students to develop a more coherent view of the importance of using physics principles in problem solving. The network analysis method used in this study is shown to give intelligible representations of the students’ epistemic framing and is proposed as a useful method of analysis of textual data.
DOT National Transportation Integrated Search
1997-11-01
The Clean Air Act Amendments of 1990 (CAAA) require that states submit State Implementation Plans (SIPs) detailing strategies to improve air quality in nonattainment areas. This analysis first reviews the administrative procedures that states and the...
Jamadar, Sharna D; Egan, Gary F; Calhoun, Vince D; Johnson, Beth; Fielding, Joanne
2016-07-01
Intrinsic brain activity provides the functional framework for the brain's full repertoire of behavioral responses; that is, a common mechanism underlies intrinsic and extrinsic neural activity, with extrinsic activity building upon the underlying baseline intrinsic activity. The generation of a motor movement in response to sensory stimulation is one of the most fundamental functions of the central nervous system. Since saccadic eye movements are among our most stereotyped motor responses, we hypothesized that individual variability in the ability to inhibit a prepotent saccade and make a voluntary antisaccade would be related to individual variability in intrinsic connectivity. Twenty-three individuals completed the antisaccade task and resting-state functional magnetic resonance imaging (fMRI). A multivariate analysis of covariance identified relationships between fMRI oscillations (0.01-0.2 Hz) of resting-state networks determined using high-dimensional independent component analysis and antisaccade performance (latency, error rate). Significant multivariate relationships between antisaccade latency and directional error rate were obtained in independent components across the entire brain. Some of the relationships were obtained in components that overlapped substantially with the task; however, many were obtained in components that showed little overlap with the task. The current results demonstrate that even in the absence of a task, spectral power in regions showing little overlap with task activity predicts an individual's performance on a saccade task.
2012-01-01
Background In recent years, biological event extraction has emerged as a key natural language processing task, aiming to address the information overload problem in accessing the molecular biology literature. The BioNLP shared task competitions have contributed to this recent interest considerably. The first competition (BioNLP'09) focused on extracting biological events from Medline abstracts from a narrow domain, while the theme of the latest competition (BioNLP-ST'11) was generalization and a wider range of text types, event types, and subject domains were considered. We view event extraction as a building block in larger discourse interpretation and propose a two-phase, linguistically-grounded, rule-based methodology. In the first phase, a general, underspecified semantic interpretation is composed from syntactic dependency relations in a bottom-up manner. The notion of embedding underpins this phase and it is informed by a trigger dictionary and argument identification rules. Coreference resolution is also performed at this step, allowing extraction of inter-sentential relations. The second phase is concerned with constraining the resulting semantic interpretation by shared task specifications. We evaluated our general methodology on core biological event extraction and speculation/negation tasks in three main tracks of BioNLP-ST'11 (GENIA, EPI, and ID). Results We achieved competitive results in GENIA and ID tracks, while our results in the EPI track leave room for improvement. One notable feature of our system is that its performance across abstracts and articles bodies is stable. Coreference resolution results in minor improvement in system performance. Due to our interest in discourse-level elements, such as speculation/negation and coreference, we provide a more detailed analysis of our system performance in these subtasks. Conclusions The results demonstrate the viability of a robust, linguistically-oriented methodology, which clearly distinguishes general semantic interpretation from shared task specific aspects, for biological event extraction. Our error analysis pinpoints some shortcomings, which we plan to address in future work within our incremental system development methodology. PMID:22759461
Development of Management Metrics for Research and Technology
NASA Technical Reports Server (NTRS)
Sheskin, Theodore J.
2003-01-01
Professor Ted Sheskin from CSU will be tasked to research and investigate metrics that can be used to determine the technical progress for advanced development and research tasks. These metrics will be implemented in a software environment that hosts engineering design, analysis and management tools to be used to support power system and component research work at GRC. Professor Sheskin is an Industrial Engineer and has been involved in issues related to management of engineering tasks and will use his knowledge from this area to allow extrapolation into the research and technology management area. Over the course of the summer, Professor Sheskin will develop a bibliography of management papers covering current management methods that may be applicable to research management. At the completion of the summer work we expect to have him recommend a metric system to be reviewed prior to implementation in the software environment. This task has been discussed with Professor Sheskin and some review material has already been given to him.
1991-01-01
Field 3. Training and Training Devices: a. Factory training b. Instructor and key personnel training c. New equipment training plan d. New equipment...12345678901234567990123456789012345678901234567890123456789 1. 0016 10 SUPPOR2AILITY ALTEIIIVE TRADE-OFF ANALISIS . 4. + 4" + 4. 4. 4 2. C1.0 111.0 N2.0 1.0 INITIATE
Robot and Human Surface Operations on Solar System Bodies
NASA Technical Reports Server (NTRS)
Weisbin, C. R.; Easter, R.; Rodriguez, G.
2001-01-01
This paper presents a comparison of robot and human surface operations on solar system bodies. The topics include: 1) Long Range Vision of Surface Scenarios; 2) Human and Robots Complement Each Other; 3) Respective Human and Robot Strengths; 4) Need More In-Depth Quantitative Analysis; 5) Projected Study Objectives; 6) Analysis Process Summary; 7) Mission Scenarios Decompose into Primitive Tasks; 7) Features of the Projected Analysis Approach; and 8) The "Getting There Effect" is a Major Consideration. This paper is in viewgraph form.
NASA Technical Reports Server (NTRS)
Hall, Edward J.; Delaney, Robert A.; Bettner, James L.
1991-01-01
The primary objective of this study was the development of a time-dependent three-dimensional Euler/Navier-Stokes aerodynamic analysis to predict unsteady compressible transonic flows about ducted and unducted propfan propulsion systems at angle of attack. The computer codes resulting from this study are referred to as Advanced Ducted Propfan Analysis Codes (ADPAC). This report is intended to serve as a computer program user's manual for the ADPAC developed under Task 2 of NASA Contract NAS3-25270, Unsteady Ducted Propfan Analysis. Aerodynamic calculations were based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. A time-accurate implicit residual smoothing operator was utilized for unsteady flow predictions. For unducted propfans, a single H-type grid was used to discretize each blade passage of the complete propeller. For ducted propfans, a coupled system of five grid blocks utilizing an embedded C-grid about the cowl leading edge was used to discretize each blade passage. Grid systems were generated by a combined algebraic/elliptic algorithm developed specifically for ducted propfans. Numerical calculations were compared with experimental data for both ducted and unducted propfan flows. The solution scheme demonstrated efficiency and accuracy comparable with other schemes of this class.
Smart Sensor-Based Motion Detection System for Hand Movement Training in Open Surgery.
Sun, Xinyao; Byrns, Simon; Cheng, Irene; Zheng, Bin; Basu, Anup
2017-02-01
We introduce a smart sensor-based motion detection technique for objective measurement and assessment of surgical dexterity among users at different experience levels. The goal is to allow trainees to evaluate their performance based on a reference model shared through communication technology, e.g., the Internet, without the physical presence of an evaluating surgeon. While in the current implementation we used a Leap Motion Controller to obtain motion data for analysis, our technique can be applied to motion data captured by other smart sensors, e.g., OptiTrack. To differentiate motions captured from different participants, measurement and assessment in our approach are achieved using two strategies: (1) low level descriptive statistical analysis, and (2) Hidden Markov Model (HMM) classification. Based on our surgical knot tying task experiment, we can conclude that finger motions generated from users with different surgical dexterity, e.g., expert and novice performers, display differences in path length, number of movements and task completion time. In order to validate the discriminatory ability of HMM for classifying different movement patterns, a non-surgical task was included in our analysis. Experimental results demonstrate that our approach had 100 % accuracy in discriminating between expert and novice performances. Our proposed motion analysis technique applied to open surgical procedures is a promising step towards the development of objective computer-assisted assessment and training systems.
Applying Cognitive Work Analysis to Time Critical Targeting Functionality
2004-10-01
Cognitive Task Analysis , CTA, Cognitive Task Analysis , Human Factors, GUI, Graphical User Interface, Heuristic Evaluation... Cognitive Task Analysis MITRE Briefing January 2000 Dynamic Battle Management Functional Architecture 3-1 Section 3 Human Factors...clear distinction between Cognitive Work Analysis (CWA) and Cognitive Task Analysis (CTA), therefore this document will refer to these
NASA Technical Reports Server (NTRS)
Perkinson, J. A.
1974-01-01
The application of associative memory processor equipment to conventional host processors type systems is discussed. Efforts were made to demonstrate how such application relieves the task burden of conventional systems, and enhance system speed and efficiency. Data cover comparative theoretical performance analysis, demonstration of expanded growth capabilities, and demonstrations of actual hardware in simulated environment.