Cloudweaver: Adaptive and Data-Driven Workload Manager for Generic Clouds
NASA Astrophysics Data System (ADS)
Li, Rui; Chen, Lei; Li, Wen-Syan
Cloud computing denotes the latest trend in application development for parallel computing on massive data volumes. It relies on clouds of servers to handle tasks that used to be managed by an individual server. With cloud computing, software vendors can provide business intelligence and data analytic services for internet scale data sets. Many open source projects, such as Hadoop, offer various software components that are essential for building a cloud infrastructure. Current Hadoop (and many others) requires users to configure cloud infrastructures via programs and APIs and such configuration is fixed during the runtime. In this chapter, we propose a workload manager (WLM), called CloudWeaver, which provides automated configuration of a cloud infrastructure for runtime execution. The workload management is data-driven and can adapt to dynamic nature of operator throughput during different execution phases. CloudWeaver works for a single job and a workload consisting of multiple jobs running concurrently, which aims at maximum throughput using a minimum set of processors.
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
NASA Astrophysics Data System (ADS)
Klimentov, A.; Buncic, P.; De, K.; Jha, S.; Maeno, T.; Mount, R.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Porter, R. J.; Read, K. F.; Vaniachine, A.; Wells, J. C.; Wenaus, T.
2015-05-01
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(102) sites, O(105) cores, O(108) jobs per year, O(103) users, and ATLAS data volume is O(1017) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled ‘Next Generation Workload Management and Analysis System for Big Data’ (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. We
NASA Astrophysics Data System (ADS)
Klimentov, A.; De, K.; Jha, S.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Wells, J.; Wenaus, T.
2016-10-01
The.LHC, operating at CERN, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than grid can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility. Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full pro duction for the ATLAS since September 2015. We will present our current accomplishments with running PanDA at supercomputers and demonstrate our ability to use PanDA as a portal independent of the
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
Klimentov, A.; Buncic, P.; De, K.; ...
2015-05-22
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10 2) sites, O(10 5) cores, O(10 8) jobs per year, O(10 3) users, and ATLAS data volume is O(10 17) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled 'Next Generation Workload Management and Analysis System for Big Data' (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the Pan
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klimentov, A.; Buncic, P.; De, K.
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10 2) sites, O(10 5) cores, O(10 8) jobs per year, O(10 3) users, and ATLAS data volume is O(10 17) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled 'Next Generation Workload Management and Analysis System for Big Data' (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the Pan
Chapman, Susan A; Mulvihill, Linda; Herrera, Carolina
2012-01-01
The Workload and Time Management Survey of Central Cancer Registries was conducted in 2011 to assess the amount of time spent on work activities usually performed by cancer registrars. A survey including 39 multi-item questions,together with a work activities data collection log, was sent by email to the central cancer registry (CCR) manager in each of the 50 states and the District of Columbia. Twenty-four central cancer registries (47%) responded to the survey.Results indicate that registries faced reductions in budgeted staffing from 2008-2009. The number of source records and total cases were important indicators of workload. Four core activities, including abstracting at the registry, visual editing,case consolidation, and resolving edit reports, accounted for about half of registry workload. We estimate an average of 12.4 full-time equivalents (FTEs) are required to perform all cancer registration activities tracked by the survey; however,estimates vary widely by registry size. These findings may be useful for registries as a benchmark for their own registry workload and time-management data and to develop staffing guidelines.
Chapman, Susan A.; Mulvihill, Linda; Herrera, Carolina
2015-01-01
The Workload and Time Management Survey of Central Cancer Registries was conducted in 2011 to assess the amount of time spent on work activities usually performed by cancer registrars. A survey including 39 multi-item questions, together with a work activities data collection log, was sent by email to the central cancer registry (CCR) manager in each of the 50 states and the District of Columbia. Twenty-four central cancer registries (47%) responded to the survey. Results indicate that registries faced reductions in budgeted staffing from 2008–2009. The number of source records and total cases were important indicators of workload. Four core activities, including abstracting at the registry, visual editing, case consolidation, and resolving edit reports, accounted for about half of registry workload. We estimate an average of 12.4 full-time equivalents (FTEs) are required to perform all cancer registration activities tracked by the survey; however, estimates vary widely by registry size. These findings may be useful for registries as a benchmark for their own registry workload and time-management data and to develop staffing guidelines. PMID:23493024
NASA Astrophysics Data System (ADS)
Altomare, Albino; Cesario, Eugenio; Mastroianni, Carlo
2016-10-01
The opportunity of using Cloud resources on a pay-as-you-go basis and the availability of powerful data centers and high bandwidth connections are speeding up the success and popularity of Cloud systems, which is making on-demand computing a common practice for enterprises and scientific communities. The reasons for this success include natural business distribution, the need for high availability and disaster tolerance, the sheer size of their computational infrastructure, and/or the desire to provide uniform access times to the infrastructure from widely distributed client sites. Nevertheless, the expansion of large data centers is resulting in a huge rise of electrical power consumed by hardware facilities and cooling systems. The geographical distribution of data centers is becoming an opportunity: the variability of electricity prices, environmental conditions and client requests, both from site to site and with time, makes it possible to intelligently and dynamically (re)distribute the computational workload and achieve as diverse business goals as: the reduction of costs, energy consumption and carbon emissions, the satisfaction of performance constraints, the adherence to Service Level Agreement established with users, etc. This paper proposes an approach that helps to achieve the business goals established by the data center administrators. The workload distribution is driven by a fitness function, evaluated for each data center, which weighs some key parameters related to business objectives, among which, the price of electricity, the carbon emission rate, the balance of load among the data centers etc. For example, the energy costs can be reduced by using a "follow the moon" approach, e.g. by migrating the workload to data centers where the price of electricity is lower at that time. Our approach uses data about historical usage of the data centers and data about environmental conditions to predict, with the help of regressive models, the values of the
Integration of Panda Workload Management System with supercomputers
NASA Astrophysics Data System (ADS)
De, K.; Jha, S.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Nilsson, P.; Novikov, A.; Oleynik, D.; Panitkin, S.; Poyda, A.; Read, K. F.; Ryabinkin, E.; Teslyuk, A.; Velikhov, V.; Wells, J. C.; Wenaus, T.
2016-09-01
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 140 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250000 cores with a peak performance of 0.3+ petaFLOPS, next LHC data taking runs will require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), Supercomputer at the National Research Center "Kurchatov Institute", IT4 in Ostrava, and others). The current approach utilizes a modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run singlethreaded workloads in parallel on Titan's multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads
INTEGRATION OF PANDA WORKLOAD MANAGEMENT SYSTEM WITH SUPERCOMPUTERS
DOE Office of Scientific and Technical Information (OSTI.GOV)
De, K; Jha, S; Maeno, T
Abstract The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the funda- mental nature of matter and the basic forces that shape our universe, and were recently credited for the dis- covery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Datamore » Analysis) Workload Management System for managing the workflow for all data processing on over 140 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data cen- ters are physically scattered all over the world. While PanDA currently uses more than 250000 cores with a peak performance of 0.3+ petaFLOPS, next LHC data taking runs will require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Com- puting Facility (OLCF), Supercomputer at the National Research Center Kurchatov Institute , IT4 in Ostrava, and others). The current approach utilizes a modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single- threaded workloads in parallel on Titan s multi-core worker nodes. This implementation was tested with a
DOE Office of Scientific and Technical Information (OSTI.GOV)
De, K; Jha, S; Klimentov, A
2016-01-01
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), MIRA supercomputer at Argonne Leadership Computing Facilities (ALCF), Supercomputer at the National Research Center Kurchatov Institute , IT4 in Ostrava and others). Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This
James, Pam; Bebee, Patty; Beekman, Linda; Browning, David; Innes, Mathew; Kain, Jeannie; Royce-Westcott, Theresa; Waldinger, Marcy
2011-11-01
Quantifying data management and regulatory workload for clinical research is a difficult task that would benefit from a robust tool to assess and allocate effort. As in most clinical research environments, The University of Michigan Comprehensive Cancer Center (UMCCC) Clinical Trials Office (CTO) struggled to effectively allocate data management and regulatory time with frequently inaccurate estimates of how much time was required to complete the specific tasks performed by each role. In a dynamic clinical research environment in which volume and intensity of work ebbs and flows, determining requisite effort to meet study objectives was challenging. In addition, a data-driven understanding of how much staff time was required to complete a clinical trial was desired to ensure accurate trial budget development and effective cost recovery. Accordingly, the UMCCC CTO developed and implemented a Web-based effort-tracking application with the goal of determining the true costs of data management and regulatory staff effort in clinical trials. This tool was developed, implemented, and refined over a 3-year period. This article describes the process improvement and subsequent leveling of workload within data management and regulatory that enhanced the efficiency of UMCCC's clinical trials operation.
Interactive Query Processing in Big Data Systems: A Cross Industry Study of MapReduce Workloads
2012-04-02
invite cluster operators and the broader data management commu- nity to share additional knowledge about their MapReduce workloads. 9. ACKNOWLEDGMENTS...against real- life production MapReduce workloads. Knowledge of such workloads is currently limited to a handful of technology companies [19, 8, 48, 41...database management insights would benefit from checking workload assumptions against empirical measurements. The broad spectrum of workloads analyzed allows
Crew workload-management strategies - A critical factor in system performance
NASA Technical Reports Server (NTRS)
Hart, Sandra G.
1989-01-01
This paper reviews the philosophy and goals of the NASA/USAF Strategic Behavior/Workload Management Program. The philosophical foundation of the program is based on the assumption that an improved understanding of pilot strategies will clarify the complex and inconsistent relationships observed among objective task demands and measures of system performance and pilot workload. The goals are to: (1) develop operationally relevant figures of merit for performance, (2) quantify the effects of strategic behaviors on system performance and pilot workload, (3) identify evaluation criteria for workload measures, and (4) develop methods of improving pilots' abilities to manage workload extremes.
Role of Academic Managers in Workload and Performance Management of Academic Staff: A Case Study
ERIC Educational Resources Information Center
Graham, Andrew T.
2016-01-01
This small-scale case study focused on academic managers to explore the ways in which they control the workload of academic staff and the extent to which they use the workload model in performance management of academic staff. The links that exist between the workload and performance management were explored to confirm or refute the conceptual…
The Potential of Knowing More: A Review of Data-Driven Urban Water Management.
Eggimann, Sven; Mutzner, Lena; Wani, Omar; Schneider, Mariane Yvonne; Spuhler, Dorothee; Moy de Vitry, Matthew; Beutler, Philipp; Maurer, Max
2017-03-07
The promise of collecting and utilizing large amounts of data has never been greater in the history of urban water management (UWM). This paper reviews several data-driven approaches which play a key role in bringing forward a sea change. It critically investigates whether data-driven UWM offers a promising foundation for addressing current challenges and supporting fundamental changes in UWM. We discuss the examples of better rain-data management, urban pluvial flood-risk management and forecasting, drinking water and sewer network operation and management, integrated design and management, increasing water productivity, wastewater-based epidemiology and on-site water and wastewater treatment. The accumulated evidence from literature points toward a future UWM that offers significant potential benefits thanks to increased collection and utilization of data. The findings show that data-driven UWM allows us to develop and apply novel methods, to optimize the efficiency of the current network-based approach, and to extend functionality of today's systems. However, generic challenges related to data-driven approaches (e.g., data processing, data availability, data quality, data costs) and the specific challenges of data-driven UWM need to be addressed, namely data access and ownership, current engineering practices and the difficulty of assessing the cost benefits of data-driven UWM.
Paving the COWpath: data-driven design of pediatric order sets
Zhang, Yiye; Padman, Rema; Levin, James E
2014-01-01
Objective Evidence indicates that users incur significant physical and cognitive costs in the use of order sets, a core feature of computerized provider order entry systems. This paper develops data-driven approaches for automating the construction of order sets that match closely with user preferences and workflow while minimizing physical and cognitive workload. Materials and methods We developed and tested optimization-based models embedded with clustering techniques using physical and cognitive click cost criteria. By judiciously learning from users’ actual actions, our methods identify items for constituting order sets that are relevant according to historical ordering data and grouped on the basis of order similarity and ordering time. We evaluated performance of the methods using 47 099 orders from the year 2011 for asthma, appendectomy and pneumonia management in a pediatric inpatient setting. Results In comparison with existing order sets, those developed using the new approach significantly reduce the physical and cognitive workload associated with usage by 14–52%. This approach is also capable of accommodating variations in clinical conditions that affect order set usage and development. Discussion There is a critical need to investigate the cognitive complexity imposed on users by complex clinical information systems, and to design their features according to ‘human factors’ best practices. Optimizing order set generation using cognitive cost criteria introduces a new approach that can potentially improve ordering efficiency, reduce unintended variations in order placement, and enhance patient safety. Conclusions We demonstrate that data-driven methods offer a promising approach for designing order sets that are generalizable, data-driven, condition-based, and up to date with current best practices. PMID:24674844
Single-Pilot Workload Management
NASA Technical Reports Server (NTRS)
Rogers, Jason; Williams, Kevin; Hackworth, Carla; Burian, Barbara; Pruchnicki, Shawn; Christopher, Bonny; Drechsler, Gena; Silverman, Evan; Runnels, Barry; Mead, Andy
2013-01-01
Integrated glass cockpit systems place a heavy cognitive load on pilots (Burian Dismukes, 2007). Researchers from the NASA Ames Flight Cognition Lab and the FAA Flight Deck Human Factors Lab examined task and workload management by single pilots. This poster describes pilot performance regarding programming a reroute while at cruise and meeting a waypoint crossing restriction on the initial descent.
Pilot workload, performance and aircraft control automation
NASA Technical Reports Server (NTRS)
Hart, S. G.; Sheridan, T. B.
1984-01-01
Conceptual and practical issues associated with the design, operation, and performance of advanced systems and the impact of such systems on the human operators are reviewed. The development of highly automated systems is driven by the availability of new technology and the requirement that operators safely and economically perform more and more activities in increasingly difficult and hostile environments. It is noted that the operators workload may become a major area of concern in future design considerations. Little research was done to determine how automation and workload relate to each other, although it is assumed that the abstract, supervisory, or management roles that are performed by operators of highly automated systems will impose increased mental workload. The relationship between performance and workload is discussed in relation to highly complex and automated environments.
Strategic workload management and decision biases in aviation
NASA Technical Reports Server (NTRS)
Raby, Mireille; Wickens, Christopher D.
1994-01-01
Thirty pilots flew three simulated landing approaches under conditions of low, medium, and high workload. Workload conditions were created by varying time pressure and external communications requirements. Our interest was in how the pilots strategically managed or adapted to the increasing workload. We independently assessed the pilot's ranking of the priority of different discrete tasks during the approach and landing. Pilots were found to sacrifice some aspects of primary flight control as workload increased. For discrete tasks, increasing workload increased the amount of time in performing the high priority tasks, decreased the time in performing those of lowest priority, and did not affect duration of performance episodes or optimality of scheduling of tasks of any priority level. Individual differences analysis revealed that high-performing subjects scheduled discrete tasks earlier in the flight and shifted more often between different activities.
ERIC Educational Resources Information Center
Torres, A. Chris
2016-01-01
An unsustainable workload is considered the primary cause of teacher turnover at Charter Management Organizations (CMOs), yet most reports provide anecdotal evidence to support this claim. This study uses 2010-2011 survey data from one large CMO and finds that teachers' perceptions of workload are significantly associated with decisions to leave…
Challenging data and workload management in CMS Computing with network-aware systems
NASA Astrophysics Data System (ADS)
D, Bonacorsi; T, Wildish
2014-06-01
After a successful first run at the LHC, and during the Long Shutdown (LS1) of the accelerator, the workload and data management sectors of the CMS Computing Model are entering into an operational review phase in order to concretely assess area of possible improvements and paths to exploit new promising technology trends. In particular, since the preparation activities for the LHC start, the Networks have constantly been of paramount importance for the execution of CMS workflows, exceeding the original expectations - as from the MONARC model - in terms of performance, stability and reliability. The low-latency transfers of PetaBytes of CMS data among dozens of WLCG Tiers worldwide using the PhEDEx dataset replication system is an example of the importance of reliable Networks. Another example is the exploitation of WAN data access over data federations in CMS. A new emerging area of work is the exploitation of Intelligent Network Services, including also bandwidth on demand concepts. In this paper, we will review the work done in CMS on this, and the next steps.
Next Generation Workload Management and Analysis System for Big Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
De, Kaushik
We report on the activities and accomplishments of a four-year project (a three-year grant followed by a one-year no cost extension) to develop a next generation workload management system for Big Data. The new system is based on the highly successful PanDA software developed for High Energy Physics (HEP) in 2005. PanDA is used by the ATLAS experiment at the Large Hadron Collider (LHC), and the AMS experiment at the space station. The program of work described here was carried out by two teams of developers working collaboratively at Brookhaven National Laboratory (BNL) and the University of Texas at Arlingtonmore » (UTA). These teams worked closely with the original PanDA team – for the sake of clarity the work of the next generation team will be referred to as the BigPanDA project. Their work has led to the adoption of BigPanDA by the COMPASS experiment at CERN, and many other experiments and science projects worldwide.« less
Arndt, Brian; Tuan, Wen-Jan; White, Jennifer; Schumacher, Jessica
2014-01-01
An understanding of primary care provider (PCP) workload is an important consideration in establishing optimal PCP panel size. However, no widely acceptable measure of PCP workload exists that incorporates the effort involved with both non-face-to-face patient care activities and face-to-face encounters. Accounting for this gap is critical given the increase in non-face-to-face PCP activities that has accompanied electronic health records (EHRs) (eg, electronic messaging). Our goal was to provide a comprehensive assessment of perceived PCP workload, accounting for aspects of both face-to-face and non-face-to-face encounters. Internal medicine, family medicine, and pediatric PCPs completed a self-administered survey about the perceived workload involved with face-to-face and non-face-to-face panel management activities as well as the perceived challenge associated with caring for patients with particular biomedical, demographic, and psychosocial characteristics (n = 185). Survey results were combined with EHR data at the individual patient and PCP service levels to assess PCP panel workload, accounting for face-to-face and non-face-to-face utilization. Of the multiple face-to-face and non-face-to-face activities associated with routine primary care, PCPs considered hospital admissions, obstetric care, hospital discharges, and new patient preventive health visits to be greater workload than non-face-to-face activities such as telephone calls, electronic communication, generating letters, and medication refills. Total workload within PCP panels at the individual patient level varied by overall health status, and the total workload of non-face-to-face panel management activities associated with routine primary care was greater than the total workload associated with face-to-face encounters regardless of health status. We used PCP survey results coupled with EHR data to assess PCP workload associated with both face-to-face as well as non-face-to-face panel management
Workload: Measurement and Management
NASA Technical Reports Server (NTRS)
Gore, Brian Francis; Casner, Stephen
2010-01-01
Poster: The workload research project has as its task to survey the available literature on: (1) workload measurement techniques; and (2) the effects of workload on operator performance. The first set of findings provides practitioners with a collection of simple-to-use workload measurement techniques along with characterizations of the kinds of tasks each technique has been shown reliably address. This allows design practitioners to select and use the most appropriate techniques for the task(s) at hand. The second set of findings provides practitioners with the guidance they need to design for appropriate kinds and amounts of workload across all tasks for which the operator is responsible. This guidance helps practitioners design systems and procedures that ensure appropriate levels of engagement across all tasks, and avoid designs and procedures that result in operator boredom, complacency, loss of awareness, undue levels of stress, or skill atrophy that can result from workload that distracts operators from the tasks they perform and monitor, workload levels that are too low, too high, or too consistent or predictable. Only those articles that were peer reviewed, long standing and generally accepted in the field, and applicable to a relevant range of conditions in a select domain of interest, in analogous "extreme" environments to those in space were included. In addition, all articles were reviewed and evaluated on uni-dimensional and multi-dimensional considerations. Casner & Gore also examined the notion of thresholds and the conditions that may benefit mostly from the various methodological approaches. Other considerations included whether the tools would be suitable for guiding a requirement-related and design-related question. An initial review of over 225 articles was conducted and entered into an EndNote database. The reference list included a range of conditions in the domain of interest (subjective/objective measures), the seminal works in workload, as
User-driven product data manager system design
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-03-01
With the infusion of information technologies into product development and production processes, effective management of product data is becoming essential to modern production enterprises. When an enterprise-wide Product Data Manager (PDM) is implemented, PDM designers must satisfy the requirements of individual users with different job functions and requirements, as well as the requirements of the enterprise as a whole. Concern must also be shown for the interrelationships between information, methods for retrieving archival information and integration of the PDM into the product development process. This paper describes a user-driven approach applied to PDM design for an agile manufacturing pilot projectmore » at Sandia National Laboratories that has been successful in achieving a much faster design-to-production process for a precision electro mechanical surety device.« less
Pilot and Controller Workload and Situation Awareness with Three Traffic Management Concept
NASA Technical Reports Server (NTRS)
Vu, Kim-Phuong L.; Strybel, Thomas Z.; Kraut, Joshua; Bacon, Paige; Minakata, Katsumi; Battiste, Vernol; Johnson, Walter
2010-01-01
This paper reports on workload and situation awareness of pilots and controllers participating in a human-in-the-loop simulation using three different distributed air-ground traffic management concepts. Eight experimental pilots started the scenario in an en-route phase of flight and were asked to avoid convective weather while performing spacing and merging tasks along with a continuous descent approach (CDA) into Louisville Standiford Airport (SDF). Two controllers managed the sectors through which the pilots flew, with one managing a sector that included the Top of Descent, and the other managing a sector that included the merge point for arrival into SDF. At 3-minute intervals in the scenario, pilots and controllers were probed on their workload or situation awareness. We employed one of three concepts of operation that distributed separation responsibility across human controllers, pilots, and automation to measure changes in operator situation awareness and workload. We found that when pilots were responsible for separation, they had higher levels of awareness, but not necessarily higher levels of workload. When controllers are responsible and actively engaged, they showed higher workload levels compared to pilots and changes in awareness that were dependent on sector characteristics.
Shift manager workload assessment - A case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berntson, K.; Kozak, A.; Malcolm, J. S.
2006-07-01
In early 2003, Bruce Power restarted two of its previously laid up units in the Bruce A generating station, Units 3 and 4. However, due to challenges relating to the availability of personnel with active Shift Manager licenses, an alternate shift structure was proposed to ensure the safe operation of the station. This alternate structure resulted in a redistribution of responsibility, and a need to assess the resulting changes in workload. Atomic Energy of Canada Limited was contracted to perform a workload assessment based on the new shift structure, and to provide recommendations, if necessary, to ensure Shift Managers hadmore » sufficient resources available to perform their required duties. This paper discusses the performance of that assessment, and lessons learned as a result of the work performed during the Restart project. (authors)« less
van den Oetelaar, W F J M; van Stel, H F; van Rhenen, W; Stellato, R K; Grolman, W
2016-11-10
Hospitals pursue different goals at the same time: excellent service to their patients, good quality care, operational excellence, retaining employees. This requires a good balance between patient needs and nursing staff. One way to ensure a proper fit between patient needs and nursing staff is to work with a workload management method. In our view, a nursing workload management method needs to have the following characteristics: easy to interpret; limited additional registration; applicable to different types of hospital wards; supported by nurses; covers all activities of nurses and suitable for prospective planning of nursing staff. At present, no such method is available. The research follows several steps to come to a workload management method for staff nurses. First, a list of patient characteristics relevant to care time will be composed by performing a Delphi study among staff nurses. Next, a time study of nurses' activities will be carried out. The 2 can be combined to estimate care time per patient group and estimate the time nurses spend on non-patient-related activities. These 2 estimates can be combined and compared with available nursing resources: this gives an estimate of nurses' workload. The research will take place in an academic hospital in the Netherlands. 6 surgical wards will be included, capacity 15-30 beds. The study protocol was submitted to the Medical Ethical Review Board of the University Medical Center (UMC) Utrecht and received a positive advice, protocol number 14-165/C. This method will be developed in close cooperation with staff nurses and ward management. The strong involvement of the end users will contribute to a broader support of the results. The method we will develop may also be useful for planning purposes; this is a strong advantage compared with existing methods, which tend to focus on retrospective analysis. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence
Evolution of the ATLAS PanDA workload management system for exascale computational science
NASA Astrophysics Data System (ADS)
Maeno, T.; De, K.; Klimentov, A.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Schovancova, J.; Vaniachine, A.; Wenaus, T.; Yu, D.; Atlas Collaboration
2014-06-01
An important foundation underlying the impressive success of data processing and analysis in the ATLAS experiment [1] at the LHC [2] is the Production and Distributed Analysis (PanDA) workload management system [3]. PanDA was designed specifically for ATLAS and proved to be highly successful in meeting all the distributed computing needs of the experiment. However, the core design of PanDA is not experiment specific. The PanDA workload management system is capable of meeting the needs of other data intensive scientific applications. Alpha-Magnetic Spectrometer [4], an astro-particle experiment on the International Space Station, and the Compact Muon Solenoid [5], an LHC experiment, have successfully evaluated PanDA and are pursuing its adoption. In this paper, a description of the new program of work to develop a generic version of PanDA will be given, as well as the progress in extending PanDA's capabilities to support supercomputers and clouds and to leverage intelligent networking. PanDA has demonstrated at a very large scale the value of automated dynamic brokering of diverse workloads across distributed computing resources. The next generation of PanDA will allow other data-intensive sciences and a wider exascale community employing a variety of computing platforms to benefit from ATLAS' experience and proven tools.
Physiological correlates of mental workload
NASA Technical Reports Server (NTRS)
Zacharias, G. L.
1980-01-01
A literature review was conducted to assess the basis of and techniques for physiological assessment of mental workload. The study findings reviewed had shortcomings involving one or more of the following basic problems: (1) physiologic arousal can be easily driven by nonworkload factors, confounding any proposed metric; (2) the profound absence of underlying physiologic models has promulgated a multiplicity of seemingly arbitrary signal processing techniques; (3) the unspecified multidimensional nature of physiological "state" has given rise to a broad spectrum of competing noncommensurate metrics; and (4) the lack of an adequate definition of workload compels physiologic correlations to suffer either from the vagueness of implicit workload measures or from the variance of explicit subjective assessments. Using specific studies as examples, two basic signal processing/data reduction techniques in current use, time and ensemble averaging are discussed.
Single-pilot workload management in entry-level jets.
DOT National Transportation Integrated Search
2013-09-01
Researchers from the NASA Ames Flight Cognition Lab and the FAAs Flight Deck Human Factors Research Laboratory at the Civil Aerospace Medical Institute (CAMI) examined task and workload management by single pilots in Very Light Jets (VLJs), also c...
Kilgore, Matthew D
and practice application of the KHFCM Acuity Tool. Quality improvement outcomes included a more valid reflection of encounter times and demonstration of the KHFCM Acuity Tool as a reliable, practical, credible, and satisfying tool for reflecting HF case manager workloads and HF disease severity. The KHFCM Acuity Tool defines workload simply as a function of the number of HFCM services performed and the duration of time spent on a client encounter. The design of the tool facilitates the measure of workload, service utilization, and HF disease characteristics, independently from the overall measure of acuity, so that differences in individual case manager practice, as well as client characteristics within sites, across sites, and potentially throughout annual seasons, can be demonstrated. Data produced from long-term applications of the KHFCM Acuity Tool, across all regions, could serve as a driver for establishing systemwide HFCM productivity benchmarks or standards of practice for HF case managers. Data produced from localized applications could serve as a reference for coordinating staffing resources or developing HFCM productivity benchmarks within individual regions or sites.
Single-pilot workload management in entry-level jets : appendices.
DOT National Transportation Integrated Search
2013-09-01
Researchers from the NASA Ames Flight Cognition Lab and the FAAs Flight Deck Human Factors Research Laboratory at the Civil Aerospace Medical Institute (CAMI) examined task and workload management by single pilots in Very Light Jets (VLJs), also c...
Single-Pilot Workload Management in Entry-Level Jets
2013-09-01
under Instrument Flight Rules ( IFR ) in a Cessna Citation Mustang ELJ level 5 flight training device at CAMI. Eight of the pilots were Mustang owner...Instrument Landing System IFR ............Instrument Flight Rules IMC ...........Instrument Meteorological Conditions ISA...pilots flew an experimental flight with two legs involving high workload management under Instrument Flight Rules ( IFR ) in a Cessna Citation Mustang
The workload analysis in welding workshop
NASA Astrophysics Data System (ADS)
Wahyuni, D.; Budiman, I.; Tryana Sembiring, M.; Sitorus, E.; Nasution, H.
2018-03-01
This research was conducted in welding workshop which produces doors, fences, canopies, etc., according to customer’s order. The symptoms of excessive workload were seen from the fact of employees complaint, requisition for additional employees, the lateness of completion time (there were 11 times of lateness from 28 orders, and 7 customers gave complaints). The top management of the workshop assumes that employees’ workload was still a tolerable limit. Therefore, it was required workload analysis to determine the number of employees required. The Workload was measured by using a physiological method and workload analysis. The result of this research can be utilized by the workshop for a better workload management.
Good, Marjorie J; Hurley, Patricia; Woo, Kaitlin M; Szczepanek, Connie; Stewart, Teresa; Robert, Nicholas; Lyss, Alan; Gönen, Mithat; Lilenbaum, Rogerio
2016-05-01
Clinical research program managers are regularly faced with the quandary of determining how much of a workload research staff members can manage while they balance clinical practice and still achieve clinical trial accrual goals, maintain data quality and protocol compliance, and stay within budget. A tool was developed to measure clinical trial-associated workload, to apply objective metrics toward documentation of work, and to provide clearer insight to better meet clinical research program challenges and aid in balancing staff workloads. A project was conducted to assess the feasibility and utility of using this tool in diverse research settings. Community-based research programs were recruited to collect and enter clinical trial-associated monthly workload data into a web-based tool for 6 consecutive months. Descriptive statistics were computed for self-reported program characteristics and workload data, including staff acuity scores and number of patient encounters. Fifty-one research programs that represented 30 states participated. Median staff acuity scores were highest for staff with patients enrolled in studies and receiving treatment, relative to staff with patients in follow-up status. Treatment trials typically resulted in higher median staff acuity, relative to cancer control, observational/registry, and prevention trials. Industry trials exhibited higher median staff acuity scores than trials sponsored by the National Institutes of Health/National Cancer Institute, academic institutions, or others. The results from this project demonstrate that trial-specific acuity measurement is a better measure of workload than simply counting the number of patients. The tool was shown to be feasible and useable in diverse community-based research settings. Copyright © 2016 by American Society of Clinical Oncology.
Hurley, Patricia; Woo, Kaitlin M.; Szczepanek, Connie; Stewart, Teresa; Robert, Nicholas; Lyss, Alan; Gönen, Mithat; Lilenbaum, Rogerio
2016-01-01
Purpose: Clinical research program managers are regularly faced with the quandary of determining how much of a workload research staff members can manage while they balance clinical practice and still achieve clinical trial accrual goals, maintain data quality and protocol compliance, and stay within budget. A tool was developed to measure clinical trial–associated workload, to apply objective metrics toward documentation of work, and to provide clearer insight to better meet clinical research program challenges and aid in balancing staff workloads. A project was conducted to assess the feasibility and utility of using this tool in diverse research settings. Methods: Community-based research programs were recruited to collect and enter clinical trial–associated monthly workload data into a web-based tool for 6 consecutive months. Descriptive statistics were computed for self-reported program characteristics and workload data, including staff acuity scores and number of patient encounters. Results: Fifty-one research programs that represented 30 states participated. Median staff acuity scores were highest for staff with patients enrolled in studies and receiving treatment, relative to staff with patients in follow-up status. Treatment trials typically resulted in higher median staff acuity, relative to cancer control, observational/registry, and prevention trials. Industry trials exhibited higher median staff acuity scores than trials sponsored by the National Institutes of Health/National Cancer Institute, academic institutions, or others. Conclusion: The results from this project demonstrate that trial-specific acuity measurement is a better measure of workload than simply counting the number of patients. The tool was shown to be feasible and useable in diverse community-based research settings. PMID:27006354
NASA TLA workload analysis support. Volume 3: FFD autopilot scenario validation data
NASA Technical Reports Server (NTRS)
Sundstrom, J. L.
1980-01-01
The data used to validate a seven time line analysis of forward flight deck autopilot mode for the pilot and copilot for NASA B737 terminal configured vehicle are presented. Demand workloads are given in two forms: workload histograms and workload summaries (bar graphs). A report showing task length and task interaction is also presented.
Integration of PanDA workload management system with Titan supercomputer at OLCF
NASA Astrophysics Data System (ADS)
De, K.; Klimentov, A.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Schovancova, J.; Vaniachine, A.; Wenaus, T.
2015-12-01
The PanDA (Production and Distributed Analysis) workload management system (WMS) was developed to meet the scale and complexity of LHC distributed computing for the ATLAS experiment. While PanDA currently distributes jobs to more than 100,000 cores at well over 100 Grid sites, the future LHC data taking runs will require more resources than Grid computing can possibly provide. To alleviate these challenges, ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). The current approach utilizes a modified PanDA pilot framework for job submission to Titan's batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on Titan's multicore worker nodes. It also gives PanDA new capability to collect, in real time, information about unused worker nodes on Titan, which allows precise definition of the size and duration of jobs submitted to Titan according to available free resources. This capability significantly reduces PanDA job wait time while improving Titan's utilization efficiency. This implementation was tested with a variety of Monte-Carlo workloads on Titan and is being tested on several other supercomputing platforms. Notice: This manuscript has been authored, by employees of Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy. The publisher by accepting the manuscript for publication acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.
C3PO - A Dynamic Data Placement Agent for ATLAS Distributed Data Management
NASA Astrophysics Data System (ADS)
Beermann, T.; Lassnig, M.; Barisits, M.; Serfon, C.; Garonne, V.; ATLAS Collaboration
2017-10-01
This paper introduces a new dynamic data placement agent for the ATLAS distributed data management system. This agent is designed to pre-place potentially popular data to make it more widely available. It therefore incorporates information from a variety of sources. Those include input datasets and sites workload information from the ATLAS workload management system, network metrics from different sources like FTS and PerfSonar, historical popularity data collected through a tracer mechanism and more. With this data it decides if, when and where to place new replicas that then can be used by the WMS to distribute the workload more evenly over available computing resources and then ultimately reduce job waiting times. This paper gives an overview of the architecture and the final implementation of this new agent. The paper also includes an evaluation of the placement algorithm by comparing the transfer times and the new replica usage.
Design Insights for MapReduce from Diverse Production Workloads
2012-01-25
different industries [5]. Consequently, there is a need to develop systematic knowledge of MapRe- duce behavior at both established users within technol...relevant to MapReduce-like systems that combine data movements and computation. 5.2 Task granularity Many MapReduce workload management mechanisms make...ex- ecutes the jobs given, versus what the jobs actually are. MapReduce workload managers currently optimize exe- cution scheduling and placement
USDA-ARS?s Scientific Manuscript database
Recent years have witnessed a call for evidence-based decisions in conservation and natural resource management, including data-driven decision-making. Adaptive management (AM) is one prevalent model for integrating scientific data into decision-making, yet AM has faced numerous challenges and limit...
Single Pilot Workload Management During Cruise in Entry Level Jets
NASA Technical Reports Server (NTRS)
Burian, Barbara K.; Pruchnicki, Shawn; Christopher, Bonny; Silverman, Evan; Hackworth, Carla; Rogers, Jason; Williams, Kevin; Drechsler, Gena; Runnels, Barry; Mead, Andy
2013-01-01
Advanced technologies and automation are important facilitators of single pilot operations, but they also contribute to the workload management challenges faced by the pilot. We examined task completion, workload management, and automation use in an entry level jet (ELJ) flown by single pilots. Thirteen certificated Cessna Citation Mustang (C510-S) pilots flew an instrument flight rules (IFR) experimental flight in a Cessna Citation Mustang simulator. At one point participants had to descend to meet a crossing restriction prior to a waypoint and prepare for an instrument approach into an un-towered field while facilitating communication from a lost pilot who was flying too low for ATC to hear. Four participants experienced some sort of difficulty with regard to meeting the crossing restriction and almost half (n=6) had problems associated with the instrument approach. Additional errors were also observed including eight participants landing at the airport with an incorrect altimeter setting.
NASA Technical Reports Server (NTRS)
Groce, J. L.; Boucek, G. P.
1988-01-01
This study is a continuation of an FAA effort to alleviate the growing problems of assimilating and managing the flow of data and flight related information in the air transport flight deck. The nature and extent of known pilot interface problems arising from new NAS data management programs were determined by a comparative timeline analysis of crew tasking requirements. A baseline of crew tasking requirements was established for conventional and advanced flight decks operating in the current NAS environment and then compared to the requirements for operation in a future NAS environment emphasizing Mode-S data link and TCAS. Results showed that a CDU-based pilot interface for Mode-S data link substantially increased crew visual activity as compared to the baseline. It was concluded that alternative means of crew interface should be available during high visual workload phases of flight. Results for TCAS implementation showed substantial visual and motor tasking increases, and that there was little available time between crew tasks during a TCAS encounter. It was concluded that additional research should be undertaken to address issues of ATC coordination and the relative benefit of high workload TCAS features.
GP views on strategies to cope with increasing workload: a qualitative interview study.
Fisher, Rebecca Fr; Croxson, Caroline Hd; Ashdown, Helen F; Hobbs, Fd Richard
2017-02-01
The existence of a crisis in primary care in the UK is in little doubt. GP morale and job satisfaction are low, and workload is increasing. In this challenging context, finding ways for GPs to manage that workload is imperative. To explore what existing or potential strategies are described by GPs for dealing with their workload, and their views on the relative merits of each. Semi-structured, qualitative interviews with GPs working within NHS England. All GPs working within NHS England were eligible. Of those who responded to advertisements, a maximum-variation sample was selected and interviewed until data saturation was reached. Data were analysed thematically. Responses were received from 171 GPs, and, from these, 34 were included in the study. Four main themes emerged for workload management: patient-level, GP-level, practice-level, and systems-level strategies. A need for patients to take greater responsibility for self-management was clear, but many felt that GPs should not be responsible for this education. Increased delegation of tasks was felt to be key to managing workload, with innovative use of allied healthcare professionals and extended roles for non-clinical staff suggested. Telephone triage was a commonly used tool for managing workload, although not all participants found this helpful. This in-depth qualitative study demonstrates an encouraging resilience among GPs. They are proactively trying to manage workload, often using innovative local strategies. GPs do not feel that they can do this alone, however, and called repeatedly for increased recruitment and more investment in primary care. © British Journal of General Practice 2017.
NASA Technical Reports Server (NTRS)
Ligda, Sarah V.; Dao, Arik-Quang V.; Vu, Kim-Phuong; Strybel, Thomas Z.; Battiste, Vernol; Johnson, Walter W.
2010-01-01
Pilot workload was examined during simulated flights requiring flight deck-based merging and spacing while avoiding weather. Pilots used flight deck tools to avoid convective weather and space behind a lead aircraft during an arrival into Louisville International airport. Three conflict avoidance management concepts were studied: pilot, controller or automation primarily responsible. A modified Air Traffic Workload Input Technique (ATWIT) metric showed highest workload during the approach phase of flight and lowest during the en-route phase of flight (before deviating for weather). In general, the modified ATWIT was shown to be a valid and reliable workload measure, providing more detailed information than post-run subjective workload metrics. The trend across multiple workload metrics revealed lowest workload when pilots had both conflict alerting and responsibility of the three concepts, while all objective and subjective measures showed highest workload when pilots had no conflict alerting or responsibility. This suggests that pilot workload was not tied primarily to responsibility for resolving conflicts, but to gaining and/or maintaining situation awareness when conflict alerting is unavailable.
Managing Teacher Workload: Work-Life Balance and Wellbeing
ERIC Educational Resources Information Center
Bubb, Sara; Earley, Peter
2004-01-01
This book is divided into three sections. In the First Section, entitled "Wellbeing and Workload", the authors examine teacher workload and how teachers spend their time. Chapter 1 focuses on what the causes and effects of excessive workload are, especially in relation to wellbeing, stress and, crucially, recruitment and retention?…
Operator strategies under varying conditions of workload
NASA Technical Reports Server (NTRS)
Arnegard, Ruth J.
1991-01-01
An attempt was made to operationally define and measure strategic behavior in a complex multiple task environment. The Multi-Attribute Task battery was developed to simulate various aspects of flight and consisted of an auditory communication task, monitoring tasks, a tracking tasks, a resource management task which allowed a wide range of responding patterns, and a scheduling window which allowed operators to predict changes in workload. This battery was validated for its sensitivity to strategic behavior, and baseline measures for each individual task were collected. Twenty-four undergraduate and graduate students then performed the battery for four 64 minute sessions which took place over a period of 2 days. Each subject performed the task battery under four levels of workload, which were presented for equal lengths of time during all four sessions. Results indicated that in general, performance improves as a function of experience with the battery, but that performance decreased as workload level increased. The data also showed that subjects developed strategies for responding to the resource management task which allowed them to manage the high workload levels more efficiently. This particular strategy developed over time but was also associated with errors of complacency. These results are presented along with implications for the aviation field and areas of future research.
Li, Zuofeng; Wen, Jingran; Zhang, Xiaoyan; Wu, Chunxiao; Li, Zuogao; Liu, Lei
2012-01-01
Aim to ease the secondary use of clinical data in clinical research, we introduce a metadata driven web-based clinical data management system named ClinData Express. ClinData Express is made up of two parts: 1) m-designer, a standalone software for metadata definition; 2) a web based data warehouse system for data management. With ClinData Express, what the researchers need to do is to define the metadata and data model in the m-designer. The web interface for data collection and specific database for data storage will be automatically generated. The standards used in the system and the data export modular make sure of the data reuse. The system has been tested on seven disease-data collection in Chinese and one form from dbGap. The flexibility of system makes its great potential usage in clinical research. The system is available at http://code.google.com/p/clindataexpress. PMID:23304327
Combining Quick-Turnaround and Batch Workloads at Scale
NASA Technical Reports Server (NTRS)
Matthews, Gregory A.
2012-01-01
NAS uses PBS Professional to schedule and manage the workload on Pleiades, an 11,000+ node 1B cluster. At this scale the user experience for quick-turnaround jobs can degrade, which led NAS initially to set up two separate PBS servers, each dedicated to a particular workload. Recently we have employed PBS hooks and scheduler modifications to merge these workloads together under one PBS server, delivering sub-1-minute start times for the quick-turnaround workload, and enabling dynamic management of the resources set aside for that workload.
NASA Astrophysics Data System (ADS)
Gleason, J. L.; Hillyer, T. N.; Wilkins, J.
2012-12-01
The CERES Science Team integrates data from 5 CERES instruments onboard the Terra, Aqua and NPP missions. The processing chain fuses CERES observations with data from 19 other unique sources. The addition of CERES Flight Model 5 (FM5) onboard NPP, coupled with ground processing system upgrades further emphasizes the need for an automated job-submission utility to manage multiple processing streams concurrently. The operator-driven, legacy-processing approach relied on manually staging data from magnetic tape to limited spinning disk attached to a shared memory architecture system. The migration of CERES production code to a distributed, cluster computing environment with approximately one petabyte of spinning disk containing all precursor input data products facilitates the development of a CERES-specific, automated workflow manager. In the cluster environment, I/O is the primary system resource in contention across jobs. Therefore, system load can be maximized with a throttling workload manager. This poster discusses a Java and Perl implementation of an automated job management tool tailored for CERES processing.
A data-driven approach to quality risk management
Alemayehu, Demissie; Alvir, Jose; Levenstein, Marcia; Nickerson, David
2013-01-01
Aim: An effective clinical trial strategy to ensure patient safety as well as trial quality and efficiency involves an integrated approach, including prospective identification of risk factors, mitigation of the risks through proper study design and execution, and assessment of quality metrics in real-time. Such an integrated quality management plan may also be enhanced by using data-driven techniques to identify risk factors that are most relevant in predicting quality issues associated with a trial. In this paper, we illustrate such an approach using data collected from actual clinical trials. Materials and Methods: Several statistical methods were employed, including the Wilcoxon rank-sum test and logistic regression, to identify the presence of association between risk factors and the occurrence of quality issues, applied to data on quality of clinical trials sponsored by Pfizer. Results: Only a subset of the risk factors had a significant association with quality issues, and included: Whether study used Placebo, whether an agent was a biologic, unusual packaging label, complex dosing, and over 25 planned procedures. Conclusion: Proper implementation of the strategy can help to optimize resource utilization without compromising trial integrity and patient safety. PMID:24312890
A data-driven approach to quality risk management.
Alemayehu, Demissie; Alvir, Jose; Levenstein, Marcia; Nickerson, David
2013-10-01
An effective clinical trial strategy to ensure patient safety as well as trial quality and efficiency involves an integrated approach, including prospective identification of risk factors, mitigation of the risks through proper study design and execution, and assessment of quality metrics in real-time. Such an integrated quality management plan may also be enhanced by using data-driven techniques to identify risk factors that are most relevant in predicting quality issues associated with a trial. In this paper, we illustrate such an approach using data collected from actual clinical trials. Several statistical methods were employed, including the Wilcoxon rank-sum test and logistic regression, to identify the presence of association between risk factors and the occurrence of quality issues, applied to data on quality of clinical trials sponsored by Pfizer. ONLY A SUBSET OF THE RISK FACTORS HAD A SIGNIFICANT ASSOCIATION WITH QUALITY ISSUES, AND INCLUDED: Whether study used Placebo, whether an agent was a biologic, unusual packaging label, complex dosing, and over 25 planned procedures. Proper implementation of the strategy can help to optimize resource utilization without compromising trial integrity and patient safety.
The impact on the workload of the Ward Manager with the introduction of administrative assistants.
Locke, Rachel; Leach, Camilla; Kitsell, Fleur; Griffith, Jacki
2011-03-01
To evaluate the impact on the workload of the Ward Manager (WM) with the introduction of administrative assistants into eight trusts in the South of England in a year-long pilot. Ward Managers are nurse leaders who are responsible for ward management and delivering expert clinical care to patients. They have traditionally been expected to achieve this role without administrative assistance. Meeting the workload demands of multiple roles and overload has meant the leadership and clinical role has suffered, presenting issues of low morale among existing WMs and issues of recruiting the next generation of WMs. Sixty qualitative interviews were carried out with 16 WMs, 12 Ward Manager Assistants (WMAs), and six senior nurse executives about the impact of the introduction of the WMA post. Quantitative data to measure change in WM workload and ward activity was supplied by 24 wards. Ward Managers reported spending reduced time on administrative tasks and having increased time available to spend on the ward with patients and leading staff. With the introduction of WMAs, there was also improvement in key performance measures (the maintenance of quality under service pressures) and increased staff motivation. There was overwhelming support for the introduction of administrative assistants from participating WMs. The WMAs enabled WMs to spend more time with patients and, more widely, to provide greater support to ward teams. The success of the pilot is reflected in wards working hard to be able to extend contracts of WMAs. The extent of the success is reflected in wards that were not participants in the pilot, observing the benefits of the post, having worked to secure funding to recruit their own WMAs. The widespread introduction of administrative assistance could increase ward productivity and provide support for clinical leaders. Continuing professional development for WMs needs to incorporate training about management responsibilities and how to best use administrative
Mental workload measurement: Event-related potentials and ratings of workload and fatigue
NASA Technical Reports Server (NTRS)
Biferno, M. A.
1985-01-01
Event-related potentials were elicited when a digitized word representing a pilot's call-sign was presented. This auditory probe was presented during 27 workload conditions in a 3x3x3 design where the following variables were manipulated: short-term load, tracking task difficulty, and time-on-task. Ratings of workload and fatigue were obtained between each trial of a 2.5-hour test. The data of each subject were analyzed individually to determine whether significant correlations existed between subjective ratings and ERP component measures. Results indicated that a significant number of subjects had positive correlations between: (1) ratings of workload and P300 amplitude, (2) ratings of workload and N400 amplitude, and (3) ratings of fatigue and P300 amplitude. These data are the first to show correlations between ratings of workload or fatigue and ERP components thereby reinforcing their validity as measures of mental workload and fatigue.
Epstein, R H; Dexter, F
2012-09-01
Perioperative interruptions generated electronically from anaesthesia information management systems (AIMS) can provide useful feedback, but may adversely affect task performance if distractions occur at inopportune moments. Ideally such interruptions would occur only at times when their impact would be minimal. In this study of AIMS data, we evaluated the times of comments, drugs, fluids and periodic assessments (e.g. electrocardiogram diagnosis and train-of-four) to develop recommendations for the timing of interruptions during the intraoperative period. The 39,707 cases studied were divided into intervals between: 1) enter operating room; 2) induction; 3) intubation; 4) surgical incision; and 5) end surgery. Five-minute intervals of no documentation were determined for each case. The offsets from the start of each interval when >50% of ongoing cases had completed initial documentation were calculated (MIN50). The primary endpoint for each interval was the percentage of all cases still ongoing at MIN50. Results were that the intervals from entering the operating room to induction and from induction to intubation were unsuitable for interruptions confirming prior observational studies of anaesthesia workload. At least 13 minutes after surgical incision was the most suitable time for interruptions with 92% of cases still ongoing. Timing was minimally affected by the type of anaesthesia, surgical facility, surgical service, prone positioning or scheduled case duration. The implication of our results is that for mediated interruptions, waiting at least 13 minutes after the start of surgery is appropriate. Although we used AIMS data, operating room information system data is also suitable.
Evolutionary Multiobjective Query Workload Optimization of Cloud Data Warehouses
Dokeroglu, Tansel; Sert, Seyyit Alper; Cinar, Muhammet Serkan
2014-01-01
With the advent of Cloud databases, query optimizers need to find paretooptimal solutions in terms of response time and monetary cost. Our novel approach minimizes both objectives by deploying alternative virtual resources and query plans making use of the virtual resource elasticity of the Cloud. We propose an exact multiobjective branch-and-bound and a robust multiobjective genetic algorithm for the optimization of distributed data warehouse query workloads on the Cloud. In order to investigate the effectiveness of our approach, we incorporate the devised algorithms into a prototype system. Finally, through several experiments that we have conducted with different workloads and virtual resource configurations, we conclude remarkable findings of alternative deployments as well as the advantages and disadvantages of the multiobjective algorithms we propose. PMID:24892048
Sanders, Gabriel J; Roll, Brad; Peacock, Corey A; Kollock, Roger O
2018-05-02
Sanders, GJ, Roll, B, Peacock, CA, and Kollock, RO. Maximum movement workloads and high-intensity workload demands by position in NCAA division I collegiate football. J Strength Cond Res XX(X): 000-000, 2018-The purpose of the study was to quantify the average and maximum (i.e., peak) movement workloads, and the percent of those workloads performed at high intensity by NCAA division I football athletes during competitive games. Using global positioning system devices (Catapult Sports), low, moderate, and high and total multidirectional movement workloads were quantified by each position. Strategically achieving maximal workloads may improve both conditioning and rehabilitation protocols for athletes as they prepare for competition or return to play after an injury. A total of 40 football athletes were included in the analysis. For the data to be included, athletes were required to participate in ≥75% of the offensive or defensive snaps for any given game. There was a total of 286 data downloads from 13 different games for 8 different football positions. Data were calculated and compared by offensive and defensive position to establish the mean, SD, and maximum workloads during competitive games. The percent high-intensity workload profile was established to assess the total number and percent of high-intensity movement workloads by position. The profile was calculated by dividing a position's maximal high-intensity movement workload by the total (e.g., sum of maximal low, moderate, and high-intensity movements) movement workload. One-way analysis of variances revealed that there was a main effect of football position for total movement workloads and the percent of workloads performed at high intensities (p ≤ 0.025 for all). Maximal high-intensity workloads were 1.6-4.3 times greater than average high-intensity workloads, and the percent of total workloads performed at high intensities varied greatly by position. Strategically training for and using maximal
Heavy vehicle driver workload assessment. Task 3, task analysis data collection
DOT National Transportation Integrated Search
This technical report consists of a collection of task analytic data to support heavy vehicle driver workload assessment and protocol development. Data were collected from professional drivers to provide insights into the following issues: the meanin...
The Workload Curve: Subjective Mental Workload.
Estes, Steven
2015-11-01
In this paper I begin looking for evidence of a subjective workload curve. Results from subjective mental workload assessments are often interpreted linearly. However, I hypothesized that ratings of subjective mental workload increase nonlinearly with unitary increases in working memory load. Two studies were conducted. In the first, the participant provided ratings of the mental difficulty of a series of digit span recall tasks. In the second study, participants provided ratings of mental difficulty associated with recall of visual patterns. The results of the second study were then examined using a mathematical model of working memory. An S curve, predicted a priori, was found in the results of both the digit span and visual pattern studies. A mathematical model showed a tight fit between workload ratings and levels of working memory activation. This effort provides good initial evidence for the existence of a workload curve. The results support further study in applied settings and other facets of workload (e.g., temporal workload). Measures of subjective workload are used across a wide variety of domains and applications. These results bear on their interpretation, particularly as they relate to workload thresholds. © 2015, Human Factors and Ergonomics Society.
TASKILLAN II - Pilot strategies for workload management
NASA Technical Reports Server (NTRS)
Segal, Leon D.; Wickens, Christopher D.
1990-01-01
This study focused on the strategies used by pilots in managing their workload level, and their subsequent task performance. Sixteen licensed pilots flew 42 missions on a helicopter simulation, and were evaluated on their performance of the overall mission, as well as individual tasks. Pilots were divided in four groups, defined by the presence or absence of scheduling control over tasks and the availability of intelligence concerning the type and stage of difficulties imposed during the flight. Results suggest that intelligence supported strategies that yielded significant higher performance levels, while scheduling control seemed to have no impact on performance. Both difficulty type and the stage of difficulty impacted performance significantly, with strongest effects for time stresss and difficulties imposed late in the flight.
On the Modeling and Management of Cloud Data Analytics
NASA Astrophysics Data System (ADS)
Castillo, Claris; Tantawi, Asser; Steinder, Malgorzata; Pacifici, Giovanni
A new era is dawning where vast amount of data is subjected to intensive analysis in a cloud computing environment. Over the years, data about a myriad of things, ranging from user clicks to galaxies, have been accumulated, and continue to be collected, on storage media. The increasing availability of such data, along with the abundant supply of compute power and the urge to create useful knowledge, gave rise to a new data analytics paradigm in which data is subjected to intensive analysis, and additional data is created in the process. Meanwhile, a new cloud computing environment has emerged where seemingly limitless compute and storage resources are being provided to host computation and data for multiple users through virtualization technologies. Such a cloud environment is becoming the home for data analytics. Consequently, providing good performance at run-time to data analytics workload is an important issue for cloud management. In this paper, we provide an overview of the data analytics and cloud environment landscapes, and investigate the performance management issues related to running data analytics in the cloud. In particular, we focus on topics such as workload characterization, profiling analytics applications and their pattern of data usage, cloud resource allocation, placement of computation and data and their dynamic migration in the cloud, and performance prediction. In solving such management problems one relies on various run-time analytic models. We discuss approaches for modeling and optimizing the dynamic data analytics workload in the cloud environment. All along, we use the Map-Reduce paradigm as an illustration of data analytics.
Gregório, João; Cavaco, Afonso Miguel; Lapão, Luís Velez
Primary health care disease management models are rooted in multidisciplinary participation; however, implementation of services is lagging behind desires and predictions. Barriers like workload and lack of demand have been described. The aim of this research is to observe the workload and work patterns of Portuguese community pharmacists, and relate it with the demand of pharmaceutical services. A time-and-motion observational study was performed to describe community pharmacists' workload in a sample of four pharmacies in the metropolitan Lisbon area. A reference list of activities to be observed was developed by reviewing other studies of community pharmacy work. This study took place during a weekday's 8-h shift, focusing on pharmacists' activities. Data to be collected included the type and duration of the activity, who performed it and where. To estimate the demand of pharmaceutical care services, "thematic-patient scenarios" were developed. These scenarios were based on the defined daily dose and package size of the most consumed medicines in Portugal, combined with data obtained from the four pharmacies' information systems on the day the observational study took place. Between 67.0% and 81.8% of the registered activities were pharmacist-patient interactions. These interactions summed 158.44 min, with a mean duration of 3.98 min per interaction. On average, participant pharmacies' professionals handled 4.2 prescriptions and 0.9 over-the-counter (OTC) consultations per hour. About one third of the day was spent performing administrative and non-differentiated tasks. About 54.92 min were registered as free time, 50% of which were "micro pauses" with 1 min or less. The most dispensed therapeutic subgroup was antihypertensive drugs, while the dispensation of antidiabetics was characterized by a high number of packages sold per interaction. From the developed scenarios, one can estimate that a chronic patient may visit the pharmacy 4-9 times per year
Automated clustering-based workload characterization
NASA Technical Reports Server (NTRS)
Pentakalos, Odysseas I.; Menasce, Daniel A.; Yesha, Yelena
1996-01-01
The demands placed on the mass storage systems at various federal agencies and national laboratories are continuously increasing in intensity. This forces system managers to constantly monitor the system, evaluate the demand placed on it, and tune it appropriately using either heuristics based on experience or analytic models. Performance models require an accurate workload characterization. This can be a laborious and time consuming process. It became evident from our experience that a tool is necessary to automate the workload characterization process. This paper presents the design and discusses the implementation of a tool for workload characterization of mass storage systems. The main features of the tool discussed here are: (1)Automatic support for peak-period determination. Histograms of system activity are generated and presented to the user for peak-period determination; (2) Automatic clustering analysis. The data collected from the mass storage system logs is clustered using clustering algorithms and tightness measures to limit the number of generated clusters; (3) Reporting of varied file statistics. The tool computes several statistics on file sizes such as average, standard deviation, minimum, maximum, frequency, as well as average transfer time. These statistics are given on a per cluster basis; (4) Portability. The tool can easily be used to characterize the workload in mass storage systems of different vendors. The user needs to specify through a simple log description language how the a specific log should be interpreted. The rest of this paper is organized as follows. Section two presents basic concepts in workload characterization as they apply to mass storage systems. Section three describes clustering algorithms and tightness measures. The following section presents the architecture of the tool. Section five presents some results of workload characterization using the tool.Finally, section six presents some concluding remarks.
MEASURING WORKLOAD OF ICU NURSES WITH A QUESTIONNAIRE SURVEY: THE NASA TASK LOAD INDEX (TLX).
Hoonakker, Peter; Carayon, Pascale; Gurses, Ayse; Brown, Roger; McGuire, Kerry; Khunlertkit, Adjhaporn; Walker, James M
2011-01-01
High workload of nurses in Intensive Care Units (ICUs) has been identified as a major patient safety and worker stress problem. However, relative little attention has been dedicated to the measurement of workload in healthcare. The objectives of this study are to describe and examine several methods to measure workload of ICU nurses. We then focus on the measurement of ICU nurses' workload using a subjective rating instrument: the NASA TLX.We conducted secondary data analysis on data from two, multi-side, cross-sectional questionnaire studies to examine several instruments to measure ICU nurses' workload. The combined database contains the data from 757 ICU nurses in 8 hospitals and 21 ICUs.Results show that the different methods to measure workload of ICU nurses, such as patient-based and operator-based workload, are only moderately correlated, or not correlated at all. Results show further that among the operator-based instruments, the NASA TLX is the most reliable and valid questionnaire to measure workload and that NASA TLX can be used in a healthcare setting. Managers of hospitals and ICUs can benefit from the results of this research as it provides benchmark data on workload experienced by nurses in a variety of ICUs.
MEASURING WORKLOAD OF ICU NURSES WITH A QUESTIONNAIRE SURVEY: THE NASA TASK LOAD INDEX (TLX)
Hoonakker, Peter; Carayon, Pascale; Gurses, Ayse; Brown, Roger; McGuire, Kerry; Khunlertkit, Adjhaporn; Walker, James M.
2012-01-01
High workload of nurses in Intensive Care Units (ICUs) has been identified as a major patient safety and worker stress problem. However, relative little attention has been dedicated to the measurement of workload in healthcare. The objectives of this study are to describe and examine several methods to measure workload of ICU nurses. We then focus on the measurement of ICU nurses’ workload using a subjective rating instrument: the NASA TLX. We conducted secondary data analysis on data from two, multi-side, cross-sectional questionnaire studies to examine several instruments to measure ICU nurses’ workload. The combined database contains the data from 757 ICU nurses in 8 hospitals and 21 ICUs. Results show that the different methods to measure workload of ICU nurses, such as patient-based and operator-based workload, are only moderately correlated, or not correlated at all. Results show further that among the operator-based instruments, the NASA TLX is the most reliable and valid questionnaire to measure workload and that NASA TLX can be used in a healthcare setting. Managers of hospitals and ICUs can benefit from the results of this research as it provides benchmark data on workload experienced by nurses in a variety of ICUs. PMID:22773941
KNMI DataLab experiences in serving data-driven innovations
NASA Astrophysics Data System (ADS)
Noteboom, Jan Willem; Sluiter, Raymond
2016-04-01
Climate change research and innovations in weather forecasting rely more and more on (Big) data. Besides increasing data from traditional sources (such as observation networks, radars and satellites), the use of open data, crowd sourced data and the Internet of Things (IoT) is emerging. To deploy these sources of data optimally in our services and products, KNMI has established a DataLab to serve data-driven innovations in collaboration with public and private sector partners. Big data management, data integration, data analytics including machine learning and data visualization techniques are playing an important role in the DataLab. Cross-domain data-driven innovations that arise from public-private collaborative projects and research programmes can be explored, experimented and/or piloted by the KNMI DataLab. Furthermore, advice can be requested on (Big) data techniques and data sources. In support of collaborative (Big) data science activities, scalable environments are offered with facilities for data integration, data analysis and visualization. In addition, Data Science expertise is provided directly or from a pool of internal and external experts. At the EGU conference, gained experiences and best practices are presented in operating the KNMI DataLab to serve data-driven innovations for weather and climate applications optimally.
[Evaluation of nurse workload in patients undergoing therapeutic hypothermia].
Argibay-Lago, Ana; Fernández-Rodríguez, Diego; Ferrer-Sala, Nuria; Prieto-Robles, Cristina; Hernanz-del Río, Alexandre; Castro-Rebollo, Pedro
2014-01-01
Therapeutic hypothermia (TH) is recommended to minimize neurological damage in patients surviving sudden cardiac arrest (SCA). There is scarcity of data evaluating the nursing workload in these patients. The objective of the study is to assess the workload of nurses whilst treating patients undergoing TH after SCA. A 43-month prospective-retrospective comparative cohort study was designed. Patients admitted to intensive care unit, for recovered SCA and persistent coma, were included. A comparison was made using the baseline characteristics, medical management, in-hospital mortality, and nursing workload during the first 96hours using the Therapeutic Intervention Scoring System-28 (TISS-28); Nursing Activities Score (NAS); and Nine Equivalents of Nursing Manpower Use Score (NEMS) scales among patients who received TH and those who did not. A total 46 patients were included: 26 in the TH group and 20 in the Non-TH group. Regarding baseline characteristics and management, the TH group presented higher prevalence of smoking habit (69 vs. 25%, p=0.012), out-of-hospital SCA (96 vs. 55%, p<0.001), and the performance of coronary angiography (96 vs. 65%, p=0.014) compared with the non-TH group. No differences were observed in the nursing workload, assessed by TISS 28, NAS or NEMS scales, or in-hospital mortality. In this study performance of TH in SCA survivors is not associated with an increase in nursing workload. The installation of a TH program does not require the use of more nursing resources in terms of workload. Copyright © 2014 Elsevier España, S.L.U. All rights reserved.
Predicting operator workload during system design
NASA Technical Reports Server (NTRS)
Aldrich, Theodore B.; Szabo, Sandra M.
1988-01-01
A workload prediction methodology was developed in response to the need to measure workloads associated with operation of advanced aircraft. The application of the methodology will involve: (1) conducting mission/task analyses of critical mission segments and assigning estimates of workload for the sensory, cognitive, and psychomotor workload components of each task identified; (2) developing computer-based workload prediction models using the task analysis data; and (3) exercising the computer models to produce predictions of crew workload under varying automation and/or crew configurations. Critical issues include reliability and validity of workload predictors and selection of appropriate criterion measures.
Curriculum Change Management and Workload
ERIC Educational Resources Information Center
Alkahtani, Aishah
2017-01-01
This study examines the ways in which Saudi teachers have responded or are responding to the challenges posed by a new curriculum. It also deals with issues relating to workload demands which affect teachers' performance when they apply a new curriculum in a Saudi Arabian secondary school. In addition, problems such as scheduling and sharing space…
GPs’ perceptions of workload in England: a qualitative interview study
Croxson, Caroline HD; Ashdown, Helen F; Hobbs, FD Richard
2017-01-01
Background GPs report the lowest levels of morale among doctors, job satisfaction is low, and the GP workforce is diminishing. Workload is frequently cited as negatively impacting on commitment to a career in general practice, and many GPs report that their workload is unmanageable. Aim To gather an in-depth understanding of GPs’ perceptions and attitudes towards workload. Design and setting All GPs working within NHS England were eligible. Advertisements were circulated via regional GP e-mail lists and national social media networks in June 2015. Of those GPs who responded, a maximum-variation sample was selected until data saturation was reached. Method Semi-structured, qualitative interviews were conducted. Data were analysed thematically. Results In total, 171 GPs responded, and 34 were included in this study. GPs described an increase in workload over recent years, with current working days being long and intense, raising concerns over the wellbeing of GPs and patients. Full-time partnership was generally not considered to be possible, and many participants felt workload was unsustainable, particularly given the diminishing workforce. Four major themes emerged to explain increased workload: increased patient needs and expectations; a changing relationship between primary and secondary care; bureaucracy and resources; and the balance of workload within a practice. Continuity of care was perceived as being eroded by changes in contracts and working patterns to deal with workload. Conclusion This study highlights the urgent need to address perceived lack of investment and clinical capacity in general practice, and suggests that managing patient expectations around what primary care can deliver, and reducing bureaucracy, have become key issues, at least until capacity issues are resolved. PMID:28093422
GPs' perceptions of workload in England: a qualitative interview study.
Croxson, Caroline Hd; Ashdown, Helen F; Hobbs, Fd Richard
2017-02-01
GPs report the lowest levels of morale among doctors, job satisfaction is low, and the GP workforce is diminishing. Workload is frequently cited as negatively impacting on commitment to a career in general practice, and many GPs report that their workload is unmanageable. To gather an in-depth understanding of GPs' perceptions and attitudes towards workload. All GPs working within NHS England were eligible. Advertisements were circulated via regional GP e-mail lists and national social media networks in June 2015. Of those GPs who responded, a maximum-variation sample was selected until data saturation was reached. Semi-structured, qualitative interviews were conducted. Data were analysed thematically. In total, 171 GPs responded, and 34 were included in this study. GPs described an increase in workload over recent years, with current working days being long and intense, raising concerns over the wellbeing of GPs and patients. Full-time partnership was generally not considered to be possible, and many participants felt workload was unsustainable, particularly given the diminishing workforce. Four major themes emerged to explain increased workload: increased patient needs and expectations; a changing relationship between primary and secondary care; bureaucracy and resources; and the balance of workload within a practice. Continuity of care was perceived as being eroded by changes in contracts and working patterns to deal with workload. This study highlights the urgent need to address perceived lack of investment and clinical capacity in general practice, and suggests that managing patient expectations around what primary care can deliver, and reducing bureaucracy, have become key issues, at least until capacity issues are resolved. © British Journal of General Practice 2017.
NASA Astrophysics Data System (ADS)
Pinner, J. W., IV
2016-02-01
Data from shipboard oceanographic sensors are collected in various ASCii, binary, open and proprietary formats. Acquiring all of these formats using single, monolithic data acquisition system (DAS) can be cumbersome, complex and difficult to adapt for the ever changing suite of emerging oceanographic sensors. Another approach to the at-sea data acquisition challenge is to utilize multiple DAS software packages and corral the resulting data files with a ship-wide data management system. The Open Vessel Data Management project (OpenVDM) implements this second approach to ship-wide data management and over the last three years has successfully demonstrated it's ability to deliver a consistent cruise data package to scientists while reducing the workload placed on marine technicians. In addition to meeting the at-sea and post-cruise needs of scientists OpenVDM is helping vessel operators better adhere to the recommendations and best practices set forth by 3rd party data management and data quality groups such as R2R and SAMOS. OpenVDM also includes tools for supporting telepresence-enabled ocean research/exploration such as bandwidth-efficient ship-to-shore data transfers, shore-side data access, data visualization and near-real-time data quality tests and data statistics. OpenVDM is currently operating aboard three vessels. The R/V Endeavor, operated by the University of Rhode Island, is a regional-class UNOLS research vessel operating under the traditional NFS, P.I. driven model. The E/V Nautilus, operated by the Ocean Exploration Trust specializes in ROV-based, telepresence-enabled oceanographic research. The R/V Falkor operated by the Schmidt Ocean Institute is an ocean research platform focusing on cutting-edge technology development. These three vessels all have different missions, sensor suites and operating models yet all are able to leverage OpenVDM for managing their unique datasets and delivering a more consistent cruise data package to scientists and data
Academic Workload and Working Time: Retrospective Perceptions versus Time-Series Data
ERIC Educational Resources Information Center
Kyvik, Svein
2013-01-01
The purpose of this article is to examine the validity of perceptions by academic staff about their past and present workload and working hours. Retrospective assessments are compared with time-series data. The data are drawn from four mail surveys among academic staff in Norwegian universities undertaken in the period 1982-2008. The findings show…
O'Bryan, Linda; Krueger, Janelle; Lusk, Ruth
2002-03-01
Kindred Healthcare, Inc., the nation's largest full-service network of long-term acute care hospitals, initiated a 3-year strategic plan to re-evaluate its workload management system. Here, follow the project's most important and difficult phase--designing and implementing the patient classification system.
Defining the subjective experience of workload
NASA Technical Reports Server (NTRS)
Hart, S. G.; Childress, M. E.; Bortolussi, M.
1981-01-01
Flight scenarios that represent different types and levels of pilot workload are needed in order to conduct research about, and develop measures of, pilot workload. In order to be useful, however, the workload associated with such scenarios and the component tasks must be determined independently. An initial study designed to provide such information was conducted by asking a panel of general aviation pilots to evaluate flight-related tasks for the overall, perceptual, physical, and cognitive workload they impose. These ratings will provide the nucleus for a data base of flight-related primary tasks that have been independently rated for workload to use in workload assessment research.
Orientations to Academic Workloads at Department Level
ERIC Educational Resources Information Center
Wolf, Amanda
2010-01-01
Universities confront many challenges in their efforts to manage staff activity with the aid of workload assessment and allocation systems. This article sets out fresh perspectives from an exploratory study designed to uncover patterns of subjective views about various aspects of workloads. Using Q methodology, academic staff in a single…
The WorkQueue project - a task queue for the CMS workload management system
NASA Astrophysics Data System (ADS)
Ryu, S.; Wakefield, S.
2012-12-01
We present the development and first experience of a new component (termed WorkQueue) in the CMS workload management system. This component provides a link between a global request system (Request Manager) and agents (WMAgents) which process requests at compute and storage resources (known as sites). These requests typically consist of creation or processing of a data sample (possibly terabytes in size). Unlike the standard concept of a task queue, the WorkQueue does not contain fully resolved work units (known typically as jobs in HEP). This would require the WorkQueue to run computationally heavy algorithms that are better suited to run in the WMAgents. Instead the request specifies an algorithm that the WorkQueue uses to split the request into reasonable size chunks (known as elements). An advantage of performing lazy evaluation of an element is that expanding datasets can be accommodated by having job details resolved as late as possible. The WorkQueue architecture consists of a global WorkQueue which obtains requests from the request system, expands them and forms an element ordering based on the request priority. Each WMAgent contains a local WorkQueue which buffers work close to the agent, this overcomes temporary unavailability of the global WorkQueue and reduces latency for an agent to begin processing. Elements are pulled from the global WorkQueue to the local WorkQueue and into the WMAgent based on the estimate of the amount of work within the element and the resources available to the agent. WorkQueue is based on CouchDB, a document oriented NoSQL database. The WorkQueue uses the features of CouchDB (map/reduce views and bi-directional replication between distributed instances) to provide a scalable distributed system for managing large queues of work. The project described here represents an improvement over the old approach to workload management in CMS which involved individual operators feeding requests into agents. This new approach allows for a
Heavy vehicle driver workload assessment. Task 1, task analysis data and protocols review
DOT National Transportation Integrated Search
This report contains a review of available task analytic data and protocols pertinent to heavy vehicle operation and determination of the availability and relevance of such data to heavy vehicle driver workload assessment. Additionally, a preliminary...
NASA Technical Reports Server (NTRS)
Brunstrom, Anna; Leutenegger, Scott T.; Simha, Rahul
1995-01-01
Traditionally, allocation of data in distributed database management systems has been determined by off-line analysis and optimization. This technique works well for static database access patterns, but is often inadequate for frequently changing workloads. In this paper we address how to dynamically reallocate data for partionable distributed databases with changing access patterns. Rather than complicated and expensive optimization algorithms, a simple heuristic is presented and shown, via an implementation study, to improve system throughput by 30 percent in a local area network based system. Based on artificial wide area network delays, we show that dynamic reallocation can improve system throughput by a factor of two and a half for wide area networks. We also show that individual site load must be taken into consideration when reallocating data, and provide a simple policy that incorporates load in the reallocation decision.
Reconsidering the conceptualization of nursing workload: literature review.
Morris, Roisin; MacNeela, Padraig; Scott, Anne; Treacy, Pearl; Hyde, Abbey
2007-03-01
This paper reports a literature review that aimed to analyse the way in which nursing intensity and patient dependency have been considered to be conceptually similar to nursing workload, and to propose a model to show how these concepts actually differ in both theoretical and practical terms. The literature on nursing workload considers the concepts of patient 'dependency' and nursing 'intensity' in the realm of nursing workload. These concepts differ by definition but are used to measure the same phenomenon, i.e. nursing workload. The literature search was undertaken in 2004 using electronic databases, reference lists and other available literature. Papers were sourced from the Medline, Psychlit, CINAHL and Cochrane databases and through the general search engine Google. The keywords focussed on nursing workload, nursing intensity and patient dependency. Nursing work and workload concepts and labels are defined and measured in different and often contradictory ways. It is vitally important to understand these differences when using such conceptualizations to measure nursing workload. A preliminary model is put forward to clarify the relationships between nursing workload concepts. In presenting a preliminary model of nursing workload, it is hoped that nursing workload might be better understood so that it becomes more visible and recognizable. Increasing the visibility of nursing workload should have a positive impact on nursing workload management and on the provision of patient care.
MacDonald, Sharyn L S; Cowan, Ian A; Floyd, Richard A; Graham, Rob
2013-10-01
Accurate and transparent measurement and monitoring of radiologist workload is highly desirable for management of daily workflow in a radiology department, and for informing decisions on department staffing needs. It offers the potential for benchmarking between departments and assessing future national workforce and training requirements. We describe a technique for quantifying, with minimum subjectivity, all the work carried out by radiologists in a tertiary department. Six broad categories of clinical activities contributing to radiologist workload were identified: reporting, procedures, trainee supervision, clinical conferences and teaching, informal case discussions, and administration related to referral forms. Time required for reporting was measured using data from the radiology information system. Other activities were measured by observation and timing by observers, and based on these results and extensive consultation, the time requirements and frequency of each activity was agreed on. An activity list was created to record this information and to calculate the total clinical hours required to meet the demand for radiologist services. Diagnostic reporting accounted for approximately 35% of radiologist clinical time; procedures, 23%; trainee supervision, 15%; conferences and tutorials, 14%; informal case discussions, 10%; and referral-related administration, 3%. The derived data have been proven reliable for workload planning over the past 3 years. A transparent and robust method of measuring radiologists' workload has been developed, with subjective assessments kept to a minimum. The technique has value for daily workload and longer term planning. It could be adapted for widespread use. © 2013 The Authors. Journal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists.
Chiorri, Carlo; Garbarino, Sergio; Bracco, Fabrizio; Magnavita, Nicola
2015-01-01
Previous research has suggested that personality traits of the Five Factor Model play a role in worker's response to workload. The aim of this study was to investigate the association of personality traits of first responders with their perceived workload in real-life tasks. A flying column of 269 police officers completed a measure of subjective workload (NASA-Task Load Index) after intervention tasks in a major public event. Officers' scores on a measure of Five Factor Model personality traits were obtained from archival data. Linear Mixed Modeling was used to test the direct and interaction effects of personality traits on workload scores once controlling for background variables, task type and workload source (mental, temporal and physical demand of the task, perceived effort, dissatisfaction for the performance and frustration due to the task). All personality traits except extraversion significantly interacted at least with one workload source. Perceived workload in flying column police officers appears to be the result of their personality characteristics interacting with the workload source. The implications of these results for the development of support measures aimed at reducing the impact of workload in this category of workers are discussed. PMID:26640456
The Effects of Automation on Battle Manager Workload and Performance
2008-01-01
such as the National Aeronautics and Space Administration ( NASA ) Task Load Index ( TLX ) (Hart & Staveland, 1988), the Subjec- tive Workload Assessment...Factor Metric Experience Demographic questionnaire Stress level NASA TLX SWAT Assessment Observer reports Confidence Logged performance data...Mahwah, New Jersey: Law- rence Erlbaum Associates. Hart, S. G., & Staveland, L. E. (1988). Development of NASA - TLX (Task Load Index): Results of
Workload Management Strategies for Online Educators
ERIC Educational Resources Information Center
Crews, Tena B.; Wilkinson, Kelly; Hemby, K. Virginia; McCannon, Melinda; Wiedmaier, Cheryl
2008-01-01
With increased use of online education, both students and instructors are adapting to the online environment. Online educators must adjust to the change in responsibilities required to teach online, as it is quite intensive during the designing, teaching, and revising stages. The purpose of this study is to examine and update workload management…
Pilot workload and fatigue: A critical survey of concepts and assessment techniques
NASA Technical Reports Server (NTRS)
Gartner, W. B.; Murphy, M. R.
1976-01-01
The principal unresolved issues in conceptualizing and measuring pilot workload and fatigue are discussed. These issues are seen as limiting the development of more useful working concepts and techniques and their application to systems engineering and management activities. A conceptual analysis of pilot workload and fatigue, an overview and critique of approaches to the assessment of these phenomena, and a discussion of current trends in the management of unwanted workload and fatigue effects are presented. Refinements and innovations in assessment methods are recommended for enhancing the practical significance of workload and fatigue studies.
Dynamic file-access characteristics of a production parallel scientific workload
NASA Technical Reports Server (NTRS)
Kotz, David; Nieuwejaar, Nils
1994-01-01
Multiprocessors have permitted astounding increases in computational performance, but many cannot meet the intense I/O requirements of some scientific applications. An important component of any solution to this I/O bottleneck is a parallel file system that can provide high-bandwidth access to tremendous amounts of data in parallel to hundreds or thousands of processors. Most successful systems are based on a solid understanding of the expected workload, but thus far there have been no comprehensive workload characterizations of multiprocessor file systems. This paper presents the results of a three week tracing study in which all file-related activity on a massively parallel computer was recorded. Our instrumentation differs from previous efforts in that it collects information about every I/O request and about the mix of jobs running in a production environment. We also present the results of a trace-driven caching simulation and recommendations for designers of multiprocessor file systems.
Student Workload and Assessment: Strategies to Manage Expectations and Inform Curriculum Development
ERIC Educational Resources Information Center
Scully, Glennda; Kerr, Rosemary
2014-01-01
This study reports the results of a survey of student study times and perceptions of workload in undergraduate and graduate accounting courses at a large Australian public university. The study was in response to student feedback expressing concerns about workload in courses. The presage factors of student workload and assessment in Biggs' 3P…
Namaganda, Grace; Oketcho, Vincent; Maniple, Everd; Viadro, Claire
2015-08-31
Uganda's health workforce is characterized by shortages and inequitable distribution of qualified health workers. To ascertain staffing levels, Uganda uses fixed government-approved norms determined by facility type. This approach cannot distinguish between facilities of the same type that have different staffing needs. The Workload Indicators of Staffing Need (WISN) method uses workload to determine number and type of staff required in a given facility. The national WISN assessment sought to demonstrate the limitations of the existing norms and generate evidence to influence health unit staffing and staff deployment for efficient utilization of available scarce human resources. A national WISN assessment (September 2012) used purposive sampling to select 136 public health facilities in 33/112 districts. The study examined staffing requirements for five cadres (nursing assistants, nurses, midwives, clinical officers, doctors) at health centres II (n = 59), III (n = 53) and IV (n = 13) and hospitals (n = 11). Using health management information system workload data (1 July 2010-30 June 2011), the study compared current and required staff, assessed workload pressure and evaluated the adequacy of the existing staffing norms. By the WISN method, all three types of health centres had fewer nurses (42-70%) and midwives (53-67%) than required and consequently exhibited high workload pressure (30-58%) for those cadres. Health centres IV and hospitals lacked doctors (39-42%) but were adequately staffed with clinical officers. All facilities displayed overstaffing of nursing assistants. For all cadres at health centres III and IV other than nursing assistants, the fixed norms or existing staffing or both fell short of the WISN staffing requirements, with, for example, only half as many nurses and midwives as required. The WISN results demonstrate the inadequacies of existing staffing norms, particularly for health centres III and IV. The results provide an evidence base to
CompatPM: enabling energy efficient multimedia workloads for distributed mobile platforms
NASA Astrophysics Data System (ADS)
Nathuji, Ripal; O'Hara, Keith J.; Schwan, Karsten; Balch, Tucker
2007-01-01
The computation and communication abilities of modern platforms are enabling increasingly capable cooperative distributed mobile systems. An example is distributed multimedia processing of sensor data in robots deployed for search and rescue, where a system manager can exploit the application's cooperative nature to optimize the distribution of roles and tasks in order to successfully accomplish the mission. Because of limited battery capacities, a critical task a manager must perform is online energy management. While support for power management has become common for the components that populate mobile platforms, what is lacking is integration and explicit coordination across the different management actions performed in a variety of system layers. This papers develops an integration approach for distributed multimedia applications, where a global manager specifies both a power operating point and a workload for a node to execute. Surprisingly, when jointly considering power and QoS, experimental evaluations show that using a simple deadline-driven approach to assigning frequencies can be non-optimal. These trends are further affected by certain characteristics of underlying power management mechanisms, which in our research, are identified as groupings that classify component power management as "compatible" (VFC) or "incompatible" (VFI) with voltage and frequency scaling. We build on these findings to develop CompatPM, a vertically integrated control strategy for power management in distributed mobile systems. Experimental evaluations of CompatPM indicate average energy improvements of 8% when platform resources are managed jointly rather than independently, demonstrating that previous attempts to maximize battery life by simply minimizing frequency are inappropriate from a platform-level perspective.
Driving Ms. Data: Creating Data-Driven Possibilities
ERIC Educational Resources Information Center
Hoffman, Richard
2005-01-01
This article describes how driven Web sites help schools and districts maximize their IT resources by making online content more "self-service" for users. It shows how to set up the capacity to create data-driven sites. By definition, a data-driven Web site is one in which the content comes from some back-end data source, such as a…
Generating Shifting Workloads to Benchmark Adaptability in Relational Database Systems
NASA Astrophysics Data System (ADS)
Rabl, Tilmann; Lang, Andreas; Hackl, Thomas; Sick, Bernhard; Kosch, Harald
A large body of research concerns the adaptability of database systems. Many commercial systems already contain autonomic processes that adapt configurations as well as data structures and data organization. Yet there is virtually no possibility for a just measurement of the quality of such optimizations. While standard benchmarks have been developed that simulate real-world database applications very precisely, none of them considers variations in workloads produced by human factors. Today’s benchmarks test the performance of database systems by measuring peak performance on homogeneous request streams. Nevertheless, in systems with user interaction access patterns are constantly shifting. We present a benchmark that simulates a web information system with interaction of large user groups. It is based on the analysis of a real online eLearning management system with 15,000 users. The benchmark considers the temporal dependency of user interaction. Main focus is to measure the adaptability of a database management system according to shifting workloads. We will give details on our design approach that uses sophisticated pattern analysis and data mining techniques.
Relationship between mental workload and musculoskeletal disorders among Alzahra Hospital nurses
Habibi, Ehsanollah; Taheri, Mohamad Reza; Hasanzadeh, Akbar
2015-01-01
Background: Musculoskeletal disorders (MSDs) are a serious problem among the nursing staff. Mental workload is the major cause of MSDs among nursing staff. The aim of this study was to investigate the mental workload dimensions and their association with MSDs among nurses of Alzahra Hospital, affiliated to Isfahan University of Medical Sciences. Materials and Methods: This descriptive cross-sectional study was conducted on 247 randomly selected nurses who worked in the Alzahra Hospital in Isfahan, Iran in the summer of 2013. The Persian version of National Aeronautics and Space Administration Task Load Index (NASA-TLX) (measuring mental load) specialized questionnaire and Cornell Musculoskeletal Discomfort Questionnaire (CMDQ) was used for data collection. Data were collected and analyzed by Pearson correlation coefficient and Spearman correlation coefficient tests in SPSS 20. Results: Pearson and Spearman correlation tests showed a significant association between the nurses’ MSDs and the dimensions of workload frustration, total workload, temporal demand, effort, and physical demand (r = 0.304, 0.277, 0.277, 0.216, and 0.211, respectively). However, there was no significant association between the nurses’ MSDs and the dimensions of workload performance and mental demand (P > 0.05). Conclusions: The nurses’ frustration had a direct correlation with MSDs. This shows that stress is an inseparable component in hospital workplace. Thus, reduction of stress in nursing workplace should be one of the main priorities of hospital managers. PMID:25709683
Psychophysical workload in the operating room: primary surgeon versus assistant.
Rieger, Annika; Fenger, Sebastian; Neubert, Sebastian; Weippert, Matthias; Kreuzfeld, Steffi; Stoll, Regina
2015-07-01
Working in the operating room is characterized by high demands and overall workload of the surgical team. Surgeons often report that they feel more stressed when operating as a primary surgeon than in the function as an assistant which has been confirmed in recent studies. In this study, intra-individual workload was assessed in both intraoperative functions using a multidimensional approach that combined objective and subjective measures in a realistic work setting. Surgeons' intraoperative psychophysiologic workload was assessed through a mobile health system. 25 surgeons agreed to take part in the 24-hour monitoring by giving their written informed consent. The mobile health system contained a sensor electronic module integrated in a chest belt and measuring physiological parameters such as heart rate (HR), breathing rate (BR), and skin temperature. Subjective workload was assessed pre- and postoperatively using an electronic version of the NASA-TLX on a smartphone. The smartphone served as a communication unit and transferred objective and subjective measures to a communication server where data were stored and analyzed. Working as a primary surgeon did not result in higher workload. Neither NASA-TLX ratings nor physiological workload indicators were related to intraoperative function. In contrast, length of surgeries had a significant impact on intraoperative physical demands (p < 0.05; η(2) = 0.283), temporal demands (p < 0.05; η(2) = 0.260), effort (p < 0.05; η(2) = 0.287), and NASA-TLX sum score (p < 0.01; η(2) = 0.287). Intra-individual workload differences do not relate to intraoperative role of surgeons when length of surgery is considered as covariate. An intelligent operating management that considers the length of surgeries by implementing short breaks could contribute to the optimization of intraoperative workload and the preservation of surgeons' health, respectively. The value of mobile health systems for continuous psychophysiologic workload
Flight Crew Task Management in Non-Normal Situations
NASA Technical Reports Server (NTRS)
Schutte, Paul C.; Trujillo, Anna C.
1996-01-01
Task management (TM) is always performed on the flight deck, although not always explicitly, consistently, or rigorously. Nowhere is TM as important as it is in dealing with non-normal situations. The objective of this study was to analyze pilot TM behavior for non-normal situations. Specifically, the study observed pilots performance in a full workload environment in order to discern their TM strategies. This study identified four different TM prioritization and allocation strategies: Aviate-Navigate-Communicate-Manage Systems; Perceived Severity; Procedure Based; and Event/Interrupt Driven. Subjects used these strategies to manage their personal workload and to schedule monitoring and assessment of the situation. The Perceived Severity strategy for personal workload management combined with the Aviate-Navigate-Communicate-Manage Systems strategy for monitoring and assessing appeared to be the most effective (fewest errors and fastest response times) in responding to the novel system failure used in this study.
Crew workload strategies in advanced cockpits
NASA Technical Reports Server (NTRS)
Hart, Sandra G.
1990-01-01
Many methods of measuring and predicting operator workload have been developed that provide useful information in the design, evaluation, and operation of complex systems and which aid in developing models of human attention and performance. However, the relationships between such measures, imposed task demands, and measures of performance remain complex and even contradictory. It appears that we have ignored an important factor: people do not passively translate task demands into performance. Rather, they actively manage their time, resources, and effort to achieve an acceptable level of performance while maintaining a comfortable level of workload. While such adaptive, creative, and strategic behaviors are the primary reason that human operators remain an essential component of all advanced man-machine systems, they also result in individual differences in the way people respond to the same task demands and inconsistent relationships among measures. Finally, we are able to measure workload and performance, but interpreting such measures remains difficult; it is still not clear how much workload is too much or too little nor the consequences of suboptimal workload on system performance and the mental, physical, and emotional well-being of the human operators. The rationale and philosophy of a program of research developed to address these issues will be reviewed and contrasted to traditional methods of defining, measuring, and predicting human operator workload. Viewgraphs are given.
Chen, Yuqian; Ke, Yufeng; Meng, Guifang; Jiang, Jin; Qi, Hongzhi; Jiao, Xuejun; Xu, Minpeng; Zhou, Peng; He, Feng; Ming, Dong
2017-12-01
As one of the most important brain-computer interface (BCI) paradigms, P300-Speller was shown to be significantly impaired once applied in practical situations due to effects of mental workload. This study aims to provide a new method of building training models to enhance performance of P300-Speller under mental workload. Three experiment conditions based on row-column P300-Speller paradigm were performed including speller-only, 3-back-speller and mental-arithmetic-speller. Data under dual-task conditions were introduced to speller-only data respectively to build new training models. Then performance of classifiers with different models was compared under the same testing condition. The results showed that when tasks of imported training data and testing data were the same, character recognition accuracies and round accuracies of P300-Speller with mixed-data training models significantly improved (FDR, p < 0.005). When they were different, performance significantly improved when tested on mental-arithmetic-speller (FDR, p < 0.05) while the improvement was modest when tested on n-back-speller (FDR, p < 0.1). The analysis of ERPs revealed that ERP difference between training data and testing data was significantly diminished when the dual-task data was introduced to training data (FDR, p < 0.05). The new method of training classifier on mixed data proved to be effective in enhancing performance of P300-Speller under mental workload, confirmed the feasibility to build a universal training model and overcome the effects of mental workload in its practical applications. Copyright © 2017 Elsevier B.V. All rights reserved.
Kang, Chun-Mei; Chiu, Hsiao-Ting; Hu, Yi-Chun; Chen, Hsiao-Lien; Lee, Pi-Hsia; Chang, Wen-Yin
2012-10-01
To assess the level of and the differences in managerial competencies, research capability, time management, executive power, workload and work-stress ratings among nurse administrators (NAs), and to determine the best predictors of managerial competencies for NAs. Although NAs require multifaceted managerial competencies, research related to NAs' managerial competencies is limited. A cross-sectional survey was conducted with 330 NAs from 16 acute care hospitals. Managerial competencies were determined through a self-developed questionnaire. Data were collected in 2011. All NAs gave themselves the highest rating on integrity and the lowest on both financial/budgeting and business acumen. All scores for managerial competencies, research capability, time management and executive power showed a statistically significant correlation. The stepwise regression analysis revealed that age; having received NA training; having completed a nursing project independently; and scores for research capability, executive power and workload could explain 63.2% of the total variance in managerial competencies. The present study provides recommendations for future administrative training programmes to increase NAs' managerial competency in fulfilling their management roles and functions. The findings inform leaders of hospitals where NAs need to develop additional competencies concerning the type of training NAs need to function proficiently. © 2012 Blackwell Publishing Ltd.
Application and Validation of Workload Assessment Techniques
1993-03-01
tech ical report documents the process and outcome of meeting this objective. Procedure: A series of eight separate studies was conducted using three...development process . The task analysis and simulation technique was shown to have the capability to track empirical workload ratings. More research is...operator workload during the systems acquisi- tion process , and (b) a pamphlet for the managers of Army systems that describes the need and some procedures
Water quality and ecosystem management: Data-driven reality check of effects in streams and lakes
NASA Astrophysics Data System (ADS)
Destouni, Georgia; Fischer, Ida; Prieto, Carmen
2017-08-01
This study investigates nutrient-related water quality conditions and change trends in the first management periods of the EU Water Framework Directive (WFD; since 2009) and Baltic Sea Action Plan (BASP; since 2007). With mitigation of nutrients in inland waters and their discharges to the Baltic Sea being a common WFD and BSAP target, we use Sweden as a case study of observable effects, by compiling and analyzing all openly available water and nutrient monitoring data across Sweden since 2003. The data compilation reveals that nutrient monitoring covers only around 1% (down to 0.2% for nutrient loads) of the total number of WFD-classified stream and lake water bodies in Sweden. The data analysis further shows that the hydro-climatically driven water discharge dominates the determination of waterborne loads of both total phosphorus and total nitrogen across Sweden. Both water discharge and the related nutrient loads are in turn well correlated with the ecosystem status classification of Swedish water bodies. Nutrient concentrations do not exhibit such correlation and their changes over the study period are on average small, but concentration increases are found for moderate-to-bad status waters, for which both the WFD and the BSAP have instead targeted concentration decreases. In general, these results indicate insufficient distinction and mitigation of human-driven nutrient components in inland waters and their discharges to the sea by the internationally harmonized applications of the WFD and the BSAP. The results call for further comparative investigations of observable large-scale effects of such regulatory/management frameworks in different parts of the world.
Flight Crew Workload Evaluation Based on the Workload Function Distribution Method.
Zheng, Yiyuan; Lu, Yanyu; Jie, Yuwen; Fu, Shan
2017-05-01
The minimum flight crew on the flight deck should be established according to the workload for individual crewmembers. Typical workload measures consist of three types: subjective rating scale, task performance, and psychophysiological measures. However, all these measures have their own limitations. To reflect flight crew workload more specifically and comprehensively within the flight environment, and more directly comply with airworthiness regulations, the Workload Function Distribution Method, which combined the basic six workload functions, was proposed. The analysis was based on the different conditions of workload function numbers. Each condition was analyzed from two aspects, which were overall proportion and effective proportion. Three types of approach tasks were used in this study and the NASA-TLX scale was implemented for comparison. Neither the one-function condition nor the two-function condition had the same results with NASA-TLX. However, both the three-function and the four- to six- function conditions were identical with NASA-TLX. Further, the significant differences were different on four to six conditions. The overall proportion was insignificant, while the effective proportions were significant. The results show that the conditions with one function and two functions seemed to have no influence on workload, while executing three functions and four to six functions had an impact on workload. Besides, effective proportions of workload functions were more precisely compared with the overall proportions to indicate workload, especially in the conditions with multiple functions.Zheng Y, Lu Y, Jie Y, Fu S. Flight crew workload evaluation based on the workload function distribution method. Aerosp Med Hum Perform. 2017; 88(5):481-486.
Data-driven Science in Geochemistry & Petrology: Vision & Reality
NASA Astrophysics Data System (ADS)
Lehnert, K. A.; Ghiorso, M. S.; Spear, F. S.
2013-12-01
Science in many fields is increasingly ';data-driven'. Though referred to as a ';new' Fourth Paradigm (Hey, 2009), data-driven science is not new, and examples are cited in the Geochemical Society's data policy, including the compilation of Dziewonski & Anderson (1981) that led to PREM, and Zindler & Hart (1986), who compiled mantle isotope data to present for the first time a comprehensive view of the Earth's mantle. Today, rapidly growing data volumes, ubiquity of data access, and new computational and information management technologies enable data-driven science at a radically advanced scale of speed, extent, flexibility, and inclusiveness, with the ability to seamlessly synthesize observations, experiments, theory, and computation, and to statistically mine data across disciplines, leading to more comprehensive, well informed, and high impact scientific advances. Are geochemists, petrologists, and volcanologists ready to participate in this revolution of the scientific process? In the past year, researchers from the VGP community and related disciplines have come together at several cyberinfrastructure related workshops, in part prompted by the EarthCube initiative of the US NSF, to evaluate the status of cyberinfrastructure in their field, to put forth key scientific challenges, and identify primary data and software needs to address these. Science scenarios developed by workshop participants that range from non-equilibrium experiments focusing on mass transport, chemical reactions, and phase transformations (J. Hammer) to defining the abundance of elements and isotopes in every voxel in the Earth (W. McDonough), demonstrate the potential of cyberinfrastructure enabled science, and define the vision of how data access, visualization, analysis, computation, and cross-domain interoperability can and should support future research in VGP. The primary obstacle for data-driven science in VGP remains the dearth of accessible, integrated data from lab and sensor
Predicting the Consequences of Workload Management Strategies with Human Performance Modeling
NASA Technical Reports Server (NTRS)
Mitchell, Diane Kuhl; Samma, Charneta
2011-01-01
Human performance modelers at the US Army Research Laboratory have developed an approach for establishing Soldier high workload that can be used for analyses of proposed system designs. Their technique includes three key components. To implement the approach in an experiment, the researcher would create two experimental conditions: a baseline and a design alternative. Next they would identify a scenario in which the test participants perform all their representative concurrent interactions with the system. This scenario should include any events that would trigger a different set of goals for the human operators. They would collect workload values during both the control and alternative design condition to see if the alternative increased workload and decreased performance. They have successfully implemented this approach for military vehicle. designs using the human performance modeling tool, IMPRINT. Although ARL researches use IMPRINT to implement their approach, it can be applied to any workload analysis. Researchers using other modeling and simulations tools or conducting experiments or field tests can use the same approach.
Burnout and Workload Among Health Care Workers: The Moderating Role of Job Control
Portoghese, Igor; Galletta, Maura; Coppola, Rosa Cristina; Finco, Gabriele; Campagna, Marcello
2014-01-01
Background As health care workers face a wide range of psychosocial stressors, they are at a high risk of developing burnout syndrome, which in turn may affect hospital outcomes such as the quality and safety of provided care. The purpose of the present study was to investigate the moderating effect of job control on the relationship between workload and burnout. Methods A total of 352 hospital workers from five Italian public hospitals completed a self-administered questionnaire that was used to measure exhaustion, cynicism, job control, and workload. Data were collected in 2013. Results In contrast to previous studies, the results of this study supported the moderation effect of job control on the relationship between workload and exhaustion. Furthermore, the results found support for the sequential link from exhaustion to cynicism. Conclusion This study showed the importance for hospital managers to carry out management practices that promote job control and provide employees with job resources, in order to reduce the burnout risk. PMID:25379330
Advances in Electrically Driven Thermal Management
NASA Technical Reports Server (NTRS)
Didion, Jeffrey R.
2017-01-01
Electrically Driven Thermal Management is a vibrant technology development initiative incorporating ISS based technology demonstrations, development of innovative fluid management techniques and fundamental research efforts. The program emphasizes high temperature high heat flux thermal management required for future generations of RF electronics and power electronic devices. This presentation reviews i.) preliminary results from the Electrohydrodynamic (EHD) Long Term Flight Demonstration launched on STP-H5 payload in February 2017 ii.) advances in liquid phase flow distribution control iii.) development of the Electrically Driven Liquid Film Boiling Experiment under the NASA Microgravity Fluid Physics Program.
Workloads in Australian emergency departments a descriptive study.
Lyneham, Joy; Cloughessy, Liz; Martin, Valmai
2008-07-01
This study aimed to identify the current workload of clinical nurses, managers and educators in Australian Emergency Departments according to the classification of the department Additionally the relationship of experienced to inexperienced clinical staff was examined. A descriptive research method utilising a survey distributed to 394 Australian Emergency departments with a 21% response rate. Nursing workloads were calculated and a ratio of nurse to patient was established. The ratios included nurse to patient, management and educators to clinical staff. Additionally the percentage of junior to senior clinical staff was also calculated. Across all categories of emergency departments the mean nurse:patient ratios were 1:15 (am shift), 1:7 (pm shift) and 1:4 (night shift). During this period an average of 17.1% of attendances were admitted to hospital. There were 27 staff members for each manager and 23.3 clinical staff for each educator. The percentage of junior staff rostered ranged from 10% to 38%. Emergency nurses cannot work under such pressure as it may compromise the care given to patients and consequently have a negative effect on the nurse personally. However, emergency nurses are dynamically adjusting to the workload. Such conditions as described in this study could give rise to burnout and attrition of experienced emergency nurses as they cannot resolve the conflict between workload and providing quality nursing care.
Associations between attending physician workload, teaching effectiveness, and patient safety.
Wingo, Majken T; Halvorsen, Andrew J; Beckman, Thomas J; Johnson, Matthew G; Reed, Darcy A
2016-03-01
Prior studies suggest that high workload among attending physicians may be associated with reduced teaching effectiveness and poor patient outcomes, but these relationships have not been investigated using objective measures of workload and safety. To examine associations between attending workload, teaching effectiveness, and patient safety, hypothesizing that higher workload would be associated with lower teaching effectiveness and negative patient outcomes. We conducted a retrospective study of 69,386 teaching evaluation items submitted by 543 internal medicine residents for 107 attending physicians who supervised inpatient teaching services from July 2, 2005 to July 1, 2011. Attending workload measures included hospital service census, patient length of stay, daily admissions, daily discharges, and concurrent outpatient duties. Teaching effectiveness was measured using residents' evaluations of attendings. Patient outcomes considered were applicable patient safety indicators (PSIs), intensive care unit transfers, cardiopulmonary resuscitation/rapid response team calls, and patient deaths. Mixed linear models and generalized linear regression models were used for statistical analysis. Workload measures of midnight census and daily discharges were associated with lower teaching evaluation scores (both β = -0.026, P < 0.0001). The number of daily admissions was associated with higher teaching scores (β = 0.021, P = 0.001) and increased PSIs (odds ratio = 1.81, P = 0.0001). Several measures of attending physician workload were associated with slightly lower teaching effectiveness, and patient safety may be compromised when teams are managing new admissions. Ongoing efforts by residency programs to optimize the learning environment should include strategies to manage the workload of supervising attendings. © 2016 Society of Hospital Medicine.
NASA Astrophysics Data System (ADS)
Destouni, G.
2017-12-01
Measures for mitigating nutrient loads to aquatic ecosystems should have observable effects, e.g, in the Baltic region after joint first periods of nutrient management actions under the Baltic Sea Action Plan (BASP; since 2007) and the EU Water Framework Directive (WFD; since 2009). Looking for such observable effects, all openly available water and nutrient monitoring data since 2003 are compiled and analyzed for Sweden as a case study. Results show that hydro-climatically driven water discharge dominates the determination of waterborne loads of both phosphorus and nitrogen. Furthermore, the nutrient loads and water discharge are all similarly well correlated with the ecosystem status classification of Swedish water bodies according to the WFD. Nutrient concentrations, which are hydro-climatically correlated and should thus reflect human effects better than loads, have changed only slightly over the study period (2003-2013) and even increased in moderate-to-bad status waters, where the WFD and BSAP jointly target nutrient decreases. These results indicate insufficient distinction and mitigation of human-driven nutrient components by the internationally harmonized applications of both the WFD and the BSAP. Aiming for better general identification of such components, nutrient data for the large transboundary catchments of the Baltic Sea and the Sava River are compared. The comparison shows cross-regional consistency in nutrient relationships to driving hydro-climatic conditions (water discharge) for nutrient loads, and socio-economic conditions (population density and farmland share) for nutrient concentrations. A data-driven screening methodology is further developed for estimating nutrient input and retention-delivery in catchments. Its first application to nested Sava River catchments identifies characteristic regional values of nutrient input per area and relative delivery, and hotspots of much larger inputs, related to urban high-population areas.
A metadata-driven approach to data repository design.
Harvey, Matthew J; McLean, Andrew; Rzepa, Henry S
2017-01-01
The design and use of a metadata-driven data repository for research data management is described. Metadata is collected automatically during the submission process whenever possible and is registered with DataCite in accordance with their current metadata schema, in exchange for a persistent digital object identifier. Two examples of data preview are illustrated, including the demonstration of a method for integration with commercial software that confers rich domain-specific data analytics without introducing customisation into the repository itself.
qPortal: A platform for data-driven biomedical research.
Mohr, Christopher; Friedrich, Andreas; Wojnar, David; Kenar, Erhan; Polatkan, Aydin Can; Codrea, Marius Cosmin; Czemmel, Stefan; Kohlbacher, Oliver; Nahnsen, Sven
2018-01-01
Modern biomedical research aims at drawing biological conclusions from large, highly complex biological datasets. It has become common practice to make extensive use of high-throughput technologies that produce big amounts of heterogeneous data. In addition to the ever-improving accuracy, methods are getting faster and cheaper, resulting in a steadily increasing need for scalable data management and easily accessible means of analysis. We present qPortal, a platform providing users with an intuitive way to manage and analyze quantitative biological data. The backend leverages a variety of concepts and technologies, such as relational databases, data stores, data models and means of data transfer, as well as front-end solutions to give users access to data management and easy-to-use analysis options. Users are empowered to conduct their experiments from the experimental design to the visualization of their results through the platform. Here, we illustrate the feature-rich portal by simulating a biomedical study based on publically available data. We demonstrate the software's strength in supporting the entire project life cycle. The software supports the project design and registration, empowers users to do all-digital project management and finally provides means to perform analysis. We compare our approach to Galaxy, one of the most widely used scientific workflow and analysis platforms in computational biology. Application of both systems to a small case study shows the differences between a data-driven approach (qPortal) and a workflow-driven approach (Galaxy). qPortal, a one-stop-shop solution for biomedical projects offers up-to-date analysis pipelines, quality control workflows, and visualization tools. Through intensive user interactions, appropriate data models have been developed. These models build the foundation of our biological data management system and provide possibilities to annotate data, query metadata for statistics and future re-analysis on
Nursing workload in the acute-care setting: A concept analysis of nursing workload.
Swiger, Pauline A; Vance, David E; Patrician, Patricia A
2016-01-01
A pressing need in the field of nursing is the identification of optimal staffing levels to ensure patient safety. Effective staffing requires comprehensive measurement of nursing workload to determine staffing needs. Issues surrounding nursing workload are complex, and the volume of workload is growing; however, many workload systems do not consider the numerous workload factors that impact nursing today. The purpose of this concept analysis was to better understand and define nursing workload as it relates to the acute-care setting. Rogers' evolutionary method was used for this literature-based concept analysis. Nursing workload is influenced by more than patient care. The proposed definition of nursing workload may help leaders identify workload that is unnoticed and unmeasured. These findings could help leaders consider and identify workload that is unnecessary, redundant, or more appropriate for assignment to other members of the health care team. Published by Elsevier Inc.
Assessing Continuous Operator Workload With a Hybrid Scaffolded Neuroergonomic Modeling Approach.
Borghetti, Brett J; Giametta, Joseph J; Rusnock, Christina F
2017-02-01
We aimed to predict operator workload from neurological data using statistical learning methods to fit neurological-to-state-assessment models. Adaptive systems require real-time mental workload assessment to perform dynamic task allocations or operator augmentation as workload issues arise. Neuroergonomic measures have great potential for informing adaptive systems, and we combine these measures with models of task demand as well as information about critical events and performance to clarify the inherent ambiguity of interpretation. We use machine learning algorithms on electroencephalogram (EEG) input to infer operator workload based upon Improved Performance Research Integration Tool workload model estimates. Cross-participant models predict workload of other participants, statistically distinguishing between 62% of the workload changes. Machine learning models trained from Monte Carlo resampled workload profiles can be used in place of deterministic workload profiles for cross-participant modeling without incurring a significant decrease in machine learning model performance, suggesting that stochastic models can be used when limited training data are available. We employed a novel temporary scaffold of simulation-generated workload profile truth data during the model-fitting process. A continuous workload profile serves as the target to train our statistical machine learning models. Once trained, the workload profile scaffolding is removed and the trained model is used directly on neurophysiological data in future operator state assessments. These modeling techniques demonstrate how to use neuroergonomic methods to develop operator state assessments, which can be employed in adaptive systems.
Workload of the VTS Sector Operator And Implications for Task Design
DOT National Transportation Integrated Search
1994-12-01
This study identifies the factors determining the VTS sector operator's workload and recommends : the most appropriate use of automation to manage that workload. Investigations were conducted at : VTS New York (Governors Island) and at VTS Puget Soun...
Bowen, Laura; Gross, Aleksander Stefan; Gimpel, Mo; Li, François-Xavier
2017-01-01
Aim The purpose of this study was to investigate the relationship between physical workload and injury risk in elite youth football players. Methods The workload data and injury incidence of 32 players were monitored throughout 2 seasons. Multiple regression was used to compare cumulative (1, 2, 3 and 4-weekly) loads and acute:chronic (A:C) workload ratios (acute workload divided by chronic workload) between injured and non-injured players for specific GPS and accelerometer-derived variables:total distance (TD), high-speed distance (HSD), accelerations (ACC) and total load. Workloads were classified into discrete ranges by z-scores and the relative risk was determined. Results A very high number of ACC (≥9254) over 3 weeks was associated with the highest significant overall (relative risk (RR)=3.84) and non-contact injury risk (RR=5.11). Non-contact injury risk was significantly increased when a high acute HSD was combined with low chronic HSD (RR=2.55), but not with high chronic HSD (RR=0.47). Contact injury risk was greatest when A:C TD and ACC ratios were very high (1.76 and 1.77, respectively) (RR=4.98). Conclusions In general, higher accumulated and acute workloads were associated with a greater injury risk. However, progressive increases in chronic workload may develop the players' physical tolerance to higher acute loads and resilience to injury risk. PMID:27450360
Zhang, Yiye; Padman, Rema
2017-01-01
Patients with multiple chronic conditions (MCC) pose an increasingly complex health management challenge worldwide, particularly due to the significant gap in our understanding of how to provide coordinated care. Drawing on our prior research on learning data-driven clinical pathways from actual practice data, this paper describes a prototype, interactive platform for visualizing the pathways of MCC to support shared decision making. Created using Python web framework, JavaScript library and our clinical pathway learning algorithm, the visualization platform allows clinicians and patients to learn the dominant patterns of co-progression of multiple clinical events from their own data, and interactively explore and interpret the pathways. We demonstrate functionalities of the platform using a cluster of 36 patients, identified from a dataset of 1,084 patients, who are diagnosed with at least chronic kidney disease, hypertension, and diabetes. Future evaluation studies will explore the use of this platform to better understand and manage MCC.
Resources–tasks imbalance: Experiences of nurses from factors influencing workload to increase
Khademi, Mojgan; Mohammadi, Easa; Vanaki, Zohreh
2015-01-01
Background: While nursing workload is a worldwide challenge, less attention has been given to the determining factors. Understanding these factors is important and could help nursing managers to provide suitable working environment and to manage the adverse outcomes of nursing workload. The aim of this study was to discover nurses’ experiences of determinant factors of their workload. Materials and Methods: In this qualitative study, the participants included 15 nurses working in two hospitals in Tehran, Iran. The data were collected through 26 unstructured interviews and were analyzed using conventional content analysis. The rigor has been guaranteed with prolonged engagement, maximum variance sampling, member check, and audit trail. Results: Resource–task imbalance was the main theme of nurses’ experiences. It means that there was an imbalance between necessary elements to meet patients’ needs in comparison with expectation and responsibility. Resource–task imbalance included lack of resource, assignment without preparation, assigning non-care tasks, and patients’ and families’ needs/expectations. Conclusions: A deep and comprehensive imbalance between recourses and tasks and expectations has been perceived by the participants to be the main source of work overload. Paying more attention to resource allocation, education of quality workforce, and job description by managers is necessary. PMID:26257804
Commissioning the CERN IT Agile Infrastructure with experiment workloads
NASA Astrophysics Data System (ADS)
Medrano Llamas, Ramón; Harald Barreiro Megino, Fernando; Kucharczyk, Katarzyna; Kamil Denis, Marek; Cinquilli, Mattia
2014-06-01
In order to ease the management of their infrastructure, most of the WLCG sites are adopting cloud based strategies. In the case of CERN, the Tier 0 of the WLCG, is completely restructuring the resource and configuration management of their computing center under the codename Agile Infrastructure. Its goal is to manage 15,000 Virtual Machines by means of an OpenStack middleware in order to unify all the resources in CERN's two datacenters: the one placed in Meyrin and the new on in Wigner, Hungary. During the commissioning of this infrastructure, CERN IT is offering an attractive amount of computing resources to the experiments (800 cores for ATLAS and CMS) through a private cloud interface. ATLAS and CMS have joined forces to exploit them by running stress tests and simulation workloads since November 2012. This work will describe the experience of the first deployments of the current experiment workloads on the CERN private cloud testbed. The paper is organized as follows: the first section will explain the integration of the experiment workload management systems (WMS) with the cloud resources. The second section will revisit the performance and stress testing performed with HammerCloud in order to evaluate and compare the suitability for the experiment workloads. The third section will go deeper into the dynamic provisioning techniques, such as the use of the cloud APIs directly by the WMS. The paper finishes with a review of the conclusions and the challenges ahead.
NASA Technical Reports Server (NTRS)
Comstock, James R., Jr.; Baxley, Brian T.; Norman, Robert M.; Ellis, Kyle K. E.; Adams, Cathy A.; Latorella, Kara A.; Lynn, William A.
2010-01-01
This paper, to accompany a discussion panel, describes a collaborative FAA and NASA research study to determine the effect Data Communications (Data Comm) messages have on flight crew workload and eye scanning behavior in busy terminal area operations. In the Next Generation Air Transportation System Concept of Operations, for the period 2017-2022, the FAA envisions Data Comm between controllers and the flight crew to become the primary means of communicating non-time critical information. Four research conditions were defined that span current day to future equipage levels (Voice with Paper map, Data Comm with Paper map, Data Comm with Moving Map Display with ownship position displayed, Data Comm with Moving Map, ownship and taxi route displayed), and were used to create arrival and departure scenarios at Boston Logan Airport. Preliminary results for workload, situation awareness, and pilot head-up time are presented here. Questionnaire data indicated that pilot acceptability, workload, and situation awareness ratings were favorable for all of the conditions tested. Pilots did indicate that there were times during final approach and landing when they would prefer not to hear the message chime, and would not be able to make a quick response due to high priority tasks in the cockpit.
Managing business compliance using model-driven security management
NASA Astrophysics Data System (ADS)
Lang, Ulrich; Schreiner, Rudolf
Compliance with regulatory and governance standards is rapidly becoming one of the hot topics of information security today. This is because, especially with regulatory compliance, both business and government have to expect large financial and reputational losses if compliance cannot be ensured and demonstrated. One major difficulty of implementing such regulations is caused the fact that they are captured at a high level of abstraction that is business-centric and not IT centric. This means that the abstract intent needs to be translated in a trustworthy, traceable way into compliance and security policies that the IT security infrastructure can enforce. Carrying out this mapping process manually is time consuming, maintenance-intensive, costly, and error-prone. Compliance monitoring is also critical in order to be able to demonstrate compliance at any given point in time. The problem is further complicated because of the need for business-driven IT agility, where IT policies and enforcement can change frequently, e.g. Business Process Modelling (BPM) driven Service Oriented Architecture (SOA). Model Driven Security (MDS) is an innovative technology approach that can solve these problems as an extension of identity and access management (IAM) and authorization management (also called entitlement management). In this paper we will illustrate the theory behind Model Driven Security for compliance, provide an improved and extended architecture, as well as a case study in the healthcare industry using our OpenPMF 2.0 technology.
glideinWMS—a generic pilot-based workload management system
NASA Astrophysics Data System (ADS)
Sfiligoi, I.
2008-07-01
The Grid resources are distributed among hundreds of independent Grid sites, requiring a higher level Workload Management System (WMS) to be used efficiently. Pilot jobs have been used for this purpose by many communities, bringing increased reliability, global fair share and just in time resource matching. glideinWMS is a WMS based on the Condor glidein concept, i.e. a regular Condor pool, with the Condor daemons (startds) being started by pilot jobs, and real jobs being vanilla, standard or MPI universe jobs. The glideinWMS is composed of a set of Glidein Factories, handling the submission of pilot jobs to a set of Grid sites, and a set of VO Frontends, requesting pilot submission based on the status of user jobs. This paper contains the structural overview of glideinWMS as well as a detailed description of the current implementation and the current scalability limits.
glideinWMS - A generic pilot-based Workload Management System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sfiligoi, Igor; /Fermilab
The Grid resources are distributed among hundreds of independent Grid sites, requiring a higher level Workload Management System (WMS) to be used efficiently. Pilot jobs have been used for this purpose by many communities, bringing increased reliability, global fair share and just in time resource matching. GlideinWMS is a WMS based on the Condor glidein concept, i.e. a regular Condor pool, with the Condor daemons (startds) being started by pilot jobs, and real jobs being vanilla, standard or MPI universe jobs. The glideinWMS is composed of a set of Glidein Factories, handling the submission of pilot jobs to a setmore » of Grid sites, and a set of VO Frontends, requesting pilot submission based on the status of user jobs. This paper contains the structural overview of glideinWMS as well as a detailed description of the current implementation and the current scalability limits.« less
Virtual Network Configuration Management System for Data Center Operations and Management
NASA Astrophysics Data System (ADS)
Okita, Hideki; Yoshizawa, Masahiro; Uehara, Keitaro; Mizuno, Kazuhiko; Tarui, Toshiaki; Naono, Ken
Virtualization technologies are widely deployed in data centers to improve system utilization. However, they increase the workload for operators, who have to manage the structure of virtual networks in data centers. A virtual-network management system which automates the integration of the configurations of the virtual networks is provided. The proposed system collects the configurations from server virtualization platforms and VLAN-supported switches, and integrates these configurations according to a newly developed XML-based management information model for virtual-network configurations. Preliminary evaluations show that the proposed system helps operators by reducing the time to acquire the configurations from devices and correct the inconsistency of operators' configuration management database by about 40 percent. Further, they also show that the proposed system has excellent scalability; the system takes less than 20 minutes to acquire the virtual-network configurations from a large scale network that includes 300 virtual machines. These results imply that the proposed system is effective for improving the configuration management process for virtual networks in data centers.
2010-01-01
Background Management decisions regarding quality and quantity of nurse staffing have important consequences for hospital budgets. Furthermore, these management decisions must address the nursing care requirements of the particular patients within an organizational unit. In order to determine optimal nurse staffing needs, the extent of nursing workload must first be known. Nursing workload is largely a function of the composite of the patients' individual health status, particularly with respect to functioning status, individual need for nursing care, and severity of symptoms. The International Classification of Functioning, Disability and Health (ICF) and the derived subsets, the so-called ICF Core Sets, are a standardized approach to describe patients' functioning status. The objectives of this study were to (1) examine the association between patients' functioning, as encoded by categories of the Acute ICF Core Sets, and nursing workload in patients in the acute care situation, (2) compare the variance in nursing workload explained by the ICF Core Set categories and with the Barthel Index, and (3) validate the Acute ICF Core Sets by their ability to predict nursing workload. Methods Patients' functioning at admission was assessed using the respective Acute ICF Core Set and the Barthel Index, whereas nursing workload data was collected using an established instrument. Associations between dependent and independent variables were modelled using linear regression. Variable selection was carried out using penalized regression. Results In patients with neurological and cardiopulmonary conditions, selected ICF categories and the Barthel Index Score explained the same variance in nursing workload (44% in neurological conditions, 35% in cardiopulmonary conditions), whereas ICF was slightly superior to Barthel Index Score for musculoskeletal conditions (20% versus 16%). Conclusions A substantial fraction of the variance in nursing workload in patients with rehabilitation
NASA TLA workload analysis support. Volume 2: Metering and spacing studies validation data
NASA Technical Reports Server (NTRS)
Sundstrom, J. L.
1980-01-01
Four sets of graphic reports--one for each of the metering and spacing scenarios--are presented. The complete data file from which the reports were generated is also given. The data was used to validate the detail task of both the pilot and copilot for four metering and spacing scenarios. The output presents two measures of demand workload and a report showing task length and task interaction.
Psychophysiological measures of cognitive workload in laboratory and flight
NASA Technical Reports Server (NTRS)
Wilson, Glenn F.; Badeau, Albert
1993-01-01
Psychophysiological data have been recorded during different levels of cognitive workload in laboratory and flight settings. Cardiac, eye blink, and brain data have shown meaningful changes as a function of the levels of mental workload. Increased cognitive workload is generally associated with increased heart rates, decreased blink rates and eye closures, and decreased evoked potential amplitudes. However, comparisons of laboratory and flight data show that direct transference of laboratory findings to the flight environment is not possible in many cases. While the laboratory data are valuable, a data base from flight is required so that 'real world' data can be properly interpreted.
Vargas Bustamante, Arturo; Martinez, Ana; Chen, Xiao; Rodriguez, Hector P
2017-06-01
We examine whether workplace climate-quality of staff relationships (QSR) and manageable clinic workload (MCW) are related to better patient care experiences and diabetes care in community health centers (CHCs) catering to Latino and Chinese patients. Patient experience surveys of adult patients with type 2 diabetes and workplace climate surveys of clinicians and staff from CHCs were included in an analytic sample. Comparisons of means analyses examine patient and provider characteristics. The associations of QSR, MCW and the diabetes care management were examined using regression analyses. Diabetes care process were more consistently provided in CHCs with high quality staff relations and more manageable clinic workload, but HbA1c, LDL cholesterol, and blood pressure outcomes were no different between clinics with high vs. low QSR and MCW. Focusing efforts on improvements in practice climate may lead to more consistent provision of important processes of diabetes care for these patients.
NASA Technical Reports Server (NTRS)
Johnson, Lee F.; Maneta, Marco P.; Kimball, John S.
2016-01-01
Water cycle extremes such as droughts and floods present a challenge for water managers and for policy makers responsible for the administration of water supplies in agricultural regions. In addition to the inherent uncertainties associated with forecasting extreme weather events, water planners need to anticipate water demands and water user behavior in a typical circumstances. This requires the use decision support systems capable of simulating agricultural water demand with the latest available data. Unfortunately, managers from local and regional agencies often use different datasets of variable quality, which complicates coordinated action. In previous work we have demonstrated novel methodologies to use satellite-based observational technologies, in conjunction with hydro-economic models and state of the art data assimilation methods, to enable robust regional assessment and prediction of drought impacts on agricultural production, water resources, and land allocation. These methods create an opportunity for new, cost-effective analysis tools to support policy and decision-making over large spatial extents. The methods can be driven with information from existing satellite-derived operational products, such as the Satellite Irrigation Management Support system (SIMS) operational over California, the Cropland Data Layer (CDL), and using a modified light-use efficiency algorithm to retrieve crop yield from the synergistic use of MODIS and Landsat imagery. Here we present an integration of this modeling framework in a client-server architecture based on the Hydra platform. Assimilation and processing of resource intensive remote sensing data, as well as hydrologic and other ancillary information occur on the server side. This information is processed and summarized as attributes in water demand nodes that are part of a vector description of the water distribution network. With this architecture, our decision support system becomes a light weight 'app' that
NASA Astrophysics Data System (ADS)
Maneta, M. P.; Johnson, L.; Kimball, J. S.
2016-12-01
Water cycle extremes such as droughts and floods present a challenge for water managers and for policy makers responsible for the administration of water supplies in agricultural regions. In addition to the inherent uncertainties associated with forecasting extreme weather events, water planners need to anticipate water demands and water user behavior in atypical circumstances. This requires the use decision support systems capable of simulating agricultural water demand with the latest available data. Unfortunately, managers from local and regional agencies often use different datasets of variable quality, which complicates coordinated action. In previous work we have demonstrated novel methodologies to use satellite-based observational technologies, in conjunction with hydro-economic models and state of the art data assimilation methods, to enable robust regional assessment and prediction of drought impacts on agricultural production, water resources, and land allocation. These methods create an opportunity for new, cost-effective analysis tools to support policy and decision-making over large spatial extents. The methods can be driven with information from existing satellite-derived operational products, such as the Satellite Irrigation Management Support system (SIMS) operational over California, the Cropland Data Layer (CDL), and using a modified light-use efficiency algorithm to retrieve crop yield from the synergistic use of MODIS and Landsat imagery. Here we present an integration of this modeling framework in a client-server architecture based on the Hydra platform. Assimilation and processing of resource intensive remote sensing data, as well as hydrologic and other ancillary information occur on the server side. This information is processed and summarized as attributes in water demand nodes that are part of a vector description of the water distribution network. With this architecture, our decision support system becomes a light weight `app` that
Comparison of workload measures on computer-generated primary flight displays
NASA Technical Reports Server (NTRS)
Nataupsky, Mark; Abbott, Terence S.
1987-01-01
Four Air Force pilots were used as subjects to assess a battery of subjective and physiological workload measures in a flight simulation environment in which two computer-generated primary flight display configurations were evaluated. A high- and low-workload task was created by manipulating flight path complexity. Both SWAT and the NASA-TLX were shown to be effective in differentiating the high and low workload path conditions. Physiological measures were inconclusive. A battery of workload measures continues to be necessary for an understanding of the data. Based on workload, opinion, and performance data, it is fruitful to pursue research with a primary flight display and a horizontal situation display integrated into a single display.
Li, Gang; He, Bin; Huang, Hongwei; Tang, Limin
2016-01-01
The spatial–temporal correlation is an important feature of sensor data in wireless sensor networks (WSNs). Most of the existing works based on the spatial–temporal correlation can be divided into two parts: redundancy reduction and anomaly detection. These two parts are pursued separately in existing works. In this work, the combination of temporal data-driven sleep scheduling (TDSS) and spatial data-driven anomaly detection is proposed, where TDSS can reduce data redundancy. The TDSS model is inspired by transmission control protocol (TCP) congestion control. Based on long and linear cluster structure in the tunnel monitoring system, cooperative TDSS and spatial data-driven anomaly detection are then proposed. To realize synchronous acquisition in the same ring for analyzing the situation of every ring, TDSS is implemented in a cooperative way in the cluster. To keep the precision of sensor data, spatial data-driven anomaly detection based on the spatial correlation and Kriging method is realized to generate an anomaly indicator. The experiment results show that cooperative TDSS can realize non-uniform sensing effectively to reduce the energy consumption. In addition, spatial data-driven anomaly detection is quite significant for maintaining and improving the precision of sensor data. PMID:27690035
General Purpose Data-Driven Monitoring for Space Operations
NASA Technical Reports Server (NTRS)
Iverson, David L.; Martin, Rodney A.; Schwabacher, Mark A.; Spirkovska, Liljana; Taylor, William McCaa; Castle, Joseph P.; Mackey, Ryan M.
2009-01-01
As modern space propulsion and exploration systems improve in capability and efficiency, their designs are becoming increasingly sophisticated and complex. Determining the health state of these systems, using traditional parameter limit checking, model-based, or rule-based methods, is becoming more difficult as the number of sensors and component interactions grow. Data-driven monitoring techniques have been developed to address these issues by analyzing system operations data to automatically characterize normal system behavior. System health can be monitored by comparing real-time operating data with these nominal characterizations, providing detection of anomalous data signatures indicative of system faults or failures. The Inductive Monitoring System (IMS) is a data-driven system health monitoring software tool that has been successfully applied to several aerospace applications. IMS uses a data mining technique called clustering to analyze archived system data and characterize normal interactions between parameters. The scope of IMS based data-driven monitoring applications continues to expand with current development activities. Successful IMS deployment in the International Space Station (ISS) flight control room to monitor ISS attitude control systems has led to applications in other ISS flight control disciplines, such as thermal control. It has also generated interest in data-driven monitoring capability for Constellation, NASA's program to replace the Space Shuttle with new launch vehicles and spacecraft capable of returning astronauts to the moon, and then on to Mars. Several projects are currently underway to evaluate and mature the IMS technology and complementary tools for use in the Constellation program. These include an experiment on board the Air Force TacSat-3 satellite, and ground systems monitoring for NASA's Ares I-X and Ares I launch vehicles. The TacSat-3 Vehicle System Management (TVSM) project is a software experiment to integrate fault
Authoring Data-Driven Videos with DataClips.
Amini, Fereshteh; Riche, Nathalie Henry; Lee, Bongshin; Monroy-Hernandez, Andres; Irani, Pourang
2017-01-01
Data videos, or short data-driven motion graphics, are an increasingly popular medium for storytelling. However, creating data videos is difficult as it involves pulling together a unique combination of skills. We introduce DataClips, an authoring tool aimed at lowering the barriers to crafting data videos. DataClips allows non-experts to assemble data-driven "clips" together to form longer sequences. We constructed the library of data clips by analyzing the composition of over 70 data videos produced by reputable sources such as The New York Times and The Guardian. We demonstrate that DataClips can reproduce over 90% of our data videos corpus. We also report on a qualitative study comparing the authoring process and outcome achieved by (1) non-experts using DataClips, and (2) experts using Adobe Illustrator and After Effects to create data-driven clips. Results indicated that non-experts are able to learn and use DataClips with a short training period. In the span of one hour, they were able to produce more videos than experts using a professional editing tool, and their clips were rated similarly by an independent audience.
Data-linked pilot reply time on controller workload and communication in a simulated terminal option
DOT National Transportation Integrated Search
2001-05-01
This report describes an analysis of air traffic control communication and workload in a simulated terminal radar approach : control environment. The objective of this study was to investigate how pilot-to-controller data-link acknowledgment time : m...
Mental workload during brain-computer interface training.
Felton, Elizabeth A; Williams, Justin C; Vanderheiden, Gregg C; Radwin, Robert G
2012-01-01
It is not well understood how people perceive the difficulty of performing brain-computer interface (BCI) tasks, which specific aspects of mental workload contribute the most, and whether there is a difference in perceived workload between participants who are able-bodied and disabled. This study evaluated mental workload using the NASA Task Load Index (TLX), a multi-dimensional rating procedure with six subscales: Mental Demands, Physical Demands, Temporal Demands, Performance, Effort, and Frustration. Able-bodied and motor disabled participants completed the survey after performing EEG-based BCI Fitts' law target acquisition and phrase spelling tasks. The NASA-TLX scores were similar for able-bodied and disabled participants. For example, overall workload scores (range 0-100) for 1D horizontal tasks were 48.5 (SD = 17.7) and 46.6 (SD 10.3), respectively. The TLX can be used to inform the design of BCIs that will have greater usability by evaluating subjective workload between BCI tasks, participant groups, and control modalities. Mental workload of brain-computer interfaces (BCI) can be evaluated with the NASA Task Load Index (TLX). The TLX is an effective tool for comparing subjective workload between BCI tasks, participant groups (able-bodied and disabled), and control modalities. The data can inform the design of BCIs that will have greater usability.
Online EEG-Based Workload Adaptation of an Arithmetic Learning Environment.
Walter, Carina; Rosenstiel, Wolfgang; Bogdan, Martin; Gerjets, Peter; Spüler, Martin
2017-01-01
In this paper, we demonstrate a closed-loop EEG-based learning environment, that adapts instructional learning material online, to improve learning success in students during arithmetic learning. The amount of cognitive workload during learning is crucial for successful learning and should be held in the optimal range for each learner. Based on EEG data from 10 subjects, we created a prediction model that estimates the learner's workload to obtain an unobtrusive workload measure. Furthermore, we developed an interactive learning environment that uses the prediction model to estimate the learner's workload online based on the EEG data and adapt the difficulty of the learning material to keep the learner's workload in an optimal range. The EEG-based learning environment was used by 13 subjects to learn arithmetic addition in the octal number system, leading to a significant learning effect. The results suggest that it is feasible to use EEG as an unobtrusive measure of cognitive workload to adapt the learning content. Further it demonstrates that a promptly workload prediction is possible using a generalized prediction model without the need for a user-specific calibration.
Stress and workload of men and women in high-ranking positions.
Lundberg, U; Frankenhaeuser, M
1999-04-01
Psychological and physiological stress responses related to work and family were investigated in 21 female and 21 male managers and professional specialists in high-ranking positions. The main result was that both women and men experienced their jobs as challenging and stimulating, although almost all data indicated a more favorable situation for men than for women. In addition, women were more stressed by their greater unpaid workload and by a greater responsibility for duties related to home and family. Women had higher norepinephrine levels than men did, both during and after work, which reflected their greater workload. Women with children at home had significantly higher norepinephrine levels after work than did the other participants. The possible long-term health consequences of women's higher stress levels are discussed.
Heavy vehicle driver workload assessment. Task 4, review of workload and related research
DOT National Transportation Integrated Search
This report reviews literature on workload measures and related research. It depicts the preliminary development of a theoretical basis for relating driving workload to highway safety and a selective review of driver performance evaluation, workload ...
Brossier, D; Villedieu, F; Letouzé, N; Pinto Da Costa, N; Jokic, M
2017-03-01
In routine practice, intensive care physicians rarely have to manage children under 18years of age, particularly those under 15. This study's objectives were to assess the quality of training in pediatrics of adult intensive care teams, to document the workload generated by care of pediatric patients, and to identify the difficulties encountered in managing minors as patients. A survey was administered in Lower Normandy from 4 April 2012 to 1 September 2012. Physicians, residents, nurses, and nurses' aides practicing in one of the nine intensive care units of Lower Normandy were asked to complete an electronic or paper format questionnaire. This questionnaire assessed their level of pediatric training, the workload management of pediatric patients entailed, and the challenges posed by these patients. One hundred and nine questionnaires were returned (by 26 attending physicians, 18 residents, 38 nurses, and 27 nurses' aides). Eighty-three of the respondents (76%) had no experience in a pediatric unit of any kind. Forty-two percent thought that the pediatric age range lies between 3months and 15years of age. However, more than 50% of respondents would like the upper limit to be 16years or even older. Ninety-three respondents (85%) estimated having some exposure to pediatric patients in their routine practice, but this activity remained quite low. Seventy-three (67%) reported difficulties with the management of these young patients. This survey provides current information regarding the level of training of adult intensive care unit professionals and their concerns about managing patients under 18years of age, both in terms of workload and specific challenges. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Consistent data-driven computational mechanics
NASA Astrophysics Data System (ADS)
González, D.; Chinesta, F.; Cueto, E.
2018-05-01
We present a novel method, within the realm of data-driven computational mechanics, to obtain reliable and thermodynamically sound simulation from experimental data. We thus avoid the need to fit any phenomenological model in the construction of the simulation model. This kind of techniques opens unprecedented possibilities in the framework of data-driven application systems and, particularly, in the paradigm of industry 4.0.
Injury risk-workload associations in NCAA American college football.
Sampson, J A; Murray, A; Williams, S; Halseth, T; Hanisch, J; Golden, G; Fullagar, H H K
2018-05-22
To determine injury risk-workload associations in collegiate American Football. Retrospective analysis. Workload and injury data was recorded from 52 players during a full NCAA football season. Acute, chronic, and a range of acute:chronic workload ratios (ACWR: 7:14, 7:21 and 7:28 day) calculated using rolling and exponentially weighted moving averages (EWMA) were plotted against non-contact injuries (regardless of time lost or not) sustained within 3- and 7-days. Injury risks were also determined relative to position and experience. 105 non-contact injuries (18 game- and 87 training-related) were observed with almost 40% sustained during the pre-season. 7-21 day EWMA ACWR's with a 3-day injury lag were most closely associated with injury (R 2 =0.54). Relative injury risks were >3× greater with high compared to moderate and low ratios and magnified when combined with low 21-day chronic workloads (injury probability=92.1%). Injury risks were similar across positions. 'Juniors' presented likely and possibly increased overall injury risk compared to 'Freshman' (RR: 1.94, CI 1.07-3.52) and 'Seniors' (RR: 1.7, CI 0.92-3.14), yet no specific ACWR - experience or - position interactions were identified. High injury rates during college football pre-season training may be associated with high acute loads. In-season injury risks were greatest with high ACWR and evident even when including (more common and less serious) non-time loss injuries. Substantially increased injury risks when low 21-day chronic workloads and concurrently high EWMA ACWR highlights the importance of load management for individuals with chronic game- (non-involved on game day) and or training (following injury) absences. Copyright © 2018 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
A study of dynamic data placement for ATLAS distributed data management
NASA Astrophysics Data System (ADS)
Beermann, T.; Stewart, G. A.; Maettig, P.
2015-12-01
This contribution presents a study on the applicability and usefulness of dynamic data placement methods for data-intensive systems, such as ATLAS distributed data management (DDM). In this system the jobs are sent to the data, therefore having a good distribution of data is significant. Ways of forecasting workload patterns are examined which then are used to redistribute data to achieve a better overall utilisation of computing resources and to reduce waiting time for jobs before they can run on the grid. This method is based on a tracer infrastructure that is able to monitor and store historical data accesses and which is used to create popularity reports. These reports provide detailed summaries about data accesses in the past, including information about the accessed files, the involved users and the sites. From this past data it is possible to then make near-term forecasts for data popularity in the future. This study evaluates simple prediction methods as well as more complex methods like neural networks. Based on the outcome of the predictions a redistribution algorithm deletes unused replicas and adds new replicas for potentially popular datasets. Finally, a grid simulator is used to examine the effects of the redistribution. The simulator replays workload on different data distributions while measuring the job waiting time and site usage. The study examines how the average waiting time is affected by the amount of data that is moved, how it differs for the various forecasting methods and how that compares to the optimal data distribution.
How the workload impacts on cognitive cooperation: A pilot study.
Sciaraffa, Nicolina; Borghini, Gianluca; Arico, Pietro; Di Flumeri, Gianluca; Toppi, Jlenia; Colosimo, Alfredo; Bezerianos, Anastatios; Thakor, Nitish V; Babiloni, Fabio
2017-07-01
Cooperation degradation can be seen as one of the main causes of human errors. Poor cooperation could arise from aberrant mental processes, such as mental overload, that negatively affect the user's performance. Using different levels of difficulty in a cooperative task, we combined behavioural, subjective and neurophysiological data with the aim to i) quantify the mental workload under which the crew was operating, ii) evaluate the degree of their cooperation, and iii) assess the impact of the workload demands on the cooperation levels. The combination of such data showed that high workload demand impacted significantly on the performance, workload perception, and degree of cooperation.
Heavy vehicle driver workload assessment. Task 5, workload assessment protocol
DOT National Transportation Integrated Search
This report presents a description of a prescriptive workload assessment protocol for use in evaluating in-cab devices in heavy vehicles. The primary objective of this heavy vehicle driver workload assessment protocol is to identify the components an...
Metzger, Ulla; Parasuraman, Raja
2005-01-01
Future air traffic management concepts envisage shared decision-making responsibilities between controllers and pilots, necessitating that controllers be supported by automated decision aids. Even as automation tools are being introduced, however, their impact on the air traffic controller is not well understood. The present experiments examined the effects of an aircraft-to-aircraft conflict decision aid on performance and mental workload of experienced, full-performance level controllers in a simulated Free Flight environment. Performance was examined with both reliable (Experiment 1) and inaccurate automation (Experiment 2). The aid improved controller performance and reduced mental workload when it functioned reliably. However, detection of a particular conflict was better under manual conditions than under automated conditions when the automation was imperfect. Potential or actual applications of the results include the design of automation and procedures for future air traffic control systems.
The workload book: Assessment of operator workload to engineering systems
NASA Technical Reports Server (NTRS)
Gopher, D.
1983-01-01
The structure and initial work performed toward the creation of a handbook for workload analysis directed at the operational community of engineers and human factors psychologists are described. The goal, when complete, will be to make accessible to such individuals the results of theoretically-based research that are of practical interest and utility in the analysis and prediction of operator workload in advanced and existing systems. In addition, the results of laboratory study focused on the development of a subjective rating technique for workload that is based on psychophysical scaling techniques are described.
Fallahi, Majid; Motamedzade, Majid; Heidarimoghadam, Rashid; Soltanian, Ali Reza; Miyake, Shinji
2016-01-01
This study evaluated operators' mental workload while monitoring traffic density in a city traffic control center. To determine the mental workload, physiological signals (ECG, EMG) were recorded and the NASA-Task Load Index (TLX) was administered for 16 operators. The results showed that the operators experienced a larger mental workload during high traffic density than during low traffic density. The traffic control center stressors caused changes in heart rate variability features and EMG amplitude, although the average workload score was significantly higher in HTD conditions than in LTD conditions. The findings indicated that increasing traffic congestion had a significant effect on HR, RMSSD, SDNN, LF/HF ratio, and EMG amplitude. The results suggested that when operators' workload increases, their mental fatigue and stress level increase and their mental health deteriorate. Therefore, it maybe necessary to implement an ergonomic program to manage mental health. Furthermore, by evaluating mental workload, the traffic control center director can organize the center's traffic congestion operators to sustain the appropriate mental workload and improve traffic control management. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Patient transfers in Australia: implications for nursing workload and patient outcomes.
Blay, Nicole; Duffield, Christine M; Gallagher, Robyn
2012-04-01
To discuss the impact of patient transfers on patient outcomes and nursing workload. Many patient transfers are essential and occur in response to patients' clinical changes. However, increasingly within Australia transfers are performed in response to reductions in bed numbers, resulting in 'bed block'. A discussion of the literature related to inpatient transfers, nursing workload and patient safety. Measures to increase patient flow such as short-stay units may result in an increase in patient transfers and nursing workload. Frequent patient transfers may also increase the risk of medication incidents, health-care acquired infections and patient falls. The continuing demand for health care has led to a reactionary bed management system that, in an attempt to accommodate patients, has resulted in increased transfers between wards. This can have a negative effect on nursing workload and affect patient outcomes. High nursing workload is cited as one reason for nurses leaving the profession. Reductions in non-essential transfers may reduce nurse workload, improve patient outcomes and enhance continuity of patient care. © 2011 Blackwell Publishing Ltd.
Development of techniques for measuring pilot workload
NASA Technical Reports Server (NTRS)
Spyker, D. A.; Stackhouse, S. P.; Khalafalla, A. S.; Mclane, R. C.
1971-01-01
An objective method of assessing information workload based on physiological measurements was developed. Information workload, or reserve capacity, was measured using a visual discrimination secondary task and subjective rating of task difficulty. The primary task was two axis (pitch and roll) tracking, and the independent variables in this study were aircraft pitch dynamics and wind gust disturbances. The study was structured to provide: (1) a sensitive, nonloading measure of reserve capacity, and (2) an unencumbering reliable measurement of the psychophysiological state. From these, a measured workload index (MWI) and physiological workload index (PWI) were extracted. An important measure of the success of this study was the degree to which the MWI and PWI agreed across the 243 randomly-presented, four-minute trials (9 subjects X 9 tasks X 3 replications). The electrophysiological data collected included vectorcardiogaram, respiration, electromyogram, skin impedance, and electroencephalogram. Special computer programs were created for the analysis of each physiological variable. The digital data base then consisted of 82 physiological features for each of the 243 trials. A prediction of workload based on physiological observations was formulated as a simultaneous least-squares prediction problem. A best subset of 10 features was chosen to predict the three measures of reserve capacity. The cannonical correlation coefficient was .754 with a chi squared value of 91.3 which allows rejection of the null hypothesis with p of .995.
Matima, Rangarirai; Murphy, Katherine; Levitt, Naomi S; BeLue, Rhonda; Oni, Tolu
2018-01-01
Current South African health policy for chronic disease management proposes integration of chronic services for better outcomes for chronic conditions; that is based on the Integrated Chronic Disease Model (ICDM). However, scant data exist on how patients with chronic multimorbidities currently experience the (re)-organisation of health services and what their perceived needs are in order to enhance the management of their conditions. A qualitative study was conducted in a community health centre treating both HIV and diabetes patients in Cape Town. The study was grounded in the Shippee's Cumulative Complexity Model (CCM) and explored "patient workload" and "patient capacity" to manage chronic conditions. Individual interviews were conducted with 10 adult patient-participants with HIV and type two diabetes (T2D) multimorbidity and 6 healthcare workers who provided health services to these patient-participants. Patient-participants in this study experienced clinic-related workload such as: two separate clinics for HIV and T2D and perceived and experienced power mismatch between patients and healthcare workers. Self-care related workloads were largely around nutritional requirements, pill burden, and stigma. Burden of these demands varied in difficulty among patient-participants due to capacity factors such as: positive attitudes, optimal health literacy, social support and availability of economic resources. Strategies mentioned by participants for improved continuity of care and self-management of multi-morbidities included integration of chronic services, consolidated guidelines for healthcare workers, educational materials for patients, improved information systems and income for patients. Using the CCM to explore multimorbidity captured most of the themes around "patient workload" and "patient capacity", and was thus a suitable framework to explore multimorbidity in this high HIV/T2D burden setting. Integration of chronic services and addressing social
A comparison of policies on nurse faculty workload in the United States.
Ellis, Peggy A
2013-01-01
This article describes nurse faculty workload policies from across the nation in order to assess current practice. There is a well-documented shortage of nursing faculty leading to an increase in workload demands. Increases in faculty workload results in difficulties with work-life balance and dissatisfaction threatening to make nursing education less attractive to young faculty. In order to begin an examination of faculty workload in nursing, existing workloads must be known. Faculty workload data were solicited from nursing programs nationwide and analyzed to determine the current workloads. The most common faculty teaching workload reported overall for nursing is 12 credit hours per semester; however, some variations exist. Consideration should be given to the multiple components of the faculty workload. Research is needed to address the most effective and efficient workload allocation for nursing faculty.
Data to Decisions: Creating a Culture of Model-Driven Drug Discovery.
Brown, Frank K; Kopti, Farida; Chang, Charlie Zhenyu; Johnson, Scott A; Glick, Meir; Waller, Chris L
2017-09-01
Merck & Co., Inc., Kenilworth, NJ, USA, is undergoing a transformation in the way that it prosecutes R&D programs. Through the adoption of a "model-driven" culture, enhanced R&D productivity is anticipated, both in the form of decreased attrition at each stage of the process and by providing a rational framework for understanding and learning from the data generated along the way. This new approach focuses on the concept of a "Design Cycle" that makes use of all the data possible, internally and externally, to drive decision-making. These data can take the form of bioactivity, 3D structures, genomics, pathway, PK/PD, safety data, etc. Synthesis of high-quality data into models utilizing both well-established and cutting-edge methods has been shown to yield high confidence predictions to prioritize decision-making and efficiently reposition resources within R&D. The goal is to design an adaptive research operating plan that uses both modeled data and experiments, rather than just testing, to drive project decision-making. To support this emerging culture, an ambitious information management (IT) program has been initiated to implement a harmonized platform to facilitate the construction of cross-domain workflows to enable data-driven decision-making and the construction and validation of predictive models. These goals are achieved through depositing model-ready data, agile persona-driven access to data, a unified cross-domain predictive model lifecycle management platform, and support for flexible scientist-developed workflows that simplify data manipulation and consume model services. The end-to-end nature of the platform, in turn, not only supports but also drives the culture change by enabling scientists to apply predictive sciences throughout their work and over the lifetime of a project. This shift in mindset for both scientists and IT was driven by an early impactful demonstration of the potential benefits of the platform, in which expert-level early discovery
ERIC Educational Resources Information Center
Massachusetts State Legislature, Boston. Senate Committee on Post Audit and Oversight.
This Massachusetts Senate committee study examined the current policies and procedures used by the University of Massachusetts at Amherst (UMA) to monitor, manage, and report on the activities of its faculty, in particular its faculty workload. The study had originally intended to analyze faculty workload in terms of instruction (teaching),…
Prediction of Nursing Workload in Hospital.
Fiebig, Madlen; Hunstein, Dirk; Bartholomeyczik, Sabine
2018-01-01
A dissertation project at the Witten/Herdecke University [1] is investigating which (nursing sensitive) patient characteristics are suitable for predicting a higher or lower degree of nursing workload. For this research project four predictive modelling methods were selected. In a first step, SUPPORT VECTOR MACHINE, RANDOM FOREST, and GRADIENT BOOSTING were used to identify potential predictors from the nursing sensitive patient characteristics. The results were compared via FEATURE IMPORTANCE. To predict nursing workload the predictors identified in step 1 were modelled using MULTINOMIAL LOGISTIC REGRESSION. First results from the data mining process will be presented. A prognostic determination of nursing workload can be used not only as a basis for human resource planning in hospital, but also to respond to health policy issues.
A VO-Driven Astronomical Data Grid in China
NASA Astrophysics Data System (ADS)
Cui, C.; He, B.; Yang, Y.; Zhao, Y.
2010-12-01
With the implementation of many ambitious observation projects, including LAMOST, FAST, and Antarctic observatory at Doom A, observational astronomy in China is stepping into a brand new era with emerging data avalanche. In the era of e-Science, both these cutting-edge projects and traditional astronomy research need much more powerful data management, sharing and interoperability. Based on data-grid concept, taking advantages of the IVOA interoperability technologies, China-VO is developing a VO-driven astronomical data grid environment to enable multi-wavelength science and large database science. In the paper, latest progress and data flow of the LAMOST, architecture of the data grid, and its supports to the VO are discussed.
FMP study of pilot workload. Qualification of workload via instrument scan
NASA Technical Reports Server (NTRS)
Tolel, J. R.; Vivaudou, M.; Harris, R. L., Sr.; Ephrath, A.
1982-01-01
Various methods of measuring a pilot's mental workload are discussed. Scanning the various flight instruments with good scan pattern and other verbal tasks during instrument landings is given special attention for measuring pilot workload.
Perceptions of mental workload in Dutch university employees of different ages: a focus group study.
Bos, Judith T; Donders, Nathalie C G M; van der Velden, Koos; van der Gulden, Joost W J
2013-03-18
As academic workload seems to be increasing, many studies examined factors that contribute to the mental workload of academics. Age-related differences in work motives and intellectual ability may lead to differences in experienced workload and in the way employees experience work features. This study aims to obtain a better understanding of age differences in sources of mental workload. 33 academics from one faculty discussed causes of workload during focus group interviews, stratified by age. Among our participants, the influence of ageing seems most evident in employees' actions and reactions, while the causes of workload mentioned seemed largely similar. These individual reactions to workload may also be driven by differences in tenure. Most positively assessed work characteristics were: interaction with colleagues and students and autonomy. Aspects most often indicated as increasing the workload, were organisational aspects as obstacles for 'getting the best out of people' and the feeling that overtime seems unavoidable. Many employees indicated to feel stretched between the 'greediness' of the organisation and their own high working standards, and many fear to be assigned even less time for research if they do not meet the rigorous output criteria. Moreover, despite great efforts on their part, promotion opportunities seem limited. A more pronounced role for the supervisor seems appreciated by employees of all ages, although the specific interpretation varied between individuals and career stages. To preserve good working conditions and quality of work, it seems important to scrutinize the output requirements and tenure-based needs for employee supervision.
Perceptions of mental workload in Dutch university employees of different ages: a focus group study
2013-01-01
Background As academic workload seems to be increasing, many studies examined factors that contribute to the mental workload of academics. Age-related differences in work motives and intellectual ability may lead to differences in experienced workload and in the way employees experience work features. This study aims to obtain a better understanding of age differences in sources of mental workload. 33 academics from one faculty discussed causes of workload during focus group interviews, stratified by age. Findings Among our participants, the influence of ageing seems most evident in employees’ actions and reactions, while the causes of workload mentioned seemed largely similar. These individual reactions to workload may also be driven by differences in tenure. Most positively assessed work characteristics were: interaction with colleagues and students and autonomy. Aspects most often indicated as increasing the workload, were organisational aspects as obstacles for ‘getting the best out of people’ and the feeling that overtime seems unavoidable. Many employees indicated to feel stretched between the ‘greediness’ of the organisation and their own high working standards, and many fear to be assigned even less time for research if they do not meet the rigorous output criteria. Moreover, despite great efforts on their part, promotion opportunities seem limited. A more pronounced role for the supervisor seems appreciated by employees of all ages, although the specific interpretation varied between individuals and career stages. Conclusions To preserve good working conditions and quality of work, it seems important to scrutinize the output requirements and tenure-based needs for employee supervision. PMID:23506458
Development of a Methodology for Assessing Aircrew Workloads.
1981-11-01
Workload Feasibility Study. .. ...... 52 Subjects. .. .............. ........ 53 Equipment .. ............... ....... 53 Date Analysis ... analysis ; simulation; standard time systems; switching synthetic time systems; task activities; task interference; time study; tracking; workload; work sampl...standard data systems, information content analysis , work sampling and job evaluation. Con- ventional methods were found to be deficient in accounting
NASA Astrophysics Data System (ADS)
Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.
2016-06-01
The growing interest in technicians' workloads research is probably associated with the recent surge in competition. This was prompted by unprecedented technological development that triggers changes in customer tastes and preferences for industrial goods. In a quest for business improvement, this worldwide intense competition in industries has stimulated theories and practical frameworks that seek to optimise performance in workplaces. In line with this drive, the present paper proposes an optimisation model which considers technicians' reliability that complements factory information obtained. The information used emerged from technicians' productivity and earned-values using the concept of multi-objective modelling approach. Since technicians are expected to carry out routine and stochastic maintenance work, we consider these workloads as constraints. The influence of training, fatigue and experiential knowledge of technicians on workload management was considered. These workloads were combined with maintenance policy in optimising reliability, productivity and earned-values using the goal programming approach. Practical datasets were utilised in studying the applicability of the proposed model in practice. It was observed that our model was able to generate information that practicing maintenance engineers can apply in making more informed decisions on technicians' management.
Big Data: An Opportunity for Collaboration with Computer Scientists on Data-Driven Science
NASA Astrophysics Data System (ADS)
Baru, C.
2014-12-01
Big data technologies are evolving rapidly, driven by the need to manage ever increasing amounts of historical data; process relentless streams of human and machine-generated data; and integrate data of heterogeneous structure from extremely heterogeneous sources of information. Big data is inherently an application-driven problem. Developing the right technologies requires an understanding of the applications domain. Though, an intriguing aspect of this phenomenon is that the availability of the data itself enables new applications not previously conceived of! In this talk, we will discuss how the big data phenomenon creates an imperative for collaboration among domain scientists (in this case, geoscientists) and computer scientists. Domain scientists provide the application requirements as well as insights about the data involved, while computer scientists help assess whether problems can be solved with currently available technologies or require adaptaion of existing technologies and/or development of new technologies. The synergy can create vibrant collaborations potentially leading to new science insights as well as development of new data technologies and systems. The area of interface between geosciences and computer science, also referred to as geoinformatics is, we believe, a fertile area for interdisciplinary research.
Comparison of nurse staffing based on changes in unit-level workload associated with patient churn.
Hughes, Ronda G; Bobay, Kathleen L; Jolly, Nicholas A; Suby, Chrysmarie
2015-04-01
This analysis compares the staffing implications of three measures of nurse staffing requirements: midnight census, turnover adjustment based on length of stay, and volume of admissions, discharges and transfers. Midnight census is commonly used to determine registered nurse staffing. Unit-level workload increases with patient churn, the movement of patients in and out of the nursing unit. Failure to account for patient churn in staffing allocation impacts nurse workload and may result in adverse patient outcomes. Secondary data analysis of unit-level data from 32 hospitals, where nursing units are grouped into three unit-type categories: intensive care, intermediate care, and medical surgical. Midnight census alone did not account adequately for registered nurse workload intensity associated with patient churn. On average, units were staffed with a mixture of registered nurses and other nursing staff not always to budgeted levels. Adjusting for patient churn increases nurse staffing across all units and shifts. Use of the discharges and transfers adjustment to midnight census may be useful in adjusting RN staffing on a shift basis to account for patient churn. Nurse managers should understand the implications to nurse workload of various methods of calculating registered nurse staff requirements. © 2013 John Wiley & Sons Ltd.
Quantitative EEG patterns of differential in-flight workload
NASA Technical Reports Server (NTRS)
Sterman, M. B.; Mann, C. A.; Kaiser, D. A.
1993-01-01
Four test pilots were instrumented for in-flight EEG recordings using a custom portable recording system. Each flew six, two minute tracking tasks in the Calspan NT-33 experimental trainer at Edwards AFB. With the canopy blacked out, pilots used a HUD display to chase a simulated aircraft through a random flight course. Three configurations of flight controls altered the flight characteristics to achieve low, moderate, and high workload, as determined by normative Cooper-Harper ratings. The test protocol was administered by a command pilot in the back seat. Corresponding EEG and tracking data were compared off-line. Tracking performance was measured as deviation from the target aircraft and combined with control difficulty to achieve an estimate of 'cognitive workload'. Trended patterns of parietal EEG activity at 8-12 Hz were sorted according to this classification. In all cases, high workload produced a significantly greater suppression of 8-12 Hz activity than low workload. Further, a clear differentiation of EEG trend patterns was obtained in 80 percent of the cases. High workload produced a sustained suppression of 8-12 Hz activity, while moderate workload resulted in an initial suppression followed by a gradual increment. Low workload was associated with a modulated pattern lacking any periods of marked or sustained suppression. These findings suggest that quantitative analysis of appropriate EEG measures may provide an objective and reliable in-flight index of cognitive effort that could facilitate workload assessment.
[Do physicians' gender and workload affect patients?].
Finnvold, Jon Erik
2008-10-23
The article discusses the effect of general practitioners' gender and workload on patients' experience with consultation time, waiting hours, use of out-of-hours services and planned health visits. Data were retrieved from the 2003 version of Statistics Norway's household panel study (5000 persons) and the National Insurance administration's register of regular general practitioners. Health condition was the most important factor related to patient experiences. A high workload was neither associated with more frequent use of out-of-hours services nor satisfaction with time spent in consultation. These results apply to physicians of both genders. Patients who used a female physician with a large workload had to wait longer for an appointment and more often reported dissatisfaction with the waiting time; this was not the case for male physicians. However, male physicians with a low workload had shorter waiting times. Patients who use practitioners with a high workload may have chosen their doctor more deliberately than others, which may be an explanation for few negative outcomes for physicians with a high workload. It is unlikely that these physicians would be as popular if the patients had fewer appointments, shorter consultations or more often had to use the out-of-hours services. Longer waiting time for appointments with female doctors may be related to more part time work, and the fact that female physicians more often are engaged in group practices.
Workload of Team Leaders and Team Members During a Simulated Sepsis Scenario.
Tofil, Nancy M; Lin, Yiqun; Zhong, John; Peterson, Dawn Taylor; White, Marjorie Lee; Grant, Vincent; Grant, David J; Gottesman, Ronald; Sudikoff, Stephanie N; Adler, Mark; Marohn, Kimberly; Davidson, Jennifer; Cheng, Adam
2017-09-01
Crisis resource management principles dictate appropriate distribution of mental and/or physical workload so as not to overwhelm any one team member. Workload during pediatric emergencies is not well studied. The National Aeronautics and Space Administration-Task Load Index is a multidimensional tool designed to assess workload validated in multiple settings. Low workload is defined as less than 40, moderate 40-60, and greater than 60 signify high workloads. Our hypothesis is that workload among both team leaders and team members is moderate to high during a simulated pediatric sepsis scenario and that team leaders would have a higher workload than team members. Multicenter observational study. Nine pediatric simulation centers (five United States, three Canada, and one United Kingdom). Team leaders and team members during a 12-minute pediatric sepsis scenario. National Aeronautics and Space Administration-Task Load Index. One hundred twenty-seven teams were recruited from nine sites. One hundred twenty-seven team leaders and 253 team members completed the National Aeronautics and Space Administration-Task Load Index. Team leader had significantly higher overall workload than team member (51 ± 11 vs 44 ± 13; p < 0.01). Team leader had higher workloads in all subcategories except in performance where the values were equal and in physical demand where team members were higher than team leaders (29 ± 22 vs 18 ± 16; p < 0.01). The highest category for each group was mental 73 ± 13 for team leader and 60 ± 20 for team member. For team leader, two categories, mental (73 ± 17) and effort (66 ± 16), were high workload, most domains for team member were moderate workload levels. Team leader and team member are under moderate workloads during a pediatric sepsis scenario with team leader under high workloads (> 60) in the mental demand and effort subscales. Team leader average significantly higher workloads. Consideration of decreasing
Data-driven optimal binning for respiratory motion management in PET.
Kesner, Adam L; Meier, Joseph G; Burckhardt, Darrell D; Schwartz, Jazmin; Lynch, David A
2018-01-01
Respiratory gating has been used in PET imaging to reduce the amount of image blurring caused by patient motion. Optimal binning is an approach for using the motion-characterized data by binning it into a single, easy to understand/use, optimal bin. To date, optimal binning protocols have utilized externally driven motion characterization strategies that have been tuned with population-derived assumptions and parameters. In this work, we are proposing a new strategy with which to characterize motion directly from a patient's gated scan, and use that signal to create a patient/instance-specific optimal bin image. Two hundred and nineteen phase-gated FDG PET scans, acquired using data-driven gating as described previously, were used as the input for this study. For each scan, a phase-amplitude motion characterization was generated and normalized using principle component analysis. A patient-specific "optimal bin" window was derived using this characterization, via methods that mirror traditional optimal window binning strategies. The resulting optimal bin images were validated by correlating quantitative and qualitative measurements in the population of PET scans. In 53% (n = 115) of the image population, the optimal bin was determined to include 100% of the image statistics. In the remaining images, the optimal binning windows averaged 60% of the statistics and ranged between 20% and 90%. Tuning the algorithm, through a single acceptance window parameter, allowed for adjustments of the algorithm's performance in the population toward conservation of motion or reduced noise-enabling users to incorporate their definition of optimal. In the population of images that were deemed appropriate for segregation, average lesion SUV max were 7.9, 8.5, and 9.0 for nongated images, optimal bin, and gated images, respectively. The Pearson correlation of FWHM measurements between optimal bin images and gated images were better than with nongated images, 0.89 and 0
Federated data storage and management infrastructure
NASA Astrophysics Data System (ADS)
Zarochentsev, A.; Kiryanov, A.; Klimentov, A.; Krasnopevtsev, D.; Hristov, P.
2016-10-01
The Large Hadron Collider (LHC)’ operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. Computing models for the High Luminosity LHC era anticipate a growth of storage needs of at least orders of magnitude; it will require new approaches in data storage organization and data handling. In our project we address the fundamental problem of designing of architecture to integrate a distributed heterogeneous disk resources for LHC experiments and other data- intensive science applications and to provide access to data from heterogeneous computing facilities. We have prototyped a federated storage for Russian T1 and T2 centers located in Moscow, St.-Petersburg and Gatchina, as well as Russian / CERN federation. We have conducted extensive tests of underlying network infrastructure and storage endpoints with synthetic performance measurement tools as well as with HENP-specific workloads, including the ones running on supercomputing platform, cloud computing and Grid for ALICE and ATLAS experiments. We will present our current accomplishments with running LHC data analysis remotely and locally to demonstrate our ability to efficiently use federated data storage experiment wide within National Academic facilities for High Energy and Nuclear Physics as well as for other data-intensive science applications, such as bio-informatics.
van den Hombergh, Pieter; Künzi, Beat; Elwyn, Glyn; van Doremalen, Jan; Akkermans, Reinier; Grol, Richard; Wensing, Michel
2009-01-01
Background The impact of high physician workload and job stress on quality and outcomes of healthcare delivery is not clear. Our study explored whether high workload and job stress were associated with lower performance in general practices in the Netherlands. Methods Secondary analysis of data from 239 general practices, collected in practice visits between 2003 to 2006 in the Netherlands using a comprehensive set of measures of practice management. Data were collected by a practice visitor, a trained non-physician observer using patients questionnaires, doctors and staff. For this study we selected five measures of practice performance as outcomes and six measures of GP workload and job stress as predictors. A total of 79 indicators were used out of the 303 available indicators. Random coefficient regression models were applied to examine associations. Results and discussion Workload and job stress are associated with practice performance. Workload: Working more hours as a GP was associated with more positive patient experiences of accessibility and availability (b = 0.16). After list size adjustment, practices with more GP-time per patient scored higher on GP care (b = 0.45). When GPs provided more than 20 hours per week per 1000 patients, patients scored over 80% on the Europep questionnaire for quality of GP care. Job stress: High GP job stress was associated with lower accessibility and availability (b = 0.21) and insufficient practice management (b = 0.25). Higher GP commitment and more satisfaction with the job was associated with more prevention and disease management (b = 0.35). Conclusion Providing more time in the practice, and more time per patient and experiencing less job stress are all associated with perceptions by patients of better care and better practice performance. Workload and job stress should be assessed by using list size adjusted data in order to realise better quality of care. Organisational development using this kind of data feedback
Endsley, Patricia
2017-02-01
The purpose of this scoping review was to survey the most recent (5 years) acute care, community health, and mental health nursing workload literature to understand themes and research avenues that may be applicable to school nursing workload research. The search for empirical and nonempirical literature was conducted using search engines such as Google Scholar, PubMed, CINAHL, and Medline. Twenty-nine empirical studies and nine nonempirical articles were selected for inclusion. Themes that emerged consistent with school nurse practice include patient classification systems, environmental factors, assistive personnel, missed nursing care, and nurse satisfaction. School nursing is a public health discipline and population studies are an inherent research priority but may overlook workload variables at the clinical level. School nurses need a consistent method of population assessment, as well as evaluation of appropriate use of assistive personnel and school environment factors. Assessment of tasks not directly related to student care and professional development must also be considered in total workload.
Heavy vehicle driver workload assessment. Task 6, baseline data study
DOT National Transportation Integrated Search
This report covers the sixth in a series of tasks involving the assessment of driver workload in heavy vehicle operation associated with in-cab devices or systems. A review of the overall study was provided by Tijerina, Kantowitz, Kiger, and Rockwell...
Multi-Attribute Task Battery - Applications in pilot workload and strategic behavior research
NASA Technical Reports Server (NTRS)
Arnegard, Ruth J.; Comstock, J. R., Jr.
1991-01-01
The Multi-Attribute Task (MAT) Battery provides a benchmark set of tasks for use in a wide range of lab studies of operator performance and workload. The battery incorporates tasks analogous to activities that aircraft crewmembers perform in flight, while providing a high degree of experimenter control, performance data on each subtask, and freedom to nonpilot test subjects. Features not found in existing computer based tasks include an auditory communication task (to simulate Air Traffic Control communication), a resource management task permitting many avenues or strategies of maintaining target performance, a scheduling window which gives the operator information about future task demands, and the option of manual or automated control of tasks. Performance data are generated for each subtask. In addition, the task battery may be paused and onscreen workload rating scales presented to the subject. The MAT Battery requires a desktop computer with color graphics. The communication task requires a serial link to a second desktop computer with a voice synthesizer or digitizer card.
Kindt, Merel; van den Hout, Marcel; Arntz, Arnoud; Drost, Jolijn
2008-12-01
Ehlers and Clark [(2000). A cognitive model of posttraumatic stress disorder. Behaviour Research and Therapy, 38, 319-345] propose that a predominance of data-driven processing during the trauma predicts subsequent PTSD. We wondered whether, apart from data-driven encoding, sustained data-driven processing after the trauma is also crucial for the development of PTSD. Both hypotheses were tested in two analogue experiments. Experiment 1 demonstrated that relative to conceptually-driven processing (n=20), data-driven processing after the film (n=14), resulted in more intrusions. Experiment 2 demonstrated that relative to the neutral condition (n=24) and the data-driven encoding condition (n=24), conceptual encoding (n=25) reduced suppression of intrusions and a trend emerged for memory fragmentation. The difference between the two encoding styles was due to the beneficial effect of induced conceptual encoding and not to the detrimental effect of data-driven encoding. The data support the viability of the distinction between data-driven/conceptually-driven processing for the understanding of the development of PTSD.
Ontology-Driven Provenance Management in eScience: An Application in Parasite Research
NASA Astrophysics Data System (ADS)
Sahoo, Satya S.; Weatherly, D. Brent; Mutharaju, Raghava; Anantharam, Pramod; Sheth, Amit; Tarleton, Rick L.
Provenance, from the French word "provenir", describes the lineage or history of a data entity. Provenance is critical information in scientific applications to verify experiment process, validate data quality and associate trust values with scientific results. Current industrial scale eScience projects require an end-to-end provenance management infrastructure. This infrastructure needs to be underpinned by formal semantics to enable analysis of large scale provenance information by software applications. Further, effective analysis of provenance information requires well-defined query mechanisms to support complex queries over large datasets. This paper introduces an ontology-driven provenance management infrastructure for biology experiment data, as part of the Semantic Problem Solving Environment (SPSE) for Trypanosoma cruzi (T.cruzi). This provenance infrastructure, called T.cruzi Provenance Management System (PMS), is underpinned by (a) a domain-specific provenance ontology called Parasite Experiment ontology, (b) specialized query operators for provenance analysis, and (c) a provenance query engine. The query engine uses a novel optimization technique based on materialized views called materialized provenance views (MPV) to scale with increasing data size and query complexity. This comprehensive ontology-driven provenance infrastructure not only allows effective tracking and management of ongoing experiments in the Tarleton Research Group at the Center for Tropical and Emerging Global Diseases (CTEGD), but also enables researchers to retrieve the complete provenance information of scientific results for publication in literature.
Fault and Error Latency Under Real Workload: an Experimental Study. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chillarege, Ram
1986-01-01
A practical methodology for the study of fault and error latency is demonstrated under a real workload. This is the first study that measures and quantifies the latency under real workload and fills a major gap in the current understanding of workload-failure relationships. The methodology is based on low level data gathered on a VAX 11/780 during the normal workload conditions of the installation. Fault occurrence is simulated on the data, and the error generation and discovery process is reconstructed to determine latency. The analysis proceeds to combine the low level activity data with high level machine performance data to yield a better understanding of the phenomena. A strong relationship exists between latency and workload and that relationship is quantified. The sampling and reconstruction techniques used are also validated. Error latency in the memory where the operating system resides was studied using data on the physical memory access. Fault latency in the paged section of memory was determined using data from physical memory scans. Error latency in the microcontrol store was studied using data on the microcode access and usage.
Supporting Academic Workloads in Online Learning
ERIC Educational Resources Information Center
Haggerty, Carmel E.
2015-01-01
Academic workloads in online learning are influenced by many variables, the complexity of which makes it difficult to measure academic workloads in isolation. While researching issues associated with academic workloads, professional development stood out as having a substantive impact on academic workloads. Many academics in applied health degrees…
ERIC Educational Resources Information Center
Ames, Michael D.
1976-01-01
A participatory process by which a useful costing and data management system was developed at Chapman College is described. The system summarizes information on instructional workloads, class sizes, and the costs per student credit hour for academic programs. Costs incurred in other areas to support each program are included. (Editor/LBH)
NASA Astrophysics Data System (ADS)
Giuliani, Matteo; Cominola, Andrea; Alshaf, Ahmad; Castelletti, Andrea; Anda, Martin
2016-04-01
The continuous expansion of urban areas worldwide is expected to highly increase residential water demand over the next few years, ultimately challenging the distribution and supply of drinking water. Several studies have recently demonstrated that actions focused only on the water supply side of the problem (e.g., augmenting existing water supply infrastructure) will likely fail to meet future demands, thus calling for the concurrent deployment of effective water demand management strategies (WDMS) to pursue water savings and conservation. However, to be effective WDMS do require a substantial understanding of water consumers' behaviors and consumption patterns at different spatial and temporal resolutions. Retrieving information on users' behaviors, as well as their explanatory and/or causal factors, is key to spot potential areas for targeting water saving efforts and to design user-tailored WDMS, such as education campaigns and personalized recommendations. In this work, we contribute a data-driven approach to identify household water users' consumption behavioural profiles and model their water use habits. State-of-the-art clustering methods are coupled with big data machine learning techniques with the aim of extracting dominant behaviors from a set of water consumption data collected at the household scale. This allows identifying heterogeneous groups of consumers from the studied sample and characterizing them with respect to several consumption features. Our approach is validated onto a real-world household water consumption dataset associated with a variety of demographic and psychographic user data and household attributes, collected in nine towns of the Pilbara and Kimberley Regions of Western Australia. Results show the effectiveness of the proposed method in capturing the influence of candidate determinants on residential water consumption profiles and in attaining sufficiently accurate predictions of users' consumption behaviors, ultimately providing
DataHub: Knowledge-based data management for data discovery
NASA Astrophysics Data System (ADS)
Handley, Thomas H.; Li, Y. Philip
1993-08-01
Currently available database technology is largely designed for business data-processing applications, and seems inadequate for scientific applications. The research described in this paper, the DataHub, will address the issues associated with this shortfall in technology utilization and development. The DataHub development is addressing the key issues in scientific data management of scientific database models and resource sharing in a geographically distributed, multi-disciplinary, science research environment. Thus, the DataHub will be a server between the data suppliers and data consumers to facilitate data exchanges, to assist science data analysis, and to provide as systematic approach for science data management. More specifically, the DataHub's objectives are to provide support for (1) exploratory data analysis (i.e., data driven analysis); (2) data transformations; (3) data semantics capture and usage; analysis-related knowledge capture and usage; and (5) data discovery, ingestion, and extraction. Applying technologies that vary from deductive databases, semantic data models, data discovery, knowledge representation and inferencing, exploratory data analysis techniques and modern man-machine interfaces, DataHub will provide a prototype, integrated environement to support research scientists' needs in multiple disciplines (i.e. oceanography, geology, and atmospheric) while addressing the more general science data management issues. Additionally, the DataHub will provide data management services to exploratory data analysis applications such as LinkWinds and NCSA's XIMAGE.
Effects of digital altimetry on pilot workload
NASA Technical Reports Server (NTRS)
Harris, R. L., Sr.; Glover, B. J.
1985-01-01
A series of VOR-DME instrument landing approaches was flown in the DC-9 full-workload simulator to compare pilot performance, scan behavior, and workload when using a computer-drum-pointer altimeter (CDPA) and a digital altimeter (DA). Six pilots executed two sets of instrument landing approaches, with a CDPA on one set and a DA on the other set. Pilot scanning parameters, flight performance, and subjective opinion data were evaluated. It is found that the processes of gathering information from the CDPA and the DA are different. The DA requires a higher mental workload than the CDPA for a VOR-DME type landing approach. Mental processing of altitude information after transitioning back to the attitude indicator is more evident with the DA than with the CDPA.
Spikes in acute workload are associated with increased injury risk in elite cricket fast bowlers.
Hulin, Billy T; Gabbett, Tim J; Blanch, Peter; Chapman, Paul; Bailey, David; Orchard, John W
2014-04-01
To determine if the comparison of acute and chronic workload is associated with increased injury risk in elite cricket fast bowlers. Data were collected from 28 fast bowlers who completed a total of 43 individual seasons over a 6-year period. Workloads were estimated by summarising the total number of balls bowled per week (external workload), and by multiplying the session rating of perceived exertion by the session duration (internal workload). One-week data (acute workload), together with 4-week rolling average data (chronic workload), were calculated for external and internal workloads. The size of the acute workload in relation to the chronic workload provided either a negative or positive training-stress balance. A negative training-stress balance was associated with an increased risk of injury in the week after exposure, for internal workload (relative risk (RR)=2.2 (CI 1.91 to 2.53), p=0.009), and external workload (RR=2.1 (CI 1.81 to 2.44), p=0.01). Fast bowlers with an internal workload training-stress balance of greater than 200% had a RR of injury of 4.5 (CI 3.43 to 5.90, p=0.009) compared with those with a training-stress balance between 50% and 99%. Fast bowlers with an external workload training-stress balance of more than 200% had a RR of injury of 3.3 (CI 1.50 to 7.25, p=0.033) in comparison to fast bowlers with an external workload training-stress balance between 50% and 99%. These findings demonstrate that large increases in acute workload are associated with increased injury risk in elite cricket fast bowlers.
Li, Xiangyu; Xie, Nijie; Tian, Xinyue
2017-01-01
This paper proposes a scheduling and power management solution for energy harvesting heterogeneous multi-core WSN node SoC such that the system continues to operate perennially and uses the harvested energy efficiently. The solution consists of a heterogeneous multi-core system oriented task scheduling algorithm and a low-complexity dynamic workload scaling and configuration optimization algorithm suitable for light-weight platforms. Moreover, considering the power consumption of most WSN applications have the characteristic of data dependent behavior, we introduce branches handling mechanism into the solution as well. The experimental result shows that the proposed algorithm can operate in real-time on a lightweight embedded processor (MSP430), and that it can make a system do more valuable works and make more than 99.9% use of the power budget. PMID:28208730
Li, Xiangyu; Xie, Nijie; Tian, Xinyue
2017-02-08
This paper proposes a scheduling and power management solution for energy harvesting heterogeneous multi-core WSN node SoC such that the system continues to operate perennially and uses the harvested energy efficiently. The solution consists of a heterogeneous multi-core system oriented task scheduling algorithm and a low-complexity dynamic workload scaling and configuration optimization algorithm suitable for light-weight platforms. Moreover, considering the power consumption of most WSN applications have the characteristic of data dependent behavior, we introduce branches handling mechanism into the solution as well. The experimental result shows that the proposed algorithm can operate in real-time on a lightweight embedded processor (MSP430), and that it can make a system do more valuable works and make more than 99.9% use of the power budget.
Workload - An examination of the concept
NASA Technical Reports Server (NTRS)
Gopher, Daniel; Donchin, Emanuel
1986-01-01
The relations between task difficulty and workload and workload and performance are examined. The architecture and limitations of the central processor are discussed. Various procedures for measuring workload are described and evaluated. Consideration is given to normative and descriptive approaches; subjective, performance, and arousal measures; performance operating characteristics; and psychophysiological measures of workload.
Cardiac-Activity Measures for Assessing Airport Ramp-Tower Controller's Workload
NASA Technical Reports Server (NTRS)
Hayashi, Miwa; Dulchinos, Victoria
2016-01-01
Heart rate (HR) and heart rate variability (HRV) potentially offer objective, continuous, and non-intrusive measures of human-operators mental workload. Such measurement capability is attractive for workload assessment in complex laboratory simulations or safety-critical field testing. The present study compares mean HR and HRV data with self-reported subjective workload ratings collected during a high-fidelity human-in-the-loop simulation of airport ramp traffic control operations, which involve complex cognitive and coordination tasks. Mean HR was found to be weakly sensitive to the workload ratings, while HRV was not sensitive or even contradictory to the assumptions. Until more knowledge on stress response mechanisms of the autonomic nervous system is obtained, it is recommended that these cardiac-activity measures be used with other workload assessment tools, such as subjective measures.
Cardiac-Activity Measures for Assessing Airport Ramp-Tower Controller's Workload
NASA Technical Reports Server (NTRS)
Hayashi, Miwa; Dulchinos, Victoria L.
2016-01-01
Heart rate (HR) and heart rate variability (HRV) potentially offer objective, continuous, and non-intrusive measures of human-operator's mental workload. Such measurement capability is attractive for workload assessment in complex laboratory simulations or safety-critical field testing. The present study compares mean HR and HRV data with self-reported subjective workload ratings collected during a high-fidelity human-in-the-loop simulation of airport ramp traffic control operations, which involve complex cognitive and coordination tasks. Mean HR was found to be weakly sensitive to the workload ratings, while HRV was not sensitive or even contradictory to the assumptions. Until more knowledge on stress response mechanisms of the autonomic nervous system is obtained, it is recommended that these cardiac-activity measures be used with other workload assessment tools, such as subjective measures.
ERIC Educational Resources Information Center
LaFee, Scott
2002-01-01
Describes the use of data-driven decision-making in four school districts: Plainfield Public Schools, Plainfield, New Jersey; Palo Alto Unified School District, Palo Alto, California; Francis Howell School District in eastern Missouri, northwest of St. Louis; and Rio Rancho Public Schools, near Albuquerque, New Mexico. Includes interviews with the…
ERIC Educational Resources Information Center
Swan, Gerry; Mazur, Joan
2011-01-01
Although the term data-driven decision making (DDDM) is relatively new (Moss, 2007), the underlying concept of DDDM is not. For example, the practices of formative assessment and computer-managed instruction have historically involved the use of student performance data to guide what happens next in the instructional sequence (Morrison, Kemp, &…
Workload and Stress in New Zealand Universities.
ERIC Educational Resources Information Center
Boyd, Sally; Wylie, Cathy
This study examined the workloads of academic, general, support, library, and technical staff of New Zealand universities. It focused on current levels of workload, changes in workload levels and content, connections between workload and stress, and staff attitudes towards the effects of workload changes and educational reforms on the quality of…
Data-Driven and Expectation-Driven Discovery of Empirical Laws.
1982-10-10
occurred in small integer proportions to each other. In 1809, Joseph Gay- Lussac found evidence for his law of combining volumes, which stated that a...of Empirical Laws Patrick W. Langley Gary L. Bradshaw Herbert A. Simon T1he Robotics Institute Carnegie-Mellon University Pittsburgh, Pennsylvania...Subtitle) S. TYPE OF REPORT & PERIOD COVERED Data-Driven and Expectation-Driven Discovery Interim Report 2/82-10/82 of Empirical Laws S. PERFORMING ORG
Physician activity during outpatient visits and subjective workload.
Calvitti, Alan; Hochheiser, Harry; Ashfaq, Shazia; Bell, Kristin; Chen, Yunan; El Kareh, Robert; Gabuzda, Mark T; Liu, Lin; Mortensen, Sara; Pandey, Braj; Rick, Steven; Street, Richard L; Weibel, Nadir; Weir, Charlene; Agha, Zia
2017-05-01
We describe methods for capturing and analyzing EHR use and clinical workflow of physicians during outpatient encounters and relating activity to physicians' self-reported workload. We collected temporally-resolved activity data including audio, video, EHR activity, and eye-gaze along with post-visit assessments of workload. These data are then analyzed through a combination of manual content analysis and computational techniques to temporally align streams, providing a range of process measures of EHR usage, clinical workflow, and physician-patient communication. Data was collected from primary care and specialty clinics at the Veterans Administration San Diego Healthcare System and UCSD Health, who use Electronic Health Record (EHR) platforms, CPRS and Epic, respectively. Grouping visit activity by physician, site, specialty, and patient status enables rank-ordering activity factors by their correlation to physicians' subjective work-load as captured by NASA Task Load Index survey. We developed a coding scheme that enabled us to compare timing studies between CPRS and Epic and extract patient and visit complexity profiles. We identified similar patterns of EHR use and navigation at the 2 sites despite differences in functions, user interfaces and consequent coded representations. Both sites displayed similar proportions of EHR function use and navigation, and distribution of visit length, proportion of time physicians attended to EHRs (gaze), and subjective work-load as measured by the task load survey. We found that visit activity was highly variable across individual physicians, and the observed activity metrics ranged widely as correlates to subjective workload. We discuss implications of our study for methodology, clinical workflow and EHR redesign. Copyright © 2017 Elsevier Inc. All rights reserved.
Krska, Janet; Palmer, Sharon; Dalzell-Brown, Annette; Nicholl, Pat
2013-07-01
To determine Citizen's Advice Bureaux (CAB) and general practice staff perceptions on the impact of a CAB Health Outreach (CABHO) service on staff workload. To quantify the frequency of mental health issues among patients referred to the CABHO service. To measure any impact of the CABHO service on appointments, referrals and prescribing for mental health. GPs and practice managers perceive that welfare rights services, provided by CAB, reduce practice staff workload, but this has not been quantified. Interviews with practice managers and GPs hosting and CAB staff providing an advisory service in nine general practices. Comparison of frequency of GP and nurse appointments, mental health referrals and prescriptions for hypnotics/anxiolytics and antidepressants issued before and after referral to the CABHO service, obtained from medical records of referred patients. Most GPs and CAB staff perceived the service reduced practice staff workload, although practice managers were less certain. CAB staff believed that many patients referred to them had mental health issues. Data were obtained for 148/250 referrals of whom 46% may have had a mental health issue. There were statistically significant reductions in the number of GP appointments and prescriptions for hypnotics/anxiolytics during the six months after referral to CABHO compared with six months before. There were also non-significant reductions in nurse appointments and prescriptions for antidepressants, but no change in appointments or referrals for mental health problems. The quantitative findings therefore confirmed perceptions among both CAB and practice staff of reduced workload and in addition suggest that prescribing may be reduced, although further larger-scale studies are required to confirm this.
Enabling Data-Driven Methodologies Across the Data Lifecycle and Ecosystem
NASA Astrophysics Data System (ADS)
Doyle, R. J.; Crichton, D.
2017-12-01
NASA has unlocked unprecedented scientific knowledge through exploration of the Earth, our solar system, and the larger universe. NASA is generating enormous amounts of data that are challenging traditional approaches to capturing, managing, analyzing and ultimately gaining scientific understanding from science data. New architectures, capabilities and methodologies are needed to span the entire observing system, from spacecraft to archive, while integrating data-driven discovery and analytic capabilities. NASA data have a definable lifecycle, from remote collection point to validated accessibility in multiple archives. Data challenges must be addressed across this lifecycle, to capture opportunities and avoid decisions that may limit or compromise what is achievable once data arrives at the archive. Data triage may be necessary when the collection capacity of the sensor or instrument overwhelms data transport or storage capacity. By migrating computational and analytic capability to the point of data collection, informed decisions can be made about which data to keep; in some cases, to close observational decision loops onboard, to enable attending to unexpected or transient phenomena. Along a different dimension than the data lifecycle, scientists and other end-users must work across an increasingly complex data ecosystem, where the range of relevant data is rarely owned by a single institution. To operate effectively, scalable data architectures and community-owned information models become essential. NASA's Planetary Data System is having success with this approach. Finally, there is the difficult challenge of reproducibility and trust. While data provenance techniques will be part of the solution, future interactive analytics environments must support an ability to provide a basis for a result: relevant data source and algorithms, uncertainty tracking, etc., to assure scientific integrity and to enable confident decision making. Advances in data science offer
Prediction of Traffic Complexity and Controller Workload in Mixed Equipage NextGen Environments
NASA Technical Reports Server (NTRS)
Lee, Paul U.; Prevot, Thomas
2012-01-01
Controller workload is a key factor in limiting en route air traffic capacity. Past efforts to quantify and predict workload have resulted in identifying objective metrics that correlate well with subjective workload ratings during current air traffic control operations. Although these metrics provide a reasonable statistical fit to existing data, they do not provide a good mechanism for estimating controller workload for future air traffic concepts and environments that make different assumptions about automation, enabling technologies, and controller tasks. One such future environment is characterized by en route airspace with a mixture of aircraft equipped with and without Data Communications (Data Comm). In this environment, aircraft with Data Comm will impact controller workload less than aircraft requiring voice communication, altering the close correlation between aircraft count and controller workload that exists in current air traffic operations. This paper outlines a new trajectory-based complexity (TBX) calculation that was presented to controllers during a human-in-the-loop simulation. The results showed that TBX accurately estimated the workload in a mixed Data Comm equipage environment and the resulting complexity values were understood and readily interpreted by the controllers. The complexity was represented as a "modified aircraft account" that weighted different complexity factors and summed them in such a way that the controllers could effectively treat them as aircraft count. The factors were also relatively easy to tune without an extensive data set. The results showed that the TBX approach is well suited for presenting traffic complexity in future air traffic environments.
Operator Workload: Comprehensive Review and Evaluation of Operator Workload Methodologies
1989-06-01
chocking for system failures or emergency conditions. It seems fair to characterize the changes In operator functions as more mental or cognitive In nature ...that the operator, the system hardware, and the evMronment all interact in affecting performance and this Interaction can change the nature of the task...a) classifying the nature of the operator tasks and (b) classifying workload assessment techniques. Task taxonomies are useful because some workload
A computerized multidimensional measurement of mental workload via handwriting analysis.
Luria, Gil; Rosenblum, Sara
2012-06-01
The goal of this study was to test the effect of mental workload on handwriting behavior and to identify characteristics of low versus high mental workload in handwriting. We hypothesized differences between handwriting under three different load conditions and tried to establish a profile that integrated these indicators. Fifty-six participants wrote three numerical progressions of varying difficulty on a digitizer attached to a computer so that we could evaluate their handwriting behavior. Differences were found in temporal, spatial, and angular velocity handwriting measures, but no significant differences were found for pressure measures. Using data reduction, we identified three clusters of handwriting, two of which differentiated well according to the three mental workload conditions. We concluded that handwriting behavior is affected by mental workload and that each measure provides distinct information, so that they present a comprehensive indicator of mental workload.
A simulator study of the interaction of pilot workload with errors, vigilance, and decisions
NASA Technical Reports Server (NTRS)
Smith, H. P. R.
1979-01-01
A full mission simulation of a civil air transport scenario that had two levels of workload was used to observe the actions of the crews and the basic aircraft parameters and to record heart rates. The results showed that the number of errors was very variable among crews but the mean increased in the higher workload case. The increase in errors was not related to rise in heart rate but was associated with vigilance times as well as the days since the last flight. The recorded data also made it possible to investigate decision time and decision order. These also varied among crews and seemed related to the ability of captains to manage the resources available to them on the flight deck.
The multi-attribute task battery for human operator workload and strategic behavior research
NASA Technical Reports Server (NTRS)
Comstock, J. Raymond, Jr.; Arnegard, Ruth J.
1992-01-01
The Multi-Attribute Task (MAT) Battery provides a benchmark set of tasks for use in a wide range of lab studies of operator performance and workload. The battery incorporates tasks analogous to activities that aircraft crewmembers perform in flight, while providing a high degree of experimenter control, performance data on each subtask, and freedom to use nonpilot test subjects. Features not found in existing computer based tasks include an auditory communication task (to simulate Air Traffic Control communication), a resource management task permitting many avenues or strategies of maintaining target performance, a scheduling window which gives the operator information about future task demands, and the option of manual or automated control of tasks. Performance data are generated for each subtask. In addition, the task battery may be paused and onscreen workload rating scales presented to the subject. The MAT Battery requires a desktop computer with color graphics. The communication task requires a serial link to a second desktop computer with a voice synthesizer or digitizer card.
A user-oriented synthetic workload generator
NASA Technical Reports Server (NTRS)
Kao, Wei-Lun
1991-01-01
A user oriented synthetic workload generator that simulates users' file access behavior based on real workload characterization is described. The model for this workload generator is user oriented and job specific, represents file I/O operations at the system call level, allows general distributions for the usage measures, and assumes independence in the file I/O operation stream. The workload generator consists of three parts which handle specification of distributions, creation of an initial file system, and selection and execution of file I/O operations. Experiments on SUN NFS are shown to demonstrate the usage of the workload generator.
Memory and subjective workload assessment
NASA Technical Reports Server (NTRS)
Staveland, L.; Hart, S.; Yeh, Y. Y.
1986-01-01
Recent research suggested subjective introspection of workload is not based upon specific retrieval of information from long term memory, and only reflects the average workload that is imposed upon the human operator by a particular task. These findings are based upon global ratings of workload for the overall task, suggesting that subjective ratings are limited in ability to retrieve specific details of a task from long term memory. To clarify the limits memory imposes on subjective workload assessment, the difficulty of task segments was varied and the workload of specified segments was retrospectively rated. The ratings were retrospectively collected on the manipulations of three levels of segment difficulty. Subjects were assigned to one of two memory groups. In the Before group, subjects knew before performing a block of trials which segment to rate. In the After group, subjects did not know which segment to rate until after performing the block of trials. The subjective ratings, RTs (reaction times) and MTs (movement times) were compared within group, and between group differences. Performance measures and subjective evaluations of workload reflected the experimental manipulations. Subjects were sensitive to different difficulty levels, and recalled the average workload of task components. Cueing did not appear to help recall, and memory group differences possibly reflected variations in the groups of subjects, or an additional memory task.
Debriefing decreases mental workload in surgical crisis: A randomized controlled trial.
Boet, Sylvain; Sharma, Bharat; Pigford, Ashlee-Ann; Hladkowicz, Emily; Rittenhouse, Neil; Grantcharov, Teodor
2017-05-01
Mental workload is the amount of mental effort involved in performing a particular task. Crisis situations may increase mental workload, which can subsequently negatively impact operative performance and patient safety. This study aims to measure the impact of learning through debriefing and a systematic approach to crisis on trainees' mental workload in a simulated surgical crisis. Twenty junior surgical residents participated in a high-fidelity, simulated, postoperative crisis in a surgical ward environment (pretest). Participants were randomized to either an instructor-led debriefing, including performance feedback (intervention; n = 10) or no debriefing (control; n = 10). Subjects then immediately managed a second simulated crisis (post-test). Mental workload was assessed in real time during the scenarios using a previously validated, wireless, vibrotactile device. Mental workload was represented by subject response times to the vibrations, which were recorded and analyzed using the Mann-Whitney U test. Participants in the debriefing arm had a significantly reduced median response time in milliseconds (post-test minus pretest -695, quartile range -2,136 to -297) compared to participants in the control arm (42, -1,191 to 763), (between-arm difference P = .049). Debriefing after simulated surgical crisis situations may improve performance by decreasing trainee's mental workload during a subsequent simulated surgical crisis. Copyright © 2016 Elsevier Inc. All rights reserved.
Training and testing ERP-BCIs under different mental workload conditions
NASA Astrophysics Data System (ADS)
Ke, Yufeng; Wang, Peiyuan; Chen, Yuqian; Gu, Bin; Qi, Hongzhi; Zhou, Peng; Ming, Dong
2016-02-01
Objective. As one of the most popular and extensively studied paradigms of brain-computer interfaces (BCIs), event-related potential-based BCIs (ERP-BCIs) are usually built and tested in ideal laboratory settings in most existing studies, with subjects concentrating on stimuli and intentionally avoiding possible distractors. This study is aimed at examining the effect of simultaneous mental activities on ERP-BCIs by manipulating various levels of mental workload during the training and/or testing of an ERP-BCI. Approach. Mental workload was manipulated during the training or testing of a row-column P300-speller to investigate how and to what extent the spelling performance and the ERPs evoked by the oddball stimuli are affected by simultaneous mental workload. Main results. Responses of certain ERP components, temporal-occipital N200 and the late reorienting negativity evoked by the oddball stimuli and the classifiability of ERP features between targets and non-targets decreased with the increase of mental workload encountered by the subject. However, the effect of mental workload on the performance of ERP-BCI was not always negative but depended on the conditions where the ERP-BCI was built and applied. The performance of ERP-BCI built under an ideal lab setting without any irrelevant mental activities declined with the increasing mental workload of the testing data. However, the performance was significantly improved when an ERP-BCI was built under an appropriate mental workload level, compared to that built under speller-only conditions. Significance. The adverse effect of concurrent mental activities may present a challenge for ERP-BCIs trained in ideal lab settings but which are to be used in daily work, especially when users are performing demanding mental processing. On the other hand, the positive effects of the mental workload of the training data suggest that introducing appropriate mental workload during training ERP-BCIs is of potential benefit to the
Training and subjective workload in a category search task
NASA Technical Reports Server (NTRS)
Vidulich, Michael A.; Pandit, Parimal
1986-01-01
This study examined automaticity as a means by which training influences mental workload. Two groups were trained in a category search task. One group received a training paradigm designed to promote the development of automaticity; the other group received a training paradigm designed to prohibit it. Resultant performance data showed the expected improvement as a result of the development of automaticity. Subjective workload assessments mirrored the performance results in most respects. The results supported the position that subjective mental workload assessments may be sensitive to the effect of training when it produces a lower level of cognitive load.
Monitoring Workload in Throwing-Dominant Sports: A Systematic Review.
Black, Georgia M; Gabbett, Tim J; Cole, Michael H; Naughton, Geraldine
2016-10-01
used to monitor workload and purposes for monitoring workload, encompassing the relationship between workload and injury, individual responses to workloads, the effect of workload on subsequent performance and the future directions of workload-monitoring techniques. This systematic review highlighted a number of simple and effective workload-monitoring techniques implemented across a variety of throwing-dominant sports. The current literature placed an emphasis on the relationship between workload and injury. However, due to differences in chronological and training age, inconsistent injury definitions and time frames used for monitoring, injury thresholds remain unclear in throwing-dominant sports. Furthermore, although research has examined total workload, the intensity of workload is often neglected. Additional research on the reliability of self-reported workload data is also required to validate existing relationships between workload and injury. Considering the existing disparity within the literature, it is likely that throwing-dominant sports would benefit from the development of an automated monitoring tool to objectively assess throwing-related workloads in conjunction with well-established internal measures of load in athletes.
Measuring perceived mental workload in children.
Laurie-Rose, Cynthia; Frey, Meredith; Ennis, Aristi; Zamary, Amanda
2014-01-01
Little is known about the mental workload, or psychological costs, associated with information processing tasks in children. We adapted the highly regarded NASA Task Load Index (NASA-TLX) multidimensional workload scale (Hart & Staveland, 1988) to test its efficacy for use with elementary school children. We developed 2 types of tasks, each with 2 levels of demand, to draw differentially on resources from the separate subscales of workload. In Experiment 1, our participants were both typical and school-labeled gifted children recruited from 4th and 5th grades. Results revealed that task type elicited different workload profiles, and task demand directly affected the children's experience of workload. In general, gifted children experienced less workload than typical children. Objective response time and accuracy measures provide evidence for the criterion validity of the workload ratings. In Experiment 2, we applied the same method with 1st- and 2nd-grade children. Findings from Experiment 2 paralleled those of Experiment 1 and support the use of NASA-TLX with even the youngest elementary school children. These findings contribute to the fledgling field of educational ergonomics and attest to the innovative application of workload research. Such research may optimize instructional techniques and identify children at risk for experiencing overload.
Physiological Parameter Response to Variation of Mental Workload.
Marinescu, Adrian Cornelius; Sharples, Sarah; Ritchie, Alastair Campbell; Sánchez López, Tomas; McDowell, Michael; Morvan, Hervé P
2018-02-01
To examine the relationship between experienced mental workload and physiological response by noninvasive monitoring of physiological parameters. Previous studies have examined how individual physiological measures respond to changes in mental demand and subjective reports of workload. This study explores the response of multiple physiological parameters and quantifies their added value when estimating the level of demand. The study presented was conducted in laboratory conditions and required participants to perform a visual-motor task that imposed varying levels of demand. The data collected consisted of physiological measurements (heart interbeat intervals, breathing rate, pupil diameter, facial thermography), subjective ratings of workload (Instantaneous Self-Assessment Workload Scale [ISA] and NASA-Task Load Index), and the performance. Facial thermography and pupil diameter were demonstrated to be good candidates for noninvasive workload measurements: For seven out of 10 participants, pupil diameter showed a strong correlation ( R values between .61 and .79 at a significance value of .01) with mean ISA normalized values. Facial thermography measures added on average 47.7% to the amount of variability in task performance explained by a regression model. As with the ISA ratings, the relationship between the physiological measures and performance showed strong interparticipant differences, with some individuals demonstrating a much stronger relationship between workload and performance measures than others. The results presented in this paper demonstrate that physiological and pupil diameter can be used for noninvasive real-time measurement of workload. The methods presented in this article, with current technological capabilities, are better suited for workplaces where the person is seated, offering the possibility of being applied to pilots and air traffic controllers.
Physiological Parameter Response to Variation of Mental Workload
Marinescu, Adrian Cornelius; Sharples, Sarah; Ritchie, Alastair Campbell; Sánchez López, Tomas; McDowell, Michael; Morvan, Hervé P.
2017-01-01
Objective: To examine the relationship between experienced mental workload and physiological response by noninvasive monitoring of physiological parameters. Background: Previous studies have examined how individual physiological measures respond to changes in mental demand and subjective reports of workload. This study explores the response of multiple physiological parameters and quantifies their added value when estimating the level of demand. Method: The study presented was conducted in laboratory conditions and required participants to perform a visual-motor task that imposed varying levels of demand. The data collected consisted of physiological measurements (heart interbeat intervals, breathing rate, pupil diameter, facial thermography), subjective ratings of workload (Instantaneous Self-Assessment Workload Scale [ISA] and NASA-Task Load Index), and the performance. Results: Facial thermography and pupil diameter were demonstrated to be good candidates for noninvasive workload measurements: For seven out of 10 participants, pupil diameter showed a strong correlation (R values between .61 and .79 at a significance value of .01) with mean ISA normalized values. Facial thermography measures added on average 47.7% to the amount of variability in task performance explained by a regression model. As with the ISA ratings, the relationship between the physiological measures and performance showed strong interparticipant differences, with some individuals demonstrating a much stronger relationship between workload and performance measures than others. Conclusion: The results presented in this paper demonstrate that physiological and pupil diameter can be used for noninvasive real-time measurement of workload. Application: The methods presented in this article, with current technological capabilities, are better suited for workplaces where the person is seated, offering the possibility of being applied to pilots and air traffic controllers. PMID:28965433
ERIC Educational Resources Information Center
Vardi, Iris
2009-01-01
Increasing demands on academic work have resulted in many academics working long hours and expressing dissatisfaction with their working life. These concerns have led to a number of faculties and universities adopting workload allocation models to improve satisfaction and better manage workloads. This paper reports on a study which examined the…
NASA TLX: software for assessing subjective mental workload.
Cao, Alex; Chintamani, Keshav K; Pandya, Abhilash K; Ellis, R Darin
2009-02-01
The NASA Task Load Index (TLX) is a popular technique for measuring subjective mental workload. It relies on a multidimensional construct to derive an overall workload score based on a weighted average of ratings on six subscales: mental demand, physical demand, temporal demand, performance, effort, and frustration level. A program for implementing a computerized version of the NASA TLX is described. The software version assists in simplifying collection, postprocessing, and storage of raw data. The program collects raw data from the subject and calculates the weighted (or unweighted) workload score, which is output to a text file. The program can also be tailored to a specific experiment using a simple input text file, if desired. The program was designed in Visual Studio 2005 and is capable of running on a Pocket PC with Windows CE or on a PC with Windows 2000 or higher. The NASA TLX program is available for free download.
Popularity Prediction Tool for ATLAS Distributed Data Management
NASA Astrophysics Data System (ADS)
Beermann, T.; Maettig, P.; Stewart, G.; Lassnig, M.; Garonne, V.; Barisits, M.; Vigne, R.; Serfon, C.; Goossens, L.; Nairz, A.; Molfetas, A.; Atlas Collaboration
2014-06-01
This paper describes a popularity prediction tool for data-intensive data management systems, such as ATLAS distributed data management (DDM). It is fed by the DDM popularity system, which produces historical reports about ATLAS data usage, providing information about files, datasets, users and sites where data was accessed. The tool described in this contribution uses this historical information to make a prediction about the future popularity of data. It finds trends in the usage of data using a set of neural networks and a set of input parameters and predicts the number of accesses in the near term future. This information can then be used in a second step to improve the distribution of replicas at sites, taking into account the cost of creating new replicas (bandwidth and load on the storage system) compared to gain of having new ones (faster access of data for analysis). To evaluate the benefit of the redistribution a grid simulator is introduced that is able replay real workload on different data distributions. This article describes the popularity prediction method and the simulator that is used to evaluate the redistribution.
From trees to forest: relational complexity network and workload of air traffic controllers.
Zhang, Jingyu; Yang, Jiazhong; Wu, Changxu
2015-01-01
In this paper, we propose a relational complexity (RC) network framework based on RC metric and network theory to model controllers' workload in conflict detection and resolution. We suggest that, at the sector level, air traffic showing a centralised network pattern can provide cognitive benefits in visual search and resolution decision which will in turn result in lower workload. We found that the network centralisation index can account for more variance in predicting perceived workload and task completion time in both a static conflict detection task (Study 1) and a dynamic one (Study 2) in addition to other aircraft-level and pair-level factors. This finding suggests that linear combination of aircraft-level or dyad-level information may not be adequate and the global-pattern-based index is necessary. Theoretical and practical implications of using this framework to improve future workload modelling and management are discussed. We propose a RC network framework to model the workload of air traffic controllers. The effect of network centralisation was examined in both a static conflict detection task and a dynamic one. Network centralisation was predictive of perceived workload and task completion time over and above other control variables.
CHROMagar Orientation Medium Reduces Urine Culture Workload
Manickam, Kanchana; Karlowsky, James A.; Adam, Heather; Lagacé-Wiens, Philippe R. S.; Rendina, Assunta; Pang, Paulette; Murray, Brenda-Lee
2013-01-01
Microbiology laboratories continually strive to streamline and improve their urine culture algorithms because of the high volumes of urine specimens they receive and the modest numbers of those specimens that are ultimately considered clinically significant. In the current study, we quantitatively measured the impact of the introduction of CHROMagar Orientation (CO) medium into routine use in two hospital laboratories and compared it to conventional culture on blood and MacConkey agars. Based on data extracted from our Laboratory Information System from 2006 to 2011, the use of CO medium resulted in a 28% reduction in workload for additional procedures such as Gram stains, subcultures, identification panels, agglutination tests, and biochemical tests. The average number of workload units (one workload unit equals 1 min of hands-on labor) per urine specimen was significantly reduced (P < 0.0001; 95% confidence interval [CI], 0.5326 to 1.047) from 2.67 in 2006 (preimplementation of CO medium) to 1.88 in 2011 (postimplementation of CO medium). We conclude that the use of CO medium streamlined the urine culture process and increased bench throughput by reducing both workload and turnaround time in our laboratories. PMID:23363839
An attempt to estimate students' workload.
Pogacnik, M; Juznic, P; Kosorok-Drobnic, M; Pogacnik, A; Cestnik, V; Kogovsek, J; Pestevsek, U; Fernandes, Tito
2004-01-01
Following the recent introduction of the European Credit Transfer System (ECTS) into several European university programs, a new interest has developed in determining students' workload. ECTS credits are numerical values describing the student workload required to complete course units; ECTS has the potential to facilitate comparison and create transparency between institutional curricula. ECTS credits are frequently listed alongside institutional credits in course outlines and module summaries. Measuring student workload has been difficult; to a large extent, estimates are based only upon anecdotal and casual information. To gather more systematic information, we asked students at the Veterinary Faculty, University of Ljubljana, to estimate the actual total workload they committed to fulfill their coursework obligations for specific subjects in the veterinary degree program by reporting their attendance at defined contact hours and their estimated time for outside study, including the time required for examinations and other activities. Students also reported the final grades they received for these subjects. The results show that certain courses require much more work than others, independent of credit unit assignment. Generally, the courses with more contact hours tend also to demand more independent work; the best predictor of both actual student workload and student success is the amount of contact time in which they participate. The data failed to show any strong connection between students' total workload and grades they received; rather, they showed some evidence that regular presence at contact hours was the most positive influence on grades. Less frequent presence at lectures tended to indicate less time spent on independent study. It was also found that pre-clinical and clinical courses tended to require more work from students than other, more general subjects. While the present study does not provide conclusive evidence, it does indicate the need for
Schnelle, John F; Schroyer, L Dale; Saraf, Avantika A; Simmons, Sandra F
2016-11-01
Nursing aides provide most of the labor-intensive activities of daily living (ADL) care to nursing home (NH) residents. Currently, most NHs do not determine nurse aide staffing requirements based on the time to provide ADL care for their unique resident population. The lack of an objective method to determine nurse aide staffing requirements suggests that many NHs could be understaffed in their capacity to provide consistent ADL care to all residents in need. Discrete event simulation (DES) mathematically models key work parameters (eg, time to provide an episode of care and available staff) to predict the ability of the work setting to provide care over time and offers an objective method to determine nurse aide staffing needs in NHs. This study had 2 primary objectives: (1) to describe the relationship between ADL workload and the level of nurse aide staffing reported by NHs; and, (2) to use a DES model to determine the relationship between ADL workload and nurse aide staffing necessary for consistent, timely ADL care. Minimum Data Set data related to the level of dependency on staff for ADL care for residents in over 13,500 NHs nationwide were converted into 7 workload categories that captured 98% of all residents. In addition, data related to the time to provide care for the ADLs within each workload category was used to calculate a workload score for each facility. The correlation between workload and reported nurse aide staffing levels was calculated to determine the association between staffing reported by NHs and workload. Simulations to project staffing requirements necessary to provide ADL care were then conducted for 65 different workload scenarios, which included 13 different nurse aide staffing levels (ranging from 1.6 to 4.0 total hours per resident day) and 5 different workload percentiles (ranging from the 5th to the 95th percentile). The purpose of the simulation model was to determine the staffing necessary to provide care within each workload
Low External Workloads Are Related to Higher Injury Risk in Professional Male Basketball Games
Caparrós, Toni; Casals, Martí; Solana, Álvaro; Peña, Javier
2018-01-01
The primary purpose of this study was to identify potential risk factors for sports injuries in professional basketball. An observational retrospective cohort study involving a male professional basketball team, using game tracking data was conducted during three consecutive seasons. Thirty-three professional basketball players took part in this study. A total of 29 time-loss injuries were recorded during regular season games, accounting for 244 total missed games with a mean of 16.26 ± 15.21 per player and season. The tracking data included the following variables: minutes played, physiological load, physiological intensity, mechanical load, mechanical intensity, distance covered, walking maximal speed, maximal speed, sprinting maximal speed, maximal speed, average offensive speed, average defensive speed, level one acceleration, level two acceleration, level three acceleration, level four acceleration, level one deceleration, level two deceleration, level three deceleration, level four deceleration, player efficiency rating and usage percentage. The influence of demographic characteristics, tracking data and performance factors on the risk of injury was investigated using multivariate analysis with their incidence rate ratios (IRRs). Athletes with less or equal than 3 decelerations per game (IRR, 4.36; 95% CI, 1.78-10.6) and those running less or equal than 1.3 miles per game (lower workload) (IRR, 6.42 ; 95% CI, 2.52-16.3) had a higher risk of injury during games (p < 0.01 in both cases). Therefore, unloaded players have a higher risk of injury. Adequate management of training loads might be a relevant factor to reduce the likelihood of injury according to individual profiles. Key points The number of decelerations and the total distance can be considered risk factors for injuries in professional basketball players. Unloaded players have greater risk of injury compared to players with higher accumulated external workload. Workload management should be
A Model-Driven Approach to e-Course Management
ERIC Educational Resources Information Center
Savic, Goran; Segedinac, Milan; Milenkovic, Dušica; Hrin, Tamara; Segedinac, Mirjana
2018-01-01
This paper presents research on using a model-driven approach to the development and management of electronic courses. We propose a course management system which stores a course model represented as distinct machine-readable components containing domain knowledge of different course aspects. Based on this formally defined platform-independent…
Inflight workload assessment: comparison of subjective and physiological measurements.
Lee, Yung-Hui; Liu, Bor-Shong
2003-10-01
Assessment of pilot workload during flight is an important aviation safety consideration. The aim of this study was to assess inflight pilot workload using both physiological and multidimensional subjective-ratings measurements (heart rate and NASA Task Load Index, respectively), comparing relative sensitivity during the four phases of flight: take-off, cruise, approach, and landing. Ten male pilots volunteered to participate in the trials, which took place in a Boeing 747-400 flight simulator. Electrocardiography was performed throughout the test using the portable Cardiovis ECG system. Mean heart rate (HR) and incremental heart rate (delta HR) were considered indices of physiological workload. Peak HR was observed during take-off (83.2 bpm) and landing (88.6 bpm); moreover, delta HR was also greatest (14.2 bpm and 18.8 bpm). The Task Load Index (TLX) scale revealed that mental and performance demands were essential components of workload during flight. In addition, temporal demand was an important component of workload during take-off and physical demand was significant during cruise. Analysis of correlation revealed that the delta HR is significantly related to TLX scores (r = 0.81, n = 40). Management of the individual sources of stress, which tend to become predominant during different flight phases, should be emphasized in periodic recurrent training. For example, a pilot must be trained to cope with the increased temporal stresses associated with take-off. In addition, the recommendations will be concerned with maintaining vigilance, task allocation between pilots, and inflight rest during long-haul cruise.
The Data-Driven Approach to Spectroscopic Analyses
NASA Astrophysics Data System (ADS)
Ness, M.
2018-01-01
I review the data-driven approach to spectroscopy, The Cannon, which is a method for deriving fundamental diagnostics of galaxy formation of precise chemical compositions and stellar ages, across many stellar surveys that are mapping the Milky Way. With The Cannon, the abundances and stellar parameters from the multitude of stellar surveys can be placed directly on the same scale, using stars in common between the surveys. Furthermore, the information that resides in the data can be fully extracted, this has resulted in higher precision stellar parameters and abundances being delivered from spectroscopic data and has opened up new avenues in galactic archeology, for example, in the determination of ages for red giant stars across the Galactic disk. Coupled with Gaia distances, proper motions, and derived orbit families, the stellar age and individual abundance information delivered at the precision obtained with the data-driven approach provides very strong constraints on the evolution of and birthplace of stars in the Milky Way. I will review the role of data-driven spectroscopy as we enter the era where we have both the data and the tools to build the ultimate conglomerate of galactic information as well as highlight further applications of data-driven models in the coming decade.
EEG Estimates of Cognitive Workload and Engagement Predict Math Problem Solving Outcomes
ERIC Educational Resources Information Center
Beal, Carole R.; Galan, Federico Cirett
2012-01-01
In the present study, the authors focused on the use of electroencephalography (EEG) data about cognitive workload and sustained attention to predict math problem solving outcomes. EEG data were recorded as students solved a series of easy and difficult math problems. Sequences of attention and cognitive workload estimates derived from the EEG…
Data-driven medicinal chemistry in the era of big data.
Lusher, Scott J; McGuire, Ross; van Schaik, René C; Nicholson, C David; de Vlieg, Jacob
2014-07-01
Science, and the way we undertake research, is changing. The increasing rate of data generation across all scientific disciplines is providing incredible opportunities for data-driven research, with the potential to transform our current practices. The exploitation of so-called 'big data' will enable us to undertake research projects never previously possible but should also stimulate a re-evaluation of all our data practices. Data-driven medicinal chemistry approaches have the potential to improve decision making in drug discovery projects, providing that all researchers embrace the role of 'data scientist' and uncover the meaningful relationships and patterns in available data. Copyright © 2013 Elsevier Ltd. All rights reserved.
Murphy, Katherine; Levitt, Naomi S.; BeLue, Rhonda; Oni, Tolu
2018-01-01
Background Current South African health policy for chronic disease management proposes integration of chronic services for better outcomes for chronic conditions; that is based on the Integrated Chronic Disease Model (ICDM). However, scant data exist on how patients with chronic multimorbidities currently experience the (re)-organisation of health services and what their perceived needs are in order to enhance the management of their conditions. Methods A qualitative study was conducted in a community health centre treating both HIV and diabetes patients in Cape Town. The study was grounded in the Shippee's Cumulative Complexity Model (CCM) and explored “patient workload” and “patient capacity” to manage chronic conditions. Individual interviews were conducted with 10 adult patient-participants with HIV and type two diabetes (T2D) multimorbidity and 6 healthcare workers who provided health services to these patient-participants. Results Patient-participants in this study experienced clinic-related workload such as: two separate clinics for HIV and T2D and perceived and experienced power mismatch between patients and healthcare workers. Self-care related workloads were largely around nutritional requirements, pill burden, and stigma. Burden of these demands varied in difficulty among patient-participants due to capacity factors such as: positive attitudes, optimal health literacy, social support and availability of economic resources. Strategies mentioned by participants for improved continuity of care and self-management of multi-morbidities included integration of chronic services, consolidated guidelines for healthcare workers, educational materials for patients, improved information systems and income for patients. Conclusion Using the CCM to explore multimorbidity captured most of the themes around "patient workload" and "patient capacity”, and was thus a suitable framework to explore multimorbidity in this high HIV/T2D burden setting. Integration of
The impact of workload on the ability to localize audible alarms.
Edworthy, Judy; Reid, Scott; Peel, Katie; Lock, Samantha; Williams, Jessica; Newbury, Chloe; Foster, Joseph; Farrington, Martin
2018-10-01
Very little is known about people's ability to localize sound under varying workload conditions, though it would be expected that increasing workload should degrade performance. A set of eight auditory clinical alarms already known to have relatively high localizability (the ease with which their location is identified) when tested alone were tested in six conditions where workload was varied. Participants were required to indicate the location of a series of alarms emanating at random from one of eight speaker locations. Additionally, they were asked to read, carry out mental arithmetic tasks, be exposed to typical ICU noise, or carry out either the reading task or the mental arithmetic task in ICU noise. Performance in the localizability task was best in the control condition (no secondary task) and worst in those tasks which involved both a secondary task and noise. The data does therefore demonstrate the typical pattern of increasing workload affecting a primary task in an area where there is little data. In addition, the data demonstrates that performance in the control condition results in a missed alarm on one in ten occurrences, whereas performance in the heaviest workload conditions results in a missed alarm on every fourth occurrence. This finding has implications for the understanding of both 'inattentional deafness' and 'alarm fatigue' in clinical environments. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Didion, Jeffrey R.
2018-01-01
Electrically Driven Thermal Management is an active research and technology development initiative incorporating ISS technology flight demonstrations (STP-H5), development of Microgravity Science Glovebox (MSG) flight experiment, and laboratory-based investigations of electrically based thermal management techniques. The program targets integrated thermal management for future generations of RF electronics and power electronic devices. This presentation reviews four program elements: i.) results from the Electrohydrodynamic (EHD) Long Term Flight Demonstration launched in February 2017 ii.) development of the Electrically Driven Liquid Film Boiling Experiment iii.) two University based research efforts iv.) development of Oscillating Heat Pipe evaluation at Goddard Space Flight Center.
Scaling Deep Learning Workloads: NVIDIA DGX-1/Pascal and Intel Knights Landing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gawande, Nitin A.; Landwehr, Joshua B.; Daily, Jeffrey A.
Deep Learning (DL) algorithms have become ubiquitous in data analytics. As a result, major computing vendors --- including NVIDIA, Intel, AMD and IBM --- have architectural road-maps influenced by DL workloads. Furthermore, several vendors have recently advertised new computing products as accelerating DL workloads. Unfortunately, it is difficult for data scientists to quantify the potential of these different products. This paper provides a performance and power analysis of important DL workloads on two major parallel architectures: NVIDIA DGX-1 (eight Pascal P100 GPUs interconnected with NVLink) and Intel Knights Landing (KNL) CPUs interconnected with Intel Omni-Path. Our evaluation consists of amore » cross section of convolutional neural net workloads: CifarNet, CaffeNet, AlexNet and GoogleNet topologies using the Cifar10 and ImageNet datasets. The workloads are vendor optimized for each architecture. GPUs provide the highest overall raw performance. Our analysis indicates that although GPUs provide the highest overall performance, the gap can close for some convolutional networks; and KNL can be competitive when considering performance/watt. Furthermore, NVLink is critical to GPU scaling.« less
ERIC Educational Resources Information Center
Gold, Stephanie
2005-01-01
The concept of data-driven professional development is both straight-forward and sensible. Implementing this approach is another story, which is why many administrators are turning to sophisticated tools to help manage data collection and analysis. These tools allow educators to assess and correlate student outcomes, instructional methods, and…
CERN data services for LHC computing
NASA Astrophysics Data System (ADS)
Espinal, X.; Bocchi, E.; Chan, B.; Fiorot, A.; Iven, J.; Lo Presti, G.; Lopez, J.; Gonzalez, H.; Lamanna, M.; Mascetti, L.; Moscicki, J.; Pace, A.; Peters, A.; Ponce, S.; Rousseau, H.; van der Ster, D.
2017-10-01
Dependability, resilience, adaptability and efficiency. Growing requirements require tailoring storage services and novel solutions. Unprecedented volumes of data coming from the broad number of experiments at CERN need to be quickly available in a highly scalable way for large-scale processing and data distribution while in parallel they are routed to tape for long-term archival. These activities are critical for the success of HEP experiments. Nowadays we operate at high incoming throughput (14GB/s during 2015 LHC Pb-Pb run and 11PB in July 2016) and with concurrent complex production work-loads. In parallel our systems provide the platform for the continuous user and experiment driven work-loads for large-scale data analysis, including end-user access and sharing. The storage services at CERN cover the needs of our community: EOS and CASTOR as a large-scale storage; CERNBox for end-user access and sharing; Ceph as data back-end for the CERN OpenStack infrastructure, NFS services and S3 functionality; AFS for legacy distributed-file-system services. In this paper we will summarise the experience in supporting LHC experiments and the transition of our infrastructure from static monolithic systems to flexible components providing a more coherent environment with pluggable protocols, tuneable QoS, sharing capabilities and fine grained ACLs management while continuing to guarantee dependable and robust services.
Subjective scaling of mental workload in a multi-task environment
NASA Technical Reports Server (NTRS)
Daryanian, B.
1982-01-01
Those factors in a multi-task environment that contribute to the operators' "sense" of mental workload were identified. The subjective judgment as conscious experience of mental effort was decided to be the appropriate method of measurement. Thurstone's law of comparative judgment was employed in order to construct interval scales of subjective mental workload from paired comparisons data. An experimental paradigm (Simulated Multi-Task Decision-Making Environment) was employed to represent the ideal experimentally controlled environment in which human operators were asked to "attend" to different cases of Tulga's decision making tasks. Through various statistical analyses it was found that, in general, a lower number of tasks-to-be-processed per unit time (a condition associated with longer interarrival times), results in a lower mental workload, a higher consistency of judgments within a subject, a higher degree of agreement among the subjects, and larger distances between the cases on the Thurstone scale of subjective mental workload. The effects of various control variables and their interactions, and the different characteristics of the subjects on the variation of subjective mental workload are demonstrated.
Approximate entropy: a new evaluation approach of mental workload under multitask conditions
NASA Astrophysics Data System (ADS)
Yao, Lei; Li, Xiaoling; Wang, Wei; Dong, Yuanzhe; Jiang, Ying
2014-04-01
There are numerous instruments and an abundance of complex information in the traditional cockpit display-control system, and pilots require a long time to familiarize themselves with the cockpit interface. This can cause accidents when they cope with emergency events, suggesting that it is necessary to evaluate pilot cognitive workload. In order to establish a simplified method to evaluate cognitive workload under a multitask condition. We designed a series of experiments involving different instrument panels and collected electroencephalograms (EEG) from 10 healthy volunteers. The data were classified and analyzed with an approximate entropy (ApEn) signal processing. ApEn increased with increasing experiment difficulty, suggesting that it can be used to evaluate cognitive workload. Our results demonstrate that ApEn can be used as an evaluation criteria of cognitive workload and has good specificity and sensitivity. Moreover, we determined an empirical formula to assess the cognitive workload interval, which can simplify cognitive workload evaluation under multitask conditions.
Implementing EVM Data Analysis Adding Value from a NASA Project Manager's Perspective
NASA Technical Reports Server (NTRS)
Counts, Stacy; Kerby, Jerald
2006-01-01
Data Analysis is one of the keys to an effective Earned Value Management (EVM) Process. Project Managers (PM) must continually evaluate data in assessing the health of their projects. Good analysis of data can assist PMs in making better decisions in managing projects. To better support our P Ms, National Aeronautics and Space Administration (NASA) - Marshall Space Flight Center (MSFC) recently renewed its emphasis on sound EVM data analysis practices and processes, During this presentation we will discuss the approach that MSFC followed in implementing better data analysis across its Center. We will address our approach to effectively equip and support our projects in applying a sound data analysis process. In addition, the PM for the Space Station Biological Research Project will share her experiences of how effective data analysis can benefit a PM in the decision making process. The PM will discuss how the emphasis on data analysis has helped create a solid method for assessing the project s performance. Using data analysis successfully can be an effective and efficient tool in today s environment with increasing workloads and downsizing workforces
Data Science in Supply Chain Management: Data-Related Influences on Demand Planning
ERIC Educational Resources Information Center
Jin, Yao
2013-01-01
Data-driven decisions have become an important aspect of supply chain management. Demand planners are tasked with analyzing volumes of data that are being collected at a torrential pace from myriad sources in order to translate them into actionable business intelligence. In particular, demand volatilities and planning are vital for effective and…
Quantitative assessment of workload and stressors in clinical radiation oncology.
Mazur, Lukasz M; Mosaly, Prithima R; Jackson, Marianne; Chang, Sha X; Burkhardt, Katharin Deschesne; Adams, Robert D; Jones, Ellen L; Hoyle, Lesley; Xu, Jing; Rockwell, John; Marks, Lawrence B
2012-08-01
Workload level and sources of stressors have been implicated as sources of error in multiple settings. We assessed workload levels and sources of stressors among radiation oncology professionals. Furthermore, we explored the potential association between workload and the frequency of reported radiotherapy incidents by the World Health Organization (WHO). Data collection was aimed at various tasks performed by 21 study participants from different radiation oncology professional subgroups (simulation therapists, radiation therapists, physicists, dosimetrists, and physicians). Workload was assessed using National Aeronautics and Space Administration Task-Load Index (NASA TLX). Sources of stressors were quantified using observational methods and segregated using a standard taxonomy. Comparisons between professional subgroups and tasks were made using analysis of variance ANOVA, multivariate ANOVA, and Duncan test. An association between workload levels (NASA TLX) and the frequency of radiotherapy incidents (WHO incidents) was explored (Pearson correlation test). A total of 173 workload assessments were obtained. Overall, simulation therapists had relatively low workloads (NASA TLX range, 30-36), and physicists had relatively high workloads (NASA TLX range, 51-63). NASA TLX scores for physicians, radiation therapists, and dosimetrists ranged from 40-52. There was marked intertask/professional subgroup variation (P<.0001). Mental demand (P<.001), physical demand (P=.001), and effort (P=.006) significantly differed among professional subgroups. Typically, there were 3-5 stressors per cycle of analyzed tasks with the following distribution: interruptions (41.4%), time factors (17%), technical factors (13.6%), teamwork issues (11.6%), patient factors (9.0%), and environmental factors (7.4%). A positive association between workload and frequency of reported radiotherapy incidents by the WHO was found (r = 0.87, P value=.045). Workload level and sources of stressors vary
Quantitative Assessment of Workload and Stressors in Clinical Radiation Oncology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mazur, Lukasz M., E-mail: lukasz_mazur@ncsu.edu; Industrial Extension Service, North Carolina State University, Raleigh, North Carolina; Biomedical Engineering, North Carolina State University, Raleigh, North Carolina
2012-08-01
Purpose: Workload level and sources of stressors have been implicated as sources of error in multiple settings. We assessed workload levels and sources of stressors among radiation oncology professionals. Furthermore, we explored the potential association between workload and the frequency of reported radiotherapy incidents by the World Health Organization (WHO). Methods and Materials: Data collection was aimed at various tasks performed by 21 study participants from different radiation oncology professional subgroups (simulation therapists, radiation therapists, physicists, dosimetrists, and physicians). Workload was assessed using National Aeronautics and Space Administration Task-Load Index (NASA TLX). Sources of stressors were quantified using observational methodsmore » and segregated using a standard taxonomy. Comparisons between professional subgroups and tasks were made using analysis of variance ANOVA, multivariate ANOVA, and Duncan test. An association between workload levels (NASA TLX) and the frequency of radiotherapy incidents (WHO incidents) was explored (Pearson correlation test). Results: A total of 173 workload assessments were obtained. Overall, simulation therapists had relatively low workloads (NASA TLX range, 30-36), and physicists had relatively high workloads (NASA TLX range, 51-63). NASA TLX scores for physicians, radiation therapists, and dosimetrists ranged from 40-52. There was marked intertask/professional subgroup variation (P<.0001). Mental demand (P<.001), physical demand (P=.001), and effort (P=.006) significantly differed among professional subgroups. Typically, there were 3-5 stressors per cycle of analyzed tasks with the following distribution: interruptions (41.4%), time factors (17%), technical factors (13.6%), teamwork issues (11.6%), patient factors (9.0%), and environmental factors (7.4%). A positive association between workload and frequency of reported radiotherapy incidents by the WHO was found (r = 0.87, P value=.045
Previdelli, Ágatha Nogueira; de Andrade, Samantha Caesar; Fisberg, Regina Mara; Marchioni, Dirce Maria
2016-09-23
The use of dietary patterns to assess dietary intake has become increasingly common in nutritional epidemiology studies due to the complexity and multidimensionality of the diet. Currently, two main approaches have been widely used to assess dietary patterns: data-driven and hypothesis-driven analysis. Since the methods explore different angles of dietary intake, using both approaches simultaneously might yield complementary and useful information; thus, we aimed to use both approaches to gain knowledge of adolescents' dietary patterns. Food intake from a cross-sectional survey with 295 adolescents was assessed by 24 h dietary recall (24HR). In hypothesis-driven analysis, based on the American National Cancer Institute method, the usual intake of Brazilian Healthy Eating Index Revised components were estimated. In the data-driven approach, the usual intake of foods/food groups was estimated by the Multiple Source Method. In the results, hypothesis-driven analysis showed low scores for Whole grains, Total vegetables, Total fruit and Whole fruits), while, in data-driven analysis, fruits and whole grains were not presented in any pattern. High intakes of sodium, fats and sugars were observed in hypothesis-driven analysis with low total scores for Sodium, Saturated fat and SoFAA (calories from solid fat, alcohol and added sugar) components in agreement, while the data-driven approach showed the intake of several foods/food groups rich in these nutrients, such as butter/margarine, cookies, chocolate powder, whole milk, cheese, processed meat/cold cuts and candies. In this study, using both approaches at the same time provided consistent and complementary information with regard to assessing the overall dietary habits that will be important in order to drive public health programs, and improve their efficiency to monitor and evaluate the dietary patterns of populations.
Electronic Health Record Alert-Related Workload as a Predictor of Burnout in Primary Care Providers.
Gregory, Megan E; Russo, Elise; Singh, Hardeep
2017-07-05
Electronic health records (EHRs) have been shown to increase physician workload. One EHR feature that contributes to increased workload is asynchronous alerts (also known as inbox notifications) related to test results, referral responses, medication refill requests, and messages from physicians and other health care professionals. This alert-related workload results in negative cognitive outcomes, but its effect on affective outcomes, such as burnout, has been understudied. To examine EHR alert-related workload (both objective and subjective) as a predictor of burnout in primary care providers (PCPs), in order to ultimately inform interventions aimed at reducing burnout due to alert workload. A cross-sectional questionnaire and focus group of 16 PCPs at a large medical center in the southern United States. Subjective, but not objective, alert workload was related to two of the three dimensions of burnout, including physical fatigue (p = 0.02) and cognitive weariness (p = 0.04), when controlling for organizational tenure. To reduce alert workload and subsequent burnout, participants indicated a desire to have protected time for alert management, fewer unnecessary alerts, and improvements to the EHR system. Burnout associated with alert workload may be in part due to subjective differences at an individual level, and not solely a function of the objective work environment. This suggests the need for both individual and organizational-level interventions to improve alert workload and subsequent burnout. Additional research should confirm these findings in larger, more representative samples.
Event management for large scale event-driven digital hardware spiking neural networks.
Caron, Louis-Charles; D'Haene, Michiel; Mailhot, Frédéric; Schrauwen, Benjamin; Rouat, Jean
2013-09-01
The interest in brain-like computation has led to the design of a plethora of innovative neuromorphic systems. Individually, spiking neural networks (SNNs), event-driven simulation and digital hardware neuromorphic systems get a lot of attention. Despite the popularity of event-driven SNNs in software, very few digital hardware architectures are found. This is because existing hardware solutions for event management scale badly with the number of events. This paper introduces the structured heap queue, a pipelined digital hardware data structure, and demonstrates its suitability for event management. The structured heap queue scales gracefully with the number of events, allowing the efficient implementation of large scale digital hardware event-driven SNNs. The scaling is linear for memory, logarithmic for logic resources and constant for processing time. The use of the structured heap queue is demonstrated on a field-programmable gate array (FPGA) with an image segmentation experiment and a SNN of 65,536 neurons and 513,184 synapses. Events can be processed at the rate of 1 every 7 clock cycles and a 406×158 pixel image is segmented in 200 ms. Copyright © 2013 Elsevier Ltd. All rights reserved.
Pilot Workload and Speech Analysis: A Preliminary Investigation
NASA Technical Reports Server (NTRS)
Bittner, Rachel M.; Begault, Durand R.; Christopher, Bonny R.
2013-01-01
Prior research has questioned the effectiveness of speech analysis to measure the stress, workload, truthfulness, or emotional state of a talker. The question remains regarding the utility of speech analysis for restricted vocabularies such as those used in aviation communications. A part-task experiment was conducted in which participants performed Air Traffic Control read-backs in different workload environments. Participant's subjective workload and the speech qualities of fundamental frequency (F0) and articulation rate were evaluated. A significant increase in subjective workload rating was found for high workload segments. F0 was found to be significantly higher during high workload while articulation rates were found to be significantly slower. No correlation was found to exist between subjective workload and F0 or articulation rate.
Knowledge management in healthcare: towards 'knowledge-driven' decision-support services.
Abidi, S S
2001-09-01
In this paper, we highlight the involvement of Knowledge Management in a healthcare enterprise. We argue that the 'knowledge quotient' of a healthcare enterprise can be enhanced by procuring diverse facets of knowledge from the seemingly placid healthcare data repositories, and subsequently operationalising the procured knowledge to derive a suite of Strategic Healthcare Decision-Support Services that can impact strategic decision-making, planning and management of the healthcare enterprise. In this paper, we firstly present a reference Knowledge Management environment-a Healthcare Enterprise Memory-with the functionality to acquire, share and operationalise the various modalities of healthcare knowledge. Next, we present the functional and architectural specification of a Strategic Healthcare Decision-Support Services Info-structure, which effectuates a synergy between knowledge procurement (vis-à-vis Data Mining) and knowledge operationalisation (vis-à-vis Knowledge Management) techniques to generate a suite of strategic knowledge-driven decision-support services. In conclusion, we argue that the proposed Healthcare Enterprise Memory is an attempt to rethink the possible sources of leverage to improve healthcare delivery, hereby providing a valuable strategic planning and management resource to healthcare policy makers.
Data-Driven Instructional Leadership
ERIC Educational Resources Information Center
Blink, Rebecca
2006-01-01
With real-world examples from actual schools, this book illustrates how to nurture a culture of continuous improvement, meet the needs of individual students, foster an environment of high expectations, and meet the requirements of NCLB. Each component of the Data-Driven Instructional Leadership (DDIS) model represents several branches of…
What Data for Data-Driven Learning?
ERIC Educational Resources Information Center
Boulton, Alex
2012-01-01
Corpora have multiple affordances, not least for use by teachers and learners of a foreign language (L2) in what has come to be known as "data-driven learning" or DDL. The corpus and concordance interface were originally conceived by and for linguists, so other users need to adopt the role of "language researcher" to make the most of them. Despite…
A new costing model in hospital management: time-driven activity-based costing system.
Öker, Figen; Özyapıcı, Hasan
2013-01-01
Traditional cost systems cause cost distortions because they cannot meet the requirements of today's businesses. Therefore, a new and more effective cost system is needed. Consequently, time-driven activity-based costing system has emerged. The unit cost of supplying capacity and the time needed to perform an activity are the only 2 factors considered by the system. Furthermore, this system determines unused capacity by considering practical capacity. The purpose of this article is to emphasize the efficiency of the time-driven activity-based costing system and to display how it can be applied in a health care institution. A case study was conducted in a private hospital in Cyprus. Interviews and direct observations were used to collect the data. The case study revealed that the cost of unused capacity is allocated to both open and laparoscopic (closed) surgeries. Thus, by using the time-driven activity-based costing system, managers should eliminate the cost of unused capacity so as to obtain better results. Based on the results of the study, hospital management is better able to understand the costs of different surgeries. In addition, managers can easily notice the cost of unused capacity and decide how many employees to be dismissed or directed to other productive areas.
Data-Driven Anomaly Detection Performance for the Ares I-X Ground Diagnostic Prototype
NASA Technical Reports Server (NTRS)
Martin, Rodney A.; Schwabacher, Mark A.; Matthews, Bryan L.
2010-01-01
In this paper, we will assess the performance of a data-driven anomaly detection algorithm, the Inductive Monitoring System (IMS), which can be used to detect simulated Thrust Vector Control (TVC) system failures. However, the ability of IMS to detect these failures in a true operational setting may be related to the realistic nature of how they are simulated. As such, we will investigate both a low fidelity and high fidelity approach to simulating such failures, with the latter based upon the underlying physics. Furthermore, the ability of IMS to detect anomalies that were previously unknown and not previously simulated will be studied in earnest, as well as apparent deficiencies or misapplications that result from using the data-driven paradigm. Our conclusions indicate that robust detection performance of simulated failures using IMS is not appreciably affected by the use of a high fidelity simulation. However, we have found that the inclusion of a data-driven algorithm such as IMS into a suite of deployable health management technologies does add significant value.
The Structural Consequences of Big Data-Driven Education.
Zeide, Elana
2017-06-01
Educators and commenters who evaluate big data-driven learning environments focus on specific questions: whether automated education platforms improve learning outcomes, invade student privacy, and promote equality. This article puts aside separate unresolved-and perhaps unresolvable-issues regarding the concrete effects of specific technologies. It instead examines how big data-driven tools alter the structure of schools' pedagogical decision-making, and, in doing so, change fundamental aspects of America's education enterprise. Technological mediation and data-driven decision-making have a particularly significant impact in learning environments because the education process primarily consists of dynamic information exchange. In this overview, I highlight three significant structural shifts that accompany school reliance on data-driven instructional platforms that perform core school functions: teaching, assessment, and credentialing. First, virtual learning environments create information technology infrastructures featuring constant data collection, continuous algorithmic assessment, and possibly infinite record retention. This undermines the traditional intellectual privacy and safety of classrooms. Second, these systems displace pedagogical decision-making from educators serving public interests to private, often for-profit, technology providers. They constrain teachers' academic autonomy, obscure student evaluation, and reduce parents' and students' ability to participate or challenge education decision-making. Third, big data-driven tools define what "counts" as education by mapping the concepts, creating the content, determining the metrics, and setting desired learning outcomes of instruction. These shifts cede important decision-making to private entities without public scrutiny or pedagogical examination. In contrast to the public and heated debates that accompany textbook choices, schools often adopt education technologies ad hoc. Given education
Relationship-Driven Classroom Management: Strategies That Promote Student Motivation.
ERIC Educational Resources Information Center
Vitto, John M.
This book combines information about resiliency, classroom management, and discipline into a user-friendly discussion suitable for all teachers. The material covers both preventive strategies and reactive strategies. The chapters of part 1, "Reinventive Strategies," are: (1) "Relationship-Driven Classroom Management and Resilience"; (2)…
ERIC Educational Resources Information Center
Varnavas, Andreas P.; Soteriou, Andreas C.
2002-01-01
Presents and discusses the approach used by the Higher Hotel Institute in Cyprus to incorporate total quality management through establishment of a customer-driven management culture in its hospitality education program. Discusses how it collects and uses service-quality related data from future employers, staff, and students in pursuing this…
Eye Tracking Metrics for Workload Estimation in Flight Deck Operation
NASA Technical Reports Server (NTRS)
Ellis, Kyle; Schnell, Thomas
2010-01-01
Flight decks of the future are being enhanced through improved avionics that adapt to both aircraft and operator state. Eye tracking allows for non-invasive analysis of pilot eye movements, from which a set of metrics can be derived to effectively and reliably characterize workload. This research identifies eye tracking metrics that correlate to aircraft automation conditions, and identifies the correlation of pilot workload to the same automation conditions. Saccade length was used as an indirect index of pilot workload: Pilots in the fully automated condition were observed to have on average, larger saccadic movements in contrast to the guidance and manual flight conditions. The data set itself also provides a general model of human eye movement behavior and so ostensibly visual attention distribution in the cockpit for approach to land tasks with various levels of automation, by means of the same metrics used for workload algorithm development.
Centralized Data Management in a Multicountry, Multisite Population-based Study.
Rahman, Qazi Sadeq-ur; Islam, Mohammad Shahidul; Hossain, Belal; Hossain, Tanvir; Connor, Nicholas E; Jaman, Md Jahiduj; Rahman, Md Mahmudur; Ahmed, A S M Nawshad Uddin; Ahmed, Imran; Ali, Murtaza; Moin, Syed Mamun Ibne; Mullany, Luke; Saha, Samir K; El Arifeen, Shams
2016-05-01
A centralized data management system was developed for data collection and processing for the Aetiology of Neonatal Infection in South Asia (ANISA) study. ANISA is a longitudinal cohort study involving neonatal infection surveillance and etiology detection in multiple sites in South Asia. The primary goal of designing such a system was to collect and store data from different sites in a standardized way to pool the data for analysis. We designed the data management system centrally and implemented it to enable data entry at individual sites. This system uses validation rules and audit that reduce errors. The study sites employ a dual data entry method to minimize keystroke errors. They upload collected data weekly to a central server via internet to create a pooled central database. Any inconsistent data identified in the central database are flagged and corrected after discussion with the relevant site. The ANISA Data Coordination Centre in Dhaka provides technical support for operations, maintenance and updating the data management system centrally. Password-protected login identifications and audit trails are maintained for the management system to ensure the integrity and safety of stored data. Centralized management of the ANISA database helps to use common data capture forms (DCFs), adapted to site-specific contextual requirements. DCFs and data entry interfaces allow on-site data entry. This reduces the workload as DCFs do not need to be shipped to a single location for entry. It also improves data quality as all collected data from ANISA goes through the same quality check and cleaning process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carranza, E. J. M., E-mail: carranza@itc.nl; Woldai, T.; Chikambwe, E. M.
A case application of data-driven estimation of evidential belief functions (EBFs) is demonstrated to prospectivity mapping in Lundazi district (eastern Zambia). Spatial data used to represent recognition criteria of prospectivity for aquamarine-bearing pegmatites include mapped granites, mapped faults/fractures, mapped shear zones, and radioelement concentration ratios derived from gridded airborne radiometric data. Data-driven estimates EBFs take into account not only (a) spatial association between an evidential map layer and target deposits but also (b) spatial relationships between classes of evidences in an evidential map layer. Data-driven estimates of EBFs can indicate which spatial data provide positive or negative evidence of prospectivity.more » Data-driven estimates of EBFs of only spatial data providing positive evidence of prospectivity were integrated according to Dempster's rule of combination. Map of integrated degrees of belief was used to delineate zones of relative degress of prospectivity for aquamarine-bearing pegmatites. The predictive map has at least 85% prediction rate and at least 79% success rate of delineating training and validation deposits, respectively. The results illustrate usefulness of data-driven estimation of EBFs in GIS-based predictive mapping of mineral prospectivity. The results also show usefulness of EBFs in managing uncertainties associated with evidential maps.« less
Patient Safety Incidents and Nursing Workload 1
Carlesi, Katya Cuadros; Padilha, Kátia Grillo; Toffoletto, Maria Cecília; Henriquez-Roldán, Carlos; Juan, Monica Andrea Canales
2017-01-01
ABSTRACT Objective: to identify the relationship between the workload of the nursing team and the occurrence of patient safety incidents linked to nursing care in a public hospital in Chile. Method: quantitative, analytical, cross-sectional research through review of medical records. The estimation of workload in Intensive Care Units (ICUs) was performed using the Therapeutic Interventions Scoring System (TISS-28) and for the other services, we used the nurse/patient and nursing assistant/patient ratios. Descriptive univariate and multivariate analysis were performed. For the multivariate analysis we used principal component analysis and Pearson correlation. Results: 879 post-discharge clinical records and the workload of 85 nurses and 157 nursing assistants were analyzed. The overall incident rate was 71.1%. It was found a high positive correlation between variables workload (r = 0.9611 to r = 0.9919) and rate of falls (r = 0.8770). The medication error rates, mechanical containment incidents and self-removal of invasive devices were not correlated with the workload. Conclusions: the workload was high in all units except the intermediate care unit. Only the rate of falls was associated with the workload. PMID:28403334
Heavy vehicle driver workload assessment : executive summary
DOT National Transportation Integrated Search
1996-10-01
This report summarizes a program of research to develop methods, data, and guidelines to conduct heavy vehicle driver-oriented workload assessments of new, high-technology, in-cab devices. Many such devices are being developed and implemented in heav...
Pupillometric measurement of operator workload
NASA Technical Reports Server (NTRS)
Beatty, J.
1981-01-01
Pupillometry as a method of measuring workload is described. Pupillometric measures provide an indication of momentary fluctuations in central nervous system excitability that occur as cognitive operations are performed; the magnitude of these changes may serve as a sensitive indicator of the workload imposed by cognitive tasks.
Data Driven Math Intervention: What the Numbers Say
ERIC Educational Resources Information Center
Martin, Anthony W.
2013-01-01
This study was designed to determine whether or not data driven math skills groups would be effective in increasing student academic achievement. From this topic three key questions arose: "Would the implementation of data driven math skills groups improve student academic achievement more than standard instruction as measured by the…
Transport pilot workload - A comparison of two subjective techniques
NASA Technical Reports Server (NTRS)
Battiste, Vernol; Bortolussi, Michael
1988-01-01
Although SWAT and NASA-TLX workload scales have been compared on numerous occasions, they have not been compared in the context of transport operations. Transport pilot workload has traditionally been classified as long periods of low workload with occasional spikes of high workload. Thus, the relative sensitivity of the scales to variations in workload at the low end of the scale were evaluated. This study was a part of a larger study which investigated workload measures for aircraft certification, conducted in a Phase II certified Link/Boeing 727 simulator. No significant main effects were found for any performance-based measures of workload. However, both SWAT and NASA-TLX were sensitive to differences between high and low workload flights and to differences among flight segments. NASA-TLX (but not SWAT) was sensitive to the increase in workload during the cruise segment of the high workload flight. Between-subject variability was high for SWAT. NASA-TLX was found to be stable when compared in the test/retest paradigm. A test/retest by segment interaction suggested that this was not the case for SWAT ratings.
Multiplexing Low and High QoS Workloads in Virtual Environments
NASA Astrophysics Data System (ADS)
Verboven, Sam; Vanmechelen, Kurt; Broeckhove, Jan
Virtualization technology has introduced new ways for managing IT infrastructure. The flexible deployment of applications through self-contained virtual machine images has removed the barriers for multiplexing, suspending and migrating applications with their entire execution environment, allowing for a more efficient use of the infrastructure. These developments have given rise to an important challenge regarding the optimal scheduling of virtual machine workloads. In this paper, we specifically address the VM scheduling problem in which workloads that require guaranteed levels of CPU performance are mixed with workloads that do not require such guarantees. We introduce a framework to analyze this scheduling problem and evaluate to what extent such mixed service delivery is beneficial for a provider of virtualized IT infrastructure. Traditionally providers offer IT resources under a guaranteed and fixed performance profile, which can lead to underutilization. The findings of our simulation study show that through proper tuning of a limited set of parameters, the proposed scheduling algorithm allows for a significant increase in utilization without sacrificing on performance dependability.
Data-Driven School Administrator Behaviors and State Report Card Results
ERIC Educational Resources Information Center
Spencer, James A., Jr.
2014-01-01
The purpose of this study was to identify the principal behaviors that would define an instructional leader as being a data-driven school administrator and to assess current school administrators' levels of being data-driven. This research attempted to examine the relationship between the degree to which a principal was data-driven and the…
Validation of air traffic controller workload models
DOT National Transportation Integrated Search
1979-09-01
During the past several years, computer models have been developed for off-site : estimat ion of control ler's workload. The inputs to these models are audio and : digital data normally recorded at an Air Route Traffic Control Center (ARTCC). : This ...
Helsens, Kenny; Colaert, Niklaas; Barsnes, Harald; Muth, Thilo; Flikka, Kristian; Staes, An; Timmerman, Evy; Wortelkamp, Steffi; Sickmann, Albert; Vandekerckhove, Joël; Gevaert, Kris; Martens, Lennart
2010-03-01
MS-based proteomics produces large amounts of mass spectra that require processing, identification and possibly quantification before interpretation can be undertaken. High-throughput studies require automation of these various steps, and management of the data in association with the results obtained. We here present ms_lims (http://genesis.UGent.be/ms_lims), a freely available, open-source system based on a central database to automate data management and processing in MS-driven proteomics analyses.
De Feo, G; De Gisi, S; Galasso, M
2013-01-01
The aim of the present study is to define a simple (and easy to use) method to equalize the workload of personnel operating several small wastewater treatment plants (SWWTPs). The approach is illustrated through a case study which is the result of collaboration between researchers and a water and wastewater management company operating in Southern Italy. The topic is important since personnel have a significant impact on the operating costs of SWWTPs, and the approach outlined results in the minimum number of staff being required to assure the management of the service. Four kinds of work units are considered: plant managers, assistant plant managers, laboratory technicians and executives. In order to develop a practical, feasible and easy to use method, the workload was evaluated considering only the population equivalent (PE) and the number of plants managed. The core of the method is the evaluation of the percentage of time that the personnel units devote to the operation of SWWTPs of the municipality considered. The proposed procedure offers a useful tool to equalize the workload, both in terms of PE and the number of plants managed, the procedure being easily modifiable to introduce other evaluation criteria. By using familiar concepts such as PE and number of plants managed, the approach of the method can easily be understood by management. It can also be readily adapted to other similar situations.
Nind, Thomas; Galloway, James; McAllister, Gordon; Scobbie, Donald; Bonney, Wilfred; Hall, Christopher; Tramma, Leandro; Reel, Parminder; Groves, Martin; Appleby, Philip; Doney, Alex; Guthrie, Bruce; Jefferson, Emily
2018-05-22
The Health Informatics Centre (HIC) at the University of Dundee provides a service to securely host clinical datasets and extract relevant data for anonymised cohorts to researchers to enable them to answer key research questions. As is common in research using routine healthcare data, the service was historically delivered using ad-hoc processes resulting in the slow provision of data whose provenance was often hidden to the researchers using it. This paper describes the development and evaluation of the Research Data Management Platform (RDMP): an open source tool to load, manage, clean, and curate longitudinal healthcare data for research and provide reproducible and updateable datasets for defined cohorts to researchers. Between 2013 and 2017, RDMP tool implementation tripled the productivity of Data Analysts producing data releases for researchers from 7.1 to 25.3 per month; and reduced the error rate from 12.7% to 3.1%. The effort on data management reduced from a mean of 24.6 to 3.0 hours per data release. The waiting time for researchers to receive data after agreeing a specification reduced from approximately 6 months to less than one week. The software is scalable and currently manages 163 datasets. 1,321 data extracts for research have been produced with the largest extract linking data from 70 different datasets. The tools and processes that encompass the RDMP not only fulfil the research data management requirements of researchers but also support the seamless collaboration of data cleaning, data transformation, data summarisation and data quality assessment activities by different research groups.
Data-Driven Hint Generation from Peer Debugging Solutions
ERIC Educational Resources Information Center
Liu, Zhongxiu
2015-01-01
Data-driven methods have been a successful approach to generating hints for programming problems. However, the majority of previous studies are focused on procedural hints that aim at moving students to the next closest state to the solution. In this paper, I propose a data-driven method to generate remedy hints for BOTS, a game that teaches…
Building an Ontology-driven Database for Clinical Immune Research
Ma, Jingming
2006-01-01
The clinical researches of immune response usually generate a huge amount of biomedical testing data over a certain period of time. The user-friendly data management systems based on the relational database will help immunologists/clinicians to fully manage the data. On the other hand, the same biological assays such as ELISPOT and flow cytometric assays are involved in immunological experiments no matter of different study purposes. The reuse of biological knowledge is one of driving forces behind this ontology-driven data management. Therefore, an ontology-driven database will help to handle different clinical immune researches and help immunologists/clinicians easily understand the immunological data from each other. We will discuss some outlines for building an ontology-driven data management for clinical immune researches (ODMim). PMID:17238637
Scaling deep learning workloads: NVIDIA DGX-1/Pascal and Intel Knights Landing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gawande, Nitin A.; Landwehr, Joshua B.; Daily, Jeffrey A.
Deep Learning (DL) algorithms have become ubiquitous in data analytics. As a result, major computing vendors --- including NVIDIA, Intel, AMD, and IBM --- have architectural road-maps influenced by DL workloads. Furthermore, several vendors have recently advertised new computing products as accelerating large DL workloads. Unfortunately, it is difficult for data scientists to quantify the potential of these different products. This paper provides a performance and power analysis of important DL workloads on two major parallel architectures: NVIDIA DGX-1 (eight Pascal P100 GPUs interconnected with NVLink) and Intel Knights Landing (KNL) CPUs interconnected with Intel Omni-Path or Cray Aries. Ourmore » evaluation consists of a cross section of convolutional neural net workloads: CifarNet, AlexNet, GoogLeNet, and ResNet50 topologies using the Cifar10 and ImageNet datasets. The workloads are vendor-optimized for each architecture. Our analysis indicates that although GPUs provide the highest overall performance, the gap can close for some convolutional networks; and the KNL can be competitive in performance/watt. We find that NVLink facilitates scaling efficiency on GPUs. However, its importance is heavily dependent on neural network architecture. Furthermore, for weak-scaling --- sometimes encouraged by restricted GPU memory --- NVLink is less important.« less
Mental workload measurement for emergency operating procedures in digital nuclear power plants.
Gao, Qin; Wang, Yang; Song, Fei; Li, Zhizhong; Dong, Xiaolu
2013-01-01
Mental workload is a major consideration for the design of emergency operation procedures (EOPs) in nuclear power plants. Continuous and objective measures are desired. This paper compares seven mental workload measurement methods (pupil size, blink rate, blink duration, heart rate variability, parasympathetic/sympathetic ratio, total power and (Goals, Operations, Methods, and Section Rules)-(Keystroke Level Model) GOMS-KLM-based workload index) with regard to sensitivity, validity and intrusiveness. Eighteen participants performed two computerised EOPs of different complexity levels, and mental workload measures were collected during the experiment. The results show that the blink rate is sensitive to both the difference in the overall task complexity and changes in peak complexity within EOPs, that the error rate is sensitive to the level of arousal and correlate to the step error rate and that blink duration increases over the task period in both low and high complexity EOPs. Cardiac measures were able to distinguish tasks with different overall complexity. The intrusiveness of the physiological instruments is acceptable. Finally, the six physiological measures were integrated using group method of data handling to predict perceived overall mental workload. The study compared seven measures for evaluating the mental workload with emergency operation procedure in nuclear power plants. An experiment with simulated procedures was carried out, and the results show that eye response measures are useful for assessing temporal changes of workload whereas cardiac measures are useful for evaluating the overall workload.
Flight deck crew coordination indices of workload and situation awareness in terminal operations
NASA Astrophysics Data System (ADS)
Ellis, Kyle Kent Edward
Crew coordination in the context of aviation is a specifically choreographed set of tasks performed by each pilot, defined for each phase of flight. Based on the constructs of effective Crew Resource Management and SOPs for each phase of flight, a shared understanding of crew workload and task responsibility is considered representative of well-coordinated crews. Nominal behavior is therefore defined by SOPs and CRM theory, detectable through pilot eye-scan. This research investigates the relationship between the eye-scan exhibited by each pilot and the level of coordination between crewmembers. Crew coordination was evaluated based on each pilot's understanding of the other crewmember's workload. By contrasting each pilot's workload-understanding, crew coordination was measured as the summed absolute difference of each pilot's understanding of the other crewmember's reported workload, resulting in a crew coordination index. The crew coordination index rates crew coordination on a scale ranging across Excellent, Good, Fair and Poor. Eye-scan behavior metrics were found to reliably identify a reduction in crew coordination. Additionally, crew coordination was successfully characterized by eye-scan behavior data using machine learning classification methods. Identifying eye-scan behaviors on the flight deck indicative of reduced crew coordination can be used to inform training programs and design enhanced avionics that improve the overall coordination between the crewmembers and the flight deck interface. Additionally, characterization of crew coordination can be used to develop methods to increase shared situation awareness and crew coordination to reduce operational and flight technical errors. Ultimately, the ability to reduce operational and flight technical errors made by pilot crews improves the safety of aviation.
Inflight evaluation of pilot workload measures for rotorcraft research
NASA Technical Reports Server (NTRS)
Shively, Robert J.; Bortolussi, Michael R.; Battiste, Vernol; Hart, Sandra G.; Pepitone, David D.; Matsumoto, Joy Hamerman
1987-01-01
The effectiveness of heart-rate monitoring and the NASA TLX workload rating scale (Hart et al., 1985) in measuring helicopter-pilot workloads is investigated experimentally. Four NASA test pilots flew two 2-h missions each in an SH-3G helicopter, following scenarios with takeoff, hover, cross-country, and landing tasks; pilot performance on the tasks undertaken near the landing area was measured by laser tracking. The results are presented in graphs and discussed in detail, and it is found that the TLX ratings clearly distinguish the flight segments and are well correlated with the performance data. The mean heart rate (measured as interbeat interval) is correlated (r = -0.69) with the TLX workload, but only the standard deviation of the interbeat interval is able to distinguish between flight segments; the correlation between standard deviation and TLX ratings is negative but not significant.
ERIC Educational Resources Information Center
Bowyer, Kyle
2012-01-01
Student workload is a contributing factor to students deciding to withdraw from their study before completion of the course, at significant cost to students, institutions and society. The aim of this paper is to create a basic workload model for a group of undergraduate students studying business law units at Curtin University in Western…
Ross-Walker, Cheryl; Rogers-Clark, Cath; Pearce, Susanne
Nursing workload is an issue that effects both the recruitment and retention of nurses, and patient safety. Historically, measurement has focussed on the delivery of direct patient care and excluded workload of facilitating hands-on care and supporting the organisation via duties that reflect organisation cultural and climate needs. Qualitative research is appropriate to understand this complexity. To determine the best available evidence in relation to registered nurses experiences of workplace cultural and climatic factors that influence nursing workloads, in an acute health care setting. This review sought high quality studies which explored registered nurses' experiences of the influence of cultural and climatic factors on their workloads. Qualitative research studies and opinion-based text were considered. An extensive search of the literature was conducted to identify published and unpublished studies between January 1990 and June 2011 in English, and indexed in the following databases: CINAHL, Medline, Medline-In Process, PsychINFO, Emerald, Current Contents, TRIP, JSTOR Nursing Consult Psychology & Behavioural Sciences collections, Emerald Management Reviews, Emerald Full Text Journals, Embase, Dissertation Abstracts, ERIC, Proquest and MedNar, EBSCOhost, Science Direct, Wiley Interscience. Two independent reviewers (CRW and CRC), using appraisal tools from the Joanna Briggs Institute (JBI), assessed fifteen articles; one was excluded. Data were extracted from included papers using standardised tools developed by the JBI. Data from qualitative studies and textual/opinion papers were meta-synthesised separately using standardised instruments. Data synthesis involved the pooling of findings, then grouped into categories on the basis of similarity of meaning. The categories were further aggregated into synthesised findings. 14 papers were identified as high quality and meeting the inclusion criteria. 81 findings were identified from the 10 qualitative research
Optimizing Data Management in Grid Environments
NASA Astrophysics Data System (ADS)
Zissimos, Antonis; Doka, Katerina; Chazapis, Antony; Tsoumakos, Dimitrios; Koziris, Nectarios
Grids currently serve as platforms for numerous scientific as well as business applications that generate and access vast amounts of data. In this paper, we address the need for efficient, scalable and robust data management in Grid environments. We propose a fully decentralized and adaptive mechanism comprising of two components: A Distributed Replica Location Service (DRLS) and a data transfer mechanism called GridTorrent. They both adopt Peer-to-Peer techniques in order to overcome performance bottlenecks and single points of failure. On one hand, DRLS ensures resilience by relying on a Byzantine-tolerant protocol and is able to handle massive concurrent requests even during node churn. On the other hand, GridTorrent allows for maximum bandwidth utilization through collaborative sharing among the various data providers and consumers. The proposed integrated architecture is completely backwards-compatible with already deployed Grids. To demonstrate these points, experiments have been conducted in LAN as well as WAN environments under various workloads. The evaluation shows that our scheme vastly outperforms the conventional mechanisms in both efficiency (up to 10 times faster) and robustness in case of failures and flash crowd instances.
A Model-Driven Development Method for Management Information Systems
NASA Astrophysics Data System (ADS)
Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki
Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.
A comprehensive prediction and evaluation method of pilot workload
Feng, Chuanyan; Wanyan, Xiaoru; Yang, Kun; Zhuang, Damin; Wu, Xu
2018-01-01
BACKGROUND: The prediction and evaluation of pilot workload is a key problem in human factor airworthiness of cockpit. OBJECTIVE: A pilot traffic pattern task was designed in a flight simulation environment in order to carry out the pilot workload prediction and improve the evaluation method. METHODS: The prediction of typical flight subtasks and dynamic workloads (cruise, approach, and landing) were built up based on multiple resource theory, and a favorable validity was achieved by the correlation analysis verification between sensitive physiological data and the predicted value. RESULTS: Statistical analysis indicated that eye movement indices (fixation frequency, mean fixation time, saccade frequency, mean saccade time, and mean pupil diameter), Electrocardiogram indices (mean normal-to-normal interval and the ratio between low frequency and sum of low frequency and high frequency), and Electrodermal Activity indices (mean tonic and mean phasic) were all sensitive to typical workloads of subjects. CONCLUSION: A multinominal logistic regression model based on combination of physiological indices (fixation frequency, mean normal-to-normal interval, the ratio between low frequency and sum of low frequency and high frequency, and mean tonic) was constructed, and the discriminate accuracy was comparatively ideal with a rate of 84.85%. PMID:29710742
A comprehensive prediction and evaluation method of pilot workload.
Feng, Chuanyan; Wanyan, Xiaoru; Yang, Kun; Zhuang, Damin; Wu, Xu
2018-01-01
The prediction and evaluation of pilot workload is a key problem in human factor airworthiness of cockpit. A pilot traffic pattern task was designed in a flight simulation environment in order to carry out the pilot workload prediction and improve the evaluation method. The prediction of typical flight subtasks and dynamic workloads (cruise, approach, and landing) were built up based on multiple resource theory, and a favorable validity was achieved by the correlation analysis verification between sensitive physiological data and the predicted value. Statistical analysis indicated that eye movement indices (fixation frequency, mean fixation time, saccade frequency, mean saccade time, and mean pupil diameter), Electrocardiogram indices (mean normal-to-normal interval and the ratio between low frequency and sum of low frequency and high frequency), and Electrodermal Activity indices (mean tonic and mean phasic) were all sensitive to typical workloads of subjects. A multinominal logistic regression model based on combination of physiological indices (fixation frequency, mean normal-to-normal interval, the ratio between low frequency and sum of low frequency and high frequency, and mean tonic) was constructed, and the discriminate accuracy was comparatively ideal with a rate of 84.85%.
Filter bank common spatial patterns in mental workload estimation.
Arvaneh, Mahnaz; Umilta, Alberto; Robertson, Ian H
2015-01-01
EEG-based workload estimation technology provides a real time means of assessing mental workload. Such technology can effectively enhance the performance of the human-machine interaction and the learning process. When designing workload estimation algorithms, a crucial signal processing component is the feature extraction step. Despite several studies on this field, the spatial properties of the EEG signals were mostly neglected. Since EEG inherently has a poor spacial resolution, features extracted individually from each EEG channel may not be sufficiently efficient. This problem becomes more pronounced when we use low-cost but convenient EEG sensors with limited stability which is the case in practical scenarios. To address this issue, in this paper, we introduce a filter bank common spatial patterns algorithm combined with a feature selection method to extract spatio-spectral features discriminating different mental workload levels. To evaluate the proposed algorithm, we carry out a comparative analysis between two representative types of working memory tasks using data recorded from an Emotiv EPOC headset which is a mobile low-cost EEG recording device. The experimental results showed that the proposed spatial filtering algorithm outperformed the state-of-the algorithms in terms of the classification accuracy.
Van Bogaert, Peter; Clarke, Sean; Willems, Riet; Mondelaers, Mieke
2013-07-01
To study the relationships between nurse practice environment, workload, burnout, job outcomes and nurse-reported quality of care in psychiatric hospital staff. Nurses' practice environments in general hospitals have been extensively investigated. Potential variations across practice settings, for instance in psychiatric hospitals, have been much less studied. A cross-sectional design with a survey. A structural equation model previously tested in acute hospitals was evaluated using survey data from a sample of 357 registered nurses, licensed practical nurses, and non-registered caregivers from two psychiatric hospitals in Belgium between December 2010-April 2011. The model included paths between practice environment dimensions and outcome variables, with burnout in a mediating position. A workload measure was also tested as a potential mediator between the practice environment and outcome variables. An improved model, slightly modified from the one validated earlier in samples of acute care nurses, was confirmed. This model explained 50% and 38% of the variance in job outcomes and nurse-reported quality of care respectively. In addition, workload was found to play a mediating role in accounting for job outcomes and significantly improved a model that ultimately explained 60% of the variance in these variables. In psychiatric hospitals as in general hospitals, nurse-physician relationship and other organizational dimensions such as nursing and hospital management were closely associated with perceptions of workload and with burnout and job satisfaction, turnover intentions, and nurse-reported quality of care. Mechanisms linking key variables and differences across settings in these relationships merit attention by managers and researchers. © 2012 Blackwell Publishing Ltd.
Mental workload in decision and control
NASA Technical Reports Server (NTRS)
Sheridan, T. B.
1979-01-01
This paper briefly reviews the problems of defining and measuring the 'mental workload' of aircraft pilots and other human operators of complex dynamic systems. Of the alternative approaches the author indicates a clear preference for the use of subjective scaling. Some recent experiments from MIT and elsewhere are described which utilize subjective mental workload scales in conjunction with human decision and control tasks in the laboratory. Finally a new three-dimensional mental workload rating scale, under current development for use by IFR aircraft pilots, is presented.
Managing Workload in Human-Robot Interaction: A Review of Empirical Studies
2010-01-01
central concern in determining successful teleoperation. Regardless of the sophistication of the technology, a robot is oper- ated – with different levels...by many characteristics, including the type of workload manipulation, the apparatus used, task char- acteristics, and/or type of outcome measures . Due...linguistic patterns. Further- more, this interference may not even be detected if operators do not explicitly measure team communication performance, or re
The relationship between workload and training - An introduction
NASA Technical Reports Server (NTRS)
Hart, Sandra G.
1986-01-01
This paper reviews the relationships among workload, performance, and training. Its goal is to introduce the concepts of workload and training and to suggest how they may be related. It suggests some of the practical and theoretical benefits to be derived from their joint consideration. Training effectiveness can be improved by monitoring trainee workload and the reliability of workload predictions, and measures can be improved by identifying and controlling the training levels of experimental subjects.
[Study on mental workload of teachers in primary schools].
Xiao, Yuan-mei; Wang, Zhi-ming; Wang, Mian-zhen; Lan, Ya-jia; Fan, Guang-qin; Feng, Chang
2011-12-01
To investigate the distribution characteristics and influencing factors of mental workload of teachers in primary schools. National Aeronautics and Space Administration-Task Load Index (NASA-TLX) was used to assess the mental workload levels for 397 teachers of primary schools in a city. The mental workload (64.34+10.56) of female teachers was significantly higher than that (61.73+ 9.77) of male teachers (P<0.05). The mental workload (65.66+10.42) of "-35" years old group was the highest. When age of teachers was younger than 35 years old, there was a positive correlation between the mental workload and age (r=0.146, P<0.05). When age of teachers was older than 35 years old, there was a negative correlation between the mental workload and age (r=-0.190, P<0.05). The teachers with higher education level felt higher mental workload (unstandardized coefficients B=1.524, standardized coefficients /=0.111, P<0.05). There was a positive correlation between the mental workload and working hours per day (unstandardized coefficients B =4.659, standardized coefficients/3 =0.223, P<0.001). Mental workload of the teachers in primary schools is closely related to age, educational level and work hours per day. Work hours per day is an important risk factor for mental workload. Reducing work hours per day (8 hours) is an effective measure of alleviating the mental workload of teachers in primary schools.
Relationship between workload and mind-wandering in simulated driving
2017-01-01
Mental workload and mind-wandering are highly related to driving safety. This study investigated the relationship between mental workload and mind-wandering while driving. Participants (N = 40) were asked to perform a car following task in driving simulator, and report whether they had experienced mind-wandering upon hearing a tone. After driving, participants reported their workload using the NASA-Task Load Index (TLX). Results revealed an interaction between workload and mind-wandering in two different perspectives. First, there was a negative correlation between workload and mind-wandering (r = -0.459, p < 0.01) for different individuals. Second, from temporal perspective workload and mind-wandering frequency increased significantly over task time and were positively correlated. Together, these findings contribute to understanding the roles of workload and mind-wandering in driving. PMID:28467513
Practical guidelines for workload assessment
NASA Technical Reports Server (NTRS)
Tattersall, Andrew J.
1994-01-01
The practical problems that might be encountered in carrying out workload evaluations in work settings have been outlined. Different approaches have been distinguished that may determine the type of research design used and provide assistance in the difficult choice between workload assessment techniques. One approach to workload assessment is to examine the short-term consequences of combining various tasks. Theoretical models of attention allocation will underpin specific studies of interference and the consequences of task demand and task conflict for performance. A further approach with a different temporal orientation may lead us to a better understanding of the relationships between work demands and strain through the analysis of individual differences in cognitive control processes. The application of these processes may depend on individual differences in long term styles and short term strategies, but may be used to prevent decrements in work performance under difficult conditions. However, control may attract costs as well as benefits in terms of changes in effective state and physiological activity. Thus, strain associated with work demands may only be measurable in the form of tradeoffs between performance and other domains of individual activity. The methodological implications are to identify patterns of adjustment to workload variations using repeated measures and longitudinal sampling of performance as well as subjective and physiological measures. Possible enhancements to workplace design must take into account these human factors considerations of workload in order to avoid potential decrements in individual performance and associated organizational problems.
Measuring workload in collaborative contexts: trait versus state perspectives.
Helton, William S; Funke, Gregory J; Knott, Benjamin A
2014-03-01
In the present study, we explored the state versus trait aspects of measures of task and team workload in a disaster simulation. There is often a need to assess workload in both individual and collaborative settings. Researchers in this field often use the NASATask Load Index (NASA-TLX) as a global measure of workload by aggregating the NASA-TLX's component items. Using this practice, one may overlook the distinction between traits and states. Fifteen dyadic teams (11 inexperienced, 4 experienced) completed five sessions of a tsunami disaster simulator. After every session, individuals completed a modified version of the NASA-TLX that included team workload measures.We then examined the workload items by using a between-subjects and within-subjects perspective. Between-subjects and within-subjects correlations among the items indicated the workload items are more independent within subjects (as states) than between subjects (as traits). Correlations between the workload items and simulation performance were also different at the trait and state levels. Workload may behave differently at trait (between-subjects) and state (within-subjects) levels. Researchers interested in workload measurement as a state should take a within-subjects perspective in their analyses.
2009-01-01
Background In recent years, the genome biology community has expended considerable effort to confront the challenges of managing heterogeneous data in a structured and organized way and developed laboratory information management systems (LIMS) for both raw and processed data. On the other hand, electronic notebooks were developed to record and manage scientific data, and facilitate data-sharing. Software which enables both, management of large datasets and digital recording of laboratory procedures would serve a real need in laboratories using medium and high-throughput techniques. Results We have developed iLAP (Laboratory data management, Analysis, and Protocol development), a workflow-driven information management system specifically designed to create and manage experimental protocols, and to analyze and share laboratory data. The system combines experimental protocol development, wizard-based data acquisition, and high-throughput data analysis into a single, integrated system. We demonstrate the power and the flexibility of the platform using a microscopy case study based on a combinatorial multiple fluorescence in situ hybridization (m-FISH) protocol and 3D-image reconstruction. iLAP is freely available under the open source license AGPL from http://genome.tugraz.at/iLAP/. Conclusion iLAP is a flexible and versatile information management system, which has the potential to close the gap between electronic notebooks and LIMS and can therefore be of great value for a broad scientific community. PMID:19941647
Korshøj, Mette; Krustrup, Peter; Jørgensen, Marie Birk; Prescott, Eva; Hansen, Åse Marie; Kristiansen, Jesper; Skotte, Jørgen Henrik; Mortensen, Ole Steen; Søgaard, Karen; Holtermann, Andreas
2012-08-13
Prevalence of cardiovascular risk factors is unevenly distributed among occupational groups. The working environment, as well as lifestyle and socioeconomic status contribute to the disparity and variation in prevalence of these risk factors. High physical work demands have been shown to increase the risk for cardiovascular disease and mortality, contrary to leisure time physical activity. High physical work demands in combination with a low cardiorespiratory fitness infer a high relative workload and an excessive risk for cardiovascular mortality. Therefore, the aim of this study is to examine whether a worksite aerobic exercise intervention will reduce the relative workload and cardiovascular risk factors by an increased cardiorespiratory fitness. A cluster-randomized controlled trial is performed to evaluate the effect of the worksite aerobic exercise intervention on cardiorespiratory fitness and cardiovascular risk factors among cleaners. Cleaners are eligible if they are employed ≥ 20 hours/week, at one of the enrolled companies. In the randomization, strata are formed according to the manager the participant reports to. The clusters will be balanced on the following criteria: Geographical work location, gender, age and seniority. Cleaners are randomized to either I) a reference group, receiving lectures concerning healthy living, or II) an intervention group, performing worksite aerobic exercise "60 min per week". Data collection will be conducted at baseline, four months and 12 months after baseline, at the worksite during working hours. The data collection will consist of a questionnaire-based interview, physiological testing of health and capacity-related measures, and objective diurnal measures of heart rate, physical activity and blood pressure. Primary outcome is cardiorespiratory fitness. Information is lacking about whether an improved cardiorespiratory fitness will affect the cardiovascular health, and additionally decrease the objectively
Preliminary Investigation of Workload on Intrastate Bus Traffic Controllers
NASA Astrophysics Data System (ADS)
Yen Bin, Teo; Azlis-Sani, Jalil; Nur Annuar Mohd Yunos, Muhammad; Ismail, S. M. Sabri S. M.; Tajedi, Noor Aqilah Ahmad
2016-11-01
The daily routine of bus traffic controller which involves high mental processes would have a direct impact on the level of workload. To date, the level of workload on the bus traffic controllers in Malaysia is relatively unknown. Excessive workload on bus traffic controllers would affect the control and efficiency of the system. This paper served to study the workload on bus traffic controllers and justify the needs to conduct further detailed research on this field. The objectives of this research are to identify the level of workload on the intrastate bus traffic controllers. Based on the results, recommendations will be proposed for improvements and future studies. The level of workload for the bus traffic controllers is quantified using questionnaire adapted from NASA TLX. Interview sessions were conducted for validation of workload. Sixteen respondents were involved and it was found that the average level of workload based on NASA TLX was 6.91. It was found that workload is not affected by gender and marital status. This study also showed that the level of workload and working experience of bus traffic controllers has a strong positive linear relationship. This study would serve as a guidance and reference related to this field. Since this study is a preliminary investigation, further detailed studies could be conducted to obtain a better comprehension regarding the bus traffic controllers.
The T.M.R. Data Dictionary: A Management Tool for Data Base Design
Ostrowski, Maureen; Bernes, Marshall R.
1984-01-01
In January 1981, a dictionary-driven ambulatory care information system known as TMR (The Medical Record) was installed at a large private medical group practice in Los Angeles. TMR's data dictionary has enabled the medical group to adapt the software to meet changing user needs largely without programming support. For top management, the dictionary is also a tool for navigating through the system's complexity and assuring the integrity of management goals.
de Bont, Eefje G P M; Lepot, Julie M M; Hendrix, Dagmar A S; Loonen, Nicole; Guldemond-Hecker, Yvonne; Dinant, Geert-Jan; Cals, Jochen W L
2015-01-01
Objective Even though childhood fever is mostly self-limiting, children with fever constitute a considerable workload in primary care. Little is known about the number of contacts and management during general practitioners’ (GPs) out-of-hours care. We investigated all fever related telephone contacts, consultations, antibiotic prescriptions and paediatric referrals of children during GP out-of-hours care within 1 year. Design Observational cohort study. Setting and patients We performed an observational cohort study at a large Dutch GP out-of-hours service. Children (<12 years) whose parents contacted the GP out-of-hours service for a fever related illness in 2012 were included. Main outcome measures Number of contacts and consultations, antibiotic prescription rates and paediatric referral rates. Results We observed an average of 14.6 fever related contacts for children per day at GP out-of-hours services, with peaks during winter months. Of 17 170 contacts in 2012, 5343 (31.1%) were fever related and 70.0% resulted in a GP consultation. One in four consultations resulted in an antibiotic prescription. Prescriptions increased by age and referrals to secondary care decreased by age (p<0.001). The majority of parents (89.5%) contacted the out-of-hours service only once during a fever episode (89.5%) and 7.6% of children were referred to secondary care. Conclusions This study shows that childhood fever does account for a large workload at GP out-of-hours services. One in three contacts is fever related and 70% of those febrile children are called in to be assessed by a GP. One in four consultations for childhood fever results in antibiotic prescribing and most consultations are managed in primary care without referral. PMID:25991452
Scaling Deep Learning workloads: NVIDIA DGX-1/Pascal and Intel Knights Landing
Gawande, Nitin A.; Daily, Jeff A.; Siegel, Charles; ...
2018-05-05
Deep Learning (DL) algorithms have become ubiquitous in data analytics. As a result, major computing vendors—including NVIDIA, Intel, AMD, and IBM—have architectural road maps influenced by DL workloads. Furthermore, several vendors have recently advertised new computing products as accelerating large DL workloads. Unfortunately, it is difficult for data scientists to quantify the potential of these different products. Here, this article provides a performance and power analysis of important DL workloads on two major parallel architectures: NVIDIA DGX-1 (eight Pascal P100 GPUs interconnected with NVLink) and Intel Knights Landing (KNL) CPUs interconnected with Intel Omni-Path or Cray Aries. Our evaluation consistsmore » of a cross section of convolutional neural net workloads: CifarNet, AlexNet, GoogLeNet, and ResNet50 topologies using the Cifar10 and ImageNet datasets. The workloads are vendor-optimized for each architecture. We use sequentially equivalent implementations to maintain iso-accuracy between parallel and sequential DL models. Our analysis indicates that although GPUs provide the highest overall performance, the gap can close for some convolutional networks; and the KNL can be competitive in performance/watt. We find that NVLink facilitates scaling efficiency on GPUs. However, its importance is heavily dependent on neural network architecture. Furthermore, for weak-scaling—sometimes encouraged by restricted GPU memory—NVLink is less important.« less
Scaling Deep Learning workloads: NVIDIA DGX-1/Pascal and Intel Knights Landing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gawande, Nitin A.; Daily, Jeff A.; Siegel, Charles
Deep Learning (DL) algorithms have become ubiquitous in data analytics. As a result, major computing vendors—including NVIDIA, Intel, AMD, and IBM—have architectural road maps influenced by DL workloads. Furthermore, several vendors have recently advertised new computing products as accelerating large DL workloads. Unfortunately, it is difficult for data scientists to quantify the potential of these different products. Here, this article provides a performance and power analysis of important DL workloads on two major parallel architectures: NVIDIA DGX-1 (eight Pascal P100 GPUs interconnected with NVLink) and Intel Knights Landing (KNL) CPUs interconnected with Intel Omni-Path or Cray Aries. Our evaluation consistsmore » of a cross section of convolutional neural net workloads: CifarNet, AlexNet, GoogLeNet, and ResNet50 topologies using the Cifar10 and ImageNet datasets. The workloads are vendor-optimized for each architecture. We use sequentially equivalent implementations to maintain iso-accuracy between parallel and sequential DL models. Our analysis indicates that although GPUs provide the highest overall performance, the gap can close for some convolutional networks; and the KNL can be competitive in performance/watt. We find that NVLink facilitates scaling efficiency on GPUs. However, its importance is heavily dependent on neural network architecture. Furthermore, for weak-scaling—sometimes encouraged by restricted GPU memory—NVLink is less important.« less
Mental Workload during Brain-Computer Interface Training
Felton, Elizabeth A.; Williams, Justin C.; Vanderheiden, Gregg C.; Radwin, Robert G.
2012-01-01
It is not well understood how people perceive the difficulty of performing brain-computer interface (BCI) tasks, which specific aspects of mental workload contribute the most, and whether there is a difference in perceived workload between participants who are able-bodied and disabled. This study evaluated mental workload using the NASA Task Load Index (TLX), a multi-dimensional rating procedure with six subscales: Mental Demands, Physical Demands, Temporal Demands, Performance, Effort, and Frustration. Able-bodied and motor disabled participants completed the survey after performing EEG-based BCI Fitts’ law target acquisition and phrase spelling tasks. The NASA-TLX scores were similar for able-bodied and disabled participants. For example, overall workload scores (range 0 – 100) for 1D horizontal tasks were 48.5 (SD = 17.7) and 46.6 (SD 10.3), respectively. The TLX can be used to inform the design of BCIs that will have greater usability by evaluating subjective workload between BCI tasks, participant groups, and control modalities. PMID:22506483
A data-driven approach to modeling physical fatigue in the workplace using wearable sensors.
Sedighi Maman, Zahra; Alamdar Yazdi, Mohammad Ali; Cavuoto, Lora A; Megahed, Fadel M
2017-11-01
Wearable sensors are currently being used to manage fatigue in professional athletics, transportation and mining industries. In manufacturing, physical fatigue is a challenging ergonomic/safety "issue" since it lowers productivity and increases the incidence of accidents. Therefore, physical fatigue must be managed. There are two main goals for this study. First, we examine the use of wearable sensors to detect physical fatigue occurrence in simulated manufacturing tasks. The second goal is to estimate the physical fatigue level over time. In order to achieve these goals, sensory data were recorded for eight healthy participants. Penalized logistic and multiple linear regression models were used for physical fatigue detection and level estimation, respectively. Important features from the five sensors locations were selected using Least Absolute Shrinkage and Selection Operator (LASSO), a popular variable selection methodology. The results show that the LASSO model performed well for both physical fatigue detection and modeling. The modeling approach is not participant and/or workload regime specific and thus can be adopted for other applications. Copyright © 2017 Elsevier Ltd. All rights reserved.
Hu, Jiangbi; Wang, Ronghua
2018-02-17
Guaranteeing a safe and comfortable driving workload can contribute to reducing traffic injuries. In order to provide safe and comfortable threshold values, this study attempted to classify driving workload from the aspects of human factors mainly affected by highway geometric conditions and to determine the thresholds of different workload classifications. This article stated a hypothesis that the values of driver workload change within a certain range. Driving workload scales were stated based on a comprehensive literature review. Through comparative analysis of different psychophysiological measures, heart rate variability (HRV) was chosen as the representative measure for quantifying driving workload by field experiments. Seventy-two participants (36 car drivers and 36 large truck drivers) and 6 highways with different geometric designs were selected to conduct field experiments. A wearable wireless dynamic multiparameter physiological detector (KF-2) was employed to detect physiological data that were simultaneously correlated to the speed changes recorded by a Global Positioning System (GPS) (testing time, driving speeds, running track, and distance). Through performing statistical analyses, including the distribution of HRV during the flat, straight segments and P-P plots of modified HRV, a driving workload calculation model was proposed. Integrating driving workload scales with values, the threshold of each scale of driving workload was determined by classification and regression tree (CART) algorithms. The driving workload calculation model was suitable for driving speeds in the range of 40 to 120 km/h. The experimental data of 72 participants revealed that driving workload had a significant effect on modified HRV, revealing a change in driving speed. When the driving speed was between 100 and 120 km/h, drivers showed an apparent increase in the corresponding modified HRV. The threshold value of the normal driving workload K was between -0.0011 and 0
The Sternberg Task as a Workload Metric in Flight Handling Qualities Research
NASA Technical Reports Server (NTRS)
Hemingway, J. C.
1984-01-01
The objective of this research was to determine whether the Sternberg item-recognition task, employed as a secondary task measure of spare mental capacity for flight handling qualities (FHQ) simulation research, could help to differentiate between different flight-control conditions. FHQ evaluations were conducted on the Vertical Motion Simulator at Ames Research Center to investigate different primary flight-control configurations, and selected stability and control augmentation levels for helicopers engaged in low-level flight regimes. The Sternberg task was superimposed upon the primary flight-control task in a balanced experimental design. The results of parametric statistical analysis of Sternberg secondary task data failed to support the continued use of this task as a measure of pilot workload. In addition to the secondary task, subjects provided Cooper-Harper pilot ratings (CHPR) and responded to a workload questionnaire. The CHPR data also failed to provide reliable statistical discrimination between FHQ treatment conditions; some insight into the behavior of the secondary task was gained from the workload questionnaire data.
NASA Technical Reports Server (NTRS)
James, Jeffrey M.; Sanderson, Penelope M.; Seidler, Karen S.
1990-01-01
As modern transport environments become increasingly complex, issues such as crew communication, interaction with automation, and workload management have become crucial. Much research is being focused on holistic aspects of social and cognitive behavior, such as the strategies used to handle workload, the flow of information, the scheduling of tasks, the verbal and non-verbal interactions between crew members. Traditional laboratory performance measures no longer sufficiently meet the needs of researchers addressing these issues. However observational techniques are better equipped to capture the type of data needed and to build models of the requisite level of sophistication. Presented here is SHAPA, an interactive software tool for performing both verbal and non-verbal protocol analysis. It has been developed with the idea of affording the researchers the closest possible degree of engagement with protocol data. The researcher can configure SHAPA to encode protocols using any theoretical framework or encoding vocabulary that is desired. SHAPA allows protocol analysis to be performed at any level of analysis, and it supplies a wide variety of tools for data aggregation, manipulation. The output generated by SHAPA can be used alone or in combination with other performance variables to get a rich picture of the influences on sequences of verbal or nonverbal behavior.
Scenario driven data modelling: a method for integrating diverse sources of data and data streams
2011-01-01
Background Biology is rapidly becoming a data intensive, data-driven science. It is essential that data is represented and connected in ways that best represent its full conceptual content and allows both automated integration and data driven decision-making. Recent advancements in distributed multi-relational directed graphs, implemented in the form of the Semantic Web make it possible to deal with complicated heterogeneous data in new and interesting ways. Results This paper presents a new approach, scenario driven data modelling (SDDM), that integrates multi-relational directed graphs with data streams. SDDM can be applied to virtually any data integration challenge with widely divergent types of data and data streams. In this work, we explored integrating genetics data with reports from traditional media. SDDM was applied to the New Delhi metallo-beta-lactamase gene (NDM-1), an emerging global health threat. The SDDM process constructed a scenario, created a RDF multi-relational directed graph that linked diverse types of data to the Semantic Web, implemented RDF conversion tools (RDFizers) to bring content into the Sematic Web, identified data streams and analytical routines to analyse those streams, and identified user requirements and graph traversals to meet end-user requirements. Conclusions We provided an example where SDDM was applied to a complex data integration challenge. The process created a model of the emerging NDM-1 health threat, identified and filled gaps in that model, and constructed reliable software that monitored data streams based on the scenario derived multi-relational directed graph. The SDDM process significantly reduced the software requirements phase by letting the scenario and resulting multi-relational directed graph define what is possible and then set the scope of the user requirements. Approaches like SDDM will be critical to the future of data intensive, data-driven science because they automate the process of converting
Implications for Academic Workload of the Changing Role of Distance Educators
ERIC Educational Resources Information Center
Bezuidenhout, Adéle
2015-01-01
The changing work roles and resulting workloads of distance educators hold significant implications for the wellbeing and mental health of academics. New work roles include redesigning curricula for online delivery, increasing staff-student ratios and demands for student-support, management of part-time staff, and 24-h availability. This research…
Women and Academic Workloads: Career Slow Lane or Cul-de-Sac?
ERIC Educational Resources Information Center
Barrett, Lucinda; Barrett, Peter
2011-01-01
Career progression for women academics to higher levels is not in proportion to their representation within the profession. This paper looks at theories about this and relates them to current practices within universities for allocating work. The management of workloads can disadvantage women through a number of interactive factors. Interruptions…
NASA Astrophysics Data System (ADS)
Liu, Y.; Zhou, J.; Song, L.; Zou, Q.; Guo, J.; Wang, Y.
2014-02-01
In recent years, an important development in flood management has been the focal shift from flood protection towards flood risk management. This change greatly promoted the progress of flood control research in a multidisciplinary way. Moreover, given the growing complexity and uncertainty in many decision situations of flood risk management, traditional methods, e.g., tight-coupling integration of one or more quantitative models, are not enough to provide decision support for managers. Within this context, this paper presents a beneficial methodological framework to enhance the effectiveness of decision support systems, through the dynamic adaptation of support regarding the needs of the decision-maker. In addition, we illustrate a loose-coupling technical prototype for integrating heterogeneous elements, such as multi-source data, multidisciplinary models, GIS tools and existing systems. The main innovation is the application of model-driven concepts, which put the system in a state of continuous iterative optimization. We define the new system as a model-driven decision support system (MDSS ). Two characteristics that differentiate the MDSS are as follows: (1) it is made accessible to non-technical specialists; and (2) it has a higher level of adaptability and compatibility. Furthermore, the MDSS was employed to manage the flood risk in the Jingjiang flood diversion area, located in central China near the Yangtze River. Compared with traditional solutions, we believe that this model-driven method is efficient, adaptable and flexible, and thus has bright prospects of application for comprehensive flood risk management.
Designing workload analysis questionnaire to evaluate needs of employees
NASA Astrophysics Data System (ADS)
Astuti, Rahmaniyah Dwi; Navi, Muhammad Abdu Haq
2018-02-01
Incompatibility between workload with work capacity is one of main problem to make optimal result. In case at the office, there are constraints to determine workload because of non-repetitive works. Employees do work based on the targets set in a working period. At the end of the period is usually performed an evaluation of employees performance to evaluate needs of employees. The aims of this study to design a workload questionnaire tools to evaluate the efficiency level of position as indicator to determine needs of employees based on the Indonesian State Employment Agency Regulation on workload analysis. This research is applied to State-Owned Enterprise PT. X by determining 3 positions as a pilot project. Position A is held by 2 employees, position B is held by 7 employees, and position C is held by 6 employees. From the calculation result, position A has an efficiency level of 1,33 or "very good", position B has an efficiency level of 1.71 or "enough", and position C has an efficiency level of 1.03 or "very good". The application of this tools giving suggestion the needs of employees of position A is 3 people, position B is 5 people, and position C is 6 people. The difference between the number of employees and the calculation result is then analyzed by interviewing the employees to get more data about personal perception. It can be concluded that this workload evaluation tools can be used as an alternative solution to evaluate needs of employees in office.
A workload model and measures for computer performance evaluation
NASA Technical Reports Server (NTRS)
Kerner, H.; Kuemmerle, K.
1972-01-01
A generalized workload definition is presented which constructs measurable workloads of unit size from workload elements, called elementary processes. An elementary process makes almost exclusive use of one of the processors, CPU, I/O processor, etc., and is measured by the cost of its execution. Various kinds of user programs can be simulated by quantitative composition of elementary processes into a type. The character of the type is defined by the weights of its elementary processes and its structure by the amount and sequence of transitions between its elementary processes. A set of types is batched to a mix. Mixes of identical cost are considered as equivalent amounts of workload. These formalized descriptions of workloads allow investigators to compare the results of different studies quantitatively. Since workloads of different composition are assigned a unit of cost, these descriptions enable determination of cost effectiveness of different workloads on a machine. Subsequently performance parameters such as throughput rate, gain factor, internal and external delay factors are defined and used to demonstrate the effects of various workload attributes on the performance of a selected large scale computer system.
SITE TECHNOLOGY CAPSULE: GIS\\KEY ENVIRONMENTAL DATA MANAGEMENT SYSTEM
GIS/Key™ is a comprehensive environmental database management system that integrates site data and graphics, enabling the user to create geologic cross-sections; boring logs; potentiometric, isopleth, and structure maps; summary tables; and hydrographs. GIS/Key™ is menu-driven an...
Modified Petri net model sensitivity to workload manipulations
NASA Technical Reports Server (NTRS)
White, S. A.; Mackinnon, D. P.; Lyman, J.
1986-01-01
Modified Petri Nets (MPNs) are investigated as a workload modeling tool. The results of an exploratory study of the sensitivity of MPNs to work load manipulations in a dual task are described. Petri nets have been used to represent systems with asynchronous, concurrent and parallel activities (Peterson, 1981). These characteristics led some researchers to suggest the use of Petri nets in workload modeling where concurrent and parallel activities are common. Petri nets are represented by places and transitions. In the workload application, places represent operator activities and transitions represent events. MPNs have been used to formally represent task events and activities of a human operator in a man-machine system. Some descriptive applications demonstrate the usefulness of MPNs in the formal representation of systems. It is the general hypothesis herein that in addition to descriptive applications, MPNs may be useful for workload estimation and prediction. The results are reported of the first of a series of experiments designed to develop and test a MPN system of workload estimation and prediction. This first experiment is a screening test of MPN model general sensitivity to changes in workload. Positive results from this experiment will justify the more complicated analyses and techniques necessary for developing a workload prediction system.
Data-driven Modelling for decision making under uncertainty
NASA Astrophysics Data System (ADS)
Angria S, Layla; Dwi Sari, Yunita; Zarlis, Muhammad; Tulus
2018-01-01
The rise of the issues with the uncertainty of decision making has become a very warm conversation in operation research. Many models have been presented, one of which is with data-driven modelling (DDM). The purpose of this paper is to extract and recognize patterns in data, and find the best model in decision-making problem under uncertainty by using data-driven modeling approach with linear programming, linear and nonlinear differential equation, bayesian approach. Model criteria tested to determine the smallest error, and it will be the best model that can be used.
An investigation of the 'von Restorff' phenomenon in post-test workload ratings
NASA Technical Reports Server (NTRS)
Thornton, D. C.
1985-01-01
The von Restorff effect in post-task ratings of task difficulty is examined. Nine subjects performed a hovercraft simulation task which combined elements of skill-based tracking and rule- and knowledge-based process control for five days of one hour sessions. The effects of isolated increases in workload on rating of task performance, and on the number of command errors and river band hits are analyzed. It is observed that the position of the workload increase affects the number of bank hits and command errors. The data reveal that factors not directly related to the task performance influence subjective rating, and post-task ratings of workload are biased.
Pilot mental workload: how well do pilots really perform?
Morris, Charles H; Leung, Ying K
2006-12-15
The purpose of this study was to investigate the effects of increasing mental demands on various aspects of aircrew performance. In particular, the robustness of the prioritization and allocation hierarchy of aviate-navigate-communicate was examined, a hierarchy commonly used within the aviation industry. A total of 42 trainee pilots were divided into three workload groups (low, medium, high) to complete a desktop, computer-based exercise that simulated combinations of generic flight deck activities: flight control manipulation, rule-based actions and higher level cognitive processing, in addition to Air Traffic Control instructions that varied in length from one chunk of auditory information to seven chunks. It was found that as mental workload and auditory input increased, participants experienced considerable difficulty in carrying out the primary manipulation task. A similar decline in prioritization was also observed. Moreover, when pilots were under a high mental workload their ability to comprehend more than two chunks of auditory data deteriorated rapidly.
Heavy physician workloads: impact on physician attitudes and outcomes.
Williams, Eric S; Rondeau, Kent V; Xiao, Qian; Francescutti, Louis H
2007-11-01
The intensity of physician workload has been increasing with the well-documented changes in the financing, organization and delivery of care. It is possible that these stressors have reached a point where they pose a serious policy issue for the entire healthcare system through their diminution of physician's ability to effectively interact with patients as they are burned out, stressed and dissatisfied. This policy question is framed in a conceptual model linking workloads with five key outcomes (patient care quality, individual performance, absenteeism, turnover and organizational performance) mediated by physician stress and satisfaction. This model showed a good fit to the data in a structural equation analysis. Ten of the 12 hypothesized pathways between variables were significant and supported the mediating role of stress and satisfaction. These results suggest that workloads, stress and satisfaction have significant and material impacts on patient care quality, individual performance, absenteeism, turnover and organizational performance. Implications of these results and directions for future research are discussed.
Yurko, Yuliya Y; Scerbo, Mark W; Prabhu, Ajita S; Acker, Christina E; Stefanidis, Dimitrios
2010-10-01
Increased workload during task performance may increase fatigue and facilitate errors. The National Aeronautics and Space Administration-Task Load Index (NASA-TLX) is a previously validated tool for workload self-assessment. We assessed the relationship of workload and performance during simulator training on a complex laparoscopic task. NASA-TLX workload data from three separate trials were analyzed. All participants were novices (n = 28), followed the same curriculum on the fundamentals of laparoscopic surgery suturing model, and were tested in the animal operating room (OR) on a Nissen fundoplication model after training. Performance and workload scores were recorded at baseline, after proficiency achievement, and during the test. Performance, NASA-TLX scores, and inadvertent injuries during the test were analyzed and compared. Workload scores declined during training and mirrored performance changes. NASA-TLX scores correlated significantly with performance scores (r = -0.5, P < 0.001). Participants with higher workload scores caused more inadvertent injuries to adjacent structures in the OR (r = 0.38, P < 0.05). Increased mental and physical workload scores at baseline correlated with higher workload scores in the OR (r = 0.52-0.82; P < 0.05) and more inadvertent injuries (r = 0.52, P < 0.01). Increased workload is associated with inferior task performance and higher likelihood of errors. The NASA-TLX questionnaire accurately reflects workload changes during simulator training and may identify individuals more likely to experience high workload and more prone to errors during skill transfer to the clinical environment.
[Nursing workloads and working conditions: integrative review].
Schmoeller, Roseli; Trindade, Letícia de Lima; Neis, Márcia Binder; Gelbcke, Francine Lima; de Pires, Denise Elvira Pires
2011-06-01
This study reviews theoretical production concerning workloads and working conditions for nurses. For that, an integrative review was carried out using scientific articles, theses and dissertations indexed in two Brazilian databases, Virtual Health Care Library (Biblioteca Virtual de Saúde) and Digital Database of Dissertations (Banco Digital de Teses), over the last ten years. From 132 identified studies, 27 were selected. Results indicate workloads as responsible for professional weariness, affecting the occurrence of work accidents and health problems. In order to adequate workloads studies indicate some strategies, such as having an adequate numbers of employees, continuing education, and better working conditions. The challenge is to continue research that reveal more precisely the relationships between workloads, working conditions, and health of the nursing team.
Tubbs-Cooley, Heather L; Mara, Constance A; Carle, Adam C; Gurses, Ayse P
2018-02-12
The NASA Task Load Index (NASA-TLX) is a subjective workload assessment scale developed for use in aviation and increasingly applied to healthcare. The scale purports to measure overall workload as a single variable calculated by summing responses to six items. Since no data address the validity of this scoring approach in health care, we evaluated the single factor structure of the NASA-TLX as a measure of overall workload among intenisive care nurses. Confirmatory factor analysis of data from two studies of nurse workload in neonatal, paediatric, and adult intensive care units. Study 1 data were obtained from 136 nurses in one neonatal intensive care unit. Study 2 data were collected from 300 nurses in 17 adult, paediatric and neonatal units. Nurses rated their workload using the NASA-TLX's paper version. A single factor model testing whether all six items measured a single overall workload variable fit least well (RMSEA = 0.14; CFI = 0.91; TLI = 0.85). A second model that specified two items as outcomes of overall workload had acceptable fit (RMSEA = 0.08; CFI = 0.97; TLI = 0.95) while a third model of four items fit best (RMSEA = 0.06; CFI > 0.99; TLI = 0.99). A summed score from four of six NASA-TLX items appears to most reliably measure a single overall workload variable among intensive care nurses. Copyright © 2018 Elsevier Ltd. All rights reserved.
File-access characteristics of parallel scientific workloads
NASA Technical Reports Server (NTRS)
Nieuwejaar, Nils; Kotz, David; Purakayastha, Apratim; Best, Michael; Ellis, Carla Schlatter
1995-01-01
Phenomenal improvements in the computational performance of multiprocessors have not been matched by comparable gains in I/O system performance. This imbalance has resulted in I/O becoming a significant bottleneck for many scientific applications. One key to overcoming this bottleneck is improving the performance of parallel file systems. The design of a high-performance parallel file system requires a comprehensive understanding of the expected workload. Unfortunately, until recently, no general workload studies of parallel file systems have been conducted. The goal of the CHARISMA project was to remedy this problem by characterizing the behavior of several production workloads, on different machines, at the level of individual reads and writes. The first set of results from the CHARISMA project describe the workloads observed on an Intel iPSC/860 and a Thinking Machines CM-5. This paper is intended to compare and contrast these two workloads for an understanding of their essential similarities and differences, isolating common trends and platform-dependent variances. Using this comparison, we are able to gain more insight into the general principles that should guide parallel file-system design.
Subjective rating scales as a workload
NASA Technical Reports Server (NTRS)
Bird, K. L.
1981-01-01
A multidimensional bipolar-adjective rating scale is employed as a subjective measure of operator workload in the performance of a one-axis tracking task. The rating scale addressed several dimensions of workload, including cognitive, physical, and perceptual task loading as well as fatigue and stress effects. Eight subjects performed a one-axis tracking task (with six levels of difficulty) and rated these tasks on several workload dimensions. Performance measures were tracking error RMS (root-mean square) and the standard deviation of control stick output. Significant relationships were observed between these performance measures and skill required, task complexity, attention level, task difficulty, task demands, and stress level.
Operating room clinicians' ratings of workload: a vignette simulation study.
Wallston, Kenneth A; Slagle, Jason M; Speroff, Ted; Nwosu, Sam; Crimin, Kimberly; Feurer, Irene D; Boettcher, Brent; Weinger, Matthew B
2014-06-01
Increased clinician workload is associated with medical errors and patient harm. The Quality and Workload Assessment Tool (QWAT) measures anticipated (pre-case) and perceived (post-case) clinical workload during actual surgical procedures using ratings of individual and team case difficulty from every operating room (OR) team member. The purpose of this study was to examine the QWAT ratings of OR clinicians who were not present in the OR but who read vignettes compiled from actual case documentation to assess interrater reliability and agreement with ratings made by clinicians involved in the actual cases. Thirty-six OR clinicians (13 anesthesia providers, 11 surgeons, and 12 nurses) used the QWAT to rate 6 cases varying from easy to moderately difficult based on actual ratings made by clinicians involved with the cases. Cases were presented and rated in random order. Before rating anticipated individual and team difficulty, the raters read prepared clinical vignettes containing case synopses and much of the same written case information that was available to the actual clinicians before the onset of each case. Then, before rating perceived individual and team difficulty, they read part 2 of the vignette consisting of detailed role-specific intraoperative data regarding the anesthetic and surgical course, unusual events, and other relevant contextual factors. Surgeons had higher interrater reliability on the QWAT than did OR nurses or anesthesia providers. For the anticipated individual and team workload ratings, there were no statistically significant differences between the actual ratings and the ratings obtained from the vignettes. There were differences for the 3 provider types in perceived individual workload for the median difficulty cases and in the perceived team workload for the median and more difficult cases. The case difficulty items on the QWAT seem to be sufficiently reliable and valid to be used in other studies of anticipated and perceived clinical
Data-driven discovery of new Dirac semimetal materials
NASA Astrophysics Data System (ADS)
Yan, Qimin; Chen, Ru; Neaton, Jeffrey
In recent years, a significant amount of materials property data from high-throughput computations based on density functional theory (DFT) and the application of database technologies have enabled the rise of data-driven materials discovery. In this work, we initiate the extension of the data-driven materials discovery framework to the realm of topological semimetal materials and to accelerate the discovery of novel Dirac semimetals. We implement current available and develop new workflows to data-mine the Materials Project database for novel Dirac semimetals with desirable band structures and symmetry protected topological properties. This data-driven effort relies on the successful development of several automatic data generation and analysis tools, including a workflow for the automatic identification of topological invariants and pattern recognition techniques to find specific features in a massive number of computed band structures. Utilizing this approach, we successfully identified more than 15 novel Dirac point and Dirac nodal line systems that have not been theoretically predicted or experimentally identified. This work is supported by the Materials Project Predictive Modeling Center through the U.S. Department of Energy, Office of Basic Energy Sciences, Materials Sciences and Engineering Division, under Contract No. DE-AC02-05CH11231.
Progress in mental workload measurement
NASA Technical Reports Server (NTRS)
Moray, Neville; Turksen, Burhan; Aidie, Paul; Drascic, David; Eisen, Paul
1986-01-01
Two new techniques are described, one using subjective, the other physiological data for the measurement of workload in complex tasks. The subjective approach uses fuzzy measurement to analyze and predict the difficulty of combinations of skill based and rule based behavior from the difficulty of skill based behavior and rule based behavior measured separately. The physiological technique offers an on-line real-time filter for measuring the Mulder signal at 0.1 Hz in the heart rate variability spectrum.
Distributed data analysis in ATLAS
NASA Astrophysics Data System (ADS)
Nilsson, Paul; Atlas Collaboration
2012-12-01
Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and Nordu Grid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interface to the multitude of execution backends (local, batch, and grid). The PanDA workload management system provides a set of utilities called PanDA Client; with these tools users can easily submit Athena analysis jobs to the PanDA-managed resources. Distributed data is managed by Don Quixote 2, a system developed by ATLAS; DQ2 is used to replicate datasets according to the data distribution policies and maintains a central catalog of file locations. The operation of the grid resources is continually monitored by the Ganga Robot functional testing system, and infrequent site stress tests are performed using the Hammer Cloud system. In addition, the DAST shift team is a group of power users who take shifts to provide distributed analysis user support; this team has effectively relieved the burden of support from the developers.
The effects of practice on tracking and subjective workload
NASA Technical Reports Server (NTRS)
Hancock, P. A.; Robinson, M. A.; Chu, A. L.; Hansen, D. R.; Vercruyssen, M.
1989-01-01
Six college-age male subjects performed one hundred, two-minute trials on a second-order tracking task. After each trial, subjects estimated perceived workload using both the NASA TLX and SWAT workload assessment procedures. Results confirmed an expected performance improvement on the tracking task which followed traditional learning curves within the performance of each individual. Perceived workload also decreased for both scales across trials. While performance variability significantly decreased across trials, workload variability remained constant. One month later, the same subjects returned to complete the second experiment in the sequence which was a retention replication of the first experiment. Results replicated those for the first experiment except that both performance error and workload were at reduced overall levels. Results in general affirm a parallel workload reduction with performance improvement, an observation consistent with a resource-based view of automaticity.
Timesharing performance as an indicator of pilot mental workload
NASA Technical Reports Server (NTRS)
Casper, Patricia A.; Kantowitz, Barry H.; Sorkin, Robert D.
1988-01-01
Attentional deficits (workloads) were evaluated in a timesharing task. The results from this and other experiments were incorporated into an expert system designed to provide workload metric selection advice to non-experts in the field interested in operator workload.
Timesharing performance as an indicator of pilot mental workload
NASA Technical Reports Server (NTRS)
Casper, Patricia A.
1988-01-01
The research was performed in two simultaneous phases, each intended to identify and manipulate factors related to operator mental workload. The first phase concerned evaluation of attentional deficits (workloads) in a timesharing task. Work in the second phase involved incorporating the results from these and other experiments into an expert system designed to provide workload metric selection advice to nonexperts in the field interested in operator workload. The results of the experiments conducted are summarized.
Measuring Pilot Workload in a Moving-base Simulator. Part 2: Building Levels of Workload
NASA Technical Reports Server (NTRS)
Kantowitz, B. H.; Hart, S. G.; Bortolussi, M. R.; Shively, R. J.; Kantowitz, S. C.
1984-01-01
Pilot behavior in flight simulators often use a secondary task as an index of workload. His routine to regard flying as the primary task and some less complex task as the secondary task. While this assumption is quite reasonable for most secondary tasks used to study mental workload in aircraft, the treatment of flying a simulator through some carefully crafted flight scenario as a unitary task is less justified. The present research acknowledges that total mental workload depends upon the specific nature of the sub-tasks that a pilot must complete as a first approximation, flight tasks were divided into three levels of complexity. The simplest level (called the Base Level) requires elementary maneuvers that do not utilize all the degrees of freedom of which an aircraft, or a moving-base simulator; is capable. The second level (called the Paired Level) requires the pilot to simultaneously execute two Base Level tasks. The third level (called the Complex Level) imposes three simultaneous constraints upon the pilot.
Mental workload during n-back task-quantified in the prefrontal cortex using fNIRS.
Herff, Christian; Heger, Dominic; Fortmann, Ole; Hennrich, Johannes; Putze, Felix; Schultz, Tanja
2013-01-01
When interacting with technical systems, users experience mental workload. Particularly in multitasking scenarios (e.g., interacting with the car navigation system while driving) it is desired to not distract the users from their primary task. For such purposes, human-machine interfaces (HCIs) are desirable which continuously monitor the users' workload and dynamically adapt the behavior of the interface to the measured workload. While memory tasks have been shown to elicit hemodynamic responses in the brain when averaging over multiple trials, a robust single trial classification is a crucial prerequisite for the purpose of dynamically adapting HCIs to the workload of its user. The prefrontal cortex (PFC) plays an important role in the processing of memory and the associated workload. In this study of 10 subjects, we used functional Near-Infrared Spectroscopy (fNIRS), a non-invasive imaging modality, to sample workload activity in the PFC. The results show up to 78% accuracy for single-trial discrimination of three levels of workload from each other. We use an n-back task (n ∈ {1, 2, 3}) to induce different levels of workload, forcing subjects to continuously remember the last one, two, or three of rapidly changing items. Our experimental results show that measuring hemodynamic responses in the PFC with fNIRS, can be used to robustly quantify and classify mental workload. Single trial analysis is still a young field that suffers from a general lack of standards. To increase comparability of fNIRS methods and results, the data corpus for this study is made available online.
Mental workload during n-back task—quantified in the prefrontal cortex using fNIRS
Herff, Christian; Heger, Dominic; Fortmann, Ole; Hennrich, Johannes; Putze, Felix; Schultz, Tanja
2014-01-01
When interacting with technical systems, users experience mental workload. Particularly in multitasking scenarios (e.g., interacting with the car navigation system while driving) it is desired to not distract the users from their primary task. For such purposes, human-machine interfaces (HCIs) are desirable which continuously monitor the users' workload and dynamically adapt the behavior of the interface to the measured workload. While memory tasks have been shown to elicit hemodynamic responses in the brain when averaging over multiple trials, a robust single trial classification is a crucial prerequisite for the purpose of dynamically adapting HCIs to the workload of its user. The prefrontal cortex (PFC) plays an important role in the processing of memory and the associated workload. In this study of 10 subjects, we used functional Near-Infrared Spectroscopy (fNIRS), a non-invasive imaging modality, to sample workload activity in the PFC. The results show up to 78% accuracy for single-trial discrimination of three levels of workload from each other. We use an n-back task (n ∈ {1, 2, 3}) to induce different levels of workload, forcing subjects to continuously remember the last one, two, or three of rapidly changing items. Our experimental results show that measuring hemodynamic responses in the PFC with fNIRS, can be used to robustly quantify and classify mental workload. Single trial analysis is still a young field that suffers from a general lack of standards. To increase comparability of fNIRS methods and results, the data corpus for this study is made available online. PMID:24474913
Understanding I/O workload characteristics of a Peta-scale storage system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Youngjae; Gunasekaran, Raghul
2015-01-01
Understanding workload characteristics is critical for optimizing and improving the performance of current systems and software, and architecting new storage systems based on observed workload patterns. In this paper, we characterize the I/O workloads of scientific applications of one of the world s fastest high performance computing (HPC) storage cluster, Spider, at the Oak Ridge Leadership Computing Facility (OLCF). OLCF flagship petascale simulation platform, Titan, and other large HPC clusters, in total over 250 thousands compute cores, depend on Spider for their I/O needs. We characterize the system utilization, the demands of reads and writes, idle time, storage space utilization,more » and the distribution of read requests to write requests for the Peta-scale Storage Systems. From this study, we develop synthesized workloads, and we show that the read and write I/O bandwidth usage as well as the inter-arrival time of requests can be modeled as a Pareto distribution. We also study the I/O load imbalance problems using I/O performance data collected from the Spider storage system.« less
Factors Related to School Nurse Workload
ERIC Educational Resources Information Center
Jameson, Beth E.; Engelke, Martha Keehner; Anderson, Lori S.; Endsley, Patricia; Maughan, Erin D.
2018-01-01
Recognizing the need for a school nurse workload model based on more than the number of students in a caseload, the National Association of School Nurses issued recommendations related to measuring school nurse workload. Next, a workforce acuity task force (WATF) was charged with identifying the steps needed to further the recommendations. As a…
ERIC Educational Resources Information Center
Endsley, Patricia
2017-01-01
The purpose of this scoping review was to survey the most recent (5 years) acute care, community health, and mental health nursing workload literature to understand themes and research avenues that may be applicable to school nursing workload research. The search for empirical and nonempirical literature was conducted using search engines such as…
Library Faculty Workload: A Case Study in Implementing a Teaching Faculty Model.
ERIC Educational Resources Information Center
Goudy, Frank Wm.
In the January 1988 issue of "Library Administration & Management," an article titled "The Dilemma of Library Faculty Workload: One Solution" described the efforts of the library faculty at Western Illinois University to achieve a more equitable situation compared to other faculty on the campus. A totally new approach to…
Heart Rate Variability as a Measure of Airport Ramp-Traffic Controllers Workload
NASA Technical Reports Server (NTRS)
Hayashi, Miwa; Dulchinos, Victoria Lee
2016-01-01
Heart Rate Variability (HRV) has been reported to reflect the person's cognitive and emotional stress levels, and may offer an objective measure of human-operator's workload levels, which are recorded continuously and unobtrusively to the task performance. The present paper compares the HRV data collected during a human-in-the-loop simulation of airport ramp-traffic control operations with the controller participants' own verbal self-reporting ratings of their workload.
Catastrophe models for cognitive workload and fatigue in N-back tasks.
Guastello, Stephen J; Reiter, Katherine; Malon, Matthew; Timm, Paul; Shircel, Anton; Shaline, James
2015-04-01
N-back tasks place a heavy load on working memory, and thus make good candidates for studying cognitive workload and fatigue (CWLF). This study extended previous work on CWLF which separated the two phenomena with two cusp catastrophe models. Participants were 113 undergraduates who completed 2-back and 3-back tasks with both auditory and visual stimuli simultaneously. Task data were complemented by several measures hypothesized to be related to cognitive elasticity and compensatory abilities and the NASA TLX ratings of subjective workload. The adjusted R2 was .980 for the workload model, which indicated a highly accurate prediction with six bifurcation (elasticity versus rigidity) effects: algebra flexibility, TLX performance, effort, and frustration; and psychosocial measures of inflexibility and monitoring. There were also two cognitive load effects (asymmetry): 2 vs. 3-back and TLX temporal demands. The adjusted R2 was .454 for the fatigue model, which contained two bifurcation variables indicating the amount of work done, and algebra flexibility as the compensatory ability variable. Both cusp models were stronger than the next best linear alternative model. The study makes an important step forward by uncovering an apparently complete model for workload, finding the role of subjective workload in the context of performance dynamics, and finding CWLF dynamics in yet another type of memory-intensive task. The results were also consistent with the developing notion that performance deficits induced by workload and deficits induced by fatigue result from the impact of the task on the workspace and executive functions of working memory respectively.
Use of EEG workload indices for diagnostic monitoring of vigilance decrement.
Kamzanova, Altyngul T; Kustubayeva, Almira M; Matthews, Gerald
2014-09-01
A study was run to test which of five electroencephalographic (EEG) indices was most diagnostic of loss of vigilance at two levels of workload. EEG indices of alertness include conventional spectral power measures as well as indices combining measures from multiple frequency bands, such as the Task Load Index (TLI) and the Engagement Index (El). However, it is unclear which indices are optimal for early detection of loss of vigilance. Ninety-two participants were assigned to one of two experimental conditions, cued (lower workload) and uncued (higher workload), and then performed a 40-min visual vigilance task. Performance on this task is believed to be limited by attentional resource availability. EEG was recorded continuously. Performance, subjective state, and workload were also assessed. The task showed a vigilance decrement in performance; cuing improved performance and reduced subjective workload. Lower-frequency alpha (8 to 10.9 Hz) and TLI were most sensitive to the task parameters. The magnitude of temporal change was larger for lower-frequency alpha. Surprisingly, higher TLI was associated with superior performance. Frontal theta and El were influenced by task workload only in the final period of work. Correlational data also suggested that the indices are distinct from one another. Lower-frequency alpha appears to be the optimal index for monitoring vigilance on the task used here, but further work is needed to test how diagnosticity of EEG indices varies with task demands. Lower-frequency alpha may be used to diagnose loss of operator alertness on tasks requiring vigilance.
Predictors of nursing workload in elderly patients admitted to intensive care units.
Sousa, Cleber Ricardo de; Gonçalves, Leilane Andrade; Toffoleto, Maria Cecília; Leão, Karine; Padilha, Kátia Grillo
2008-01-01
The age of patients is a controversial issue in admission to intensive care unit (ICU). The aim of this study was to compare severity and nursing workload of elderly patients with 60-69, 70-79, and e"80 years of age and to identify predictors of nursing workload in elderly patients. A cross sectional study was performed with a sample of 71 elderly patients admitted to three ICU in the city of Sao Paulo, Brazil from October to November 2004. Data were prospectively collected using Nursing Activities Score (NAS) and Simplified Acute Physiology Score II (SAPS II). There was no significant difference in nursing workload among the elderly patients age subgroups (p=0.84). Multiple regression analysis indicated that the independent risk factors of high nursing workload were severity, age e"70 years, and to be a surgical ICU patient. Age as an isolated factor should not be discriminative for elderly patients admission to ICU.
Exploring Individual Differences in Workload Assessment
2014-12-26
83 Question 3: Do the physiological measures: blinks, saccades, HR, HRV , correlate with the objective workload profile for all...Electrooculography (EOG) signals, and heart rate (HR) and heart rate variability ( HRV ) as determined from Electrocardiography (ECG...3) Do the physiological measures blinks, saccades, HR, and HRV , correlate with the objective workload profile for all divergent participants and
A self-analysis of the NASA-TLX workload measure.
Noyes, Jan M; Bruneau, Daniel P J
2007-04-01
Computer use and, more specifically, the administration of tests and materials online continue to proliferate. A number of subjective, self-report workload measures exist, but the National Aeronautics and Space Administration-Task Load Index (NASA-TLX) is probably the most well known and used. The aim of this paper is to consider the workload costs associated with the computer-based and paper versions of the NASA-TLX measure. It was found that there is a significant difference between the workload scores for the two media, with the computer version of the NASA-TLX incurring more workload. This has implications for the practical use of the NASA-TLX as well as for other computer-based workload measures.
An Annotated Bibliography on Operator Mental Workload Assessment
1980-03-26
The descriptors associated with each citation designate the general workload classification, the specific workload classification, tue type of...systems, with all of their advanced sen3ors and avionics, must be compatible with the capabilities and limitations of the aircrew. During the design ...constructs or models was included only if mental workload was at least potentially assessable from the constructs or models. C. Experimental design . A
Workload Measurement in Human Autonomy Teaming: How and Why?
NASA Technical Reports Server (NTRS)
Shively, Jay
2016-01-01
This is an invited talk on autonomy and workload for an AFRL Blue Sky workshop sponsored by the Florida Institute for Human Machine Studies. The presentation reviews various metrics of workload and how to move forward with measuring workload in a human-autonomy teaming environment.
Mental workload measurement in operator control room using NASA-TLX
NASA Astrophysics Data System (ADS)
Sugarindra, M.; Suryoputro, M. R.; Permana, A. I.
2017-12-01
The workload, encountered a combination of physical workload and mental workload, is a consequence of the activities for workers. Central control room is one department in the oil processing company, employees tasked with monitoring the processing unit for 24 hours nonstop with a combination of 3 shifts in 8 hours. NASA-TLX (NASA Task Load Index) is one of the subjective mental workload measurement using six factors, namely the Mental demand (MD), Physical demand (PD), Temporal demand (TD), Performance (OP), Effort (EF), frustration levels (FR). Measurement of a subjective mental workload most widely used because it has a high degree of validity. Based on the calculation of the mental workload, there at 5 units (DTU, NPU, HTU, DIST and OPS) at the control chamber (94; 83.33; 94.67; 81, 33 and 94.67 respectively) that categorize as very high mental workload. The high level of mental workload on the operator in the Central Control Room is a requirement to have high accuracy, alertness and can make decisions quickly
The acute:chonic workload ratio in relation to injury risk in professional soccer.
Malone, Shane; Owen, Adam; Newton, Matt; Mendes, Bruno; Collins, Kieran D; Gabbett, Tim J
2017-06-01
To examine the association between combined sRPE measures and injury risk in elite professional soccer. Observational cohort study. Forty-eight professional soccer players (mean±SD age of 25.3±3.1 yr) from two elite European teams were involved within a one season study. Players completed a test of intermittent-aerobic capacity (Yo-YoIR1) to assess player's injury risk in relation to intermittent aerobic capacity. Weekly workload measures and time loss injuries were recorded during the entire period. Rolling weekly sums and week-to-week changes in workload were measured, allowing for the calculation of the acute:chronic workload ratio, which was calculated by dividing the acute (1-weekly) and chronic (4-weekly) workloads. All derived workload measures were modelled against injury data using logistic regression. Odds ratios (OR) were reported against a reference group. Players who exerted pre-season 1-weekly loads of ≥1500 to ≤2120AU were at significantly higher risk of injury compared to the reference group of ≤1500AU (OR=1.95, p=0.006). Players with increased intermittent-aerobic capacity were better able to tolerate increased 1-weekly absolute changes in training load than players with lower fitness levels (OR=4.52, p=0.011). Players who exerted in-season acute:chronic workload ratios of >1.00 to <1.25 (OR=0.68, p=0.006) were at significantly lower risk of injury compared to the reference group (≤0.85). These findings demonstrate that an acute:chronic workload of between 1.00 and 1.25 is protective for professional soccer players. A higher intermittent-aerobic capacity appears to offer greater injury protection when players are exposed to rapid changes in workload in elite soccer players. Moderate workloads, coupled with moderate-low to moderate-high acute:chronic workload ratios, appear to be protective for professional soccer players. Copyright © 2016 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Influence of patient and provider factors on the workload of on-call physicians
Hsu, Nin-Chieh; Huang, Chun-Che; Jerng, Jih-Shuin; Hsu, Chia-Hao; Yang, Ming-Chin; Chang, Ray-E; Ko, Wen-Je; Yu, Chong-Jen
2016-01-01
Abstract Factors associated with the physician workload are scarcely reported. The study aims to investigate the associated factors of on-call physician workload based on a published conceptual framework. The study was conducted in a general internal medicine unit of National Taiwan University Hospital. On-call physician workloads were recorded on a shift basis from 1198 hospitalized patients between May 2010 and April 2011. The proxy of on-call workloads included night calls, bedside evaluation/management (E/M), and performing clinical procedures in a shift. Multivariable logistic and negative binomial regression models were used to determine the factors associated with the workloads of on-call physicians. During the study period, 378 (31.6%) of patients had night calls with related workloads. Multivariate analysis showed that the number of patients with unstable conditions in a shift (odds ratio [OR] 1.89 and 1.66, respectively) and the intensive care unit (ICU) training of the nurse leader (OR 2.87 and 3.08, respectively) resulted in higher likelihood of night calls to and bedside E/M visits by the on-call physician. However, ICU training of nurses (OR = 0.37, 95% confidence interval: 0.16–0.86) decreased the demand of performing clinical procedures by the on-call physician. Moreover, number of patients with unstable conditions (risk ratio [RR] 1.52 and 1.55, respectively) had significantly increased the number of night calls and bedside E/M by on-call physicians by around 50%. Nurses with N1 level (RR 2.16 and 2.71, respectively) were more likely to place night calls and facilitate bedside E/M by the on-call physician compared to nurses with N0 level. In addition, the nurse leaders with ICU training (RR 1.72 and 3.07, respectively) had significant increases in night calls and bedside E/M by the on-call physician compared to those without ICU training. On-call physician workload is associated with patient factors and the training of nurses. Number of
Domestic chores workload and depressive symptoms among children affected by HIV/AIDS in China.
Yu, Yun; Li, Xiaoming; Zhang, Liying; Zhao, Junfeng; Zhao, Guoxiang; Zheng, Yu; Stanton, Bonita
2013-01-01
Limited data are available regarding the effects of domestic chores workload on psychological problems among children affected by HIV/AIDS in China. The current study aims to examine association between children's depressive symptoms and the domestic chores workload (i.e., the frequency and the amount of time doing domestic chores). Data were derived from the baseline survey of a longitudinal study which investigated the impact of parental HIV/AIDS on psychological problems of children. A total of 1449 children in family-based care were included in the analysis: 579 orphaned children who lost one or both parents due to AIDS, 466 vulnerable children living with one or both parents being infected with HIV, and 404 comparison children who did not have HIV/AIDS-infected family members in their families. Results showed differences on domestic chores workload between children affected by HIV/AIDS (orphans and vulnerable children) and the comparison children. Children affected by HIV/AIDS worked more frequently and worked longer time on domestic chores than the comparison children. Multivariate linear regression analysis showed that domestic chores workload was positively associated with depressive symptoms. The data suggest that children affected by HIV/AIDS may face increasing burden of domestic chores and it is necessary to reduce the excessive workload of domestic chores among children affected by HIV/AIDS through increasing community-based social support for children in the families affected by HIV/AIDS.
Domestic chores workload and depressive symptoms among children affected by HIV/AIDS in China
Yu, Yun; Li, Xiaoming; Zhang, Liying; Zhao, Junfeng; Zhao, Guoxiang; Zheng, Yu; Stanton, Bonita
2012-01-01
Limited data are available regarding the effects of domestic chores workload on psychological problems among children affected by HIV/AIDS in China. The current study aims to examine association between children’s depressive symptoms and the domestic chores workload (i.e., the frequency and the amount of time doing domestic chores). Data were derived from the baseline survey of a longitudinal study which investigated the impact of parental HIV/AIDS on psychological problems of children. A total of 1,449 children in family-based care were included in the analysis: 579 orphaned children who lost one or both parents due to AIDS, 466 vulnerable children living with one or both parents being infected with HIV, and 404 comparison children who did not have HIV/AIDS infected family members in their families. Results showed differences on domestic chores workload between children affected by HIV/AIDS (orphans and vulnerable children) and the comparison children. Children affected by HIV/AIDS worked more frequently and worked longer time on domestic chores than the comparison children. Multivariate linear regression analysis showed that domestic chores workload was positively associated with depressive symptoms. The data suggest that children affected by HIV/AIDS may face increasing burden of domestic chores and it is necessary to reduce the excessive workload of domestic chores among children affected by HIV/AIDS through increasing community-based social support for children in the families affected by HIV/AIDS. PMID:22970996
Front-line ordering clinicians: matching workforce to workload.
Fieldston, Evan S; Zaoutis, Lisa B; Hicks, Patricia J; Kolb, Susan; Sladek, Erin; Geiger, Debra; Agosto, Paula M; Boswinkel, Jan P; Bell, Louis M
2014-07-01
Matching workforce to workload is particularly important in healthcare delivery, where an excess of workload for the available workforce may negatively impact processes and outcomes of patient care and resident learning. Hospitals currently lack a means to measure and match dynamic workload and workforce factors. This article describes our work to develop and obtain consensus for use of an objective tool to dynamically match the front-line ordering clinician (FLOC) workforce to clinical workload in a variety of inpatient settings. We undertook development of a tool to represent hospital workload and workforce based on literature reviews, discussions with clinical leadership, and repeated validation sessions. We met with physicians and nurses from every clinical care area of our large, urban children's hospital at least twice. We successfully created a tool in a matrix format that is objective and flexible and can be applied to a variety of settings. We presented the tool in 14 hospital divisions and received widespread acceptance among physician, nursing, and administrative leadership. The hospital uses the tool to identify gaps in FLOC coverage and guide staffing decisions. Hospitals can better match workload to workforce if they can define and measure these elements. The Care Model Matrix is a flexible, objective tool that quantifies the multidimensional aspects of workload and workforce. The tool, which uses multiple variables that are easily modifiable, can be adapted to a variety of settings. © 2014 Society of Hospital Medicine.
Comparing capacity coefficient and dual task assessment of visual multitasking workload
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaha, Leslie M.
Capacity coefficient analysis could offer a theoretically grounded alternative approach to subjective measures and dual task assessment of cognitive workload. Workload capacity or workload efficiency is a human information processing modeling construct defined as the amount of information that can be processed by the visual cognitive system given a specified of amount of time. In this paper, I explore the relationship between capacity coefficient analysis of workload efficiency and dual task response time measures. To capture multitasking performance, I examine how the relatively simple assumptions underlying the capacity construct generalize beyond the single visual decision making tasks. The fundamental toolsmore » for measuring workload efficiency are the integrated hazard and reverse hazard functions of response times, which are defined by log transforms of the response time distribution. These functions are used in the capacity coefficient analysis to provide a functional assessment of the amount of work completed by the cognitive system over the entire range of response times. For the study of visual multitasking, capacity coefficient analysis enables a comparison of visual information throughput as the number of tasks increases from one to two to any number of simultaneous tasks. I illustrate the use of capacity coefficients for visual multitasking on sample data from dynamic multitasking in the modified Multi-attribute Task Battery.« less
The most economical cadence increases with increasing workload.
Foss, Øivind; Hallén, Jostein
2004-08-01
Several studies have suggested that the most economical cadence in cycling increases with increasing workload. However, none of these studies have been able to demonstrate this relationship with experimental data. The purpose of this study was to test the hypothesis that the most economical cadence in elite cyclists increases with increasing workload and to explore the effect of cadence on performance. Six elite road cyclists performed submaximal and maximal tests at four different cadences (60, 80, 100 and 120 rpm) on separate days. Respiratory data was measured at 0, 50, 125, 200, 275 and 350 W during the submaximal test and at the end of the maximal test. The maximal test was carried out as an incremental test, conducted to reveal differences in maximal oxygen uptake and time to exhaustion (short-term performance) between cadences. The results showed that the lowest oxygen uptake, i.e. the best work economy, shifted from 60 rpm at 0 W to 80 rpm at 350 W ( P<0.05). No difference was found in maximal oxygen uptake among cadences ( P>0.05), while the best performance was attained at the same cadence that elicited the best work economy (80 rpm) at 350 W ( P<0.05). This study demonstrated that the most economical cadence increases with increasing workload in elite cyclists. It was further shown that work economy and performance are related during short efforts (approximately 5 min) over a wide range of cadences.
Costs, effectiveness, and workload impact of management strategies for women with an adnexal mass.
Havrilesky, Laura J; Dinan, Michaela; Sfakianos, Gregory P; Curtis, Lesley H; Barnett, Jason C; Van Gorp, Toon; Myers, Evan R
2015-01-01
We compared the estimated clinical outcomes, costs, and physician workload resulting from available strategies for deciding which women with an adnexal mass should be referred to a gynecologic oncologist. We used a microsimulation model to compare five referral strategies: 1) American Congress of Obstetricians and Gynecologists (ACOG) guidelines, 2) Multivariate Index Assay (MIA) algorithm, 3) Risk of Malignancy Algorithm (ROMA), 4) CA125 alone with lowered cutoff values to prioritize test sensitivity over specificity, 5) referral of all women (Refer All). Test characteristics and relative survival were obtained from the literature and data from a biomarker validation study. Medical costs were estimated using Medicare reimbursements. Travel costs were estimated using discharge data from Surveillance, Epidemiology and End Results-Medicare and State Inpatient Databases. Analyses were performed separately for pre- and postmenopausal women (60 000 "subjects" in each), repeated 10 000 times. Refer All was cost-effective compared with less expensive strategies in both postmenopausal (incremental cost-effectiveness ratio [ICER] $9423/year of life saved (LYS) compared with CA125) and premenopausal women (ICER $10 644/YLS compared with CA125), but would result in an additional 73 cases/year/subspecialist. MIA was more expensive and less effective than Refer All in pre- and postmenopausal women. If Refer All is not a viable option, CA125 is an optimal strategy in postmenopausal women. Referral of all women to a subspecialist is an efficient strategy for managing women with adnexal masses requiring surgery, assuming sufficient capacity for additional surgical volume. If a test-based triage strategy is needed, CA125 with lowered cutoff values is a cost-effective strategy. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
ERIC Educational Resources Information Center
Hemer, Susan R.
2014-01-01
University workloads, their impact on staff and how they can be managed, are the subject of considerable research and discussion. This paper addresses strategies to deal with the impact of workloads on teaching practices in higher education. In particular, it aims to discover the implicit theories and tacit assumptions that underlie perceptions of…
Measurement of operator workload in an information processing task
NASA Technical Reports Server (NTRS)
Jenney, L. L.; Older, H. J.; Cameron, B. J.
1972-01-01
This was an experimental study to develop an improved methodology for measuring workload in an information processing task and to assess the effects of shift length and communication density (rate of information flow) on the ability to process and classify verbal messages. Each of twelve subjects was exposed to combinations of three shift lengths and two communication densities in a counterbalanced, repeated measurements experimental design. Results indicated no systematic variation in task performance measures or in other dependent measures as a function of shift length or communication density. This is attributed to the absence of a secondary loading task, an insufficiently taxing work schedule, and the lack of psychological stress. Subjective magnitude estimates of workload showed fatigue (and to a lesser degree, tension) to be a power function of shift length. Estimates of task difficulty and fatigue were initially lower but increased more sharply over time under low density than under high density conditions. An interpretation of findings and recommedations for furture research are included. This research has major implications to human workload problems in information processing of air traffic control verbal data.
Driver's workload comparison in waste collection vehicle routing problem
NASA Astrophysics Data System (ADS)
Benjamin, Aida Mauziah; Abdul-Rahman, Syariza
2016-10-01
This paper compares the workload of the drivers for a waste collection benchmark problem. The problem involves ten data sets with different number of customers to be served and different number of disposal facilities available. Previous studies proposed a heuristic algorithm, namely Different Initial Customer (DIC) to solve the problem by constructing initial vehicles routes for the drivers with two main objectives; to minimize the total distance travelled and to minimize the total number of vehicles needed to collect the waste. The results from DIC compared well with other solutions in the literature. However, the balance of the workload among the vehicle drivers is not considered in the solutions. Thus in this paper, we evaluate the quality of the solutions in terms of the total number of customers served by each driver. Then the computational result is compared in terms of the total distance travelled which have been presented in a previous study. Comparison results show that the workload of the drivers are unbalance in terms of these two factors that may cause dissatisfaction among the drivers as well as to the managament.
Comparative evaluation of workload estimation techniques in piloting tasks
NASA Technical Reports Server (NTRS)
Wierwille, W. W.
1983-01-01
Techniques to measure operator workload in a wide range of situations and tasks were examined. The sensitivity and intrusion of a wide variety of workload assessment techniques in simulated piloting tasks were investigated. Four different piloting tasks, psychomotor, perceptual, mediational, and communication aspects of piloting behavior were selected. Techniques to determine relative sensitivity and intrusion were applied. Sensitivity is the relative ability of a workload estimation technique to discriminate statistically significant differences in operator loading. High sensitivity requires discriminable changes in score means as a function of load level and low variation of the scores about the means. Intrusion is an undesirable change in the task for which workload is measured, resulting from the introduction of the workload estimation technique or apparatus.
NASA Technical Reports Server (NTRS)
Hoh, Roger H.; Smith, James C.; Hinton, David A.
1987-01-01
An analytical and experimental research program was conducted to develop criteria for pilot interaction with advanced controls and displays in single pilot instrument flight rules (SPIFR) operations. The analytic phase reviewed fundamental considerations for pilot workload taking into account existing data, and using that data to develop a divided attention SPIFR pilot workload model. The pilot model was utilized to interpret the two experimental phases. The first experimental phase was a flight test program that evaluated pilot workload in the presence of current and near-term displays and autopilot functions. The second experiment was conducted on a King Air simulator, investigating the effects of co-pilot functions in the presence of very high SPIFR workload. The results indicate that the simplest displays tested were marginal for SPIFR operations. A moving map display aided the most in mental orientation, but had inherent deficiencies as a stand alone replacement for an HSI. Autopilot functions were highly effective for reducing pilot workload. The simulator tests showed that extremely high workload situations can be adequately handled when co-pilot functions are provided.
NASA Technical Reports Server (NTRS)
Horst, Richard L.; Mahaffey, David L.; Munson, Robert C.
1989-01-01
The present Phase 2 small business innovation research study was designed to address issues related to scalp-recorded event-related potential (ERP) indices of mental workload and to transition this technology from the laboratory to cockpit simulator environments for use as a systems engineering tool. The project involved five main tasks: (1) Two laboratory studies confirmed the generality of the ERP indices of workload obtained in the Phase 1 study and revealed two additional ERP components related to workload. (2) A task analysis' of flight scenarios and pilot tasks in the Advanced Concepts Flight Simulator (ACFS) defined cockpit events (i.e., displays, messages, alarms) that would be expected to elicit ERPs related to workload. (3) Software was developed to support ERP data analysis. An existing ARD-proprietary package of ERP data analysis routines was upgraded, new graphics routines were developed to enhance interactive data analysis, and routines were developed to compare alternative single-trial analysis techniques using simulated ERP data. (4) Working in conjunction with NASA Langley research scientists and simulator engineers, preparations were made for an ACFS validation study of ERP measures of workload. (5) A design specification was developed for a general purpose, computerized, workload assessment system that can function in simulators such as the ACFS.
Näswall, Katharina; Burt, Christopher D B; Pearce, Megan
2015-01-01
This study investigated the impact of workload demands on perceived job risk using the Job Demand-Control model as a research framework. The primary objective was to test the hypothesis that employee control over work scheduling and overtime would moderate the relationship between workload demands and perceived job risk. Ninety-six participants working in a variety of industries completed measures of workload demands, and of control over work scheduling and overtime, and a measure of perceived job risk. Workload demands predicted higher perceptions of job risk. However, the results also suggest that control over overtime moderated this relationship, where those with the combination of high workload demands and low control over overtime reported higher levels of perceived risk. The results indicate that the JDC model is applicable to safety research. The results suggest that employee control over workload demands is an important variable to consider in terms of managing workplace safety. The present study also points to important areas for future research to explore in order to further understand the connection between demands and safety.
An Approach to Quantify Workload in a System of Agents
NASA Technical Reports Server (NTRS)
Stocker, Richard; Rungta, Neha; Mercer, Eric; Raimondi, Franco; Holbrook, Jon; Cardoza, Colleen; Goodrich, Michael
2015-01-01
The role of humans in aviation and other domains continues to shift from manual control to automation monitoring. Studies have found that humans are often poorly suited for monitoring roles, and workload can easily spike in off-nominal situations. Current workload measurement tools, like NASA TLX, use human operators to assess their own workload after using a prototype system. Such measures are used late in the design process and can result in ex- pensive alterations when problems are discovered. Our goal in this work is to provide a quantitative workload measure for use early in the design process. We leverage research in human cognition to de ne metrics that can measure workload on belief-desire-intentions based multi-agent systems. These measures can alert designers to potential workload issues early in design. We demonstrate the utility of our approach by characterizing quantitative differences in the workload for a single pilot operations model compared to a traditional two pilot model.
Evaluation of Mental Workload among ICU Ward's Nurses.
Mohammadi, Mohsen; Mazloumi, Adel; Kazemi, Zeinab; Zeraati, Hojat
2015-01-01
High level of workload has been identified among stressors of nurses in intensive care units (ICUs). The present study investigated nursing workload and identified its influencing perfor-mance obstacles in ICUs. This cross-sectional study was conducted, in 2013, on 81 nurses working in ICUs in Imam Khomeini Hospital in Tehran, Iran. NASA-TLX was applied for assessment of workload. Moreover, ICUs Performance Obstacles Questionnaire was used to identify performance obstacles associated with ICU nursing. Physical demand (mean=84.17) was perceived as the most important dimensions of workload by nurses. The most critical performance obstacles affecting workload included: difficulty in finding a place to sit down, hectic workplace, disorganized workplace, poor-conditioned equipment, waiting for using a piece of equipment, spending much time seeking for supplies in the central stock, poor quality of medical materials, delay in getting medications, unpredicted problems, disorganized central stock, outpatient surgery, spending much time dealing with family needs, late, inadequate, and useless help from nurse assistants, and ineffective morning rounds (P-value<0.05). Various performance obstacles are correlated with nurses' workload, affirms the significance of nursing work system characteristics. Interventions are recommended based on the results of this study in the work settings of nurses in ICUs.
Performance Analysis of the NAS Y-MP Workload
NASA Technical Reports Server (NTRS)
Bergeron, Robert J.; Kutler, Paul (Technical Monitor)
1997-01-01
This paper describes the performance characteristics of the computational workloads on the NAS Cray Y-MP machines, a Y-MP 832 and later a Y-MP 8128. Hardware measurements indicated that the Y-MP workload performance matured over time, ultimately sustaining an average throughput of 0.8 GFLOPS and a vector operation fraction of 87%. The measurements also revealed an operation rate exceeding 1 per clock period, a well-balanced architecture featuring a strong utilization of vector functional units, and an efficient memory organization. Introduction of the larger memory 8128 increased throughput by allowing a more efficient utilization of CPUs. Throughput also depended on the metering of the batch queues; low-idle Saturday workloads required a buffer of small jobs to prevent memory starvation of the CPU. UNICOS required about 7% of total CPU time to service the 832 workloads; this overhead decreased to 5% for the 8128 workloads. While most of the system time went to service I/O requests, efficient scheduling prevented excessive idle due to I/O wait. System measurements disclosed no obvious bottlenecks in the response of the machine and UNICOS to the workloads. In most cases, Cray-provided software tools were- quite sufficient for measuring the performance of both the machine and operating, system.
Evaluation of Mental Workload among ICU Ward's Nurses
Mohammadi, Mohsen; Mazloumi, Adel; Kazemi, Zeinab; Zeraati, Hojat
2015-01-01
Background: High level of workload has been identified among stressors of nurses in intensive care units (ICUs). The present study investigated nursing workload and identified its influencing performance obstacles in ICUs. Methods: This cross-sectional study was conducted, in 2013, on 81 nurses working in ICUs in Imam Khomeini Hospital in Tehran, Iran. NASA-TLX was applied for assessment of workload. Moreover, ICUs Performance Obstacles Questionnaire was used to identify performance obstacles associated with ICU nursing. Results: Physical demand (mean=84.17) was perceived as the most important dimensions of workload by nurses. The most critical performance obstacles affecting workload included: difficulty in finding a place to sit down, hectic workplace, disorganized workplace, poor-conditioned equipment, waiting for using a piece of equipment, spending much time seeking for supplies in the central stock, poor quality of medical materials, delay in getting medications, unpredicted problems, disorganized central stock, outpatient surgery, spending much time dealing with family needs, late, inadequate, and useless help from nurse assistants, and ineffective morning rounds (P-value<0.05). Conclusion: Various performance obstacles are correlated with nurses' workload, affirms the significance of nursing work system characteristics. Interventions are recommended based on the results of this study in the work settings of nurses in ICUs. PMID:26933647
Aviation human-in-the-loop simulation studies : experimental planning, design, and data management.
DOT National Transportation Integrated Search
2014-01-01
Researchers from the NASAAmes Flight Cognition Laband the FAAs Aerospace Human Factors Research Lab at the Civil Aerospace Medical Instituteexamined task and workload management by single pilots in very light jets, also called entry-level jets.Thi...
Lavner, Justin A; Clark, Malissa A
2017-08-01
Although many studies have found that higher workloads covary with lower levels of marital satisfaction, the question of whether workloads may also predict changes in marital satisfaction over time has been overlooked. To address this question, we investigated the lagged association between own and partner workload and marital satisfaction using eight waves of data collected every 6 months over the first four years of marriage from 172 heterosexual couples. Significant crossover, but not spillover, effects were found, indicating that partners of individuals with higher workloads at one time point experience greater declines in marital satisfaction by the following time point compared to the partners of individuals with lower workloads. These effects were not moderated by gender or parental status. These findings suggest that higher partner workloads can prove deleterious for relationship functioning over time and call for increased attention to the long-term effects of spillover and crossover from work to marital functioning.
GIS\\KEY™ ENVIRONMENTAL DATA MANAGEMENT SYSTEM - INNOVATIVE TECHNOLOGY EVALUATION REPORT
GIS/Key™ is a comprehensive environmental database management system that integrates site data and graphics, enabling the user to create geologic cross-sections; boring logs; potentiometric, isopleth, and structure maps; summary tables; and hydrographs. GIS/Key™ is menu-driven an...
Data management integration for biomedical core facilities
NASA Astrophysics Data System (ADS)
Zhang, Guo-Qiang; Szymanski, Jacek; Wilson, David
2007-03-01
We present the design, development, and pilot-deployment experiences of MIMI, a web-based, Multi-modality Multi-Resource Information Integration environment for biomedical core facilities. This is an easily customizable, web-based software tool that integrates scientific and administrative support for a biomedical core facility involving a common set of entities: researchers; projects; equipments and devices; support staff; services; samples and materials; experimental workflow; large and complex data. With this software, one can: register users; manage projects; schedule resources; bill services; perform site-wide search; archive, back-up, and share data. With its customizable, expandable, and scalable characteristics, MIMI not only provides a cost-effective solution to the overarching data management problem of biomedical core facilities unavailable in the market place, but also lays a foundation for data federation to facilitate and support discovery-driven research.
Assessing physician job satisfaction and mental workload.
Boultinghouse, Oscar W; Hammack, Glenn G; Vo, Alexander H; Dittmar, Mary Lynne
2007-12-01
Physician job satisfaction and mental workload were evaluated in a pilot study of five physicians engaged in a telemedicine practice at The University of Texas Medical Branch at Galveston Electronic Health Network. Several previous studies have examined physician satisfaction with specific telemedicine applications; however, few have attempted to identify the underlying factors that contribute to physician satisfaction or lack thereof. One factor that has been found to affect well-being and functionality in the workplace-particularly with regard to human interaction with complex systems and tasks as seen in telemedicine-is mental workload. Workload is generally defined as the "cost" to a person for performing a complex task or tasks; however, prior to this study, it was unexplored as a variable that influences physician satisfaction. Two measures of job satisfaction were used: The Job Descriptive Index and the Job In General scales. Mental workload was evaluated by means of the National Aeronautics and Space Administration Task Load Index. The measures were administered by means of Web-based surveys and were given twice over a 6-month period. Nonparametric statistical analyses revealed that physician job satisfaction was generally high relative to that of the general population and other professionals. Mental workload scores associated with the practice of telemedicine in this environment are also high, and appeared stable over time. In addition, they are commensurate with scores found in individuals practicing tasks with elevated information-processing demands, such as quality control engineers and air traffic controllers. No relationship was found between the measures of job satisfaction and mental workload.
What's skill got to do with it? Vehicle automation and driver mental workload.
Young, M S; Stanton, N A
2007-08-01
Previous research has found that vehicle automation systems can reduce driver mental workload, with implications for attentional resources that can be detrimental to performance. The present paper considers how the development of automaticity within the driving task may influence performance in underload situations. Driver skill and vehicle automation were manipulated in a driving simulator, with four levels of each variable. Mental workload was assessed using a secondary task measure and eye movements were recorded to infer attentional capacity. The effects of automation on driver mental workload were quite robust across skill levels, but the most intriguing findings were from the eye movement data. It was found that, with little exception, attentional capacity and mental workload were directly related at all levels of driver skill, consistent with earlier studies. The results are discussed with reference to applied theories of cognition and the design of automation.
FY17 ASC CSSE L2 Milestone 6018: Power Usage Characteristics of Workloads Running on Trinity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pedretti, Kevin
The overall goal of this work was to utilize the Advanced Power Management (APM) capabilities of the ATS-1 Trinity platform to understand the power usage behavior of ASC workloads running on Trinity and gain insight into the potential for utilizing power management techniques on future ASC platforms.
NASA Astrophysics Data System (ADS)
Berenter, J. S.; Mueller, J. M.; Morrison, I.
2016-12-01
of support systems for use of Earth observation data is thus required to maximize the value of data-driven forest fire management in the MBR. Findings further validate a need for continued cooperation between scientific and governance institutions to disseminate and integrate geospatial data into environmental decision-making.
Progress in Multi-Disciplinary Data Life Cycle Management
NASA Astrophysics Data System (ADS)
Jung, C.; Gasthuber, M.; Giesler, A.; Hardt, M.; Meyer, J.; Prabhune, A.; Rigoll, F.; Schwarz, K.; Streit, A.
2015-12-01
Modern science is most often driven by data. Improvements in state-of-the-art technologies and methods in many scientific disciplines lead not only to increasing data rates, but also to the need to improve or even completely overhaul their data life cycle management. Communities usually face two kinds of challenges: generic ones like federated authorization and authentication infrastructures and data preservation, and ones that are specific to their community and their respective data life cycle. In practice, the specific requirements often hinder the use of generic tools and methods. The German Helmholtz Association project ’’Large-Scale Data Management and Analysis” (LSDMA) addresses both challenges: its five Data Life Cycle Labs (DLCLs) closely collaborate with communities in joint research and development to optimize the communities data life cycle management, while its Data Services Integration Team (DSIT) provides generic data tools and services. We present most recent developments and results from the DLCLs covering communities ranging from heavy ion physics and photon science to high-throughput microscopy, and from DSIT.
Activity-based differentiation of pathologists' workload in surgical pathology.
Meijer, G A; Oudejans, J J; Koevoets, J J M; Meijer, C J L M
2009-06-01
Adequate budget control in pathology practice requires accurate allocation of resources. Any changes in types and numbers of specimens handled or protocols used will directly affect the pathologists' workload and consequently the allocation of resources. The aim of the present study was to develop a model for measuring the pathologists' workload that can take into account the changes mentioned above. The diagnostic process was analyzed and broken up into separate activities. The time needed to perform these activities was measured. Based on linear regression analysis, for each activity, the time needed was calculated as a function of the number of slides or blocks involved. The total pathologists' time required for a range of specimens was calculated based on standard protocols and validated by comparing to actually measured workload. Cutting up, microscopic procedures and dictating turned out to be highly correlated to number of blocks and/or slides per specimen. Calculated workload per type of specimen was significantly correlated to the actually measured workload. Modeling pathologists' workload based on formulas that calculate workload per type of specimen as a function of the number of blocks and slides provides a basis for a comprehensive, yet flexible, activity-based costing system for pathology.
JAX Colony Management System (JCMS): an extensible colony and phenotype data management system.
Donnelly, Chuck J; McFarland, Mike; Ames, Abigail; Sundberg, Beth; Springer, Dave; Blauth, Peter; Bult, Carol J
2010-04-01
The Jackson Laboratory Colony Management System (JCMS) is a software application for managing data and information related to research mouse colonies, associated biospecimens, and experimental protocols. JCMS runs directly on computers that run one of the PC Windows operating systems, but can be accessed via web browser interfaces from any computer running a Windows, Macintosh, or Linux operating system. JCMS can be configured for a single user or multiple users in small- to medium-size work groups. The target audience for JCMS includes laboratory technicians, animal colony managers, and principal investigators. The application provides operational support for colony management and experimental workflows, sample and data tracking through transaction-based data entry forms, and date-driven work reports. Flexible query forms allow researchers to retrieve database records based on user-defined criteria. Recent advances in handheld computers with integrated barcode readers, middleware technologies, web browsers, and wireless networks add to the utility of JCMS by allowing real-time access to the database from any networked computer.
ERIC Educational Resources Information Center
Melda, Kerri, Ed.
This guide discusses participant-driven managed support in which people with disabilities and their families steer their own futures by having more control over the money used to provide long-term supports. After an introductory chapter, chapter 2, "What Is Managed Care," describes managed care, traditional managed care players, and the 10 tools…
Writing through Big Data: New Challenges and Possibilities for Data-Driven Arguments
ERIC Educational Resources Information Center
Beveridge, Aaron
2017-01-01
As multimodal writing continues to shift and expand in the era of Big Data, writing studies must confront the new challenges and possibilities emerging from data mining, data visualization, and data-driven arguments. Often collected under the broad banner of "data literacy," students' experiences of data visualization and data-driven…
Mission Driven and Data Informed Leadership
ERIC Educational Resources Information Center
Holter, Anthony C.; Frabutt, James M.
2012-01-01
The contemporary challenges facing Catholic schools and Catholic school leaders are widely known. Effective and systemic solutions to these mounting challenges are less widely known or discussed. This article highlights the skills, knowledge, and dispositions associated with mission driven and data informed leadership--an orientation to school…
Data-Driven Approaches to Empirical Discovery
1988-10-31
if nece ry and identify by block number) empirical discovery history of science data-driven heuristics numeric laws theoretical terms scope of laws...to the normative side. Machine Discovery and the History of Science The history of science studies the actual path followed by scientists over the
Workload Characterization of CFD Applications Using Partial Differential Equation Solvers
NASA Technical Reports Server (NTRS)
Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
Workload characterization is used for modeling and evaluating of computing systems at different levels of detail. We present workload characterization for a class of Computational Fluid Dynamics (CFD) applications that solve Partial Differential Equations (PDEs). This workload characterization focuses on three high performance computing platforms: SGI Origin2000, EBM SP-2, a cluster of Intel Pentium Pro bases PCs. We execute extensive measurement-based experiments on these platforms to gather statistics of system resource usage, which results in workload characterization. Our workload characterization approach yields a coarse-grain resource utilization behavior that is being applied for performance modeling and evaluation of distributed high performance metacomputing systems. In addition, this study enhances our understanding of interactions between PDE solver workloads and high performance computing platforms and is useful for tuning these applications.
Effects of workload preview on task scheduling during simulated instrument flight.
Andre, A D; Heers, S T; Cashion, P A
1995-01-01
Our study examined pilot scheduling behavior in the context of simulated instrument flight. Over the course of the flight, pilots flew along specified routes while scheduling and performing several flight-related secondary tasks. The first phase of flight was flown under low-workload conditions, whereas the second phase of flight was flown under high-workload conditions in the form of increased turbulence and a disorganized instrument layout. Six pilots were randomly assigned to each of three workload preview groups. Subjects in the no-preview group were not given preview of the increased-workload conditions. Subjects in the declarative preview group were verbally informed of the nature of the flight workload manipulation but did not receive any practice under the high-workload conditions. Subjects in the procedural preview group received the same instructions as the declarative preview group but also flew half of the practice flight under the high-workload conditions. The results show that workload preview fostered efficient scheduling strategies. Specifically, those pilots with either declarative or procedural preview of future workload demands adopted an efficient strategy of scheduling more of the difficult secondary tasks during the low-workload phase of flight. However, those pilots given a procedural preview showed the greatest benefits in overall flight performance.
State of science: mental workload in ergonomics.
Young, Mark S; Brookhuis, Karel A; Wickens, Christopher D; Hancock, Peter A
2015-01-01
Mental workload (MWL) is one of the most widely used concepts in ergonomics and human factors and represents a topic of increasing importance. Since modern technology in many working environments imposes ever more cognitive demands upon operators while physical demands diminish, understanding how MWL impinges on performance is increasingly critical. Yet, MWL is also one of the most nebulous concepts, with numerous definitions and dimensions associated with it. Moreover, MWL research has had a tendency to focus on complex, often safety-critical systems (e.g. transport, process control). Here we provide a general overview of the current state of affairs regarding the understanding, measurement and application of MWL in the design of complex systems over the last three decades. We conclude by discussing contemporary challenges for applied research, such as the interaction between cognitive workload and physical workload, and the quantification of workload 'redlines' which specify when operators are approaching or exceeding their performance tolerances.
Emotional exhaustion and workload predict clinician-rated and objective patient safety
Welp, Annalena; Meier, Laurenz L.; Manser, Tanja
2015-01-01
Aims: To investigate the role of clinician burnout, demographic, and organizational characteristics in predicting subjective and objective indicators of patient safety. Background: Maintaining clinician health and ensuring safe patient care are important goals for hospitals. While these goals are not independent from each other, the interplay between clinician psychological health, demographic and organizational variables, and objective patient safety indicators is poorly understood. The present study addresses this gap. Method: Participants were 1425 physicians and nurses working in intensive care. Regression analysis (multilevel) was used to investigate the effect of burnout as an indicator of psychological health, demographic (e.g., professional role and experience) and organizational (e.g., workload, predictability) characteristics on standardized mortality ratios, length of stay and clinician-rated patient safety. Results: Clinician-rated patient safety was associated with burnout, trainee status, and professional role. Mortality was predicted by emotional exhaustion. Length of stay was predicted by workload. Contrary to our expectations, burnout did not predict length of stay, and workload and predictability did not predict standardized mortality ratios. Conclusion: At least in the short-term, clinicians seem to be able to maintain safety despite high workload and low predictability. Nevertheless, burnout poses a safety risk. Subjectively, burnt-out clinicians rated safety lower, and objectively, units with high emotional exhaustion had higher standardized mortality ratios. In summary, our results indicate that clinician psychological health and patient safety could be managed simultaneously. Further research needs to establish causal relationships between these variables and support to the development of managerial guidelines to ensure clinicians’ psychological health and patients’ safety. PMID:25657627
A data driven control method for structure vibration suppression
NASA Astrophysics Data System (ADS)
Xie, Yangmin; Wang, Chao; Shi, Hang; Shi, Junwei
2018-02-01
High radio-frequency space applications have motivated continuous research on vibration suppression of large space structures both in academia and industry. This paper introduces a novel data driven control method to suppress vibrations of flexible structures and experimentally validates the suppression performance. Unlike model-based control approaches, the data driven control method designs a controller directly from the input-output test data of the structure, without requiring parametric dynamics and hence free of system modeling. It utilizes the discrete frequency response via spectral analysis technique and formulates a non-convex optimization problem to obtain optimized controller parameters with a predefined controller structure. Such approach is then experimentally applied on an end-driving flexible beam-mass structure. The experiment results show that the presented method can achieve competitive disturbance rejections compared to a model-based mixed sensitivity controller under the same design criterion but with much less orders and design efforts, demonstrating the proposed data driven control is an effective approach for vibration suppression of flexible structures.
A Data-Driven Approach to Interactive Visualization of Power Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Jun
Driven by emerging industry standards, electric utilities and grid coordination organizations are eager to seek advanced tools to assist grid operators to perform mission-critical tasks and enable them to make quick and accurate decisions. The emerging field of visual analytics holds tremendous promise for improving the business practices in today’s electric power industry. The conducted investigation, however, has revealed that the existing commercial power grid visualization tools heavily rely on human designers, hindering user’s ability to discover. Additionally, for a large grid, it is very labor-intensive and costly to build and maintain the pre-designed visual displays. This project proposes amore » data-driven approach to overcome the common challenges. The proposed approach relies on developing powerful data manipulation algorithms to create visualizations based on the characteristics of empirically or mathematically derived data. The resulting visual presentations emphasize what the data is rather than how the data should be presented, thus fostering comprehension and discovery. Furthermore, the data-driven approach formulates visualizations on-the-fly. It does not require a visualization design stage, completely eliminating or significantly reducing the cost for building and maintaining visual displays. The research and development (R&D) conducted in this project is mainly divided into two phases. The first phase (Phase I & II) focuses on developing data driven techniques for visualization of power grid and its operation. Various data-driven visualization techniques were investigated, including pattern recognition for auto-generation of one-line diagrams, fuzzy model based rich data visualization for situational awareness, etc. The R&D conducted during the second phase (Phase IIB) focuses on enhancing the prototyped data driven visualization tool based on the gathered requirements and use cases. The goal is to evolve the prototyped tool developed
Event-driven management algorithm of an Engineering documents circulation system
NASA Astrophysics Data System (ADS)
Kuzenkov, V.; Zebzeev, A.; Gromakov, E.
2015-04-01
Development methodology of an engineering documents circulation system in the design company is reviewed. Discrete event-driven automatic models using description algorithms of project management is offered. Petri net use for dynamic design of projects is offered.
Retrospective data-driven respiratory gating for PET/CT
NASA Astrophysics Data System (ADS)
Schleyer, Paul J.; O'Doherty, Michael J.; Barrington, Sally F.; Marsden, Paul K.
2009-04-01
Respiratory motion can adversely affect both PET and CT acquisitions. Respiratory gating allows an acquisition to be divided into a series of motion-reduced bins according to the respiratory signal, which is typically hardware acquired. In order that the effects of motion can potentially be corrected for, we have developed a novel, automatic, data-driven gating method which retrospectively derives the respiratory signal from the acquired PET and CT data. PET data are acquired in listmode and analysed in sinogram space, and CT data are acquired in cine mode and analysed in image space. Spectral analysis is used to identify regions within the CT and PET data which are subject to respiratory motion, and the variation of counts within these regions is used to estimate the respiratory signal. Amplitude binning is then used to create motion-reduced PET and CT frames. The method was demonstrated with four patient datasets acquired on a 4-slice PET/CT system. To assess the accuracy of the data-derived respiratory signal, a hardware-based signal was acquired for comparison. Data-driven gating was successfully performed on PET and CT datasets for all four patients. Gated images demonstrated respiratory motion throughout the bin sequences for all PET and CT series, and image analysis and direct comparison of the traces derived from the data-driven method with the hardware-acquired traces indicated accurate recovery of the respiratory signal.
A simulation study on the constancy of cardiac energy metabolites during workload transition.
Saito, Ryuta; Takeuchi, Ayako; Himeno, Yukiko; Inagaki, Nobuya; Matsuoka, Satoshi
2016-12-01
The cardiac energy metabolites such as ATP, phosphocreatine, ADP and NADH are kept relatively constant during physiological cardiac workload transition. How this is accomplished is not yet clarified, though Ca 2+ has been suggested to be one of the possible mechanisms. We constructed a detailed mathematical model of cardiac mitochondria based on experimental data and studied whether known Ca 2+ -dependent regulation mechanisms play roles in the metabolite constancy. Model simulations revealed that the Ca 2+ -dependent regulation mechanisms have important roles under the in vitro condition of isolated mitochondria where malate and glutamate were mitochondrial substrates, while they have only a minor role and the composition of substrates has marked influence on the metabolite constancy during workload transition under the simulated in vivo condition where many substrates exist. These results help us understand the regulation mechanisms of cardiac energy metabolism during physiological cardiac workload transition. The cardiac energy metabolites such as ATP, phosphocreatine, ADP and NADH are kept relatively constant over a wide range of cardiac workload, though the mechanisms are not yet clarified. One possible regulator of mitochondrial metabolism is Ca 2+ , because it activates several mitochondrial enzymes and transporters. Here we constructed a mathematical model of cardiac mitochondria, including oxidative phosphorylation, substrate metabolism and ion/substrate transporters, based on experimental data, and studied whether the Ca 2+ -dependent activation mechanisms play roles in metabolite constancy. Under the in vitro condition of isolated mitochondria, where malate and glutamate were used as mitochondrial substrates, the model well reproduced the Ca 2+ and inorganic phosphate (P i ) dependences of oxygen consumption, NADH level and mitochondrial membrane potential. The Ca 2+ -dependent activations of the aspartate/glutamate carrier and the F 1 F o -ATPase, and
The impact of automation on workload and dispensing errors in a hospital pharmacy.
James, K Lynette; Barlow, Dave; Bithell, Anne; Hiom, Sarah; Lord, Sue; Pollard, Mike; Roberts, Dave; Way, Cheryl; Whittlesea, Cate
2013-04-01
To determine the effect of installing an original-pack automated dispensing system (ADS) on dispensary workload and prevented dispensing incidents in a hospital pharmacy. Data on dispensary workload and prevented dispensing incidents, defined as dispensing errors detected and reported before medication had left the pharmacy, were collected over 6 weeks at a National Health Service hospital in Wales before and after the installation of an ADS. Workload was measured by non-participant observation using the event recording technique. Prevented dispensing incidents were self-reported by pharmacy staff on standardised forms. Median workloads (measured as items dispensed/person/hour) were compared using Mann-Whitney U tests and rate of prevented dispensing incidents were compared using Chi-square test. Spearman's rank correlation was used to examine the association between workload and prevented dispensing incidents. A P value of ≤0.05 was considered statistically significant. Median dispensary workload was significantly lower pre-automation (9.20 items/person/h) compared to post-automation (13.17 items/person/h, P < 0.001). Rate of prevented dispensing incidents was significantly lower post-automation (0.28%) than pre-automation (0.64%, P < 0.0001) but there was no difference (P = 0.277) between the types of dispensing incidents. A positive association existed between workload and prevented dispensing incidents both pre- (ρ = 0.13, P = 0.015) and post-automation (ρ = 0.23, P < 0.001). Dispensing incidents were found to occur during prolonged periods of moderate workload or after a busy period. Study findings suggest that automation improves dispensing efficiency and reduces the rate of prevented dispensing incidents. It is proposed that prevented dispensing incidents frequently occurred during periods of high workload due to involuntary automaticity. Prevented dispensing incidents occurring after a busy period were attributed to staff
Li, Li; Lee, Nathan J; Glicksberg, Benjamin S; Radbill, Brian D; Dudley, Joel T
2016-01-01
The Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey is the first publicly reported nationwide survey to evaluate and compare hospitals. Increasing patient satisfaction is an important goal as it aims to achieve a more effective and efficient healthcare delivery system. In this study, we develop and apply an integrative, data-driven approach to identify clinical risk factors that associate with patient satisfaction outcomes. We included 1,771 unique adult patients who completed the HCAHPS survey and were discharged from the inpatient Medicine service from 2010 to 2012. We collected 266 clinical features including patient demographics, lab measurements, medications, disease categories, and procedures. We developed and applied a data-driven approach to identify risk factors that associate with patient satisfaction outcomes. We identify 102 significant risk factors associating with 18 surveyed questions. The most significantly recurrent clinical risk factors were: self-evaluation of health, education level, Asian, White, treatment in BMT oncology division, being prescribed a new medication. Patients who were prescribed pregabalin were less satisfied particularly in relation to communication with nurses and pain management. Explanation of medication usage was associated with communication with nurses (q = 0.001); however, explanation of medication side effects was associated with communication with doctors (q = 0.003). Overall hospital rating was associated with hospital environment, communication with doctors, and communication about medicines. However, patient likelihood to recommend hospital was associated with hospital environment, communication about medicines, pain management, and communication with nurse. Our study identified a number of putatively novel clinical risk factors for patient satisfaction that suggest new opportunities to better understand and manage patient satisfaction. Hospitals can use a data-driven approach to
Algorithmic Management for Improving Collective Productivity in Crowdsourcing.
Yu, Han; Miao, Chunyan; Chen, Yiqiang; Fauvel, Simon; Li, Xiaoming; Lesser, Victor R
2017-10-02
Crowdsourcing systems are complex not only because of the huge number of potential strategies for assigning workers to tasks, but also due to the dynamic characteristics associated with workers. Maximizing social welfare in such situations is known to be NP-hard. To address these fundamental challenges, we propose the surprise-minimization-value-maximization (SMVM) approach. By analysing typical crowdsourcing system dynamics, we established a simple and novel worker desirability index (WDI) jointly considering the effect of each worker's reputation, workload and motivation to work on collective productivity. Through evaluating workers' WDI values, SMVM influences individual workers in real time about courses of action which can benefit the workers and lead to high collective productivity. Solutions can be produced in polynomial time and are proven to be asymptotically bounded by a theoretical optimal solution. High resolution simulations based on a real-world dataset demonstrate that SMVM significantly outperforms state-of-the-art approaches. A large-scale 3-year empirical study involving 1,144 participants in over 9,000 sessions shows that SMVM outperforms human task delegation decisions over 80% of the time under common workload conditions. The approach and results can help engineer highly scalable data-driven algorithmic management decision support systems for crowdsourcing.
NASA Technical Reports Server (NTRS)
Norman, R. Michael; Baxley, Brian T.; Adams, Cathy A.; Ellis, Kyle K. E.; Latorella, Kara A.; Comstock, James R., Jr.
2013-01-01
This document describes a collaborative FAA/NASA experiment using 22 commercial airline pilots to determine the effect of using Data Comm to issue messages during busy, terminal area operations. Four conditions were defined that span current day to future flight deck equipage: Voice communication only, Data Comm only, Data Comm with Moving Map Display, and Data Comm with Moving Map displaying taxi route. Each condition was used in an arrival and a departure scenario at Boston Logan Airport. Of particular interest was the flight crew response to D-TAXI, the use of Data Comm by Air Traffic Control (ATC) to send taxi instructions. Quantitative data was collected on subject reaction time, flight technical error, operational errors, and eye tracking information. Questionnaires collected subjective feedback on workload, situation awareness, and acceptability to the flight crew for using Data Comm in a busy terminal area. Results showed that 95% of the Data Comm messages were responded to by the flight crew within one minute and 97% of the messages within two minutes. However, post experiment debrief comments revealed almost unanimous consensus that two minutes was a reasonable expectation for crew response. Flight crews reported that Expected D-TAXI messages were useful, and employment of these messages acceptable at all altitude bands evaluated during arrival scenarios. Results also indicate that the use of Data Comm for all evaluated message types in the terminal area was acceptable during surface operations, and during arrivals at any altitude above the Final Approach Fix, in terms of response time, workload, situation awareness, and flight technical performance. The flight crew reported the use of Data Comm as implemented in this experiment as unacceptable in two instances: in clearances to cross an active runway, and D-TAXI messages between the Final Approach Fix and 80 knots during landing roll. Critical cockpit tasks and the urgency of out-the window scan made the
Physical and mental workloads in professional dance teachers.
Wanke, Eileen M; Schmidt, Mike; Leslie-Spinks, Jeremy; Fischer, Axel; Groneberg, David A
2015-03-01
The aim of this cross-sectional study was to investigate the level of mental and physical workloads in professional dance teachers depending on the trained students' age, technique level, or dance style. A total number of 133 professional dance pedagogues responded to an online cross-sectional questionnaire survey on self-assessment of physical and mental workloads occurring during dance units. The majority of dance teachers estimated their level of physical and mental workload to be almost as high as that of their students, with differences in physical and mental workloads observed depending on dance style, age of students, and technical level. More than 60% of the teachers were convinced that their occupation implies positive effects on their own health in terms of self-realization (78.2%), musculoskeletal system (66.9%), and social relationships (61.7%). Of all respondents, 58.6% stated that their musculoskeletal system was jeopardized by the physically demanding activity. This is followed by the fear of financial insecurity (50.4%). The majority of all dance teachers (males 65.4%, females 63.9%) would like to obtain further education on prevention against or dealing with physical workload. Physical and mental workloads play an important role in dance teaching. Coping with or preventing these loads could be keys to a lifelong, healthy career as a professional dance teacher. Future trials should look at clinical parameters of physical and mental load.
Assessment of Crew Workload for the RAH-66 Comanche Force Development Experiment 1
2001-10-01
Scale and a cockpit controls and displays usability questionnaire . Results of the assessment indicate that (a) workload was tolerable for the pilots...Workload Levels Between Front Seat and Back Seat 13 3.4 Pilot Responses to Controls and Displays Usability Questionnaire 13 3.5 HMD Symbology 13 4... questionnaire . The data were analyzed to determine if the pilot flying the aircraft (pilot on controls) and the pilot operating the mission equipment
Mental workload prediction based on attentional resource allocation and information processing.
Xiao, Xu; Wanyan, Xiaoru; Zhuang, Damin
2015-01-01
Mental workload is an important component in complex human-machine systems. The limited applicability of empirical workload measures produces the need for workload modeling and prediction methods. In the present study, a mental workload prediction model is built on the basis of attentional resource allocation and information processing to ensure pilots' accuracy and speed in understanding large amounts of flight information on the cockpit display interface. Validation with an empirical study of an abnormal attitude recovery task showed that this model's prediction of mental workload highly correlated with experimental results. This mental workload prediction model provides a new tool for optimizing human factors interface design and reducing human errors.
Robbins, Anthony S; Moilanen, Dale A; Fonseca, Vincent P; Chao, Susan Y
2002-04-01
A study was conducted to examine the relationship between two types of trends in the Air Force Medical Service Direct Care System (AFMS/DCS): trends in expenditures, total and by categories; and trends in medical workload, defined as the sum of inpatient admissions and outpatient clinic visits. Expenditure and medical workload data were extracted from the Medical Expense and Performance Reporting System Executive Query System. Medical inflation data were obtained from the Bureau of Labor Statistics Producer Price Index series. Between fiscal years 1995 and 1999, the AFMS/DCS experienced a 21.2% decrease in medical workload, but total (nominal) expenditures declined only 3.6%. Of all expenditure categories, only inpatient medical care, outpatient medical care, and military-funded private sector care for active duty personnel (supplemental care) have any direct relationship with AFMS/DCS medical workload. Real expenditures for the three categories above decreased by 20.3% during the 5-year period. Accounting for inflation and considering only expenditures related to medical workload, these results suggest that the AFMS/DCS is spending approximately 20% less money to do approximately 20% less work.
EEG-based workload estimation across affective contexts
Mühl, Christian; Jeunet, Camille; Lotte, Fabien
2014-01-01
Workload estimation from electroencephalographic signals (EEG) offers a highly sensitive tool to adapt the human–computer interaction to the user state. To create systems that reliably work in the complexity of the real world, a robustness against contextual changes (e.g., mood), has to be achieved. To study the resilience of state-of-the-art EEG-based workload classification against stress we devise a novel experimental protocol, in which we manipulated the affective context (stressful/non-stressful) while the participant solved a task with two workload levels. We recorded self-ratings, behavior, and physiology from 24 participants to validate the protocol. We test the capability of different, subject-specific workload classifiers using either frequency-domain, time-domain, or both feature varieties to generalize across contexts. We show that the classifiers are able to transfer between affective contexts, though performance suffers independent of the used feature domain. However, cross-context training is a simple and powerful remedy allowing the extraction of features in all studied feature varieties that are more resilient to task-unrelated variations in signal characteristics. Especially for frequency-domain features, across-context training is leading to a performance comparable to within-context training and testing. We discuss the significance of the result for neurophysiology-based workload detection in particular and for the construction of reliable passive brain–computer interfaces in general. PMID:24971046
Yoon, Sang-Young; Ko, Jeonghan; Jung, Myung-Chul
2016-07-01
The aim of study is to suggest a job rotation schedule by developing a mathematical model in order to reduce cumulative workload from the successive use of the same body region. Workload assessment using rapid entire body assessment (REBA) was performed for the model in three automotive assembly lines of chassis, trim, and finishing to identify which body part exposed to relatively high workloads at workstations. The workloads were incorporated to the model to develop a job rotation schedule. The proposed schedules prevent the exposure to high workloads successively on the same body region and minimized between-worker variance in cumulative daily workload. Whereas some of workers were successively assigned to high workload workstation under no job rotation and serial job rotation. This model would help to reduce the potential for work-related musculoskeletal disorders (WMSDs) without additional cost for engineering work, although it may need more computational time and relative complex job rotation sequences. Copyright © 2016 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Jeong, Jina; Park, Eungyu; Han, Weon Shik; Kim, Kue-Young; Jun, Seong-Chun; Choung, Sungwook; Yun, Seong-Taek; Oh, Junho; Kim, Hyun-Jun
2017-11-01
In this study, a data-driven method for predicting CO 2 leaks and associated concentrations from geological CO 2 sequestration is developed. Several candidate models are compared based on their reproducibility and predictive capability for CO 2 concentration measurements from the Environment Impact Evaluation Test (EIT) site in Korea. Based on the data mining results, a one-dimensional solution of the advective-dispersive equation for steady flow (i.e., Ogata-Banks solution) is found to be most representative for the test data, and this model is adopted as the data model for the developed method. In the validation step, the method is applied to estimate future CO 2 concentrations with the reference estimation by the Ogata-Banks solution, where a part of earlier data is used as the training dataset. From the analysis, it is found that the ensemble mean of multiple estimations based on the developed method shows high prediction accuracy relative to the reference estimation. In addition, the majority of the data to be predicted are included in the proposed quantile interval, which suggests adequate representation of the uncertainty by the developed method. Therefore, the incorporation of a reasonable physically-based data model enhances the prediction capability of the data-driven model. The proposed method is not confined to estimations of CO 2 concentration and may be applied to various real-time monitoring data from subsurface sites to develop automated control, management or decision-making systems. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Jeong, Jina; Park, Eungyu; Han, Weon Shik; Kim, Kue-Young; Jun, Seong-Chun; Choung, Sungwook; Yun, Seong-Taek; Oh, Junho; Kim, Hyun-Jun
2017-11-01
In this study, a data-driven method for predicting CO2 leaks and associated concentrations from geological CO2 sequestration is developed. Several candidate models are compared based on their reproducibility and predictive capability for CO2 concentration measurements from the Environment Impact Evaluation Test (EIT) site in Korea. Based on the data mining results, a one-dimensional solution of the advective-dispersive equation for steady flow (i.e., Ogata-Banks solution) is found to be most representative for the test data, and this model is adopted as the data model for the developed method. In the validation step, the method is applied to estimate future CO2 concentrations with the reference estimation by the Ogata-Banks solution, where a part of earlier data is used as the training dataset. From the analysis, it is found that the ensemble mean of multiple estimations based on the developed method shows high prediction accuracy relative to the reference estimation. In addition, the majority of the data to be predicted are included in the proposed quantile interval, which suggests adequate representation of the uncertainty by the developed method. Therefore, the incorporation of a reasonable physically-based data model enhances the prediction capability of the data-driven model. The proposed method is not confined to estimations of CO2 concentration and may be applied to various real-time monitoring data from subsurface sites to develop automated control, management or decision-making systems.
Downing, Amanda; Mortimer, Molly; Hiers, Jill
2016-03-01
Warfarin is a high alert medication and a challenge to dose and monitor. Pharmacist-driven warfarin management has been shown to decrease the time international normalized ratio (INR) is out of range, which may reduce undesired outcomes. The purpose of this study is to assess the effect of the implementation of a pharmacist-driven warfarin management protocol on the achievement of therapeutic INRs. A warfarin management protocol was developed using evidence based literature and similar protocols from other institutions. Pharmacists utilized the protocol to provide patient specific warfarin dosing upon provider referral. To evaluate the protocol's impact, a retrospective chart review pre- and post-implementation was completed for admitted patients receiving warfarin. Three hundred twenty-seven charts were reviewed for pre- and post-implementation data. INRs within therapeutic range increased from 27.8% before protocol implementation to 38.5% after implementation. There was also a reduction in subtherapeutic INRs (55.3% pre to 39% post) and supratherapeutic INRs 5 or above (3.7% pre to 2.6% post). Supratherapeutic INRs between 3 and 5 did increase from 13.2% before protocol implementation to 19.9% in the pharmacist managed group. In addition to reducing the time to achievement of therapeutic INRs by 0.5 days, implementation of the protocol resulted in an increased the number of patients with at least one therapeutic INR during admission (35% pre to 40% post). The implementation of a pharmacist-driven warfarin dosing protocol increased therapeutic INRs, and decreased the time to therapeutic range, as well as the proportion of subtherapeutic INRs and supratherapeutic INRs 5 or greater. Additional benefits of the protocol include documentation of Joint Commission National Patient Safety Goal compliance, promotion of interdisciplinary collaboration and increased continuity of care. Copyright © 2016 by the American Society of Health-System Pharmacists, Inc. All rights
The reliability and validity of flight task workload ratings
NASA Technical Reports Server (NTRS)
Childress, M. E.; Hart, S. G.; Bortolussi, M. R.
1982-01-01
Twelve instrument-rated general aviation pilots each flew two scenarios in a motion-base simulator. During each flight, the pilots verbally estimated their workload every three minutes. Following each flight, they again estimated workload for each flight segment and also rated their overall workload, perceived performance, and 13 specific factors on a bipolar scale. The results indicate that time (a priori, inflight, or postflight) of eliciting ratings, period to be covered by the ratings (a specific moment in time or a longer period), type of rating scale, and rating method (verbal, written, or other) may be important variables. Overall workload ratings appear to be predicted by different specific scales depending upon the situation, with activity level the best predictor. Perceived performance seems to bear little relationship to observer-rated performance when pilots rate their overall performance and an observer rates specific behaviors. Perceived workload and performance also seem unrelated.
Data-Driven Learning of Q-Matrix
ERIC Educational Resources Information Center
Liu, Jingchen; Xu, Gongjun; Ying, Zhiliang
2012-01-01
The recent surge of interests in cognitive assessment has led to developments of novel statistical models for diagnostic classification. Central to many such models is the well-known "Q"-matrix, which specifies the item-attribute relationships. This article proposes a data-driven approach to identification of the "Q"-matrix and estimation of…
Voice-stress measure of mental workload
NASA Technical Reports Server (NTRS)
Alpert, Murray; Schneider, Sid J.
1988-01-01
In a planned experiment, male subjects between the age of 18 and 50 will be required to produce speech while performing various tasks. Analysis of the speech produced should reveal which aspects of voice prosody are associated with increased workloads. Preliminary results with two female subjects suggest a possible trend for voice frequency and amplitude to be higher and the variance of the voice frequency to be lower in the high workload condition.
Health impairment of system engineers working on projects with heavy workload.
Shimizui, Hayato; Ooshima, Kirika; Miki, Akiko; Matsushita, Yoshie; Hattori, Youji; Sugita, Minoru
2011-03-01
It has been reported that many system engineers must work hard to produce computer systems, and some of them suffer from health impairment due to their hard work. The purpose of the present cross-sectional study was to investigate the situation of impaired health status of system engineers in projects with high job strain. Countermeasures against health impairment of the subjects in the projects with high job strain in practices of occupational health fields are discussed. The study subjects were five superiors and their 35 subordinates working on computer system projects with high job strain at a large computer systems corporation in the Tokyo area. The control group was comprised of three superiors and their 18 subordinates in the same corporation. From July to November, 2006, the above were interviewed by six occupational health nurses, who evaluated their health and recorded their health evaluation scores. The problems involved in producing the computer systems were sometimes very difficult to solve, even if they spent long hours working on them. The present study detected a tendency showing that healthy superiors' subordinates were unhealthy and unhealthy superiors' subordinates were healthy in the overload projects with high job strain, while this was not detected in the control groups. A few employees whose health deteriorated were faced with very hard jobs in the overload projects. This means that heavy workloads were unevenly distributed in the overload projects among superiors, and their subordinates, and the health of a few members with heavy workloads deteriorated due to the heavy workload. In order to improve such a situation, it may be important not only to commit the necessary number of employees whose working ability is high to the section but also to even the workload in the overload project by informing all members of the project the health impairment of a few members due to heavy workload, from the viewpoint of the practice of occupational health
Does daily nurse staffing match ward workload variability? Three hospitals' experiences.
Gabbay, Uri; Bukchin, Michael
2009-01-01
. The real challenge will be to develop planning systems and implement corrective interventions such as dynamic and flexible daily staffing, which will face difficulties and barriers. The paper fulfils the need for workforce utilization evaluation. A simple method using available data for daily staffing appropriateness evaluation, which is easy to implement and operate, is presented. The statistical process control method enables intra-ward evaluation, while standardization by converting crude into relative measures enables inter-ward analysis. The staffing indicator definitions enable performance evaluation. This original study uses statistical process control to develop simple standardization methods and applies straightforward statistical tools. This method is not limited to crude measures, rather it uses weighted workload measures such as nursing acuity or weighted nurse level (i.e. grade/band).
ERIC Educational Resources Information Center
Johnson, Adam W.
2013-01-01
As a growing entity within higher education organizational structures, enrollment managers (EMs) are primarily tasked with projecting, recruiting, and retaining the student population of their campuses. Enrollment managers are expected by institutional presidents as well as through industry standards to make data-driven planning decisions to reach…
Target volume and artifact evaluation of a new data-driven 4D CT.
Martin, Rachael; Pan, Tinsu
Four-dimensional computed tomography (4D CT) is often used to define the internal gross target volume (IGTV) for radiation therapy of lung cancer. Traditionally, this technique requires the use of an external motion surrogate; however, a new image, data-driven 4D CT, has become available. This study aims to describe this data-driven 4D CT and compare target contours created with it to those created using standard 4D CT. Cine CT data of 35 patients undergoing stereotactic body radiation therapy were collected and sorted into phases using standard and data-driven 4D CT. IGTV contours were drawn using a semiautomated method on maximum intensity projection images of both 4D CT methods. Errors resulting from reproducibility of the method were characterized. A comparison of phase image artifacts was made using a normalized cross-correlation method that assigned a score from +1 (data-driven "better") to -1 (standard "better"). The volume difference between the data-driven and standard IGTVs was not significant (data driven was 2.1 ± 1.0% smaller, P = .08). The Dice similarity coefficient showed good similarity between the contours (0.949 ± 0.006). The mean surface separation was 0.4 ± 0.1 mm and the Hausdorff distance was 3.1 ± 0.4 mm. An average artifact score of +0.37 indicated that the data-driven method had significantly fewer and/or less severe artifacts than the standard method (P = 1.5 × 10 -5 for difference from 0). On average, the difference between IGTVs derived from data-driven and standard 4D CT was not clinically relevant or statistically significant, suggesting data-driven 4D CT can be used in place of standard 4D CT without adjustments to IGTVs. The relatively large differences in some patients were usually attributed to limitations in automatic contouring or differences in artifacts. Artifact reduction and setup simplicity suggest a clinical advantage to data-driven 4D CT. Published by Elsevier Inc.
Making Data-Driven Decisions: Silent Reading
ERIC Educational Resources Information Center
Trudel, Heidi
2007-01-01
Due in part to conflicting opinions and research results, the practice of sustained silent reading (SSR) in schools has been questioned. After a frustrating experience with SSR, the author of this article began a data-driven decision-making process to gain new insights on how to structure silent reading in a classroom, including a comparison…
Exploration of an oculometer-based model of pilot workload
NASA Technical Reports Server (NTRS)
Krebs, M. J.; Wingert, J. W.; Cunningham, T.
1977-01-01
Potential relationships between eye behavior and pilot workload are discussed. A Honeywell Mark IIA oculometer was used to obtain the eye data in a fixed base transport aircraft simulation facility. The data were analyzed to determine those parameters of eye behavior which were related to changes in level of task difficulty of the simulated manual approach and landing on instruments. A number of trends and relationships between eye variables and pilot ratings were found. A preliminary equation was written based on the results of a stepwise linear regression. High variability in time spent on various instruments was related to differences in scanning strategy among pilots. A more detailed analysis of individual runs by individual pilots was performed to investigate the source of this variability more closely. Results indicated a high degree of intra-pilot variability in instrument scanning. No consistent workload related trends were found. Pupil diameter which had demonstrated a strong relationship to task difficulty was extensively re-exmained.
Social Capital in Data-Driven Community College Reform
ERIC Educational Resources Information Center
Kerrigan, Monica Reid
2015-01-01
The current rhetoric around using data to improve community college student outcomes with only limited research on data-driven decision-making (DDDM) within postsecondary education compels a more comprehensive understanding of colleges' capacity for using data to inform decisions. Based on an analysis of faculty and administrators' perceptions and…
Effect of time span and task load on pilot mental workload
NASA Technical Reports Server (NTRS)
Berg, S. L.; Sheridan, T. B.
1986-01-01
Two sets of simulations designed to examine how a pilot's mental workload is affected by continuous manual-control activity versus discrete mental tasks that included the length of time between receiving an assignment and executing it are described. The first experiment evaluated two types of measures: objective performance indicators and subjective ratings. Subjective ratings for the two missions were different, but the objective performance measures were similar. In the second experiments, workload levels were increased and a second performance measure was taken. Mental workload had no influence on either performance-based workload measure. Subjective ratings discriminated among the scenarios and correlated with performance measures for high-workload flights. The number of mental tasks performed did not influence error rates, although high manual workloads did increase errors.
NASA Technical Reports Server (NTRS)
Acton, W. H.; Crabtree, M. S.; Simons, J. C.; Gomer, F. E.; Eckel, J. S.
1983-01-01
Information theoretic analysis and subjective paired-comparison and task ranking techniques were employed in order to scale the workload of 20 communications-related tasks frequently performed by the captain and first officer of transport category aircraft. Tasks were drawn from taped conversations between aircraft and air traffic controllers (ATC). Twenty crewmembers performed subjective message comparisons and task rankings on the basis of workload. Information theoretic results indicated a broad range of task difficulty levels, and substantial differences between captain and first officer workload levels. Preliminary subjective data tended to corroborate these results. A hybrid scale reflecting the results of both the analytical and the subjective techniques is currently being developed. The findings will be used to select representative sets of communications for use in high fidelity simulation.
Estimating workload using EEG spectral power and ERPs in the n-back task
NASA Astrophysics Data System (ADS)
Brouwer, Anne-Marie; Hogervorst, Maarten A.; van Erp, Jan B. F.; Heffelaar, Tobias; Zimmerman, Patrick H.; Oostenveld, Robert
2012-08-01
Previous studies indicate that both electroencephalogram (EEG) spectral power (in particular the alpha and theta band) and event-related potentials (ERPs) (in particular the P300) can be used as a measure of mental work or memory load. We compare their ability to estimate workload level in a well-controlled task. In addition, we combine both types of measures in a single classification model to examine whether this results in higher classification accuracy than either one alone. Participants watched a sequence of visually presented letters and indicated whether or not the current letter was the same as the one (n instances) before. Workload was varied by varying n. We developed different classification models using ERP features, frequency power features or a combination (fusion). Training and testing of the models simulated an online workload estimation situation. All our ERP, power and fusion models provide classification accuracies between 80% and 90% when distinguishing between the highest and the lowest workload condition after 2 min. For 32 out of 35 participants, classification was significantly higher than chance level after 2.5 s (or one letter) as estimated by the fusion model. Differences between the models are rather small, though the fusion model performs better than the other models when only short data segments are available for estimating workload.
Optimally Distributed Kalman Filtering with Data-Driven Communication †
Dormann, Katharina
2018-01-01
For multisensor data fusion, distributed state estimation techniques that enable a local processing of sensor data are the means of choice in order to minimize storage and communication costs. In particular, a distributed implementation of the optimal Kalman filter has recently been developed. A significant disadvantage of this algorithm is that the fusion center needs access to each node so as to compute a consistent state estimate, which requires full communication each time an estimate is requested. In this article, different extensions of the optimally distributed Kalman filter are proposed that employ data-driven transmission schemes in order to reduce communication expenses. As a first relaxation of the full-rate communication scheme, it can be shown that each node only has to transmit every second time step without endangering consistency of the fusion result. Also, two data-driven algorithms are introduced that even allow for lower transmission rates, and bounds are derived to guarantee consistent fusion results. Simulations demonstrate that the data-driven distributed filtering schemes can outperform a centralized Kalman filter that requires each measurement to be sent to the center node. PMID:29596392
Research papers and publications (1981-1987): Workload research program
NASA Technical Reports Server (NTRS)
Hart, Sandra G. (Compiler)
1987-01-01
An annotated bibliography of the research reports written by participants in NASA's Workload Research Program since 1981 is presented, representing the results of theoretical and applied research conducted at Ames Research Center and at universities and industrial laboratories funded by the program. The major program elements included: 1) developing an understanding of the workload concept; 2) providing valid, reliable, and practical measures of workload; and 3) creating a computer model to predict workload. The goal is to provide workload-related design principles, measures, guidelines, and computational models. The research results are transferred to user groups by establishing close ties with manufacturers, civil and military operators of aerospace systems, and regulatory agencies; publishing scientific articles; participating in and sponsoring workshops and symposia; providing information, guidelines, and computer models; and contributing to the formulation of standards. In addition, the methods and theories developed have been applied to specific operational and design problems at the request of a number of industry and government agencies.
Academic context and perceived mental workload of psychology students.
Rubio-Valdehita, Susana; López-Higes, Ramón; Díaz-Ramiro, Eva
2014-01-01
The excessive workload of university students is an academic stressor. Consequently, it is necessary to evaluate and control the workload in education. This research applies the NASA-TLX scale, as a measure of the workload. The objectives of this study were: (a) to measure the workload levels of a sample of 367 psychology students, (b) to group students according to their positive or negative perception of academic context (AC) and c) to analyze the effects of AC on workload. To assess the perceived AC, we used an ad hoc questionnaire designed according to Demand-Control-Social Support and Effort-Reward Imbalance models. Using cluster analysis, participants were classified into two groups (positive versus negative context). The differences between groups show that a positive AC improves performance (p < .01) and reduces feelings of overload (p < .02), temporal demand (p < .02), and nervousness and frustration (p < .001). Social relationships with peers and teachers, student autonomy and result satisfaction were relevant dimensions of the AC (p < .001 in all cases).
Voice measures of workload in the advanced flight deck: Additional studies
NASA Technical Reports Server (NTRS)
Schneider, Sid J.; Alpert, Murray
1989-01-01
These studies investigated acoustical analysis of the voice as a measure of workload in individual operators. In the first study, voice samples were recorded from a single operator during high, medium, and low workload conditions. Mean amplitude, frequency, syllable duration, and emphasis all tended to increase as workload increased. In the second study, NASA test pilots performed a laboratory task, and used a flight simulator under differing work conditions. For two of the pilots, high workload in the simulator brought about greater amplitude, peak duration, and stress. In both the laboratory and simulator tasks, high workload tended to be associated with more statistically significant drop-offs in the acoustical measures than were lower workload levels. There was a great deal of intra-subject variability in the acoustical measures. The results suggested that in individual operators, increased workload might be revealed by high initial amplitude and frequency, followed by rapid drop-offs over time.
Driver distraction : eye glance analysis and conversation workload.
DOT National Transportation Integrated Search
2015-11-01
The objective of this project was to assess the risk of performing a secondary task while driving a commercial : motor vehicle (CMV). The risk of conversation workload while driving a CMV was also assessed. Conversation : workload is a proxy for cogn...
Adult social position and sick leave: the mediating effect of physical workload.
Corbett, Karina; Gran, Jon Michaeal; Kristensen, Petter; Mehlum, Ingrid Sivesind
2015-11-01
This study aimed to quantify how much of the adult social gradient in sick leave can be attributed to the mediating role of physical workload while accounting for the role of childhood and adolescent social position and neuroticism. Our sample consisted of 2099 women and 1229 men from a Norwegian birth cohort study (born 1967-1976) who participated in the Nord-Trøndelag Health Study (2006-2008) (HUNT3). Data on sick leave (defined as >16 calendar days; 2006-2009) and social position during childhood, adolescence, and adulthood were obtained from national registers. Study outcome was time-to-first sick leave spell. Physical workload and neuroticism were self-reported in HUNT3. Mediating effects through physical workload were estimated using a method based on the additive hazards survival model. A hypothetical change from highest to lowest group in adult social position was, for women, associated with 51.6 [95% confidence interval (95% CI) 24.7-78.5] additional spells per 100,000 person-days at risk, in a model adjusted for childhood and adolescent social position and neuroticism. The corresponding rate increase for men was 41.1 (95% CI 21.4-60.8). Of these additional spells, the proportion mediated through physical workload was 24% (95% CI 10-49) and 30% (95% CI 10-63) for women and men, respectively. The effect of adult social position on sick leave was partly mediated through physical workload, even while accounting for earlier life course factors. Our findings provide support that interventions aimed at reducing physical workload among those with lower adult social position could reduce sick leave risk.
File-System Workload on a Scientific Multiprocessor
NASA Technical Reports Server (NTRS)
Kotz, David; Nieuwejaar, Nils
1995-01-01
Many scientific applications have intense computational and I/O requirements. Although multiprocessors have permitted astounding increases in computational performance, the formidable I/O needs of these applications cannot be met by current multiprocessors a their I/O subsystems. To prevent I/O subsystems from forever bottlenecking multiprocessors and limiting the range of feasible applications, new I/O subsystems must be designed. The successful design of computer systems (both hardware and software) depends on a thorough understanding of their intended use. A system designer optimizes the policies and mechanisms for the cases expected to most common in the user's workload. In the case of multiprocessor file systems, however, designers have been forced to build file systems based only on speculation about how they would be used, extrapolating from file-system characterizations of general-purpose workloads on uniprocessor and distributed systems or scientific workloads on vector supercomputers (see sidebar on related work). To help these system designers, in June 1993 we began the Charisma Project, so named because the project sought to characterize 1/0 in scientific multiprocessor applications from a variety of production parallel computing platforms and sites. The Charisma project is unique in recording individual read and write requests-in live, multiprogramming, parallel workloads (rather than from selected or nonparallel applications). In this article, we present the first results from the project: a characterization of the file-system workload an iPSC/860 multiprocessor running production, parallel scientific applications at NASA's Ames Research Center.
Using Neural Networks to Explore Air Traffic Controller Workload
NASA Technical Reports Server (NTRS)
Martin, Lynne; Kozon, Thomas; Verma, Savita; Lozito, Sandra C.
2006-01-01
When a new system, concept, or tool is proposed in the aviation domain, one concern is the impact that this will have on operator workload. As an experience, workload is difficult to measure in a way that will allow comparison of proposed systems with those already in existence. Chatterji and Sridhar (2001) suggested a method by which airspace parameters can be translated into workload ratings, using a neural network. This approach was employed, and modified to accept input from a non-real time airspace simulation model. The following sections describe the preparations and testing work that will enable comparison of a future airspace concept with a current day baseline in terms of workload levels.
Evaluation of Workload and its Impact on Satisfaction Among Pharmacy Academicians in Southern India.
Ahmad, Akram; Khan, Muhammad Umair; Srikanth, Akshaya B; Patel, Isha; Nagappa, Anantha Naik; Jamshed, Shazia Qasim
2015-06-01
The purpose of this study was to determine the level of workload among pharmacy academicians working in public and private sector universities in India. The study also aimed to assess the satisfaction of academicians towards their workload. A cross-sectional study was conducted for a period of 2 months among pharmacy academicians in Karnataka state of Southern India. Convenience sampling was used to select a sample and was contacted via email and/or social networking sites. Questionnaire designed by thorough review literature was used as a tool to collect data on workload (teaching, research, extracurricular services) and satisfaction. Of 214 participants, 95 returned the filled questionnaire giving the response rate of 44.39%. Private sector academicians had more load of teaching (p=0.046) and they appeared to be less involved in research activities (p=0.046) as compared to public sector academicians. More than half of the respondents (57.9%) were satisfied with their workload with Assistant Professors were least satisfied as compared to Professors (p=0.01). Overall, private sector academicians are more burdened by teaching load and also are less satisfied of their workload. Revision of private universities policies may aid in addressing this issue.
Evaluating MODIS satellite versus terrestrial data driven productivity estimates in Austria
NASA Astrophysics Data System (ADS)
Petritsch, R.; Boisvenue, C.; Pietsch, S. A.; Hasenauer, H.; Running, S. W.
2009-04-01
Sensors, such as the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA's Terra satellite, are developed for monitoring global and/or regional ecosystem fluxes like net primary production (NPP). Although these systems should allow us to assess carbon sequestration issues, forest management impacts, etc., relatively little is known about the consistency and accuracy in the resulting satellite driven estimates versus production estimates driven from ground data. In this study we compare the following NPP estimation methods: (i) NPP estimates as derived from MODIS and available on the internet; (ii) estimates resulting from the off-line version of the MODIS algorithm; (iii) estimates using regional meteorological data within the offline algorithm; (iv) NPP estimates from a species specific biogeochemical ecosystem model adopted for Alpine conditions; and (v) NPP estimates calculated from individual tree measurements. Single tree measurements were available from 624 forested sites across Austria but only the data from 165 sample plots included all the necessary information for performing the comparison on plot level. To ensure independence of satellite-driven and ground-based predictions, only latitude and longitude for each site were used to obtain MODIS estimates. Along with the comparison of the different methods, we discuss problems like the differing dates of field campaigns (<1999) and acquisition of satellite images (2000-2005) or incompatible productivity definitions within the methods and come up with a framework for combining terrestrial and satellite data based productivity estimates. On average MODIS estimates agreed well with the output of the models self-initialization (spin-up) and biomass increment calculated from tree measurements is not significantly different from model results; however, correlation between satellite-derived versus terrestrial estimates are relatively poor. Considering the different scales as they are 9km² from MODIS and
EEG correlates of task engagement and mental workload in vigilance, learning, and memory tasks.
Berka, Chris; Levendowski, Daniel J; Lumicao, Michelle N; Yau, Alan; Davis, Gene; Zivkovic, Vladimir T; Olmstead, Richard E; Tremoulet, Patrice D; Craven, Patrick L
2007-05-01
The ability to continuously and unobtrusively monitor levels of task engagement and mental workload in an operational environment could be useful in identifying more accurate and efficient methods for humans to interact with technology. This information could also be used to optimize the design of safer, more efficient work environments that increase motivation and productivity. The present study explored the feasibility of monitoring electroencephalo-graphic (EEG) indices of engagement and workload acquired unobtrusively and quantified during performance of cognitive tests. EEG was acquired from 80 healthy participants with a wireless sensor headset (F3-F4,C3-C4,Cz-POz,F3-Cz,Fz-C3,Fz-POz) during tasks including: multi-level forward/backward-digit-span, grid-recall, trails, mental-addition, 20-min 3-Choice Vigilance, and image-learning and memory tests. EEG metrics for engagement and workload were calculated for each 1 -s of EEG. Across participants, engagement but not workload decreased over the 20-min vigilance test. Engagement and workload were significantly increased during the encoding period of verbal and image-learning and memory tests when compared with the recognition/ recall period. Workload but not engagement increased linearly as level of difficulty increased in forward and backward-digit-span, grid-recall, and mental-addition tests. EEG measures correlated with both subjective and objective performance metrics. These data in combination with previous studies suggest that EEG engagement reflects information-gathering, visual processing, and allocation of attention. EEG workload increases with increasing working memory load and during problem solving, integration of information, analytical reasoning, and may be more reflective of executive functions. Inspection of EEG on a second-by-second timescale revealed associations between workload and engagement levels when aligned with specific task events providing preliminary evidence that second
Individual differences and subjective workload assessment - Comparing pilots to nonpilots
NASA Technical Reports Server (NTRS)
Vidulich, Michael A.; Pandit, Parimal
1987-01-01
Results by two groups of subjects, pilots and nonpilots, for two subjective workload assessment techniques (the SWAT and NASA-TLX tests) intended to evaluate individual differences in the perception and reporting of subjective workload are compared with results obtained for several traditional personality tests. The personality tests were found to discriminate between the groups while the workload tests did not. It is concluded that although the workload tests may provide useful information with respect to the interaction between tasks and personality, they are not effective as pure tests of individual differences.
NASA Technical Reports Server (NTRS)
Scerbo, Mark; Coyne, Joseph; Burt, Jennifer L. (Technical Monitor)
2002-01-01
My work at NASA Langley has focused around Aviation Weather Information CAWING displays. The majority of my time at LYRIC has been spent on the Workload and Relative Position (WaRP) Study. The goal of this project is to determine how an AWIN display at various positions within the cockpit affects pilot performance and workload. The project is being conducted in Languages Cessna 206H research aircraft. During the past year the design of the experiment was finalized and approved. Despite facing several delays the data collection was completed in early February. Alter the completion of the data collection an extensive data entry task began. This required recording air speed, altitude, course heading, bank angle, and vertical speed information from videos of the primary flight displays. This data was then used to determine root mean square error (RMSE) for each experimental condition. In addition to the performance data (RMSE) taken from flight path deviation, the study also collected data on pilot;s accuracy in reporting weather information, and a subjective rating of workload from the pilot. The data for this experiment is currently being analyzed. Overall the current experiment should help to determine potential costs and benefits associated with AWIN displays. The data will be used to determine if a private pilot can safely fly a general aviation aircraft while operating a weather display. Clearly a display that adds to the pilot#s already heavy workload represents a potential problem. The study will compare the use of an AWIN display to conventional means of acquiring weather data. The placement of the display within the cockpit (i.e., either on the yoke, kneeboard, or panel) will be also compared in terms of workload, performance, and pilot preference.
The dissociation of subjective measures of mental workload and performance
NASA Technical Reports Server (NTRS)
Yeh, Y. H.; Wickens, C. D.
1984-01-01
Dissociation between performance and subjective workload measures was investigated in the theoretical framework of the multiple resources model. Subjective measures do not preserve the vector characteristics in the multidimensional space described by the model. A theory of dissociation was proposed to locate the sources that may produce dissociation between the two workload measures. According to the theory, performance is affected by every aspect of processing whereas subjective workload is sensitive to the amount of aggregate resource investment and is dominated by the demands on the perceptual/central resources. The proposed theory was tested in three experiments. Results showed that performance improved but subjective workload was elevated with an increasing amount of resource investment. Furthermore, subjective workload was not as sensitive as was performance to differences in the amount of resource competition between two tasks. The demand on perceptual/central resources was found to be the most salient component of subjective workload. Dissociation occurred when the demand on this component was increased by the number of concurrent tasks or by the number of display elements. However, demands on response resources were weighted in subjective introspection as much as demands on perceptual/central resources. The implications of these results for workload practitioners are described.
MyDiabetesMyWay: An Evolving National Data Driven Diabetes Self-Management Platform.
Wake, Deborah J; He, Jinzhang; Czesak, Anna Maria; Mughal, Fezan; Cunningham, Scott G
2016-09-01
MyDiabetesMyWay (MDMW) is an award-wining national electronic personal health record and self-management platform for diabetes patients in Scotland. This platform links multiple national institutional and patient-recorded data sources to provide a unique resource for patient care and self-management. This review considers the current evidence for online interventions in diabetes and discusses these in the context of current and ongoing developments for MDMW. Evaluation of MDMW through patient reported outcomes demonstrates a positive impact on self-management. User feedback has highlighted barriers to uptake and has guided platform evolution from an education resource website to an electronic personal health record now encompassing remote monitoring, communication tools and personalized education links. Challenges in delivering digital interventions for long-term conditions include integration of data between institutional and personal recorded sources to perform big data analytics and facilitating technology use in those with disabilities, low digital literacy, low socioeconomic status and in minority groups. The potential for technology supported health improvement is great, but awareness and adoption by health workers and patients remains a significant barrier. © 2016 Diabetes Technology Society.
Development and early application of the Scottish Community Nursing Workload Measurement Tool.
Grafen, May; Mackenzie, Fiona C
2015-02-01
This article describes the development and early application of the Scottish Community Nursing Workload Measurement Tool, part of a suite of tools aiming to ensure a consistent approach to measuring nursing workload across NHS Scotland. The tool, which enables community nurses to record and report their actual workload by collecting information on six categories of activity, is now being used by all NHS boards as part of a triangulated approach. Data being generated by the tool at national level include indications that approximately 50% of band 6 district nurses' time is spent in face-to-face and non-face-to-face contact and planned sessions with patients, and that over 60% of face-to-face contacts are at 'moderate' and 'complex' levels of intervention (2012 data). These data are providing hard evidence of key elements of community nursing activity and practice that will enable informed decisions about workforce planning to be taken forward locally and nationally. The article features an account of the early impact of the tool's implementation in an NHS board by an associate director of nursing. Positive effects from implementation include the generation of reliable data to inform planning decisions, identification of issues around nursing time spent on administrative tasks, clarification of school nursing roles, and information being fed back to teams on various aspects of performance.
NASA Technical Reports Server (NTRS)
Waller, M. C.
1976-01-01
An electro-optical device called an oculometer which tracks a subject's lookpoint as a time function has been used to collect data in a real-time simulation study of instrument landing system (ILS) approaches. The data describing the scanning behavior of a pilot during the instrument approaches have been analyzed by use of a stepwise regression analysis technique. A statistically significant correlation between pilot workload, as indicated by pilot ratings, and scanning behavior has been established. In addition, it was demonstrated that parameters derived from the scanning behavior data can be combined in a mathematical equation to provide a good representation of pilot workload.
Emergency nursing workload and patient dependency in the ambulance bay: A prospective study.
Varndell, Wayne; Ryan, Elizabeth; Jeffers, Alison; Marquez-Hunt, Nadya
2016-11-01
The purpose of this prospective observational study was to characterise patients occupying the ambulance bay and to determine the ensuing nursing workload. The number of patients presenting to ED by ambulance is increasing. During periods of peak demand and access block in the ED, patients with ongoing care needs, requiring continual assessment and symptom management by emergency nurses can remain in the ambulance bay for extended periods of time. The profile of these patients and on the related nursing workload is not well known. A prospective observational study design based upon a convenience sample of patients was conducted over a randomly selected four-week period. Nursing workload was assessing using the Jones Dependency Tool. A modified Work Observation Method By Activity Timing technique was used to estimate direct nursing care time. Of 4068 presentations to ED, 640 (16%) occupied the ambulance bay following triage, of which the majority (n=408; 64%) had arrived by ambulance. Of those occupying the ambulance bay 205 (32%) were evaluated using the JDT. The majority of patients had potentially life-threatening symptoms (ATS 3, n=424; 66%), were moderately dependent (n=134; 65%), and consumed approximately 152.1h of direct nursing care time. A large proportion of direct nursing care time was spent on patient reassessment (60.4h) and pain management (29.6h). Patients occupying the ambulance bay had an average ED length of stay of 5.6h (4.6h), of which 1.8h (SD 1.8h) was spent delayed in the ambulance bay. Early detailed assessment and symptom management of patients occupying the ambulance bay is extensively undertaken by emergency nurses. The frequency and number of patients off-loaded into non-clinical areas is not currently monitored or reported upon. This study has demonstrated that patients managed in the ambulance bay consume large amounts of nursing resources, commonly require acute level care and hospital admission. Copyright © 2016 College of Emergency
Postoperative electrolyte management: Current practice patterns of surgeons and residents.
Angarita, Fernando A; Dueck, Andrew D; Azouz, Solomon M
2015-07-01
Managing postoperative electrolyte imbalances often is driven by dogma. To identify areas of improvement, we assessed the practice pattern of postoperative electrolyte management among surgeons and residents. An online survey was distributed among attending surgeons and surgical residents at the University of Toronto. The survey was designed according to a systematic approach for formulating self-administered questionnaires. Questions addressed workload, decision making in hypothetical clinical scenarios, and improvement strategies. Of 232 surveys distributed, 156 were completed (response rate: 67%). The majority stated that junior residents were responsible for managing electrolytes at 13 University of Toronto-affiliated hospitals. Supervision was carried out predominately by senior residents (75%). Thirteen percent reported management went unsupervised. Approximately 59% of residents were unaware how often attending surgeons assessed patients' electrolytes. Despite the majority of residents (53.7%) reporting they were never given tools or trained in electrolyte replacement, they considered themselves moderately or extremely confident. The management of hypothetical clinical scenarios differed between residents and attending surgeons. The majority (50.5%) of respondents considered that an electrolyte replacement protocol is the most appropriate improvement strategy. Electrolyte replacement represents an important component of surgeons' workload. Despite reporting that formal training in electrolyte management is limited, residents consider themselves competent; however, their practice is highly variable and often differs from pharmacologic-directed recommendations. Optimizing how postoperative electrolytes are managed in surgical wards requires building a framework that improves knowledge, training, and limits unnecessary interventions. Copyright © 2015 Elsevier Inc. All rights reserved.
Application of statistical mining in healthcare data management for allergic diseases
NASA Astrophysics Data System (ADS)
Wawrzyniak, Zbigniew M.; Martínez Santolaya, Sara
2014-11-01
The paper aims to discuss data mining techniques based on statistical tools in medical data management in case of long-term diseases. The data collected from a population survey is the source for reasoning and identifying disease processes responsible for patient's illness and its symptoms, and prescribing a knowledge and decisions in course of action to correct patient's condition. The case considered as a sample of constructive approach to data management is a dependence of allergic diseases of chronic nature on some symptoms and environmental conditions. The knowledge summarized in a systematic way as accumulated experience constitutes to an experiential simplified model of the diseases with feature space constructed of small set of indicators. We have presented the model of disease-symptom-opinion with knowledge discovery for data management in healthcare. The feature is evident that the model is purely data-driven to evaluate the knowledge of the diseases` processes and probability dependence of future disease events on symptoms and other attributes. The example done from the outcomes of the survey of long-term (chronic) disease shows that a small set of core indicators as 4 or more symptoms and opinions could be very helpful in reflecting health status change over disease causes. Furthermore, the data driven understanding of the mechanisms of diseases gives physicians the basis for choices of treatment what outlines the need of data governance in this research domain of discovered knowledge from surveys.
Crew procedures and workload of retrofit concepts for microwave landing system
NASA Technical Reports Server (NTRS)
Summers, Leland G.; Jonsson, Jon E.
1989-01-01
Crew procedures and workload for Microwave Landing Systems (MLS) that could be retrofitted into existing transport aircraft were evaluated. Two MLS receiver concepts were developed. One is capable of capturing a runway centerline and the other is capable of capturing a segmented approach path. Crew procedures were identified and crew task analyses were performed using each concept. Crew workload comparisons were made between the MLS concepts and an ILS baseline using a task-timeline workload model. Workload indexes were obtained for each scenario. The results showed that workload was comparable to the ILS baseline for the MLS centerline capture concept, but significantly higher for the segmented path capture concept.
Time management for case managers--so much work, so little time.
Cesta, Toni
2014-08-01
The world of a case manager is a busy one, and you may not have all the resources you need each and every day. If you can maintain a routine it will make the workload more manageable for you and will allow room for those surprises that invariably happen. Whether you are a new or a seasoned case manager, organizing your workload can always help smooth out the rough edges in anyone's hectic day!
Data-Driven H∞ Control for Nonlinear Distributed Parameter Systems.
Luo, Biao; Huang, Tingwen; Wu, Huai-Ning; Yang, Xiong
2015-11-01
The data-driven H∞ control problem of nonlinear distributed parameter systems is considered in this paper. An off-policy learning method is developed to learn the H∞ control policy from real system data rather than the mathematical model. First, Karhunen-Loève decomposition is used to compute the empirical eigenfunctions, which are then employed to derive a reduced-order model (ROM) of slow subsystem based on the singular perturbation theory. The H∞ control problem is reformulated based on the ROM, which can be transformed to solve the Hamilton-Jacobi-Isaacs (HJI) equation, theoretically. To learn the solution of the HJI equation from real system data, a data-driven off-policy learning approach is proposed based on the simultaneous policy update algorithm and its convergence is proved. For implementation purpose, a neural network (NN)- based action-critic structure is developed, where a critic NN and two action NNs are employed to approximate the value function, control, and disturbance policies, respectively. Subsequently, a least-square NN weight-tuning rule is derived with the method of weighted residuals. Finally, the developed data-driven off-policy learning approach is applied to a nonlinear diffusion-reaction process, and the obtained results demonstrate its effectiveness.
Using Psychophysiological Sensors to Assess Mental Workload During Web Browsing.
Jimenez-Molina, Angel; Retamal, Cristian; Lira, Hernan
2018-02-03
Knowledge of the mental workload induced by a Web page is essential for improving users' browsing experience. However, continuously assessing the mental workload during a browsing task is challenging. To address this issue, this paper leverages the correlation between stimuli and physiological responses, which are measured with high-frequency, non-invasive psychophysiological sensors during very short span windows. An experiment was conducted to identify levels of mental workload through the analysis of pupil dilation measured by an eye-tracking sensor. In addition, a method was developed to classify mental workload by appropriately combining different signals (electrodermal activity (EDA), electrocardiogram, photoplethysmo-graphy (PPG), electroencephalogram (EEG), temperature and pupil dilation) obtained with non-invasive psychophysiological sensors. The results show that the Web browsing task involves four levels of mental workload. Also, by combining all the sensors, the efficiency of the classification reaches 93.7%.
Fallahi, Majid; Motamedzade, Majid; Heidarimoghadam, Rashid; Soltanian, Ali Reza; Miyake, Shinji
2016-01-01
The present study aimed to evaluate the operators' mental workload (MW) of cement, city traffic control and power plant control centers using subjective and objective measures during system vital parameters monitoring. This cross-sectional study was conducted from June 2014 to February 2015 at the cement, city traffic control and power plant control centers. Electrocardiography and electroencephalography data were recorded from forty males during performing their daily working in resting, low mental workload (LMW), high mental workload (HMW) and recovery conditions (each block 5 minutes). The NASA-Task Load Index (TLX) was used to evaluate the subjective workload of the operators. The results showed that increasing MW had a significant effect on the operators subjective responses in two conditions ([1,53] = 216.303, P < 0.001, η2 = 0.803). Also,the Task-MW interaction effect on operators subjective responses was significant (F [3, 53] = 12.628,P < 0.001, η2 = 0.417). Analysis of repeated measures analysis of variance (ANOVA) indicated that increasing mental demands had a significant effect on heart rate, low frequency/high frequency ratio, theta and alpha band activity. The results suggested that when operators' mental demands especially in traffic control and power plant tasks increased, their mental fatigue and stress level increased and their mental health deteriorated. Therefore, it may be necessary to implement an ergonomic program or administrative control to manage mental probably health in these control centers. Furthermore, by evaluating MW, the control center director can organize the human resources for each MW condition to sustain the appropriate performance as well as improve system functions.
Cognitive Workload and Sleep Restriction Interact to Influence Sleep Homeostatic Responses
Goel, Namni; Abe, Takashi; Braun, Marcia E.; Dinges, David F.
2014-01-01
Study Objectives: Determine the effects of high versus moderate workload on sleep physiology and neurobehavioral measures, during sleep restriction (SR) and no sleep restriction (NSR) conditions. Design: Ten-night experiment involving cognitive workload and SR manipulations. Setting: Controlled laboratory environment. Participants: Sixty-three healthy adults (mean ± standard deviation: 33.2 ± 8.7 y; 29 females), age 22–50 y. Interventions: Following three baseline 8 h time in bed (TIB) nights, subjects were randomized to one of four conditions: high cognitive workload (HW) + SR; moderate cognitive workload (MW) + SR; HW + NSR; or MW + NSR. SR entailed 5 consecutive nights at 4 h TIB; NSR entailed 5 consecutive nights at 8 h TIB. Subjects received three workload test sessions/day consisting of 15-min preworkload assessments, followed by a 60-min (MW) or 120-min (HW) workload manipulation comprised of visually based cognitive tasks, and concluding with 15-min of postworkload assessments. Experimental nights were followed by two 8-h TIB recovery sleep nights. Polysomnography was collected on baseline night 3, experimental nights 1, 4, and 5, and recovery night 1 using three channels (central, frontal, occipital [C3, Fz, O2]). Measurements and Results: High workload, regardless of sleep duration, increased subjective fatigue and sleepiness (all P < 0.05). In contrast, sleep restriction produced cumulative increases in Psychomotor Vigilance Test (PVT) lapses, fatigue, and sleepiness and decreases in PVT response speed and Maintenance of Wakefulness Test (MWT) sleep onset latencies (all P < 0.05). High workload produced longer sleep onset latencies (P < 0.05, d = 0.63) and less wake after sleep onset (P < 0.05, d = 0.64) than moderate workload. Slow-wave energy—the putative marker of sleep homeostasis—was higher at O2 than C3 only in the HW + SR condition (P < 0.05). Conclusions: High cognitive workload delayed sleep onset, but it also promoted sleep homeostatic
[Effects of mental workload on work ability in primary and secondary school teachers].
Xiao, Yuanmei; Li, Weijuan; Ren, Qingfeng; Ren, Xiaohui; Wang, Zhiming; Wang, Mianzhen; Lan, Yajia
2015-02-01
To investigate the change pattern of primary and secondary school teachers' work ability with the changes in their mental workload. A total of 901 primary and secondary school teachers were selected by random cluster sampling, and then their mental workload and work ability were assessed by National Aeronautics and Space Administration-Task Load Index (NASA-TLX) and Work Ability Index (WAI) questionnaires, whose reliability and validity had been tested. The effects of their mental workload on the work ability were analyzed. Primary and secondary school teachers' work ability reached the highest level at a certain level of mental workload (55.73< mental workload ≤ 64.10). When their mental workload was lower than the level, their work ability had a positive correlation with the mental workload. Their work ability increased or maintained stable with the increasing mental workload. Moreover, the percentage of teachers with good work ability increased, while that of teachers with moderate work ability decreased. But when their mental workload was higher than the level, their work ability had a negative correlation with the mental workload. Their work ability significantly decreased with the increasing mental workload (P < 0.01). Furthermore, the percentage of teachers with good work ability decreased, while that of teachers with moderate work ability increased (P < 0.001). Too high or low mental workload will result in the decline of primary and secondary school teachers' work ability. Moderate mental workload (55.73∼64.10) will benefit the maintaining and stabilization of their work ability.
Efficient mental workload estimation using task-independent EEG features.
Roy, R N; Charbonnier, S; Campagne, A; Bonnet, S
2016-04-01
Mental workload is frequently estimated by EEG-based mental state monitoring systems. Usually, these systems use spectral markers and event-related potentials (ERPs). To our knowledge, no study has directly compared their performance for mental workload assessment, nor evaluated the stability in time of these markers and of the performance of the associated mental workload estimators. This study proposes a comparison of two processing chains, one based on the power in five frequency bands, and one based on ERPs, both including a spatial filtering step (respectively CSP and CCA), an FLDA classification and a 10-fold cross-validation. To get closer to a real life implementation, spectral markers were extracted from a short window (i.e. towards reactive systems) that did not include any motor activity and the analyzed ERPs were elicited by a task-independent probe that required a reflex-like answer (i.e. close to the ones required by dead man's vigilance devices). The data were acquired from 20 participants who performed a Sternberg memory task for 90 min (i.e. 2/6 digits to memorize) inside which a simple detection task was inserted. The results were compared both when the testing was performed at the beginning and end of the session. Both chains performed significantly better than random; however the one based on the spectral markers had a low performance (60%) and was not stable in time. Conversely, the ERP-based chain gave very high results (91%) and was stable in time. This study demonstrates that an efficient and stable in time workload estimation can be achieved using task-independent spatially filtered ERPs elicited in a minimally intrusive manner.
Efficient mental workload estimation using task-independent EEG features
NASA Astrophysics Data System (ADS)
Roy, R. N.; Charbonnier, S.; Campagne, A.; Bonnet, S.
2016-04-01
Objective. Mental workload is frequently estimated by EEG-based mental state monitoring systems. Usually, these systems use spectral markers and event-related potentials (ERPs). To our knowledge, no study has directly compared their performance for mental workload assessment, nor evaluated the stability in time of these markers and of the performance of the associated mental workload estimators. This study proposes a comparison of two processing chains, one based on the power in five frequency bands, and one based on ERPs, both including a spatial filtering step (respectively CSP and CCA), an FLDA classification and a 10-fold cross-validation. Approach. To get closer to a real life implementation, spectral markers were extracted from a short window (i.e. towards reactive systems) that did not include any motor activity and the analyzed ERPs were elicited by a task-independent probe that required a reflex-like answer (i.e. close to the ones required by dead man’s vigilance devices). The data were acquired from 20 participants who performed a Sternberg memory task for 90 min (i.e. 2/6 digits to memorize) inside which a simple detection task was inserted. The results were compared both when the testing was performed at the beginning and end of the session. Main results. Both chains performed significantly better than random; however the one based on the spectral markers had a low performance (60%) and was not stable in time. Conversely, the ERP-based chain gave very high results (91%) and was stable in time. Significance. This study demonstrates that an efficient and stable in time workload estimation can be achieved using task-independent spatially filtered ERPs elicited in a minimally intrusive manner.
ERIC Educational Resources Information Center
Johnson, Adam W.
2016-01-01
As a growing entity within higher education organizational structures, enrollment managers (EMs) are primarily tasked with projecting, recruiting, and retaining the student population of their campuses. Enrollment managers are expected by institutional presidents as well as through industry standards to make data-driven planning decisions to reach…
Mental workload associated with operating an agricultural sprayer: an empirical approach.
Dey, A K; Mann, D D
2011-04-01
Agricultural spraying involves two major tasks: guiding a sprayer in response to a GPS navigation device, and simultaneous monitoring of rear-attached booms under various illumination and terrain difficulty levels. The aim of the present study was to investigate the effect of illumination, task difficulty, and task level on the mental workload of an individual operating an agricultural sprayer in response to a commercial GPS lightbar, and to explore the sensitivity of the NASA-TLX and SSWAT subjective rating scales in discriminating the subjective experienced workload under various task, illumination, and difficulty levels. Mental workload was measured using performance measures (lateral root mean square error and reaction time), physiological measures (0.1 Hz power of HRV, latency of the P300 component of event-related potential, and eye-glance behavior), and two subjective rating scales (NASA-TLX and SSWAT). Sixteen male university students participated in this experiment, and a fixed-base high-fidelity agricultural tractor simulator was used to create a simulated spraying task. All performance measures, the P300 latency, and subjective rating scales showed a common trend that mental workload increased with the change in illumination from day to night, with task difficulty from low to high, and with task type from single to dual. The 0.1 Hz power of HRV contradicted the performance measures. Eye-glance data showed that under night illumination, participants spent more time looking at the lightbar for guidance information. A similar trend was observed with the change in task type from single to dual. Both subjective rating scales showed a common trend of increasing mental workload with the change in illumination, difficulty, and task levels. However, the SSWAT scale was more sensitive than the NASA-TLX scale. With the change in illumination, difficulty, and task levels, participants spent more mental resources to meet the increased task demand; hence, the
A simplified method for assessing cytotechnologist workload.
Vaickus, Louis J; Tambouret, Rosemary
2014-01-01
Examining cytotechnologist workflow and how it relates to job performance and patient safety is important in determining guidelines governing allowable workloads. This report discusses the development of a software tool that significantly simplifies the process of analyzing cytotechnologist workload while simultaneously increasing the quantity and resolution of the data collected. The program runs in Microsoft Excel and minimizes manual data entry and data transcription by automating as many tasks as is feasible. Data show the cytotechnologists tested were remarkably consistent in the amount of time it took them to screen a cervical cytology (Gyn) or a nongynecologic cytology (Non-Gyn) case and that this amount of time was directly proportional to the number of slides per case. Namely, the time spent per slide did not differ significantly in Gyn versus Non-Gyn cases (216 ± 3.4 seconds and 235 ± 24.6 seconds, respectively; P=.16). There was no significant difference in the amount of time needed to complete a Gyn case between the morning and the evening (314 ± 4.7 seconds and 312 ± 7.1 seconds; P=.39), but a significantly increased time spent screening Non-Gyn cases (slide-adjusted) in the afternoon hours (323 ± 20.1 seconds and 454 ± 67.6 seconds; P=.027), which was largely the result of significantly increased time spent on prescreening activities such as checking the electronic medical record (62 ± 6.9 seconds and 145 ± 36 seconds; P=.006). This Excel-based data collection tool generates highly detailed data in an unobtrusive manner and is highly customizable to the individual working environment and clinical climate. © 2013 American Cancer Society.
Novel method of measuring the mental workload of anaesthetists during clinical practice.
Byrne, A J; Oliver, M; Bodger, O; Barnett, W A; Williams, D; Jones, H; Murphy, A
2010-12-01
Cognitive overload has been recognized as a significant cause of error in industries such as aviation and measuring mental workload has become a key method of improving safety. The aim of this study was to pilot the use of a new method of measuring mental workload in the operating theatre using a previously published methodology. The mental workload of the anaesthetists was assessed by measuring their response times to a wireless vibrotactile device and the NASA TLX subjective workload score during routine surgical procedures. Primary task workload was inferred from the phase of anaesthesia. Significantly increased response time was associated with the induction phase of anaesthesia compared with maintenance/emergence, non-consultant grade, and during more complex cases. Increased response was also associated with self-reported mental load, physical load, and frustration. These findings are consistent with periods of increased mental workload and with the findings of other studies using similar techniques. These findings confirm the importance of mental workload to the performance of anaesthetists and suggest that increased mental workload is likely to be a common problem in clinical practice. Although further studies are required, the method described may be useful for the measurement of the mental workload of anaesthetists.
"Time Is Not Enough." Workload in Higher Education: A Student Perspective
ERIC Educational Resources Information Center
Kyndt, Eva; Berghmans, Inneke; Dochy, Filip; Bulckens, Lydwin
2014-01-01
Students' workload has been recognised as a major factor in the teaching and learning environment. This paper starts by structuring the different conceptualisations of workload described in the scientific literature. Besides the traditional distinction between objective and subjective or perceived workload, a distinction between conceptualisations…
Construction of a survey to assess workload and fatigue among AMT operators in Mexico.
Hernández Arellano, Juan Luis; Ibarra Mejía, Gabriel; Serratos Pérez, J Nieves; García Alcaraz, Jorge Luis; Brunette, María Julia
2012-01-01
Operators of machinery classified as Advanced Manufacturing Technology (AMT) are exposed to high levels of workload and fatigue. However, only few studies have been conducted on this topic in Hispanic-American countries workers. Several instruments be used to assess workload and fatigue; however, only few of them have been adapted to Spanish language. This paper reports on the development and validity testing of a survey instrument in Spanish, aiming to subjectively assess workload and fatigue among AMT operators in Mexico. After an exhaustive literature review in search of already available measurement instruments, they were adapted for content and later translated into Spanish; a pilot test was conducted to evaluate validity and reliability; afterwards appropriate modifications were made to the testing instruments. Final version of the instrument was applied to a group of 121 operators of CNC lathes. Reliability was analyzed using KMO and Cronbach alpha indices. For the assessment of workload, both NASA-TLX and ISTAS 21 methods were incorporated to the survey instrument. As for fatigue assessment tools, these were SOFI-S, FAS and OFER questionnaires. RESULTS show KMO value and Cronbach alpha above 0.6.Conclusions. The survey instrument as designed, allows the collection of reliable and valid data regarding workload and fatigue among AMT operators in Mexico.
Evaluation of Workload and its Impact on Satisfaction Among Pharmacy Academicians in Southern India
Khan, Muhammad Umair; Srikanth, Akshaya B.; Patel, Isha; Nagappa, Anantha Naik; Jamshed, Shazia Qasim
2015-01-01
Objective The purpose of this study was to determine the level of workload among pharmacy academicians working in public and private sector universities in India. The study also aimed to assess the satisfaction of academicians towards their workload. Materials and Methods A cross-sectional study was conducted for a period of 2 months among pharmacy academicians in Karnataka state of Southern India. Convenience sampling was used to select a sample and was contacted via email and/or social networking sites. Questionnaire designed by thorough review literature was used as a tool to collect data on workload (teaching, research, extracurricular services) and satisfaction. Results Of 214 participants, 95 returned the filled questionnaire giving the response rate of 44.39%. Private sector academicians had more load of teaching (p=0.046) and they appeared to be less involved in research activities (p=0.046) as compared to public sector academicians. More than half of the respondents (57.9%) were satisfied with their workload with Assistant Professors were least satisfied as compared to Professors (p=0.01). Conclusion Overall, private sector academicians are more burdened by teaching load and also are less satisfied of their workload. Revision of private universities policies may aid in addressing this issue. PMID:26266133
Shakouri, Mahmoud; Ikuma, Laura H; Aghazadeh, Fereydoun; Punniaraj, Karthy; Ishak, Sherif
2014-10-01
This paper investigates the effect of changing work zone configurations and traffic density on performance variables and subjective workload. Data regarding travel time, average speed, maximum percent braking force and location of lane changes were collected by using a full size driving simulator. The NASA-TLX was used to measure self-reported workload ratings during the driving task. Conventional lane merge (CLM) and joint lane merge (JLM) were modeled in a driving simulator, and thirty participants (seven female and 23 male), navigated through the two configurations with two levels of traffic density. The mean maximum braking forces was 34% lower in the JLM configuration, and drivers going through the JLM configuration remained in the closed lane longer. However, no significant differences in speed were found between the two merge configurations. The analysis of self-reported workload ratings show that participants reported 15.3% lower total workload when driving through the JLM. In conclusion, the implemented changes in the JLM make it a more favorable merge configuration in both high and low traffic densities in terms of optimizing traffic flow by increasing the time and distance cars use both lanes, and in terms of improving safety due to lower braking forces and lower reported workload. Copyright © 2014 Elsevier Ltd. All rights reserved.
Using Psychophysiological Sensors to Assess Mental Workload During Web Browsing
Jimenez-Molina, Angel; Retamal, Cristian; Lira, Hernan
2018-01-01
Knowledge of the mental workload induced by a Web page is essential for improving users’ browsing experience. However, continuously assessing the mental workload during a browsing task is challenging. To address this issue, this paper leverages the correlation between stimuli and physiological responses, which are measured with high-frequency, non-invasive psychophysiological sensors during very short span windows. An experiment was conducted to identify levels of mental workload through the analysis of pupil dilation measured by an eye-tracking sensor. In addition, a method was developed to classify mental workload by appropriately combining different signals (electrodermal activity (EDA), electrocardiogram, photoplethysmo-graphy (PPG), electroencephalogram (EEG), temperature and pupil dilation) obtained with non-invasive psychophysiological sensors. The results show that the Web browsing task involves four levels of mental workload. Also, by combining all the sensors, the efficiency of the classification reaches 93.7%. PMID:29401688
Baethge, Anja; Müller, Andreas; Rigotti, Thomas
2016-03-01
The aim of this study was to investigate whether selective optimization with compensation constitutes an individualized action strategy for nurses wanting to maintain job performance under high workload. High workload is a major threat to healthcare quality and performance. Selective optimization with compensation is considered to enhance the efficient use of intra-individual resources and, therefore, is expected to act as a buffer against the negative effects of high workload. The study applied a diary design. Over five consecutive workday shifts, self-report data on workload was collected at three randomized occasions during each shift. Self-reported job performance was assessed in the evening. Self-reported selective optimization with compensation was assessed prior to the diary reporting. Data were collected in 2010. Overall, 136 nurses from 10 German hospitals participated. Selective optimization with compensation was assessed with a nine-item scale that was specifically developed for nursing. The NASA-TLX scale indicating the pace of task accomplishment was used to measure workload. Job performance was assessed with one item each concerning performance quality and forgetting of intentions. There was a weaker negative association between workload and both indicators of job performance in nurses with a high level of selective optimization with compensation, compared with nurses with a low level. Considering the separate strategies, selection and compensation turned out to be effective. The use of selective optimization with compensation is conducive to nurses' job performance under high workload levels. This finding is in line with calls to empower nurses' individual decision-making. © 2015 John Wiley & Sons Ltd.
DCMS: A data analytics and management system for molecular simulation.
Kumar, Anand; Grupcev, Vladimir; Berrada, Meryem; Fogarty, Joseph C; Tu, Yi-Cheng; Zhu, Xingquan; Pandit, Sagar A; Xia, Yuni
Molecular Simulation (MS) is a powerful tool for studying physical/chemical features of large systems and has seen applications in many scientific and engineering domains. During the simulation process, the experiments generate a very large number of atoms and intend to observe their spatial and temporal relationships for scientific analysis. The sheer data volumes and their intensive interactions impose significant challenges for data accessing, managing, and analysis. To date, existing MS software systems fall short on storage and handling of MS data, mainly because of the missing of a platform to support applications that involve intensive data access and analytical process. In this paper, we present the database-centric molecular simulation (DCMS) system our team developed in the past few years. The main idea behind DCMS is to store MS data in a relational database management system (DBMS) to take advantage of the declarative query interface ( i.e. , SQL), data access methods, query processing, and optimization mechanisms of modern DBMSs. A unique challenge is to handle the analytical queries that are often compute-intensive. For that, we developed novel indexing and query processing strategies (including algorithms running on modern co-processors) as integrated components of the DBMS. As a result, researchers can upload and analyze their data using efficient functions implemented inside the DBMS. Index structures are generated to store analysis results that may be interesting to other users, so that the results are readily available without duplicating the analysis. We have developed a prototype of DCMS based on the PostgreSQL system and experiments using real MS data and workload show that DCMS significantly outperforms existing MS software systems. We also used it as a platform to test other data management issues such as security and compression.
Data-Driven Planning: Using Assessment in Strategic Planning
ERIC Educational Resources Information Center
Bresciani, Marilee J.
2010-01-01
Data-driven planning or evidence-based decision making represents nothing new in its concept. For years, business leaders have claimed they have implemented planning informed by data that have been strategically and systematically gathered. Within higher education and student affairs, there may be less evidence of the actual practice of…
Data-driven risk identification in phase III clinical trials using central statistical monitoring.
Timmermans, Catherine; Venet, David; Burzykowski, Tomasz
2016-02-01
Our interest lies in quality control for clinical trials, in the context of risk-based monitoring (RBM). We specifically study the use of central statistical monitoring (CSM) to support RBM. Under an RBM paradigm, we claim that CSM has a key role to play in identifying the "risks to the most critical data elements and processes" that will drive targeted oversight. In order to support this claim, we first see how to characterize the risks that may affect clinical trials. We then discuss how CSM can be understood as a tool for providing a set of data-driven key risk indicators (KRIs), which help to organize adaptive targeted monitoring. Several case studies are provided where issues in a clinical trial have been identified thanks to targeted investigation after the identification of a risk using CSM. Using CSM to build data-driven KRIs helps to identify different kinds of issues in clinical trials. This ability is directly linked with the exhaustiveness of the CSM approach and its flexibility in the definition of the risks that are searched for when identifying the KRIs. In practice, a CSM assessment of the clinical database seems essential to ensure data quality. The atypical data patterns found in some centers and variables are seen as KRIs under a RBM approach. Targeted monitoring or data management queries can be used to confirm whether the KRIs point to an actual issue or not.
Street, A; Strong, J; Karp, S
2001-01-01
One of the most frequently cited reasons for poor recruitment to multicentre randomized clinical trials is the additional workload placed on clinical staff. We report the effect on patient recruitment of employing a data manager to support clinical staff in an English district general hospital (DGH). In addition, we explore the effect data managers have on the quality of data collected, proxied by the number of queries arising with the trial organizers. We estimate that the cost of employing a data manager on a full-time basis is 502 per patient recruited but may amount to 326 if the appointment is part-time. Data quality is high when full responsibility lies with a data manager but falls when responsibility is shared. Whether the costs of employing a data manager to recruit patients from a DGH are worth incurring depends on the value placed on the speed at which multicentre trials can be completed, how important it is to broaden the research base beyond the traditional setting of teaching hospitals, and the amount of evaluative data required.
Motakis, E S; Nason, G P; Fryzlewicz, P; Rutter, G A
2006-10-15
Many standard statistical techniques are effective on data that are normally distributed with constant variance. Microarray data typically violate these assumptions since they come from non-Gaussian distributions with a non-trivial mean-variance relationship. Several methods have been proposed that transform microarray data to stabilize variance and draw its distribution towards the Gaussian. Some methods, such as log or generalized log, rely on an underlying model for the data. Others, such as the spread-versus-level plot, do not. We propose an alternative data-driven multiscale approach, called the Data-Driven Haar-Fisz for microarrays (DDHFm) with replicates. DDHFm has the advantage of being 'distribution-free' in the sense that no parametric model for the underlying microarray data is required to be specified or estimated; hence, DDHFm can be applied very generally, not just to microarray data. DDHFm achieves very good variance stabilization of microarray data with replicates and produces transformed intensities that are approximately normally distributed. Simulation studies show that it performs better than other existing methods. Application of DDHFm to real one-color cDNA data validates these results. The R package of the Data-Driven Haar-Fisz transform (DDHFm) for microarrays is available in Bioconductor and CRAN.
Creation of a Book Order Management System Using a Microcomputer and a DBMS.
ERIC Educational Resources Information Center
Neill, Charlotte; And Others
1985-01-01
Describes management decisions and resultant technology-based system that allowed a medical library to meet increasing workloads without accompanying increases in resources available. Discussion covers system analysis; capabilities of book-order management system, "BOOKDIRT;" software and training; hardware; data files; data entry;…
CMS Readiness for Multi-Core Workload Scheduling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perez-Calero Yzquierdo, A.; Balcas, J.; Hernandez, J.
In the present run of the LHC, CMS data reconstruction and simulation algorithms benefit greatly from being executed as multiple threads running on several processor cores. The complexity of the Run 2 events requires parallelization of the code to reduce the memory-per- core footprint constraining serial execution programs, thus optimizing the exploitation of present multi-core processor architectures. The allocation of computing resources for multi-core tasks, however, becomes a complex problem in itself. The CMS workload submission infrastructure employs multi-slot partitionable pilots, built on HTCondor and GlideinWMS native features, to enable scheduling of single and multi-core jobs simultaneously. This provides amore » solution for the scheduling problem in a uniform way across grid sites running a diversity of gateways to compute resources and batch system technologies. This paper presents this strategy and the tools on which it has been implemented. The experience of managing multi-core resources at the Tier-0 and Tier-1 sites during 2015, along with the deployment phase to Tier-2 sites during early 2016 is reported. The process of performance monitoring and optimization to achieve efficient and flexible use of the resources is also described.« less
CMS readiness for multi-core workload scheduling
NASA Astrophysics Data System (ADS)
Perez-Calero Yzquierdo, A.; Balcas, J.; Hernandez, J.; Aftab Khan, F.; Letts, J.; Mason, D.; Verguilov, V.
2017-10-01
In the present run of the LHC, CMS data reconstruction and simulation algorithms benefit greatly from being executed as multiple threads running on several processor cores. The complexity of the Run 2 events requires parallelization of the code to reduce the memory-per- core footprint constraining serial execution programs, thus optimizing the exploitation of present multi-core processor architectures. The allocation of computing resources for multi-core tasks, however, becomes a complex problem in itself. The CMS workload submission infrastructure employs multi-slot partitionable pilots, built on HTCondor and GlideinWMS native features, to enable scheduling of single and multi-core jobs simultaneously. This provides a solution for the scheduling problem in a uniform way across grid sites running a diversity of gateways to compute resources and batch system technologies. This paper presents this strategy and the tools on which it has been implemented. The experience of managing multi-core resources at the Tier-0 and Tier-1 sites during 2015, along with the deployment phase to Tier-2 sites during early 2016 is reported. The process of performance monitoring and optimization to achieve efficient and flexible use of the resources is also described.
Overview of ATLAS PanDA Workload Management
NASA Astrophysics Data System (ADS)
Maeno, T.; De, K.; Wenaus, T.; Nilsson, P.; Stewart, G. A.; Walker, R.; Stradling, A.; Caballero, J.; Potekhin, M.; Smith, D.; ATLAS Collaboration
2011-12-01
The Production and Distributed Analysis System (PanDA) plays a key role in the ATLAS distributed computing infrastructure. All ATLAS Monte-Carlo simulation and data reprocessing jobs pass through the PanDA system. We will describe how PanDA manages job execution on the grid using dynamic resource estimation and data replication together with intelligent brokerage in order to meet the scaling and automation requirements of ATLAS distributed computing. PanDA is also the primary ATLAS system for processing user and group analysis jobs, bringing further requirements for quick, flexible adaptation to the rapidly evolving analysis use cases of the early datataking phase, in addition to the high reliability, robustness and usability needed to provide efficient and transparent utilization of the grid for analysis users. We will describe how PanDA meets ATLAS requirements, the evolution of the system in light of operational experience, how the system has performed during the first LHC data-taking phase and plans for the future.
Overview of ATLAS PanDA Workload Management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maeno T.; De K.; Wenaus T.
2011-01-01
The Production and Distributed Analysis System (PanDA) plays a key role in the ATLAS distributed computing infrastructure. All ATLAS Monte-Carlo simulation and data reprocessing jobs pass through the PanDA system. We will describe how PanDA manages job execution on the grid using dynamic resource estimation and data replication together with intelligent brokerage in order to meet the scaling and automation requirements of ATLAS distributed computing. PanDA is also the primary ATLAS system for processing user and group analysis jobs, bringing further requirements for quick, flexible adaptation to the rapidly evolving analysis use cases of the early datataking phase, in additionmore » to the high reliability, robustness and usability needed to provide efficient and transparent utilization of the grid for analysis users. We will describe how PanDA meets ATLAS requirements, the evolution of the system in light of operational experience, how the system has performed during the first LHC data-taking phase and plans for the future.« less
ERIC Educational Resources Information Center
Gonzalez, Sylvia; Bernard, Hinsdale
2006-01-01
The focus of this investigation was to determine the possible relationship of workload typologies and other selected demographic variables to levels of burnout among full-time faculty in Seventh-day Adventist colleges and universities in North America. Four typologies of academic workload emerged from the study of the data. The results revealed…
Data-driven modeling, control and tools for cyber-physical energy systems
NASA Astrophysics Data System (ADS)
Behl, Madhur
inverse model accuracy and control performance, which can be used to make informed decisions about sensor requirements and data accuracy. We also present DR-Advisor, a data-driven demand response recommender system for the building's facilities manager which provides suitable control actions to meet the desired load curtailment while maintaining operations and maximizing the economic reward. We develop a model based control with regression trees algorithm (mbCRT), which allows us to perform closed-loop control for DR strategy synthesis for large commercial buildings. Our data-driven control synthesis algorithm outperforms rule-based demand response methods for a large DoE commercial reference building and leads to a significant amount of load curtailment (of 380kW) and over $45,000 in savings which is 37.9% of the summer energy bill for the building. The performance of DR-Advisor is also evaluated for 8 buildings on Penn's campus; where it achieves 92.8% to 98.9% prediction accuracy. We also compare DR-Advisor with other data driven methods and rank 2nd on ASHRAE's benchmarking data-set for energy prediction.
Workshop on Workload and Training, and Examination of their Interactions: Executive summary
NASA Technical Reports Server (NTRS)
Donchin, Emanuel; Hart, Sandra G.; Hartzell, Earl J.
1987-01-01
The goal of the workshop was to bring together experts in the fields of workload and training and representatives from the Dept. of Defense and industrial organizations who are reponsible for specifying, building, and managing advanced, complex systems. The challenging environments and requirements imposed by military helicopter missions and space station operations were presented as the focus for the panel discussions. The workshop permitted a detailed examination of the theoretical foundations of the fields of training and workload, as well as their practical applications. Furthermore, it created a forum where government, industry, and academic experts were able to examine each other's concepts, values, and goals. The discussions pointed out the necessity for a more efficient and effective flow of information among the groups respresented. The executive summary describes the rationale of the meeting, summarizes the primary points of discussion, and lists the participants and some of their summary comments.
Measurement of Workload: Physics, Psychophysics, and Metaphysics
NASA Technical Reports Server (NTRS)
Gopher, D.
1984-01-01
The present paper reviews the results of two experiments in which workload analysis was conducted based upon performance measures, brain evoked potentials and magnitude estimations of subjective load. The three types of measures were jointly applied to the description of the behavior of subjects in a wide battery of experimental tasks. Data analysis shows both instances of association and dissociation between types of measures. A general conceptual framework and methodological guidelines are proposed to account for these findings.
Weigl, Matthias; Antoniadis, Sophia; Chiapponi, Costanza; Bruns, Christiane; Sevdalis, Nick
2015-01-01
Surgeons' intra-operative workload is critical for effective and safe surgical performance. Detrimental conditions in the operating room (OR) environment may add to perceived workload and jeopardize surgical performance and outcomes. This study aims to evaluate the impact of different intra-operative workflow interruptions on surgeons' capacity to manage their workload safely and efficiently. This was an observational study of intra-operative interruptions and self-rated workload in two surgical specialties (general, orthopedic/trauma surgery). Intra-operative interruptions were assessed via expert observation using a well-validated observation tool. Surgeons, nurses, and anesthesiologists assessed their intra-operative workload directly after case completion based on three items of the validated Surgery Task Load Index (mental demand, situational stress, distraction). A total of 56 elective cases (35 open, 21 laparoscopic) with 94 workload ratings were included. Mean intra-operative duration was 1 h 37 min. Intra-operative interruptions were on average observed 9.78 times per hour. People who entered/exited the OR (30.6 %) as well as telephone-/beeper-related disruptions (23.6 %) occurred most often. Equipment and OR environment-related interruptions were associated with highest interference with team functioning particularly in laparoscopic procedures. After identifying task and procedural influences, partial correlational analyses revealed that case-irrelevant communications were negatively associated with surgeons' mental fatigue and situational stress, whereas surgeons' reported distraction was increased by case-irrelevant communication and procedural disruptions. OR nurses' and anesthesiologists' perceived workload was also related to intra-operative interruption events. Our study documents the unique contribution of different interruptions on surgeons' workload; whereas case-irrelevant communications may be beneficial for mental fatigue and stress in routine
Jørgensen, Marie Birk; Nabe-Nielsen, Kirsten; Clausen, Thomas; Holtermann, Andreas
2013-03-15
Prospective cohort study. To investigate the independent effect of physical workload and childhood socioeconomic status (CSES) on low back pain (LBP) and LBP-related sickness absence among female health care workers. The role of physical workload on LBP independently from CSES is still subject to controversy. We used questionnaire data from 1661 female social and health care workers responding to a questionnaire in 2004, 2005, and 2006. We collected information on CSES (parental occupation), physical workload, and LBP-prevalence (no LBP, subchronic LBP, and frequent LBP), and LBP-related sickness absence. The participants were categorized into 5 groups according to CSES (I = highest, V = lowest). Data were analyzed using logistic regression analysis. Irrespective of CSES, high physical workload increased the odds ratio (OR) of future subchronic LBP (OR = 2.03; 95% confidence interval [CI], 1.61-2.57) and frequent LBP (OR = 2.20; 95% CI, 1.65-3.00), but not LBP-related sickness absence. The odds of subchronic LBP were lower in CSES groups II (OR = 0.62; 95% CI, 0.42-0.93) and III (OR = 0.58; 95% CI, 0.39-0.86) referencing CSES group I, irrespective of physical workload. The odds of short-term LBP-related sickness absence were higher in CSES groups III (OR = 2.78; 95% CI, 1.41-5.47) and IV (OR = 2.18; 95% CI, 1.11-4.27) referencing CSES group I, irrespective of physical workload. We found no interaction between physical workload and CSES. Physical workload and CSES are independently associated with future LBP within a group with similar occupational status. N/A.
Effect of block weight on work demands and physical workload during masonry work.
Van Der Molen, H F; Kuijer, P P F M; Hopmans, P P W; Houweling, A G; Faber, G S; Hoozemans, M J M; Frings-Dresen, M H W
2008-03-01
The effect of block weight on work demands and physical workload was determined for masons who laid sandstone building blocks over the course of a full work day. Three groups of five sandstone block masons participated. Each group worked with a different block weight: 11 kg, 14 kg or 16 kg. Productivity and durations of tasks and activities were assessed through real time observations at the work site. Energetic workload was also assessed through monitoring the heart rate and oxygen consumption at the work site. Spinal load of the low back was estimated by calculating the cumulated elastic energy stored in the lumbar spine using durations of activities and previous data on corresponding compression forces. Block weight had no effect on productivity, duration or frequency of tasks and activities, energetic workload or cumulative spinal load. Working with any of the block weights exceeded exposure guidelines for work demands and physical workload. This implies that, regardless of block weight in the range of 11 to 16 kg, mechanical lifting equipment or devices to adjust work height should be implemented to substantially lower the risk of low back injuries.
Pilot Mental Workload with Predictive System Status Information
NASA Technical Reports Server (NTRS)
Trujillo, Anna C.
1998-01-01
Research has shown a strong pilot preference for predictive information of aircraft system status in the flight deck. However, the mental workload associated with using this predictive information has not been ascertained. The study described here attempted to measure mental workload. In this simulator experiment, three types of predictive information (none, whether a parameter was changing abnormally, and the time for a parameter to reach an alert range) and four initial times to a parameter alert range (1 minute, 5 minutes, 15 minutes, and ETA+45 minutes) were tested to determine their effects on subjects mental workload. Subjective workload ratings increased with increasing predictive information (whether a parameter was changing abnormally or the time for a parameter to reach an alert range). Subjective situation awareness decreased with more predictive information but it became greater with increasing initial times to a parameter alert range. Also, subjective focus changed depending on the type of predictive information. Lastly, skin temperature fluctuated less as the initial time to a parameter alert range increased.
Workload Characterization of a Leadership Class Storage Cluster
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Youngjae; Gunasekaran, Raghul; Shipman, Galen M
2010-01-01
Understanding workload characteristics is critical for optimizing and improving the performance of current systems and software, and architecting new storage systems based on observed workload patterns. In this paper, we characterize the scientific workloads of the world s fastest HPC (High Performance Computing) storage cluster, Spider, at the Oak Ridge Leadership Computing Facility (OLCF). Spider provides an aggregate bandwidth of over 240 GB/s with over 10 petabytes of RAID 6 formatted capacity. OLCFs flagship petascale simulation platform, Jaguar, and other large HPC clusters, in total over 250 thousands compute cores, depend on Spider for their I/O needs. We characterize themore » system utilization, the demands of reads and writes, idle time, and the distribution of read requests to write requests for the storage system observed over a period of 6 months. From this study we develop synthesized workloads and we show that the read and write I/O bandwidth usage as well as the inter-arrival time of requests can be modeled as a Pareto distribution.« less
Examining Data-Driven Decision Making in Private/Religious Schools
ERIC Educational Resources Information Center
Hanks, Jason Edward
2011-01-01
The purpose of this study was to investigate non-mandated data-driven decision making in private/religious schools. The school culture support of data use, teacher use of data, leader facilitation of using data, and the availability of data were investigated in three schools. A quantitative survey research design was used to explore the research…
Student Burnout as a Function of Personality, Social Support, and Workload.
ERIC Educational Resources Information Center
Jacobs, Sheri R.; Dodd, David K.
2003-01-01
Measures of social support, personality, and workload were related to psychological burnout among 149 college students. High levels of burnout were predicted by negative temperament and subjective workload, but actual workload (academic and vocational) had little to do with burnout. Low levels of burnout were predicted by positive temperament,…
Data-driven train set crash dynamics simulation
NASA Astrophysics Data System (ADS)
Tang, Zhao; Zhu, Yunrui; Nie, Yinyu; Guo, Shihui; Liu, Fengjia; Chang, Jian; Zhang, Jianjun
2017-02-01
Traditional finite element (FE) methods are arguably expensive in computation/simulation of the train crash. High computational cost limits their direct applications in investigating dynamic behaviours of an entire train set for crashworthiness design and structural optimisation. On the contrary, multi-body modelling is widely used because of its low computational cost with the trade-off in accuracy. In this study, a data-driven train crash modelling method is proposed to improve the performance of a multi-body dynamics simulation of train set crash without increasing the computational burden. This is achieved by the parallel random forest algorithm, which is a machine learning approach that extracts useful patterns of force-displacement curves and predicts a force-displacement relation in a given collision condition from a collection of offline FE simulation data on various collision conditions, namely different crash velocities in our analysis. Using the FE simulation results as a benchmark, we compared our method with traditional multi-body modelling methods and the result shows that our data-driven method improves the accuracy over traditional multi-body models in train crash simulation and runs at the same level of efficiency.
Automation - Changes in cognitive demands and mental workload
NASA Technical Reports Server (NTRS)
Tsang, Pamela S.; Johnson, Walter W.
1987-01-01
The effect of partial automation on mental workloads in man/machine tasks is investigated experimentally. Subjective workload measures are obtained from six subjects after performance of a task battery comprising two manual (flight-path control, FC, and target acquisition, TA) tasks and one decisionmaking (engine failure, EF) task; the FC task was performed in both a fully manual (altitude and lateral control) mode and in a semiautomated mode (autmatic latitude control). The performance results and subjective evaluations are presented in graphs and characterized in detail. The automation is shown to improve objective performance and lower subjective workload significantly in the combined FC/TA task, but not in the FC task alone or in the FC/EF task.
Strayer, David L; Cooper, Joel M; Turrill, Jonna; Coleman, James R; Hopman, Rachel J
2017-06-01
The goal of this research was to examine the impact of voice-based interactions using 3 different intelligent personal assistants (Apple's Siri , Google's Google Now for Android phones, and Microsoft's Cortana ) on the cognitive workload of the driver. In 2 experiments using an instrumented vehicle on suburban roadways, we measured the cognitive workload of drivers when they used the voice-based features of each smartphone to place a call, select music, or send text messages. Cognitive workload was derived from primary task performance through video analysis, secondary-task performance using the Detection Response Task (DRT), and subjective mental workload. We found that workload was significantly higher than that measured in the single-task drive. There were also systematic differences between the smartphones: The Google system placed lower cognitive demands on the driver than the Apple and Microsoft systems, which did not differ. Video analysis revealed that the difference in mental workload between the smartphones was associated with the number of system errors, the time to complete an action, and the complexity and intuitiveness of the devices. Finally, surprisingly high levels of cognitive workload were observed when drivers were interacting with the devices: "on-task" workload measures did not systematically differ from that associated with a mentally demanding Operation Span (OSPAN) task. The analysis also found residual costs associated using each of the smartphones that took a significant time to dissipate. The data suggest that caution is warranted in the use of smartphone voice-based technology in the vehicle because of the high levels of cognitive workload associated with these interactions. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Fang, Michele; Linson, Eric; Suneja, Manish; Kuperman, Ethan F
2017-02-22
Excellence in Graduate Medical Education requires the right clinical environment with an appropriate workload where residents have enough patients to gain proficiency in medicine with optimal time for reflection. The Accreditation Council for Graduate Medical Education (ACGME) has focused more on work hours rather than workload; however, high resident workload has been associated with lower resident participation in education and fatigue-related errors. Recognizing the potential risks associated with high resident workload and being mindful of the costs of reducing resident workload, we sought to reduce residents' workload by adding an advanced practice provider (APP) to the surgical comanagement service (SCM) and study its effect on resident satisfaction and perceived educational value of the rotation. In Fiscal Year (FY) 2014 and 2015, an additional faculty member was added to the SCM rotation. In FY 2014, the faculty member was a staff physician, and in FY 2015, the faculty member was an APP.. Resident workload was assessed using billing data. We measured residents' perceptions of the rotation using an anonymous electronic survey tool. We compared FY2014-2015 data to the baseline FY2013. The number of patients seen per resident per day decreased from 8.0(SD 3.3) in FY2013 to 5.0(SD 1.9) in FY2014 (p < 0.001) and 5.7(SD 2.0) in FY2015 (p < 0.001). A higher proportion of residents reported "just right" patient volume (64.4%, 91.7%, 96.7% in FY2013, 2014, 2015 respectively p < 0.001), meeting curricular goals (79.9%, 95.0%, 97.2%, in FY2013, 2014 and 2015 respectively p < 0.001), and overall educational value of the rotation (40.0%, 72.2%, 72.6% in FY2013, 2014, 2015 respectively, p < 0.001). Decreasing resident workload through adding clinical faculty (both staff physician and APPs) was associated with improvements on resident perceived educational value and clinical experience of a medical consultation rotation.
Enhancing Extensive Reading with Data-Driven Learning
ERIC Educational Resources Information Center
Hadley, Gregory; Charles, Maggie
2017-01-01
This paper investigates using data-driven learning (DDL) as a means of stimulating greater lexicogrammatical knowledge and reading speed among lower proficiency learners in an extensive reading program. For 16 weekly 90-minute sessions, an experimental group (12 students) used DDL materials created from a corpus developed from the Oxford Bookworms…
Planning Training Workload in Football Using Small-Sided Games' Density.
Sangnier, Sebastien; Cotte, Thierry; Brachet, Olivier; Coquart, Jeremy; Tourny, Claire
2018-05-08
Sangnier, S, Cotte, T, Brachet, O, Coquart, J, and Tourny, C. Planning training workload in football using small-sided games density. J Strength Cond Res XX(X): 000-000, 2018-To develop the physical qualities, the small-sided games' (SSGs) density may be essential in soccer. Small-sided games are games in which the pitch size, players' number, and rules are different to those for traditional soccer matches. The purpose was to assess the relation between training workload and SSGs' density. The 33 densities data (41 practice games and 3 full games) were analyzed through global positioning system (GPS) data collected from 25 professional soccer players (80.7 ± 7.0 kg; 1.83 ± 0.05 m; 26.4 ± 4.9 years). From total distance, distance metabolic power, sprint distance, and acceleration distance, the data GPS were divided into 4 categories: endurance, power, speed, and strength. Statistical analysis compared the relation between GPS values and SSGs' densities, and 3 methods were applied to assess models (R-squared, root-mean-square error, and Akaike information criterion). The results suggest that all the GPS data match the player's essential athletic skills. They were all correlated with the game's density. Acceleration distance, deceleration distance, metabolic power, and total distance followed a logarithmic regression model, whereas distance and number of sprints follow a linear regression model. The research reveals options to monitor the training workload. Coaches could anticipate the load resulting from the SSGs and adjust the field size to the players' number. Taking into account the field size during SSGs enables coaches to target the most favorable density for developing expected physical qualities. Calibrating intensity during SSGs would allow coaches to assess each athletic skill in the same conditions of intensity as in the competition.
The relationship between physical workload and quality within line-based assembly.
Ivarsson, Anna; Eek, Frida
2016-07-01
Reducing costs and improvement of product quality are considered important to ensure productivity within a company. Quality deviations during production processes and ergonomics have previously shown to be associated. This study explored the relationship between physical workload and real (found during production processes) and potential (need of extra time and assistance to complete tasks) quality deviations in a line-based assembly plant. The physical workload on and the work rotation between 52 workstations were assessed. As the outcome, real and potential quality deviations were studied during 10 weeks. Results show that workstations with higher physical workload had significantly more real deviations compared to lower workload stations. Static work posture had significantly more potential deviations. Rotation between high and low workload was related to fewer quality deviations compared to rotation between only high workload stations. In conclusion, physical ergonomics seems to be related to real and potential quality deviation within line-based assembly. Practitioner Summary: To ensure good productivity in manufacturing industries, it is important to reduce costs and improve product quality. This study shows that high physical workload is associated with quality deviations and need of extra time and assistance to complete tasks within line-based assembly, which can be financially expensive for a company.
Exploring the Utility of Workload Models in Academe: A Pilot Study
ERIC Educational Resources Information Center
Boyd, Leanne
2014-01-01
The workload of academics in Australia is increasing. Among the potential ramifications of this are work-related stress and burnout. Unions have negotiated workload models in employment agreements as a means of distributing workload in a fair and transparent manner. This qualitative pilot study aimed to explore how academics perceive their current…
Data-driven approaches in the investigation of social perception
Adolphs, Ralph; Nummenmaa, Lauri; Todorov, Alexander; Haxby, James V.
2016-01-01
The complexity of social perception poses a challenge to traditional approaches to understand its psychological and neurobiological underpinnings. Data-driven methods are particularly well suited to tackling the often high-dimensional nature of stimulus spaces and of neural representations that characterize social perception. Such methods are more exploratory, capitalize on rich and large datasets, and attempt to discover patterns often without strict hypothesis testing. We present four case studies here: behavioural studies on face judgements, two neuroimaging studies of movies, and eyetracking studies in autism. We conclude with suggestions for particular topics that seem ripe for data-driven approaches, as well as caveats and limitations. PMID:27069045
EFFECTIVE INDICES FOR MONITORING MENTAL WORKLOAD WHILE PERFORMING MULTIPLE TASKS.
Hsu, Bin-Wei; Wang, Mao-Jiun J; Chen, Chi-Yuan; Chen, Fang
2015-08-01
This study identified several physiological indices that can accurately monitor mental workload while participants performed multiple tasks with the strategy of maintaining stable performance and maximizing accuracy. Thirty male participants completed three 10-min. simulated multitasks: MATB (Multi-Attribute Task Battery) with three workload levels. Twenty-five commonly used mental workload measures were collected, including heart rate, 12 HRV (heart rate variability), 10 EEG (electroencephalography) indices (α, β, θ, α/θ, θ/β from O1-O2 and F4-C4), and two subjective measures. Analyses of index sensitivity showed that two EEG indices, θ and α/θ (F4-C4), one time-domain HRV-SDNN (standard deviation of inter-beat intervals), and four frequency-domain HRV: VLF (very low frequency), LF (low frequency), %HF (percentage of high frequency), and LF/HF were sensitive to differentiate high workload. EEG α/θ (F4-C4) and LF/HF were most effective for monitoring high mental workload. LF/HF showed the highest correlations with other physiological indices. EEG α/θ (F4-C4) showed strong correlations with subjective measures across different mental workload levels. Operation strategy would affect the sensitivity of EEG α (F4-C4) and HF.
Ackermann, O; Heigel, U; Lazic, D; Vogel, T; Schofer, M D; Rülander, C
2012-04-01
For the clinical planning of mass events the emergency departments are of critical importance, but there are still no data available for the workload in these cases. As this is essential for an effective medical preparation, we calculated the workload based on the ICD codes of the vicitims at the Loveparade 2010 in Duisburg. Based on the patient data of the Loveparade 2010 we used a filter diagnosis to estimate the number of shock room patients, regular admittances, surgical wound treatments, applications of casts or splints, and diagnosis of drug abuse. In addition every patient was classified to a Manchester Triage System category. This resulted in a chronological and quantitative work-load profile of the emergency department, which was evaluated by the clinical experiences of the departmental medical staff. The workload profile as a whole displayed a realistic image of the real true situation on July 24, 2010. While only the number, diagnosis and chronology of medical surgical patients was realistic, the MTS classification was not. The emergency department had a maximum of 6 emergency room admittances, 6 regular admittances, 4-5 surgical wound treatments, 3 casts and 2 drug abuse patients per hour. The calculation of workload from the ICD data is a reasonable tool for retrospective estimation of the workload of an emergency department, the data can be used for future planning. The retrospective MTS grouping is at present not suitable for a realistic calculation. Retrospective measures in the MTS groups are at present not sufficiently suitable for valid data publication. © Georg Thieme Verlag KG Stuttgart · New York.
Prototype Development: Context-Driven Dynamic XML Ophthalmologic Data Capture Application
Schwei, Kelsey M; Kadolph, Christopher; Finamore, Joseph; Cancel, Efrain; McCarty, Catherine A; Okorie, Asha; Thomas, Kate L; Allen Pacheco, Jennifer; Pathak, Jyotishman; Ellis, Stephen B; Denny, Joshua C; Rasmussen, Luke V; Tromp, Gerard; Williams, Marc S; Vrabec, Tamara R; Brilliant, Murray H
2017-01-01
Background The capture and integration of structured ophthalmologic data into electronic health records (EHRs) has historically been a challenge. However, the importance of this activity for patient care and research is critical. Objective The purpose of this study was to develop a prototype of a context-driven dynamic extensible markup language (XML) ophthalmologic data capture application for research and clinical care that could be easily integrated into an EHR system. Methods Stakeholders in the medical, research, and informatics fields were interviewed and surveyed to determine data and system requirements for ophthalmologic data capture. On the basis of these requirements, an ophthalmology data capture application was developed to collect and store discrete data elements with important graphical information. Results The context-driven data entry application supports several features, including ink-over drawing capability for documenting eye abnormalities, context-based Web controls that guide data entry based on preestablished dependencies, and an adaptable database or XML schema that stores Web form specifications and allows for immediate changes in form layout or content. The application utilizes Web services to enable data integration with a variety of EHRs for retrieval and storage of patient data. Conclusions This paper describes the development process used to create a context-driven dynamic XML data capture application for optometry and ophthalmology. The list of ophthalmologic data elements identified as important for care and research can be used as a baseline list for future ophthalmologic data collection activities. PMID:28903894
Using the NASA Task Load Index to Assess Workload in Electronic Medical Records.
Hudson, Darren; Kushniruk, Andre W; Borycki, Elizabeth M
2015-01-01
Electronic medical records (EMRs) has been expected to decrease health professional workload. The NASA Task Load Index has become an important tool for assessing workload in many domains. However, its application in assessing the impact of an EMR on nurse's workload has remained to be explored. In this paper we report the results of a study of workload and we explore the utility of applying the NASA Task Load Index to assess impact of an EMR at the end of its lifecycle on nurses' workload. It was found that mental and temporal demands were the most responsible for the workload. Further work along these lines is recommended.
Data-driven region-of-interest selection without inflating Type I error rate.
Brooks, Joseph L; Zoumpoulaki, Alexia; Bowman, Howard
2017-01-01
In ERP and other large multidimensional neuroscience data sets, researchers often select regions of interest (ROIs) for analysis. The method of ROI selection can critically affect the conclusions of a study by causing the researcher to miss effects in the data or to detect spurious effects. In practice, to avoid inflating Type I error rate (i.e., false positives), ROIs are often based on a priori hypotheses or independent information. However, this can be insensitive to experiment-specific variations in effect location (e.g., latency shifts) reducing power to detect effects. Data-driven ROI selection, in contrast, is nonindependent and uses the data under analysis to determine ROI positions. Therefore, it has potential to select ROIs based on experiment-specific information and increase power for detecting effects. However, data-driven methods have been criticized because they can substantially inflate Type I error rate. Here, we demonstrate, using simulations of simple ERP experiments, that data-driven ROI selection can indeed be more powerful than a priori hypotheses or independent information. Furthermore, we show that data-driven ROI selection using the aggregate grand average from trials (AGAT), despite being based on the data at hand, can be safely used for ROI selection under many circumstances. However, when there is a noise difference between conditions, using the AGAT can inflate Type I error and should be avoided. We identify critical assumptions for use of the AGAT and provide a basis for researchers to use, and reviewers to assess, data-driven methods of ROI localization in ERP and other studies. © 2016 Society for Psychophysiological Research.
Measuring Mental Workload: A Performance Battery.
1987-09-01
1982). Subjective mental workload. Human Factors, 24, 25-40. Neisser , U ., Novick, R., & Lazar, R. (1963). Searching for ten targets 0...818 MES~A [G MENTAL WORKLOAD: A PERFORMANCE BATTERY( U ) 1/ • i NUN ENGINEERING LAS ABERDEEN PROVING GROUND MO L A MHITAKER ET AL .EP 87 HEL-TH-21-87...UNCLASSIFIED FG 5/9IIEEEEEEIE EhhIEllE~lllEE I.E.E.E I1.2 P e- - ,uIII j .. - - I ,,, 65 -S S p *q 0 0 0 0 0 0 0 0 0 u -; 5 . . ~qqr AD-A187 118
Internal consistency and validity of a new physical workload questionnaire
Bot, S; Terwee, C; van der Windt, D A W M; Feleus, A; Bierma-Zeinstra, S; Knol, D; Bouter, L; Dekker, J
2004-01-01
Aims: To examine the dimensionality, internal consistency, and construct validity of a new physical workload questionnaire in employees with musculoskeletal complaints. Methods: Factor analysis was applied to the responses in three study populations with musculoskeletal disorders (n = 406, 300, and 557) on 26 items related to physical workload. The internal consistency of the resulting subscales was examined. It was hypothesised that physical workload would vary among different occupational groups. The occupations of all subjects were classified into four groups on the basis of expected workload (heavy physical load; long lasting postures and repetitive movements; both; no physical load). Construct validity of the subscales created was tested by comparing the subscale scores among these occupational groups. Results: The pattern of the factor loadings of items was almost identical for the three study populations. Two interpretable factors were found: items related to heavy physical workload loaded highly on the first factor, and items related to static postures or repetitive work loaded highly on the second factor. The first constructed subscale "heavy physical work" had a Cronbach's α of 0.92 to 0.93 and the second subscale "long lasting postures and repetitive movements", of 0.86 to 0.87. Six of eight hypotheses regarding the construct validity of the subscales were confirmed. Conclusions: The results support the internal structure, internal consistency, and validity of the new physical workload questionnaire. Testing this questionnaire in non-symptomatic employees and comparing its performance with objective assessments of physical workload are important next steps in the validation process. PMID:15550603
Saleh, Ziad H; Jeong, Jeho; Quinn, Brian; Mechalakos, James; St Germain, Jean; Dauer, Lawrence T
2017-05-01
The workload for shielding purposes of modern linear accelerators (linacs) consists of primary and scatter radiation which depends on the dose delivered to isocenter (cGy) and leakage radiation which depends on the monitor units (MUs). In this study, we report on the workload for 10 treatment vaults in terms of dose to isocenter (cGy), monitor units delivered (MUs), number of treatment sessions (Txs), as well as, use factors (U) and modulation factors (CI) for different treatment techniques. The survey was performed for the years between 2006 and 2015 and included 16 treatment machines which represent different generations of Varian linear accelerators (6EX, 600C, 2100C, 2100EX, and TrueBeam) operating at different electron and x-ray energies (6, 9, 12, 16 and 20 MeV electrons and, 6 and 15 MV x-rays). An institutional review board (IRB) approval was acquired to perform this study. Data regarding patient workload, dose to isocenter, number of monitor units delivered, beam energies, gantry angles, and treatment techniques were exported from an ARIA treatment management system (Varian Medical Systems, Palo Alto, Ca.) into Excel spreadsheets and data analysis was performed in Matlab. The average (± std-dev) number of treatment sessions, dose to isocenter, and number of monitor units delivered per week per machine in 2006 was 119 ± 39 Txs, (300 ± 116) × 10 2 cGys, and (78 ± 28) × 10 3 MUs respectively. In contrast, the workload in 2015 was 112 ± 40 Txs, (337 ± 124) × 10 2 cGys, and (111 ± 46) × 10 3 MUs. 60% of the workload (cGy) was delivered using 6 MV and 30% using 15 MV while the remaining 10% was delivered using electron beams. The modulation factors (MU/cGy) for IMRT and VMAT were 5.0 (± 3.4) and 4.6 (± 1.6) respectively. Use factors using 90° gantry angle intervals were equally distributed (~0.25) but varied considerably among different treatment techniques. The workload, in terms of dose to isocenter (cGy) and subsequently monitor units (MUs), has
Knaepen, Kristel; Marusic, Uros; Crea, Simona; Rodríguez Guerrero, Carlos D; Vitiello, Nicola; Pattyn, Nathalie; Mairesse, Olivier; Lefeber, Dirk; Meeusen, Romain
2015-04-01
Walking with a lower limb prosthesis comes at a high cognitive workload for amputees, possibly affecting their mobility, safety and independency. A biocooperative prosthesis which is able to reduce the cognitive workload of walking could offer a solution. Therefore, we wanted to investigate whether different levels of cognitive workload can be assessed during symmetrical, asymmetrical and dual-task walking and to identify which parameters are the most sensitive. Twenty-four healthy subjects participated in this study. Cognitive workload was assessed through psychophysiological responses, physical and cognitive performance and subjective ratings. The results showed that breathing frequency and heart rate significantly increased, and heart rate variability significantly decreased with increasing cognitive workload during walking (p<.05). Performance measures (e.g., cadence) only changed under high cognitive workload. As a result, psychophysiological measures are the most sensitive to identify changes in cognitive workload during walking. These parameters reflect the cognitive effort necessary to maintain performance during complex walking and can easily be assessed regardless of the task. This makes them excellent candidates to feed to the control loop of a biocooperative prosthesis in order to detect the cognitive workload. This information can then be used to adapt the robotic assistance to the patient's cognitive abilities. Copyright © 2015 Elsevier B.V. All rights reserved.
Matthews, R; Turner, P J; McDonald, N J; Ermolaev, K; Manus, T; Shelby, R A; Steindorf, M
2008-01-01
This paper describes a compact, lightweight and ultra-low power ambulatory wireless EEG system based upon QUASAR's innovative noninvasive bioelectric sensor technologies. The sensors operate through hair without skin preparation or conductive gels. Mechanical isolation built into the harness permits the recording of high quality EEG data during ambulation. Advanced algorithms developed for this system permit real time classification of workload during subject motion. Measurements made using the EEG system during ambulation are presented, including results for real time classification of subject workload.
[Variability in nursing workload within Swiss Diagnosis Related Groups].
Baumberger, Dieter; Bürgin, Reto; Bartholomeyczik, Sabine
2014-04-01
Nursing care inputs represent one of the major cost components in the Swiss Diagnosis Related Group (DRG) structure. High and low nursing workloads in individual cases are supposed to balance out via the DRG group. Research results indicating possible problems in this area cannot be reliably extrapolated to SwissDRG. An analysis of nursing workload figures with DRG indicators was carried out in order to decide whether there is a need to develop SwissDRG classification criteria that are specific to nursing care. The case groups were determined with SwissDRG 0.1, and nursing workload with LEP Nursing 2. Robust statistical methods were used. The evaluation of classification accuracy was carried out with R2 as the measurement of variance reduction and the coefficient of homogeneity (CH). To ensure reliable conclusions, statistical tests with bootstrapping methods were performed. The sample included 213 groups with a total of 73930 cases from ten hospitals. The DRG classification was seen to have limited explanatory power for variability in nursing workload inputs, both for all cases (R2 = 0.16) and for inliers (R2 = 0.32). Nursing workload homogeneity was statistically significant unsatisfactory (CH < 0.67) in 123 groups, including 24 groups in which it was significant defective (CH < 0.60). Therefore, there is a high risk of high and low nursing workloads not balancing out in these groups, and, as a result, of financial resources being wrongly allocated. The development of nursing-care-specific SwissDRG classification criteria for improved homogeneity and variance reduction is therefore indicated.
Fallahi, Majid; Motamedzade, Majid; Heidarimoghadam, Rashid; Soltanian, Ali Reza; Miyake, Shinji
2016-01-01
Background: The present study aimed to evaluate the operators’ mental workload (MW) of cement, city traffic control and power plant control centers using subjective and objective measures during system vital parameters monitoring. Methods: This cross-sectional study was conducted from June 2014 to February 2015 at the cement, city traffic control and power plant control centers. Electrocardiography and electroencephalography data were recorded from forty males during performing their daily working in resting, low mental workload (LMW), high mental workload (HMW) and recovery conditions (each block 5 minutes). The NASA-Task Load Index (TLX) was used to evaluate the subjective workload of the operators. Results: The results showed that increasing MW had a significant effect on the operators subjective responses in two conditions ([1,53] = 216.303, P < 0.001, η2 = 0.803). Also,the Task-MW interaction effect on operators subjective responses was significant (F [3, 53] = 12.628,P < 0.001, η2 = 0.417). Analysis of repeated measures analysis of variance (ANOVA) indicated that increasing mental demands had a significant effect on heart rate, low frequency/high frequency ratio, theta and alpha band activity. Conclusion: The results suggested that when operators’ mental demands especially in traffic control and power plant tasks increased, their mental fatigue and stress level increased and their mental health deteriorated. Therefore, it may be necessary to implement an ergonomic program or administrative control to manage mental probably health in these control centers. Furthermore, by evaluating MW, the control center director can organize the human resources for each MW condition to sustain the appropriate performance as well as improve system functions. PMID:27386425
The psychometrics of mental workload: multiple measures are sensitive but divergent.
Matthews, Gerald; Reinerman-Jones, Lauren E; Barber, Daniel J; Abich, Julian
2015-02-01
A study was run to test the sensitivity of multiple workload indices to the differing cognitive demands of four military monitoring task scenarios and to investigate relationships between indices. Various psychophysiological indices of mental workload exhibit sensitivity to task factors. However, the psychometric properties of multiple indices, including the extent to which they intercorrelate, have not been adequately investigated. One hundred fifty participants performed in four task scenarios based on a simulation of unmanned ground vehicle operation. Scenarios required threat detection and/or change detection. Both single- and dual-task scenarios were used. Workload metrics for each scenario were derived from the electroencephalogram (EEG), electrocardiogram, transcranial Doppler sonography, functional near infrared, and eye tracking. Subjective workload was also assessed. Several metrics showed sensitivity to the differing demands of the four scenarios. Eye fixation duration and the Task Load Index metric derived from EEG were diagnostic of single-versus dual-task performance. Several other metrics differentiated the two single tasks but were less effective in differentiating single- from dual-task performance. Psychometric analyses confirmed the reliability of individual metrics but failed to identify any general workload factor. An analysis of difference scores between low- and high-workload conditions suggested an effort factor defined by heart rate variability and frontal cortex oxygenation. General workload is not well defined psychometrically, although various individual metrics may satisfy conventional criteria for workload assessment. Practitioners should exercise caution in using multiple metrics that may not correspond well, especially at the level of the individual operator.
W-MAC: A Workload-Aware MAC Protocol for Heterogeneous Convergecast in Wireless Sensor Networks
Xia, Ming; Dong, Yabo; Lu, Dongming
2011-01-01
The power consumption and latency of existing MAC protocols for wireless sensor networks (WSNs) are high in heterogeneous convergecast, where each sensor node generates different amounts of data in one convergecast operation. To solve this problem, we present W-MAC, a workload-aware MAC protocol for heterogeneous convergecast in WSNs. A subtree-based iterative cascading scheduling mechanism and a workload-aware time slice allocation mechanism are proposed to minimize the power consumption of nodes, while offering a low data latency. In addition, an efficient schedule adjustment mechanism is provided for adapting to data traffic variation and network topology change. Analytical and simulation results show that the proposed protocol provides a significant energy saving and latency reduction in heterogeneous convergecast, and can effectively support data aggregation to further improve the performance. PMID:22163753
Safety analysis of proposed data-driven physiologic alarm parameters for hospitalized children.
Goel, Veena V; Poole, Sarah F; Longhurst, Christopher A; Platchek, Terry S; Pageler, Natalie M; Sharek, Paul J; Palma, Jonathan P
2016-12-01
Modification of alarm limits is one approach to mitigating alarm fatigue. We aimed to create and validate heart rate (HR) and respiratory rate (RR) percentiles for hospitalized children, and analyze the safety of replacing current vital sign reference ranges with proposed data-driven, age-stratified 5th and 95th percentile values. In this retrospective cross-sectional study, nurse-charted HR and RR data from a training set of 7202 hospitalized children were used to develop percentile tables. We compared 5th and 95th percentile values with currently accepted reference ranges in a validation set of 2287 patients. We analyzed 148 rapid response team (RRT) and cardiorespiratory arrest (CRA) events over a 12-month period, using HR and RR values in the 12 hours prior to the event, to determine the proportion of patients with out-of-range vitals based upon reference versus data-driven limits. There were 24,045 (55.6%) fewer out-of-range measurements using data-driven vital sign limits. Overall, 144/148 RRT and CRA patients had out-of-range HR or RR values preceding the event using current limits, and 138/148 were abnormal using data-driven limits. Chart review of RRT and CRA patients with abnormal HR and RR per current limits considered normal by data-driven limits revealed that clinical status change was identified by other vital sign abnormalities or clinical context. A large proportion of vital signs in hospitalized children are outside presently used norms. Safety evaluation of data-driven limits suggests they are as safe as those currently used. Implementation of these parameters in physiologic monitors may mitigate alarm fatigue. Journal of Hospital Medicine 2015;11:817-823. © 2015 Society of Hospital Medicine. © 2016 Society of Hospital Medicine.
Data-Driven Geospatial Visual Analytics for Real-Time Urban Flooding Decision Support
NASA Astrophysics Data System (ADS)
Liu, Y.; Hill, D.; Rodriguez, A.; Marini, L.; Kooper, R.; Myers, J.; Wu, X.; Minsker, B. S.
2009-12-01
Urban flooding is responsible for the loss of life and property as well as the release of pathogens and other pollutants into the environment. Previous studies have shown that spatial distribution of intense rainfall significantly impacts the triggering and behavior of urban flooding. However, no general purpose tools yet exist for deriving rainfall data and rendering them in real-time at the resolution of hydrologic units used for analyzing urban flooding. This paper presents a new visual analytics system that derives and renders rainfall data from the NEXRAD weather radar system at the sewershed (i.e. urban hydrologic unit) scale in real-time for a Chicago stormwater management project. We introduce a lightweight Web 2.0 approach which takes advantages of scientific workflow management and publishing capabilities developed at NCSA (National Center for Supercomputing Applications), streaming data-aware semantic content management repository, web-based Google Earth/Map and time-aware KML (Keyhole Markup Language). A collection of polygon-based virtual sensors is created from the NEXRAD Level II data using spatial, temporal and thematic transformations at the sewershed level in order to produce persistent virtual rainfall data sources for the animation. Animated color-coded rainfall map in the sewershed can be played in real-time as a movie using time-aware KML inside the web browser-based Google Earth for visually analyzing the spatiotemporal patterns of the rainfall intensity in the sewershed. Such system provides valuable information for situational awareness and improved decision support during extreme storm events in an urban area. Our further work includes incorporating additional data (such as basement flooding events data) or physics-based predictive models that can be used for more integrated data-driven decision support.
A psychophysiological assessment of operator workload during simulated flight missions
NASA Technical Reports Server (NTRS)
Kramer, Arthur F.; Sirevaag, Erik J.; Braune, Rolf
1987-01-01
The applicability of the dual-task event-related (brain) potential (ERP) paradigm to the assessment of an operator's mental workload and residual capacity in a complex situation of a flight mission was demonstrated using ERP measurements and subjective workload ratings of student pilots flying a fixed-based single-engine simulator. Data were collected during two separate 45-min flights differing in difficulty; flight demands were examined by dividing each flight into four segments: takeoff, straight and level flight, holding patterns, and landings. The P300 ERP component in particular was found to discriminate among the levels of task difficulty in a systematic manner, decreasing in amplitude with an increase in task demands. The P300 amplitude is shown to be negatively correlated with deviations from command headings across the four flight segments.
Methodological integrative review of the work sampling technique used in nursing workload research.
Blay, Nicole; Duffield, Christine M; Gallagher, Robyn; Roche, Michael
2014-11-01
To critically review the work sampling technique used in nursing workload research. Work sampling is a technique frequently used by researchers and managers to explore and measure nursing activities. However, work sampling methods used are diverse making comparisons of results between studies difficult. Methodological integrative review. Four electronic databases were systematically searched for peer-reviewed articles published between 2002-2012. Manual scanning of reference lists and Rich Site Summary feeds from contemporary nursing journals were other sources of data. Articles published in the English language between 2002-2012 reporting on research which used work sampling to examine nursing workload. Eighteen articles were reviewed. The review identified that the work sampling technique lacks a standardized approach, which may have an impact on the sharing or comparison of results. Specific areas needing a shared understanding included the training of observers and subjects who self-report, standardization of the techniques used to assess observer inter-rater reliability, sampling methods and reporting of outcomes. Work sampling is a technique that can be used to explore the many facets of nursing work. Standardized reporting measures would enable greater comparison between studies and contribute to knowledge more effectively. Author suggestions for the reporting of results may act as guidelines for researchers considering work sampling as a research method. © 2014 John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Damos, D. L.
1984-01-01
Human factors practitioners often are concerned with mental workload in multiple-task situations. Investigations of these situations have demonstrated repeatedly that individuals differ in their subjective estimates of workload. These differences may be attributed in part to individual differences in definitions of workload. However, after allowing for differences in the definition of workload, there are still unexplained individual differences in workload ratings. The relation between individual differences in multiple-task performance, subjective estimates of workload, information processing abilities, and the Type A personality trait were examined.
Event-related potential indices of workload in a single task paradigm
NASA Technical Reports Server (NTRS)
Horst, R. L.; Munson, R. C.; Ruchkin, D. S.
1984-01-01
Many previous studies of both behavioral and physiological correlates of cognitive workload have burdened subjects with a contrived secondary task in order to assess the workload of a primary task. The present study investigated event-related potential (ERP) indices of workload in a single task paradigm. Subjects monitored changing digital readouts for values that went 'out-of-bounds'. The amplitude of a long-latency positivity in the ERPs elicited by readout changes increased with the number of readouts being monitored. This effect of workload on ERPs is reported, along with plans for additional analyses to address theoretical implications.
Analysis of DISMS (Defense Integrated Subsistence Management System) Increment 4
1988-12-01
response data entry; and rationale supporting an on-line system based on real time management information needs. Keywords: Automated systems; Subsistence; Workload capacity; Bid response; Contract administration; Computer systems.
Data flow machine for data driven computing
Davidson, George S.; Grafe, Victor G.
1995-01-01
A data flow computer which of computing is disclosed which utilizes a data driven processor node architecture. The apparatus in a preferred embodiment includes a plurality of First-In-First-Out (FIFO) registers, a plurality of related data flow memories, and a processor. The processor makes the necessary calculations and includes a control unit to generate signals to enable the appropriate FIFO register receiving the result. In a particular embodiment, there are three FIFO registers per node: an input FIFO register to receive input information form an outside source and provide it to the data flow memories; an output FIFO register to provide output information from the processor to an outside recipient; and an internal FIFO register to provide information from the processor back to the data flow memories. The data flow memories are comprised of four commonly addressed memories. A parameter memory holds the A and B parameters used in the calculations; an opcode memory holds the instruction; a target memory holds the output address; and a tag memory contains status bits for each parameter. One status bit indicates whether the corresponding parameter is in the parameter memory and one status but to indicate whether the stored information in the corresponding data parameter is to be reused. The tag memory outputs a "fire" signal (signal R VALID) when all of the necessary information has been stored in the data flow memories, and thus when the instruction is ready to be fired to the processor.
Carey, David L; Blanch, Peter; Ong, Kok-Leong; Crossley, Kay M; Crow, Justin; Morris, Meg E
2017-01-01
Aims (1) To investigate whether a daily acute:chronic workload ratio informs injury risk in Australian football players; (2) to identify which combination of workload variable, acute and chronic time window best explains injury likelihood. Methods Workload and injury data were collected from 53 athletes over 2 seasons in a professional Australian football club. Acute:chronic workload ratios were calculated