Performance Support Tools: Delivering Value when and where It Is Needed
ERIC Educational Resources Information Center
McManus, Paul; Rossett, Allison
2006-01-01
Some call them Electronic Performance Support Systems (EPSSs). Others prefer Performance Support Tools (PSTs) or decision support tools. One might call EPSSs or PSTs job aids on steroids, technological tools that provide critical information or advice needed to move forward at a particular moment in time. Characteristic advantages of an EPSS or a…
Integrating Reliability Analysis with a Performance Tool
NASA Technical Reports Server (NTRS)
Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael
1995-01-01
A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.
Usability of Operational Performance Support Tools - Findings from Sea Test II
NASA Technical Reports Server (NTRS)
Byrne, Vicky; Litaker, Harry; McGuire, Kerry
2014-01-01
Sea Test II, aka NASA Extreme Environment Mission Operations 17(NEEMO 17) took place in the Florida Aquarius undersea habitat. This confined underwater environment provides a excellent analog for space habitation providing similarities to space habitation such as hostile environment, difficult logistics, autonomous operations, and remote communications. This study collected subjective feedback on the usability of two performance support tools during the Sea Test II mission, Sept 10-14, 2013; Google Glass and iPAD. The two main objectives: - Assess the overall functionality and usability of each performance support tool in a mission analog environment. - Assess the advantages and disadvantages of each tool when performing operational procedures and Just-In-Time-Training (JITT).
Using Performance Tools to Support Experiments in HPC Resilience
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naughton, III, Thomas J; Boehm, Swen; Engelmann, Christian
2014-01-01
The high performance computing (HPC) community is working to address fault tolerance and resilience concerns for current and future large scale computing platforms. This is driving enhancements in the programming environ- ments, specifically research on enhancing message passing libraries to support fault tolerant computing capabilities. The community has also recognized that tools for resilience experimentation are greatly lacking. However, we argue that there are several parallels between performance tools and resilience tools . As such, we believe the rich set of HPC performance-focused tools can be extended (repurposed) to benefit the resilience community. In this paper, we describe the initialmore » motivation to leverage standard HPC per- formance analysis techniques to aid in developing diagnostic tools to assist fault tolerance experiments for HPC applications. These diagnosis procedures help to provide context for the system when the errors (failures) occurred. We describe our initial work in leveraging an MPI performance trace tool to assist in provid- ing global context during fault injection experiments. Such tools will assist the HPC resilience community as they extend existing and new application codes to support fault tolerances.« less
Kurosawa, Hiroshi; Ikeyama, Takanari; Achuff, Patricia; Perkel, Madeline; Watson, Christine; Monachino, Annemarie; Remy, Daphne; Deutsch, Ellen; Buchanan, Newton; Anderson, Jodee; Berg, Robert A; Nadkarni, Vinay M; Nishisaki, Akira
2014-03-01
Recent evidence shows poor retention of Pediatric Advanced Life Support provider skills. Frequent refresher training and in situ simulation are promising interventions. We developed a "Pediatric Advanced Life Support-reconstructed" recertification course by deconstructing the training into six 30-minute in situ simulation scenario sessions delivered over 6 months. We hypothesized that in situ Pediatric Advanced Life Support-reconstructed implementation is feasible and as effective as standard Pediatric Advanced Life Support recertification. A prospective randomized, single-blinded trial. Single-center, large, tertiary PICU in a university-affiliated children's hospital. Nurses and respiratory therapists in PICU. Simulation-based modular Pediatric Advanced Life Support recertification training. Simulation-based pre- and postassessment sessions were conducted to evaluate participants' performance. Video-recorded sessions were rated by trained raters blinded to allocation. The primary outcome was skill performance measured by a validated Clinical Performance Tool, and secondary outcome was behavioral performance measured by a Behavioral Assessment Tool. A mixed-effect model was used to account for baseline differences. Forty participants were prospectively randomized to Pediatric Advanced Life Support reconstructed versus standard Pediatric Advanced Life Support with no significant difference in demographics. Clinical Performance Tool score was similar at baseline in both groups and improved after Pediatric Advanced Life Support reconstructed (pre, 16.3 ± 4.1 vs post, 22.4 ± 3.9; p < 0.001), but not after standard Pediatric Advanced Life Support (pre, 14.3 ± 4.7 vs post, 14.9 ± 4.4; p =0.59). Improvement of Clinical Performance Tool was significantly higher in Pediatric Advanced Life Support reconstructed compared with standard Pediatric Advanced Life Support (p = 0.006). Behavioral Assessment Tool improved in both groups: Pediatric Advanced Life Support reconstructed (pre, 33.3 ± 4.5 vs post, 35.9 ± 5.0; p = 0.008) and standard Pediatric Advanced Life Support (pre, 30.5 ± 4.7 vs post, 33.6 ± 4.9; p = 0.02), with no significant difference of improvement between both groups (p = 0.49). For PICU-based nurses and respiratory therapists, simulation-based "Pediatric Advanced Life Support-reconstructed" in situ training is feasible and more effective than standard Pediatric Advanced Life Support recertification training for skill performance. Both Pediatric Advanced Life Support recertification training courses improved behavioral performance.
Design of a Cognitive Tool to Enhance Problemsolving Performance
ERIC Educational Resources Information Center
Lee, Youngmin; Nelson, David
2005-01-01
The design of a cognitive tool to support problem-solving performance for external representation of knowledge is described. The limitations of conventional knowledge maps are analyzed in proposing the tool. The design principles and specifications are described. This tool is expected to enhance learners problem-solving performance by allowing…
Sebok, Angelia; Wickens, Christopher D
2017-03-01
The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.
DOT National Transportation Integrated Search
2016-09-01
This report documents use of the NASA Design and Analysis of Rotorcraft (NDARC) helicopter performance software tool in developing data for the FAAs Aviation Environmental Design Tool (AEDT). These data support the Rotorcraft Performance Model (RP...
Cabrera, V E
2018-01-01
The objective of this review paper is to describe the development and application of a suite of more than 40 computerized dairy farm decision support tools contained at the University of Wisconsin-Madison (UW) Dairy Management website http://DairyMGT.info. These data-driven decision support tools are aimed to help dairy farmers improve their decision-making, environmental stewardship and economic performance. Dairy farm systems are highly dynamic in which changing market conditions and prices, evolving policies and environmental restrictions together with every time more variable climate conditions determine performance. Dairy farm systems are also highly integrated with heavily interrelated components such as the dairy herd, soils, crops, weather and management. Under these premises, it is critical to evaluate a dairy farm following a dynamic integrated system approach. For this approach, it is crucial to use meaningful data records, which are every time more available. These data records should be used within decision support tools for optimal decision-making and economic performance. Decision support tools in the UW-Dairy Management website (http://DairyMGT.info) had been developed using combination and adaptation of multiple methods together with empirical techniques always with the primary goal for these tools to be: (1) highly user-friendly, (2) using the latest software and computer technologies, (3) farm and user specific, (4) grounded on the best scientific information available, (5) remaining relevant throughout time and (6) providing fast, concrete and simple answers to complex farmers' questions. DairyMGT.info is a translational innovative research website in various areas of dairy farm management that include nutrition, reproduction, calf and heifer management, replacement, price risk and environment. This paper discusses the development and application of 20 selected (http://DairyMGT.info) decision support tools.
Ludwick, Teralynn; Turyakira, Eleanor; Kyomuhangi, Teddy; Manalili, Kimberly; Robinson, Sheila; Brenner, Jennifer L
2018-02-13
While evidence supports community health worker (CHW) capacity to improve maternal and newborn health in less-resourced countries, key implementation gaps remain. Tools for assessing CHW performance and evidence on what programmatic components affect performance are lacking. This study developed and tested a qualitative evaluative framework and tool to assess CHW team performance in a district program in rural Uganda. A new assessment framework was developed to collect and analyze qualitative evidence based on CHW perspectives on seven program components associated with effectiveness (selection; training; community embeddedness; peer support; supportive supervision; relationship with other healthcare workers; retention and incentive structures). Focus groups were conducted with four high/medium-performing CHW teams and four low-performing CHW teams selected through random, stratified sampling. Content analysis involved organizing focus group transcripts according to the seven program effectiveness components, and assigning scores to each component per focus group. Four components, 'supportive supervision', 'good relationships with other healthcare workers', 'peer support', and 'retention and incentive structures' received the lowest overall scores. Variances in scores between 'high'/'medium'- and 'low'-performing CHW teams were largest for 'supportive supervision' and 'good relationships with other healthcare workers.' Our analysis suggests that in the Bushenyi intervention context, CHW team performance is highly correlated with the quality of supervision and relationships with other healthcare workers. CHWs identified key performance-related issues of absentee supervisors, referral system challenges, and lack of engagement/respect by health workers. Other less-correlated program components warrant further study and may have been impacted by relatively consistent program implementation within our limited study area. Applying process-oriented measurement tools are needed to better understand CHW performance-related factors and build a supportive environment for CHW program effectiveness and sustainability. Findings from a qualitative, multi-component tool developed and applied in this study suggest that factors related to (1) supportive supervision and (2) relationships with other healthcare workers may be strongly associated with variances in performance outcomes within a program. Careful consideration of supervisory structure and health worker orientation during program implementation are among strategies proposed to increase CHW performance.
Software project management tools in global software development: a systematic mapping study.
Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio
2016-01-01
Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.
Examining Students' Use of Online Annotation Tools in Support of Argumentative Reading
ERIC Educational Resources Information Center
Lu, Jingyan; Deng, Liping
2013-01-01
This study examined how students in a Hong Kong high school used Diigo, an online annotation tool, to support their argumentative reading activities. Two year 10 classes, a high-performance class (HPC) and an ordinary-performance class (OPC), highlighted passages of text and wrote and attached sticky notes to them to clarify argumentation…
A knowledge based search tool for performance measures in health care systems.
Beyan, Oya D; Baykal, Nazife
2012-02-01
Performance measurement is vital for improving the health care systems. However, we are still far from having accepted performance measurement models. Researchers and developers are seeking comparable performance indicators. We developed an intelligent search tool to identify appropriate measures for specific requirements by matching diverse care settings. We reviewed the literature and analyzed 229 performance measurement studies published after 2000. These studies are evaluated with an original theoretical framework and stored in the database. A semantic network is designed for representing domain knowledge and supporting reasoning. We have applied knowledge based decision support techniques to cope with uncertainty problems. As a result we designed a tool which simplifies the performance indicator search process and provides most relevant indicators by employing knowledge based systems.
ERIC Educational Resources Information Center
Dickover, Noel T.
2002-01-01
Explains performance-centered learning (PCL), an approach to optimize support for performance on the job by making corporate assets available to knowledge workers so they can solve actual problems. Illustrates PCL with a Web site that provides just-in-time learning, collaboration, and performance support tools to improve performance at the…
Extending BPM Environments of Your Choice with Performance Related Decision Support
NASA Astrophysics Data System (ADS)
Fritzsche, Mathias; Picht, Michael; Gilani, Wasif; Spence, Ivor; Brown, John; Kilpatrick, Peter
What-if Simulations have been identified as one solution for business performance related decision support. Such support is especially useful in cases where it can be automatically generated out of Business Process Management (BPM) Environments from the existing business process models and performance parameters monitored from the executed business process instances. Currently, some of the available BPM Environments offer basic-level performance prediction capabilities. However, these functionalities are normally too limited to be generally useful for performance related decision support at business process level. In this paper, an approach is presented which allows the non-intrusive integration of sophisticated tooling for what-if simulations, analytic performance prediction tools, process optimizations or a combination of such solutions into already existing BPM environments. The approach abstracts from process modelling techniques which enable automatic decision support spanning processes across numerous BPM Environments. For instance, this enables end-to-end decision support for composite processes modelled with the Business Process Modelling Notation (BPMN) on top of existing Enterprise Resource Planning (ERP) processes modelled with proprietary languages.
The Changing Role of Instructors in Distance Education: Impact on Tool Support.
ERIC Educational Resources Information Center
Biedebach, Anke; Bomsdorf, Birgit; Schlageter, Gunter
At the university of Hagen a lot of experience exists in performing Web-based teaching and in implementing tools supporting e-learning. To share this knowledge, (inexperienced) instructors more and more ask for tool-based assistance in designing and administrating e-learning courses. Considering experience from other universities, it becomes…
Indicators and Measurement Tools for Health Systems Integration: A Knowledge Synthesis.
Suter, Esther; Oelke, Nelly D; da Silva Lima, Maria Alice Dias; Stiphout, Michelle; Janke, Robert; Witt, Regina Rigatto; Van Vliet-Brown, Cheryl; Schill, Kaela; Rostami, Mahnoush; Hepp, Shelanne; Birney, Arden; Al-Roubaiai, Fatima; Marques, Giselda Quintana
2017-11-13
Despite far reaching support for integrated care, conceptualizing and measuring integrated care remains challenging. This knowledge synthesis aimed to identify indicator domains and tools to measure progress towards integrated care. We used an established framework and a Delphi survey with integration experts to identify relevant measurement domains. For each domain, we searched and reviewed the literature for relevant tools. From 7,133 abstracts, we retrieved 114 unique tools. We found many quality tools to measure care coordination, patient engagement and team effectiveness/performance. In contrast, there were few tools in the domains of performance measurement and information systems, alignment of organizational goals and resource allocation. The search yielded 12 tools that measure overall integration or three or more indicator domains. Our findings highlight a continued gap in tools to measure foundational components that support integrated care. In the absence of such targeted tools, "overall integration" tools may be useful for a broad assessment of the overall state of a system. Continued progress towards integrated care depends on our ability to evaluate the success of strategies across different levels and context. This study has identified 114 tools that measure integrated care across 16 domains, supporting efforts towards a unified measurement framework.
DOT National Transportation Integrated Search
2017-03-01
The performance-planning tool developed as part of this project is intended for use with the guidebook for establishing and using rural performance based transportation system assessment, monitoring, planning, and programming to support the rural pla...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allan, Benjamin A.
We report on the use and design of a portable, extensible performance data collection tool motivated by modeling needs of the high performance computing systems co-design com- munity. The lightweight performance data collectors with Eiger support is intended to be a tailorable tool, not a shrink-wrapped library product, as pro ling needs vary widely. A single code markup scheme is reported which, based on compilation ags, can send perfor- mance data from parallel applications to CSV les, to an Eiger mysql database, or (in a non-database environment) to at les for later merging and loading on a host with mysqlmore » available. The tool supports C, C++, and Fortran applications.« less
Building Internet-Based Electronic Performance Support for Teaching and Learning.
ERIC Educational Resources Information Center
Laffey, James M.; Musser, Dale
The College of Education, University of Missouri-Columbia is developing and testing a suite of tools that utilize the Internet and work as a system to support learning from field experiences. These tools are built to support preservice teachers, field-based mentors, and college faculty as they collaborate, engage in practice, document their…
HPC Profiling with the Sun Studio™ Performance Tools
NASA Astrophysics Data System (ADS)
Itzkowitz, Marty; Maruyama, Yukon
In this paper, we describe how to use the Sun Studio Performance Tools to understand the nature and causes of application performance problems. We first explore CPU and memory performance problems for single-threaded applications, giving some simple examples. Then, we discuss multi-threaded performance issues, such as locking and false-sharing of cache lines, in each case showing how the tools can help. We go on to describe OpenMP applications and the support for them in the performance tools. Then we discuss MPI applications, and the techniques used to profile them. Finally, we present our conclusions.
Indicators and Measurement Tools for Health Systems Integration: A Knowledge Synthesis
Oelke, Nelly D.; da Silva Lima, Maria Alice Dias; Stiphout, Michelle; Janke, Robert; Witt, Regina Rigatto; Van Vliet-Brown, Cheryl; Schill, Kaela; Rostami, Mahnoush; Hepp, Shelanne; Birney, Arden; Al-Roubaiai, Fatima; Marques, Giselda Quintana
2017-01-01
Background: Despite far reaching support for integrated care, conceptualizing and measuring integrated care remains challenging. This knowledge synthesis aimed to identify indicator domains and tools to measure progress towards integrated care. Methods: We used an established framework and a Delphi survey with integration experts to identify relevant measurement domains. For each domain, we searched and reviewed the literature for relevant tools. Findings: From 7,133 abstracts, we retrieved 114 unique tools. We found many quality tools to measure care coordination, patient engagement and team effectiveness/performance. In contrast, there were few tools in the domains of performance measurement and information systems, alignment of organizational goals and resource allocation. The search yielded 12 tools that measure overall integration or three or more indicator domains. Discussion: Our findings highlight a continued gap in tools to measure foundational components that support integrated care. In the absence of such targeted tools, “overall integration” tools may be useful for a broad assessment of the overall state of a system. Conclusions: Continued progress towards integrated care depends on our ability to evaluate the success of strategies across different levels and context. This study has identified 114 tools that measure integrated care across 16 domains, supporting efforts towards a unified measurement framework. PMID:29588637
ERIC Educational Resources Information Center
Oluwuo, S. O.; Enefaa, Bestman Briggs Anthonia
2016-01-01
The study investigated the application of education information management support tools in the promotion of teaching/learning and management of students' performance in federal universities in the South-South zone of Nigeria. Two research questions and two null hypotheses guided the study. The study adopted a descriptive survey design. The…
NASA Technical Reports Server (NTRS)
Scheper, C.; Baker, R.; Frank, G.; Yalamanchili, S.; Gray, G.
1992-01-01
Systems for Space Defense Initiative (SDI) space applications typically require both high performance and very high reliability. These requirements present the systems engineer evaluating such systems with the extremely difficult problem of conducting performance and reliability trade-offs over large design spaces. A controlled development process supported by appropriate automated tools must be used to assure that the system will meet design objectives. This report describes an investigation of methods, tools, and techniques necessary to support performance and reliability modeling for SDI systems development. Models of the JPL Hypercubes, the Encore Multimax, and the C.S. Draper Lab Fault-Tolerant Parallel Processor (FTPP) parallel-computing architectures using candidate SDI weapons-to-target assignment algorithms as workloads were built and analyzed as a means of identifying the necessary system models, how the models interact, and what experiments and analyses should be performed. As a result of this effort, weaknesses in the existing methods and tools were revealed and capabilities that will be required for both individual tools and an integrated toolset were identified.
NASA Astrophysics Data System (ADS)
Bell, C.; Li, Y.; Lopez, E.; Hogue, T. S.
2017-12-01
Decision support tools that quantitatively estimate the cost and performance of infrastructure alternatives are valuable for urban planners. Such a tool is needed to aid in planning stormwater projects to meet diverse goals such as the regulation of stormwater runoff and its pollutants, minimization of economic costs, and maximization of environmental and social benefits in the communities served by the infrastructure. This work gives a brief overview of an integrated decision support tool, called i-DST, that is currently being developed to serve this need. This presentation focuses on the development of a default database for the i-DST that parameterizes water quality treatment efficiency of stormwater best management practices (BMPs) by region. Parameterizing the i-DST by region will allow the tool to perform accurate simulations in all parts of the United States. A national dataset of BMP performance is analyzed to determine which of a series of candidate regionalizations explains the most variance in the national dataset. The data used in the regionalization analysis comes from the International Stormwater BMP Database and data gleaned from an ongoing systematic review of peer-reviewed and gray literature. In addition to identifying a regionalization scheme for water quality performance parameters in the i-DST, our review process will also provide example methods and protocols for systematic reviews in the field of Earth Science.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mellor-Crummey, John
The PIPER project set out to develop methodologies and software for measurement, analysis, attribution, and presentation of performance data for extreme-scale systems. Goals of the project were to support analysis of massive multi-scale parallelism, heterogeneous architectures, multi-faceted performance concerns, and to support both post-mortem performance analysis to identify program features that contribute to problematic performance and on-line performance analysis to drive adaptation. This final report summarizes the research and development activity at Rice University as part of the PIPER project. Producing a complete suite of performance tools for exascale platforms during the course of this project was impossible since bothmore » hardware and software for exascale systems is still a moving target. For that reason, the project focused broadly on the development of new techniques for measurement and analysis of performance on modern parallel architectures, enhancements to HPCToolkit’s software infrastructure to support our research goals or use on sophisticated applications, engaging developers of multithreaded runtimes to explore how support for tools should be integrated into their designs, engaging operating system developers with feature requests for enhanced monitoring support, engaging vendors with requests that they add hardware measure- ment capabilities and software interfaces needed by tools as they design new components of HPC platforms including processors, accelerators and networks, and finally collaborations with partners interested in using HPCToolkit to analyze and tune scalable parallel applications.« less
Bennett, Hunter; Davison, Kade; Arnold, John; Slattery, Flynn; Martin, Max; Norton, Kevin
2017-10-01
Multicomponent movement assessment tools have become commonplace to measure movement quality, proposing to indicate injury risk and performance capabilities. Despite popular use, there has been no attempt to compare the components of each tool reported in the literature, the processes in which they were developed, or the underpinning rationale for their included content. As such, the objective of this systematic review was to provide a comprehensive summary of current movement assessment tools and appraise the evidence supporting their development. A systematic literature search was performed using PRISMA guidelines to identify multicomponent movement assessment tools. Commonalities between tools and the evidence provided to support the content of each tool was identified. Each tool underwent critical appraisal to identify the rigor in which it was developed, and its applicability to professional practice. Eleven tools were identified, of which 5 provided evidence to support their content as assessments of movement quality. One assessment tool (Soccer Injury Movement Screen [SIMS]) received an overall score of above 65% on critical appraisal, with a further 2 tools (Movement Competency Screen [MCS] and modified 4 movement screen [M4-MS]) scoring above 60%. Only the MCS provided clear justification for its developmental process. The remaining 8 tools scored between 40 and 60%. On appraisal, the MCS, M4-MS, and SIMS seem to provide the most practical value for assessing movement quality as they provide the strongest reports of developmental rigor and an identifiable evidence base. In addition, considering the evidence provided, these tools may have the strongest potential for identifying performance capabilities and guiding exercise prescription in athletic and sport-specific populations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Barton
2014-06-30
Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, wemore » built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of tools targeted at Office of Science Leadership Class computer systems and selected Office of Science application codes. We describe the contributions made by the team at the University of Wisconsin. The project built on the efforts in Open|SpeedShop funded by DOE/NNSA and the DOE/NNSA Tri-Lab community, extended Open|Speedshop to the Office of Science Leadership Class Computing Facilities, and addressed new challenges found on these cutting edge systems. Work done under this project at Wisconsin can be divided into two categories, new algorithms and techniques for debugging, and foundation infrastructure work on our Dyninst binary analysis and instrumentation toolkits and MRNet scalability infrastructure.« less
Supporting Technology Integration within a Teacher Education System
ERIC Educational Resources Information Center
Schaffer, Scott P.; Richardson, Jennifer C.
2004-01-01
The purpose of this case study was to examine a teacher education system relative to the degree of performance support for the use of technology to support learning. Performance support was measured by the presence of factors such as clear expectations, feedback, tools, rewards, incentives, motivation, capacity, skills, and knowledge within the…
Integrating Cache Performance Modeling and Tuning Support in Parallelization Tools
NASA Technical Reports Server (NTRS)
Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
With the resurgence of distributed shared memory (DSM) systems based on cache-coherent Non Uniform Memory Access (ccNUMA) architectures and increasing disparity between memory and processors speeds, data locality overheads are becoming the greatest bottlenecks in the way of realizing potential high performance of these systems. While parallelization tools and compilers facilitate the users in porting their sequential applications to a DSM system, a lot of time and effort is needed to tune the memory performance of these applications to achieve reasonable speedup. In this paper, we show that integrating cache performance modeling and tuning support within a parallelization environment can alleviate this problem. The Cache Performance Modeling and Prediction Tool (CPMP), employs trace-driven simulation techniques without the overhead of generating and managing detailed address traces. CPMP predicts the cache performance impact of source code level "what-if" modifications in a program to assist a user in the tuning process. CPMP is built on top of a customized version of the Computer Aided Parallelization Tools (CAPTools) environment. Finally, we demonstrate how CPMP can be applied to tune a real Computational Fluid Dynamics (CFD) application.
NASA Astrophysics Data System (ADS)
Spahr, K.; Hogue, T. S.
2016-12-01
Selecting the most appropriate green, gray, and / or hybrid system for stormwater treatment and conveyance can prove challenging to decision markers across all scales, from site managers to large municipalities. To help streamline the selection process, a multi-disciplinary team of academics and professionals is developing an industry standard for selecting and evaluating the most appropriate stormwater management technology for different regions. To make the tool more robust and comprehensive, life-cycle cost assessment and optimization modules will be included to evaluate non-monetized and ecosystem benefits of selected technologies. Initial work includes surveying advisory board members based in cities that use existing decision support tools in their infrastructure planning process. These surveys will qualify the decisions currently being made and identify challenges within the current planning process across a range of hydroclimatic regions and city size. Analysis of social and other non-technical barriers to adoption of the existing tools is also being performed, with identification of regional differences and institutional challenges. Surveys will also gage the regional appropriateness of certain stormwater technologies based off experiences in implementing stormwater treatment and conveyance plans. In additional to compiling qualitative data on existing decision support tools, a technical review of components of the decision support tool used will be performed. Gaps in each tool's analysis, like the lack of certain critical functionalities, will be identified and ease of use will be evaluated. Conclusions drawn from both the qualitative and quantitative analyses will be used to inform the development of the new decision support tool and its eventual dissemination.
A Performance Support Tool for Cisco Training Program Managers
ERIC Educational Resources Information Center
Benson, Angela D.; Bothra, Jashoda; Sharma, Priya
2004-01-01
Performance support systems can play an important role in corporations by managing and allowing distribution of information more easily. These systems run the gamut from simple paper job aids to sophisticated computer- and web-based software applications that support the entire corporate supply chain. According to Gery (1991), a performance…
Arcmancer: Geodesics and polarized radiative transfer library
NASA Astrophysics Data System (ADS)
Pihajoki, Pauli; Mannerkoski, Matias; Nättilä, Joonas; Johansson, Peter H.
2018-05-01
Arcmancer computes geodesics and performs polarized radiative transfer in user-specified spacetimes. The library supports Riemannian and semi-Riemannian spaces of any dimension and metric; it also supports multiple simultaneous coordinate charts, embedded geometric shapes, local coordinate systems, and automatic parallel propagation. Arcmancer can be used to solve various problems in numerical geometry, such as solving the curve equation of motion using adaptive integration with configurable tolerances and differential equations along precomputed curves. It also provides support for curves with an arbitrary acceleration term and generic tools for generating ray initial conditions and performing parallel computation over the image, among other tools.
Behavioral Health Support of NASA Astronauts for International Space Station Missions
NASA Technical Reports Server (NTRS)
Sipes, Walter
2000-01-01
Two areas of focus for optimizing behavioral health and human performance during International Space Station missions are 1) sleep and circadian assessment and 2) behavioral medicine. The Mir experience provided the opportunity to examine the use and potential effectiveness of tools and procedures to support the behavioral health of the crew. The experience of NASA has shown that on-orbit performance can be better maintained if behavioral health, sleep, and circadian issues are effectively monitored and properly addressed. For example, schedules can be tailored based upon fatigue level of crews and other behavioral and cognitive indicators to maximize performance. Previous research and experience with long duration missions has resulted in the development and upgrade of tools used to monitor fatigue, stress, cognitive function, and behavioral health. Self-assessment and objective tools such as the Spaceflight Cognitive Assessment Tool have been developed and refined to effectively address behavioral medicine countermeasures in space.
A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth
2005-03-15
The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scalemore » long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK projects have made use of this infrastructure to build performance measurement and analysis tools that scale to long-running programs on large parallel and distributed systems and that automate much of the search for performance bottlenecks.« less
A survey on annotation tools for the biomedical literature.
Neves, Mariana; Leser, Ulf
2014-03-01
New approaches to biomedical text mining crucially depend on the existence of comprehensive annotated corpora. Such corpora, commonly called gold standards, are important for learning patterns or models during the training phase, for evaluating and comparing the performance of algorithms and also for better understanding the information sought for by means of examples. Gold standards depend on human understanding and manual annotation of natural language text. This process is very time-consuming and expensive because it requires high intellectual effort from domain experts. Accordingly, the lack of gold standards is considered as one of the main bottlenecks for developing novel text mining methods. This situation led the development of tools that support humans in annotating texts. Such tools should be intuitive to use, should support a range of different input formats, should include visualization of annotated texts and should generate an easy-to-parse output format. Today, a range of tools which implement some of these functionalities are available. In this survey, we present a comprehensive survey of tools for supporting annotation of biomedical texts. Altogether, we considered almost 30 tools, 13 of which were selected for an in-depth comparison. The comparison was performed using predefined criteria and was accompanied by hands-on experiences whenever possible. Our survey shows that current tools can support many of the tasks in biomedical text annotation in a satisfying manner, but also that no tool can be considered as a true comprehensive solution.
Runtime Performance Monitoring Tool for RTEMS System Software
NASA Astrophysics Data System (ADS)
Cho, B.; Kim, S.; Park, H.; Kim, H.; Choi, J.; Chae, D.; Lee, J.
2007-08-01
RTEMS is a commercial-grade real-time operating system that supports multi-processor computers. However, there are not many development tools for RTEMS. In this paper, we report new RTEMS-based runtime performance monitoring tool. We have implemented a light weight runtime monitoring task with an extension to the RTEMS APIs. Using our tool, software developers can verify various performance- related parameters during runtime. Our tool can be used during software development phase and in-orbit operation as well. Our implemented target agent is light weight and has small overhead using SpaceWire interface. Efforts to reduce overhead and to add other monitoring parameters are currently under research.
An integrated modeling and design tool for advanced optical spacecraft
NASA Technical Reports Server (NTRS)
Briggs, Hugh C.
1992-01-01
Consideration is given to the design and status of the Integrated Modeling of Optical Systems (IMOS) tool and to critical design issues. A multidisciplinary spacecraft design and analysis tool with support for structural dynamics, controls, thermal analysis, and optics, IMOS provides rapid and accurate end-to-end performance analysis, simulations, and optimization of advanced space-based optical systems. The requirements for IMOS-supported numerical arrays, user defined data structures, and a hierarchical data base are outlined, and initial experience with the tool is summarized. A simulation of a flexible telescope illustrates the integrated nature of the tools.
Mayer, Simone; Teufel, Martin; Schaeffeler, Norbert; Keim, Ulrike; Garbe, Claus; Eigentler, Thomas Kurt; Zipfel, Stephan; Forschner, Andrea
2017-09-01
Despite an increasing number of promising treatment options, only a limited number of studies concerning melanoma patients' psycho-oncological distress have been carried out. However, multiple screening tools are in use to assess the need for psycho-oncological support. This study aimed first to identify parameters in melanoma patients that are associated with a higher risk for being psycho-oncologically distressed and second to compare patients' self-evaluation concerning the need for psycho-oncological support with the results of established screening tools.We performed a cross-sectional study including 254 melanoma patients from the Center for Dermatooncology at the University of Tuebingen. The study was performed between June 2010 and February 2013. Several screening instruments were included: the Distress Thermometer (DT), Hospital Anxiety and Depression Scale and the patients' subjective evaluation concerning psycho-oncological support. Binary logistic regression was performed to identify factors that indicate the need for psycho-oncological support.Patients' subjective evaluation concerning the need for psycho-oncological support, female gender, and psychotherapeutic or psychiatric treatment at present or in the past had the highest impact on values above threshold in the DT. The odds ratio of patients' self-evaluation (9.89) was even higher than somatic factors like female gender (1.85), duration of illness (0.99), or increasing age (0.97). Patients' self-evaluation concerning the need for psycho-oncological support indicated a moderate correlation with the results of the screening tools included.In addition to the results obtained by screening tools like the DT, we could demonstrate that patients' self-evaluation is an important instrument to identify patients who need psycho-oncological support.
Managing personal health information in distributed research network environments.
Bredfeldt, Christine E; Butani, Amy L; Pardee, Roy; Hitz, Paul; Padmanabhan, Sandy; Saylor, Gwyn
2013-10-08
Studying rare outcomes, new interventions and diverse populations often requires collaborations across multiple health research partners. However, transferring healthcare research data from one institution to another can increase the risk of data privacy and security breaches. A working group of multi-site research programmers evaluated the need for tools to support data security and data privacy. The group determined that data privacy support tools should: 1) allow for a range of allowable Protected Health Information (PHI); 2) clearly identify what type of data should be protected under the Health Insurance Portability and Accountability Act (HIPAA); and 3) help analysts identify which protected health information data elements are allowable in a given project and how they should be protected during data transfer. Based on these requirements we developed two performance support tools to support data programmers and site analysts in exchanging research data. The first tool, a workplan template, guides the lead programmer through effectively communicating the details of multi-site programming, including how to run the program, what output the program will create, and whether the output is expected to contain protected health information. The second performance support tool is a checklist that site analysts can use to ensure that multi-site program output conforms to expectations and does not contain protected health information beyond what is allowed under the multi-site research agreements. Together the two tools create a formal multi-site programming workflow designed to reduce the chance of accidental PHI disclosure.
Parallelization of NAS Benchmarks for Shared Memory Multiprocessors
NASA Technical Reports Server (NTRS)
Waheed, Abdul; Yan, Jerry C.; Saini, Subhash (Technical Monitor)
1998-01-01
This paper presents our experiences of parallelizing the sequential implementation of NAS benchmarks using compiler directives on SGI Origin2000 distributed shared memory (DSM) system. Porting existing applications to new high performance parallel and distributed computing platforms is a challenging task. Ideally, a user develops a sequential version of the application, leaving the task of porting to new generations of high performance computing systems to parallelization tools and compilers. Due to the simplicity of programming shared-memory multiprocessors, compiler developers have provided various facilities to allow the users to exploit parallelism. Native compilers on SGI Origin2000 support multiprocessing directives to allow users to exploit loop-level parallelism in their programs. Additionally, supporting tools can accomplish this process automatically and present the results of parallelization to the users. We experimented with these compiler directives and supporting tools by parallelizing sequential implementation of NAS benchmarks. Results reported in this paper indicate that with minimal effort, the performance gain is comparable with the hand-parallelized, carefully optimized, message-passing implementations of the same benchmarks.
Performance Support Tools for Space Medical Operations
NASA Technical Reports Server (NTRS)
Byrne, Vicky E.; Schmidt, Josef; Barshi, Immanuel
2009-01-01
The early Constellation space missions are expected to have medical capabilities very similar to those currently on the Space Shuttle and International Space Station (ISS). For Crew Exploration Vehicle (CEV) missions to ISS, medical equipment will be located on ISS, and carried into CEV in the event of an emergency. Flight Surgeons (FS) on the ground in Mission Control will be expected to direct the Crew Medical Officer (CMO) during medical situations. If there is a loss of signal and the crew is unable to communicate with the ground, a CMO would be expected to carry out medical procedures without the aid of a FS. In these situations, performance support tools can be used to reduce errors and time to perform emergency medical tasks. Human factors personnel at Johnson Space Center have recently investigated medical performance support tools for CMOs on-orbit, and FSs on the ground. This area of research involved the feasibility of Just-in-time (JIT) training techniques and concepts for real-time medical procedures. In Phase 1, preliminary feasibility data was gathered for two types of prototype display technologies: a hand-held PDA, and a Head Mounted Display (HMD). The PDA and HMD were compared while performing a simulated medical procedure using ISS flight-like medical equipment. Based on the outcome of Phase 1, including data on user preferences, further testing was completed using the PDA only. Phase 2 explored a wrist-mounted PDA, and compared it to a paper cue card. For each phase, time to complete procedures, errors, and user satisfaction were captured. Information needed by the FS during ISS mission support, especially for an emergency situation (e.g. fire onboard ISS), may be located in many different places around the FS s console. A performance support tool prototype is being developed to address this issue by bringing all of the relevant information together in one place. The tool is designed to include procedures and other information needed by a FS during an emergency, as well as procedures and information to be used after the emergency is resolved. Several walkthroughs of the prototype with FSs have been completed within a mockup of an ISS FS console. Feedback on the current tool design as well as recommendations for existing ISS FS displays were captured.
The SEA of the Future: Leveraging Performance Management to Support School Improvement. Volume 1
ERIC Educational Resources Information Center
Gross, Betheny, Ed.; Jochim, Ashley, Ed.
2013-01-01
"The SEA of the Future" is an education publication series examining how state education agencies can shift from a compliance to a performance-oriented organization through strategic planning and performance management tools to meet growing demands to support education reform while improving productivity. This inaugural edition of…
Regulation of Tool-Use within a Blended Course: Student Differences and Performance Effects
ERIC Educational Resources Information Center
Lust, Griet; Elen, Jan; Clarebout, Geraldine
2013-01-01
Given the rising popularity of content management systems (CMSs) in higher education, the current study investigates how students use tools in CMS supported courses. More specifically, the current study investigates how students regulate their tool-use throughout the course period by considering the moment tools are used. This temporal dimension…
ERIC Educational Resources Information Center
Mass Insight Education (NJ1), 2011
2011-01-01
The District Self-Assessment Tool is designed to support districts, unions and Lead Partners, when analyzing their existing collective bargaining agreement (CBA) with the intention of making targeted modifications to support the implementation of dramatic reform in the district's lowest-performing schools. By outlining objectives and suggested…
NASA Astrophysics Data System (ADS)
Le, Anh H.; Deshpande, Ruchi; Liu, Brent J.
2010-03-01
The electronic patient record (ePR) has been developed for prostate cancer patients treated with proton therapy. The ePR has functionality to accept digital input from patient data, perform outcome analysis and patient and physician profiling, provide clinical decision support and suggest courses of treatment, and distribute information across different platforms and health information systems. In previous years, we have presented the infrastructure of a medical imaging informatics based ePR for PT with functionality to accept digital patient information and distribute this information across geographical location using Internet protocol. In this paper, we present the ePR decision support tools which utilize the imaging processing tools and data collected in the ePR. The two decision support tools including the treatment plan navigator and radiation toxicity tool are presented to evaluate prostate cancer treatment to improve proton therapy operation and improve treatment outcomes analysis.
Tools to Support Human Factors and Systems Engineering Interactions During Early Analysis
NASA Technical Reports Server (NTRS)
Thronesbery, Carroll; Malin, Jane T.; Holden, Kritina; Smith, Danielle Paige
2005-01-01
We describe an approach and existing software tool support for effective interactions between human factors engineers and systems engineers in early analysis activities during system acquisition. We examine the tasks performed during this stage, emphasizing those tasks where system engineers and human engineers interact. The Concept of Operations (ConOps) document is an important product during this phase, and particular attention is paid to its influences on subsequent acquisition activities. Understanding this influence helps ConOps authors describe a complete system concept that guides subsequent acquisition activities. We identify commonly used system engineering and human engineering tools and examine how they can support the specific tasks associated with system definition. We identify possible gaps in the support of these tasks, the largest of which appears to be creating the ConOps document itself. Finally, we outline the goals of our future empirical investigations of tools to support system concept definition.
Tools to Support Human Factors and Systems Engineering Interactions During Early Analysis
NASA Technical Reports Server (NTRS)
Thronesbery, Carroll; Malin, Jane T.; Holden, Kritina; Smith, Danielle Paige
2006-01-01
We describe an approach and existing software tool support for effective interactions between human factors engineers and systems engineers in early analysis activities during system acquisition. We examine the tasks performed during this stage, emphasizing those tasks where system engineers and human engineers interact. The Concept of Operations (ConOps) document is an important product during this phase, and particular attention is paid to its influences on subsequent acquisition activities. Understanding this influence helps ConOps authors describe a complete system concept that guides subsequent acquisition activities. We identify commonly used system engineering and human engineering tools and examine how they can support the specific tasks associated with system definition. We identify possible gaps in the support of these tasks, the largest of which appears to be creating the ConOps document itself. Finally, we outline the goals of our future empirical investigations of tools to support system concept definition.
Development of transportation asset management decision support tools : final report.
DOT National Transportation Integrated Search
2017-08-09
This study developed a web-based prototype decision support platform to demonstrate the benefits of transportation asset management in monitoring asset performance, supporting asset funding decisions, planning budget tradeoffs, and optimizing resourc...
Performance Assessment as a Diagnostic Tool for Science Teachers
NASA Astrophysics Data System (ADS)
Kruit, Patricia; Oostdam, Ron; van den Berg, Ed; Schuitema, Jaap
2018-04-01
Information on students' development of science skills is essential for teachers to evaluate and improve their own education, as well as to provide adequate support and feedback to the learning process of individual students. The present study explores and discusses the use of performance assessments as a diagnostic tool for formative assessment to inform teachers and guide instruction of science skills in primary education. Three performance assessments were administered to more than 400 students in grades 5 and 6 of primary education. Students performed small experiments using real materials while following the different steps of the empirical cycle. The mutual relationship between the three performance assessments is examined to provide evidence for the value of performance assessments as useful tools for formative evaluation. Differences in response patterns are discussed, and the diagnostic value of performance assessments is illustrated with examples of individual student performances. Findings show that the performance assessments were difficult for grades 5 and 6 students but that much individual variation exists regarding the different steps of the empirical cycle. Evaluation of scores as well as a more substantive analysis of students' responses provided insight into typical errors that students make. It is concluded that performance assessments can be used as a diagnostic tool for monitoring students' skill performance as well as to support teachers in evaluating and improving their science lessons.
DOT National Transportation Integrated Search
2008-12-15
Intelligent Transportation Systems (ITS) planning requires the use of tools to assess the performance of ITS deployment alternatives relative to each other and to other types of transportation system improvement alternatives. This research project in...
Implementation of TAMSIM and EROW right-of-way acquisition decision - support tools.
DOT National Transportation Integrated Search
2011-04-01
An implementation project was performed to initiate use of TAMSIM and EROW tools in region offices and : the Right of Way (ROW) Division. The research team worked with Texas Department of Transportation : regional ROW staffs to apply both tools to a ...
Tools to manage the enterprise-wide picture archiving and communications system environment.
Lannum, L M; Gumpf, S; Piraino, D
2001-06-01
The presentation will focus on the implementation and utilization of a central picture archiving and communications system (PACS) network-monitoring tool that allows for enterprise-wide operations management and support of the image distribution network. The MagicWatch (Siemens, Iselin, NJ) PACS/radiology information system (RIS) monitoring station from Siemens has allowed our organization to create a service support structure that has given us proactive control of our environment and has allowed us to meet the service level performance expectations of the users. The Radiology Help Desk has used the MagicWatch PACS monitoring station as an applications support tool that has allowed the group to monitor network activity and individual systems performance at each node. Fast and timely recognition of the effects of single events within the PACS/RIS environment has allowed the group to proactively recognize possible performance issues and resolve problems. The PACS/operations group performs network management control, image storage management, and software distribution management from a single, central point in the enterprise. The MagicWatch station allows for the complete automation of software distribution, installation, and configuration process across all the nodes in the system. The tool has allowed for the standardization of the workstations and provides a central configuration control for the establishment and maintenance of the system standards. This report will describe the PACS management and operation prior to the implementation of the MagicWatch PACS monitoring station and will highlight the operational benefits of a centralized network and system-monitoring tool.
Performance Measurement, Visualization and Modeling of Parallel and Distributed Programs
NASA Technical Reports Server (NTRS)
Yan, Jerry C.; Sarukkai, Sekhar R.; Mehra, Pankaj; Lum, Henry, Jr. (Technical Monitor)
1994-01-01
This paper presents a methodology for debugging the performance of message-passing programs on both tightly coupled and loosely coupled distributed-memory machines. The AIMS (Automated Instrumentation and Monitoring System) toolkit, a suite of software tools for measurement and analysis of performance, is introduced and its application illustrated using several benchmark programs drawn from the field of computational fluid dynamics. AIMS includes (i) Xinstrument, a powerful source-code instrumentor, which supports both Fortran77 and C as well as a number of different message-passing libraries including Intel's NX Thinking Machines' CMMD, and PVM; (ii) Monitor, a library of timestamping and trace -collection routines that run on supercomputers (such as Intel's iPSC/860, Delta, and Paragon and Thinking Machines' CM5) as well as on networks of workstations (including Convex Cluster and SparcStations connected by a LAN); (iii) Visualization Kernel, a trace-animation facility that supports source-code clickback, simultaneous visualization of computation and communication patterns, as well as analysis of data movements; (iv) Statistics Kernel, an advanced profiling facility, that associates a variety of performance data with various syntactic components of a parallel program; (v) Index Kernel, a diagnostic tool that helps pinpoint performance bottlenecks through the use of abstract indices; (vi) Modeling Kernel, a facility for automated modeling of message-passing programs that supports both simulation -based and analytical approaches to performance prediction and scalability analysis; (vii) Intrusion Compensator, a utility for recovering true performance from observed performance by removing the overheads of monitoring and their effects on the communication pattern of the program; and (viii) Compatibility Tools, that convert AIMS-generated traces into formats used by other performance-visualization tools, such as ParaGraph, Pablo, and certain AVS/Explorer modules.
Investigating Analytic Tools for e-Book Design in Early Literacy Learning
ERIC Educational Resources Information Center
Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah
2009-01-01
Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…
ERIC Educational Resources Information Center
Gallardo, Matilde; Heiser, Sarah; Arias McLaughlin, Ximena
2015-01-01
In modern language (ML) distance learning programmes, teachers and students use online tools to facilitate, reinforce and support independent learning. This makes it essential for teachers to develop pedagogical expertise in using online communication tools to perform their role. Teachers frequently raise questions of how best to support the needs…
Additional Support for the Information Systems Analyst Exam as a Valid Program Assessment Tool
ERIC Educational Resources Information Center
Carpenter, Donald A.; Snyder, Johnny; Slauson, Gayla Jo; Bridge, Morgan K.
2011-01-01
This paper presents a statistical analysis to support the notion that the Information Systems Analyst (ISA) exam can be used as a program assessment tool in addition to measuring student performance. It compares ISA exam scores earned by students in one particular Computer Information Systems program with scores earned by the same students on the…
NASA Technical Reports Server (NTRS)
Stohlgren, Tom; Schnase, John; Morisette, Jeffrey; Most, Neal; Sheffner, Ed; Hutchinson, Charles; Drake, Sam; Van Leeuwen, Willem; Kaupp, Verne
2005-01-01
The National Institute of Invasive Species Science (NIISS), through collaboration with NASA's Goddard Space Flight Center (GSFC), recently began incorporating NASA observations and predictive modeling tools to fulfill its mission. These enhancements, labeled collectively as the Invasive Species Forecasting System (ISFS), are now in place in the NIISS in their initial state (V1.0). The ISFS is the primary decision support tool of the NIISS for the management and control of invasive species on Department of Interior and adjacent lands. The ISFS is the backbone for a unique information services line-of-business for the NIISS, and it provides the means for delivering advanced decision support capabilities to a wide range of management applications. This report describes the operational characteristics of the ISFS, a decision support tool of the United States Geological Survey (USGS). Recent enhancements to the performance of the ISFS, attained through the integration of observations, models, and systems engineering from the NASA are benchmarked; i.e., described quantitatively and evaluated in relation to the performance of the USGS system before incorporation of the NASA enhancements. This report benchmarks Version 1.0 of the ISFS.
Sedig, Kamran; Parsons, Paul; Dittmer, Mark; Ola, Oluwakemi
2012-01-01
Public health professionals work with a variety of information sources to carry out their everyday activities. In recent years, interactive computational tools have become deeply embedded in such activities. Unlike the early days of computational tool use, the potential of tools nowadays is not limited to simply providing access to information; rather, they can act as powerful mediators of human-information discourse, enabling rich interaction with public health information. If public health informatics tools are designed and used properly, they can facilitate, enhance, and support the performance of complex cognitive activities that are essential to public health informatics, such as problem solving, forecasting, sense-making, and planning. However, the effective design and evaluation of public health informatics tools requires an understanding of the cognitive and perceptual issues pertaining to how humans work and think with information to perform such activities. This paper draws on research that has examined some of the relevant issues, including interaction design, complex cognition, and visual representations, to offer some human-centered design and evaluation considerations for public health informatics tools.
Human Factors in Training: Space Medical Proficiency Training
NASA Technical Reports Server (NTRS)
Byrne, Vicky E.; Barshi, I.; Arsintescu, L.; Connell, E.
2010-01-01
The early Constellation space missions are expected to have medical capabilities very similar to those currently on the Space Shuttle and the International Space Station (ISS). For Crew Exploration Vehicle (CEV) missions to the ISS, medical equipment will be located on the ISS, and carried into CEV in the event of an emergency. Flight surgeons (FS) on the ground in Mission Control will be expected to direct the crew medical officer (CMO) during medical situations. If there is a loss of signal and the crew is unable to communicate with the ground, a CMO would be expected to carry out medical procedures without the aid of a FS. In these situations, performance support tools can be used to reduce errors and time to perform emergency medical tasks. The space medical training work is part of the Human Factors in Training Directed Research Project (DRP) of the Space Human Factors Engineering (SHFE) Project under the Space Human Factors and Habitability (SHFH) Element of the Human Research Program (HRP). This is a joint project consisting of human factors team from the Ames Research Center (ARC) with Immanuel Barshi as Principal Investigator and the Johnson Space Center (JSC). Human factors researchers at JSC have recently investigated medical performance support tools for CMOs on-orbit, and FSs on the ground, and researchers at the Ames Research Center performed a literature review on medical errors. Work on medical training has been conducted in collaboration with the Medical Training Group at the Johnson Space Center (JSC) and with Wyle Laboratories that provides medical training to crew members, biomedical engineers (BMEs), and to flight surgeons under the Bioastronautics contract. One area of research building on activities from FY08, involved the feasibility of just-in-time (JIT) training techniques and concepts for real-time medical procedures. A second area of research involves FS performance support tools. Information needed by the FS during the ISS mission support, especially for an emergency situation (e.g., fire onboard ISS), may be located in many different places around the FS s console. A performance support tool prototype is being developed to address this issue by bringing all of the relevant information together in one place. The tool is designed to include procedures and other information needed by a FS during an emergency, as well as procedures and information to be used after the emergency is resolved. Several walkthroughs of the prototype with FSs have been completed within a mockup of an ISS FS console. Feedback on the current tool design as well as recommendations for existing ISS FS displays were captured. The tool could have different uses depending on the situation and the skill of the user. An experienced flight surgeon could use it during an emergency situation as a decision and performance support tool, whereas a new flight surgeon could use it as JITT, or part of his/her regular training. The work proposed for FY10 continues to build on this strong collaboration with the Space Medical Training Group and previous research.
A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maile, Tobias; Bazjanac, Vladimir; O'Donnell, James
2011-11-01
Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots andmore » data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.« less
Software Tools to Support the Assessment of System Health
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.
2013-01-01
This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of sensors that optimally meet the performance goals and the constraints. It identifies optimal sensor suite solutions by utilizing a merit (i.e., cost) function with one of several available optimization approaches. As part of its analysis, S4 can expose fault conditions that are difficult to diagnose due to an incomplete diagnostic philosophy and/or a lack of sensors. S4 was originally developed and applied to liquid rocket engines. It was subsequently used to study the optimized selection of sensors for a simulation ]based aircraft engine diagnostic system. The ETA Tool is a software ]based analysis tool that augments the testability analysis and reporting capabilities of a commercial ]off ]the ]shelf (COTS) package. An initial diagnostic assessment is performed by the COTS software using a user ]developed, qualitative, directed ]graph model of the system being analyzed. The ETA Tool accesses system design information captured within the model and the associated testability analysis output to create a series of six reports for various system engineering needs. These reports are highlighted in the presentation. The ETA Tool was developed by NASA to support the verification of fault management requirements early in the Launch Vehicle process. Due to their early development during the design process, the TEAMS ]based diagnostic model and the ETA Tool were able to positively influence the system design by highlighting gaps in failure detection, fault isolation, and failure recovery.
30 CFR 57.3202 - Scaling tools.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Support-Surface and Underground § 57.3202 Scaling tools. Where manual scaling is performed, a scaling bar shall be provided. This bar shall be of a length and design that will allow the removal of loose...
Access to Teacher Evaluations Divides Advocates
ERIC Educational Resources Information Center
Sawchuk, Stephen
2012-01-01
As the movement to overhaul teacher evaluation marches onward, an emerging question is splitting the swath of advocates who support the new tools used to gauge teacher performance: Who should get access to the resulting information? Supporters of typing teacher evaluations to student performance differ over whether individuals' results should be…
Performance Support Tools for Space Medical Operations
NASA Technical Reports Server (NTRS)
Byrne, Vicky; Schmid, Josef; Barshi, Immanuel
2010-01-01
Early Constellation space missions are expected to have medical capabilities similar to those currently on board the Space Shuttle and International Space Station (ISS). Flight surgeons on the ground in Mission Control will direct the Crew Medical Officer (CMO) during medical situations. If the crew is unable to communicate with the ground, the CMO will carry out medical procedures without the aid of a flight surgeon. In these situations, use of performance support tools can reduce errors and time to perform emergency medical tasks. The research presented here is part of the Human Factors in Training Directed Research Project of the Space Human Factors Engineering Project under the Space Human Factors and Habitability Element of the Human Research Program. This is a joint project consisting of human factors teams from the Johnson Space Center (JSC) and the Ames Research Center (ARC). Work on medical training has been conducted in collaboration with the Medical Training Group at JSC and with Wyle that provides medical training to crew members, biomedical engineers (BMEs), and flight surgeons under the Bioastronautics contract. Human factors personnel at Johnson Space Center have investigated medical performance support tools for CMOs and flight surgeons.
A Decision Support Prototype Tool for Predicting Student Performance in an ODL Environment
ERIC Educational Resources Information Center
Kotsiantis, S. B.; Pintelas, P. E.
2004-01-01
Machine Learning algorithms fed with data sets which include information such as attendance data, test scores and other student information can provide tutors with powerful tools for decision-making. Until now, much of the research has been limited to the relation between single variables and student performance. Combining multiple variables as…
A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.
Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy
2016-12-01
Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.
ERIC Educational Resources Information Center
Beausaert, Simon A. J.; Segers, Mien S. R.; Gijselaers, Wim H.
2011-01-01
Today, organizations are increasingly implementing assessment tools such as Personal Development Plans. Although the true power of the tool lies in supporting the employee's continuing professional development, organizations implement the tool for various different purposes, professional development purposes on the one hand and promotion/salary…
DOT National Transportation Integrated Search
2018-02-02
The objective of this study is to develop an evidencebased research implementation database and tool to support research implementation at the Georgia Department of Transportation (GDOT).A review was conducted drawing from the (1) implementati...
Report Central: quality reporting tool in an electronic health record.
Jung, Eunice; Li, Qi; Mangalampalli, Anil; Greim, Julie; Eskin, Michael S; Housman, Dan; Isikoff, Jeremy; Abend, Aaron H; Middleton, Blackford; Einbinder, Jonathan S
2006-01-01
Quality reporting tools, integrated with ambulatory electronic health records, can help clinicians and administrators understand performance, manage populations, and improve quality. Report Central is a secure web report delivery tool built on Crystal Reports XItrade mark and ASP.NET technologies. Pilot evaluation of Report Central indicates that clinicians prefer a quality reporting tool that is integrated with our home-grown EHR to support clinical workflow.
360-degree physician performance assessment.
Dubinsky, Isser; Jennings, Kelly; Greengarten, Moshe; Brans, Amy
2010-01-01
Few jurisdictions have a robust common approach to assessing the quantitative and qualitative dimensions of physician performance. In this article, we examine the need for 360-degree physician performance assessment and review the literature supporting comprehensive physician assessment. An evidence-based, "best practice" approach to the development of a 360-degree physician performance assessment framework is presented, including an overview of a tool kit to support implementation. The focus of the framework is to support physician career planning and to enhance the quality of patient care. Finally, the legal considerations related to implementing 360-degree physician performance assessment are explored.
Code of Federal Regulations, 2014 CFR
2014-10-01
... System (CPARS) and Past Performance Information Retrieval System (PPIRS) metric tools to measure the... CONTRACT ADMINISTRATION AND AUDIT SERVICES Contractor Performance Information 42.1501 General. (a) Past performance information (including the ratings and supporting narratives) is relevant information, for future...
Code of Federal Regulations, 2013 CFR
2013-10-01
... Reporting System (CPARS) and Past Performance Information Retrieval System (PPIRS) metric tools to measure... CONTRACT ADMINISTRATION AND AUDIT SERVICES Contractor Performance Information 42.1501 General. (a) Past performance information (including the ratings and supporting narratives) is relevant information, for future...
Greenhouse Gas Mitigation Options Database(GMOD)and Tool
Greenhouse Gas Mitigation Options Database (GMOD) is a decision support database and tool that provides cost and performance information for GHG mitigation options for the power, cement, refinery, landfill and pulp and paper sectors. The GMOD includes approximately 450 studies fo...
Ergonomics action research II: a framework for integrating HF into work system design.
Neumann, W P; Village, J
2012-01-01
This paper presents a conceptual framework that can support efforts to integrate human factors (HF) into the work system design process, where improved and cost-effective application of HF is possible. The framework advocates strategies of broad stakeholder participation, linking of performance and health goals, and process focussed change tools that can help practitioners engage in improvements to embed HF into a firm's work system design process. Recommended tools include business process mapping of the design process, implementing design criteria, using cognitive mapping to connect to managers' strategic goals, tactical use of training and adopting virtual HF (VHF) tools to support the integration effort. Consistent with organisational change research, the framework provides guidance but does not suggest a strict set of steps. This allows more adaptability for the practitioner who must navigate within a particular organisational context to secure support for embedding HF into the design process for improved operator wellbeing and system performance. There has been little scientific literature about how a practitioner might integrate HF into a company's work system design process. This paper proposes a framework for this effort by presenting a coherent conceptual framework, process tools, design tools and procedural advice that can be adapted for a target organisation.
Benndorf, Matthias; Kotter, Elmar; Langer, Mathias; Herda, Christoph; Wu, Yirong; Burnside, Elizabeth S
2015-06-01
To develop and validate a decision support tool for mammographic mass lesions based on a standardized descriptor terminology (BI-RADS lexicon) to reduce variability of practice. We used separate training data (1,276 lesions, 138 malignant) and validation data (1,177 lesions, 175 malignant). We created naïve Bayes (NB) classifiers from the training data with tenfold cross-validation. Our "inclusive model" comprised BI-RADS categories, BI-RADS descriptors, and age as predictive variables; our "descriptor model" comprised BI-RADS descriptors and age. The resulting NB classifiers were applied to the validation data. We evaluated and compared classifier performance with ROC-analysis. In the training data, the inclusive model yields an AUC of 0.959; the descriptor model yields an AUC of 0.910 (P < 0.001). The inclusive model is superior to the clinical performance (BI-RADS categories alone, P < 0.001); the descriptor model performs similarly. When applied to the validation data, the inclusive model yields an AUC of 0.935; the descriptor model yields an AUC of 0.876 (P < 0.001). Again, the inclusive model is superior to the clinical performance (P < 0.001); the descriptor model performs similarly. We consider our classifier a step towards a more uniform interpretation of combinations of BI-RADS descriptors. We provide our classifier at www.ebm-radiology.com/nbmm/index.html . • We provide a decision support tool for mammographic masses at www.ebm-radiology.com/nbmm/index.html . • Our tool may reduce variability of practice in BI-RADS category assignment. • A formal analysis of BI-RADS descriptors may enhance radiologists' diagnostic performance.
A concept for performance management for Federal science programs
Whalen, Kevin G.
2017-11-06
The demonstration of clear linkages between planning, funding, outcomes, and performance management has created unique challenges for U.S. Federal science programs. An approach is presented here that characterizes science program strategic objectives by one of five “activity types”: (1) knowledge discovery, (2) knowledge development and delivery, (3) science support, (4) inventory and monitoring, and (5) knowledge synthesis and assessment. The activity types relate to performance measurement tools for tracking outcomes of research funded under the objective. The result is a multi-time scale, integrated performance measure that tracks individual performance metrics synthetically while also measuring progress toward long-term outcomes. Tracking performance on individual metrics provides explicit linkages to root causes of potentially suboptimal performance and captures both internal and external program drivers, such as customer relations and science support for managers. Functionally connecting strategic planning objectives with performance measurement tools is a practical approach for publicly funded science agencies that links planning, outcomes, and performance management—an enterprise that has created unique challenges for public-sector research and development programs.
Verification and Validation of NASA-Supported Enhancements to Decision Support Tools of PECAD
NASA Technical Reports Server (NTRS)
Ross, Kenton W.; McKellip, Rodney; Moore, Roxzana F.; Fendley, Debbie
2005-01-01
This section of the evaluation report summarizes the verification and validation (V&V) of recently implemented, NASA-supported enhancements to the decision support tools of the Production Estimates and Crop Assessment Division (PECAD). The implemented enhancements include operationally tailored Moderate Resolution Imaging Spectroradiometer (MODIS) products and products of the Global Reservoir and Lake Monitor (GRLM). The MODIS products are currently made available through two separate decision support tools: the MODIS Image Gallery and the U.S. Department of Agriculture (USDA) Foreign Agricultural Service (FAS) MODIS Normalized Difference Vegetation Index (NDVI) Database. Both the Global Reservoir and Lake Monitor and MODIS Image Gallery provide near-real-time products through PECAD's CropExplorer. This discussion addresses two areas: 1. Assessments of the standard NASA products on which these enhancements are based. 2. Characterizations of the performance of the new operational products.
Performance Analysis of and Tool Support for Transactional Memory on BG/Q
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schindewolf, M
2011-12-08
Martin Schindewolf worked during his internship at the Lawrence Livermore National Laboratory (LLNL) under the guidance of Martin Schulz at the Computer Science Group of the Center for Applied Scientific Computing. We studied the performance of the TM subsystem of BG/Q as well as researched the possibilities for tool support for TM. To study the performance, we run CLOMP-TM. CLOMP-TM is a benchmark designed for the purpose to quantify the overhead of OpenMP and compare different synchronization primitives. To advance CLOMP-TM, we added Message Passing Interface (MPI) routines for a hybrid parallelization. This enables to run multiple MPI tasks, eachmore » running OpenMP, on one node. With these enhancements, a beneficial MPI task to OpenMP thread ratio is determined. Further, the synchronization primitives are ranked as a function of the application characteristics. To demonstrate the usefulness of these results, we investigate a real Monte Carlo simulation called Monte Carlo Benchmark (MCB). Applying the lessons learned yields the best task to thread ratio. Further, we were able to tune the synchronization by transactifying the MCB. Further, we develop tools that capture the performance of the TM run time system and present it to the application's developer. The performance of the TM run time system relies on the built-in statistics. These tools use the Blue Gene Performance Monitoring (BGPM) interface to correlate the statistics from the TM run time system with performance counter values. This combination provides detailed insights in the run time behavior of the application and enables to track down the cause of degraded performance. Further, one tool has been implemented that separates the performance counters in three categories: Successful Speculation, Unsuccessful Speculation and No Speculation. All of the tools are crafted around IBM's xlc compiler for C and C++ and have been run and tested on a Q32 early access system.« less
Report Central: Quality Reporting Tool in an Electronic Health Record
Jung, Eunice; Li, Qi; Mangalampalli, Anil; Greim, Julie; Eskin, Michael S.; Housman, Dan; Isikoff, Jeremy; Abend, Aaron H.; Middleton, Blackford; Einbinder, Jonathan S.
2006-01-01
Quality reporting tools, integrated with ambulatory electronic health records, can help clinicians and administrators understand performance, manage populations, and improve quality. Report Central is a secure web report delivery tool built on Crystal Reports XI™ and ASP.NET technologies. Pilot evaluation of Report Central indicates that clinicians prefer a quality reporting tool that is integrated with our home-grown EHR to support clinical workflow. PMID:17238590
Rovira, Ericka; Cross, Austin; Leitch, Evan; Bonaceto, Craig
2014-09-01
The impact of a decision support tool designed to embed contextual mission factors was investigated. Contextual information may enable operators to infer the appropriateness of data underlying the automation's algorithm. Research has shown the costs of imperfect automation are more detrimental than perfectly reliable automation when operators are provided with decision support tools. Operators may trust and rely on the automation more appropriately if they understand the automation's algorithm. The need to develop decision support tools that are understandable to the operator provides the rationale for the current experiment. A total of 17 participants performed a simulated rapid retasking of intelligence, surveillance, and reconnaissance (ISR) assets task with manual, decision automation, or contextual decision automation differing in two levels of task demand: low or high. Automation reliability was set at 80%, resulting in participants experiencing a mixture of reliable and automation failure trials. Dependent variables included ISR coverage and response time of replanning routes. Reliable automation significantly improved ISR coverage when compared with manual performance. Although performance suffered under imperfect automation, contextual decision automation helped to reduce some of the decrements in performance. Contextual information helps overcome the costs of imperfect decision automation. Designers may mitigate some of the performance decrements experienced with imperfect automation by providing operators with interfaces that display contextual information, that is, the state of factors that affect the reliability of the automation's recommendation.
National Energy Audit Tool for Multifamily Buildings Development Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malhotra, Mini; MacDonald, Michael; Accawi, Gina K
The U.S. Department of Energy's (DOE's) Weatherization Assistance Program (WAP) enables low-income families to reduce their energy costs by providing funds to make their homes more energy efficient. In addition, the program funds Weatherization Training and Technical Assistance (T and TA) activities to support a range of program operations. These activities include measuring and documenting performance, monitoring programs, promoting advanced techniques and collaborations to further improve program effectiveness, and training, including developing tools and information resources. The T and TA plan outlines the tasks, activities, and milestones to support the weatherization network with the program implementation ramp up efforts. Weatherizationmore » of multifamily buildings has been recognized as an effective way to ramp up weatherization efforts. To support this effort, the 2009 National Weatherization T and TA plan includes the task of expanding the functionality of the Weatherization Assistant, a DOE-sponsored family of energy audit computer programs, to perform audits for large and small multifamily buildings This report describes the planning effort for a new multifamily energy audit tool for DOE's WAP. The functionality of the Weatherization Assistant is being expanded to also perform energy audits of small multifamily and large multifamily buildings. The process covers an assessment of needs that includes input from national experts during two national Web conferences. The assessment of needs is then translated into capability and performance descriptions for the proposed new multifamily energy audit, with some description of what might or should be provided in the new tool. The assessment of needs is combined with our best judgment to lay out a strategy for development of the multifamily tool that proceeds in stages, with features of an initial tool (version 1) and a more capable version 2 handled with currently available resources. Additional development in the future is expected to be needed if more capabilities are to be added. A rough schedule for development of the version 1 tool is presented. The components and capabilities described in this plan will serve as the starting point for development of the proposed new multifamily energy audit tool for WAP.« less
Woods, Cindy; Carlisle, Karen; Larkins, Sarah; Thompson, Sandra Claire; Tsey, Komla; Matthews, Veronica; Bailie, Ross
2017-01-01
Continuous Quality Improvement is a process for raising the quality of primary health care (PHC) across Indigenous PHC services. In addition to clinical auditing using plan, do, study, and act cycles, engaging staff in a process of reflecting on systems to support quality care is vital. The One21seventy Systems Assessment Tool (SAT) supports staff to assess systems performance in terms of five key components. This study examines quantitative and qualitative SAT data from five high-improving Indigenous PHC services in northern Australia to understand the systems used to support quality care. High-improving services selected for the study were determined by calculating quality of care indices for Indigenous health services participating in the Audit and Best Practice in Chronic Disease National Research Partnership. Services that reported continuing high improvement in quality of care delivered across two or more audit tools in three or more audits were selected for the study. Precollected SAT data (from annual team SAT meetings) are presented longitudinally using radar plots for quantitative scores for each component, and content analysis is used to describe strengths and weaknesses of performance in each systems' component. High-improving services were able to demonstrate strong processes for assessing system performance and consistent improvement in systems to support quality care across components. Key strengths in the quality support systems included adequate and orientated workforce, appropriate health system supports, and engagement with other organizations and community, while the weaknesses included lack of service infrastructure, recruitment, retention, and support for staff and additional costs. Qualitative data revealed clear voices from health service staff expressing concerns with performance, and subsequent SAT data provided evidence of changes made to address concerns. Learning from the processes and strengths of high-improving services may be useful as we work with services striving to improve the quality of care provided in other areas.
Programming Tools: Status, Evaluation, and Comparison
NASA Technical Reports Server (NTRS)
Cheng, Doreen Y.; Cooper, D. M. (Technical Monitor)
1994-01-01
In this tutorial I will first describe the characteristics of scientific applications and their developers, and describe the computing environment in a typical high-performance computing center. I will define the user requirements for tools that support application portability and present the difficulties to satisfy them. These form the basis of the evaluation and comparison of the tools. I will then describe the tools available in the market and the tools available in the public domain. Specifically, I will describe the tools for converting sequential programs, tools for developing portable new programs, tools for debugging and performance tuning, tools for partitioning and mapping, and tools for managing network of resources. I will introduce the main goals and approaches of the tools, and show main features of a few tools in each category. Meanwhile, I will compare tool usability for real-world application development and compare their different technological approaches. Finally, I will indicate the future directions of the tools in each category.
Plots, Calculations and Graphics Tools (PCG2). Software Transfer Request Presentation
NASA Technical Reports Server (NTRS)
Richardson, Marilou R.
2010-01-01
This slide presentation reviews the development of the Plots, Calculations and Graphics Tools (PCG2) system. PCG2 is an easy to use tool that provides a single user interface to view data in a pictorial, tabular or graphical format. It allows the user to view the same display and data in the Control Room, engineering office area, or remote sites. PCG2 supports extensive and regular engineering needs that are both planned and unplanned and it supports the ability to compare, contrast and perform ad hoc data mining over the entire domain of a program's test data.
Multidisciplinary Shape Optimization of a Composite Blended Wing Body Aircraft
NASA Astrophysics Data System (ADS)
Boozer, Charles Maxwell
A multidisciplinary shape optimization tool coupling aerodynamics, structure, and performance was developed for battery powered aircraft. Utilizing high-fidelity computational fluid dynamics analysis tools and a structural wing weight tool, coupled based on the multidisciplinary feasible optimization architecture; aircraft geometry is modified in the optimization of the aircraft's range or endurance. The developed tool is applied to three geometries: a hybrid blended wing body, delta wing UAS, the ONERA M6 wing, and a modified ONERA M6 wing. First, the optimization problem is presented with the objective function, constraints, and design vector. Next, the tool's architecture and the analysis tools that are utilized are described. Finally, various optimizations are described and their results analyzed for all test subjects. Results show that less computationally expensive inviscid optimizations yield positive performance improvements using planform, airfoil, and three-dimensional degrees of freedom. From the results obtained through a series of optimizations, it is concluded that the newly developed tool is both effective at improving performance and serves as a platform ready to receive additional performance modules, further improving its computational design support potential.
ATAMM enhancement and multiprocessor performance evaluation
NASA Technical Reports Server (NTRS)
Stoughton, John W.; Mielke, Roland R.; Som, Sukhamoy; Obando, Rodrigo; Malekpour, Mahyar R.; Jones, Robert L., III; Mandala, Brij Mohan V.
1991-01-01
ATAMM (Algorithm To Architecture Mapping Model) enhancement and multiprocessor performance evaluation is discussed. The following topics are included: the ATAMM model; ATAMM enhancement; ADM (Advanced Development Model) implementation of ATAMM; and ATAMM support tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horiike, S.; Okazaki, Y.
This paper describes a performance estimation tool developed for modeling and simulation of open distributed energy management systems to support their design. The approach of discrete event simulation with detailed models is considered for efficient performance estimation. The tool includes basic models constituting a platform, e.g., Ethernet, communication protocol, operating system, etc. Application softwares are modeled by specifying CPU time, disk access size, communication data size, etc. Different types of system configurations for various system activities can be easily studied. Simulation examples show how the tool is utilized for the efficient design of open distributed energy management systems.
Upham, Susan J; Janamian, Tina; Crossland, Lisa; Jackson, Claire L
2016-04-18
To determine the relevance and utility of online tools and resources to support organisational performance development in primary care and to complement the Primary Care Practice Improvement Tool (PC-PIT). A purposively recruited Expert Advisory Panel of 12 end users used a modified Delphi technique to evaluate 53 tools and resources identified through a previously conducted systematic review. The panel comprised six practice managers and six general practitioners who had participated in the PC-PIT pilot study in 2013-2014. Tools and resources were reviewed in three rounds using a standard pre-tested assessment form. Recommendations, scores and reasons for recommending or rejecting each tool or resource were analysed to determine the final suite of tools and resources. The evaluation was conducted from November 2014 to August 2015. Recommended tools and resources scored highly (mean score, 16/20) in Rounds 1 and 2 of review (n = 25). These tools and resources were perceived to be easily used, useful to the practice and supportive of the PC-PIT. Rejected resources scored considerably lower (mean score, 5/20) and were noted to have limitations such as having no value to the practice and poor utility (n = 6). A final review (Round 3) of 28 resources resulted in a suite of 21 to support the elements of the PC-PIT. This suite of tools and resources offers one approach to supporting the quality improvement initiatives currently in development in primary care reform.
NASA Technical Reports Server (NTRS)
Yeh, H. Y. Jannivine; Brown, Cheryl B.; Jeng, Frank F.; Anderson, Molly; Ewert, Michael K.
2009-01-01
The development of the Advanced Life Support (ALS) Sizing Analysis Tool (ALSSAT) using Microsoft(Registered TradeMark) Excel was initiated by the Crew and Thermal Systems Division (CTSD) of Johnson Space Center (JSC) in 1997 to support the ALS and Exploration Offices in Environmental Control and Life Support System (ECLSS) design and studies. It aids the user in performing detailed sizing of the ECLSS for different combinations of the Exploration Life support (ELS) regenerative system technologies. This analysis tool will assist the user in performing ECLSS preliminary design and trade studies as well as system optimization efficiently and economically. The latest ALSSAT related publication in ICES 2004 detailed ALSSAT s development status including the completion of all six ELS Subsystems (ELSS), namely, the Air Management Subsystem, the Biomass Subsystem, the Food Management Subsystem, the Solid Waste Management Subsystem, the Water Management Subsystem, and the Thermal Control Subsystem and two external interfaces, including the Extravehicular Activity and the Human Accommodations. Since 2004, many more regenerative technologies in the ELSS were implemented into ALSSAT. ALSSAT has also been used for the ELS Research and Technology Development Metric Calculation for FY02 thru FY06. It was also used to conduct the Lunar Outpost Metric calculation for FY08 and was integrated as part of a Habitat Model developed at Langley Research Center to support the Constellation program. This paper will give an update on the analysis tool s current development status as well as present the analytical results of one of the trade studies that was performed.
Harnessing the power of emerging petascale platforms
NASA Astrophysics Data System (ADS)
Mellor-Crummey, John
2007-07-01
As part of the US Department of Energy's Scientific Discovery through Advanced Computing (SciDAC-2) program, science teams are tackling problems that require computational simulation and modeling at the petascale. A grand challenge for computer science is to develop software technology that makes it easier to harness the power of these systems to aid scientific discovery. As part of its activities, the SciDAC-2 Center for Scalable Application Development Software (CScADS) is building open source software tools to support efficient scientific computing on the emerging leadership-class platforms. In this paper, we describe two tools for performance analysis and tuning that are being developed as part of CScADS: a tool for analyzing scalability and performance, and a tool for optimizing loop nests for better node performance. We motivate these tools by showing how they apply to S3D, a turbulent combustion code under development at Sandia National Laboratory. For S3D, our node performance analysis tool helped uncover several performance bottlenecks. Using our loop nest optimization tool, we transformed S3D's most costly loop nest to reduce execution time by a factor of 2.94 for a processor working on a 503 domain.
Can You Help Me with My Pitch? Studying a Tool for Real-Time Automated Feedback
ERIC Educational Resources Information Center
Schneider, Jan; Borner, Dirk; van Rosmalen, Peter; Specht, Marcus
2016-01-01
In our pursue to study effective real-time feedback in Technology Enhanced Learning, we developed the Presentation Trainer, a tool designed to support the practice of nonverbal communication skills for public speaking. The tool tracks the user's voice and body to analyze her performance, and selects the type of real-time feedback to be presented.…
Effects of Planning on Task Load, Knowledge, and Tool Preference: A Comparison of Two Tools
ERIC Educational Resources Information Center
Bonestroo, Wilco J.; de Jong, Ton
2012-01-01
Self-regulated learners are expected to plan their own learning. Because planning is a complex task, it is not self-evident that all learners can perform this task successfully. In this study, we examined the effects of two planning support tools on the quality of created plans, planning behavior, task load, and acquired knowledge. Sixty-five…
Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool
NASA Technical Reports Server (NTRS)
Maul, William A.; Fulton, Christopher E.
2011-01-01
This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual
MACHETE: Environment for Space Networking Evaluation
NASA Technical Reports Server (NTRS)
Jennings, Esther H.; Segui, John S.; Woo, Simon
2010-01-01
Space Exploration missions requires the design and implementation of space networking that differs from terrestrial networks. In a space networking architecture, interplanetary communication protocols need to be designed, validated and evaluated carefully to support different mission requirements. As actual systems are expensive to build, it is essential to have a low cost method to validate and verify mission/system designs and operations. This can be accomplished through simulation. Simulation can aid design decisions where alternative solutions are being considered, support trade-studies and enable fast study of what-if scenarios. It can be used to identify risks, verify system performance against requirements, and as an initial test environment as one moves towards emulation and actual hardware implementation of the systems. We describe the development of Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) and its use cases in supporting architecture trade studies, protocol performance and its role in hybrid simulation/emulation. The MACHETE environment contains various tools and interfaces such that users may select the set of tools tailored for the specific simulation end goal. The use cases illustrate tool combinations for simulating space networking in different mission scenarios. This simulation environment is useful in supporting space networking design for planned and future missions as well as evaluating performance of existing networks where non-determinism exist in data traffic and/or link conditions.
[Intelligent systems tools in the diagnosis of acute coronary syndromes: A systemic review].
Sprockel, John; Tejeda, Miguel; Yate, José; Diaztagle, Juan; González, Enrique
2017-03-27
Acute myocardial infarction is the leading cause of non-communicable deaths worldwide. Its diagnosis is a highly complex task, for which modelling through automated methods has been attempted. A systematic review of the literature was performed on diagnostic tests that applied intelligent systems tools in the diagnosis of acute coronary syndromes. A systematic review of the literature is presented using Medline, Embase, Scopus, IEEE/IET Electronic Library, ISI Web of Science, Latindex and LILACS databases for articles that include the diagnostic evaluation of acute coronary syndromes using intelligent systems. The review process was conducted independently by 2 reviewers, and discrepancies were resolved through the participation of a third person. The operational characteristics of the studied tools were extracted. A total of 35 references met the inclusion criteria. In 22 (62.8%) cases, neural networks were used. In five studies, the performances of several intelligent systems tools were compared. Thirteen studies sought to perform diagnoses of all acute coronary syndromes, and in 22, only infarctions were studied. In 21 cases, clinical and electrocardiographic aspects were used as input data, and in 10, only electrocardiographic data were used. Most intelligent systems use the clinical context as a reference standard. High rates of diagnostic accuracy were found with better performance using neural networks and support vector machines, compared with statistical tools of pattern recognition and decision trees. Extensive evidence was found that shows that using intelligent systems tools achieves a greater degree of accuracy than some clinical algorithms or scales and, thus, should be considered appropriate tools for supporting diagnostic decisions of acute coronary syndromes. Copyright © 2017 Instituto Nacional de Cardiología Ignacio Chávez. Publicado por Masson Doyma México S.A. All rights reserved.
Tool for Sizing Analysis of the Advanced Life Support System
NASA Technical Reports Server (NTRS)
Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.
2005-01-01
Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.
Situation Awareness and Workload Measures for SAFOR
NASA Technical Reports Server (NTRS)
DeMaio, Joe; Hart, Sandra G.; Allen, Ed (Technical Monitor)
1999-01-01
The present research was conducted in support of the NASA Safe All-Weather Flight Operations for Rotorcraft (SAFOR) program. The purpose of the work was to investigate the utility of two measurement tools developed by the British Defense Evaluation Research Agency. These tools were a subjective workload assessment scale, the DRA Workload Scale (DRAWS), and a situation awareness measurement tool in which the crews self-evaluation of performance is compared against actual performance. These two measurement tools were evaluated in the context of a test of an innovative approach to alerting the crew by way of a helmet mounted display. The DRAWS was found to be usable, but it offered no advantages over extant scales, and it had only limited resolution. The performance self-evaluation metric of situation awareness was found to be highly effective.
Ishibashi, Ryo; Mima, Tatsuya; Fukuyama, Hidenao; Pobric, Gorana
2017-01-01
Using a variety of tools is a common and essential component of modern human life. Patients with brain damage or neurological disorders frequently have cognitive deficits in their recognition and manipulation of tools. In this study, we focused on improving tool-related cognition using transcranial direct current stimulation (tDCS). Converging evidence from neuropsychology, neuroimaging and non- invasive brain stimulation has identified the anterior temporal lobe (ATL) and inferior parietal lobule (IPL) as brain regions supporting action semantics. We observed enhanced performance in tool cognition with anodal tDCS over ATL and IPL in two cognitive tasks that require rapid access to semantic knowledge about the function or manipulation of common tools. ATL stimulation improved access to both function and manipulation knowledge of tools. The effect of IPL stimulation showed a trend toward better manipulation judgments. Our findings support previous studies of tool semantics and provide a novel approach for manipulation of underlying circuits.
Analysis of Alternatives for Risk Assessment Methodologies and Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nachtigal, Noel M.; Fruetel, Julia A.; Gleason, Nathaniel J.
The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in themore » risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.« less
Structural considerations for fabrication and mounting of the AXAF HRMA optics
NASA Technical Reports Server (NTRS)
Cohen, Lester M.; Cernoch, Larry; Mathews, Gary; Stallcup, Michael
1990-01-01
A methodology is described which minimizes optics distortion in the fabrication, metrology, and launch configuration phases. The significance of finite element modeling and breadboard testing is described with respect to performance analyses of support structures and material effects in NASA's AXAF X-ray optics. The paper outlines the requirements for AXAF performance, optical fabrication, metrology, and glass support fixtures, as well as the specifications for mirror sensitivity and the high-resolution mirror assembly. Analytical modeling of the tools is shown to coincide with grinding and polishing experiments, and is useful for designing large-area polishing and grinding tools. Metrological subcomponents that have undergone initial testing show evidence of meeting force requirements.
ERIC Educational Resources Information Center
Mass Insight Education (NJ1), 2009
2009-01-01
Given the importance of good teaching and leadership for school success, turnaround schools should think carefully about how to structure professional environments that reward and motivate excellence. A system of "Pay-for-Contribution" that includes tools such as hard-to-staff and skill shortage pay, performance pay, and/or retention…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Belles, Randy; Jain, Prashant K.; Powers, Jeffrey J.
The Oak Ridge National Laboratory (ORNL) has a rich history of support for light water reactor (LWR) and non-LWR technologies. The ORNL history involves operation of 13 reactors at ORNL including the graphite reactor dating back to World War II, two aqueous homogeneous reactors, two molten salt reactors (MSRs), a fast-burst health physics reactor, and seven LWRs. Operation of the High Flux Isotope Reactor (HFIR) has been ongoing since 1965. Expertise exists amongst the ORNL staff to provide non-LWR training; support evaluation of non-LWR licensing and safety issues; perform modeling and simulation using advanced computational tools; run laboratory experiments usingmore » equipment such as the liquid salt component test facility; and perform in-depth fuel performance and thermal-hydraulic technology reviews using a vast suite of computer codes and tools. Summaries of this expertise are included in this paper.« less
NASA Astrophysics Data System (ADS)
Mendoza, A. M. M.; Rastaetter, L.; Kuznetsova, M. M.; Mays, M. L.; Chulaki, A.; Shim, J. S.; MacNeice, P. J.; Taktakishvili, A.; Collado-Vega, Y. M.; Weigand, C.; Zheng, Y.; Mullinix, R.; Patel, K.; Pembroke, A. D.; Pulkkinen, A. A.; Boblitt, J. M.; Bakshi, S. S.; Tsui, T.
2017-12-01
The Community Coordinated Modeling Center (CCMC), with the fundamental goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research, has been serving as an integral hub for over 15 years, providing invaluable resources to both space weather scientific and operational communities. CCMC has developed and provided innovative web-based point of access tools varying from: Runs-On-Request System - providing unprecedented global access to the largest collection of state-of-the-art solar and space physics models, Integrated Space Weather Analysis (iSWA) - a powerful dissemination system for space weather information, Advanced Online Visualization and Analysis tools for more accurate interpretation of model results, Standard Data formats for Simulation Data downloads, and Mobile apps to view space weather data anywhere to the scientific community. In addition to supporting research and performing model evaluations, CCMC also supports space science education by hosting summer students through local universities. In this poster, we will showcase CCMC's latest innovative tools and services, and CCMC's tools that revolutionized the way we do research and improve our operational space weather capabilities. CCMC's free tools and resources are all publicly available online (http://ccmc.gsfc.nasa.gov).
Multi-modal virtual environment research at Armstrong Laboratory
NASA Technical Reports Server (NTRS)
Eggleston, Robert G.
1995-01-01
One mission of the Paul M. Fitts Human Engineering Division of Armstrong Laboratory is to improve the user interface for complex systems through user-centered exploratory development and research activities. In support of this goal, many current projects attempt to advance and exploit user-interface concepts made possible by virtual reality (VR) technologies. Virtual environments may be used as a general purpose interface medium, an alternative display/control method, a data visualization and analysis tool, or a graphically based performance assessment tool. An overview is given of research projects within the division on prototype interface hardware/software development, integrated interface concept development, interface design and evaluation tool development, and user and mission performance evaluation tool development.
The use of power tools in the insertion of cortical bone screws.
Elliott, D
1992-01-01
Cortical bone screws are commonly used in fracture surgery, most patterns are non-self-tapping and require a thread to be pre-cut. This is traditionally performed using hand tools rather than their powered counterparts. Reasons given usually imply that power tools are more dangerous and cut a less precise thread, but there is no evidence to support this supposition. A series of experiments has been performed which show that the thread pattern cut with either method is identical and that over-penetration with the powered tap is easy to control. The conclusion reached is that both methods produce consistently reliable results but use of power tools is much faster.
Which species? A decision-support tool to guide plant selection in stormwater biofilters
NASA Astrophysics Data System (ADS)
Payne, Emily G. I.; Pham, Tracey; Deletic, Ana; Hatt, Belinda E.; Cook, Perran L. M.; Fletcher, Tim D.
2018-03-01
Plant species are diverse in form, function and environmental response. This provides enormous potential for designing nature-based stormwater treatment technologies, such as biofiltration systems. However, species can vary dramatically in their pollutant-removal performance, particularly for nitrogen removal. Currently, there is a lack of information on how to efficiently select from the vast palette of species. This study aimed to identify plant traits beneficial to performance and create a decision-support tool to screen species for further testing. A laboratory experiment using 220 biofilter columns paired plant morphological characteristics with nitrogen removal and water loss for 20 Australian native species and two lawn grasses. Testing was undertaken during wet and dry conditions, for two biofilter designs (saturated zone and free-draining). An extensive root system and high total biomass were critical to the effective removal of total nitrogen (TN) and nitrate (NO3-), driven by high nitrogen assimilation. The same characteristics were key to performance under dry conditions, and were associated with high water use for Australian native plants; linking assimilation and transpiration. The decision-support tool uses these scientific relationships and readily-available information to identify the morphology, natural distribution and stress tolerances likely to be good predictors of plant nitrogen and water uptake.
Trinkaus, Hans L; Gaisser, Andrea E
2010-09-01
Nearly 30,000 individual inquiries are answered annually by the telephone cancer information service (CIS, KID) of the German Cancer Research Center (DKFZ). The aim was to develop a tool for evaluating these calls, and to support the complete counseling process interactively. A novel software tool is introduced, based on a structure similar to a music score. Treating the interaction as a "duet", guided by the CIS counselor, the essential contents of the dialogue are extracted automatically. For this, "trained speech recognition" is applied to the (known) counselor's part, and "keyword spotting" is used on the (unknown) client's part to pick out specific items from the "word streams". The outcomes fill an abstract score representing the dialogue. Pilot tests performed on a prototype of SACA (Software Assisted Call Analysis) resulted in a basic proof of concept: Demographic data as well as information regarding the situation of the caller could be identified. The study encourages following up on the vision of an integrated SACA tool for supporting calls online and performing statistics on its knowledge database offline. Further research perspectives are to check SACA's potential in comparison with established interaction analysis systems like RIAS. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.
The road to successful ITS software acquisition : executive summary
DOT National Transportation Integrated Search
1999-04-01
The Long Term Pavement Performance (LTPP) program was established to support a broad range of pavement performance analyses leading to improved engineering tools to design, construct, and manage pavements. Since 1989, LTPP has collected data on the p...
Timmings, Caitlyn; Khan, Sobia; Moore, Julia E; Marquez, Christine; Pyka, Kasha; Straus, Sharon E
2016-02-24
To address challenges related to selecting a valid, reliable, and appropriate readiness assessment measure in practice, we developed an online decision support tool to aid frontline implementers in healthcare settings in this process. The focus of this paper is to describe a multi-step, end-user driven approach to developing this tool for use during the planning stages of implementation. A multi-phase, end-user driven approach was used to develop and test the usability of a readiness decision support tool. First, readiness assessment measures that are valid, reliable, and appropriate for healthcare settings were identified from a systematic review. Second, a mapping exercise was performed to categorize individual items of included measures according to key readiness constructs from an existing framework. Third, a modified Delphi process was used to collect stakeholder ratings of the included measures on domains of feasibility, relevance, and likelihood to recommend. Fourth, two versions of a decision support tool prototype were developed and evaluated for usability. Nine valid and reliable readiness assessment measures were included in the decision support tool. The mapping exercise revealed that of the nine measures, most measures (78 %) focused on assessing readiness for change at the organizational versus the individual level, and that four measures (44 %) represented all constructs of organizational readiness. During the modified Delphi process, stakeholders rated most measures as feasible and relevant for use in practice, and reported that they would be likely to recommend use of most measures. Using data from the mapping exercise and stakeholder panel, an algorithm was developed to link users to a measure based on characteristics of their organizational setting and their readiness for change assessment priorities. Usability testing yielded recommendations that were used to refine the Ready, Set, Change! decision support tool . Ready, Set, Change! decision support tool is an implementation support that is designed to facilitate the routine incorporation of a readiness assessment as an early step in implementation. Use of this tool in practice may offer time and resource-saving implications for implementation.
A front-end automation tool supporting design, verification and reuse of SOC.
Yan, Xiao-lang; Yu, Long-li; Wang, Jie-bing
2004-09-01
This paper describes an in-house developed language tool called VPerl used in developing a 250 MHz 32-bit high-performance low power embedded CPU core. The authors showed that use of this tool can compress the Verilog code by more than a factor of 5, increase the efficiency of the front-end design, reduce the bug rate significantly. This tool can be used to enhance the reusability of an intellectual property model, and facilitate porting design for different platforms.
BilKristal 2.0: A tool for pattern information extraction from crystal structures
NASA Astrophysics Data System (ADS)
Okuyan, Erhan; Güdükbay, Uğur
2014-01-01
We present a revised version of the BilKristal tool of Okuyan et al. (2007). We converted the development environment into Microsoft Visual Studio 2005 in order to resolve compatibility issues. We added multi-core CPU support and improvements are made to graphics functions in order to improve performance. Discovered bugs are fixed and exporting functionality to a material visualization tool is added.
Barton, G; Abbott, J; Chiba, N; Huang, DW; Huang, Y; Krznaric, M; Mack-Smith, J; Saleem, A; Sherman, BT; Tiwari, B; Tomlinson, C; Aitman, T; Darlington, J; Game, L; Sternberg, MJE; Butcher, SA
2008-01-01
Background Microarray experimentation requires the application of complex analysis methods as well as the use of non-trivial computer technologies to manage the resultant large data sets. This, together with the proliferation of tools and techniques for microarray data analysis, makes it very challenging for a laboratory scientist to keep up-to-date with the latest developments in this field. Our aim was to develop a distributed e-support system for microarray data analysis and management. Results EMAAS (Extensible MicroArray Analysis System) is a multi-user rich internet application (RIA) providing simple, robust access to up-to-date resources for microarray data storage and analysis, combined with integrated tools to optimise real time user support and training. The system leverages the power of distributed computing to perform microarray analyses, and provides seamless access to resources located at various remote facilities. The EMAAS framework allows users to import microarray data from several sources to an underlying database, to pre-process, quality assess and analyse the data, to perform functional analyses, and to track data analysis steps, all through a single easy to use web portal. This interface offers distance support to users both in the form of video tutorials and via live screen feeds using the web conferencing tool EVO. A number of analysis packages, including R-Bioconductor and Affymetrix Power Tools have been integrated on the server side and are available programmatically through the Postgres-PLR library or on grid compute clusters. Integrated distributed resources include the functional annotation tool DAVID, GeneCards and the microarray data repositories GEO, CELSIUS and MiMiR. EMAAS currently supports analysis of Affymetrix 3' and Exon expression arrays, and the system is extensible to cater for other microarray and transcriptomic platforms. Conclusion EMAAS enables users to track and perform microarray data management and analysis tasks through a single easy-to-use web application. The system architecture is flexible and scalable to allow new array types, analysis algorithms and tools to be added with relative ease and to cope with large increases in data volume. PMID:19032776
EPA/ECLSS consumables analyses for the Spacelab 1 flight
NASA Technical Reports Server (NTRS)
Steines, G. J.; Pipher, M. D.
1976-01-01
The results of electrical power system (EPS) and environmental control/life support system (ECLSS) consumables analyses of the Spacelab 1 mission are presented. The analyses were performed to assess the capability of the orbiter systems to support the proposed mission and to establish the various non propulsive consumables requirements. The EPS analysis was performed using the shuttle electrical power system (SEPS) analysis computer program. The ECLSS analysis was performed using the shuttle environmental consumables requirements evaluation tool (SECRET) program.
The paper discusses a computer-based decision support tool that has been developed to assist local governments in evaluating the cost and environmental performance of integrated municipal solid waste (MSW) managment systems. ongoing case studies of the tool at the local level are...
Driving Improvement with a Balanced Scorecard
ERIC Educational Resources Information Center
Cowart, Scott K.
2010-01-01
This article describes how a school district's use of a transparent tool coalesced support for systemic improvement. The author was looking for a way to push improvement in his 4,000-student school system when he discovered the balanced scorecard, a strategic tool for performance management. The author details how the balanced scorecard helped him…
Managing Skills and Knowledge Using Online Tools
ERIC Educational Resources Information Center
Waller, Dave; Holland, Tom
2009-01-01
Purpose: This paper aims to explore a structured approach to measuring skills and knowledge, and to outline how such an approach can be beneficial for improving performance and supporting strategy. It also seeks to examine how online tools can help with this process and to look at implications for the wider UK and European skills development…
WRF/CMAQ AQMEII3 Simulations of US Regional-Scale ...
Chemical boundary conditions are a key input to regional-scale photochemical models. In this study, performed during the third phase of the Air Quality Model Evaluation International Initiative (AQMEII3), we perform annual simulations over North America with chemical boundary conditions prepared from four different global models. Results indicate that the impacts of different boundary conditions are significant for ozone throughout the year and most pronounced outside the summer season. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
Integrated modeling of advanced optical systems
NASA Astrophysics Data System (ADS)
Briggs, Hugh C.; Needels, Laura; Levine, B. Martin
1993-02-01
This poster session paper describes an integrated modeling and analysis capability being developed at JPL under funding provided by the JPL Director's Discretionary Fund and the JPL Control/Structure Interaction Program (CSI). The posters briefly summarize the program capabilities and illustrate them with an example problem. The computer programs developed under this effort will provide an unprecedented capability for integrated modeling and design of high performance optical spacecraft. The engineering disciplines supported include structural dynamics, controls, optics and thermodynamics. Such tools are needed in order to evaluate the end-to-end system performance of spacecraft such as OSI, POINTS, and SMMM. This paper illustrates the proof-of-concept tools that have been developed to establish the technology requirements and demonstrate the new features of integrated modeling and design. The current program also includes implementation of a prototype tool based upon the CAESY environment being developed under the NASA Guidance and Control Research and Technology Computational Controls Program. This prototype will be available late in FY-92. The development plan proposes a major software production effort to fabricate, deliver, support and maintain a national-class tool from FY-93 through FY-95.
Probability and Statistics in Sensor Performance Modeling
2010-12-01
language software program is called Environmental Awareness for Sensor and Emitter Employment. Some important numerical issues in the implementation...3 Statistical analysis for measuring sensor performance...complementary cumulative distribution function cdf cumulative distribution function DST decision-support tool EASEE Environmental Awareness of
Potential Impacts of Accelerated Climate Change
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leung, L. R.; Vail, L. W.
2016-05-31
This research project is part of the U.S. Nuclear Regulatory Commission’s (NRC’s) Probabilistic Flood Hazard Assessment (PFHA) Research plan in support of developing a risk-informed licensing framework for flood hazards and design standards at proposed new facilities and significance determination tools for evaluating potential deficiencies related to flood protection at operating facilities. The PFHA plan aims to build upon recent advances in deterministic, probabilistic, and statistical modeling of extreme precipitation events to develop regulatory tools and guidance for NRC staff with regard to PFHA for nuclear facilities. The tools and guidance developed under the PFHA plan will support and enhancemore » NRC’s capacity to perform thorough and efficient reviews of license applications and license amendment requests. They will also support risk-informed significance determination of inspection findings, unusual events, and other oversight activities.« less
NASA Technical Reports Server (NTRS)
Hall, Laverne
1995-01-01
Modeling of the Multi-mission Image Processing System (MIPS) will be described as an example of the use of a modeling tool to design a distributed system that supports multiple application scenarios. This paper examines: (a) modeling tool selection, capabilities, and operation (namely NETWORK 2.5 by CACl), (b) pointers for building or constructing a model and how the MIPS model was developed, (c) the importance of benchmarking or testing the performance of equipment/subsystems being considered for incorporation the design/architecture, (d) the essential step of model validation and/or calibration using the benchmark results, (e) sample simulation results from the MIPS model, and (f) how modeling and simulation analysis affected the MIPS design process by having a supportive and informative impact.
Supporting Learners' Experiment Design
ERIC Educational Resources Information Center
van Riesen, Siswa; Gijlers, Hannie; Anjewierden, Anjo; de Jong, Ton
2018-01-01
Inquiry learning is an educational approach in which learners actively construct knowledge and in which performing investigations and conducting experiments is central. To support learners in designing informative experiments we created a scaffold, the Experiment Design Tool (EDT), that provided learners with a step-by-step structure to select…
Yucca Mountain licensing support network archive assistant.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunlavy, Daniel M.; Bauer, Travis L.; Verzi, Stephen J.
2008-03-01
This report describes the Licensing Support Network (LSN) Assistant--a set of tools for categorizing e-mail messages and documents, and investigating and correcting existing archives of categorized e-mail messages and documents. The two main tools in the LSN Assistant are the LSN Archive Assistant (LSNAA) tool for recategorizing manually labeled e-mail messages and documents and the LSN Realtime Assistant (LSNRA) tool for categorizing new e-mail messages and documents. This report focuses on the LSNAA tool. There are two main components of the LSNAA tool. The first is the Sandia Categorization Framework, which is responsible for providing categorizations for documents in anmore » archive and storing them in an appropriate Categorization Database. The second is the actual user interface, which primarily interacts with the Categorization Database, providing a way for finding and correcting categorizations errors in the database. A procedure for applying the LSNAA tool and an example use case of the LSNAA tool applied to a set of e-mail messages are provided. Performance results of the categorization model designed for this example use case are presented.« less
Training and Assessment of Hysteroscopic Skills: A Systematic Review.
Savran, Mona Meral; Sørensen, Stine Maya Dreier; Konge, Lars; Tolsgaard, Martin G; Bjerrum, Flemming
2016-01-01
The aim of this systematic review was to identify studies on hysteroscopic training and assessment. PubMed, Excerpta Medica, the Cochrane Library, and Web of Science were searched in January 2015. Manual screening of references and citation tracking were also performed. Studies on hysteroscopic educational interventions were selected without restrictions on study design, populations, language, or publication year. A qualitative data synthesis including the setting, study participants, training model, training characteristics, hysteroscopic skills, assessment parameters, and study outcomes was performed by 2 authors working independently. Effect sizes were calculated when possible. Overall, 2 raters independently evaluated sources of validity evidence supporting the outcomes of the hysteroscopy assessment tools. A total of 25 studies on hysteroscopy training were identified, of which 23 were performed in simulated settings. Overall, 10 studies used virtual-reality simulators and reported effect sizes for technical skills ranging from 0.31 to 2.65; 12 used inanimate models and reported effect sizes for technical skills ranging from 0.35 to 3.19. One study involved live animal models; 2 studies were performed in clinical settings. The validity evidence supporting the assessment tools used was low. Consensus between the 2 raters on the reported validity evidence was high (94%). This systematic review demonstrated large variations in the effect of different tools for hysteroscopy training. The validity evidence supporting the assessment of hysteroscopic skills was limited. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
GPS: Shaping Student Success One Conversation at a Time
ERIC Educational Resources Information Center
Star, Mikhael; Collette, Lanita
2010-01-01
Increasing instructor-student interactions and improving support personnel interventions with students positively affects their academic performance, retention, and graduation rates. This article discusses the Grade Performance Status (GPS) which is Northern Arizona University's new online, academic early alert tool for increasing instructor…
Indentured Parts List Maintenance and Part Assembly Capture Tool - IMPACT
NASA Technical Reports Server (NTRS)
Jain, Bobby; Morris, Jill; Sharpe, Kelly
2004-01-01
Johnson Space Center's (JSC's) indentured parts list (IPL) maintenance and parts assembly capture tool (IMPACT) is an easy-to-use graphical interface for viewing and maintaining the complex assembly hierarchies of large databases. IMPACT, already in use at JSC to support the International Space Station (ISS), queries, updates, modifies, and views data in IPL and associated resource data, functions that it can also perform, with modification, for any large commercial database. By enabling its users to efficiently view and manipulate IPL hierarchical data, IMPACT performs a function unlike that of any other tool. Through IMPACT, users will achieve results quickly, efficiently, and cost effectively.
NASA Technical Reports Server (NTRS)
Statler, Irving C.; Connor, Mary M. (Technical Monitor)
1998-01-01
This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data, The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions. APMS offers to the air transport community an open, voluntary standard for flight-data-analysis software; a standard that will help to ensure suitable functionality and data interchangeability among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs-of aircrews in mind. APMS tools must serve the needs of the government and air carriers, as well as aircrews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but also through statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the aircrew.
Abstract:Managing urban water infrastructures faces the challenge of jointly dealing with assets of diverse types, useful life, cost, ages and condition. Service quality and sustainability require sound long-term planning, well aligned with tactical and operational planning and m...
Developing a Dynamic Inference Expert System to Support Individual Learning at Work
ERIC Educational Resources Information Center
Hung, Yu Hsin; Lin, Chun Fu; Chang, Ray I.
2015-01-01
In response to the rapid growth of information in recent decades, knowledge-based systems have become an essential tool for organizational learning. The application of electronic performance-support systems in learning activities has attracted considerable attention from researchers. Nevertheless, the vast, ever-increasing amount of information is…
Objective Situation Awareness Measurement Based on Performance Self-Evaluation
NASA Technical Reports Server (NTRS)
DeMaio, Joe
1998-01-01
The research was conducted in support of the NASA Safe All-Weather Flight Operations for Rotorcraft (SAFOR) program. The purpose of the work was to investigate the utility of two measurement tools developed by the British Defense Evaluation Research Agency. These tools were a subjective workload assessment scale, the DRA Workload Scale and a situation awareness measurement tool. The situation awareness tool uses a comparison of the crew's self-evaluation of performance against actual performance in order to determine what information the crew attended to during the performance. These two measurement tools were evaluated in the context of a test of innovative approach to alerting the crew by way of a helmet mounted display. The situation assessment data are reported here. The performance self-evaluation metric of situation awareness was found to be highly effective. It was used to evaluate situation awareness on a tank reconnaissance task, a tactical navigation task, and a stylized task used to evaluated handling qualities. Using the self-evaluation metric, it was possible to evaluate situation awareness, without exact knowledge the relevant information in some cases and to identify information to which the crew attended or failed to attend in others.
Revel8or: Model Driven Capacity Planning Tool Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Liming; Liu, Yan; Bui, Ngoc B.
2007-05-31
Designing complex multi-tier applications that must meet strict performance requirements is a challenging software engineering problem. Ideally, the application architect could derive accurate performance predictions early in the project life-cycle, leveraging initial application design-level models and a description of the target software and hardware platforms. To this end, we have developed a capacity planning tool suite for component-based applications, called Revel8tor. The tool adheres to the model driven development paradigm and supports benchmarking and performance prediction for J2EE, .Net and Web services platforms. The suite is composed of three different tools: MDAPerf, MDABench and DSLBench. MDAPerf allows annotation of designmore » diagrams and derives performance analysis models. MDABench allows a customized benchmark application to be modeled in the UML 2.0 Testing Profile and automatically generates a deployable application, with measurement automatically conducted. DSLBench allows the same benchmark modeling and generation to be conducted using a simple performance engineering Domain Specific Language (DSL) in Microsoft Visual Studio. DSLBench integrates with Visual Studio and reuses its load testing infrastructure. Together, the tool suite can assist capacity planning across platforms in an automated fashion.« less
Janamian, Tina; Upham, Susan J; Crossland, Lisa; Jackson, Claire L
2016-04-18
To conduct a systematic review of the literature to identify existing online primary care quality improvement tools and resources to support organisational improvement related to the seven elements in the Primary Care Practice Improvement Tool (PC-PIT), with the identified tools and resources to progress to a Delphi study for further assessment of relevance and utility. Systematic review of the international published and grey literature. CINAHL, Embase and PubMed databases were searched in March 2014 for articles published between January 2004 and December 2013. GreyNet International and other relevant websites and repositories were also searched in March-April 2014 for documents dated between 1992 and 2012. All citations were imported into a bibliographic database. Published and unpublished tools and resources were included in the review if they were in English, related to primary care quality improvement and addressed any of the seven PC-PIT elements of a high-performing practice. Tools and resources that met the eligibility criteria were then evaluated for their accessibility, relevance, utility and comprehensiveness using a four-criteria appraisal framework. We used a data extraction template to systematically extract information from eligible tools and resources. A content analysis approach was used to explore the tools and resources and collate relevant information: name of the tool or resource, year and country of development, author, name of the organisation that provided access and its URL, accessibility information or problems, overview of each tool or resource and the quality improvement element(s) it addresses. If available, a copy of the tool or resource was downloaded into the bibliographic database, along with supporting evidence (published or unpublished) on its use in primary care. This systematic review identified 53 tools and resources that can potentially be provided as part of a suite of tools and resources to support primary care practices in improving the quality of their practice, to achieve improved health outcomes.
Tool-Use in a Blended Undergraduate Course: In Search of User Profiles
ERIC Educational Resources Information Center
Lust, Griet; Vandewaetere, Mieke; Ceulemans, Eva; Elen, Jan; Clarebout, Geraldine
2011-01-01
The popularity of today's blended courses in higher education is driven by the assumption that students are provided with a rich toolset that supports them in their learning process. However, little is known on how students actually use these tools and how this affects their performance for the course. The current study investigates how students…
ERIC Educational Resources Information Center
Page, Deb
2012-01-01
The digitized collections of artifacts known as electronic portfolios are creating solutions to a variety of performance improvement needs in ways that are cost-effective and improve both individual and group learning and performance. When social media functionality is embedded in e-portfolios, the tools support collaboration, social learning,…
Development of a Relay Performance Web Tool for the Mars Network
NASA Technical Reports Server (NTRS)
Allard, Daniel A.; Edwards, Charles D.
2009-01-01
Modern Mars surface missions rely upon orbiting spacecraft to relay communications to and from Earth systems. An important component of this multi-mission relay process is the collection of relay performance statistics supporting strategic trend analysis and tactical anomaly identification and tracking.
NASA Technical Reports Server (NTRS)
Tahmasebi, Farhad; Pearce, Robert
2016-01-01
Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. For efficiency and speed, the tool takes advantage of a function developed in Excels Visual Basic for Applications. The strategic planning process for determining the community Outcomes is also briefly discussed. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples of using the tool are also presented.
GenSAA: A tool for advancing satellite monitoring with graphical expert systems
NASA Technical Reports Server (NTRS)
Hughes, Peter M.; Luczak, Edward C.
1993-01-01
During numerous contacts with a satellite each day, spacecraft analysts must closely monitor real time data for combinations of telemetry parameter values, trends, and other indications that may signify a problem or failure. As satellites become more complex and the number of data items increases, this task is becoming increasingly difficult for humans to perform at acceptable performance levels. At the NASA Goddard Space Flight Center, fault-isolation expert systems have been developed to support data monitoring and fault detection tasks in satellite control centers. Based on the lessons learned during these initial efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analyst Assistant (GenSAA) is being developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. Although initially domain-specific in nature, this powerful tool will support the development of highly graphical expert systems for data monitoring purposes throughout the space and commercial industry.
The Generic Spacecraft Analyst Assistant (gensaa): a Tool for Developing Graphical Expert Systems
NASA Technical Reports Server (NTRS)
Hughes, Peter M.
1993-01-01
During numerous contacts with a satellite each day, spacecraft analysts must closely monitor real-time data. The analysts must watch for combinations of telemetry parameter values, trends, and other indications that may signify a problem or failure. As the satellites become more complex and the number of data items increases, this task is becoming increasingly difficult for humans to perform at acceptable performance levels. At NASA GSFC, fault-isolation expert systems are in operation supporting this data monitoring task. Based on the lessons learned during these initial efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analyst Assistant (GenSAA) is being developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. Although initially domain-specific in nature, this powerful tool will readily support the development of highly graphical expert systems for data monitoring purposes throughout the space and commercial industry.
Validity evidence as a key marker of quality of technical skill assessment in OTL-HNS.
Labbé, Mathilde; Young, Meredith; Nguyen, Lily H P
2018-01-13
Quality monitoring of assessment practices should be a priority in all residency programs. Validity evidence is one of the main hallmarks of assessment quality and should be collected to support the interpretation and use of assessment data. Our objective was to identify, synthesize, and present the validity evidence reported supporting different technical skill assessment tools in otolaryngology-head and neck surgery (OTL-HNS). We performed a secondary analysis of data generated through a systematic review of all published tools for assessing technical skills in OTL-HNS (n = 16). For each tool, we coded validity evidence according to the five types of evidence described by the American Educational Research Association's interpretation of Messick's validity framework. Descriptive statistical analyses were conducted. All 16 tools included in our analysis were supported by internal structure and relationship to variables validity evidence. Eleven articles presented evidence supporting content. Response process was discussed only in one article, and no study reported on evidence exploring consequences. We present the validity evidence reported for 16 rater-based tools that could be used for work-based assessment of OTL-HNS residents in the operating room. The articles included in our review were consistently deficient in evidence for response process and consequences. Rater-based assessment tools that support high-stakes decisions that impact the learner and programs should include several sources of validity evidence. Thus, use of any assessment should be done with careful consideration of the context-specific validity evidence supporting score interpretation, and we encourage deliberate continual assessment quality-monitoring. NA. Laryngoscope, 2018. © 2018 The American Laryngological, Rhinological and Otological Society, Inc.
Supporting performance and configuration management of GTE cellular networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Ming; Lafond, C.; Jakobson, G.
GTE Laboratories, in cooperation with GTE Mobilnet, has developed and deployed PERFFEX (PERFormance Expert), an intelligent system for performance and configuration management of cellular networks. PERFEX assists cellular network performance and radio engineers in the analysis of large volumes of cellular network performance and configuration data. It helps them locate and determine the probable causes of performance problems, and provides intelligent suggestions about how to correct them. The system combines an expert cellular network performance tuning capability with a map-based graphical user interface, data visualization programs, and a set of special cellular engineering tools. PERFEX is in daily use atmore » more than 25 GTE Mobile Switching Centers. Since the first deployment of the system in late 1993, PERFEX has become a major GTE cellular network performance optimization tool.« less
NASA Technical Reports Server (NTRS)
Osgood, Cathy; Williams, Kevin; Gentry, Philip; Brownfield, Dana; Hallstrom, John; Stuit, Tim
2012-01-01
Orbit Software Suite is used to support a variety of NASA/DM (Dependable Multiprocessor) mission planning and analysis activities on the IPS (Intrusion Prevention System) platform. The suite of Orbit software tools (Orbit Design and Orbit Dynamics) resides on IPS/Linux workstations, and is used to perform mission design and analysis tasks corresponding to trajectory/ launch window, rendezvous, and proximity operations flight segments. A list of tools in Orbit Software Suite represents tool versions established during/after the Equipment Rehost-3 Project.
The challenge of big data in public health: an opportunity for visual analytics.
Ola, Oluwakemi; Sedig, Kamran
2014-01-01
Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.
The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics
Ola, Oluwakemi; Sedig, Kamran
2014-01-01
Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376
Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)
NASA Technical Reports Server (NTRS)
Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.
2007-01-01
An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.
A web platform for integrated surface water - groundwater modeling and data management
NASA Astrophysics Data System (ADS)
Fatkhutdinov, Aybulat; Stefan, Catalin; Junghanns, Ralf
2016-04-01
Model-based decision support systems are considered to be reliable and time-efficient tools for resources management in various hydrology related fields. However, searching and acquisition of the required data, preparation of the data sets for simulations as well as post-processing, visualization and publishing of the simulations results often requires significantly more work and time than performing the modeling itself. The purpose of the developed software is to combine data storage facilities, data processing instruments and modeling tools in a single platform which potentially can reduce time required for performing simulations, hence decision making. The system is developed within the INOWAS (Innovative Web Based Decision Support System for Water Sustainability under a Changing Climate) project. The platform integrates spatially distributed catchment scale rainfall - runoff, infiltration and groundwater flow models with data storage, processing and visualization tools. The concept is implemented in a form of a web-GIS application and is build based on free and open source components, including the PostgreSQL database management system, Python programming language for modeling purposes, Mapserver for visualization and publishing the data, Openlayers for building the user interface and others. Configuration of the system allows performing data input, storage, pre- and post-processing and visualization in a single not disturbed workflow. In addition, realization of the decision support system in the form of a web service provides an opportunity to easily retrieve and share data sets as well as results of simulations over the internet, which gives significant advantages for collaborative work on the projects and is able to significantly increase usability of the decision support system.
Reaction Decoder Tool (RDT): extracting features from chemical reactions.
Rahman, Syed Asad; Torrance, Gilliean; Baldacci, Lorenzo; Martínez Cuesta, Sergio; Fenninger, Franz; Gopal, Nimish; Choudhary, Saket; May, John W; Holliday, Gemma L; Steinbeck, Christoph; Thornton, Janet M
2016-07-01
Extracting chemical features like Atom-Atom Mapping (AAM), Bond Changes (BCs) and Reaction Centres from biochemical reactions helps us understand the chemical composition of enzymatic reactions. Reaction Decoder is a robust command line tool, which performs this task with high accuracy. It supports standard chemical input/output exchange formats i.e. RXN/SMILES, computes AAM, highlights BCs and creates images of the mapped reaction. This aids in the analysis of metabolic pathways and the ability to perform comparative studies of chemical reactions based on these features. This software is implemented in Java, supported on Windows, Linux and Mac OSX, and freely available at https://github.com/asad/ReactionDecoder : asad@ebi.ac.uk or s9asad@gmail.com. © The Author 2016. Published by Oxford University Press.
On-line analysis capabilities developed to support the AFW wind-tunnel tests
NASA Technical Reports Server (NTRS)
Wieseman, Carol D.; Hoadley, Sherwood T.; Mcgraw, Sandra M.
1992-01-01
A variety of on-line analysis tools were developed to support two active flexible wing (AFW) wind-tunnel tests. These tools were developed to verify control law execution, to satisfy analysis requirements of the control law designers, to provide measures of system stability in a real-time environment, and to provide project managers with a quantitative measure of controller performance. Descriptions and purposes of the developed capabilities are presented along with examples. Procedures for saving and transferring data for near real-time analysis, and descriptions of the corresponding data interface programs are also presented. The on-line analysis tools worked well before, during, and after the wind tunnel test and proved to be a vital and important part of the entire test effort.
Advancements in Risk-Informed Performance-Based Asset Management for Commercial Nuclear Power Plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liming, James K.; Ravindra, Mayasandra K.
2006-07-01
Over the past several years, ABSG Consulting Inc. (ABS Consulting) and the South Texas Project Nuclear Operating Company (STPNOC) have developed a decision support process and associated software for risk-informed, performance-based asset management (RIPBAM) of nuclear power plant facilities. RIPBAM applies probabilistic risk assessment (PRA) tools and techniques in the realm of plant physical and financial asset management. The RIPBAM process applies a tiered set of models and supporting performance measures (or metrics) that can ultimately be applied to support decisions affecting the allocation and management of plant resources (e.g., funding, staffing, scheduling, etc.). In general, the ultimate goal ofmore » the RIPBAM process is to continually support decision-making to maximize a facility's net present value (NPV) and long-term profitability for its owners. While the initial applications of RIPBAM have been for nuclear power stations, the methodology can easily be adapted to other types of power station or complex facility decision-making support. RIPBAM can also be designed to focus on performance metrics other than NPV and profitability (e.g., mission reliability, operational availability, probability of mission success per dollar invested, etc.). Recent advancements in the RIPBAM process focus on expanding the scope of previous RIPBAM applications to include not only operations, maintenance, and safety issues, but also broader risk perception components affecting plant owner (stockholder), operator, and regulator biases. Conceptually, RIPBAM is a comprehensive risk-informed cash flow model for decision support. It originated as a tool to help manage plant refueling outage scheduling, and was later expanded to include the full spectrum of operations and maintenance decision support. However, it differs from conventional business modeling tools in that it employs a systems engineering approach with broadly based probabilistic analysis of organizational 'value streams'. The scope of value stream inclusion in the process can be established by the user, but in its broadest applications, RIPBAM can be used to address how risk perceptions of plant owners and regulators are impacted by plant performance. Plant staffs can expand and refine RIPBAM models scope via a phased program of activities over time. This paper shows how the multi-metric uncertainty analysis feature of RIPBAM can apply a wide spectrum of decision-influencing factors to support decisions designed to maximize the probability of achieving, maintaining, and improving upon plant goals and objectives. In this paper, the authors show how this approach can be extremely valuable to plant owners and operators in supporting plant value-impacting decision-making processes. (authors)« less
DOT National Transportation Integrated Search
2016-09-30
field study was performed at 40 uncontrolled midblock crosswalks and 26 signalized intersections on low-speed roadways selected from the areas surrounding three major urban college campuses across lower Michigan. An array of existing traffic control ...
Demystifying Results-Based Performance Measurement.
ERIC Educational Resources Information Center
Jorjani, Hamid
Many evaluators are convinced that Results-based Performance Measurement (RBPM) is an effective tool to improve service delivery and cost effectiveness in both public and private sectors. Successful RBPM requires self-directed and cross-functional work teams and the supporting infrastructure to make it work. There are many misconceptions and…
User-Centric Approach for Benchmark RDF Data Generator in Big Data Performance Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Purohit, Sumit; Paulson, Patrick R.; Rodriguez, Luke R.
This research focuses on user-centric approach of building such tools and proposes a flexible, extensible, and easy to use framework to support performance analysis of Big Data systems. Finally, case studies from two different domains are presented to validate the framework.
Wilson, Fernando A; Araz, Ozgur M; Thompson, Ronald W; Ringle, Jay L; Mason, W Alex; Stimpson, Jim P
2016-06-01
Family-centered program research has demonstrated its effectiveness in improving adolescent outcomes. However, given current fiscal constraints faced by governmental agencies, a recent report from the Institute of Medicine and National Research Council highlighted the need for cost-benefit analyses to inform decision making by policymakers. Furthermore, performance management tools such as balanced scorecards and dashboards do not generally include cost-benefit analyses. In this paper, we describe the development of an Excel-based decision support tool that can be used to evaluate a selected family-based program for at-risk children and adolescents relative to a comparison program or the status quo. This tool incorporates the use of an efficient, user-friendly interface with results provided in concise tabular and graphical formats that may be interpreted without need for substantial training in economic evaluation. To illustrate, we present an application of this tool to evaluate use of Boys Town's In-Home Family Services (IHFS) relative to detention and out-of-home placement in New York City. Use of the decision support tool can help mitigate the need for programs to contract experts in economic evaluation, especially when there are financial or time constraints. Copyright © 2016 Elsevier Ltd. All rights reserved.
A Decision Support System for Predicting Students' Performance
ERIC Educational Resources Information Center
Livieris, Ioannis E.; Mikropoulos, Tassos A.; Pintelas, Panagiotis
2016-01-01
Educational data mining is an emerging research field concerned with developing methods for exploring the unique types of data that come from educational context. These data allow the educational stakeholders to discover new, interesting and valuable knowledge about students. In this paper, we present a new user-friendly decision support tool for…
Team Machine: A Decision Support System for Team Formation
ERIC Educational Resources Information Center
Bergey, Paul; King, Mark
2014-01-01
This paper reports on the cross-disciplinary research that resulted in a decision-support tool, Team Machine (TM), which was designed to create maximally diverse student teams. TM was used at a large United States university between 2004 and 2012, and resulted in significant improvement in the performance of student teams, superior overall balance…
New Tools For Understanding Microbial Diversity Using High-throughput Sequence Data
NASA Astrophysics Data System (ADS)
Knight, R.; Hamady, M.; Liu, Z.; Lozupone, C.
2007-12-01
High-throughput sequencing techniques such as 454 are straining the limits of tools traditionally used to build trees, choose OTUs, and perform other essential sequencing tasks. We have developed a workflow for phylogenetic analysis of large-scale sequence data sets that combines existing tools, such as the Arb phylogeny package and the NAST multiple sequence alignment tool, with new methods for choosing and clustering OTUs and for performing phylogenetic community analysis with UniFrac. This talk discusses the cyberinfrastructure we are developing to support the human microbiome project, and the application of these workflows to analyze very large data sets that contrast the gut microbiota with a range of physical environments. These tools will ultimately help to define core and peripheral microbiomes in a range of environments, and will allow us to understand the physical and biotic factors that contribute most to differences in microbial diversity.
Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek
2016-05-01
This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.
Command Center Training Tool (C2T2)
NASA Technical Reports Server (NTRS)
Jones, Phillip; Drucker, Nich; Mathews, Reejo; Stanton, Laura; Merkle, Ed
2012-01-01
This abstract presents the training approach taken to create a management-centered, experiential learning solution for the Virginia Port Authority's Port Command Center. The resultant tool, called the Command Center Training Tool (C2T2), follows a holistic approach integrated across the training management cycle and within a single environment. The approach allows a single training manager to progress from training design through execution and AAR. The approach starts with modeling the training organization, identifying the organizational elements and their individual and collective performance requirements, including organizational-specific performance scoring ontologies. Next, the developer specifies conditions, the problems, and constructs that compose exercises and drive experiential learning. These conditions are defined by incidents, which denote a single, multi-media datum, and scenarios, which are stories told by incidents. To these layered, modular components, previously developed meta-data is attached, including associated performance requirements. The components are then stored in a searchable library An event developer can create a training event by searching the library based on metadata and then selecting and loading the resultant modular pieces. This loading process brings into the training event all the previously associated task and teamwork material as well as AAR preparation materials. The approach includes tools within an integrated management environment that places these materials at the fingertips of the event facilitator such that, in real time, the facilitator can track training audience performance and resultantly modify the training event. The approach also supports the concentrated knowledge management requirements for rapid preparation of an extensive AAR. This approach supports the integrated training cycle and allows a management-based perspective and advanced tools, through which a complex, thorough training event can be developed.
Systematic Omics Analysis Review (SOAR) Tool to Support Risk Assessment
McConnell, Emma R.; Bell, Shannon M.; Cote, Ila; Wang, Rong-Lin; Perkins, Edward J.; Garcia-Reyero, Natàlia; Gong, Ping; Burgoon, Lyle D.
2014-01-01
Environmental health risk assessors are challenged to understand and incorporate new data streams as the field of toxicology continues to adopt new molecular and systems biology technologies. Systematic screening reviews can help risk assessors and assessment teams determine which studies to consider for inclusion in a human health assessment. A tool for systematic reviews should be standardized and transparent in order to consistently determine which studies meet minimum quality criteria prior to performing in-depth analyses of the data. The Systematic Omics Analysis Review (SOAR) tool is focused on assisting risk assessment support teams in performing systematic reviews of transcriptomic studies. SOAR is a spreadsheet tool of 35 objective questions developed by domain experts, focused on transcriptomic microarray studies, and including four main topics: test system, test substance, experimental design, and microarray data. The tool will be used as a guide to identify studies that meet basic published quality criteria, such as those defined by the Minimum Information About a Microarray Experiment standard and the Toxicological Data Reliability Assessment Tool. Seven scientists were recruited to test the tool by using it to independently rate 15 published manuscripts that study chemical exposures with microarrays. Using their feedback, questions were weighted based on importance of the information and a suitability cutoff was set for each of the four topic sections. The final validation resulted in 100% agreement between the users on four separate manuscripts, showing that the SOAR tool may be used to facilitate the standardized and transparent screening of microarray literature for environmental human health risk assessment. PMID:25531884
On Designing Multicore-Aware Simulators for Systems Biology Endowed with OnLine Statistics
Calcagno, Cristina; Coppo, Mario
2014-01-01
The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed. PMID:25050327
Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems
Hendrix, Valerie; Fox, James; Ghoshal, Devarshi; ...
2016-07-21
The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less
Tigres Workflow Library: Supporting Scientific Pipelines on HPC Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendrix, Valerie; Fox, James; Ghoshal, Devarshi
The growth in scientific data volumes has resulted in the need for new tools that enable users to operate on and analyze data on large-scale resources. In the last decade, a number of scientific workflow tools have emerged. These tools often target distributed environments, and often need expert help to compose and execute the workflows. Data-intensive workflows are often ad-hoc, they involve an iterative development process that includes users composing and testing their workflows on desktops, and scaling up to larger systems. In this paper, we present the design and implementation of Tigres, a workflow library that supports the iterativemore » workflow development cycle of data-intensive workflows. Tigres provides an application programming interface to a set of programming templates i.e., sequence, parallel, split, merge, that can be used to compose and execute computational and data pipelines. We discuss the results of our evaluation of scientific and synthetic workflows showing Tigres performs with minimal template overheads (mean of 13 seconds over all experiments). We also discuss various factors (e.g., I/O performance, execution mechanisms) that affect the performance of scientific workflows on HPC systems.« less
On designing multicore-aware simulators for systems biology endowed with OnLine statistics.
Aldinucci, Marco; Calcagno, Cristina; Coppo, Mario; Damiani, Ferruccio; Drocco, Maurizio; Sciacca, Eva; Spinella, Salvatore; Torquati, Massimo; Troina, Angelo
2014-01-01
The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed.
Gilfoyle, Elaine; Koot, Deanna A; Annear, John C; Bhanji, Farhan; Cheng, Adam; Duff, Jonathan P; Grant, Vincent J; St George-Hyslop, Cecilia E; Delaloye, Nicole J; Kotsakis, Afrothite; McCoy, Carolyn D; Ramsay, Christa E; Weiss, Matthew J; Gottesman, Ronald D
2017-02-01
To measure the effect of a 1-day team training course for pediatric interprofessional resuscitation team members on adherence to Pediatric Advanced Life Support guidelines, team efficiency, and teamwork in a simulated clinical environment. Multicenter prospective interventional study. Four tertiary-care children's hospitals in Canada from June 2011 to January 2015. Interprofessional pediatric resuscitation teams including resident physicians, ICU nurse practitioners, registered nurses, and registered respiratory therapists (n = 300; 51 teams). A 1-day simulation-based team training course was delivered, involving an interactive lecture, group discussions, and four simulated resuscitation scenarios, each followed by a debriefing. The first scenario of the day (PRE) was conducted prior to any team training. The final scenario of the day (POST) was the same scenario, with a slightly modified patient history. All scenarios included standardized distractors designed to elicit and challenge specific teamwork behaviors. Primary outcome measure was change (before and after training) in adherence to Pediatric Advanced Life Support guidelines, as measured by the Clinical Performance Tool. Secondary outcome measures were as follows: 1) change in times to initiation of chest compressions and defibrillation and 2) teamwork performance, as measured by the Clinical Teamwork Scale. Correlation between Clinical Performance Tool and Clinical Teamwork Scale scores was also analyzed. Teams significantly improved Clinical Performance Tool scores (67.3-79.6%; p < 0.0001), time to initiation of chest compressions (60.8-27.1 s; p < 0.0001), time to defibrillation (164.8-122.0 s; p < 0.0001), and Clinical Teamwork Scale scores (56.0-71.8%; p < 0.0001). A positive correlation was found between Clinical Performance Tool and Clinical Teamwork Scale (R = 0.281; p < 0.0001). Participation in a simulation-based team training educational intervention significantly improved surrogate measures of clinical performance, time to initiation of key clinical tasks, and teamwork during simulated pediatric resuscitation. A positive correlation between clinical and teamwork performance suggests that effective teamwork improves clinical performance of resuscitation teams.
The Tracking Meteogram, an AWIPS II Tool for Time-Series Analysis
NASA Technical Reports Server (NTRS)
Burks, Jason Eric; Sperow, Ken
2015-01-01
A new tool has been developed for the National Weather Service (NWS) Advanced Weather Interactive Processing System (AWIPS) II through collaboration between NASA's Short-term Prediction Research and Transition (SPoRT) and the NWS Meteorological Development Laboratory (MDL). Referred to as the "Tracking Meteogram", the tool aids NWS forecasters in assessing meteorological parameters associated with moving phenomena. The tool aids forecasters in severe weather situations by providing valuable satellite and radar derived trends such as cloud top cooling rates, radial velocity couplets, reflectivity, and information from ground-based lightning networks. The Tracking Meteogram tool also aids in synoptic and mesoscale analysis by tracking parameters such as the deepening of surface low pressure systems, changes in surface or upper air temperature, and other properties. The tool provides a valuable new functionality and demonstrates the flexibility and extensibility of the NWS AWIPS II architecture. In 2014, the operational impact of the tool was formally evaluated through participation in the NOAA/NWS Operations Proving Ground (OPG), a risk reduction activity to assess performance and operational impact of new forecasting concepts, tools, and applications. Performance of the Tracking Meteogram Tool during the OPG assessment confirmed that it will be a valuable asset to the operational forecasters. This presentation reviews development of the Tracking Meteogram tool, performance and feedback acquired during the OPG activity, and future goals for continued support and extension to other application areas.
Small scale sequence automation pays big dividends
NASA Technical Reports Server (NTRS)
Nelson, Bill
1994-01-01
Galileo sequence design and integration are supported by a suite of formal software tools. Sequence review, however, is largely a manual process with reviewers scanning hundreds of pages of cryptic computer printouts to verify sequence correctness. Beginning in 1990, a series of small, PC based sequence review tools evolved. Each tool performs a specific task but all have a common 'look and feel'. The narrow focus of each tool means simpler operation, and easier creation, testing, and maintenance. Benefits from these tools are (1) decreased review time by factors of 5 to 20 or more with a concomitant reduction in staffing, (2) increased review accuracy, and (3) excellent returns on time invested.
DOT National Transportation Integrated Search
2016-11-15
While a number of studies have developed Safety Performance Functions (SPFs) for : motorized traffic, there has been a very limited focus on developing SPFs for non-motorized : traffic. Lack of exposure measures for pedestrians and bicyclists has bee...
The SEA of the Future: Prioritizing Productivity. Volume 2
ERIC Educational Resources Information Center
Gross, Betheny, Ed.; Jochim, Ashley, Ed.
2013-01-01
"The SEA of the Future" is an education publication series examining how state education agencies can shift from a compliance to a performance-oriented organization through strategic planning and performance management tools to meet growing demands to support education reform while improving productivity. This volume, the second in the…
Policy to Performance Toolkit: Transitioning Adults to Opportunity
ERIC Educational Resources Information Center
Alamprese, Judith A.; Limardo, Chrys
2012-01-01
The "Policy to Performance Toolkit" is designed to provide state adult education staff and key stakeholders with guidance and tools to use in developing, implementing, and monitoring state policies and their associated practices that support an effective state adult basic education (ABE) to postsecondary education and training transition…
A Queue Simulation Tool for a High Performance Scientific Computing Center
NASA Technical Reports Server (NTRS)
Spear, Carrie; McGalliard, James
2007-01-01
The NASA Center for Computational Sciences (NCCS) at the Goddard Space Flight Center provides high performance highly parallel processors, mass storage, and supporting infrastructure to a community of computational Earth and space scientists. Long running (days) and highly parallel (hundreds of CPUs) jobs are common in the workload. NCCS management structures batch queues and allocates resources to optimize system use and prioritize workloads. NCCS technical staff use a locally developed discrete event simulation tool to model the impacts of evolving workloads, potential system upgrades, alternative queue structures and resource allocation policies.
Model-Driven Useware Engineering
NASA Astrophysics Data System (ADS)
Meixner, Gerrit; Seissler, Marc; Breiner, Kai
User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.
Spot and Runway Departure Advisor
NASA Technical Reports Server (NTRS)
Jung, Yoon Chul
2013-01-01
The Spot and Runway Departure Advisor (SARDA) is a research prototype of a decision support tool for ATC tower controllers to assist in manging and controlling traffic on the surface of an airport. SARDA employs a scheduler to generate an optimal runway schedule and gate push-back - spot release sequence and schedule that improves efficiency of surface operations. The advisories for ATC tower controllers are displayed on an Electronic Flight Strip (EFS) system. The human-in-the-loop simulation of the SARDA tool was conducted for east operations of Dallas-Ft. Worth International Airport (DFW) to evaluate performance of the SARDA tool and human factors, such as situational awareness and workload. The results indicates noticeable taxi delay reduction and fuel savings by using the SARDA tool. Reduction in controller workload were also observed throughout the scenario runs. The future plan includes modeling and simulation of the ramp operations of the Charlotte International Airport, and develop a decision support tool for the ramp controllers.
A solution for exposure tool optimization at the 65-nm node and beyond
NASA Astrophysics Data System (ADS)
Itai, Daisuke
2007-03-01
As device geometries shrink, tolerances for critical dimension, focus, and overlay control decrease. For the stable manufacture of semiconductor devices at (and beyond) the 65nm node, both performance variability and drift in exposure tools are no longer negligible factors. With EES (Equipment Engineering System) as a guidepost, hopes of improving productivity of semiconductor manufacturing are growing. We are developing a system, EESP (Equipment Engineering Support Program), based on the concept of EES. The EESP system collects and stores large volumes of detailed data generated from Canon lithographic equipment while product is being manufactured. It uses that data to monitor both equipment characteristics and process characteristics, which cannot be examined without this system. The goal of EESP is to maximize equipment capabilities, by feeding the result back to APC/FDC and the equipment maintenance list. This was a collaborative study of the system's effectiveness at the device maker's factories. We analyzed the performance variability of exposure tools by using focus residual data. We also attempted to optimize tool performance using the analyzed results. The EESP system can make the optimum performance of exposure tools available to the device maker.
Planetary quarantine, supporting research and technology
NASA Technical Reports Server (NTRS)
1975-01-01
The impact of satisfying satellite quarantine on current outer planet mission and spacecraft designs was determined and the tools required to perform trajectory and navigation analyses for determining satellite impact probabilities were developed.
APMS: An Integrated Suite of Tools for Measuring Performance and Safety
NASA Technical Reports Server (NTRS)
Statler, Irving C.; Lynch, Robert E.; Connors, Mary M. (Technical Monitor)
1997-01-01
This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data. The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data-analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions. APMS will offer to the air transport community an open, voluntary standard for flight-data-analysis software, a standard that will help to ensure suitable functionality, and data interchangeability, among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs of air crews in mind. APMS tools must serve the needs of the government and air carriers, as well as air crews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but through statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the air crew.
APMS: An Integrated Suite of Tools for Measuring Performance and Safety
NASA Technical Reports Server (NTRS)
Statler, Irving C. (Technical Monitor)
1997-01-01
This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data. The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data-analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions . APMS will offer to the air transport community an open, voluntary standard for flight-data-analysis software, a standard that will help to ensure suitable functionality, and data interchangeability, among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs of air crews in mind. APMS tools must serve the needs of the government and air carriers, as well as air crews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but through statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the air crew.
CSM digital autopilot testing in support of ASTP experiments control requirements
NASA Technical Reports Server (NTRS)
Rue, D. L.
1975-01-01
Results are presented of CSM digital autopilot (DAP) testing. The testing was performed to demonstrate and evaluate control modes which are currently planned or could be considered for use in support of experiments on the ASTP mission. The testing was performed on the Lockheed Guidance, Navigation, and Control System Functional Simulator (GNCFS). This simulator, which was designed to test the Apollo and Skylab DAP control system, has been used extensively and is a proven tool for CSM DAP analysis.
NASA Technical Reports Server (NTRS)
Chung, William; Chachad, Girish; Hochstetler, Ronald
2016-01-01
The Integrated Gate Turnaround Management (IGTM) concept was developed to improve the gate turnaround performance at the airport by leveraging relevant historical data to support optimization of airport gate operations, which include: taxi to the gate, gate services, push back, taxi to the runway, and takeoff, based on available resources, constraints, and uncertainties. By analyzing events of gate operations, primary performance dependent attributes of these events were identified for the historical data analysis such that performance models can be developed based on uncertainties to support descriptive, predictive, and prescriptive functions. A system architecture was developed to examine system requirements in support of such a concept. An IGTM prototype was developed to demonstrate the concept using a distributed network and collaborative decision tools for stakeholders to meet on time pushback performance under uncertainties.
NASA Technical Reports Server (NTRS)
Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe
2008-01-01
NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.
ERIC Educational Resources Information Center
Haezendonck, Elvira; Willems, Kim; Hillemann, Jenny
2017-01-01
Universities, and higher education institutions in general, are ever more influenced by output-driven performance indicators and models that originally stem from the profit-organisational context. As a result, universities are increasingly considering management tools that support them in the (decision) process for attaining their strategic goals.…
41 CFR 102-192.100 - How do we submit our annual mail management report to GSA?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION... annual reports using the GSA web-based Electronic Performance Support Tool (EPST). Agency mail managers...
ERIC Educational Resources Information Center
Thoms, Paul
1989-01-01
Argues that Music in Our Schools Month (MIOSM) activities can serve as an excellent public relations tool to strengthen community support of school music programs. Points out that a musical performance offers opportunities for good publicity. (LS)
Investigating University Educators' Design Thinking and the Implications for Design Support Tools
ERIC Educational Resources Information Center
Bennett, Sue; Agostinho, Shirley; Lockyer, Lori
2016-01-01
All university educators perform design work as they prepare and plan learning experiences for their students. How such design work is undertaken, conceptualised, and optimally supported is the focus of ongoing research for the authors. The purpose of this article is to present the results of a research study that sought to gain a richer…
Computer-assisted knowledge acquisition for hypermedia systems
NASA Technical Reports Server (NTRS)
Steuck, Kurt
1990-01-01
The usage of procedural and declarative knowledge to set up the structure or 'web' of a hypermedia environment is described. An automated knowledge acquisition tool was developed that helps a knowledge engineer elicit and represent an expert's knowledge involved in performing procedural tasks. The tool represents both procedural and prerequisite, declarative knowledge that supports each activity performed by the expert. This knowledge is output and subsequently read by a hypertext scripting language to generate the link between blank, but labeled cards. Each step of the expert's activity and each piece of supporting declarative knowledge is set up as an empty node. An instructional developer can then enter detailed instructional material concerning each step and declarative knowledge into these empty nodes. Other research is also described that facilitates the translation of knowledge from one form into a form more readily useable by computerized systems.
Modeling a Wireless Network for International Space Station
NASA Technical Reports Server (NTRS)
Alena, Richard; Yaprak, Ece; Lamouri, Saad
2000-01-01
This paper describes the application of wireless local area network (LAN) simulation modeling methods to the hybrid LAN architecture designed for supporting crew-computing tools aboard the International Space Station (ISS). These crew-computing tools, such as wearable computers and portable advisory systems, will provide crew members with real-time vehicle and payload status information and access to digital technical and scientific libraries, significantly enhancing human capabilities in space. A wireless network, therefore, will provide wearable computer and remote instruments with the high performance computational power needed by next-generation 'intelligent' software applications. Wireless network performance in such simulated environments is characterized by the sustainable throughput of data under different traffic conditions. This data will be used to help plan the addition of more access points supporting new modules and more nodes for increased network capacity as the ISS grows.
ERIC Educational Resources Information Center
Su, Addison Y. S.; Yang, Stephen J. H.; Hwang, Wu-Yuin; Huang, Chester S. J.; Tern, Ming-Yu
2014-01-01
For more than 2 years, Scratch programming has been taught in Taiwanese elementary schools. However, past studies have shown that it is difficult to find appropriate learning methods or tools to boost students' Scratch programming performance. This inability to readily identify tutoring tools has become one of the primary challenges addressed in…
Gender Differences in Mobile Phone Usage for Language Learning, Attitude, and Performance
ERIC Educational Resources Information Center
Hilao, Marites Piguing; Wichadee, Saovapa
2017-01-01
Mobile phone technology that has a huge impact on students' lives in the digital age may offer a new type of learning. The use of effective tool to support learning can be affected by the factor of gender. The current research compared how male and female students perceived mobile phones as a language learning tool, used mobile phones to learn…
Paiva, Anthony; Shou, Wilson Z
2016-08-01
The last several years have seen the rapid adoption of the high-resolution MS (HRMS) for bioanalytical support of high throughput in vitro ADME profiling. Many capable software tools have been developed and refined to process quantitative HRMS bioanalysis data for ADME samples with excellent performance. Additionally, new software applications specifically designed for quan/qual soft spot identification workflows using HRMS have greatly enhanced the quality and efficiency of the structure elucidation process for high throughput metabolite ID in early in vitro ADME profiling. Finally, novel approaches in data acquisition and compression, as well as tools for transferring, archiving and retrieving HRMS data, are being continuously refined to tackle the issue of large data file size typical for HRMS analyses.
An Infrastructure for UML-Based Code Generation Tools
NASA Astrophysics Data System (ADS)
Wehrmeister, Marco A.; Freitas, Edison P.; Pereira, Carlos E.
The use of Model-Driven Engineering (MDE) techniques in the domain of distributed embedded real-time systems are gain importance in order to cope with the increasing design complexity of such systems. This paper discusses an infrastructure created to build GenERTiCA, a flexible tool that supports a MDE approach, which uses aspect-oriented concepts to handle non-functional requirements from embedded and real-time systems domain. GenERTiCA generates source code from UML models, and also performs weaving of aspects, which have been specified within the UML model. Additionally, this paper discusses the Distributed Embedded Real-Time Compact Specification (DERCS), a PIM created to support UML-based code generation tools. Some heuristics to transform UML models into DERCS, which have been implemented in GenERTiCA, are also discussed.
Realizing the Potential of Patient Engagement: Designing IT to Support Health in Everyday Life
Novak, Laurie L.; Unertl, Kim M.; Holden, Richard J.
2017-01-01
Maintaining health or managing a chronic condition involves performing and coordinating potentially new and complex tasks in the context of everyday life. Tools such as reminder apps and online health communities are being created to support patients in carrying out these tasks. Research has documented mixed effectiveness and problems with continued use of these tools, and suggests that more widespread adoption may be aided by design approaches that facilitate integration of eHealth technologies into patients’ and family members’ daily routines. Given the need to augment existing methods of design and implementation of eHealth tools, this contribution discusses frameworks and associated methods that engage patients and explore contexts of use in ways that can produce insights for eHealth designers. PMID:27198106
National trends in safety performance of electronic health record systems in children's hospitals.
Chaparro, Juan D; Classen, David C; Danforth, Melissa; Stockwell, David C; Longhurst, Christopher A
2017-03-01
To evaluate the safety of computerized physician order entry (CPOE) and associated clinical decision support (CDS) systems in electronic health record (EHR) systems at pediatric inpatient facilities in the US using the Leapfrog Group's pediatric CPOE evaluation tool. The Leapfrog pediatric CPOE evaluation tool, a previously validated tool to assess the ability of a CPOE system to identify orders that could potentially lead to patient harm, was used to evaluate 41 pediatric hospitals over a 2-year period. Evaluation of the last available test for each institution was performed, assessing performance overall as well as by decision support category (eg, drug-drug, dosing limits). Longitudinal analysis of test performance was also carried out to assess the impact of testing and the overall trend of CPOE performance in pediatric hospitals. Pediatric CPOE systems were able to identify 62% of potential medication errors in the test scenarios, but ranged widely from 23-91% in the institutions tested. The highest scoring categories included drug-allergy interactions, dosing limits (both daily and cumulative), and inappropriate routes of administration. We found that hospitals with longer periods since their CPOE implementation did not have better scores upon initial testing, but after initial testing there was a consistent improvement in testing scores of 4 percentage points per year. Pediatric computerized physician order entry (CPOE) systems on average are able to intercept a majority of potential medication errors, but vary widely among implementations. Prospective and repeated testing using the Leapfrog Group's evaluation tool is associated with improved ability to intercept potential medication errors. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
DisTeam: A decision support tool for surgical team selection
Ebadi, Ashkan; Tighe, Patrick J.; Zhang, Lei; Rashidi, Parisa
2018-01-01
Objective Surgical service providers play a crucial role in the healthcare system. Amongst all the influencing factors, surgical team selection might affect the patients’ outcome significantly. The performance of a surgical team not only can depend on the individual members, but it can also depend on the synergy among team members, and could possibly influence patient outcome such as surgical complications. In this paper, we propose a tool for facilitating decision making in surgical team selection based on considering history of the surgical team, as well as the specific characteristics of each patient. Methods DisTeam (a decision support tool for surgical team selection) is a metaheuristic framework for objective evaluation of surgical teams and finding the optimal team for a given patient, in terms of number of complications. It identifies a ranked list of surgical teams personalized for each patient, based on prior performance of the surgical teams. DisTeam takes into account the surgical complications associated with teams and their members, their teamwork history, as well as patient’s specific characteristics such as age, body mass index (BMI) and Charlson comorbidity index score. Results We tested DisTeam using intra-operative data from 6065 unique orthopedic surgery cases. Our results suggest high effectiveness of the proposed system in a health-care setting. The proposed framework converges quickly to the optimal solution and provides two sets of answers: a) The best surgical team over all the generations, and b) The best population which consists of different teams that can be used as an alternative solution. This increases the flexibility of the system as a complementary decision support tool. Conclusion DisTeam is a decision support tool for assisting in surgical team selection. It can facilitate the job of scheduling personnel in the hospital which involves an overwhelming number of factors pertaining to patients, individual team members, and team dynamics and can be used to compose patient-personalized surgical teams with minimum (potential) surgical complications. PMID:28363285
DisTeam: A decision support tool for surgical team selection.
Ebadi, Ashkan; Tighe, Patrick J; Zhang, Lei; Rashidi, Parisa
2017-02-01
Surgical service providers play a crucial role in the healthcare system. Amongst all the influencing factors, surgical team selection might affect the patients' outcome significantly. The performance of a surgical team not only can depend on the individual members, but it can also depend on the synergy among team members, and could possibly influence patient outcome such as surgical complications. In this paper, we propose a tool for facilitating decision making in surgical team selection based on considering history of the surgical team, as well as the specific characteristics of each patient. DisTeam (a decision support tool for surgical team selection) is a metaheuristic framework for objective evaluation of surgical teams and finding the optimal team for a given patient, in terms of number of complications. It identifies a ranked list of surgical teams personalized for each patient, based on prior performance of the surgical teams. DisTeam takes into account the surgical complications associated with teams and their members, their teamwork history, as well as patient's specific characteristics such as age, body mass index (BMI) and Charlson comorbidity index score. We tested DisTeam using intra-operative data from 6065 unique orthopedic surgery cases. Our results suggest high effectiveness of the proposed system in a health-care setting. The proposed framework converges quickly to the optimal solution and provides two sets of answers: a) The best surgical team over all the generations, and b) The best population which consists of different teams that can be used as an alternative solution. This increases the flexibility of the system as a complementary decision support tool. DisTeam is a decision support tool for assisting in surgical team selection. It can facilitate the job of scheduling personnel in the hospital which involves an overwhelming number of factors pertaining to patients, individual team members, and team dynamics and can be used to compose patient-personalized surgical teams with minimum (potential) surgical complications. Copyright © 2017 Elsevier B.V. All rights reserved.
Hunter, Sarah B.; Ebener, Patricia; Paddock, Susan M.; Stillman, Lindsey; Imm, Pamela; Wandersman, Abraham
2010-01-01
Communities are increasingly being required by state and federal funders to achieve outcomes and be accountable, yet are often not provided the guidance or the tools needed to successfully meet this challenge. To improve the likelihood of achieving positive outcomes, the Getting To Outcomes (GTO) intervention (manual, training, technical assistance) is designed to provide the necessary guidance and tools, tailored to community needs, in order to build individual capacity and program performance. GTO is an example of a Prevention Support System intervention, which as conceptualized by the Interactive Systems Framework, plays a key role in bridging the gap between prevention science (Prevention Synthesis and Translation System) and prevention practice (Prevention Delivery System). We evaluated the impact of GTO on individual capacity and program performance using survey- and interview-based methods. We tracked the implementation of GTO and gathered user feedback about its utility and acceptability. The evaluation of GTO suggests that it can build individual capacity and program performance and as such demonstrates that the Prevention Support System can successfully fulfill its intended role. Lessons learned from the implementation of GTO relevant to illuminating the framework are discussed. PMID:18278551
A comparative assessment of tools for ecosystem services quantification and valuation
Bagstad, Kenneth J.; Semmens, Darius; Waage, Sissel; Winthrop, Robert
2013-01-01
To enter widespread use, ecosystem service assessments need to be quantifiable, replicable, credible, flexible, and affordable. With recent growth in the field of ecosystem services, a variety of decision-support tools has emerged to support more systematic ecosystem services assessment. Despite the growing complexity of the tool landscape, thorough reviews of tools for identifying, assessing, modeling and in some cases monetarily valuing ecosystem services have generally been lacking. In this study, we describe 17 ecosystem services tools and rate their performance against eight evaluative criteria that gauge their readiness for widespread application in public- and private-sector decision making. We describe each of the tools′ intended uses, services modeled, analytical approaches, data requirements, and outputs, as well time requirements to run seven tools in a first comparative concurrent application of multiple tools to a common location – the San Pedro River watershed in southeast Arizona, USA, and northern Sonora, Mexico. Based on this work, we offer conclusions about these tools′ current ‘readiness’ for widespread application within both public- and private-sector decision making processes. Finally, we describe potential pathways forward to reduce the resource requirements for running ecosystem services models, which are essential to facilitate their more widespread use in environmental decision making.
Smith, Madison; Wilder, David A
2018-06-01
The Performance Diagnostic Checklist-Human Services (PDC-HS) is an informant-based tool designed to identify the variables responsible for performance problems. To date, the PDC-HS has not been examined with individuals with intellectual disabilities. In the current study, two supervisors with intellectual disabilities completed the PDC-HS to assess the productivity of two supervisees with disabilities who performed a pricing task in a thrift store. The PDC-HS suggested that performance deficits were due to a lack of training; a PDC-HS-indicated intervention was effective to increase accurate pricing. • The PDC-HS is an informant-based tool designed to identify the variables responsible for employee performance problems in human service settings. • The PDC-HS can be completed by some individuals with intellectual disabilities in a supervisory position to identify the variables responsible for problematic job performance among their supervisees. • A PDC-HS indicated intervention was demonstrated to be effective to improve the job performance of individuals with disabilities. • The PDC-HS may be a useful tool to support performance improvement and job maintenance among individuals with intellectual disabilities.
USDA-ARS?s Scientific Manuscript database
In conventional and most IPM programs, application of insecticides continues to be the most important responsive pest control tactic. For both immediate and long-term optimization and sustainability of insecticide applications, it is paramount to study the factors affecting the performance of insect...
The SEA of the Future: Building Agency Capacity for Evidence-Based Policymaking. Volume 5
ERIC Educational Resources Information Center
Gross, Betheny, Ed.; Jochim, Ashley, Ed.
2015-01-01
"The SEA of the Future" is an education publication series examining how state education agencies can shift from a compliance to a performance-oriented organization through strategic planning and performance management tools to meet growing demands to support education reform while improving productivity. This volume, the fifth in the…
The SEA of the Future: Maximizing Opportunities under ESSA. Volume 6
ERIC Educational Resources Information Center
Jochim, Ashley, Ed.; Gross, Betheny, Ed.
2016-01-01
"The SEA of the Future" is an education publication series examining how state education agencies can shift from a compliance to a performance-oriented organization through strategic planning and performance management tools to meet growing demands to support education reform while improving productivity. This volume, the sixth in the…
The SEA of the Future: Building the Productivity Infrastructure. Volume 3
ERIC Educational Resources Information Center
Gross, Betheny, Ed.; Jochim, Ashley, Ed.
2014-01-01
"The SEA of the Future" is an education publication series examining how state education agencies can shift from a compliance to a performance-oriented organization through strategic planning and performance management tools to meet growing demands to support education reform while improving productivity. This volume, the third in the…
ERIC Educational Resources Information Center
Amoatemaa, Abena Serwaa; Kyeremeh, Dorcas Darkoah
2016-01-01
Many organisations are increasingly making use of employee recognition to motivate employees to achieve high performance and productivity. Research has shown that effective recognition occurs in organisations that have strong supportive culture, understand the psychology of praising employees for their good work, and apply the principles of…
Mobile Formative Assessment Tool Based on Data Mining Techniques for Supporting Web-Based Learning
ERIC Educational Resources Information Center
Chen, Chih-Ming; Chen, Ming-Chuan
2009-01-01
Current trends clearly indicate that online learning has become an important learning mode. However, no effective assessment mechanism for learning performance yet exists for e-learning systems. Learning performance assessment aims to evaluate what learners learned during the learning process. Traditional summative evaluation only considers final…
OISI dynamic end-to-end modeling tool
NASA Astrophysics Data System (ADS)
Kersten, Michael; Weidler, Alexander; Wilhelm, Rainer; Johann, Ulrich A.; Szerdahelyi, Laszlo
2000-07-01
The OISI Dynamic end-to-end modeling tool is tailored to end-to-end modeling and dynamic simulation of Earth- and space-based actively controlled optical instruments such as e.g. optical stellar interferometers. `End-to-end modeling' is meant to denote the feature that the overall model comprises besides optical sub-models also structural, sensor, actuator, controller and disturbance sub-models influencing the optical transmission, so that the system- level instrument performance due to disturbances and active optics can be simulated. This tool has been developed to support performance analysis and prediction as well as control loop design and fine-tuning for OISI, Germany's preparatory program for optical/infrared spaceborne interferometry initiated in 1994 by Dornier Satellitensysteme GmbH in Friedrichshafen.
Performance Monitoring of Chilled-Water Distribution Systems Using HVAC-Cx
Ferretti, Natascha Milesi; Galler, Michael A.; Bushby, Steven T.
2017-01-01
In this research we develop, test, and demonstrate the newest extension of the software HVAC-Cx (NIST and CSTB 2014), an automated commissioning tool for detecting common mechanical faults and control errors in chilled-water distribution systems (loops). The commissioning process can improve occupant comfort, ensure the persistence of correct system operation, and reduce energy consumption. Automated tools support the process by decreasing the time and the skill level required to carry out necessary quality assurance measures, and as a result they enable more thorough testing of building heating, ventilating, and air-conditioning (HVAC) systems. This paper describes the algorithm, developed by National Institute of Standards and Technology (NIST), to analyze chilled-water loops and presents the results of a passive monitoring investigation using field data obtained from BACnet® (ASHRAE 2016) controllers and presents field validation of the findings. The tool was successful in detecting faults in system operation in its first field implementation supporting the investigation phase through performance monitoring. Its findings led to a full energy retrocommissioning of the field site. PMID:29167584
Performance Monitoring of Chilled-Water Distribution Systems Using HVAC-Cx.
Ferretti, Natascha Milesi; Galler, Michael A; Bushby, Steven T
2017-01-01
In this research we develop, test, and demonstrate the newest extension of the software HVAC-Cx (NIST and CSTB 2014), an automated commissioning tool for detecting common mechanical faults and control errors in chilled-water distribution systems (loops). The commissioning process can improve occupant comfort, ensure the persistence of correct system operation, and reduce energy consumption. Automated tools support the process by decreasing the time and the skill level required to carry out necessary quality assurance measures, and as a result they enable more thorough testing of building heating, ventilating, and air-conditioning (HVAC) systems. This paper describes the algorithm, developed by National Institute of Standards and Technology (NIST), to analyze chilled-water loops and presents the results of a passive monitoring investigation using field data obtained from BACnet ® (ASHRAE 2016) controllers and presents field validation of the findings. The tool was successful in detecting faults in system operation in its first field implementation supporting the investigation phase through performance monitoring. Its findings led to a full energy retrocommissioning of the field site.
On Services for Collaborative Project Management
NASA Astrophysics Data System (ADS)
Ollus, Martin; Jansson, Kim; Karvonen, Iris; Uoti, Mikko; Riikonen, Heli
This paper presents an approach for collaborative project management. The focus is on the support of collaboration, communication and trust. Several project management tools exist for monitoring and control the performance of project tasks. However, support of important intangible assets is more difficult to find. In the paper a leadership approach is identified as a management means and the use of new IT technology, especially social media for support of leadership in project management is discussed.
Work readiness tools for young adults with chronic conditions.
Metzinger, Courtney; Berg, Christine
2015-01-01
Young adults with chronic health conditions can experience barriers to work performance, ability, and their present and future worker roles. Work readiness resources can expand individuals' work skills, abilities, and interests. Five work readiness tools are presented (1) building an occupational profile, (2) generating environmental strategies, (3) on-the-job strategy use, and exploration of online tools (4) O*NET® and (5) O*NET® Interest Profiler, along with two theories (Knowles's Andragogy and Lawton's Ecological Model) to guide tool use. Use of these tools can assist young adults to better manage their health and expand their vocational identities for success at work. These approaches and tools support health professionals, community partners, and vocational organizations in their efforts to help young adults with chronic conditions.
Student Evaluations of Teaching Are an Inadequate Assessment Tool for Evaluating Faculty Performance
ERIC Educational Resources Information Center
Hornstein, Henry A.
2017-01-01
Literature is examined to support the contention that student evaluations of teaching (SET) should not be used for summative evaluation of university faculty. Recommendations for alternatives to SET are provided.
Agency for Healthcare Research and Quality
... improve patient outcomes Opioids Supporting HHS' Opioid Initiative Comparative Health System Performance Initiative AHRQ Releases Compendium of ... the U.S. Health system. More AHRQ IMPACT CASE STUDIES AHRQ’s evidence-based tools and resources are used ...
Modeling and performance analysis of QoS data
NASA Astrophysics Data System (ADS)
Strzeciwilk, Dariusz; Zuberek, Włodzimierz M.
2016-09-01
The article presents the results of modeling and analysis of data transmission performance on systems that support quality of service. Models are designed and tested, taking into account multiservice network architecture, i.e. supporting the transmission of data related to different classes of traffic. Studied were mechanisms of traffic shaping systems, which are based on the Priority Queuing with an integrated source of data and the various sources of data that is generated. Discussed were the basic problems of the architecture supporting QoS and queuing systems. Designed and built were models based on Petri nets, supported by temporal logics. The use of simulation tools was to verify the mechanisms of shaping traffic with the applied queuing algorithms. It is shown that temporal models of Petri nets can be effectively used in the modeling and analysis of the performance of computer networks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ragan, Eric D; Goodall, John R
2014-01-01
Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less
Integration of g4tools in Geant4
NASA Astrophysics Data System (ADS)
Hřivnáčová, Ivana
2014-06-01
g4tools, that is originally part of the inlib and exlib packages, provides a very light and easy to install set of C++ classes that can be used to perform analysis in a Geant4 batch program. It allows to create and manipulate histograms and ntuples, and write them in supported file formats (ROOT, AIDA XML, CSV and HBOOK). It is integrated in Geant4 through analysis manager classes, thus providing a uniform interface to the g4tools objects and also hiding the differences between the classes for different supported output formats. Moreover, additional features, such as for example histogram activation or support for Geant4 units, are implemented in the analysis classes following users requests. A set of Geant4 user interface commands allows the user to create histograms and set their properties interactively or in Geant4 macros. g4tools was first introduced in the Geant4 9.5 release where its use was demonstrated in one basic example, and it is already used in a majority of the Geant4 examples within the Geant4 9.6 release. In this paper, we will give an overview and the present status of the integration of g4tools in Geant4 and report on upcoming new features.
Heumann, Frederick K.; Wilkinson, Jay C.; Wooding, David R.
1997-01-01
A remote appliance for supporting a tool for performing work at a worksite on a substantially circular bore of a workpiece and for providing video signals of the worksite to a remote monitor comprising: a baseplate having an inner face and an outer face; a plurality of rollers, wherein each roller is rotatably and adjustably attached to the inner face of the baseplate and positioned to roll against the bore of the workpiece when the baseplate is positioned against the mouth of the bore such that the appliance may be rotated about the bore in a plane substantially parallel to the baseplate; a tool holding means for supporting the tool, the tool holding means being adjustably attached to the outer face of the baseplate such that the working end of the tool is positioned on the inner face side of the baseplate; a camera for providing video signals of the worksite to the remote monitor; and a camera holding means for supporting the camera on the inner face side of the baseplate, the camera holding means being adjustably attached to the outer face of the baseplate. In a preferred embodiment, roller guards are provided to protect the rollers from debris and a bore guard is provided to protect the bore from wear by the rollers and damage from debris.
DspaceOgre 3D Graphics Visualization Tool
NASA Technical Reports Server (NTRS)
Jain, Abhinandan; Myin, Steven; Pomerantz, Marc I.
2011-01-01
This general-purpose 3D graphics visualization C++ tool is designed for visualization of simulation and analysis data for articulated mechanisms. Examples of such systems are vehicles, robotic arms, biomechanics models, and biomolecular structures. DspaceOgre builds upon the open-source Ogre3D graphics visualization library. It provides additional classes to support the management of complex scenes involving multiple viewpoints and different scene groups, and can be used as a remote graphics server. This software provides improved support for adding programs at the graphics processing unit (GPU) level for improved performance. It also improves upon the messaging interface it exposes for use as a visualization server.
The Establishment of a New Friction Stir Welding Process Development Facility at NASA/MSFC
NASA Technical Reports Server (NTRS)
Carter, Robert W.
2009-01-01
Full-scale weld process development is being performed at MSFC to develop the tools, fixtures, and facilities necessary for Ares I production. Full scale development in-house at MSFC fosters technical acuity within the NASA engineering community, and allows engineers to identify and correct tooling and equipment shortcomings before they become problems on the production floor. Finally, while the new weld process development facility is currently being outfitted in support of Ares I development, it has been established to support all future Constellation Program needs. In particular, both the RWT and VWT were sized with the larger Ares V hardware in mind.
Beyond consumer-driven health care: purchasers' expectations of all plans.
Lee, Peter V; Hoo, Emma
2006-01-01
Skyrocketing health care costs and quality deficits can only be addressed through a broad approach of quality-based benefit design. Consumer-directed health plans that are built around better consumer information tools and support hold the promise of consumer engagement, but purchasers expect these features in all types of health plans. Regardless of plan type, simply shifting costs to consumers is a threat to access and adherence to evidence-based medicine. Comparative and interactive consumer information tools, coupled with provider performance transparency and payment reform, are needed to advance accountability and support consumers in getting the right care at the right time.
Braido, Fulvio; Santus, Pierachille; Corsico, Angelo Guido; Di Marco, Fabiano; Melioli, Giovanni; Scichilone, Nicola; Solidoro, Paolo
2018-01-01
The purposes of this study were development and validation of an expert system (ES) aimed at supporting the diagnosis of chronic obstructive lung disease (COLD). A questionnaire and a WebFlex code were developed and validated in silico. An expert panel pilot validation on 60 cases and a clinical validation on 241 cases were performed. The developed questionnaire and code validated in silico resulted in a suitable tool to support the medical diagnosis. The clinical validation of the ES was performed in an academic setting that included six different reference centers for respiratory diseases. The results of the ES expressed as a score associated with the risk of suffering from COLD were matched and compared with the final clinical diagnoses. A set of 60 patients were evaluated by a pilot expert panel validation with the aim of calculating the sample size for the clinical validation study. The concordance analysis between these preliminary ES scores and diagnoses performed by the experts indicated that the accuracy was 94.7% when both experts and the system confirmed the COLD diagnosis and 86.3% when COLD was excluded. Based on these results, the sample size of the validation set was established in 240 patients. The clinical validation, performed on 241 patients, resulted in ES accuracy of 97.5%, with confirmed COLD diagnosis in 53.6% of the cases and excluded COLD diagnosis in 32% of the cases. In 11.2% of cases, a diagnosis of COLD was made by the experts, although the imaging results showed a potential concomitant disorder. The ES presented here (COLD ES ) is a safe and robust supporting tool for COLD diagnosis in primary care settings.
Developing an SSAC Self-Assessment Tool for Operators and Regulators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frazar, Sarah L.; Innes-Jones, Gemma; Hamilton, Ian
Enabling an SSAC to understand why it is performing inefficiently can help it allocate resources more effectively to better support IAEA safeguards implementation. In collaboration with international consulting firm, Environmental Resources Management (ERM) and a U.S. based nuclear fuel cycle facility, the Pacific Northwest National Laboratory (PNNL) has been developing a framework for a future self-assessment tool for nuclear operators and regulators. This paper will describe the effort to date, with particular emphasis on the steps the team took to align the framework with relevant IAEA self-assessment tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drake, Richard R.
Vvtools is a suite of testing tools, with a focus on reproducible verification and validation. They are written in pure Python, and contain a test harness and an automated process management tool. Users of vvtools can develop suites of verification and validation tests and run them on small to large high performance computing resources in an automated and reproducible way. The test harness enables complex processes to be performed in each test and even supports a one-level parent/child dependency between tests. It includes a built in capability to manage workloads requiring multiple processors and platforms that use batch queueing systems.
ERIC Educational Resources Information Center
Mattison, Michelle L. A.; Dando, Coral J.; Ormerod, Thomas C.
2015-01-01
Deficits in episodic free-recall memory performance have been reported in children with autism spectrum disorder (ASD), yet best practice dictates that child witness/victim interviews commence with a free-recall account. No "tools" exist to support children with ASD to freely recall episodic information. Here, the efficacy of a novel…
Compilation of Theses Abstracts
2005-06-01
allies; 2) institutes that focus on the integration of teaching and research in direct support of the four pillars of Joint Visions 2010 and 2020 and...in Performance Based Contracts ................. 7 An Analysis of the Marriage and Dependency Premium Among Active Duty Navy Personnel...Decision Support Tool MASTER OF BUSINESS ADMINISTRATION 8 AN ANALYSIS OF THE MARRIAGE AND DEPENDENCY PREMIUM AMONG ACTIVE DUTY NAVY
Survey of Human Systems Integration (HSI) Tools for USCG Acquisitions
2009-04-01
an IMPRINT HPM. IMPRINT uses task network modeling to represent human performance. As the name implies, task networks use a flowchart type format...tools; and built-in tutoring support for beginners . A perceptual/motor layer extending ACT-R’s theory of cognition to perception and action is also...chisystems.com B.8 Information and Functional Flow Analysis Description In information flow analysis, a flowchart of the information and decisions
Implementation of a Parameterization Framework for Cybersecurity Laboratories
2017-03-01
designer of laboratory exercises with tools to parameterize labs for each student , and automate some aspects of the grading of laboratory exercises. A...is to provide the designer of laboratory exercises with tools to parameterize labs for each student , and automate some aspects of the grading of...support might assist the designer of laboratory exercises to achieve the following? 1. Verify that students performed lab exercises, with some
NASA Technical Reports Server (NTRS)
Tahmasebi, Farhad; Pearce, Robert
2016-01-01
Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. The strategic planning process for determining the community Outcomes is also briefly described. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples are also presented.
Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2008-01-01
This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.
Micro-Vibration Performance Prediction of SEPTA24 Using SMeSim (RUAG Space Mechanism Simulator Tool)
NASA Astrophysics Data System (ADS)
Omiciuolo, Manolo; Lang, Andreas; Wismer, Stefan; Barth, Stephan; Szekely, Gerhard
2013-09-01
Scientific space missions are currently challenging the performances of their payloads. The performances can be dramatically restricted by micro-vibration loads generated by any moving parts of the satellites, thus by Solar Array Drive Assemblies too. Micro-vibration prediction of SADAs is therefore very important to support their design and optimization in the early stages of a programme. The Space Mechanism Simulator (SMeSim) tool, developed by RUAG, enhances the capability of analysing the micro-vibration emissivity of a Solar Array Drive Assembly (SADA) under a specified set of boundary conditions. The tool is developed in the Matlab/Simulink® environment throughout a library of blocks simulating the different components a SADA is made of. The modular architecture of the blocks, assembled by the user, and the set up of the boundary conditions allow time-domain and frequency-domain analyses of a rigid multi-body model with concentrated flexibilities and coupled- electronic control of the mechanism. SMeSim is used to model the SEPTA24 Solar Array Drive Mechanism and predict its micro-vibration emissivity. SMeSim and the return of experience earned throughout its development and use can now support activities like verification by analysis of micro-vibration emissivity requirements and/or design optimization to minimize the micro- vibration emissivity of a SADA.
NASA Astrophysics Data System (ADS)
Fasel, Markus
2016-10-01
High-Performance Computing Systems are powerful tools tailored to support large- scale applications that rely on low-latency inter-process communications to run efficiently. By design, these systems often impose constraints on application workflows, such as limited external network connectivity and whole node scheduling, that make more general-purpose computing tasks, such as those commonly found in high-energy nuclear physics applications, more difficult to carry out. In this work, we present a tool designed to simplify access to such complicated environments by handling the common tasks of job submission, software management, and local data management, in a framework that is easily adaptable to the specific requirements of various computing systems. The tool, initially constructed to process stand-alone ALICE simulations for detector and software development, was successfully deployed on the NERSC computing systems, Carver, Hopper and Edison, and is being configured to provide access to the next generation NERSC system, Cori. In this report, we describe the tool and discuss our experience running ALICE applications on NERSC HPC systems. The discussion will include our initial benchmarks of Cori compared to other systems and our attempts to leverage the new capabilities offered with Cori to support data-intensive applications, with a future goal of full integration of such systems into ALICE grid operations.
XWeB: The XML Warehouse Benchmark
NASA Astrophysics Data System (ADS)
Mahboubi, Hadj; Darmont, Jérôme
With the emergence of XML as a standard for representing business data, new decision support applications are being developed. These XML data warehouses aim at supporting On-Line Analytical Processing (OLAP) operations that manipulate irregular XML data. To ensure feasibility of these new tools, important performance issues must be addressed. Performance is customarily assessed with the help of benchmarks. However, decision support benchmarks do not currently support XML features. In this paper, we introduce the XML Warehouse Benchmark (XWeB), which aims at filling this gap. XWeB derives from the relational decision support benchmark TPC-H. It is mainly composed of a test data warehouse that is based on a unified reference model for XML warehouses and that features XML-specific structures, and its associate XQuery decision support workload. XWeB's usage is illustrated by experiments on several XML database management systems.
Impacts of Lateral Boundary Conditions on US Ozone ...
Chemical boundary conditions are a key input to regional-scale photochemical models. In this study, we perform annual simulations over North America with chemical boundary conditions prepared from two global models (GEOS-CHEM and Hemispheric CMAQ). Results indicate that the impacts of different boundary conditions on ozone can be significant throughout the year. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
FY17 Status Report on NEAMS Neutronics Activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, C. H.; Jung, Y. S.; Smith, M. A.
2017-09-30
Under the U.S. DOE NEAMS program, the high-fidelity neutronics code system has been developed to support the multiphysics modeling and simulation capability named SHARP. The neutronics code system includes the high-fidelity neutronics code PROTEUS, the cross section library and preprocessing tools, the multigroup cross section generation code MC2-3, the in-house meshing generation tool, the perturbation and sensitivity analysis code PERSENT, and post-processing tools. The main objectives of the NEAMS neutronics activities in FY17 are to continue development of an advanced nodal solver in PROTEUS for use in nuclear reactor design and analysis projects, implement a simplified sub-channel based thermal-hydraulic (T/H)more » capability into PROTEUS to efficiently compute the thermal feedback, improve the performance of PROTEUS-MOCEX using numerical acceleration and code optimization, improve the cross section generation tools including MC2-3, and continue to perform verification and validation tests for PROTEUS.« less
Palese, Alvisa; Marini, Eva; Guarnier, Annamaria; Barelli, Paolo; Zambiasi, Paola; Allegrini, Elisabetta; Bazoli, Letizia; Casson, Paola; Marin, Meri; Padovan, Marisa; Picogna, Michele; Taddia, Patrizia; Chiari, Paolo; Salmaso, Daniele; Marognolli, Oliva; Canzan, Federica; Ambrosi, Elisa; Saiani, Luisa; Grassetti, Luca
2016-10-01
There is growing interest in validating tools aimed at supporting the clinical decision-making process and research. However, an increased bureaucratization of clinical practice and redundancies in the measures collected have been reported by clinicians. Redundancies in clinical assessments affect negatively both patients and nurses. To validate a meta-tool measuring the risks/problems currently estimated by multiple tools used in daily practice. A secondary analysis of a database was performed, using a cross-validation and a longitudinal study designs. In total, 1464 patients admitted to 12 medical units in 2012 were assessed at admission with the Brass, Barthel, Conley and Braden tools. Pertinent outcomes such as the occurrence of post-discharge need for resources and functional decline at discharge, as well as falls and pressure sores, were measured. Explorative factor analysis of each tool, inter-tool correlations and a conceptual evaluation of the redundant/similar items across tools were performed. Therefore, the validation of the meta-tool was performed through explorative factor analysis, confirmatory factor analysis and the structural equation model to establish the ability of the meta-tool to predict the outcomes estimated by the original tools. High correlations between the tools have emerged (from r 0.428 to 0.867) with a common variance from 18.3% to 75.1%. Through a conceptual evaluation and explorative factor analysis, the items were reduced from 42 to 20, and the three factors that emerged were confirmed by confirmatory factor analysis. According to the structural equation model results, two out of three emerged factors predicted the outcomes. From the initial 42 items, the meta-tool is composed of 20 items capable of predicting the outcomes as with the original tools. © 2016 John Wiley & Sons, Ltd.
A decision support tool for adaptive management of native prairie ecosystems
Hunt, Victoria M.; Jacobi, Sarah; Gannon, Jill J.; Zorn, Jennifer E.; Moore, Clinton; Lonsdorf, Eric V.
2016-01-01
The Native Prairie Adaptive Management initiative is a decision support framework that provides cooperators with management-action recommendations to help them conserve native species and suppress invasive species on prairie lands. We developed a Web-based decision support tool (DST) for the U.S. Fish and Wildlife Service and the U.S. Geological Survey initiative. The DST facilitates cross-organizational data sharing, performs analyses to improve conservation delivery, and requires no technical expertise to operate. Each year since 2012, the DST has used monitoring data to update ecological knowledge that it translates into situation-specific management-action recommendations (e.g., controlled burn or prescribed graze). The DST provides annual recommendations for more than 10,000 acres on 20 refuge complexes in four U.S. states. We describe how the DST promotes the long-term implementation of the program for which it was designed and may facilitate decision support and improve ecological outcomes of other conservation efforts.
Varshney, Rickul; Frenkiel, Saul; Nguyen, Lily H P; Young, Meredith; Del Maestro, Rolando; Zeitouni, Anthony; Tewfik, Marc A
2014-01-01
The technical challenges of endoscopic sinus surgery (ESS) and the high risk of complications support the development of alternative modalities to train residents in these procedures. Virtual reality simulation is becoming a useful tool for training the skills necessary for minimally invasive surgery; however, there are currently no ESS virtual reality simulators available with valid evidence supporting their use in resident education. Our aim was to develop a new rhinology simulator, as well as to define potential performance metrics for trainee assessment. The McGill simulator for endoscopic sinus surgery (MSESS), a new sinus surgery virtual reality simulator with haptic feedback, was developed (a collaboration between the McGill University Department of Otolaryngology-Head and Neck Surgery, the Montreal Neurologic Institute Simulation Lab, and the National Research Council of Canada). A panel of experts in education, performance assessment, rhinology, and skull base surgery convened to identify core technical abilities that would need to be taught by the simulator, as well as performance metrics to be developed and captured. The MSESS allows the user to perform basic sinus surgery skills, such as an ethmoidectomy and sphenoidotomy, through the use of endoscopic tools in a virtual nasal model. The performance metrics were developed by an expert panel and include measurements of safety, quality, and efficiency of the procedure. The MSESS incorporates novel technological advancements to create a realistic platform for trainees. To our knowledge, this is the first simulator to combine novel tools such as the endonasal wash and elaborate anatomic deformity with advanced performance metrics for ESS.
NASA Astrophysics Data System (ADS)
Kim, Woojin; Boonn, William
2010-03-01
Data mining of existing radiology and pathology reports within an enterprise health system can be used for clinical decision support, research, education, as well as operational analyses. In our health system, the database of radiology and pathology reports exceeds 13 million entries combined. We are building a web-based tool to allow search and data analysis of these combined databases using freely available and open source tools. This presentation will compare performance of an open source full-text indexing tool to MySQL's full-text indexing and searching and describe implementation procedures to incorporate these capabilities into a radiology-pathology search engine.
Interventions to Modify Health Care Provider Adherence to Asthma Guidelines: A Systematic Review
Okelo, Sande O.; Butz, Arlene M.; Sharma, Ritu; Diette, Gregory B.; Pitts, Samantha I.; King, Tracy M.; Linn, Shauna T.; Reuben, Manisha; Chelladurai, Yohalakshmi
2013-01-01
BACKGROUND AND OBJECTIVE: Health care provider adherence to asthma guidelines is poor. The objective of this study was to assess the effect of interventions to improve health care providers’ adherence to asthma guidelines on health care process and clinical outcomes. METHODS: Data sources included Medline, Embase, Cochrane CENTRAL Register of Controlled Trials, Cumulative Index to Nursing and Allied Health Literature, Educational Resources Information Center, PsycINFO, and Research and Development Resource Base in Continuing Medical Education up to July 2012. Paired investigators independently assessed study eligibility. Investigators abstracted data sequentially and independently graded the evidence. RESULTS: Sixty-eight eligible studies were classified by intervention: decision support, organizational change, feedback and audit, clinical pharmacy support, education only, quality improvement/pay-for-performance, multicomponent, and information only. Half were randomized trials (n = 35). There was moderate evidence for increased prescriptions of controller medications for decision support, feedback and audit, and clinical pharmacy support and low-grade evidence for organizational change and multicomponent interventions. Moderate evidence supports the use of decision support and clinical pharmacy interventions to increase provision of patient self-education/asthma action plans. Moderate evidence supports use of decision support tools to reduce emergency department visits, and low-grade evidence suggests there is no benefit for this outcome with organizational change, education only, and quality improvement/pay-for-performance. CONCLUSIONS: Decision support tools, feedback and audit, and clinical pharmacy support were most likely to improve provider adherence to asthma guidelines, as measured through health care process outcomes. There is a need to evaluate health care provider-targeted interventions with standardized outcomes. PMID:23979092
SMARTe is a web-based decision support tool intended to help revitalization practitioners find information, perform data analysis, communicate, and evaluate future reuse options for a site or area. A tutorial was developed to help users navigate SMARTe. This tutorial is approxima...
Software Maintenance of the Subway Environment Simulation Computer Program
DOT National Transportation Integrated Search
1980-12-01
This document summarizes the software maintenance activities performed to support the Subway Environment Simulation (SES) Computer Program. The SES computer program is a design-oriented analytic tool developed during a recent five-year research proje...
Automation Bias: Decision Making and Performance in High-Tech Cockpits
NASA Technical Reports Server (NTRS)
Mosier, Kathleen L.; Skitka, Linda J.; Heers, Susan; Burdick, Mark; Rosekind, Mark R. (Technical Monitor)
1997-01-01
Automated aids and decision support tools are rapidly becoming indispensible tools in high-technology cockpits, and are assuming increasing control of "cognitive" flight tasks, such as calculating fuel-efficient routes, navigating, or detecting and diagnosing system malfunctions and abnormalities. This study was designed to investigate "automation bias," a recently documented factor in the use of automated aids and decision support systems. The term refers to omission and commission errors resulting from the use of automated cues as a heuristic replacement for vigilant information seeking and processing. Glass-cockpit pilots flew flight scenarios involving automation "events," or opportunities for automation-related omission and commission errors. Pilots who perceived themselves as "accountable" for their performance and strategies of interaction with the automation were more likely to double-check automated functioning against other cues, and less likely to commit errors. Pilots were also likely to erroneously "remember" the presence of expected cues when describing their decision-making processes.
CWA 15793 2011 Planning and Implementation Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gross, Alan; Nail, George
This software, built on an open source platform called Electron (runs on Chromium and Node.js), is designed to assist organizations in the implementation of a biorisk management system consistent with the requirements of the international, publicly available guidance document CEN Workshop Agreement 15793:2011 (CWA 15793). The software includes tools for conducting organizational gap analysis against CWA 15793 requirements, planning tools to support the implementation of CWA 15793 requirements, and performance monitoring support. The gap analysis questions are based on the text of CWA 15793, and its associated guidance document, CEN Workshop Agreement 16393:2012. The authors have secured permission from themore » publisher of CWA 15793, the European Committee for Standardization (CEN), to use language from the document in the software, with the understanding that the software will be made available freely, without charge.« less
Validation of an explanatory tool for data-fused displays for high-technology future aircraft
NASA Astrophysics Data System (ADS)
Fletcher, Georgina C. L.; Shanks, Craig R.; Selcon, Stephen J.
1996-05-01
As the number of sensor and data sources in the military cockpit increases, pilots will suffer high levels of workload which could result in reduced performance and the loss of situational awareness. A DRA research program has been investigating the use of data-fused displays in decision support and has developed and laboratory-tested an explanatory tool for displaying information in air combat scenarios. The tool has been designed to provide pictorial explanations of data that maintain situational awareness by involving the pilot in the hostile aircraft threat assessment task. This paper reports a study carried out to validate the success of the explanatory tool in a realistic flight simulation facility. Aircrew were asked to perform a threat assessment task, either with or without the explanatory tool providing information in the form of missile launch success zone envelopes, while concurrently flying a waypoint course within set flight parameters. The results showed that there was a significant improvement (p less than 0.01) in threat assessment accuracy of 30% when using the explanatory tool. This threat assessment performance advantage was achieved without a trade-off with flying task performance. Situational awareness measures showed no general differences between the explanatory and control conditions, but significant learning effects suggested that the explanatory tool makes the task initially more intuitive and hence less demanding on the pilots' attentional resources. The paper concludes that DRA's data-fused explanatory tool is successful at improving threat assessment accuracy in a realistic simulated flying environment, and briefly discusses the requirements for further research in the area.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flach, G.P.; Burns, H.H.; Langton, C.
2013-07-01
The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the U.S. Department of Energy (US DOE) Office of Tank Waste and Nuclear Materials Management. The CBP program has developed a set of integrated tools (based on state-of-the-art models and leaching test methods) that help improve understanding and predictions of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. Tools selected for and developed under this program have been used to evaluate and predict the behavior of cementitious barriers used in near-surface engineered waste disposal systems for periods of performance up tomore » 100 years and longer for operating facilities and longer than 1000 years for waste disposal. The CBP Software Toolbox has produced tangible benefits to the DOE Performance Assessment (PA) community. A review of prior DOE PAs has provided a list of potential opportunities for improving cementitious barrier performance predictions through the use of the CBP software tools. These opportunities include: 1) impact of atmospheric exposure to concrete and grout before closure, such as accelerated slag and Tc-99 oxidation, 2) prediction of changes in K{sub d}/mobility as a function of time that result from changing pH and redox conditions, 3) concrete degradation from rebar corrosion due to carbonation, 4) early age cracking from drying and/or thermal shrinkage and 5) degradation due to sulfate attack. The CBP has already had opportunity to provide near-term, tangible support to ongoing DOE-EM PAs such as the Savannah River Saltstone Disposal Facility (SDF) by providing a sulfate attack analysis that predicts the extent and damage that sulfate ingress will have on the concrete vaults over extended time (i.e., > 1000 years). This analysis is one of the many technical opportunities in cementitious barrier performance that can be addressed by the DOE-EM sponsored CBP software tools. Modification of the existing tools can provide many opportunities to bring defense in depth in prediction of the performance of cementitious barriers over time. (authors)« less
Pathway Tools version 13.0: integrated software for pathway/genome informatics and systems biology
Paley, Suzanne M.; Krummenacker, Markus; Latendresse, Mario; Dale, Joseph M.; Lee, Thomas J.; Kaipa, Pallavi; Gilham, Fred; Spaulding, Aaron; Popescu, Liviu; Altman, Tomer; Paulsen, Ian; Keseler, Ingrid M.; Caspi, Ron
2010-01-01
Pathway Tools is a production-quality software environment for creating a type of model-organism database called a Pathway/Genome Database (PGDB). A PGDB such as EcoCyc integrates the evolving understanding of the genes, proteins, metabolic network and regulatory network of an organism. This article provides an overview of Pathway Tools capabilities. The software performs multiple computational inferences including prediction of metabolic pathways, prediction of metabolic pathway hole fillers and prediction of operons. It enables interactive editing of PGDBs by DB curators. It supports web publishing of PGDBs, and provides a large number of query and visualization tools. The software also supports comparative analyses of PGDBs, and provides several systems biology analyses of PGDBs including reachability analysis of metabolic networks, and interactive tracing of metabolites through a metabolic network. More than 800 PGDBs have been created using Pathway Tools by scientists around the world, many of which are curated DBs for important model organisms. Those PGDBs can be exchanged using a peer-to-peer DB sharing system called the PGDB Registry. PMID:19955237
ERIC Educational Resources Information Center
Clifford, Matthew; Hansen, Ulcca Joshni; Wraight, Sara
2014-01-01
Across the country, states and districts are designing principal evaluation systems as a means of improving leadership, learning, and school performance. Principal evaluation systems hold potential for supporting leaders' learning and sense of accountability for instructional excellence and student performance. Principal evaluation also is an…
The SEA of the Future: Uncovering the Productivity Promise of Rural Education. Volume 4
ERIC Educational Resources Information Center
Gross, Betheny, Ed.; Jochim, Ashley, Ed.
2015-01-01
"The SEA of the Future" is an education publication series examining how state education agencies can shift from a compliance to a performance-oriented organization through strategic planning and performance management tools to meet growing demands to support education reform while improving productivity. This is the fourth volume in the…
ERIC Educational Resources Information Center
Heric, Matthew; Carter, Jenn
2011-01-01
Cognitive readiness (CR) and performance for operational time-critical environments are continuing points of focus for military and academic communities. In response to this need, we designed an open source interactive CR assessment application as a highly adaptive and efficient open source testing administration and analysis tool. It is capable…
ERIC Educational Resources Information Center
Clifford, Matthew; Hansen, Ulcca Joshni; Wraight, Sara
2012-01-01
Across the country, states and districts are designing principal evaluation systems as a means of improving leadership, learning, and school performance. Principal evaluation systems hold potential for supporting leaders' learning and sense of accountability for instructional excellence and student performance. Principal evaluation is also an…
NASA Astrophysics Data System (ADS)
Brennan-Tonetta, Margaret
This dissertation seeks to provide key information and a decision support tool that states can use to support long-term goals of fossil fuel displacement and greenhouse gas reductions. The research yields three outcomes: (1) A methodology that allows for a comprehensive and consistent inventory and assessment of bioenergy feedstocks in terms of type, quantity, and energy potential. Development of a standardized methodology for consistent inventorying of biomass resources fosters research and business development of promising technologies that are compatible with the state's biomass resource base. (2) A unique interactive decision support tool that allows for systematic bioenergy analysis and evaluation of policy alternatives through the generation of biomass inventory and energy potential data for a wide variety of feedstocks and applicable technologies, using New Jersey as a case study. Development of a database that can assess the major components of a bioenergy system in one tool allows for easy evaluation of technology, feedstock and policy options. The methodology and decision support tool is applicable to other states and regions (with location specific modifications), thus contributing to the achievement of state and federal goals of renewable energy utilization. (3) Development of policy recommendations based on the results of the decision support tool that will help to guide New Jersey into a sustainable renewable energy future. The database developed in this research represents the first ever assessment of bioenergy potential for New Jersey. It can serve as a foundation for future research and modifications that could increase its power as a more robust policy analysis tool. As such, the current database is not able to perform analysis of tradeoffs across broad policy objectives such as economic development vs. CO2 emissions, or energy independence vs. source reduction of solid waste. Instead, it operates one level below that with comparisons of kWh or GGE generated by different feedstock/technology combinations at the state and county level. Modification of the model to incorporate factors that will enable the analysis of broader energy policy issues as those mentioned above, are recommended for future research efforts.
Visiting Vehicle Ground Trajectory Tool
NASA Technical Reports Server (NTRS)
Hamm, Dustin
2013-01-01
The International Space Station (ISS) Visiting Vehicle Group needed a targeting tool for vehicles that rendezvous with the ISS. The Visiting Vehicle Ground Trajectory targeting tool provides the ability to perform both realtime and planning operations for the Visiting Vehicle Group. This tool provides a highly reconfigurable base, which allows the Visiting Vehicle Group to perform their work. The application is composed of a telemetry processing function, a relative motion function, a targeting function, a vector view, and 2D/3D world map type graphics. The software tool provides the ability to plan a rendezvous trajectory for vehicles that visit the ISS. It models these relative trajectories using planned and realtime data from the vehicle. The tool monitors ongoing rendezvous trajectory relative motion, and ensures visiting vehicles stay within agreed corridors. The software provides the ability to update or re-plan a rendezvous to support contingency operations. Adding new parameters and incorporating them into the system was previously not available on-the-fly. If an unanticipated capability wasn't discovered until the vehicle was flying, there was no way to update things.
Mabey, David C.; Chaudhri, Simran; Brown Epstein, Helen-Ann; Lawn, Stephen D.
2017-01-01
Abstract Primary health care workers (HCWs) in low- and middle-income settings (LMIC) often work in challenging conditions in remote, rural areas, in isolation from the rest of the health system and particularly specialist care. Much attention has been given to implementation of interventions to support quality and performance improvement for workers in such settings. However, little is known about the design of such initiatives and which approaches predominate, let alone those that are most effective. We aimed for a broad understanding of what distinguishes different approaches to primary HCW support and performance improvement and to clarify the existing evidence as well as gaps in evidence in order to inform decision-making and design of programs intended to support and improve the performance of health workers in these settings. We systematically searched the literature for articles addressing this topic, and undertook a comparative review to document the principal approaches to performance and quality improvement for primary HCWs in LMIC settings. We identified 40 eligible papers reporting on interventions that we categorized into five different approaches: (1) supervision and supportive supervision; (2) mentoring; (3) tools and aids; (4) quality improvement methods, and (5) coaching. The variety of study designs and quality/performance indicators precluded a formal quantitative data synthesis. The most extensive literature was on supervision, but there was little clarity on what defines the most effective approach to the supervision activities themselves, let alone the design and implementation of supervision programs. The mentoring literature was limited, and largely focused on clinical skills building and educational strategies. Further research on how best to incorporate mentorship into pre-service clinical training, while maintaining its function within the routine health system, is needed. There is insufficient evidence to draw conclusions about coaching in this setting, however a review of the corporate and the business school literature is warranted to identify transferrable approaches. A substantial literature exists on tools, but significant variation in approaches makes comparison challenging. We found examples of effective individual projects and designs in specific settings, but there was a lack of comparative research on tools across approaches or across settings, and no systematic analysis within specific approaches to provide evidence with clear generalizability. Future research should prioritize comparative intervention trials to establish clear global standards for performance and quality improvement initiatives. Such standards will be critical to creating and sustaining a well-functioning health workforce and for global initiatives such as universal health coverage. PMID:27993961
Uranus: a rapid prototyping tool for FPGA embedded computer vision
NASA Astrophysics Data System (ADS)
Rosales-Hernández, Victor; Castillo-Jimenez, Liz; Viveros-Velez, Gilberto; Zuñiga-Grajeda, Virgilio; Treviño Torres, Abel; Arias-Estrada, M.
2007-01-01
The starting point for all successful system development is the simulation. Performing high level simulation of a system can help to identify, insolate and fix design problems. This work presents Uranus, a software tool for simulation and evaluation of image processing algorithms with support to migrate them to an FPGA environment for algorithm acceleration and embedded processes purposes. The tool includes an integrated library of previous coded operators in software and provides the necessary support to read and display image sequences as well as video files. The user can use the previous compiled soft-operators in a high level process chain, and code his own operators. Additional to the prototyping tool, Uranus offers FPGA-based hardware architecture with the same organization as the software prototyping part. The hardware architecture contains a library of FPGA IP cores for image processing that are connected with a PowerPC based system. The Uranus environment is intended for rapid prototyping of machine vision and the migration to FPGA accelerator platform, and it is distributed for academic purposes.
Using computer graphics to enhance astronaut and systems safety
NASA Technical Reports Server (NTRS)
Brown, J. W.
1985-01-01
Computer graphics is being employed at the NASA Johnson Space Center as a tool to perform rapid, efficient and economical analyses for man-machine integration, flight operations development and systems engineering. The Operator Station Design System (OSDS), a computer-based facility featuring a highly flexible and versatile interactive software package, PLAID, is described. This unique evaluation tool, with its expanding data base of Space Shuttle elements, various payloads, experiments, crew equipment and man models, supports a multitude of technical evaluations, including spacecraft and workstation layout, definition of astronaut visual access, flight techniques development, cargo integration and crew training. As OSDS is being applied to the Space Shuttle, Orbiter payloads (including the European Space Agency's Spacelab) and future space vehicles and stations, astronaut and systems safety are being enhanced. Typical OSDS examples are presented. By performing physical and operational evaluations during early conceptual phases. supporting systems verification for flight readiness, and applying its capabilities to real-time mission support, the OSDS provides the wherewithal to satisfy a growing need of the current and future space programs for efficient, economical analyses.
A Five- Year CMAQ Model Performance for Wildfires and ...
Biomass burning has been identified as an important contributor to the degradation of air quality because of its impact on ozone and particulate matter. Two components of the biomass burning inventory, wildfires and prescribed fires are routinely estimated in the national emissions inventory. However, there is a large amount of uncertainty in the development of these emission inventory sectors. We have completed a 5 year set of CMAQ model simulations (2008-2012) in which we have simulated regional air quality with and without the wildfire and prescribed fire inventory. We will examine CMAQ model performance over regions with significant PM2.5 and Ozone contribution from prescribed fires and wildfires. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
How to Boost Engineering Support Via Web 2.0 - Seeds for the Ares Project...and/or Yours?
NASA Technical Reports Server (NTRS)
Scott, David W.
2010-01-01
The Mission Operations Laboratory (MOL) at Marshall Space Flight Center (MSFC) is responsible for Engineering Support capability for NASA s Ares launch system development. In pursuit of this, MOL is building the Ares Engineering and Operations Network (AEON), a web-based portal intended to provide a seamless interface to support and simplify two critical activities: a) Access and analyze Ares manufacturing, test, and flight performance data, with access to Shuttle data for comparison. b) Provide archive storage for engineering instrumentation data to support engineering design, development, and test. A mix of NASA-written and COTS software provides engineering analysis tools. A by-product of using a data portal to access and display data is access to collaborative tools inherent in a Web 2.0 environment. This paper discusses how Web 2.0 techniques, particularly social media, might be applied to the traditionally conservative and formal engineering support arena. A related paper by the author [1] considers use
Internet and information technology use in treatment of diabetes.
Kaufman, N
2010-02-01
This chapter contains clinical studies and reviews of the state-of-the-art regarding how information technology can help improve outcomes for patients with diabetes through enhanced education and support. With the increasing sophistication of diabetes treatment protocols and diabetes-related devices this new modality offers a remarkable opportunity for clinicians and patients. For the first time, with online tools clinicians are in a position to have a major impact on diabetes outcomes by providing robust and affordable just-in-time support to large numbers of patients who want to improve their diabetes outcomes through enhanced self-management of the complex behaviours so essential for good outcomes. Patients with diabetes often need a complex set of services and support ranging from glucose monitoring, insulin and other medication management, psychotherapy and social support, to physical activity promotion, nutrition counselling and more. Integrating these supports into a patient's therapeutic regimen presents challenges that need to be addressed through a variety of strategies. Patient self-management of diabetes enabled by information technology is becoming an important factor in the way providers deliver healthcare. Approaches using information technology to support clinical services are being dramatically altered by the confluence of several trends. * Patients want an active role in managing their own health and a collaborative relationship with their healthcare providers. * Widespread, low-cost internet access is erasing existing geographic, economic and demographic barriers to obtaining health information online, and with advanced Web 2.0 technologies high levels of interactivity can engage the patient. * Clinicians and researchers now have a deeper understanding of how people learn and respond online, and that knowledge can be crafted into solutions that produce effective, long-term behaviour change. Technology enabled approaches that show great promise to improve outcomes use new models of service provision in which technology enabled self-management support (SMS) provides patients with * just-in-time delivery of tailored messages and experience that speak to each person based on their unique characteristics, their performance on key behaviours and their needs at that moment in time; * ways to easily and accurately keep track of their performance and use that knowledge to plan and implement new approaches to reaching their goals; * ways to link directly to family and friends for critical support, and to link to their many providers to help integrate medical care with everyday life. Online tools can extend health practices and provide this support through cost-effective programmes that help clinicians guide their patients to better manage their diabetes. The best internet self-management education and support programmes are rich in pertinent content, provide engaging interactive elements, and offer a tailored, personalised learning experience. They contain self-assessment tools and ways for the individual to monitor performance and changes in biological measurements such as blood sugar, insulin dosage, physical activity, weight, blood pressure and mood. The patient can access their information, input their data, and receive support 24 h a day - at a time and place most convenient for them, and not limited to clinicians' office hours. Web-based learning and support technology benefits both clinician and patient; patients learn to overcome barriers and to self-document activities and interactions, permitting clinician review and feedback at any time. In addition to automating much of the educational content, this time shifting element is one of the keys to making the process efficient and low cost. The ability to perform an automated review of the patient's activities and performance also provides the clinician with a valuable tool that increases both effectiveness and efficiency. As with online intervention, a 'virtual coach' can provide individualised guidance and support based on readily available analyses of each patient's characteristics and performance. In addition, the clinician can communicate frequently and efficiently, offering personalised email support to each patient without requiring in-person meetings, as well as monitor 'virtual support groups' where patients interact with others online via informational chat rooms and blogs. By incorporating web-based patient self-management and support into traditional treatment methods, one clinician can effectively support many patients - one patient at a time.
Human Factors Evaluations of Two-Dimensional Spacecraft Conceptual Layouts
NASA Technical Reports Server (NTRS)
Kennedy, Kriss J.; Toups, Larry D.; Rudisill, Marianne
2010-01-01
Much of the human factors work done in support of the NASA Constellation lunar program has been with low fidelity mockups. These volumetric replicas of the future lunar spacecraft allow researchers to insert test subjects from the engineering and astronaut population and evaluate the vehicle design as the test subjects perform simulations of various operational tasks. However, lunar outpost designs must be evaluated without the use of mockups, creating a need for evaluation tools that can be performed on two-dimension conceptual spacecraft layouts, such as floor plans. A tool based on the Cooper- Harper scale was developed and applied to one lunar scenario, enabling engineers to select between two competing floor plan layouts. Keywords: Constellation, human factors, tools, processes, habitat, outpost, Net Habitable Volume, Cooper-Harper.
Myer, Gregory D.; Kushner, Adam M.; Brent, Jensen L.; Schoenfeld, Brad J.; Hugentobler, Jason; Lloyd, Rhodri S.; Vermeil, Al; Chu, Donald A.; Harbin, Jason; McGill, Stuart M.
2014-01-01
Fundamental movement competency is essential for participation in physical activity and for mitigating the risk of injury, which are both key elements of health throughout life. The squat movement pattern is arguably one of the most primal and critical fundamental movements necessary to improve sport performance, to reduce injury risk and to support lifelong physical activity. Based on current evidence, this first (1 of 2) report deconstructs the technical performance of the back squat as a foundation training exercise and presents a novel dynamic screening tool that incorporates identification techniques for functional deficits that limit squat performance and injury resilience. The follow-up report will outline targeted corrective methodology for each of the functional deficits presented in the assessment tool. PMID:25506270
How Can You Support RIDM/CRM/RM Through the Use of PRA
NASA Technical Reports Server (NTRS)
DoVemto. Tpmu
2011-01-01
Probabilistic Risk Assessment (PRA) is one of key Risk Informed Decision Making (RIDM) tools. It is a scenario-based methodology aimed at identifying and assessing Safety and Technical Performance risks in complex technological systems.
DOT National Transportation Integrated Search
1982-01-01
The Detailed Station Model (DSM) provides operational and performance measures of alternative station configurations and management policies with respect to vehicle and passenger capabilities. It provides an analytic tool to support tradeoff studies ...
Development of decision-making support tools for early right-of-way acquisitions.
DOT National Transportation Integrated Search
2010-01-01
This report documents the work performed during phase two of Project 0-5534, Asset Management Texas : Style. This phase included gathering historical Texas Department of Transportation (TxDOT) right-of-way : acquisition information, analyzi...
DOT National Transportation Integrated Search
2011-01-01
To support improved analysis of the environmental impacts of proposed global aircraft operational changes, the United States Federal Aviation Administration recently worked : with European academic partners to update the airport terminal area fuel co...
Low-Level Analytical Methodology Updates to Support Decontaminant Performance Evaluations
2011-06-01
from EPDM and tire rubber coupon materials that were spiked with a known amount of the chemical agent VX, treated with bleach decontaminant, and...to evaluate the performance of bleach decontaminant on EPDM and tire rubber coupons. Dose-confirmation or Tool samples were collected by delivering...components • An aging or damaged analytical column • Dirty detector • Other factors related to general instrument and/or sample analysis performance
Mohanta, Paritosh Kumar; Regnet, Fabian; Jörissen, Ludwig
2018-05-28
Stability of cathode catalyst support material is one of the big challenges of polymer electrolyte membrane fuel cells (PEMFC) for long term applications. Traditional carbon black (CB) supports are not stable enough to prevent oxidation to CO₂ under fuel cell operating conditions. The feasibility of a graphitized carbon (GC) as a cathode catalyst support for low temperature PEMFC is investigated herein. GC and CB supported Pt electrocatalysts were prepared via an already developed polyol process. The physical characterization of the prepared catalysts was performed using transmission electron microscope (TEM), X-ray Powder Diffraction (XRD) and inductively coupled plasma optical emission spectrometry (ICP-OES) analysis, and their electrochemical characterizations were conducted via cyclic voltammetry(CV), rotating disk electrode (RDE) and potential cycling, and eventually, the catalysts were processed using membrane electrode assemblies (MEA) for single cell performance tests. Electrochemical impedance spectroscopy (EIS) and scanning electrochemical microscopy (SEM) have been used as MEA diagonostic tools. GC showed superior stability over CB in acid electrolyte under potential conditions. Single cell MEA performance of the GC-supported catalyst is comparable with the CB-supported catalyst. A correlation of MEA performance of the supported catalysts of different Brunauer⁻Emmett⁻Teller (BET) surface areas with the ionomer content was also established. GC was identified as a promising candidate for catalyst support in terms of both of the stability and the performance of fuel cell.
Hvitfeldt-Forsberg, Helena; Mazzocato, Pamela; Glaser, Daniel; Keller, Christina; Unbeck, Maria
2017-01-01
Objective To explore healthcare staffs’ and managers’ perceptions of how and when discrete event simulation modelling can be used as a decision support in improvement efforts. Design Two focus group discussions were performed. Setting Two settings were included: a rheumatology department and an orthopaedic section both situated in Sweden. Participants Healthcare staff and managers (n=13) from the two settings. Interventions Two workshops were performed, one at each setting. Workshops were initiated by a short introduction to simulation modelling. Results from the respective simulation model were then presented and discussed in the following focus group discussion. Results Categories from the content analysis are presented according to the following research questions: how and when simulation modelling can assist healthcare improvement? Regarding how, the participants mentioned that simulation modelling could act as a tool for support and a way to visualise problems, potential solutions and their effects. Regarding when, simulation modelling could be used both locally and by management, as well as a pedagogical tool to develop and test innovative ideas and to involve everyone in the improvement work. Conclusions Its potential as an information and communication tool and as an instrument for pedagogic work within healthcare improvement render a broader application and value of simulation modelling than previously reported. PMID:28588107
Torija, Antonio J; Ruiz, Diego P; Ramos-Ridao, Angel F
2014-06-01
To ensure appropriate soundscape management in urban environments, the urban-planning authorities need a range of tools that enable such a task to be performed. An essential step during the management of urban areas from a sound standpoint should be the evaluation of the soundscape in such an area. In this sense, it has been widely acknowledged that a subjective and acoustical categorization of a soundscape is the first step to evaluate it, providing a basis for designing or adapting it to match people's expectations as well. In this sense, this work proposes a model for automatic classification of urban soundscapes. This model is intended for the automatic classification of urban soundscapes based on underlying acoustical and perceptual criteria. Thus, this classification model is proposed to be used as a tool for a comprehensive urban soundscape evaluation. Because of the great complexity associated with the problem, two machine learning techniques, Support Vector Machines (SVM) and Support Vector Machines trained with Sequential Minimal Optimization (SMO), are implemented in developing model classification. The results indicate that the SMO model outperforms the SVM model in the specific task of soundscape classification. With the implementation of the SMO algorithm, the classification model achieves an outstanding performance (91.3% of instances correctly classified). © 2013 Elsevier B.V. All rights reserved.
Nano-Launcher Technologies, Approaches, and Life Cycle Assessment. Phase II
NASA Technical Reports Server (NTRS)
Zapata, Edgar
2014-01-01
Assist in understanding NASA technology and investment approaches, and other driving factors, necessary for enabling dedicated nano-launchers by industry at a cost and flight rate that (1) could support and be supported by an emerging nano-satellite market and (2) would benefit NASAs needs. Develop life-cycle cost, performance and other NASA analysis tools or models required to understand issues, drivers and challenges.
A Survey of Reliability, Maintainability, Supportability, and Testability Software Tools
1991-04-01
designs in terms of their contributions toward forced mission termination and vehicle or function loss . Includes the ability to treat failure modes of...ABSTRACT: Inputs: MTBFs, MTTRs, support equipment costs, equipment weights and costs, available targets, military occupational specialty skill level and...US Army CECOM NAME: SPARECOST ABSTRACT: Calculates expected number of failures and performs spares holding optimization based on cost, weight , or
Supporting Scientific Analysis within Collaborative Problem Solving Environments
NASA Technical Reports Server (NTRS)
Watson, Velvin R.; Kwak, Dochan (Technical Monitor)
2000-01-01
Collaborative problem solving environments for scientists should contain the analysis tools the scientists require in addition to the remote collaboration tools used for general communication. Unfortunately, most scientific analysis tools have been designed for a "stand-alone mode" and cannot be easily modified to work well in a collaborative environment. This paper addresses the questions, "What features are desired in a scientific analysis tool contained within a collaborative environment?", "What are the tool design criteria needed to provide these features?", and "What support is required from the architecture to support these design criteria?." First, the features of scientific analysis tools that are important for effective analysis in collaborative environments are listed. Next, several design criteria for developing analysis tools that will provide these features are presented. Then requirements for the architecture to support these design criteria are listed. Sonic proposed architectures for collaborative problem solving environments are reviewed and their capabilities to support the specified design criteria are discussed. A deficiency in the most popular architecture for remote application sharing, the ITU T. 120 architecture, prevents it from supporting highly interactive, dynamic, high resolution graphics. To illustrate that the specified design criteria can provide a highly effective analysis tool within a collaborative problem solving environment, a scientific analysis tool that contains the specified design criteria has been integrated into a collaborative environment and tested for effectiveness. The tests were conducted in collaborations between remote sites in the US and between remote sites on different continents. The tests showed that the tool (a tool for the visual analysis of computer simulations of physics) was highly effective for both synchronous and asynchronous collaborative analyses. The important features provided by the tool (and made possible by the specified design criteria) are: 1. The tool provides highly interactive, dynamic, high resolution, 3D graphics. 2. All remote scientists can view the same dynamic, high resolution, 3D scenes of the analysis as the analysis is being conducted. 3. The responsiveness of the tool is nearly identical to the responsiveness of the tool in a stand-alone mode. 4. The scientists can transfer control of the analysis between themselves. 5. Any analysis session or segment of an analysis session, whether done individually or collaboratively, can be recorded and posted on the Web for other scientists or students to download and play in either a collaborative or individual mode. 6. The scientist or student who downloaded the session can, individually or collaboratively, modify or extend the session with his/her own "what if" analysis of the data and post his/her version of the analysis back onto the Web. 7. The peak network bandwidth used in the collaborative sessions is only 1K bit/second even though the scientists at all sites are viewing high resolution (1280 x 1024 pixels), dynamic, 3D scenes of the analysis. The links between the specified design criteria and these performance features are presented.
Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.
Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana
2016-01-01
The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.
Instrumentation, performance visualization, and debugging tools for multiprocessors
NASA Technical Reports Server (NTRS)
Yan, Jerry C.; Fineman, Charles E.; Hontalas, Philip J.
1991-01-01
The need for computing power has forced a migration from serial computation on a single processor to parallel processing on multiprocessor architectures. However, without effective means to monitor (and visualize) program execution, debugging, and tuning parallel programs becomes intractably difficult as program complexity increases with the number of processors. Research on performance evaluation tools for multiprocessors is being carried out at ARC. Besides investigating new techniques for instrumenting, monitoring, and presenting the state of parallel program execution in a coherent and user-friendly manner, prototypes of software tools are being incorporated into the run-time environments of various hardware testbeds to evaluate their impact on user productivity. Our current tool set, the Ames Instrumentation Systems (AIMS), incorporates features from various software systems developed in academia and industry. The execution of FORTRAN programs on the Intel iPSC/860 can be automatically instrumented and monitored. Performance data collected in this manner can be displayed graphically on workstations supporting X-Windows. We have successfully compared various parallel algorithms for computational fluid dynamics (CFD) applications in collaboration with scientists from the Numerical Aerodynamic Simulation Systems Division. By performing these comparisons, we show that performance monitors and debuggers such as AIMS are practical and can illuminate the complex dynamics that occur within parallel programs.
Short-term Forecasting Tools for Agricultural Nutrient Management.
Easton, Zachary M; Kleinman, Peter J A; Buda, Anthony R; Goering, Dustin; Emberston, Nichole; Reed, Seann; Drohan, Patrick J; Walter, M Todd; Guinan, Pat; Lory, John A; Sommerlot, Andrew R; Sharpley, Andrew
2017-11-01
The advent of real-time, short-term farm management tools is motivated by the need to protect water quality above and beyond the general guidance offered by existing nutrient management plans. Advances in high-performance computing and hydrologic or climate modeling have enabled rapid dissemination of real-time information that can assist landowners and conservation personnel with short-term management planning. This paper reviews short-term decision support tools for agriculture that are under various stages of development and implementation in the United States: (i) Wisconsin's Runoff Risk Advisory Forecast (RRAF) System, (ii) New York's Hydrologically Sensitive Area Prediction Tool, (iii) Virginia's Saturated Area Forecast Model, (iv) Pennsylvania's Fertilizer Forecaster, (v) Washington's Application Risk Management (ARM) System, and (vi) Missouri's Design Storm Notification System. Although these decision support tools differ in their underlying model structure, the resolution at which they are applied, and the hydroclimates to which they are relevant, all provide forecasts (range 24-120 h) of runoff risk or soil moisture saturation derived from National Weather Service Forecast models. Although this review highlights the need for further development of robust and well-supported short-term nutrient management tools, their potential for adoption and ultimate utility requires an understanding of the appropriate context of application, the strategic and operational needs of managers, access to weather forecasts, scales of application (e.g., regional vs. field level), data requirements, and outreach communication structure. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
NASA Technical Reports Server (NTRS)
Wilson, D. A.
1976-01-01
Specific requirements for a wash/rinse capability to support Spacelab biological experimentation and to identify various concepts for achieving this capability were determined. This included the examination of current state-of-the-art and emerging technology designs that would meet the wash/rinse requirements. Once several concepts were identified, including the disposable utensils, tools and gloves or other possible alternatives, a tradeoff analysis involving system cost, weight, volume utilization, functional performance, maintainability, reliability, power utilization, safety, complexity, etc., was performed so as to determine an optimum approach for achieving a wash/rinse capability to support future space flights. Missions of varying crew size and durations were considered.
APMS: An Integrated Set of Tools for Measuring Safety
NASA Technical Reports Server (NTRS)
Statler, Irving C.; Reynard, William D. (Technical Monitor)
1996-01-01
This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data. The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data-analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions. APMS will offer to the air transport community an open, voluntary standard for flight-data-analysis software, a standard that will help to ensure suitable functionality, and data interchangeability, among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs of air crews in mind. APMS tools must serve the needs of the government and air carriers, as well as air crews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but through statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the air crew.
Hough transform for clustered microcalcifications detection in full-field digital mammograms
NASA Astrophysics Data System (ADS)
Fanizzi, A.; Basile, T. M. A.; Losurdo, L.; Amoroso, N.; Bellotti, R.; Bottigli, U.; Dentamaro, R.; Didonna, V.; Fausto, A.; Massafra, R.; Moschetta, M.; Tamborra, P.; Tangaro, S.; La Forgia, D.
2017-09-01
Many screening programs use mammography as principal diagnostic tool for detecting breast cancer at a very early stage. Despite the efficacy of the mammograms in highlighting breast diseases, the detection of some lesions is still doubtless for radiologists. In particular, the extremely minute and elongated salt-like particles of microcalcifications are sometimes no larger than 0.1 mm and represent approximately half of all cancer detected by means of mammograms. Hence the need for automatic tools able to support radiologists in their work. Here, we propose a computer assisted diagnostic tool to support radiologists in identifying microcalcifications in full (native) digital mammographic images. The proposed CAD system consists of a pre-processing step, that improves contrast and reduces noise by applying Sobel edge detection algorithm and Gaussian filter, followed by a microcalcification detection step performed by exploiting the circular Hough transform. The procedure performance was tested on 200 images coming from the Breast Cancer Digital Repository (BCDR), a publicly available database. The automatically detected clusters of microcalcifications were evaluated by skilled radiologists which asses the validity of the correctly identified regions of interest as well as the system error in case of missed clustered microcalcifications. The system performance was evaluated in terms of Sensitivity and False Positives per images (FPi) rate resulting comparable to the state-of-art approaches. The proposed model was able to accurately predict the microcalcification clusters obtaining performances (sensibility = 91.78% and FPi rate = 3.99) which favorably compare to other state-of-the-art approaches.
Electronic Health Record Tools to Care for At-Risk Older Drivers: A Quality Improvement Project.
Casey, Colleen M; Salinas, Katherine; Eckstrom, Elizabeth
2015-06-01
Evaluating driving safety of older adults is an important health topic, but primary care providers (PCP) face multiple barriers in addressing this issue. The study's objectives were to develop an electronic health record (EHR)-based Driving Clinical Support Tool, train PCPs to perform driving assessments utilizing the tool, and systematize documentation of assessment and management of driving safety issues via the tool. The intervention included development of an evidence-based Driving Clinical Support Tool within the EHR, followed by training of internal medicine providers in the tool's content and use. Pre- and postintervention provider surveys and chart review of driving-related patient visits were conducted. Surveys included self-report of preparedness and knowledge to evaluate at-risk older drivers and were analyzed using paired t-test. A chart review of driving-related office visits compared documentation pre- and postintervention including: completeness of appropriate focused history and exam, identification of deficits, patient education, and reporting to appropriate authorities when indicated. Data from 86 providers were analyzed. Pre- and postintervention surveys showed significantly increased self-assessed preparedness (p < .001) and increased driving-related knowledge (p < .001). Postintervention charts showed improved documentation of correct cognitive testing, more referrals/consults, increased patient education about community resources, and appropriate regulatory reporting when deficits were identified. Focused training and an EHR-based clinical support tool improved provider self-reported preparedness and knowledge of how to evaluate at-risk older drivers. The tool improved documentation of driving-related issues and led to improved access to interdisciplinary care coordination. Published by Oxford University Press on behalf of the Gerontological Society of America 2015.
NASA Astrophysics Data System (ADS)
Nomaguch, Yutaka; Fujita, Kikuo
This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.
Heumann, F.K.; Wilkinson, J.C.; Wooding, D.R.
1997-12-16
A remote appliance for supporting a tool for performing work at a work site on a substantially circular bore of a work piece and for providing video signals of the work site to a remote monitor comprises: a base plate having an inner face and an outer face; a plurality of rollers, wherein each roller is rotatably and adjustably attached to the inner face of the base plate and positioned to roll against the bore of the work piece when the base plate is positioned against the mouth of the bore such that the appliance may be rotated about the bore in a plane substantially parallel to the base plate; a tool holding means for supporting the tool, the tool holding means being adjustably attached to the outer face of the base plate such that the working end of the tool is positioned on the inner face side of the base plate; a camera for providing video signals of the work site to the remote monitor; and a camera holding means for supporting the camera on the inner face side of the base plate, the camera holding means being adjustably attached to the outer face of the base plate. In a preferred embodiment, roller guards are provided to protect the rollers from debris and a bore guard is provided to protect the bore from wear by the rollers and damage from debris. 5 figs.
DOT National Transportation Integrated Search
2007-01-01
The focus of the surface transportation community has been steadily shifting over the past decade, from one of capital construction and maintenance toward system operations. To support this new focus, new monitoring tools are necessary. The Virginia ...
Information Technologies and Workplace Learning.
ERIC Educational Resources Information Center
Roth, Gene L.
1995-01-01
Information technologies are important tools for individual, team, and organizational learning. Developments in virtual reality and the Internet, performance support systems that increase the efficiency of individuals and groups, and other innovations have the potential to enhance the relationship between work and learning. (SK)
Integrative Genomics Viewer (IGV) | Informatics Technology for Cancer Research (ITCR)
The Integrative Genomics Viewer (IGV) is a high-performance visualization tool for interactive exploration of large, integrated genomic datasets. It supports a wide variety of data types, including array-based and next-generation sequence data, and genomic annotations.
GIS tools for strategic SB375 planning and program participation
DOT National Transportation Integrated Search
2010-12-02
The just-completed (2009-2010) phase of this project corresponds to the second year of an envisioned three-year initiative on integrated transportation and land use planning supported by the Leonard Transportation Center (LTC) and USDOT, and performe...
41 CFR 102-192.100 - How do we submit our annual mail management report to GSA?
Code of Federal Regulations, 2012 CFR
2012-01-01
... ADMINISTRATIVE PROGRAMS 192-MAIL MANAGEMENT Reporting Requirements § 102-192.100 How do we submit our annual mail... annual reports using the GSA web-based Electronic Performance Support Tool (EPST). Agency mail managers...
41 CFR 102-192.100 - How do we submit our annual mail management report to GSA?
Code of Federal Regulations, 2013 CFR
2013-07-01
... ADMINISTRATIVE PROGRAMS 192-MAIL MANAGEMENT Reporting Requirements § 102-192.100 How do we submit our annual mail... annual reports using the GSA web-based Electronic Performance Support Tool (EPST). Agency mail managers...
41 CFR 102-192.100 - How do we submit our annual mail management report to GSA?
Code of Federal Regulations, 2011 CFR
2011-01-01
... ADMINISTRATIVE PROGRAMS 192-MAIL MANAGEMENT Reporting Requirements § 102-192.100 How do we submit our annual mail... annual reports using the GSA web-based Electronic Performance Support Tool (EPST). Agency mail managers...
41 CFR 102-192.100 - How do we submit our annual mail management report to GSA?
Code of Federal Regulations, 2014 CFR
2014-01-01
... ADMINISTRATIVE PROGRAMS 192-MAIL MANAGEMENT Reporting Requirements § 102-192.100 How do we submit our annual mail... annual reports using the GSA web-based Electronic Performance Support Tool (EPST). Agency mail managers...
DOT National Transportation Integrated Search
2016-09-02
Public transportation agencies can obtain large amounts of information regarding timeliness, efficiency, cleanliness, ridership, and other : performance measures. However, these metrics are based on the interests of these agencies and do not necessar...
NASA Technical Reports Server (NTRS)
2002-01-01
MarketMiner(R) Products, a line of automated marketing analysis tools manufactured by MarketMiner, Inc., can benefit organizations that perform significant amounts of direct marketing. MarketMiner received a Small Business Innovation Research (SBIR) contract from NASA's Johnson Space Center to develop the software as a data modeling tool for space mission applications. The technology was then built into the company current products to provide decision support for business and marketing applications. With the tool, users gain valuable information about customers and prospects from existing data in order to increase sales and profitability. MarketMiner(R) is a registered trademark of MarketMiner, Inc.
Spectacle and SpecViz: New Spectral Analysis and Visualization Tools
NASA Astrophysics Data System (ADS)
Earl, Nicholas; Peeples, Molly; JDADF Developers
2018-01-01
A new era of spectroscopic exploration of our universe is being ushered in with advances in instrumentation and next-generation space telescopes. The advent of new spectroscopic instruments has highlighted a pressing need for tools scientists can use to analyze and explore these new data. We have developed Spectacle, a software package for analyzing both synthetic spectra from hydrodynamic simulations as well as real COS data with an aim of characterizing the behavior of the circumgalactic medium. It allows easy reduction of spectral data and analytic line generation capabilities. Currently, the package is focused on automatic determination of absorption regions and line identification with custom line list support, simultaneous line fitting using Voigt profiles via least-squares or MCMC methods, and multi-component modeling of blended features. Non-parametric measurements, such as equivalent widths, delta v90, and full-width half-max are available. Spectacle also provides the ability to compose compound models used to generate synthetic spectra allowing the user to define various LSF kernels, uncertainties, and to specify sampling.We also present updates to the visualization tool SpecViz, developed in conjunction with the JWST data analysis tools development team, to aid in the exploration of spectral data. SpecViz is an open source, Python-based spectral 1-D interactive visualization and analysis application built around high-performance interactive plotting. It supports handling general and instrument-specific data and includes advanced tool-sets for filtering and detrending one-dimensional data, along with the ability to isolate absorption regions using slicing and manipulate spectral features via spectral arithmetic. Multi-component modeling is also possible using a flexible model fitting tool-set that supports custom models to be used with various fitting routines. It also features robust user extensions such as custom data loaders and support for user-created plugins that add new functionality.This work was supported in part by HST AR #13919, HST GO #14268, and HST AR #14560.
Marshall Space Flight Center's Virtual Reality Applications Program 1993
NASA Technical Reports Server (NTRS)
Hale, Joseph P., II
1993-01-01
A Virtual Reality (VR) applications program has been under development at the Marshall Space Flight Center (MSFC) since 1989. Other NASA Centers, most notably Ames Research Center (ARC), have contributed to the development of the VR enabling technologies and VR systems. This VR technology development has now reached a level of maturity where specific applications of VR as a tool can be considered. The objectives of the MSFC VR Applications Program are to develop, validate, and utilize VR as a Human Factors design and operations analysis tool and to assess and evaluate VR as a tool in other applications (e.g., training, operations development, mission support, teleoperations planning, etc.). The long-term goals of this technology program is to enable specialized Human Factors analyses earlier in the hardware and operations development process and develop more effective training and mission support systems. The capability to perform specialized Human Factors analyses earlier in the hardware and operations development process is required to better refine and validate requirements during the requirements definition phase. This leads to a more efficient design process where perturbations caused by late-occurring requirements changes are minimized. A validated set of VR analytical tools must be developed to enable a more efficient process for the design and development of space systems and operations. Similarly, training and mission support systems must exploit state-of-the-art computer-based technologies to maximize training effectiveness and enhance mission support. The approach of the VR Applications Program is to develop and validate appropriate virtual environments and associated object kinematic and behavior attributes for specific classes of applications. These application-specific environments and associated simulations will be validated, where possible, through empirical comparisons with existing, accepted tools and methodologies. These validated VR analytical tools will then be available for use in the design and development of space systems and operations and in training and mission support systems.
ERIC Educational Resources Information Center
Mintrop, Rick; Ordenes, Miguel; Coghlan, Erin; Pryor, Laura; Madero, Cristobal
2018-01-01
Purpose: The study examines why the logic of a performance management system, supported by the federal Teacher Incentive Fund, might be faulty. It does this by exploring the nuances of the interplay between teaching evaluations as formative and summative, the use of procedures, tools, and artifacts obligated by the local Teacher Incentive Fund…
NASA Astrophysics Data System (ADS)
Donà, G.; Faletra, M.
2015-09-01
This paper presents the TT&C performance simulator toolkit developed internally at Thales Alenia Space Italia (TAS-I) to support the design of TT&C subsystems for space exploration and scientific satellites. The simulator has a modular architecture and has been designed using a model-based approach using standard engineering tools such as MATLAB/SIMULINK and mission analysis tools (e.g. STK). The simulator is easily reconfigurable to fit different types of satellites, different mission requirements and different scenarios parameters. This paper provides a brief description of the simulator architecture together with two examples of applications used to demonstrate some of the simulator’s capabilities.
HPC in a HEP lab: lessons learned from setting up cost-effective HPC clusters
NASA Astrophysics Data System (ADS)
Husejko, Michal; Agtzidis, Ioannis; Baehler, Pierre; Dul, Tadeusz; Evans, John; Himyr, Nils; Meinhard, Helge
2015-12-01
In this paper we present our findings gathered during the evaluation and testing of Windows Server High-Performance Computing (Windows HPC) in view of potentially using it as a production HPC system for engineering applications. The Windows HPC package, an extension of Microsofts Windows Server product, provides all essential interfaces, utilities and management functionality for creating, operating and monitoring a Windows-based HPC cluster infrastructure. The evaluation and test phase was focused on verifying the functionalities of Windows HPC, its performance, support of commercial tools and the integration with the users work environment. We describe constraints imposed by the way the CERN Data Centre is operated, licensing for engineering tools and scalability and behaviour of the HPC engineering applications used at CERN. We will present an initial set of requirements, which were created based on the above constraints and requests from the CERN engineering user community. We will explain how we have configured Windows HPC clusters to provide job scheduling functionalities required to support the CERN engineering user community, quality of service, user- and project-based priorities, and fair access to limited resources. Finally, we will present several performance tests we carried out to verify Windows HPC performance and scalability.
FY16 Status Report on NEAMS Neutronics Activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, C. H.; Shemon, E. R.; Smith, M. A.
2016-09-30
The goal of the NEAMS neutronics effort is to develop a neutronics toolkit for use on sodium-cooled fast reactors (SFRs) which can be extended to other reactor types. The neutronics toolkit includes the high-fidelity deterministic neutron transport code PROTEUS and many supporting tools such as a cross section generation code MC 2-3, a cross section library generation code, alternative cross section generation tools, mesh generation and conversion utilities, and an automated regression test tool. The FY16 effort for NEAMS neutronics focused on supporting the release of the SHARP toolkit and existing and new users, continuing to develop PROTEUS functions necessarymore » for performance improvement as well as the SHARP release, verifying PROTEUS against available existing benchmark problems, and developing new benchmark problems as needed. The FY16 research effort was focused on further updates of PROTEUS-SN and PROTEUS-MOCEX and cross section generation capabilities as needed.« less
Automation bias: decision making and performance in high-tech cockpits.
Mosier, K L; Skitka, L J; Heers, S; Burdick, M
1997-01-01
Automated aids and decision support tools are rapidly becoming indispensable tools in high-technology cockpits and are assuming increasing control of"cognitive" flight tasks, such as calculating fuel-efficient routes, navigating, or detecting and diagnosing system malfunctions and abnormalities. This study was designed to investigate automation bias, a recently documented factor in the use of automated aids and decision support systems. The term refers to omission and commission errors resulting from the use of automated cues as a heuristic replacement for vigilant information seeking and processing. Glass-cockpit pilots flew flight scenarios involving automation events or opportunities for automation-related omission and commission errors. Although experimentally manipulated accountability demands did not significantly impact performance, post hoc analyses revealed that those pilots who reported an internalized perception of "accountability" for their performance and strategies of interaction with the automation were significantly more likely to double-check automated functioning against other cues and less likely to commit errors than those who did not share this perception. Pilots were also lilkely to erroneously "remember" the presence of expected cues when describing their decision-making processes.
Science Opportunity Analyzer (SOA): Not Just Another Pretty Face
NASA Technical Reports Server (NTRS)
Polanskey, Carol A.; Streiiffert, Barbara; O'Reilly, Taifun
2004-01-01
This viewgraph presentation reviews the Science Opportunity Analyzer (SOA). For the first time at JPL, the Cassini mission to Saturn is using distributed science operations for sequence generation. This means that scientist at other institutions has more responsibility to build the spacecraft sequence. Tools are required to support the sequence development. JPL tools required a complete configuration behind a firewall, and the tools that the user community had developed did not interface with the JPL tools. Therefore the SOA was created to bridge the gap between the remote scientists and the JPL operations teams. The presentation reviews the development of the SOA, and what was required of the system. The presentation reviews the functions that the SOA performed.
A graph algebra for scalable visual analytics.
Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V
2012-01-01
Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.
Leeman, Jennifer; Myers, Allison; Grant, Jennifer C; Wangen, Mary; Queen, Tara L
2017-09-01
The US tobacco industry spends $8.2 billion annually on marketing at the point of sale (POS), a practice known to increase tobacco use. Evidence-based policy interventions (EBPIs) are available to reduce exposure to POS marketing, and nationwide, states are funding community-based tobacco control partnerships to promote local enactment of these EBPIs. Little is known, however, about what implementation strategies best support community partnerships' success enacting EBPI. Guided by Kingdon's theory of policy change, Counter Tools provides tools, training, and other implementation strategies to support community partnerships' performance of five core policy change processes: document local problem, formulate policy solutions, engage partners, raise awareness of problems and solutions, and persuade decision makers to enact new policy. We assessed Counter Tools' impact at 1 year on (1) partnership coordinators' self-efficacy, (2) partnerships' performance of core policy change processes, (3) community progress toward EBPI enactment, and (4) salient contextual factors. Counter Tools provided implementation strategies to 30 partnerships. Data on self-efficacy were collected using a pre-post survey. Structured interviews assessed performance of core policy change processes. Data also were collected on progress toward EBPI enactment and contextual factors. Analysis included descriptive and bivariate statistics and content analysis. Following 1-year exposure to implementation strategies, coordinators' self-efficacy increased significantly. Partnerships completed the greatest proportion of activities within the "engage partners" and "document local problem" core processes. Communities made only limited progress toward policy enactment. Findings can inform delivery of implementation strategies and tests of their effects on community-level efforts to enact EBPIs.
Jimeno Yepes, Antonio; Verspoor, Karin
2014-01-01
As the cost of genomic sequencing continues to fall, the amount of data being collected and studied for the purpose of understanding the genetic basis of disease is increasing dramatically. Much of the source information relevant to such efforts is available only from unstructured sources such as the scientific literature, and significant resources are expended in manually curating and structuring the information in the literature. As such, there have been a number of systems developed to target automatic extraction of mutations and other genetic variation from the literature using text mining tools. We have performed a broad survey of the existing publicly available tools for extraction of genetic variants from the scientific literature. We consider not just one tool but a number of different tools, individually and in combination, and apply the tools in two scenarios. First, they are compared in an intrinsic evaluation context, where the tools are tested for their ability to identify specific mentions of genetic variants in a corpus of manually annotated papers, the Variome corpus. Second, they are compared in an extrinsic evaluation context based on our previous study of text mining support for curation of the COSMIC and InSiGHT databases. Our results demonstrate that no single tool covers the full range of genetic variants mentioned in the literature. Rather, several tools have complementary coverage and can be used together effectively. In the intrinsic evaluation on the Variome corpus, the combined performance is above 0.95 in F-measure, while in the extrinsic evaluation the combined recall performance is above 0.71 for COSMIC and above 0.62 for InSiGHT, a substantial improvement over the performance of any individual tool. Based on the analysis of these results, we suggest several directions for the improvement of text mining tools for genetic variant extraction from the literature. PMID:25285203
Weingart, Saul N; Yaghi, Omar; Wetherell, Matthew; Sweeney, Megan
2018-04-10
To examine the composition and concordance of existing instruments used to assess medical teams' performance. A trained observer joined 20 internal medicine housestaff teams for morning work rounds at Tufts Medical Center, a 415-bed Boston teaching hospital, from October through December 2015. The observer rated each team's performance using 9 teamwork observation instruments that examined domains including team structure, leadership, situation monitoring, mutual support, and communication. Observations recorded on paper forms were stored electronically. Scores were normalized from 1 (low) to 5 (high) to account for different rating scales. Overall mean scores were calculated and graphed; weighted scores adjusted for the number of items in each teamwork domain. Teamwork scores were analyzed using t-tests, pair-wise correlations, and the Kruskal-Wallis statistic, and team performance was compared across instruments by domain. The 9 tools incorporated 5 major domains, with 5-35 items per instrument for a total of 161 items per observation session. In weighted and unweighted analyses, the overall teamwork performance score for a given team on a given day varied by instrument. While all of the tools identified the same low outlier, high performers on some instruments were low performers on others. Inconsistent scores for a given team across instruments persisted in domain-level analyses. There was substantial variation in the rating of individual teams assessed concurrently by a single observer using multiple instruments. Since existing teamwork observation tools do not yield concordant assessments, researchers should create better tools for measuring teamwork performance.
Trajectory Assessment and Modification Tools for Next Generation Air Traffic Management Operations
NASA Technical Reports Server (NTRS)
Brasil, Connie; Lee, Paul; Mainini, Matthew; Lee, Homola; Lee, Hwasoo; Prevot, Thomas; Smith, Nancy
2011-01-01
This paper reviews three Next Generation Air Transportation System (NextGen) based high fidelity air traffic control human-in-the-loop (HITL) simulations, with a focus on the expected requirement of enhanced automated trajectory assessment and modification tools to support future air traffic flow management (ATFM) planning positions. The simulations were conducted at the National Aeronautics and Space Administration (NASA) Ames Research Centers Airspace Operations Laboratory (AOL) in 2009 and 2010. The test airspace for all three simulations assumed the mid-term NextGenEn-Route high altitude environment utilizing high altitude sectors from the Kansas City and Memphis Air Route Traffic Control Centers. Trajectory assessment, modification and coordination decision support tools were developed at the AOL in order to perform future ATFM tasks. Overall tool usage results and user acceptability ratings were collected across three areas of NextGen operatoins to evaluate the tools. In addition to the usefulness and usability feedback, feasibility issues, benefits, and future requirements were also addressed. Overall, the tool sets were rated very useful and usable, and many elements of the tools received high scores and were used frequently and successfully. Tool utilization results in all three HITLs showed both user and system benefits including better airspace throughput, reduced controller workload, and highly effective communication protocols in both full Data Comm and mixed-equipage environments.
Views of general practitioners on the use of STOPP&START in primary care: a qualitative study.
Dalleur, O; Feron, J-M; Spinewine, A
2014-08-01
STOPP (Screening Tool of Older Person's Prescriptions) and START (Screening Tool to Alert Doctors to Right Treatment) criteria aim at detecting potentially inappropriate prescribing in older people. The objective was to explore general practitioners' (GPs) perceptions regarding the use of the STOPP&START tool in their practice. We conducted three focus groups which were conveniently sampled. Vignettes with clinical cases were provided for discussion as well as a full version of the STOPP&START tool. Knowledge, strengths and weaknesses of the tool and its implementation were discussed. Two researchers independently performed content analysis, classifying quotes and creating new categories for emerging themes. Discussions highlighted incentives (e.g. systematic procedure for medication review) and barriers (e.g. time-consuming application) influencing the use of STOPP&START in primary care. Usefulness, comprehensiveness, and relevance of the tool were also questioned. Another important category emerging from the content analysis was the projected use of the tool. The GPs imagined key elements for the implementation in daily practice: computerized clinical decision support system, education, and multidisciplinary collaborations, especially at care transitions and in nursing homes. Despite variables views on the usefulness, comprehensiveness, and relevance of STOPP&START, GPs suggest the implementation of this tool in primary care within computerized clinical decision support systems, through education, and used as part of multidisciplinary collaborations.
A software communication tool for the tele-ICU.
Pimintel, Denise M; Wei, Shang Heng; Odor, Alberto
2013-01-01
The Tele Intensive Care Unit (tele-ICU) supports a high volume, high acuity population of patients. There is a high-volume of incoming and outgoing calls, especially during the evening and night hours, through the tele-ICU hubs. The tele-ICU clinicians must be able to communicate effectively to team members in order to support the care of complex and critically ill patients while supporting and maintaining a standard to improve time to intervention. This study describes a software communication tool that will improve the time to intervention, over the paper-driven communication format presently used in the tele-ICU. The software provides a multi-relational database of message instances to mine information for evaluation and quality improvement for all entities that touch the tele-ICU. The software design incorporates years of critical care and software design experience combined with new skills acquired in an applied Health Informatics program. This software tool will function in the tele-ICU environment and perform as a front-end application that gathers, routes, and displays internal communication messages for intervention by priority and provider.
2017-06-01
The Naval Postgraduate School has developed a competency model for the systems engineering profession and is implementing a tool to support high...stakes human resource functions for the U.S. Army. A systems engineering career competency model (SECCM), recently developed by the Navy and verified by...the Office of Personnel Management (OPM), defines the critical competencies for successful performance as a systems engineer at each general schedule
Evaluating the Usability of a Professional Modeling Tool Repurposed for Middle School Learning
NASA Astrophysics Data System (ADS)
Peters, Vanessa L.; Songer, Nancy Butler
2013-10-01
This paper reports the results of a three-stage usability test of a modeling tool designed to support learners' deep understanding of the impacts of climate change on ecosystems. The design process involved repurposing an existing modeling technology used by professional scientists into a learning tool specifically designed for middle school students. To evaluate usability, we analyzed students' task performance and task completion time as they worked on an activity with the repurposed modeling technology. In stage 1, we conducted remote testing of an early modeling prototype with urban middle school students (n = 84). In stages 2 and 3, we used screencasting software to record students' mouse and keyboard movements during collaborative think-alouds (n = 22) and conducted a qualitative analysis of their peer discussions. Taken together, the study findings revealed two kinds of usability issues that interfered with students' productive use of the tool: issues related to the use of data and information, and issues related to the use of the modeling technology. The study findings resulted in design improvements that led to stronger usability outcomes and higher task performance among students. In this paper, we describe our methods for usability testing, our research findings, and our design solutions for supporting students' use of the modeling technology and use of data. The paper concludes with implications for the design and study of modeling technologies for science learning.
System Specification for ADA Integrated Environment Type A AIE(1).
1982-11-12
includes program library support tools and the linker. The program library is the means by which the AIE supports independent, modular program development...KAPSE TUOL COMMUNICATION Package KAPSE-KAPS ECOMMUNICATION (KAPSE.RTS) (Most of it, except the "Language-Defined Packages" CPC) The overall...including classification of errors by severity; 5. perform optimizations for timing and/or space, without changing the functional meaning of a program by the
NASA Technical Reports Server (NTRS)
Searcy, Brittani
2017-01-01
Using virtual environments to assess complex large scale human tasks provides timely and cost effective results to evaluate designs and to reduce operational risks during assembly and integration of the Space Launch System (SLS). NASA's Marshall Space Flight Center (MSFC) uses a suite of tools to conduct integrated virtual analysis during the design phase of the SLS Program. Siemens Jack is a simulation tool that allows engineers to analyze human interaction with CAD designs by placing a digital human model into the environment to test different scenarios and assess the design's compliance to human factors requirements. Engineers at MSFC are using Jack in conjunction with motion capture and virtual reality systems in MSFC's Virtual Environments Lab (VEL). The VEL provides additional capability beyond standalone Jack to record and analyze a person performing a planned task to assemble the SLS at Kennedy Space Center (KSC). The VEL integrates Vicon Blade motion capture system, Siemens Jack, Oculus Rift, and other virtual tools to perform human factors assessments. By using motion capture and virtual reality, a more accurate breakdown and understanding of how an operator will perform a task can be gained. By virtual analysis, engineers are able to determine if a specific task is capable of being safely performed by both a 5% (approx. 5ft) female and a 95% (approx. 6'1) male. In addition, the analysis will help identify any tools or other accommodations that may to help complete the task. These assessments are critical for the safety of ground support engineers and keeping launch operations on schedule. Motion capture allows engineers to save and examine human movements on a frame by frame basis, while virtual reality gives the actor (person performing a task in the VEL) an immersive view of the task environment. This presentation will discuss the need of human factors for SLS and the benefits of analyzing tasks in NASA MSFC's VEL.
New technologies for supporting real-time on-board software development
NASA Astrophysics Data System (ADS)
Kerridge, D.
1995-03-01
The next generation of on-board data management systems will be significantly more complex than current designs, and will be required to perform more complex and demanding tasks in software. Improved hardware technology, in the form of the MA31750 radiation hard processor, is one key component in addressing the needs of future embedded systems. However, to complement these hardware advances, improved support for the design and implementation of real-time data management software is now needed. This will help to control the cost and risk assoicated with developing data management software development as it becomes an increasingly significant element within embedded systems. One particular problem with developing embedded software is managing the non-functional requirements in a systematic way. This paper identifies how Logica has exploited recent developments in hard real-time theory to address this problem through the use of new hard real-time analysis and design methods which can be supported by specialized tools. The first stage in transferring this technology from the research domain to industrial application has already been completed. The MA37150 Hard Real-Time Embedded Software Support Environment (HESSE) is a loosely integrated set of hardware and software tools which directly support the process of hard real-time analysis for software targeting the MA31750 processor. With further development, this HESSE promises to provide embedded system developers with software tools which can reduce the risks associated with developing complex hard real-time software. Supported in this way by more sophisticated software methods and tools, it is foreseen that MA31750 based embedded systems can meet the processing needs for the next generation of on-board data management systems.
DECISION-SUPPORT TOOLS FOR MANAGING WASTEWATER PIPELINE PERFORMANCE IMPROVEMENTS
Wastewater collection systems are an extensive part of the nation's infrastructure. In the US approximately 150 million people are served by about 19,000 municipal wastewater collection systems representing about 500,000 miles of sewer pipe (not including privately owned service ...
DECISION SUPPORT TOOLS FOR MANAGING WASTEWATER PIPELINE PERFORMANCE IMPROVEMENTS
Wastewater collection systems are an extensive part of the nation's infrastructure. In the US approximately 150 million people are served by about 19,000 municipal wastewater collection systems representing about 500,000 miles of sewer pipe (not including privately owned service ...
SMARTE 2007 TUTORIAL - JANUARY 2007 REVISION
SMARTe 2007 is a web-based decision support tool intended to help revitalization practitioners find information, perform data analysis, communicate, and evaluate future reuse options for a site or area. This tutorial CD was developed to help users navigate SMARTe 2007. It is appr...
Content Analysis in Systems Engineering Acquisition Activities
2016-04-30
Acquisition Activities Karen Holness, Assistant Professor, NPS Update on the Department of the Navy Systems Engineering Career Competency Model Clifford...systems engineering toolkit . Having a common analysis tool that is easy to use would support the feedback of observed system performance trends from the
Shahjehan, Khurram; Li, Guangxi; Dhokarh, Rajanigandha; Kashyap, Rahul; Janish, Christopher; Alsara, Anas; Jaffe, Allan S.; Hubmayr, Rolf D.; Gajic, Ognjen
2012-01-01
Background: At the onset of acute hypoxic respiratory failure, critically ill patients with acute lung injury (ALI) may be difficult to distinguish from those with cardiogenic pulmonary edema (CPE). No single clinical parameter provides satisfying prediction. We hypothesized that a combination of those will facilitate early differential diagnosis. Methods: In a population-based retrospective development cohort, validated electronic surveillance identified critically ill adult patients with acute pulmonary edema. Recursive partitioning and logistic regression were used to develop a decision support tool based on routine clinical information to differentiate ALI from CPE. Performance of the score was validated in an independent cohort of referral patients. Blinded post hoc expert review served as gold standard. Results: Of 332 patients in a development cohort, expert reviewers (κ, 0.86) classified 156 as having ALI and 176 as having CPE. The validation cohort had 161 patients (ALI = 113, CPE = 48). The score was based on risk factors for ALI and CPE, age, alcohol abuse, chemotherapy, and peripheral oxygen saturation/Fio2 ratio. It demonstrated good discrimination (area under curve [AUC] = 0.81; 95% CI, 0.77-0.86) and calibration (Hosmer-Lemeshow [HL] P = .16). Similar performance was obtained in the validation cohort (AUC = 0.80; 95% CI, 0.72-0.88; HL P = .13). Conclusions: A simple decision support tool accurately classifies acute pulmonary edema, reserving advanced testing for a subset of patients in whom satisfying prediction cannot be made. This novel tool may facilitate early inclusion of patients with ALI and CPE into research studies as well as improve and rationalize clinical management and resource use. PMID:22030803
STAMPS: Software Tool for Automated MRI Post-processing on a supercomputer.
Bigler, Don C; Aksu, Yaman; Miller, David J; Yang, Qing X
2009-08-01
This paper describes a Software Tool for Automated MRI Post-processing (STAMP) of multiple types of brain MRIs on a workstation and for parallel processing on a supercomputer (STAMPS). This software tool enables the automation of nonlinear registration for a large image set and for multiple MR image types. The tool uses standard brain MRI post-processing tools (such as SPM, FSL, and HAMMER) for multiple MR image types in a pipeline fashion. It also contains novel MRI post-processing features. The STAMP image outputs can be used to perform brain analysis using Statistical Parametric Mapping (SPM) or single-/multi-image modality brain analysis using Support Vector Machines (SVMs). Since STAMPS is PBS-based, the supercomputer may be a multi-node computer cluster or one of the latest multi-core computers.
Huang, Wen-Yen; Hung, Weiteng; Vu, Chi Thanh; Chen, Wei-Ting; Lai, Jhih-Wei; Lin, Chitsan
2016-11-01
Taiwan has a large number of poorly managed contaminated sites in need of remediation. This study proposes a framework, a set of standards, and a spreadsheet-based evaluation tool for implementing green and sustainable principles into remediation projects and evaluating the projects from this perspective. We performed a case study to understand how the framework would be applied. For the case study, we used a spreadsheet-based evaluation tool (SEFA) and performed field scale cultivation tests on a site contaminated with total petroleum hydrocarbons (TPHs). The site was divided into two lots: one treated by chemical oxidation and the other by bioremediation. We evaluated five core elements of green and sustainable remediation (GSR): energy, air, water resources, materials and wastes, and land and ecosystem. The proposed evaluation tool and field scale cultivation test were found to efficiently assess the effectiveness of the two remediation alternatives. The framework and related tools proposed herein can potentially be used to support decisions about the remediation of contaminated sites taking into account engineering management, cost effectiveness, and social reconciliation.
A decision support tool for landfill methane generation and gas collection.
Emkes, Harriet; Coulon, Frédéric; Wagland, Stuart
2015-09-01
This study presents a decision support tool (DST) to enhance methane generation at individual landfill sites. To date there is no such tool available to provide landfill decision makers with clear and simplified information to evaluate biochemical processes within a landfill site, to assess performance of gas production and to identify potential remedies to any issues. The current lack in understanding stems from the complexity of the landfill waste degradation process. Two scoring sets for landfill gas production performance are calculated with the tool: (1) methane output score which measures the deviation of the actual methane output rate at each site which the prediction generated by the first order decay model LandGEM; and (2) landfill gas indicators' score, which measures the deviation of the landfill gas indicators from their ideal ranges for optimal methane generation conditions. Landfill gas indicators include moisture content, temperature, alkalinity, pH, BOD, COD, BOD/COD ratio, ammonia, chloride, iron and zinc. A total landfill gas indicator score is provided using multi-criteria analysis to calculate the sum of weighted scores for each indicator. The weights for each indicator are calculated using an analytical hierarchical process. The tool is tested against five real scenarios for landfill sites in UK with a range of good, average and poor landfill methane generation over a one year period (2012). An interpretation of the results is given for each scenario and recommendations are highlighted for methane output rate enhancement. Results demonstrate how the tool can help landfill managers and operators to enhance their understanding of methane generation at a site-specific level, track landfill methane generation over time, compare and rank sites, and identify problems areas within a landfill site. Copyright © 2015 Elsevier Ltd. All rights reserved.
Composing Data Parallel Code for a SPARQL Graph Engine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castellana, Vito G.; Tumeo, Antonino; Villa, Oreste
Big data analytics process large amount of data to extract knowledge from them. Semantic databases are big data applications that adopt the Resource Description Framework (RDF) to structure metadata through a graph-based representation. The graph based representation provides several benefits, such as the possibility to perform in memory processing with large amounts of parallelism. SPARQL is a language used to perform queries on RDF-structured data through graph matching. In this paper we present a tool that automatically translates SPARQL queries to parallel graph crawling and graph matching operations. The tool also supports complex SPARQL constructs, which requires more than basicmore » graph matching for their implementation. The tool generates parallel code annotated with OpenMP pragmas for x86 Shared-memory Multiprocessors (SMPs). With respect to commercial database systems such as Virtuoso, our approach reduces memory occupation due to join operations and provides higher performance. We show the scaling of the automatically generated graph-matching code on a 48-core SMP.« less
SSSFD manipulator engineering using statistical experiment design techniques
NASA Technical Reports Server (NTRS)
Barnes, John
1991-01-01
The Satellite Servicer System Flight Demonstration (SSSFD) program is a series of Shuttle flights designed to verify major on-orbit satellite servicing capabilities, such as rendezvous and docking of free flyers, Orbital Replacement Unit (ORU) exchange, and fluid transfer. A major part of this system is the manipulator system that will perform the ORU exchange. The manipulator must possess adequate toolplate dexterity to maneuver a variety of EVA-type tools into position to interface with ORU fasteners, connectors, latches, and handles on the satellite, and to move workpieces and ORUs through 6 degree of freedom (dof) space from the Target Vehicle (TV) to the Support Module (SM) and back. Two cost efficient tools were combined to perform a study of robot manipulator design parameters. These tools are graphical computer simulations and Taguchi Design of Experiment methods. Using a graphics platform, an off-the-shelf robot simulation software package, and an experiment designed with Taguchi's approach, the sensitivities of various manipulator kinematic design parameters to performance characteristics are determined with minimal cost.
Flinn, Sharon R.; Pease, William S.; Freimer, Miriam L.
2013-01-01
OBJECTIVE We investigated the psychometric properties of the Flinn Performance Screening Tool (FPST) for people referred with symptoms of carpal tunnel syndrome (CTS). METHOD An occupational therapist collected data from 46 participants who completed the Functional Status Scale (FSS) and FPST after the participants’ nerve conduction velocity study to test convergent and contrasted-group validity. RESULTS Seventy-four percent of the participants had abnormal nerve conduction studies. Cronbach’s α coefficients for subscale and total scores of the FPST ranged from .96 to .98. Intrarater reliability for six shared items of the FSS and the FPST was supported by high agreement (71%) and a fair κ statistic (.36). Strong to moderate positive relationships were found between the FSS and FPST scores. Functional status differed significantly among severe, mild, and negative CTS severity groups. CONCLUSION The FPST shows adequate psychometric properties as a client-centered screening tool for occupational performance of people referred for symptoms of CTS. PMID:22549598
Peiris, David; Usherwood, Tim; Weeramanthri, Tarun; Cass, Alan; Patel, Anushka
2011-11-01
This article explores Australian general practitioners' (GPs) views on a novel electronic decision support (EDS) tool being developed for cardiovascular disease management. We use Timmermans and Berg's technology-in-practice approach to examine how technologies influence and are influenced by the social networks in which they are placed. In all, 21 general practitioners who piloted the tool were interviewed. The tool occupied an ill-defined middle ground in a dialectical relationship between GPs' routine care and factors promoting best practice. Drawing on Lipsky's concept of 'street-level bureaucrats', the tool's ability to process workloads expeditiously was of greatest appeal to GPs. This feature of the tool gave it the potential to alter the structure, process and content of healthcare encounters. The credibility of EDS tools appears to be mediated by fluid notions of best practice, based on an expert scrutiny of the evidence, synthesis via authoritative guidelines and dissemination through trusted and often informal networks. Balanced against this is the importance of 'soft' forms of knowledge such as intuition and timing in everyday decision-making. This resonates with Aristotle's theory of phronesis (practical wisdom) and may render EDS tools inconsequential if they merely process biomedical data. While EDS tools show promise in improving health practitioner performance, the socio-technical dimensions of their implementation warrant careful consideration. © 2011 The Authors. Sociology of Health & Illness © 2011 Foundation for the Sociology of Health & Illness/Blackwell Publishing Ltd.
Sampling Approaches for Multi-Domain Internet Performance Measurement Infrastructures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calyam, Prasad
2014-09-15
The next-generation of high-performance networks being developed in DOE communities are critical for supporting current and emerging data-intensive science applications. The goal of this project is to investigate multi-domain network status sampling techniques and tools to measure/analyze performance, and thereby provide “network awareness” to end-users and network operators in DOE communities. We leverage the infrastructure and datasets available through perfSONAR, which is a multi-domain measurement framework that has been widely deployed in high-performance computing and networking communities; the DOE community is a core developer and the largest adopter of perfSONAR. Our investigations include development of semantic scheduling algorithms, measurement federationmore » policies, and tools to sample multi-domain and multi-layer network status within perfSONAR deployments. We validate our algorithms and policies with end-to-end measurement analysis tools for various monitoring objectives such as network weather forecasting, anomaly detection, and fault-diagnosis. In addition, we develop a multi-domain architecture for an enterprise-specific perfSONAR deployment that can implement monitoring-objective based sampling and that adheres to any domain-specific measurement policies.« less
WFIRST: Data/Instrument Simulation Support at IPAC
NASA Astrophysics Data System (ADS)
Laine, Seppo; Akeson, Rachel; Armus, Lee; Bennett, Lee; Colbert, James; Helou, George; Kirkpatrick, J. Davy; Meshkat, Tiffany; Paladini, Roberta; Ramirez, Solange; Wang, Yun; Xie, Joan; Yan, Lin
2018-01-01
As part of WFIRST Science Center preparations, the IPAC Science Operations Center (ISOC) maintains a repository of 1) WFIRST data and instrument simulations, 2) tools to facilitate scientific performance and feasibility studies using the WFIRST, and 3) parameters summarizing the current design and predicted performance of the WFIRST telescope and instruments. The simulation repository provides access for the science community to simulation code, tools, and resulting analyses. Examples of simulation code with ISOC-built web-based interfaces include EXOSIMS (for estimating exoplanet yields in CGI surveys) and the Galaxy Survey Exposure Time Calculator. In the future the repository will provide an interface for users to run custom simulations of a wide range of coronagraph instrument (CGI) observations and sophisticated tools for designing microlensing experiments. We encourage those who are generating simulations or writing tools for exoplanet observations with WFIRST to contact the ISOC team so we can work with you to bring these to the attention of the broader astronomical community as we prepare for the exciting science that will be enabled by WFIRST.
Interface methods for using intranet portal organizational memory information system.
Ji, Yong Gu; Salvendy, Gavriel
2004-12-01
In this paper, an intranet portal is considered as an information infrastructure (organizational memory information system, OMIS) supporting organizational learning. The properties and the hierarchical structure of information and knowledge in an intranet portal OMIS was identified as a problem for navigation tools of an intranet portal interface. The problem relates to navigation and retrieval functions of intranet portal OMIS and is expected to adversely affect user performance, satisfaction, and usefulness. To solve the problem, a conceptual model for navigation tools of an intranet portal interface was proposed and an experiment using a crossover design was conducted with 10 participants. In the experiment, a separate access method (tabbed tree tool) was compared to an unified access method (single tree tool). The results indicate that each information/knowledge repository for which a user has a different structural knowledge should be handled separately with a separate access to increase user satisfaction and the usefulness of the OMIS and to improve user performance in navigation.
Integrating reliability and maintainability into a concurrent engineering environment
NASA Astrophysics Data System (ADS)
Phillips, Clifton B.; Peterson, Robert R.
1993-02-01
This paper describes the results of a reliability and maintainability study conducted at the University of California, San Diego and supported by private industry. Private industry thought the study was important and provided the university access to innovative tools under cooperative agreement. The current capability of reliability and maintainability tools and how they fit into the design process is investigated. The evolution of design methodologies leading up to today's capability is reviewed for ways to enhance the design process while keeping cost under control. A method for measuring the consequences of reliability and maintainability policy for design configurations in an electronic environment is provided. The interaction of selected modern computer tool sets is described for reliability, maintainability, operations, and other elements of the engineering design process. These tools provide a robust system evaluation capability that brings life cycle performance improvement information to engineers and their managers before systems are deployed, and allow them to monitor and track performance while it is in operation.
Classification Algorithms for Big Data Analysis, a Map Reduce Approach
NASA Astrophysics Data System (ADS)
Ayma, V. A.; Ferreira, R. S.; Happ, P.; Oliveira, D.; Feitosa, R.; Costa, G.; Plaza, A.; Gamba, P.
2015-03-01
Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP), which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA's machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM). The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.
Evans, Steven T; Stewart, Kevin D; Afdahl, Chris; Patel, Rohan; Newell, Kelcy J
2017-07-14
In this paper, we discuss the optimization and implementation of a high throughput process development (HTPD) tool that utilizes commercially available micro-liter sized column technology for the purification of multiple clinically significant monoclonal antibodies. Chromatographic profiles generated using this optimized tool are shown to overlay with comparable profiles from the conventional bench-scale and clinical manufacturing scale. Further, all product quality attributes measured are comparable across scales for the mAb purifications. In addition to supporting chromatography process development efforts (e.g., optimization screening), comparable product quality results at all scales makes this tool is an appropriate scale model to enable purification and product quality comparisons of HTPD bioreactors conditions. The ability to perform up to 8 chromatography purifications in parallel with reduced material requirements per run creates opportunities for gathering more process knowledge in less time. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Using competences and competence tools in workforce development.
Green, Tess; Dickerson, Claire; Blass, Eddie
The NHS Knowledge and Skills Framework (KSF) has been a driving force in the move to competence-based workforce development in the NHS. Skills for Health has developed national workforce competences that aim to improve behavioural performance, and in turn increase productivity. This article describes five projects established to test Skills for Health national workforce competences, electronic tools and products in different settings in the NHS. Competences and competence tools were used to redesign services, develop job roles, identify skills gaps and develop learning programmes. Reported benefits of the projects included increased clarity and a structured, consistent and standardized approach to workforce development. Findings from the evaluation of the tools were positive in terms of their overall usefulness and provision of related training/support. Reported constraints of using the competences and tools included issues relating to their availability, content and organization. It is recognized that a highly skilled and flexible workforce is important to the delivery of high-quality health care. These projects suggest that Skills for Health competences can be used as a 'common currency' in workforce development in the UK health sector. This would support the need to adapt rapidly to changing service needs.
Reversible micromachining locator
Salzer, Leander J.; Foreman, Larry R.
2002-01-01
A locator with a part support is used to hold a part onto the kinematic mount of a tooling machine so that the part can be held in or replaced in exactly the same position relative to the cutting tool for machining different surfaces of the part or for performing different machining operations on the same or different surfaces of the part. The locator has disposed therein a plurality of steel balls placed at equidistant positions around the planar surface of the locator and the kinematic mount has a plurality of magnets which alternate with grooves which accommodate the portions of the steel balls projecting from the locator. The part support holds the part to be machined securely in place in the locator. The locator can be easily detached from the kinematic mount, turned over, and replaced onto the same kinematic mount or another kinematic mount on another tooling machine without removing the part to be machined from the locator so that there is no need to touch or reposition the part within the locator, thereby assuring exact replication of the position of the part in relation to the cutting tool on the tooling machine for each machining operation on the part.
Isupov, Inga; McInnes, Matthew D F; Hamstra, Stan J; Doherty, Geoffrey; Gupta, Ashish; Peddle, Susan; Jibri, Zaid; Rakhra, Kawan; Hibbert, Rebecca M
2017-04-01
The purpose of this study is to develop a tool to assess the procedural competence of radiology trainees, with sources of evidence gathered from five categories to support the construct validity of tool: content, response process, internal structure, relations to other variables, and consequences. A pilot form for assessing procedural competence among radiology residents, known as the RAD-Score tool, was developed by evaluating published literature and using a modified Delphi procedure involving a group of local content experts. The pilot version of the tool was tested by seven radiology department faculty members who evaluated procedures performed by 25 residents at one institution between October 2014 and June 2015. Residents were evaluated while performing multiple procedures in both clinical and simulation settings. The main outcome measure was the percentage of residents who were considered ready to perform procedures independently, with testing conducted to determine differences between levels of training. A total of 105 forms (for 52 procedures performed in a clinical setting and 53 procedures performed in a simulation setting) were collected for a variety of procedures (eight vascular or interventional, 42 body, 12 musculoskeletal, 23 chest, and 20 breast procedures). A statistically significant difference was noted in the percentage of trainees who were rated as being ready to perform a procedure independently (in postgraduate year [PGY] 2, 12% of residents; in PGY3, 61%; in PGY4, 85%; and in PGY5, 88%; p < 0.05); this difference persisted in the clinical and simulation settings. User feedback and psychometric analysis were used to create a final version of the form. This prospective study describes the successful development of a tool for assessing the procedural competence of radiology trainees with high levels of construct validity in multiple domains. Implementation of the tool in the radiology residency curriculum is planned and can play an instrumental role in the transition to competency-based radiology training.
Self-regulation workshop and Occupational Performance Coaching with teachers: A pilot study.
Hui, Caroline; Snider, Laurie; Couture, Mélanie
2016-04-01
Teachers' occupational role and performance can be undermined when working with students with disruptive classroom behaviours. This pilot study aimed to explore the impact of school-based occupational therapy intervention on teachers' classroom management self-efficacy and perceived performance/satisfaction in their management of students with disruptive behaviours. This pilot study used a multiple-case replication study design. A cohort of regular classroom elementary school teachers (n = 11) participated in a 1-day workshop on sensorimotor strategies for supporting student self-regulation followed by eight individual sessions of Occupational Performance Coaching (OPC). Measurement tools were the Canadian Occupational Performance Measure, Goal Attainment Scaling (GAS), and Teachers' Self-Efficacy Scale-Classroom Management. Improvement in teachers' perception of performance, satisfaction, and classroom management was seen. GAS showed clinically significant improvement. Improvements were sustained at 7 weeks follow-up. Preliminary results support the use of sensorimotor education combined with OPC to enable teachers' occupational performance. © CAOT 2016.
ERIC Educational Resources Information Center
Hwang, Gwo-Jen; Sung, Han-Yu; Chang, Hsuan
2017-01-01
Researchers have pointed out that interactive e-books have rich content and interactive features which can promote students' learning interest. However, researchers have also indicated the need to integrate effective learning supports or tools to help students organize what they have learned so as to increase their learning performance, in…
2017-02-01
must evaluate compliance with reporting requirements frequently so they can readily identify delinquent past performance reports.24 The FAR also...problems the contractor recovered from without impact to the contract/ order. There should have been no significant weaknesses identified. A...contractor had trouble overcoming and state how it impacted the Government. A Marginal rating should be supported by referencing the management tool
ERIC Educational Resources Information Center
Prew, Martin; Quaigrain, Kenneth
2010-01-01
This article looks at a school management tool that allows school managers and education district offices to review the performance of their schools and use the broad-based data to undertake orchestrated planning with districts planning delivery based on the needs of schools and in support of school improvement plans. The review process also…
The Business Value of Superior Energy Performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKane, Aimee; Scheihing, Paul; Evans, Tracy
Industrial facilities participating in the U.S. Department of Energy’s (US DOE) Superior Energy Performance (SEP) program are finding that it provides them with significant business value. This value starts with the implementation of ISO 50001-Energy management system standard, which provides an internationally-relevant framework for integration of energy management into an organization’s business processes. The resulting structure emphasizes effective use of available data and supports continual improvement of energy performance. International relevance is particularly important for companies with a global presence or trading interests, providing them with access to supporting ISO standards and a growing body of certified companies representing themore » collective knowledge of communities of practice. This paper examines the business value of SEP, a voluntary program that builds on ISO 50001, inviting industry to demonstrate an even greater commitment through third-party verification of energy performance improvement to a specified level of achievement. Information from 28 facilities that have already achieved SEP certification will illustrate key findings concerning both the value and the challenges from SEP/ISO 50001 implementation. These include the facilities’ experience with implementation, internal and external value of third-party verification of energy performance improvement; attractive payback periods and the importance of SEP tools and guidance. US DOE is working to bring the program to scale, including the Enterprise-Wide Accelerator (SEP for multiple facilities in a company), the Ratepayer-Funded Program Accelerator (supporting tools for utilities and program administrators to include SEP in their program offerings), and expansion of the program to other sectors and industry supply chains.« less
Comprehensive Design Reliability Activities for Aerospace Propulsion Systems
NASA Technical Reports Server (NTRS)
Christenson, R. L.; Whitley, M. R.; Knight, K. C.
2000-01-01
This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.
Talamo, Alessandra; Mellini, Barbara; Barbieri, Barbara
2017-01-01
This paper aims to describe how nurses' planning and coordination work is performed through the use of locally designed tools (i.e., diaries, planners, reminders, and organizers). These tools are investigated as the materialization of organizational work, thus offering a complementary perspective on nursing practice to that proposed by the professional mandate and supported by official artifacts in use. Ethnographic study. By analyzing locally designed artifacts, the rationale that enables nurses to make the flow of activities work is highlighted and explained. Evidence is provided by a description of how nurses' tacit knowledge is reified and embedded into objects produced by the nurses themselves. Implications for the design of digital systems supporting nursing practice are discussed. The analysis of these artifacts has allowed an understanding of practices used by the nurses to manage the workflow in the wards.
A decision support tool for synchronizing technology advances with strategic mission objectives
NASA Technical Reports Server (NTRS)
Hornstein, Rhoda S.; Willoughby, John K.
1992-01-01
Successful accomplishment of the objectives of many long-range future missions in areas such as space systems, land-use planning, and natural resource management requires significant technology developments. This paper describes the development of a decision-support data-derived tool called MisTec for helping strategic planners to determine technology development alternatives and to synchronize the technology development schedules with the performance schedules of future long-term missions. Special attention is given to the operations, concept, design, and functional capabilities of the MisTec. The MisTec was initially designed for manned Mars mission, but can be adapted to support other high-technology long-range strategic planning situations, making it possible for a mission analyst, planner, or manager to describe a mission scenario, determine the technology alternatives for making the mission achievable, and to plan the R&D activity necessary to achieve the required technology advances.
Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model
2016-01-01
The omnipresent need for optimisation requires constant improvements of companies’ business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and “what-if” scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results. PMID:26871694
ERIC Educational Resources Information Center
1998
This document contains four papers from a symposium on technology in human resource development (HRD). "COBRA, an Electronic Performance Support System for the Analysis of Jobs and Tasks" (Theo J. Bastiaens) is described as an integrated computerized environment that provides tools, information, advice, and training to help employees do…
Water and wastewater infrastructure systems represent a major capital investment; utilities must ensure they are getting the highest yield possible on their investment, both in terms of dollars and water quality. Accurate information related to equipment, pipe characteristics, l...
Water and wastewater infrastructure systems represent a major capital investment; utilities must ensure they are getting the highest yield possible on their investment, both in terms of dollars and water quality. Accurate information related to equipment, pipe characteristics, lo...
Integrated Campaign Probabilistic Cost, Schedule, Performance, and Value for Program Office Support
NASA Technical Reports Server (NTRS)
Cornelius, David; Sasamoto, Washito; Daugherty, Kevin; Deacon, Shaun
2012-01-01
This paper describes an integrated assessment tool developed at NASA Langley Research Center that incorporates probabilistic analysis of life cycle cost, schedule, launch performance, on-orbit performance, and value across a series of planned space-based missions, or campaign. Originally designed as an aid in planning the execution of missions to accomplish the National Research Council 2007 Earth Science Decadal Survey, it utilizes Monte Carlo simulation of a series of space missions for assessment of resource requirements and expected return on investment. Interactions between simulated missions are incorporated, such as competition for launch site manifest, to capture unexpected and non-linear system behaviors. A novel value model is utilized to provide an assessment of the probabilistic return on investment. A demonstration case is discussed to illustrate the tool utility.
The CPAT 2.0.2 Domain Model - How CPAT 2.0.2 "Thinks" From an Analyst Perspective.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waddell, Lucas; Muldoon, Frank; Melander, Darryl J.
To help effectively plan the management and modernization of their large and diverse fleets of vehicles, the Program Executive Office Ground Combat Systems (PEO GCS) and the Program Executive Office Combat Support and Combat Service Support (PEO CS &CSS) commissioned the development of a large - scale portfolio planning optimization tool. This software, the Capability Portfolio Analysis Tool (CPAT), creates a detailed schedule that optimally prioritizes the modernization or replacement of vehicles within the fleet - respecting numerous business rules associated with fleet structure, budgets, industrial base, research and testing, etc., while maximizing overall fleet performance through time. This reportmore » contains a description of the organizational fleet structure and a thorough explanation of the business rules that the CPAT formulation follows involving performance, scheduling, production, and budgets. This report, which is an update to the original CPAT domain model published in 2015 (SAND2015 - 4009), covers important new CPAT features. This page intentionally left blank« less
Supporting creativity and appreciation of uncertainty in exploring geo-coded public health data.
Thew, S L; Sutcliffe, A; de Bruijn, O; McNaught, J; Procter, R; Jarvis, Paul; Buchan, I
2011-01-01
We present a prototype visualisation tool, ADVISES (Adaptive Visualization for e-Science), designed to support epidemiologists and public health practitioners in exploring geo-coded datasets and generating spatial epidemiological hypotheses. The tool is designed to support creative thinking while providing the means for the user to evaluate the validity of the visualization in terms of statistical uncertainty. We present an overview of the application and the results of an evaluation exploring public health researchers' responses to maps as a new way of viewing familiar data, in particular the use of thematic maps with adjoining descriptive statistics and forest plots to support the generation and evaluation of new hypotheses. A series of qualitative evaluations involved one experienced researcher asking 21 volunteers to interact with the system to perform a series of relatively complex, realistic map-building and exploration tasks, using a 'think aloud' protocol, followed by a semi-structured interview The volunteers were academic epidemiologists and UK National Health Service analysts. All users quickly and confidently created maps, and went on to spend substantial amounts of time exploring and interacting with system, generating hypotheses about their maps. Our findings suggest that the tool is able to support creativity and statistical appreciation among public health professionals and epidemiologists building thematic maps. Software such as this, introduced appropriately, could increase the capability of existing personnel for generating public health intelligence.
A computer aided engineering tool for ECLS systems
NASA Technical Reports Server (NTRS)
Bangham, Michal E.; Reuter, James L.
1987-01-01
The Computer-Aided Systems Engineering and Analysis tool used by NASA for environmental control and life support system design studies is capable of simulating atmospheric revitalization systems, water recovery and management systems, and single-phase active thermal control systems. The designer/analysis interface used is graphics-based, and allows the designer to build a model by constructing a schematic of the system under consideration. Data management functions are performed, and the program is translated into a format that is compatible with the solution routines.
Wacker, Michael A.
2010-01-01
Borehole geophysical logs were obtained from selected exploratory coreholes in the vicinity of the Florida Power and Light Company Turkey Point Power Plant. The geophysical logging tools used and logging sequences performed during this project are summarized herein to include borehole logging methods, descriptions of the properties measured, types of data obtained, and calibration information.
A workshop will be conducted to demonstrate and focus on two decision support tools developed at EPA/ORD: 1. Community-scale MARKAL model: an energy-water technology evaluation tool and 2. Municipal Solid Waste Decision Support Tool (MSW DST). The Workshop will be part of Southea...
Vasan, Ashwin; Mabey, David C; Chaudhri, Simran; Brown Epstein, Helen-Ann; Lawn, Stephen D
2017-04-01
Primary health care workers (HCWs) in low- and middle-income settings (LMIC) often work in challenging conditions in remote, rural areas, in isolation from the rest of the health system and particularly specialist care. Much attention has been given to implementation of interventions to support quality and performance improvement for workers in such settings. However, little is known about the design of such initiatives and which approaches predominate, let alone those that are most effective. We aimed for a broad understanding of what distinguishes different approaches to primary HCW support and performance improvement and to clarify the existing evidence as well as gaps in evidence in order to inform decision-making and design of programs intended to support and improve the performance of health workers in these settings. We systematically searched the literature for articles addressing this topic, and undertook a comparative review to document the principal approaches to performance and quality improvement for primary HCWs in LMIC settings. We identified 40 eligible papers reporting on interventions that we categorized into five different approaches: (1) supervision and supportive supervision; (2) mentoring; (3) tools and aids; (4) quality improvement methods, and (5) coaching. The variety of study designs and quality/performance indicators precluded a formal quantitative data synthesis. The most extensive literature was on supervision, but there was little clarity on what defines the most effective approach to the supervision activities themselves, let alone the design and implementation of supervision programs. The mentoring literature was limited, and largely focused on clinical skills building and educational strategies. Further research on how best to incorporate mentorship into pre-service clinical training, while maintaining its function within the routine health system, is needed. There is insufficient evidence to draw conclusions about coaching in this setting, however a review of the corporate and the business school literature is warranted to identify transferrable approaches. A substantial literature exists on tools, but significant variation in approaches makes comparison challenging. We found examples of effective individual projects and designs in specific settings, but there was a lack of comparative research on tools across approaches or across settings, and no systematic analysis within specific approaches to provide evidence with clear generalizability. Future research should prioritize comparative intervention trials to establish clear global standards for performance and quality improvement initiatives. Such standards will be critical to creating and sustaining a well-functioning health workforce and for global initiatives such as universal health coverage. © The Author 2016. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.
Community detection in complex networks using proximate support vector clustering
NASA Astrophysics Data System (ADS)
Wang, Feifan; Zhang, Baihai; Chai, Senchun; Xia, Yuanqing
2018-03-01
Community structure, one of the most attention attracting properties in complex networks, has been a cornerstone in advances of various scientific branches. A number of tools have been involved in recent studies concentrating on the community detection algorithms. In this paper, we propose a support vector clustering method based on a proximity graph, owing to which the introduced algorithm surpasses the traditional support vector approach both in accuracy and complexity. Results of extensive experiments undertaken on computer generated networks and real world data sets illustrate competent performances in comparison with the other counterparts.
Statistical methods for identifying and bounding a UXO target area or minefield
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKinstry, Craig A.; Pulsipher, Brent A.; Gilbert, Richard O.
2003-09-18
The sampling unit for minefield or UXO area characterization is typically represented by a geographical block or transect swath that lends itself to characterization by geophysical instrumentation such as mobile sensor arrays. New spatially based statistical survey methods and tools, more appropriate for these unique sampling units have been developed and implemented at PNNL (Visual Sample Plan software, ver. 2.0) with support from the US Department of Defense. Though originally developed to support UXO detection and removal efforts, these tools may also be used in current form or adapted to support demining efforts and aid in the development of newmore » sensors and detection technologies by explicitly incorporating both sampling and detection error in performance assessments. These tools may be used to (1) determine transect designs for detecting and bounding target areas of critical size, shape, and density of detectable items of interest with a specified confidence probability, (2) evaluate the probability that target areas of a specified size, shape and density have not been missed by a systematic or meandering transect survey, and (3) support post-removal verification by calculating the number of transects required to achieve a specified confidence probability that no UXO or mines have been missed.« less
Loads produced by a suited subject performing tool tasks without the use of foot restraints
NASA Technical Reports Server (NTRS)
Rajulu, Sudhakar L.; Poliner, Jeffrey; Klute, Glenn K.
1993-01-01
With an increase in the frequency of extravehicular activities (EVA's) aboard the Space Shuttle, NASA is interested in determining the capabilities of suited astronauts while performing manual tasks during an EVA, in particular the situations in which portable foot restraints are not used to stabilize the astronauts. Efforts were made to document the forces that are transmitted to spacecraft while pushing and pulling an object as well as while operating a standard wrench and an automatic power tool. The six subjects studied aboard the KC-135 reduced gravity aircraft were asked to exert a maximum torque and to maintain a constant level of torque with a wrench, to push and pull an EVA handrail, and to operate a Hubble Space Telescope (HST) power tool. The results give an estimate of the forces and moments that an operator will transmit to the handrail as well as to the supporting structure. In general, it was more effective to use the tool inwardly toward the body rather than away from the body. There were no differences in terms of strength capabilities between right and left hands. The power tool was difficult to use. It is suggested that ergonomic redesigning of the power tool may increase the efficiency of power tool use.
Data management system advanced development
NASA Technical Reports Server (NTRS)
Douglas, Katherine; Humphries, Terry
1990-01-01
The Data Management System (DMS) Advanced Development task provides for the development of concepts, new tools, DMS services, and for the testing of the Space Station DMS hardware and software. It also provides for the development of techniques capable of determining the effects of system changes/enhancements, additions of new technology, and/or hardware and software growth on system performance. This paper will address the built-in characteristics which will support network monitoring requirements in the design of the evolving DMS network implementation, functional and performance requirements for a real-time, multiprogramming, multiprocessor operating system, and the possible use of advanced development techniques such as expert systems and artificial intelligence tools in the DMS design.
Aeroelastic Optimization of Generalized Tube and Wing Aircraft Concepts Using HCDstruct Version 2.0
NASA Technical Reports Server (NTRS)
Quinlan, Jesse R.; Gern, Frank H.
2017-01-01
Major enhancements were made to the Higher-fidelity Conceptual Design and structural optimization (HCDstruct) tool developed at NASA Langley Research Center (LaRC). Whereas previous versions were limited to hybrid wing body (HWB) configurations, the current version of HCDstruct now supports the analysis of generalized tube and wing (TW) aircraft concepts. Along with significantly enhanced user input options for all air- craft configurations, these enhancements represent HCDstruct version 2.0. Validation was performed using a Boeing 737-200 aircraft model, for which primary structure weight estimates agreed well with available data. Additionally, preliminary analysis of the NASA D8 (ND8) aircraft concept was performed, highlighting several new features of the tool.
A tool to convert CAD models for importation into Geant4
NASA Astrophysics Data System (ADS)
Vuosalo, C.; Carlsmith, D.; Dasu, S.; Palladino, K.; LUX-ZEPLIN Collaboration
2017-10-01
The engineering design of a particle detector is usually performed in a Computer Aided Design (CAD) program, and simulation of the detector’s performance can be done with a Geant4-based program. However, transferring the detector design from the CAD program to Geant4 can be laborious and error-prone. SW2GDML is a tool that reads a design in the popular SOLIDWORKS CAD program and outputs Geometry Description Markup Language (GDML), used by Geant4 for importing and exporting detector geometries. Other methods for outputting CAD designs are available, such as the STEP format, and tools exist to convert these formats into GDML. However, these conversion methods produce very large and unwieldy designs composed of tessellated solids that can reduce Geant4 performance. In contrast, SW2GDML produces compact, human-readable GDML that employs standard geometric shapes rather than tessellated solids. This paper will describe the development and current capabilities of SW2GDML and plans for its enhancement. The aim of this tool is to automate importation of detector engineering models into Geant4-based simulation programs to support rapid, iterative cycles of detector design, simulation, and optimization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lutes, Robert G.; Neubauer, Casey C.; Haack, Jereme N.
2015-03-31
The Department of Energy’s (DOE’s) Building Technologies Office (BTO) is supporting the development of an open-source software tool for analyzing building energy and operational data: OpenEIS (open energy information system). This tool addresses the problems of both owners of building data and developers of tools to analyze this data. Building owners and managers have data but lack the tools to analyze it while tool developers lack data in a common format to ease development of reusable data analysis tools. This document is intended for developers of applications and explains the mechanisms for building analysis applications, accessing data, and displaying datamore » using a visualization from the included library. A brief introduction to the visualizations can be used as a jumping off point for developers familiar with JavaScript to produce their own. Several example applications are included which can be used along with this document to implement algorithms for performing energy data analysis.« less
Mueller, Martina; Wagner, Carol L; Annibale, David J; Knapp, Rebecca G; Hulsey, Thomas C; Almeida, Jonas S
2006-03-01
Approximately 30% of intubated preterm infants with respiratory distress syndrome (RDS) will fail attempted extubation, requiring reintubation and mechanical ventilation. Although ventilator technology and monitoring of premature infants have improved over time, optimal extubation remains challenging. Furthermore, extubation decisions for premature infants require complex informational processing, techniques implicitly learned through clinical practice. Computer-aided decision-support tools would benefit inexperienced clinicians, especially during peak neonatal intensive care unit (NICU) census. A five-step procedure was developed to identify predictive variables. Clinical expert (CE) thought processes comprised one model. Variables from that model were used to develop two mathematical models for the decision-support tool: an artificial neural network (ANN) and a multivariate logistic regression model (MLR). The ranking of the variables in the three models was compared using the Wilcoxon Signed Rank Test. The best performing model was used in a web-based decision-support tool with a user interface implemented in Hypertext Markup Language (HTML) and the mathematical model employing the ANN. CEs identified 51 potentially predictive variables for extubation decisions for an infant on mechanical ventilation. Comparisons of the three models showed a significant difference between the ANN and the CE (p = 0.0006). Of the original 51 potentially predictive variables, the 13 most predictive variables were used to develop an ANN as a web-based decision-tool. The ANN processes user-provided data and returns the prediction 0-1 score and a novelty index. The user then selects the most appropriate threshold for categorizing the prediction as a success or failure. Furthermore, the novelty index, indicating the similarity of the test case to the training case, allows the user to assess the confidence level of the prediction with regard to how much the new data differ from the data originally used for the development of the prediction tool. State-of-the-art, machine-learning methods can be employed for the development of sophisticated tools to aid clinicians' decisions. We identified numerous variables considered relevant for extubation decisions for mechanically ventilated premature infants with RDS. We then developed a web-based decision-support tool for clinicians which can be made widely available and potentially improve patient care world wide.
Ewing, Gail; Austin, Lynn; Grande, Gunn
2016-04-01
The importance of supporting family carers is well recognised in healthcare policy. The Carer Support Needs Assessment Tool is an evidence-based, comprehensive measure of carer support needs to facilitate carer support in palliative home care. To examine practitioner perspectives of the role of the Carer Support Needs Assessment Tool intervention in palliative home care to identify its impact and mechanisms of action. Qualitative - practitioner accounts of implementation (interviews, focus groups, reflective audio diaries) plus researcher field notes. A total of 29 staff members from two hospice home-care services - contrasting geographical locations, different service sizes and staff composition. A thematic analysis was conducted. Existing approaches to identification of carer needs were informal and unstructured. Practitioners expressed some concerns, pre-implementation, about negative impacts of the Carer Support Needs Assessment Tool on carers and expectations raised about support available. In contrast, post-implementation, the Carer Support Needs Assessment Tool provided positive impacts when used as part of a carer-led assessment and support process: it made support needs visible, legitimised support for carers and opened up different conversations with carers. The mechanisms of action that enabled the Carer Support Needs Assessment Tool to make a difference were creating space for the separate needs of carers, providing an opportunity for carers to express support needs and responding to carers' self-defined priorities. The Carer Support Needs Assessment Tool delivered benefits through a change in practice to an identifiable, separate assessment process for carers, facilitated by practitioners but carer-led. Used routinely with all carers, the Carer Support Needs Assessment Tool has the potential to normalise carer assessment and support, facilitate delivery of carer-identified support and enable effective targeting of resources. © The Author(s) 2015.
Matheson, Heath E; Buxbaum, Laurel J; Thompson-Schill, Sharon L
2017-11-01
Our use of tools is situated in different contexts. Prior evidence suggests that diverse regions within the ventral and dorsal streams represent information supporting common tool use. However, given the flexibility of object concepts, these regions may be tuned to different types of information when generating novel or uncommon uses of tools. To investigate this, we collected fMRI data from participants who reported common or uncommon tool uses in response to visually presented familiar objects. We performed a pattern dissimilarity analysis in which we correlated cortical patterns with behavioral measures of visual, action, and category information. The results showed that evoked cortical patterns within the dorsal tool use network reflected action and visual information to a greater extent in the uncommon use group, whereas evoked neural patterns within the ventral tool use network reflected categorical information more strongly in the common use group. These results reveal the flexibility of cortical representations of tool use and the situated nature of cortical representations more generally.
Orbiter Return-To-Flight Entry Aeroheating
NASA Technical Reports Server (NTRS)
Campbell, Charles H.; Anderson, Brian; Bourland, Gary; Bouslog, Stan; Cassady, Amy; Horvath, Tom; Berry, Scott A.; Gnoffo, Peter; Wood, Bill; Reuther, James;
2006-01-01
The Columbia accident on February 1, 2003 began an unprecedented level of effort within the hypersonic aerothermodynamic community to support the Space Shuttle Program. During the approximately six month time frame of the primary Columbia Accident Investigation Board activity, many technical disciplines were involved in a concerted effort to reconstruct the last moments of the Columbia and her crew, and understand the critical events that led to that loss. Significant contributions to the CAIB activity were made by the hypersonic aerothermodynamic community(REF CAIB) in understanding the re-entry environments that led to the propagation of an ascent foam induced wing leading edge damage to a subsequent breech of the wing spar of Columbia, and the subsequent breakup of the vehicle. A core of the NASA hypersonic aerothermodynamics team that was involved in the CAIB investigation has been combined with the United Space Alliance and Boeing Orbiter engineering team in order to position the Space Shuttle Program with a process to perform in-flight Thermal Protection System damage assessments. This damage assessment process is now part of the baselined plan for Shuttle support, and is a direct out-growth of the Columbia accident and NASAs response. Multiple re-entry aeroheating tools are involved in this damage assessment process, many of which have been developed during the Return To Flight activity. In addition, because these aeroheating tools are part of an overall damage assessment process that also involves the thermal and stress analyses community, in addition to a much broader mission support team, an integrated process for performing the damage assessment activities has been developed by the Space Shuttle Program and the Orbiter engineering community. Several subsets of activity in the Orbiter aeroheating communities support to the Return To Flight effort have been described in previous publications (CFD?, Cavity Heating? Any BLT? Grid Generation?). This work will provide a description of the integrated process utilized to perform Orbiter tile damage assessment, and in particular will seek to provide a description of the integrated aeroheating tools utilized to perform these assessments. Individual aeroheating tools will be described which provide the nominal re-entry heating environment characterization for the Orbiter, the heating environments for tile damage, heating effects due to exposed Thermal Protection System substrates, the application of Computational Fluid Dynamics for the description of tile cavity heating, and boundary layer transition prediction. This paper is meant to provide an overall view of the integrated aeroheating assessment process for tile damage assessment as one of a sequence of papers on the development of the boundary layer transition prediction capability in support of Space Shuttle Return To Flight efforts.
Wilson, Kenneth L; Doswell, Jayfus T; Fashola, Olatokunbo S; Debeatham, Wayne; Darko, Nii; Walker, Travelyan M; Danner, Omar K; Matthews, Leslie R; Weaver, William L
2013-09-01
This study was to extrapolate potential roles of augmented reality goggles as a clinical support tool assisting in the reduction of preventable causes of death on the battlefield. Our pilot study was designed to improve medic performance in accurately placing a large bore catheter to release tension pneumothorax (prehospital setting) while using augmented reality goggles. Thirty-four preclinical medical students recruited from Morehouse School of Medicine performed needle decompressions on human cadaver models after hearing a brief training lecture on tension pneumothorax management. Clinical vignettes identifying cadavers as having life-threatening tension pneumothoraces as a consequence of improvised explosive device attacks were used. Study group (n = 13) performed needle decompression using augmented reality goggles whereas the control group (n = 21) relied solely on memory from the lecture. The two groups were compared according to their ability to accurately complete the steps required to decompress a tension pneumothorax. The medical students using augmented reality goggle support were able to treat the tension pneumothorax on the human cadaver models more accurately than the students relying on their memory (p < 0.008). Although the augmented reality group required more time to complete the needle decompression intervention (p = 0.0684), this did not reach statistical significance. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.
Mulder, Sanne; de Rooy, Diederik
2018-01-01
In the last 35 yr, 17 commercial aviation accidents and incidents, with 576 fatalities, could likely have been attributed to mental disease of a pilot. Screening tools for mental health risks in airline pilots are needed. There is growing interest in pilot peer-support programs and how to incorporate them in a just culture, meaning that pilots can report mental health complaints without a risk of job or income loss. We combined findings from aviation accidents and incidents with a search of scientific literature to provide data-based recommendations for screening, peer-support, and a just culture approach to mental health problems. Commercial aviation accidents and incidents in which a mental disorder of a pilot was thought to play a role were reviewed. Subsequently, PubMed and PsychInfo literature searches were performed on peer-support programs, just culture human resource management, and the risk of negative life events on developing suicidal ideation and behavior in comparable professional groups. Lethal accidents were mostly related to impaired coping with negative life events. Negative life events are clearly related to suicidal thoughts, attempts, and completed suicide. A protective effect of peer-support programs on mental health problems has not been established, although peer-support programs are generally appreciated by those involved. We did not find relevant literature on just culture. Negative life events are likely a useful screening tool for mental health risks. There is still a lack of evidence on how peer-support groups should be designed and how management of mental health risks can be implemented in a just culture.Mulder S, de Rooy D. Pilot mental health, negative life events, and improving safety with peer support and a just culture. Aerosp Med Hum Perform. 2018; 89(1):41-51.
COMBINE*: An integrated opto-mechanical tool for laser performance modeling
NASA Astrophysics Data System (ADS)
Rehak, M.; Di Nicola, J. M.
2015-02-01
Accurate modeling of thermal, mechanical and optical processes is important for achieving reliable, high-performance high energy lasers such as those at the National Ignition Facility [1] (NIF). The need for this capability is even more critical for high average power, high repetition rate applications. Modeling the effects of stresses and temperature fields on optical properties allows for optimal design of optical components and more generally of the architecture of the laser system itself. Stresses change the indices of refractions and induce inhomogeneities and anisotropy. We present a modern, integrated analysis tool that efficiently produces reliable results that are used in our laser propagation tools such as VBL [5]. COMBINE is built on and supplants the existing legacy tools developed for the previous generations of lasers at LLNL but also uses commercially available mechanical finite element codes ANSYS or COMSOL (including computational fluid dynamics). The COMBINE code computes birefringence and wave front distortions due to mechanical stresses on lenses and slabs of arbitrary geometry. The stresses calculated typically originate from mounting support, vacuum load, gravity, heat absorption and/or attending cooling. Of particular importance are the depolarization and detuning effects of nonlinear crystals due to thermal loading. Results are given in the form of Jones matrices, depolarization maps and wave front distributions. An incremental evaluation of Jones matrices and ray propagation in a 3D mesh with a stress and temperature field is performed. Wavefront and depolarization maps are available at the optical aperture and at slices within the optical element. The suite is validated, user friendly, supported, documented and amenable to collaborative development. * COMBINE stands for Code for Opto-Mechanical Birefringence Integrated Numerical Evaluations.
Validation of the Virtual MET as an assessment tool for executive functions.
Rand, Debbie; Basha-Abu Rukan, Soraya; Weiss, Patrice L Tamar; Katz, Noomi
2009-08-01
The purpose of this study was to establish ecological validity and initial construct validity of a Virtual Multiple Errands Test (VMET) as an assessment tool for executive functions. It was implemented within the Virtual Mall (VMall), a novel functional video-capture virtual shopping environment. The main objectives were (1) to examine the relationships between the performance of three groups of participants in the Multiple Errands Test (MET) carried out in a real shopping mall and their performance in the VMET, (2) to assess the relationships between the MET and VMET of the post-stroke participant's level of executive functioning and independence in instrumental activities of daily living, and (3) to compare the performance of post-stroke participants to those of healthy young and older controls in both the MET and VMET. The study population included three groups; post-stroke participants (n = 9), healthy young participants (n = 20), and healthy older participants (n = 20). The VMET was able to differentiate between two age groups of healthy participants and between healthy and post-stroke participants thus demonstrating that it is sensitive to brain injury and ageing and supports construct validity between known groups. In addition, significant correlations were found between the MET and the VMET for both the post-stroke participants and older healthy participants. This provides initial support for the ecological validity of the VMET as an assessment tool of executive functions. However, further psychometric data on temporal stability are needed, namely test-retest reliability and responsiveness, before it is ready for clinical application. Further research using the VMET as an assessment tool within the VMall with larger groups and in additional populations is also recommended.
FDA's Activities Supporting Regulatory Application of "Next Gen" Sequencing Technologies.
Wilson, Carolyn A; Simonyan, Vahan
2014-01-01
Applications of next-generation sequencing (NGS) technologies require availability and access to an information technology (IT) infrastructure and bioinformatics tools for large amounts of data storage and analyses. The U.S. Food and Drug Administration (FDA) anticipates that the use of NGS data to support regulatory submissions will continue to increase as the scientific and clinical communities become more familiar with the technologies and identify more ways to apply these advanced methods to support development and evaluation of new biomedical products. FDA laboratories are conducting research on different NGS platforms and developing the IT infrastructure and bioinformatics tools needed to enable regulatory evaluation of the technologies and the data sponsors will submit. A High-performance Integrated Virtual Environment, or HIVE, has been launched, and development and refinement continues as a collaborative effort between the FDA and George Washington University to provide the tools to support these needs. The use of a highly parallelized environment facilitated by use of distributed cloud storage and computation has resulted in a platform that is both rapid and responsive to changing scientific needs. The FDA plans to further develop in-house capacity in this area, while also supporting engagement by the external community, by sponsoring an open, public workshop to discuss NGS technologies and data formats standardization, and to promote the adoption of interoperability protocols in September 2014. Next-generation sequencing (NGS) technologies are enabling breakthroughs in how the biomedical community is developing and evaluating medical products. One example is the potential application of this method to the detection and identification of microbial contaminants in biologic products. In order for the U.S. Food and Drug Administration (FDA) to be able to evaluate the utility of this technology, we need to have the information technology infrastructure and bioinformatics tools to be able to store and analyze large amounts of data. To address this need, we have developed the High-performance Integrated Virtual Environment, or HIVE. HIVE uses a combination of distributed cloud storage and distributed cloud computations to provide a platform that is both rapid and responsive to support the growing and increasingly diverse scientific and regulatory needs of FDA scientists in their evaluation of NGS in research and ultimately for evaluation of NGS data in regulatory submissions. © PDA, Inc. 2014.
The Web Resource Collaboration Center
ERIC Educational Resources Information Center
Dunlap, Joanna C.
2004-01-01
The Web Resource Collaboration Center (WRCC) is a web-based tool developed to help software engineers build their own web-based learning and performance support systems. Designed using various online communication and collaboration technologies, the WRCC enables people to: (1) build a learning and professional development resource that provides…
CBE Worplace Performance Webinar Series
Speech Privacy Task Ambient Conditioning Team Space Design Study Thermal Comfort Automotive Research , Design and Evaluation Theory, tools and strategies to help professionals create and support successful Design With Science View slide presentation (PDF) Janice Barnes, PhD, LEED AP, Principal and Global
Pilot testing of SHRP 2 reliability data and analytical products: Washington. [supporting datasets
DOT National Transportation Integrated Search
2014-01-01
The Washington site used the reliability guide from Project L02, analysis tools for forecasting reliability and estimating impacts from Project L07, Project L08, and Project C11 as well as the guide on reliability performance measures from the Projec...
Supporting Problem-Solving Performance Through the Construction of Knowledge Maps
ERIC Educational Resources Information Center
Lee, Youngmin; Baylor, Amy L.; Nelson, David W.
2005-01-01
The purpose of this article is to provide five empirically-derived guidelines for knowledge map construction tools that facilitate problem solving. First, the combinational representation principle proposes that conceptual and corresponding procedural knowledge should be represented together (rather than separately) within the knowledge map.…
High-performance scientific computing in the cloud
NASA Astrophysics Data System (ADS)
Jorissen, Kevin; Vila, Fernando; Rehr, John
2011-03-01
Cloud computing has the potential to open up high-performance computational science to a much broader class of researchers, owing to its ability to provide on-demand, virtualized computational resources. However, before such approaches can become commonplace, user-friendly tools must be developed that hide the unfamiliar cloud environment and streamline the management of cloud resources for many scientific applications. We have recently shown that high-performance cloud computing is feasible for parallelized x-ray spectroscopy calculations. We now present benchmark results for a wider selection of scientific applications focusing on electronic structure and spectroscopic simulation software in condensed matter physics. These applications are driven by an improved portable interface that can manage virtual clusters and run various applications in the cloud. We also describe a next generation of cluster tools, aimed at improved performance and a more robust cluster deployment. Supported by NSF grant OCI-1048052.
Influence of Wake Models on Calculated Tiltrotor Aerodynamics
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2001-01-01
The tiltrotor aircraft configuration has the potential to revolutionize air transportation by providing an economical combination of vertical take-off and landing capability with efficient, high-speed cruise flight. To achieve this potential it is necessary to have validated analytical tools that will support future tiltrotor aircraft development. These analytical tools must calculate tiltrotor aeromechanical behavior, including performance, structural loads, vibration, and aeroelastic stability, with an accuracy established by correlation with measured tiltrotor data. The recent test of the Tilt Rotor Aeroacoustic Model (TRAM) with a single,l/4-scale V-22 rotor in the German-Dutch Wind Tunnel (DNW) provides an extensive set of aeroacoustic, performance, and structural loads data. This paper will examine the influence of wake models on calculated tiltrotor aerodynamics, comparing calculations of performance and airloads with TRAM DNW measurements. The calculations will be performed using the comprehensive analysis CAMRAD II.
Requirements Document for Development of a Livermore Tomography Tools Interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seetho, I. M.
In this document, we outline an exercise performed at LLNL to evaluate the user interface deficits of a LLNL-developed CT reconstruction software package, Livermore Tomography Tools (LTT). We observe that a difficult-to-use command line interface and the lack of support functions compound to generate a bottleneck in the CT reconstruction process when input parameters to key functions are not well known. Through the exercise of systems engineering best practices, we generate key performance parameters for a LTT interface refresh, and specify a combination of back-end (“test-mode” functions) and front-end (graphical user interface visualization and command scripting tools) solutions to LTT’smore » poor user interface that aim to mitigate issues and lower costs associated with CT reconstruction using LTT. Key functional and non-functional requirements and risk mitigation strategies for the solution are outlined and discussed.« less
Gowing, Jeremy R; Walker, Kim N; Elmer, Shandell L; Cummings, Elizabeth A
2017-06-01
Introduction It is important that health professionals and support staff are prepared for disasters to safeguard themselves and the community during disasters. There has been a significantly heightened focus on disasters since the terrorist attacks of September 11, 2001 in New York (USA); however, despite this, it is evident that health professionals and support staff may not be adequately prepared for disasters. Report An integrative literature review was performed based on a keyword search of the major health databases for primary research evaluating preparedness of health professionals and support staff. The literature was quality appraised using a mixed-methods appraisal tool (MMAT), and a thematic analysis was completed to identify current knowledge and gaps. Discussion The main themes identified were: health professionals and support staff may not be fully prepared for disasters; the most effective content and methods for disaster preparedness is unknown; and the willingness of health professionals and support staff to attend work and perform during disasters needs further evaluation. Gaps were identified to guide further research and the creation of new knowledge to best prepare for disasters. These included the need for: high-quality research to evaluate the best content and methods of disaster preparedness; inclusion of the multi-disciplinary health care team as participants; preparation for internal disasters; the development of validated competencies for preparedness; validated tools for measurement; and the importance of performance in actual disasters to evaluate preparation. The literature identified that all types of disaster preparedness activities lead to improvements in knowledge, skills, or attitude preparedness for disasters. Most studies focused on external disasters and the preparedness of medical, nursing, public health, or paramedic professionals. There needs to be a greater focus on the whole health care team, including allied health professionals and support staff, for both internal and external disasters. Evaluation during real disasters and the use of validated competencies and tools to deliver and evaluate disaster preparedness will enhance knowledge of best practice preparedness. However, of the 36 research articles included in this review, only five were rated at 100% using the MMAT. Due to methodological weakness of the research reviewed, the findings cannot be generalized, nor can the most effective method be determined. Gowing JR , Walker KN , Elmer SL , Cummings EA . Disaster preparedness among health professionals and support staff: what is effective? An integrative literature review. Prehosp Disaster Med. 2017;32(3):321-328.
Visualising biological data: a semantic approach to tool and database integration
Pettifer, Steve; Thorne, David; McDermott, Philip; Marsh, James; Villéger, Alice; Kell, Douglas B; Attwood, Teresa K
2009-01-01
Motivation In the biological sciences, the need to analyse vast amounts of information has become commonplace. Such large-scale analyses often involve drawing together data from a variety of different databases, held remotely on the internet or locally on in-house servers. Supporting these tasks are ad hoc collections of data-manipulation tools, scripting languages and visualisation software, which are often combined in arcane ways to create cumbersome systems that have been customised for a particular purpose, and are consequently not readily adaptable to other uses. For many day-to-day bioinformatics tasks, the sizes of current databases, and the scale of the analyses necessary, now demand increasing levels of automation; nevertheless, the unique experience and intuition of human researchers is still required to interpret the end results in any meaningful biological way. Putting humans in the loop requires tools to support real-time interaction with these vast and complex data-sets. Numerous tools do exist for this purpose, but many do not have optimal interfaces, most are effectively isolated from other tools and databases owing to incompatible data formats, and many have limited real-time performance when applied to realistically large data-sets: much of the user's cognitive capacity is therefore focused on controlling the software and manipulating esoteric file formats rather than on performing the research. Methods To confront these issues, harnessing expertise in human-computer interaction (HCI), high-performance rendering and distributed systems, and guided by bioinformaticians and end-user biologists, we are building reusable software components that, together, create a toolkit that is both architecturally sound from a computing point of view, and addresses both user and developer requirements. Key to the system's usability is its direct exploitation of semantics, which, crucially, gives individual components knowledge of their own functionality and allows them to interoperate seamlessly, removing many of the existing barriers and bottlenecks from standard bioinformatics tasks. Results The toolkit, named Utopia, is freely available from . PMID:19534744
Visualising biological data: a semantic approach to tool and database integration.
Pettifer, Steve; Thorne, David; McDermott, Philip; Marsh, James; Villéger, Alice; Kell, Douglas B; Attwood, Teresa K
2009-06-16
In the biological sciences, the need to analyse vast amounts of information has become commonplace. Such large-scale analyses often involve drawing together data from a variety of different databases, held remotely on the internet or locally on in-house servers. Supporting these tasks are ad hoc collections of data-manipulation tools, scripting languages and visualisation software, which are often combined in arcane ways to create cumbersome systems that have been customized for a particular purpose, and are consequently not readily adaptable to other uses. For many day-to-day bioinformatics tasks, the sizes of current databases, and the scale of the analyses necessary, now demand increasing levels of automation; nevertheless, the unique experience and intuition of human researchers is still required to interpret the end results in any meaningful biological way. Putting humans in the loop requires tools to support real-time interaction with these vast and complex data-sets. Numerous tools do exist for this purpose, but many do not have optimal interfaces, most are effectively isolated from other tools and databases owing to incompatible data formats, and many have limited real-time performance when applied to realistically large data-sets: much of the user's cognitive capacity is therefore focused on controlling the software and manipulating esoteric file formats rather than on performing the research. To confront these issues, harnessing expertise in human-computer interaction (HCI), high-performance rendering and distributed systems, and guided by bioinformaticians and end-user biologists, we are building reusable software components that, together, create a toolkit that is both architecturally sound from a computing point of view, and addresses both user and developer requirements. Key to the system's usability is its direct exploitation of semantics, which, crucially, gives individual components knowledge of their own functionality and allows them to interoperate seamlessly, removing many of the existing barriers and bottlenecks from standard bioinformatics tasks. The toolkit, named Utopia, is freely available from http://utopia.cs.man.ac.uk/.
Stress management standards: a warning indicator for employee health.
Kazi, A; Haslam, C O
2013-07-01
Psychological stress is a major cause of lost working days in the UK. The Health & Safety Executive (HSE) has developed management standards (MS) to help organizations to assess work-related stress. To investigate the relationships between the MS indicator tool and employee health, job attitudes, work performance and environmental outcomes. The first phase involved a survey employing the MS indicator tool, General Health Questionnaire-12 (GHQ-12), job attitudes, work performance and environmental measures in a call centre from a large utility company. The second phase comprised six focus groups to investigate what employees believed contributed to their perceived stress. Three hundred and four call centre employees responded with a response rate of 85%. Significant negative correlations were found between GHQ-12 and two MS dimensions; demands (Rho = -0.211, P < 0.001) and relationships (Rho= -0.134, P < 0.05). Other dimensions showed no significant relationship with GHQ-12. Higher levels of stress were associated with reduced job performance, job motivation and increased intention to quit but low stress levels were associated with reduced job satisfaction. Lack of management support, recognition and development opportunities were identified as sources of stress. The findings support the utility of the MS as a measure of employee attitudes and performance.
Factors Affecting Job Motivation among Health Workers: A Study From Iran
Daneshkohan, Abbas; Zarei, Ehsan; Mansouri, Tahere; Maajani, Khadije; Ghasemi, Mehri Siyahat; Rezaeian, Mohsen
2015-01-01
Objective: Human resources are the most vital resource of any organizations which determine how other resources are used to accomplish organizational goals. This research aimed to identity factors affecting health workers’ motivation in Shahid Beheshti University of Medical Sciences (SBUMS). Method: This is a cross-sectional survey conducted with participation of 212 health workers of Tehran health centers in November and December 2011. The data collection tool was a researcher-developed questionnaire that included 17 motivating factors and 6 demotivating factors and 8 questions to assess the current status of some factors. Validity and reliability of the tool were confirmed. Data were analyzed with descriptive and analytical statistical tests. Results: The main motivating factors for health workers were good management, supervisors and managers’ support and good working relationship with colleagues. On the other hand, unfair treatment, poor management and lack of appreciation were the main demotivating factors. Furthermore, 47.2% of health workers believed that existing schemes for supervision were unhelpful in improving their performance. Conclusion: Strengthening management capacities in health services can increase job motivation and improve health workers’ performance. The findings suggests that special attention should be paid to some aspects such as management competencies, social support in the workplace, treating employees fairly and performance management practices, especially supervision and performance appraisal. PMID:25948438
Factors affecting job motivation among health workers: a study from Iran.
Daneshkohan, Abbas; Zarei, Ehsan; Mansouri, Tahere; Maajani, Khadije; Ghasemi, Mehri Siyahat; Rezaeian, Mohsen
2014-11-26
Human resources are the most vital resource of any organizations which determine how other resources are used to accomplish organizational goals. This research aimed to identity factors affecting health workers' motivation in Shahid Beheshti University of Medical Sciences (SBUMS). This is a cross-sectional survey conducted with participation of 212 health workers of Tehran health centers in November and December 2011. The data collection tool was a researcher-developed questionnaire that included 17 motivating factors and 6 demotivating factors and 8 questions to assess the current status of some factors. Validity and reliability of the tool were confirmed. Data were analyzed with descriptive and analytical statistical tests. The main motivating factors for health workers were good management, supervisors and managers' support and good working relationship with colleagues. On the other hand, unfair treatment, poor management and lack of appreciation were the main demotivating factors. Furthermore, 47.2% of health workers believed that existing schemes for supervision were unhelpful in improving their performance. Strengthening management capacities in health services can increase job motivation and improve health workers' performance. The findings suggests that special attention should be paid to some aspects such as management competencies, social support in the workplace, treating employees fairly and performance management practices, especially supervision and performance appraisal.
Leroy, Gondy; Xu, Jennifer; Chung, Wingyan; Eggers, Shauna; Chen, Hsinchun
2007-01-01
Retrieving sufficient relevant information online is difficult for many people because they use too few keywords to search and search engines do not provide many support tools. To further complicate the search, users often ignore support tools when available. Our goal is to evaluate in a realistic setting when users use support tools and how they perceive these tools. We compared three medical search engines with support tools that require more or less effort from users to form a query and evaluate results. We carried out an end user study with 23 users who were asked to find information, i.e., subtopics and supporting abstracts, for a given theme. We used a balanced within-subjects design and report on the effectiveness, efficiency and usability of the support tools from the end user perspective. We found significant differences in efficiency but did not find significant differences in effectiveness between the three search engines. Dynamic user support tools requiring less effort led to higher efficiency. Fewer searches were needed and more documents were found per search when both query reformulation and result review tools dynamically adjust to the user query. The query reformulation tool that provided a long list of keywords, dynamically adjusted to the user query, was used most often and led to more subtopics. As hypothesized, the dynamic result review tools were used more often and led to more subtopics than static ones. These results were corroborated by the usability questionnaires, which showed that support tools that dynamically optimize output were preferred.
How to Sustain Change and Support Continuous Quality Improvement
McQuillan, Rory; Harel, Ziv; Weizman, Adam V.; Thomas, Alison; Nesrallah, Gihad; Bell, Chaim M.; Chan, Christopher T.; Chertow, Glenn M.
2016-01-01
To achieve sustainable change, quality improvement initiatives must become the new way of working rather than something added on to routine clinical care. However, most organizational change is not maintained. In this next article in this Moving Points in Nephrology feature on quality improvement, we provide health care professionals with strategies to sustain and support quality improvement. Threats to sustainability may be identified both at the beginning of a project and when it is ready for implementation. The National Health Service Sustainability Model is reviewed as one example to help identify issues that affect long-term success of quality improvement projects. Tools to help sustain improvement include process control boards, performance boards, standard work, and improvement huddles. Process control and performance boards are methods to communicate improvement results to staff and leadership. Standard work is a written or visual outline of current best practices for a task and provides a framework to ensure that changes that have improved patient care are consistently and reliably applied to every patient encounter. Improvement huddles are short, regular meetings among staff to anticipate problems, review performance, and support a culture of improvement. Many of these tools rely on principles of visual management, which are systems transparent and simple so that every staff member can rapidly distinguish normal from abnormal working conditions. Even when quality improvement methods are properly applied, the success of a project still depends on contextual factors. Context refers to aspects of the local setting in which the project operates. Context affects resources, leadership support, data infrastructure, team motivation, and team performance. For these reasons, the same project may thrive in a supportive context and fail in a different context. To demonstrate the practical applications of these quality improvement principles, these principles are applied to a hypothetical quality improvement initiative that aims to promote home dialysis (home hemodialysis and peritoneal dialysis). PMID:27016498
Case and Administrative Support Tools
Case and Administrative Support Tools (CAST) is the secure portion of the Office of General Counsel (OGC) Dashboard business process automation tool used to help reduce office administrative labor costs while increasing employee effectiveness. CAST supports business functions which rely on and store Privacy Act sensitive data (PII). Specific business processes included in CAST (and respective PII) are: -Civil Rights Cast Tracking (name, partial medical history, summary of case, and case correspondance). -Employment Law Case Tracking (name, summary of case). -Federal Tort Claims Act Incident Tracking (name, summary of incidents). -Ethics Program Support Tools and Tracking (name, partial financial history). -Summer Honors Application Tracking (name, home address, telephone number, employment history). -Workforce Flexibility Initiative Support Tools (name, alternative workplace phone number). -Resource and Personnel Management Support Tools (name, partial employment and financial history).
Advancements in Large-Scale Data/Metadata Management for Scientific Data.
NASA Astrophysics Data System (ADS)
Guntupally, K.; Devarakonda, R.; Palanisamy, G.; Frame, M. T.
2017-12-01
Scientific data often comes with complex and diverse metadata which are critical for data discovery and users. The Online Metadata Editor (OME) tool, which was developed by an Oak Ridge National Laboratory team, effectively manages diverse scientific datasets across several federal data centers, such as DOE's Atmospheric Radiation Measurement (ARM) Data Center and USGS's Core Science Analytics, Synthesis, and Libraries (CSAS&L) project. This presentation will focus mainly on recent developments and future strategies for refining OME tool within these centers. The ARM OME is a standard based tool (https://www.archive.arm.gov/armome) that allows scientists to create and maintain metadata about their data products. The tool has been improved with new workflows that help metadata coordinators and submitting investigators to submit and review their data more efficiently. The ARM Data Center's newly upgraded Data Discovery Tool (http://www.archive.arm.gov/discovery) uses rich metadata generated by the OME to enable search and discovery of thousands of datasets, while also providing a citation generator and modern order-delivery techniques like Globus (using GridFTP), Dropbox and THREDDS. The Data Discovery Tool also supports incremental indexing, which allows users to find new data as and when they are added. The USGS CSAS&L search catalog employs a custom version of the OME (https://www1.usgs.gov/csas/ome), which has been upgraded with high-level Federal Geographic Data Committee (FGDC) validations and the ability to reserve and mint Digital Object Identifiers (DOIs). The USGS's Science Data Catalog (SDC) (https://data.usgs.gov/datacatalog) allows users to discover a myriad of science data holdings through a web portal. Recent major upgrades to the SDC and ARM Data Discovery Tool include improved harvesting performance and migration using new search software, such as Apache Solr 6.0 for serving up data/metadata to scientific communities. Our presentation will highlight the future enhancements of these tools which enable users to retrieve fast search results, along with parallelizing the retrieval process from online and High Performance Storage Systems. In addition, these improvements to the tools will support additional metadata formats like the Large-Eddy Simulation (LES) ARM Symbiotic and Observation (LASSO) bundle data.
Developing Quality Indicators for Family Support Services in Community Team-Based Mental Health Care
Olin, S. Serene; Kutash, Krista; Pollock, Michele; Burns, Barbara J.; Kuppinger, Anne; Craig, Nancy; Purdy, Frances; Armusewicz, Kelsey; Wisdom, Jennifer; Hoagwood, Kimberly E.
2013-01-01
Quality indicators for programs integrating parent-delivered family support services for children’s mental health have not been systematically developed. Increasing emphasis on accountability under the Affordable Care Act highlights the importance of quality-benchmarking efforts. Using a modified Delphi approach, quality indicators were developed for both program level and family support specialist level practices. These indicators were pilot tested with 21 community-based mental health programs. Psychometric properties of these indicators are reported; variations in program and family support specialist performance suggest the utility of these indicators as tools to guide policies and practices in organizations that integrate parent-delivered family support service components. PMID:23709287
A Two-Layer Least Squares Support Vector Machine Approach to Credit Risk Assessment
NASA Astrophysics Data System (ADS)
Liu, Jingli; Li, Jianping; Xu, Weixuan; Shi, Yong
Least squares support vector machine (LS-SVM) is a revised version of support vector machine (SVM) and has been proved to be a useful tool for pattern recognition. LS-SVM had excellent generalization performance and low computational cost. In this paper, we propose a new method called two-layer least squares support vector machine which combines kernel principle component analysis (KPCA) and linear programming form of least square support vector machine. With this method sparseness and robustness is obtained while solving large dimensional and large scale database. A U.S. commercial credit card database is used to test the efficiency of our method and the result proved to be a satisfactory one.
WISE: Automated support for software project management and measurement. M.S. Thesis
NASA Technical Reports Server (NTRS)
Ramakrishnan, Sudhakar
1995-01-01
One important aspect of software development and IV&V is measurement. Unless a software development effort is measured in some way, it is difficult to judge the effectiveness of current efforts and predict future performances. Collection of metrics and adherence to a process are difficult tasks in a software project. Change activity is a powerful indicator of project status. Automated systems that can handle change requests, issues, and other process documents provide an excellent platform for tracking the status of the project. A World Wide Web based architecture is developed for (a) making metrics collection an implicit part of the software process, (b) providing metric analysis dynamically, (c) supporting automated tools that can complement current practices of in-process improvement, and (d) overcoming geographical barrier. An operational system (WISE) instantiates this architecture allowing for the improvement of software process in a realistic environment. The tool tracks issues in software development process, provides informal communication between the users with different roles, supports to-do lists (TDL), and helps in software process improvement. WISE minimizes the time devoted to metrics collection, analysis, and captures software change data. Automated tools like WISE focus on understanding and managing the software process. The goal is improvement through measurement.
Employing Cognitive Tools within Interactive Multimedia Applications.
ERIC Educational Resources Information Center
Hedberg, John; And Others
This paper describes research into the use of cognitive tools in the classroom using "Exploring the Nardoo", an information landscape designed to support student investigation. Simulations and support tools which allow multimedia reporting are embedded in the package and are supported by several metacognitive tools for the writing…
Kim, Hyerin; Kang, NaNa; An, KyuHyeon; Koo, JaeHyung; Kim, Min-Soo
2016-01-01
Design of high-quality primers for multiple target sequences is essential for qPCR experiments, but is challenging due to the need to consider both homology tests on off-target sequences and the same stringent filtering constraints on the primers. Existing web servers for primer design have major drawbacks, including requiring the use of BLAST-like tools for homology tests, lack of support for ranking of primers, TaqMan probes and simultaneous design of primers against multiple targets. Due to the large-scale computational overhead, the few web servers supporting homology tests use heuristic approaches or perform homology tests within a limited scope. Here, we describe the MRPrimerW, which performs complete homology testing, supports batch design of primers for multi-target qPCR experiments, supports design of TaqMan probes and ranks the resulting primers to return the top-1 best primers to the user. To ensure high accuracy, we adopted the core algorithm of a previously reported MapReduce-based method, MRPrimer, but completely redesigned it to allow users to receive query results quickly in a web interface, without requiring a MapReduce cluster or a long computation. MRPrimerW provides primer design services and a complete set of 341 963 135 in silico validated primers covering 99% of human and mouse genes. Free access: http://MRPrimerW.com. PMID:27154272
Drilling Machines: Vocational Machine Shop.
ERIC Educational Resources Information Center
Thomas, John C.
The lessons and supportive information in this field tested instructional block provide a guide for teachers in developing a machine shop course of study in drilling. The document is comprised of operation sheets, information sheets, and transparency masters for 23 lessons. Each lesson plan includes a performance objective, material and tools,…
ERIC Educational Resources Information Center
Slof, B.; Erkens, G.; Kirschner, P. A.; Janssen, J.; Jaspers, J. G. M.
2012-01-01
This study investigated whether and how scripting learners' use of representational tools in a computer supported collaborative learning (CSCL)-environment fostered their collaborative performance on a complex business-economics task. Scripting the problem-solving process sequenced and made its phase-related part-task demands explicit, namely…
ERIC Educational Resources Information Center
Passerini, Katia
2007-01-01
Understanding the impact of different technological media on the achievement of instructional goals enables the delivery of a subject matter more effectively. Among the various instructional technologies that advance learning, educators and practitioners recurrently identify interactive multimedia as a very powerful tool for instruction and…
Military-Veteran Students' Perceptions of College Transition and Support Systems
ERIC Educational Resources Information Center
Pamphile, Murielle F.
2013-01-01
Military veterans preparing for new careers in the civilian world are pursuing higher educational degrees to fulfill career goals. The real-life experiences of veterans in the military are beneficial tools that can effectively enhance student veterans' academic performance and success. As veterans' enrollment continues to rise, veteran's academic…
DOT National Transportation Integrated Search
1998-10-01
The overall objectives of this study were (1) to provide basic performance evaluation of asphalt overlays on rigid pavements and (2) to provide a design tool for supporting a long-range rehabilitation plan for the US 59 : corridor in the Lufkin Distr...
Handwriting Skills in Children with Spina Bifida: Assessment, Monitoring and Measurement.
ERIC Educational Resources Information Center
Hancock, Julie; Alston, Jean
1986-01-01
Case studies of three students with spina bifida (ages 8-11) illustrate an individualized six-week handwriting intervention program which stressed assessment, monitoring, and measurement of changes in writing performance. Appropriate changes in physical support (sitting position, writing surface, and choice of writing tool) are recommended. (JW)
A large, multi-laboratory microcosm study was performed to select amendments for supporting reductive dechlorination of high levels of trichloroethylene (TCE) found at an industrial site in the United Kingdom (UK) containing dense non-aqueous phase liquid (DNAPL) TCE. The study ...
Assisting Instructional Assessment of Undergraduate Collaborative Wiki and SVN Activities
ERIC Educational Resources Information Center
Kim, Jihie; Shaw, Erin; Xu, Hao; Adarsh, G. V.
2012-01-01
In this paper we examine the collaborative performance of undergraduate engineering students who used shared project documents (Wikis, Google documents) and a software version control system (SVN) to support project collaboration. We present an initial implementation of TeamAnalytics, an instructional tool that facilitates the analyses of the…
Swan: A tool for porting CUDA programs to OpenCL
NASA Astrophysics Data System (ADS)
Harvey, M. J.; De Fabritiis, G.
2011-04-01
The use of modern, high-performance graphical processing units (GPUs) for acceleration of scientific computation has been widely reported. The majority of this work has used the CUDA programming model supported exclusively by GPUs manufactured by NVIDIA. An industry standardisation effort has recently produced the OpenCL specification for GPU programming. This offers the benefits of hardware-independence and reduced dependence on proprietary tool-chains. Here we describe a source-to-source translation tool, "Swan" for facilitating the conversion of an existing CUDA code to use the OpenCL model, as a means to aid programmers experienced with CUDA in evaluating OpenCL and alternative hardware. While the performance of equivalent OpenCL and CUDA code on fixed hardware should be comparable, we find that a real-world CUDA application ported to OpenCL exhibits an overall 50% increase in runtime, a reduction in performance attributable to the immaturity of contemporary compilers. The ported application is shown to have platform independence, running on both NVIDIA and AMD GPUs without modification. We conclude that OpenCL is a viable platform for developing portable GPU applications but that the more mature CUDA tools continue to provide best performance. Program summaryProgram title: Swan Catalogue identifier: AEIH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIH_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU Public License version 2 No. of lines in distributed program, including test data, etc.: 17 736 No. of bytes in distributed program, including test data, etc.: 131 177 Distribution format: tar.gz Programming language: C Computer: PC Operating system: Linux RAM: 256 Mbytes Classification: 6.5 External routines: NVIDIA CUDA, OpenCL Nature of problem: Graphical Processing Units (GPUs) from NVIDIA are preferentially programed with the proprietary CUDA programming toolkit. An alternative programming model promoted as an industry standard, OpenCL, provides similar capabilities to CUDA and is also supported on non-NVIDIA hardware (including multicore ×86 CPUs, AMD GPUs and IBM Cell processors). The adaptation of a program from CUDA to OpenCL is relatively straightforward but laborious. The Swan tool facilitates this conversion. Solution method:Swan performs a translation of CUDA kernel source code into an OpenCL equivalent. It also generates the C source code for entry point functions, simplifying kernel invocation from the host program. A concise host-side API abstracts the CUDA and OpenCL APIs. A program adapted to use Swan has no dependency on the CUDA compiler for the host-side program. The converted program may be built for either CUDA or OpenCL, with the selection made at compile time. Restrictions: No support for CUDA C++ features Running time: Nominal
Mikalsen, Marius; Walderhaug, Ståle
2009-01-01
The objective of the study presented here was to perform an empirical investigation on factors affecting healthcare workers acceptance and utilisation of e-learning in post-school healthcare education. E-learning benefits are realised when key features of e-learning are not only applied, but deemed useful, compatible with the learning process and supportive in order to reach the overall goals of the learning process. We conducted a survey of 14 state-enrolled nurses and skilled-workers within the field of healthcare in Norway. The results show that perceived compatibility and subjective norm explain system usage of the e-learning tool amongst the students. We found that the fact that the students considered the e-learning to be compatible with the course in question had a positive effect on e-learning tool usage. We also found support for factors such as facilitating conditions and ease of use leads to the e-learning tool being considered useful.
NASA Technical Reports Server (NTRS)
Drysdale, Alan; Thomas, Mark; Fresa, Mark; Wheeler, Ray
1992-01-01
Controlled Ecological Life Support System (CELSS) technology is critical to the Space Exploration Initiative. NASA's Kennedy Space Center has been performing CELSS research for several years, developing data related to CELSS design. We have developed OCAM (Object-oriented CELSS Analysis and Modeling), a CELSS modeling tool, and have used this tool to evaluate CELSS concepts, using this data. In using OCAM, a CELSS is broken down into components, and each component is modeled as a combination of containers, converters, and gates which store, process, and exchange carbon, hydrogen, and oxygen on a daily basis. Multiple crops and plant types can be simulated. Resource recovery options modeled include combustion, leaching, enzyme treatment, aerobic or anaerobic digestion, and mushroom and fish growth. Results include printouts and time-history graphs of total system mass, biomass, carbon dioxide, and oxygen quantities; energy consumption; and manpower requirements. The contributions of mass, energy, and manpower to system cost have been analyzed to compare configurations and determine appropriate research directions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waddell, Lucas; Muldoon, Frank; Henry, Stephen Michael
In order to effectively plan the management and modernization of their large and diverse fleets of vehicles, Program Executive Office Ground Combat Systems (PEO GCS) and Program Executive Office Combat Support and Combat Service Support (PEO CS&CSS) commis- sioned the development of a large-scale portfolio planning optimization tool. This software, the Capability Portfolio Analysis Tool (CPAT), creates a detailed schedule that optimally prioritizes the modernization or replacement of vehicles within the fleet - respecting numerous business rules associated with fleet structure, budgets, industrial base, research and testing, etc., while maximizing overall fleet performance through time. This paper contains a thor-more » ough documentation of the terminology, parameters, variables, and constraints that comprise the fleet management mixed integer linear programming (MILP) mathematical formulation. This paper, which is an update to the original CPAT formulation document published in 2015 (SAND2015-3487), covers the formulation of important new CPAT features.« less
Introducing a design exigency to promote student learning through assessment: A case study.
Grealish, Laurie A; Shaw, Julie M
2018-02-01
Assessment technologies are often used to classify student and newly qualified nurse performance as 'pass' or 'fail', with little attention to how these decisions are achieved. Examining the design exigencies of classification technologies, such as performance assessment technologies, provides opportunities to explore flexibility and change in the process of using those technologies. Evaluate an established assessment technology for nursing performance as a classification system. A case study analysis that is focused on the assessment approach and a priori design exigencies of performance assessment technology, in this case the Australian Nursing Standards Assessment Tool 2016. Nurse assessors are required to draw upon their expertise to judge performance, but that judgement is described as a source of bias, creating confusion. The definition of satisfactory performance is 'ready to enter practice'. To pass, the performance on each criterion must be at least satisfactory, indicating to the student that no further improvement is required. The Australian Nursing Standards Assessment Tool 2016 does not have a third 'other' category, which is usually found in classification systems. Introducing a 'not yet competent' category and creating a two-part, mixed methods assessment process can improve the Australian Nursing Standards Assessment Tool 2016 assessment technology. Using a standards approach in the first part, judgement is valued and can generate learning opportunities across a program. Using a measurement approach in the second part, student performance can be 'not yet competent' but still meet criteria for year level performance and a graded pass. Subjecting the Australian Nursing Standards Assessment Tool 2016 assessment technology to analysis as a classification system provides opportunities for innovation in design. This design innovation has the potential to support students who move between programs and clinicians who assess students from different universities. Copyright © 2017 Elsevier Ltd. All rights reserved.
LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory
NASA Astrophysics Data System (ADS)
Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.
2017-08-01
MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.
Vega-Constellation Tools to Analize Hyperspectral Images
NASA Astrophysics Data System (ADS)
Savorskiy, V.; Loupian, E.; Balashov, I.; Kashnitskii, A.; Konstantinova, A.; Tolpin, V.; Uvarov, I.; Kuznetsov, O.; Maklakov, S.; Panova, O.; Savchenko, E.
2016-06-01
Creating high-performance means to manage massive hyperspectral data (HSD) arrays is an actual challenge when it is implemented to deal with disparate information resources. Aiming to solve this problem the present work develops tools to work with HSD in a distributed information infrastructure, i.e. primarily to use those tools in remote access mode. The main feature of presented approach is in the development of remotely accessed services, which allow users both to conduct search and retrieval procedures on HSD sets and to provide target users with tools to analyze and to process HSD in remote mode. These services were implemented within VEGA-Constellation family information systems that were extended by adding tools oriented to support the studies of certain classes of natural objects by exploring their HSD. Particular developed tools provide capabilities to conduct analysis of such objects as vegetation canopies (forest and agriculture), open soils, forest fires, and areas of thermal anomalies. Developed software tools were successfully tested on Hyperion data sets.
Multidisciplinary Optimization for Aerospace Using Genetic Optimization
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Hahn, Edward E.; Herrera, Claudia Y.
2007-01-01
In support of the ARMD guidelines NASA's Dryden Flight Research Center is developing a multidisciplinary design and optimization tool This tool will leverage existing tools and practices, and allow the easy integration and adoption of new state-of-the-art software. Optimization has made its way into many mainstream applications. For example NASTRAN(TradeMark) has its solution sequence 200 for Design Optimization, and MATLAB(TradeMark) has an Optimization Tool box. Other packages, such as ZAERO(TradeMark) aeroelastic panel code and the CFL3D(TradeMark) Navier-Stokes solver have no built in optimizer. The goal of the tool development is to generate a central executive capable of using disparate software packages ina cross platform network environment so as to quickly perform optimization and design tasks in a cohesive streamlined manner. A provided figure (Figure 1) shows a typical set of tools and their relation to the central executive. Optimization can take place within each individual too, or in a loop between the executive and the tool, or both.
Students' Use of Electronic Support Tools in Mathematics
ERIC Educational Resources Information Center
Crawford, Lindy; Higgins, Kristina N.; Huscroft-D'Angelo, Jacqueline N.; Hall, Lindsay
2016-01-01
This study investigated students' use of electronic support tools within a computer-based mathematics program. Electronic support tools are tools, such as hyperlinks or calculators, available within many computer-based instructional programs. A convenience sample of 73 students in grades 4-6 was selected to participate in the study. Students…
NASA Astrophysics Data System (ADS)
Buck, J. A.; Underhill, P. R.; Morelli, J.; Krause, T. W.
2017-02-01
Degradation of nuclear steam generator (SG) tubes and support structures can result in a loss of reactor efficiency. Regular in-service inspection, by conventional eddy current testing (ECT), permits detection of cracks, measurement of wall loss, and identification of other SG tube degradation modes. However, ECT is challenged by overlapping degradation modes such as might occur for SG tube fretting accompanied by tube off-set within a corroding ferromagnetic support structure. Pulsed eddy current (PEC) is an emerging technology examined here for inspection of Alloy-800 SG tubes and associated carbon steel drilled support structures. Support structure hole size was varied to simulate uniform corrosion, while SG tube was off-set relative to hole axis. PEC measurements were performed using a single driver with an 8 pick-up coil configuration in the presence of flat-bottom rectangular frets as an overlapping degradation mode. A modified principal component analysis (MPCA) was performed on the time-voltage data in order to reduce data dimensionality. The MPCA scores were then used to train a support vector machine (SVM) that simultaneously targeted four independent parameters associated with; support structure hole size, tube off-centering in two dimensions and fret depth. The support vector machine was trained, tested, and validated on experimental data. Results were compared with a previously developed artificial neural network (ANN) trained on the same data. Estimates of tube position showed comparable results between the two machine learning tools. However, the ANN produced better estimates of hole inner diameter and fret depth. The better results from ANN analysis was attributed to challenges associated with the SVM when non-constant variance is present in the data.
New decision support tool for acute lymphoblastic leukemia classification
NASA Astrophysics Data System (ADS)
Madhukar, Monica; Agaian, Sos; Chronopoulos, Anthony T.
2012-03-01
In this paper, we build up a new decision support tool to improve treatment intensity choice in childhood ALL. The developed system includes different methods to accurately measure furthermore cell properties in microscope blood film images. The blood images are exposed to series of pre-processing steps which include color correlation, and contrast enhancement. By performing K-means clustering on the resultant images, the nuclei of the cells under consideration are obtained. Shape features and texture features are then extracted for classification. The system is further tested on the classification of spectra measured from the cell nuclei in blood samples in order to distinguish normal cells from those affected by Acute Lymphoblastic Leukemia. The results show that the proposed system robustly segments and classifies acute lymphoblastic leukemia based on complete microscopic blood images.
NASA Technical Reports Server (NTRS)
Smith, Philip J.; Mccoy, C. Elaine
1991-01-01
The goals of this research were to develop design concepts to support the task of enroute flight planning. And within this context, to explore and evaluate general design concepts and principles to guide the development of cooperative problem solving systems. A detailed model is to be developed of the cognitive processes involved in flight planning. Included in this model will be the identification of individual differences of subjects. Of particular interest will be differences between pilots and dispatchers. The effect will be studied of the effect on performance of tools that support planning at different levels of abstraction. In order to conduct this research, the Flight Planning Testbed (FPT) was developed, a fully functional testbed environment for studying advanced design concepts for tools to aid in flight planning.
HNS-MS : Improving Member States preparedness to face an HNS pollution of the Marine System
NASA Astrophysics Data System (ADS)
Legrand, Sebastien; Le Floch, Stéphane; Aprin, Laurent; Parthenay, Valérie; Donnay, Eric; Parmentier, Koen; Ovidio, Fabrice; Schallier, Ronny; Poncet, Florence; Chataing, Sophie; Poupon, Emmanuelle; Hellouvry, Yann-Hervé
2016-04-01
When dealing with a HNS pollution incident, one of the priority requirements is the identification of the hazard and an assessment of the risk posed to the public and responder safety, the environment and socioeconomic assets upon which a state or coastal community depend. The primary factors which determine the safety, environmental and socioeconomic impact of the released substance(s) relate to their physico-chemical properties and fate in the environment. Until now, preparedness actions at various levels have primarily aimed at classifying the general environmental or public health hazard of an HNS, or at performing a risk analysis of HNS transported in European marine regions. Operational datasheets have been (MIDSIS-TROCS) or are being (MAR-CIS) developed collating detailed, substance-specific information for responders and covering information needs at the first stage of an incident. However, contrary to oil pollution preparedness and response tools, only few decision-support tools used by Member State authorities (Coastguard agencies or other) integrate 3D models that are able to simulate the drift, fate and behaviour of HNS spills in the marine environment. When they do, they usually consider simplified or steady-state environmental conditions. Moreover, the above-mentioned available HNS information is currently not sufficiently detailed or not suitably classified to be used as an input for an advanced HNS support decision tool. HNS-MS aims at developing a 'one-stop shop' integrated HNS decision-support tool that is able to predict the drift, behaviour and Fate of HNS spills under realistic environmental conditions and at providing key product information - drawing upon and in complement to existing studies and databases - to improve the understanding and evaluation of a HNS spill situation in the field and the environmental and safety-related issues at stake. The 3D HNS drift and fate model and decision-support tool will also be useful at the preparedness stage. The expected results will be an operational HNS decision-support tool (prototype) for the Bonn Agreement area that can also be viewed as a demonstrator tool for other European marine regions. The developed tool will have a similar operational level as OSERIT, the Belgian oil spill drift model. The HNS decision-support tool will integrate the following features: 1. A database containing the physico-chemical parameters needed to compute the behaviour in the marine environment of 100+ relevant HNS; 2. A database of environmental and socioeconomic HNS-sensitive features; 3. A three dimensional HNS spill drift and fate model able to simulate HNS behaviour in the marine environment (including floaters, sinkers, evaporators and dissolvers). 4. A user-friendly web-based interface allowing Coastguard stations to launch a HNS drift simulation and visualize post-processed results in support of an incident evaluation and decision-making process. In this contribution, we will present the methodology followed to develop these four features.
Boscardin, Christy; Fergus, Kirkpatrick B; Hellevig, Bonnie; Hauer, Karen E
2017-11-09
Easily accessible and interpretable performance data constitute critical feedback for learners that facilitate informed self-assessment and learning planning. To provide this feedback, there has been a proliferation of educational dashboards in recent years. An educational (learner) dashboard systematically delivers timely and continuous feedback on performance and can provide easily visualized and interpreted performance data. In this paper, we provide practical tips for developing a functional, user-friendly individual learner performance dashboard and literature review of dashboard development, assessment theory, and users' perspectives. Considering key design principles and maximizing current technological advances in data visualization techniques can increase dashboard utility and enhance the user experience. By bridging current technology with assessment strategies that support learning, educators can continue to improve the field of learning analytics and design of information management tools such as dashboards in support of improved learning outcomes.
UTOPIA-User-Friendly Tools for Operating Informatics Applications.
Pettifer, S R; Sinnott, J R; Attwood, T K
2004-01-01
Bioinformaticians routinely analyse vast amounts of information held both in large remote databases and in flat data files hosted on local machines. The contemporary toolkit available for this purpose consists of an ad hoc collection of data manipulation tools, scripting languages and visualization systems; these must often be combined in complex and bespoke ways, the result frequently being an unwieldy artefact capable of one specific task, which cannot easily be exploited or extended by other practitioners. Owing to the sizes of current databases and the scale of the analyses necessary, routine bioinformatics tasks are often automated, but many still require the unique experience and intuition of human researchers: this requires tools that support real-time interaction with complex datasets. Many existing tools have poor user interfaces and limited real-time performance when applied to realistically large datasets; much of the user's cognitive capacity is therefore focused on controlling the tool rather than on performing the research. The UTOPIA project is addressing some of these issues by building reusable software components that can be combined to make useful applications in the field of bioinformatics. Expertise in the fields of human computer interaction, high-performance rendering, and distributed systems is being guided by bioinformaticians and end-user biologists to create a toolkit that is both architecturally sound from a computing point of view, and directly addresses end-user and application-developer requirements.
Figueiro, Ana Claudia; de Araújo Oliveira, Sydia Rosana; Hartz, Zulmira; Couturier, Yves; Bernier, Jocelyne; do Socorro Machado Freire, Maria; Samico, Isabella; Medina, Maria Guadalupe; de Sa, Ronice Franco; Potvin, Louise
2017-03-01
Public health interventions are increasingly represented as complex systems. Research tools for capturing the dynamic of interventions processes, however, are practically non-existent. This paper describes the development and proof of concept process of an analytical tool, the critical event card (CEC), which supports the representation and analysis of complex interventions' evolution, based on critical events. Drawing on the actor-network theory (ANT), we developed and field-tested the tool using three innovative health interventions in northeastern Brazil. Interventions were aimed to promote health equity through intersectoral approaches; were engaged in participatory evaluation and linked to professional training programs. The CEC developing involve practitioners and researchers from projects. Proof of concept was based on document analysis, face-to-face interviews and focus groups. Analytical categories from CEC allow identifying and describing critical events as milestones in the evolution of complex interventions. Categories are (1) event description; (2) actants (human and non-human) involved; (3) interactions between actants; (4) mediations performed; (5) actions performed; (6) inscriptions produced; and (7) consequences for interventions. The CEC provides a tool to analyze and represent intersectoral internvetions' complex and dynamic evolution.
Using component technologies for web based wavelet enhanced mammographic image visualization.
Sakellaropoulos, P; Costaridou, L; Panayiotakis, G
2000-01-01
The poor contrast detectability of mammography can be dealt with by domain specific software visualization tools. Remote desktop client access and time performance limitations of a previously reported visualization tool are addressed, aiming at more efficient visualization of mammographic image resources existing in web or PACS image servers. This effort is also motivated by the fact that at present, web browsers do not support domain-specific medical image visualization. To deal with desktop client access the tool was redesigned by exploring component technologies, enabling the integration of stand alone domain specific mammographic image functionality in a web browsing environment (web adaptation). The integration method is based on ActiveX Document Server technology. ActiveX Document is a part of Object Linking and Embedding (OLE) extensible systems object technology, offering new services in existing applications. The standard DICOM 3.0 part 10 compatible image-format specification Papyrus 3.0 is supported, in addition to standard digitization formats such as TIFF. The visualization functionality of the tool has been enhanced by including a fast wavelet transform implementation, which allows for real time wavelet based contrast enhancement and denoising operations. Initial use of the tool with mammograms of various breast structures demonstrated its potential in improving visualization of diagnostic mammographic features. Web adaptation and real time wavelet processing enhance the potential of the previously reported tool in remote diagnosis and education in mammography.
NASA Technical Reports Server (NTRS)
Edwards, Daryl A.
2008-01-01
Preparing NASA's Plum Brook Station's Spacecraft Propulsion Research Facility (B-2) to support NASA's new generation of launch vehicles has raised many challenges for B-2's support staff. The facility provides a unique capability to test chemical propulsion systems/vehicles while simulating space thermal and vacuum environments. Designed and constructed in the early 1960s to support upper stage cryogenic engine/vehicle system development, the Plum Brook Station B-2 facility will require modifications to support the larger, more powerful, and more advanced engine systems for the next generation of vehicles leaving earth's orbit. Engine design improvements over the years have included large area expansion ratio nozzles, greater combustion chamber pressures, and advanced materials. Consequently, it has become necessary to determine what facility changes are required and how the facility can be adapted to support varying customers and their specific test needs. Exhaust system performance, including understanding the present facility capabilities, is the primary focus of this work. A variety of approaches and analytical tools are being employed to gain this understanding. This presentation discusses some of the challenges in applying these tools to this project and expected facility configuration to support the varying customer needs.
Decision Support for Resilient Communities: EPA’s Watershed Management Optimization Support Tool
The U.S. EPA Atlantic Ecology Division is releasing version 3 of the Watershed Management Optimization Support Tool (WMOST v3) in February 2018. WMOST is a decision-support tool that facilitates integrated water resources management (IWRM) by communities and watershed organizati...
Challenges and strategies in applying performance measurement to federal public health programs.
DeGroff, Amy; Schooley, Michael; Chapel, Thomas; Poister, Theodore H
2010-11-01
Performance measurement is widely accepted in public health as an important management tool supporting program improvement and accountability. However, several challenges impede developing and implementing performance measurement systems at the federal level, including the complexity of public health problems that reflect multiple determinants and involve outcomes that may take years to achieve, the decentralized and networked nature of public health program implementation, and the lack of reliable and consistent data sources and other issues related to measurement. All three of these challenges hinder the ability to attribute program results to specific public health program efforts. The purpose of this paper is to explore these issues in detail and offer potential solutions that support the development of robust and practical performance measures to meet the needs for program improvement and accountability. Adapting performance measurement to public health programs is both an evolving science and art. Through the strategies presented here, appropriate systems can be developed and monitored to support the production of meaningful data that will inform effective decision making at multiple levels. Published by Elsevier Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, Danny H; Elwood Jr, Robert H
2011-01-01
An effective risk assessment system is needed to address the threat posed by an active or passive insider who, acting alone or in collusion, could attempt diversion or theft of nuclear material. The material control and accountability (MC&A) system effectiveness tool (MSET) is a self-assessment or inspection tool utilizing probabilistic risk assessment (PRA) methodology to calculate the system effectiveness of a nuclear facility's material protection, control, and accountability (MPC&A) system. The MSET process is divided into four distinct and separate parts: (1) Completion of the questionnaire that assembles information about the operations of every aspect of the MPC&A system; (2)more » Conversion of questionnaire data into numeric values associated with risk; (3) Analysis of the numeric data utilizing the MPC&A fault tree and the SAPHIRE computer software; and (4) Self-assessment using the MSET reports to perform the effectiveness evaluation of the facility's MPC&A system. The process should lead to confirmation that mitigating features of the system effectively minimize the threat, or it could lead to the conclusion that system improvements or upgrades are necessary to achieve acceptable protection against the threat. If the need for system improvements or upgrades is indicated when the system is analyzed, MSET provides the capability to evaluate potential or actual system improvements or upgrades. A facility's MC&A system can be evaluated at a point in time. The system can be reevaluated after upgrades are implemented or after other system changes occur. The total system or specific subareas within the system can be evaluated. Areas of potential system improvement can be assessed to determine where the most beneficial and cost-effective improvements should be made. Analyses of risk importance factors show that sustainability is essential for optimal performance and reveals where performance degradation has the greatest impact on total system risk. The risk importance factors show the amount of risk reduction achievable with potential upgrades and the amount of risk reduction achieved after upgrades are completed. Applying the risk assessment tool gives support to budget prioritization by showing where budget support levels must be sustained for MC&A functions most important to risk. Results of the risk assessment are also useful in supporting funding justifications for system improvements that significantly reduce system risk. The functional model, the system risk assessment tool, and the facility evaluation questionnaire are valuable educational tools for MPC&A personnel. These educational tools provide a framework for ongoing dialogue between organizations regarding the design, development, implementation, operation, assessment, and sustainability of MPC&A systems. An organization considering the use of MSET as an analytical tool for evaluating the effectiveness of its MPC&A system will benefit from conducting a complete MSET exercise at an existing nuclear facility.« less
Sea Level Rise Decision Support Tools for Adaptation Planning in Vulnerable Coastal Communities
NASA Astrophysics Data System (ADS)
Rozum, J. S.; Marcy, D.
2015-12-01
NOAA is involved in a myriad of climate related research and projects that help decision makers and the public understand climate science as well as climate change impacts. The NOAA Office for Coastal Management (OCM) provides data, tools, trainings and technical assistance to coastal resource managers. Beginning in 2011, NOAA OCM began developing a sea level rise and coastal flooding impacts viewer which provides nationally consistent data sets and analyses to help communities with coastal management goals such as: understanding and communicating coastal flood hazards, performing vulnerability assessments and increasing coastal resilience, and prioritizing actions for different inundation/flooding scenarios. The Viewer is available on NOAA's Digital Coast platform: (coast.noaa.gov/ditgitalcoast/tools/slr). In this presentation we will share the lessons learned from our work with coastal decision-makers on the role of coastal flood risk data and tools in helping to shape future land use decisions and policies. We will also focus on a recent effort in California to help users understand the similarities and differences of a growing array of sea level rise decision support tools. NOAA staff and other partners convened a workshop entitled, "Lifting the Fog: Bringing Clarity to Sea Level Rise and Shoreline Change Models and Tools," which was attended by tool develops, science translators and coastal managers with the goal to create a collaborative communication framework to help California coastal decision-makers navigate the range of available sea level rise planning tools, and to inform tool developers of future planning needs. A sea level rise tools comparison matrix will be demonstrated. This matrix was developed as part of this effort and has been expanded to many other states via a partnership with NOAA, Climate Central, and The Nature Conservancy.
Boundary Layer Transition Results From STS-114
NASA Technical Reports Server (NTRS)
Berry, Scott A.; Horvath, Thomas J.; Cassady, Amy M.; Kirk, Benjamin S.; Wang, K. C.; Hyatt, Andrew J.
2006-01-01
The tool for predicting the onset of boundary layer transition from damage to and/or repair of the thermal protection system developed in support of Shuttle Return to Flight is compared to the STS-114 flight results. The Boundary Layer Transition (BLT) Tool is part of a suite of tools that analyze the aerothermodynamic environment of the local thermal protection system to allow informed disposition of damage for making recommendations to fly as is or to repair. Using mission specific trajectory information and details of each damage site or repair, the expected time of transition onset is predicted to help determine the proper aerothermodynamic environment to use in the subsequent thermal and stress analysis of the local structure. The boundary layer transition criteria utilized for the tool was developed from ground-based measurements to account for the effect of both protuberances and cavities and has been calibrated against flight data. Computed local boundary layer edge conditions provided the means to correlate the experimental results and then to extrapolate to flight. During STS-114, the BLT Tool was utilized and was part of the decision making process to perform an extravehicular activity to remove the large gap fillers. The role of the BLT Tool during this mission, along with the supporting information that was acquired for the on-orbit analysis, is reviewed. Once the large gap fillers were removed, all remaining damage sites were cleared for reentry as is. Post-flight analysis of the transition onset time revealed excellent agreement with BLT Tool predictions.
Performance evaluation of the Engineering Analysis and Data Systems (EADS) 2
NASA Technical Reports Server (NTRS)
Debrunner, Linda S.
1994-01-01
The Engineering Analysis and Data System (EADS)II (1) was installed in March 1993 to provide high performance computing for science and engineering at Marshall Space Flight Center (MSFC). EADS II increased the computing capabilities over the existing EADS facility in the areas of throughput and mass storage. EADS II includes a Vector Processor Compute System (VPCS), a Virtual Memory Compute System (CFS), a Common Output System (COS), as well as Image Processing Station, Mini Super Computers, and Intelligent Workstations. These facilities are interconnected by a sophisticated network system. This work considers only the performance of the VPCS and the CFS. The VPCS is a Cray YMP. The CFS is implemented on an RS 6000 using the UniTree Mass Storage System. To better meet the science and engineering computing requirements, EADS II must be monitored, its performance analyzed, and appropriate modifications for performance improvement made. Implementing this approach requires tool(s) to assist in performance monitoring and analysis. In Spring 1994, PerfStat 2.0 was purchased to meet these needs for the VPCS and the CFS. PerfStat(2) is a set of tools that can be used to analyze both historical and real-time performance data. Its flexible design allows significant user customization. The user identifies what data is collected, how it is classified, and how it is displayed for evaluation. Both graphical and tabular displays are supported. The capability of the PerfStat tool was evaluated, appropriate modifications to EADS II to optimize throughput and enhance productivity were suggested and implemented, and the effects of these modifications on the systems performance were observed. In this paper, the PerfStat tool is described, then its use with EADS II is outlined briefly. Next, the evaluation of the VPCS, as well as the modifications made to the system are described. Finally, conclusions are drawn and recommendations for future worked are outlined.
Heuer, Herbert; Hegele, Mathias
2010-12-01
Mechanical tools are transparent in the sense that their input-output relations can be derived from their perceptible characteristics. Modern technology creates more and more tools that lack mechanical transparency, such as in the control of the position of a cursor by means of a computer mouse or some other input device. We inquired whether an enhancement of transparency by means of presenting the shaft of a virtual sliding lever, which governed the transformation of hand position into cursor position, supports performance of aimed cursor movement and the acquisition of an internal model of the transformation in both younger and older adults. Enhanced transparency resulted in an improvement of visual closed-loop control in terms of movement time and curvature of cursor paths. The movement-time improvement was more pronounced at older working age than at younger working age, so that the enhancement of transparency can serve as a means to mitigate age-related declines in performance. Benefits for the acquisition of an internal model of the transformation and of explicit knowledge were absent. Thus, open-loop control in this task did not profit from enhanced mechanical transparency. These findings strongly suggest that environmental support of transparency of the effects of input devices on controlled systems might be a powerful tool to support older users. Enhanced transparency may also improve simulator-based training by increasing motivation, even if training benefits do not transfer to situations without enhanced transparency. (PsycINFO Database Record (c) 2010 APA, all rights reserved).
Gross, Douglas P; Zhang, Jing; Steenstra, Ivan; Barnsley, Susan; Haws, Calvin; Amell, Tyler; McIntosh, Greg; Cooper, Juliette; Zaiane, Osmar
2013-12-01
To develop a classification algorithm and accompanying computer-based clinical decision support tool to help categorize injured workers toward optimal rehabilitation interventions based on unique worker characteristics. Population-based historical cohort design. Data were extracted from a Canadian provincial workers' compensation database on all claimants undergoing work assessment between December 2009 and January 2011. Data were available on: (1) numerous personal, clinical, occupational, and social variables; (2) type of rehabilitation undertaken; and (3) outcomes following rehabilitation (receiving time loss benefits or undergoing repeat programs). Machine learning, concerned with the design of algorithms to discriminate between classes based on empirical data, was the foundation of our approach to build a classification system with multiple independent and dependent variables. The population included 8,611 unique claimants. Subjects were predominantly employed (85 %) males (64 %) with diagnoses of sprain/strain (44 %). Baseline clinician classification accuracy was high (ROC = 0.86) for selecting programs that lead to successful return-to-work. Classification performance for machine learning techniques outperformed the clinician baseline classification (ROC = 0.94). The final classifiers were multifactorial and included the variables: injury duration, occupation, job attachment status, work status, modified work availability, pain intensity rating, self-rated occupational disability, and 9 items from the SF-36 Health Survey. The use of machine learning classification techniques appears to have resulted in classification performance better than clinician decision-making. The final algorithm has been integrated into a computer-based clinical decision support tool that requires additional validation in a clinical sample.
Hvitfeldt-Forsberg, Helena; Mazzocato, Pamela; Glaser, Daniel; Keller, Christina; Unbeck, Maria
2017-06-06
To explore healthcare staffs' and managers' perceptions of how and when discrete event simulation modelling can be used as a decision support in improvement efforts. Two focus group discussions were performed. Two settings were included: a rheumatology department and an orthopaedic section both situated in Sweden. Healthcare staff and managers (n=13) from the two settings. Two workshops were performed, one at each setting. Workshops were initiated by a short introduction to simulation modelling. Results from the respective simulation model were then presented and discussed in the following focus group discussion. Categories from the content analysis are presented according to the following research questions: how and when simulation modelling can assist healthcare improvement? Regarding how, the participants mentioned that simulation modelling could act as a tool for support and a way to visualise problems, potential solutions and their effects. Regarding when, simulation modelling could be used both locally and by management, as well as a pedagogical tool to develop and test innovative ideas and to involve everyone in the improvement work. Its potential as an information and communication tool and as an instrument for pedagogic work within healthcare improvement render a broader application and value of simulation modelling than previously reported. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
The dark side of the immunohistochemical moon: industry.
Kalyuzhny, Alexander E
2009-12-01
Modern biological research is dependent on tools developed and provided by commercial suppliers, and antibodies for immunohistochemistry are among the most frequently used of these tools. Not all commercial antibodies perform as expected, however; this problem leads researchers to waste time and money when using antibodies that perform inadequately. Different commercial suppliers offer antibodies of varying degrees of quality and, in some cases, are unable to provide expert technical support for the immunohistochemical use of their antibodies. This article briefly describes the production of commercial antibodies from the manufacturer's perspective and presents some guidelines for choosing appropriate commercial antibodies for immunohistochemistry. Additionally, the article suggests steps to establish mutually beneficial relationships between commercial antibody suppliers and researchers who use them.
Cementitious Barriers Partnership FY2013 End-Year Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flach, G. P.; Langton, C. A.; Burns, H. H.
2013-11-01
In FY2013, the Cementitious Barriers Partnership (CBP) demonstrated continued tangible progress toward fulfilling the objective of developing a set of software tools to improve understanding and prediction of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. In November 2012, the CBP released “Version 1.0” of the CBP Software Toolbox, a suite of software for simulating reactive transport in cementitious materials and important degradation phenomena. In addition, the CBP completed development of new software for the “Version 2.0” Toolbox to be released in early FY2014 and demonstrated use of the Version 1.0 Toolbox on DOEmore » applications. The current primary software components in both Versions 1.0 and 2.0 are LeachXS/ORCHESTRA, STADIUM, and a GoldSim interface for probabilistic analysis of selected degradation scenarios. The CBP Software Toolbox Version 1.0 supports analysis of external sulfate attack (including damage mechanics), carbonation, and primary constituent leaching. Version 2.0 includes the additional analysis of chloride attack and dual regime flow and contaminant migration in fractured and non-fractured cementitious material. The LeachXS component embodies an extensive material property measurements database along with chemical speciation and reactive mass transport simulation cases with emphasis on leaching of major, trace and radionuclide constituents from cementitious materials used in DOE facilities, such as Saltstone (Savannah River) and Cast Stone (Hanford), tank closure grouts, and barrier concretes. STADIUM focuses on the physical and structural service life of materials and components based on chemical speciation and reactive mass transport of major cement constituents and aggressive species (e.g., chloride, sulfate, etc.). THAMES is a planned future CBP Toolbox component focused on simulation of the microstructure of cementitious materials and calculation of resultant hydraulic and constituent mass transfer parameters needed in modeling. Two CBP software demonstrations were conducted in FY2013, one to support the Saltstone Disposal Facility (SDF) at SRS and the other on a representative Hanford high-level waste tank. The CBP Toolbox demonstration on the SDF provided analysis on the most probable degradation mechanisms to the cementitious vault enclosure caused by sulfate and carbonation ingress. This analysis was documented and resulted in the issuance of a SDF Performance Assessment Special Analysis by Liquid Waste Operations this fiscal year. The two new software tools supporting chloride attack and dual-regime flow will provide additional degradation tools to better evaluate performance of DOE and commercial cementitious barriers. The CBP SRNL experimental program produced two patent applications and field data that will be used in the development and calibration of CBP software tools being developed in FY2014. The CBP software and simulation tools varies from other efforts in that all the tools are based upon specific and relevant experimental research of cementitious materials utilized in DOE applications. The CBP FY2013 program involved continuing research to improve and enhance the simulation tools as well as developing new tools that model other key degradation phenomena not addressed in Version 1.0. Also efforts to continue to verify the various simulation tools through laboratory experiments and analysis of field specimens are ongoing and will continue into FY2014 to quantify and reduce the uncertainty associated with performance assessments. This end-year report summarizes FY2013 software development efforts and the various experimental programs that are providing data for calibration and validation of the CBP developed software.« less
Kim, Seung; Lee, Eun Hye; Yang, Hye Ran
2018-06-01
The prevalence of malnutrition among hospitalized children ranges between 12% and 24%. Although the consequences of hospital malnutrition are enormous, it is often unrecognized and untreated. The aim of this study was to identify the current status of in-hospital nutrition support for children in South Korea by carrying out a nationwide hospital-based survey. Out of 345 general and tertiary hospitals in South Korea, a total of 53 institutes with pediatric gastroenterologists and more than 10 pediatric inpatients were selected. A questionnaire was developed by the nutrition committee of the Korean Society of Pediatric Gastroenterology, Hepatology and Nutrition. The questionnaires were sent to pediatric gastroenterologists in each hospital. Survey was performed by e-mails. Forty hospitals (75.5%) responded to the survey; 23 of them were tertiary hospitals, and 17 of them were general hospitals. Only 21 hospitals (52.5%) had all the required nutritional support personnel (including pediatrician, nutritionist, pharmacist, and nurse) assigned to pediatric patients. Routine nutritional screening was performed in 22 (55.0%) hospitals on admission, which was lower than that in adult patients (65.8%). Nutritional screening tools varied among hospitals; 33 of 40 (82.5%) hospitals used their own screening tools. The most frequently used nutritional assessment parameters were weight, height, hemoglobin, and serum albumin levels. In our nationwide hospital-based survey, the most frequently reported main barriers of nutritional support in hospitals were lack of manpower and excessive workload, followed by insufficient knowledge and experience. Although this nationwide hospital-based survey targeted general and tertiary hospitals with pediatric gastroenterologists, manpower and medical resources for nutritional support were still insufficient for hospitalized children, and nutritional screening was not routinely performed in many hospitals. More attention to hospital malnutrition and additional national policies for nutritional support in hospitals are required to ensure appropriate nutritional management of hospitalized pediatric patients.
NASA Enterprise Visual Analysis
NASA Technical Reports Server (NTRS)
Lopez-Tellado, Maria; DiSanto, Brenda; Humeniuk, Robert; Bard, Richard, Jr.; Little, Mia; Edwards, Robert; Ma, Tien-Chi; Hollifield, Kenneith; White, Chuck
2007-01-01
NASA Enterprise Visual Analysis (NEVA) is a computer program undergoing development as a successor to Launch Services Analysis Tool (LSAT), formerly known as Payload Carrier Analysis Tool (PCAT). NEVA facilitates analyses of proposed configurations of payloads and packing fixtures (e.g. pallets) in a space shuttle payload bay for transport to the International Space Station. NEVA reduces the need to use physical models, mockups, and full-scale ground support equipment in performing such analyses. Using NEVA, one can take account of such diverse considerations as those of weight distribution, geometry, collision avoidance, power requirements, thermal loads, and mechanical loads.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roger Lew; Ronald L. Boring; Thomas A. Ulrich
Operators of critical processes, such as nuclear power production, must contend with highly complex systems, procedures, and regulations. Developing human-machine interfaces (HMIs) that better support operators is a high priority for ensuring the safe and reliable operation of critical processes. Human factors engineering (HFE) provides a rich and mature set of tools for evaluating the performance of HMIs, but the set of tools for developing and designing HMIs is still in its infancy. Here we propose that Microsoft Windows Presentation Foundation (WPF) is well suited for many roles in the research and development of HMIs for process control.
Graphite fiber reinforced structure for supporting machine tools
Knight, Jr., Charles E.; Kovach, Louis; Hurst, John S.
1978-01-01
Machine tools utilized in precision machine operations require tool support structures which exhibit minimal deflection, thermal expansion and vibration characteristics. The tool support structure of the present invention is a graphite fiber reinforced composite in which layers of the graphite fibers or yarn are disposed in a 0/90.degree. pattern and bonded together with an epoxy resin. The finished composite possesses a low coefficient of thermal expansion and a substantially greater elastic modulus, stiffness-to-weight ratio, and damping factor than a conventional steel tool support utilized in similar machining operations.
Research into software executives for space operations support
NASA Technical Reports Server (NTRS)
Collier, Mark D.
1990-01-01
Research concepts pertaining to a software (workstation) executive which will support a distributed processing command and control system characterized by high-performance graphics workstations used as computing nodes are presented. Although a workstation-based distributed processing environment offers many advantages, it also introduces a number of new concerns. In order to solve these problems, allow the environment to function as an integrated system, and present a functional development environment to application programmers, it is necessary to develop an additional layer of software. This 'executive' software integrates the system, provides real-time capabilities, and provides the tools necessary to support the application requirements.
NASA Technical Reports Server (NTRS)
Thomas, Stan J.
1993-01-01
KATE (Knowledge-based Autonomous Test Engineer) is a model-based software system developed in the Artificial Intelligence Laboratory at the Kennedy Space Center for monitoring, fault detection, and control of launch vehicles and ground support systems. In order to bring KATE to the level of performance, functionality, and integratability needed for firing room applications, efforts are underway to implement KATE in the C++ programming language using an X-windows interface. Two programs which were designed and added to the collection of tools which comprise the KATE toolbox are described. The first tool, called the schematic viewer, gives the KATE user the capability to view digitized schematic drawings in the KATE environment. The second tool, called the model editor, gives the KATE model builder a tool for creating and editing knowledge base files. Design and implementation issues having to do with these two tools are discussed. It will be useful to anyone maintaining or extending either the schematic viewer or the model editor.
Faye, Alexandrine; Jacquin-Courtois, Sophie; Osiurak, François
2018-03-01
The purpose of this study was to deepen our understanding of the cognitive bases of human tool use based on the technical reasoning hypothesis (i.e., the reasoning-based approach). This approach assumes that tool use is supported by the ability to reason about an object's physical properties (e.g., length, weight, strength, etc.) to perform mechanical actions (e.g., lever). In this framework, an important issue is to understand whether left-brain-damaged (LBD) individuals with tool-use deficits are still able to estimate the physical object's properties necessary to use the tool. Eleven LBD patients and 12 control participants performed 3 original experimental tasks: Use-Length (visual evaluation of the length of a stick to bring down a target), Visual-Length (to visually compare objects of different lengths) and Addition-Length (to visually compare added lengths). Participants were also tested on conventional tasks: Familiar Tool Use and Mechanical Problem-Solving (novel tools). LBD patients had more difficulties than controls on both conventional tasks. No significant differences were observed for the 3 experimental tasks. These results extend the reasoning-based approach, stressing that it might not be the representation of length that is impaired in LBD patients, but rather the ability to generate mechanical actions based on physical object properties. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Cluster tool solution for fabrication and qualification of advanced photomasks
NASA Astrophysics Data System (ADS)
Schaetz, Thomas; Hartmann, Hans; Peter, Kai; Lalanne, Frederic P.; Maurin, Olivier; Baracchi, Emanuele; Miramond, Corinne; Brueck, Hans-Juergen; Scheuring, Gerd; Engel, Thomas; Eran, Yair; Sommer, Karl
2000-07-01
The reduction of wavelength in optical lithography, phase shift technology and optical proximity correction (OPC), requires a rapid increase in cost effective qualification of photomasks. The knowledge about CD variation, loss of pattern fidelity especially for OPC pattern and mask defects concerning the impact on wafer level is becoming a key issue for mask quality assessment. As part of the European Community supported ESPRIT projection 'Q-CAP', a new cluster concept has been developed, which allows the combination of hardware tools as well as software tools via network communication. It is designed to be open for any tool manufacturer and mask hose. The bi-directional network access allows the exchange of all relevant mask data including grayscale images, measurement results, lithography parameters, defect coordinates, layout data, process data etc. and its storage to a SQL database. The system uses SEMI format descriptions as well as standard network hardware and software components for the client server communication. Each tool is used mainly to perform its specific application without using expensive time to perform optional analysis, but the availability of the database allows each component to share the full data ste gathered by all components. Therefore, the cluster can be considered as one single virtual tool. The paper shows the advantage of the cluster approach, the benefits of the tools linked together already, and a vision of a mask house in the near future.
Design and usability of heuristic-based deliberation tools for women facing amniocentesis.
Durand, Marie-Anne; Wegwarth, Odette; Boivin, Jacky; Elwyn, Glyn
2012-03-01
Evidence suggests that in decision contexts characterized by uncertainty and time constraints (e.g. health-care decisions), fast and frugal decision-making strategies (heuristics) may perform better than complex rules of reasoning. To examine whether it is possible to design deliberation components in decision support interventions using simple models (fast and frugal heuristics). The 'Take The Best' heuristic (i.e. selection of a 'most important reason') and 'The Tallying' integration algorithm (i.e. unitary weighing of pros and cons) were used to develop two deliberation components embedded in a Web-based decision support intervention for women facing amniocentesis testing. Ten researchers (recruited from 15), nine health-care providers (recruited from 28) and ten pregnant women (recruited from 14) who had recently been offered amniocentesis testing appraised evolving versions of 'your most important reason' (Take The Best) and 'weighing it up' (Tallying). Most researchers found the tools useful in facilitating decision making although emphasized the need for simple instructions and clear layouts. Health-care providers however expressed concerns regarding the usability and clarity of the tools. By contrast, 7 out of 10 pregnant women found the tools useful in weighing up the pros and cons of each option, helpful in structuring and clarifying their thoughts and visualizing their decision efforts. Several pregnant women felt that 'weighing it up' and 'your most important reason' were not appropriate when facing such a difficult and emotional decision. Theoretical approaches based on fast and frugal heuristics can be used to develop deliberation tools that provide helpful support to patients facing real-world decisions about amniocentesis. © 2011 Blackwell Publishing Ltd.
Radio-science performance analysis software
NASA Astrophysics Data System (ADS)
Morabito, D. D.; Asmar, S. W.
1995-02-01
The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.
Radio-Science Performance Analysis Software
NASA Astrophysics Data System (ADS)
Morabito, D. D.; Asmar, S. W.
1994-10-01
The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussion on operating the program set on Galileo and Ulysses data will be presented.
Radio-science performance analysis software
NASA Technical Reports Server (NTRS)
Morabito, D. D.; Asmar, S. W.
1995-01-01
The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.
NASA Technical Reports Server (NTRS)
Keys, Andrew S.; Adams, James H.; Darty, Ronald C.; Patrick, Marshall C.; Johnson, Michael A.; Cressler, John D.
2008-01-01
Primary Objective: 1) A computational tool to accurately predict electronics performance in the presence of space radiation in support of spacecraft design: a) Total dose; b) Single Event Effects; and c) Mean Time Between Failure. (Developed as successor to CR ME96.) Secondary Objectives: 2) To provide a detailed description of the natural radiation environment in support of radiation health and instrument design: a) In deep space; b) Inside the magnetosphere; and c) Behind shielding.
Process Damping and Cutting Tool Geometry in Machining
NASA Astrophysics Data System (ADS)
Taylor, C. M.; Sims, N. D.; Turner, S.
2011-12-01
Regenerative vibration, or chatter, limits the performance of machining processes. Consequences of chatter include tool wear and poor machined surface finish. Process damping by tool-workpiece contact can reduce chatter effects and improve productivity. Process damping occurs when the flank (also known as the relief face) of the cutting tool makes contact with waves on the workpiece surface, created by chatter motion. Tool edge features can act to increase the damping effect. This paper examines how a tool's edge condition combines with the relief angle to affect process damping. An analytical model of cutting with chatter leads to a two-section curve describing how process damped vibration amplitude changes with surface speed for radiussed tools. The tool edge dominates the process damping effect at the lowest surface speeds, with the flank dominating at higher speeds. A similar curve is then proposed regarding tools with worn edges. Experimental data supports the notion of the two-section curve. A rule of thumb is proposed which could be useful to machine operators, regarding tool wear and process damping. The question is addressed, should a tool of a given geometry, used for a given application, be considered as sharp, radiussed or worn regarding process damping.
Perceptions of organizational support and its impact on nurses' job outcomes.
Labrague, Leodoro J; McEnroe Petitte, Denise M; Leocadio, Michael C; Van Bogaert, Peter; Tsaras, Konstantinos
2018-04-25
Strong organizational support can promote a sense of well-being and positive work behaviors in nurses. However, despite the importance of organizational support in nursing, this topic remains unexplored in the Philippines. The aim of this study was to examine the impact of organizational support perceptions on nurses' work outcomes (organizational commitment, work autonomy, work performance, job satisfaction, job stress, and turnover intention). A descriptive, cross-sectional research design was adopted in this study to collect data from one hundred eighty (180) nurses in the Philippines during the months of September 2015 to December 2015. Seven standardized tools were used: the Job Satisfaction Index, the Job Stress Scale, the Burnout Measure Scale, the Work Autonomy Scale, the Six Dimension Scale of Nursing Performance, the Turnover Intention Inventory Scale, and the Perception of Organizational Support Scale. Nurses employed in government-owned hospitals perceived low levels of organizational support as compared to private hospitals. Significant correlations were identified between perceived organizational support (POS), hospital bed capacity, and nurses' work status. No significant correlations were found between perceived organizational supportand the six outcomes perceived by nurses in the Philippines (organizational commitment, work performance, job autonomy, job satisfaction, job stress, and turnover intention). Perceptions of organizational support were low in Filipino nurses compared to findings in other international studies. Perceived organizational support did not influence job outcomes in nurses. © 2018 Wiley Periodicals, Inc.
Modeling and Simulation of Phased Array Antennas to Support Next-Generation Satellite Design
NASA Technical Reports Server (NTRS)
Tchorowski, Nicole; Murawski, Robert; Manning, Robert; Fuentes, Michael
2016-01-01
Developing enhanced simulation capabilities has become a significant priority for the Space Communications and Navigation (SCaN) project at NASA as new space communications technologies are proposed to replace aging NASA communications assets, such as the Tracking and Data Relay Satellite System (TDRSS). When developing the architecture for these new space communications assets, it is important to develop updated modeling and simulation methodologies, such that competing architectures can be weighed against one another and the optimal path forward can be determined. There have been many simulation tools developed here at NASA for the simulation of single RF link budgets, or for the modeling and simulation of an entire network of spacecraft and their supporting SCaN network elements. However, the modeling capabilities are never fully complete and as new technologies are proposed, gaps are identified. One such gap is the ability to rapidly develop high fidelity simulation models of electronically steerable phased array systems. As future relay satellite architectures are proposed that include optical communications links, electronically steerable antennas will become more desirable due to the reduction in platform vibration introduced by mechanically steerable devices. In this research, we investigate how modeling of these antennas can be introduced into out overall simulation and modeling structure. The ultimate goal of this research is two-fold. First, to enable NASA engineers to model various proposed simulation architectures and determine which proposed architecture meets the given architectural requirements. Second, given a set of communications link requirements for a proposed satellite architecture, determine the optimal configuration for a phased array antenna. There is a variety of tools available that can be used to model phased array antennas. To meet our stated goals, the first objective of this research is to compare the subset of tools available to us, trading-off modeling fidelity of the tool with simulation performance. When comparing several proposed architectures, higher- fidelity modeling may be desirable, however, when iterating a proposed set of communication link requirements across ranges of phased array configuration parameters, the practicality of performance becomes a significant requirement. In either case, a minimum simulation - fidelity must be met, regardless of performance considerations, which will be discussed in this research. Given a suitable set of phased array modeling tools, this research then focuses on integration with current SCaN modeling and simulation tools. While properly modeling the antenna elements of a system are vital, this is only a small part of the end-to-end communication path between a satellite and the supporting ground station and/or relay satellite assets. To properly model a proposed simulation architecture, this toolset must be integrated with other commercial and government development tools, such that the overall architecture can be examined in terms of communications, reliability, and cost. In this research, integration with previously developed communication tools is investigated.
Greased Lightning (GL-10) Performance Flight Research: Flight Data Report
NASA Technical Reports Server (NTRS)
McSwain, Robert G.; Glaab, Louis J.; Theodore, Colin R.; Rhew, Ray D. (Editor); North, David D. (Editor)
2017-01-01
Modern aircraft design methods have produced acceptable designs for large conventional aircraft performance. With revolutionary electronic propulsion technologies fueled by the growth in the small UAS (Unmanned Aerial Systems) industry, these same prediction models are being applied to new smaller, and experimental design concepts requiring a VTOL (Vertical Take Off and Landing) capability for ODM (On Demand Mobility). A 50% sub-scale GL-10 flight model was built and tested to demonstrate the transition from hover to forward flight utilizing DEP (Distributed Electric Propulsion)[1][2]. In 2016 plans were put in place to conduct performance flight testing on the 50% sub-scale GL-10 flight model to support a NASA project called DELIVER (Design Environment for Novel Vertical Lift Vehicles). DELIVER was investigating the feasibility of including smaller and more experimental aircraft configurations into a NASA design tool called NDARC (NASA Design and Analysis of Rotorcraft)[3]. This report covers the performance flight data collected during flight testing of the GL-10 50% sub-scale flight model conducted at Beaver Dam Airpark, VA. Overall the flight test data provides great insight into how well our existing conceptual design tools predict the performance of small scale experimental DEP concepts. Low fidelity conceptual design tools estimated the (L/D)( sub max)of the GL-10 50% sub-scale flight model to be 16. Experimentally measured (L/D)( sub max) for the GL-10 50% scale flight model was 7.2. The aerodynamic performance predicted versus measured highlights the complexity of wing and nacelle interactions which is not currently accounted for in existing low fidelity tools.
Verification and Validation of NASA-Supported Enhancements to PECAD's Decision Support Tools
NASA Technical Reports Server (NTRS)
McKellipo, Rodney; Ross, Kenton W.
2006-01-01
The NASA Applied Sciences Directorate (ASD), part of the Earth-Sun System Division of NASA's Science Mission Directorate, has partnered with the U.S. Department of Agriculture (USDA) to enhance decision support in the area of agricultural efficiency-an application of national importance. The ASD integrated the results of NASA Earth science research into USDA decision support tools employed by the USDA Foreign Agricultural Service (FAS) Production Estimates and Crop Assessment Division (PECAD), which supports national decision making by gathering, analyzing, and disseminating global crop intelligence. Verification and validation of the following enhancements are summarized: 1) Near-real-time Moderate Resolution Imaging Spectroradiometer (MODIS) products through PECAD's MODIS Image Gallery; 2) MODIS Normalized Difference Vegetation Index (NDVI) time series data through the USDA-FAS MODIS NDVI Database; and 3) Jason-1 and TOPEX/Poseidon lake level estimates through PECAD's Global Reservoir and Lake Monitor. Where possible, each enhanced product was characterized for accuracy, timeliness, and coverage, and the characterized performance was compared to PECAD operational requirements. The MODIS Image Gallery and the GRLM are more mature and have achieved a semi-operational status, whereas the USDA-FAS MODIS NDVI Database is still evolving and should be considered
Experience Gained From Launch and Early Orbit Support of the Rossi X-Ray Timing Explorer (RXTE)
NASA Technical Reports Server (NTRS)
Fink, D. R.; Chapman, K. B.; Davis, W. S.; Hashmall, J. A.; Shulman, S. E.; Underwood, S. C.; Zsoldos, J. M.; Harman, R. R.
1996-01-01
this paper reports the results to date of early mission support provided by the personnel of the Goddard Space Flight Center Flight Dynamics Division (FDD) for the Rossi X-Ray Timing Explorer (RXTE) spacecraft. For this mission, the FDD supports onboard attitude determination and ephemeris propagation by supplying ground-based orbit and attitude solutions and calibration results. The first phase of that support was to provide launch window analyses. As the launch window was determined, acquisition attitudes were calculated and calibration slews were planned. postlaunch, these slews provided the basis for ground determined calibration. Ground determined calibration results are used to improve the accuracy of onboard solutions. The FDD is applying new calibration tools designed to facilitate use of the simultaneous, high-accuracy star observations from the two RXTE star trackers for ground attitude determination and calibration. An evaluation of the performance of these tools is presented. The FDD provides updates to the onboard star catalog based on preflight analysis and analysis of flight data. The in-flight results of the mission support in each area are summarized and compared with pre-mission expectations.
A New Network Modeling Tool for the Ground-based Nuclear Explosion Monitoring Community
NASA Astrophysics Data System (ADS)
Merchant, B. J.; Chael, E. P.; Young, C. J.
2013-12-01
Network simulations have long been used to assess the performance of monitoring networks to detect events for such purposes as planning station deployments and network resilience to outages. The standard tool has been the SAIC-developed NetSim package. With correct parameters, NetSim can produce useful simulations; however, the package has several shortcomings: an older language (FORTRAN), an emphasis on seismic monitoring with limited support for other technologies, limited documentation, and a limited parameter set. Thus, we are developing NetMOD (Network Monitoring for Optimal Detection), a Java-based tool designed to assess the performance of ground-based networks. NetMOD's advantages include: coded in a modern language that is multi-platform, utilizes modern computing performance (e.g. multi-core processors), incorporates monitoring technologies other than seismic, and includes a well-validated default parameter set for the IMS stations. NetMOD is designed to be extendable through a plugin infrastructure, so new phenomenological models can be added. Development of the Seismic Detection Plugin is being pursued first. Seismic location and infrasound and hydroacoustic detection plugins will follow. By making NetMOD an open-release package, it can hopefully provide a common tool that the monitoring community can use to produce assessments of monitoring networks and to verify assessments made by others.
NASA Technical Reports Server (NTRS)
Monell, D.; Mathias, D.; Reuther, J.; Garn, M.
2003-01-01
A new engineering environment constructed for the purposes of analyzing and designing Reusable Launch Vehicles (RLVs) is presented. The new environment has been developed to allow NASA to perform independent analysis and design of emerging RLV architectures and technologies. The new Advanced Engineering Environment (AEE) is both collaborative and distributed. It facilitates integration of the analyses by both vehicle performance disciplines and life-cycle disciplines. Current performance disciplines supported include: weights and sizing, aerodynamics, trajectories, propulsion, structural loads, and CAD-based geometries. Current life-cycle disciplines supported include: DDT&E cost, production costs, operations costs, flight rates, safety and reliability, and system economics. Involving six NASA centers (ARC, LaRC, MSFC, KSC, GRC and JSC), AEE has been tailored to serve as a web-accessed agency-wide source for all of NASA's future launch vehicle systems engineering functions. Thus, it is configured to facilitate (a) data management, (b) automated tool/process integration and execution, and (c) data visualization and presentation. The core components of the integrated framework are a customized PTC Windchill product data management server, a set of RLV analysis and design tools integrated using Phoenix Integration's Model Center, and an XML-based data capture and transfer protocol. The AEE system has seen production use during the Initial Architecture and Technology Review for the NASA 2nd Generation RLV program, and it continues to undergo development and enhancements in support of its current main customer, the NASA Next Generation Launch Technology (NGLT) program.
Watershed Management Optimization Support Tool (WMOST) Workshop.
EPA's Watershed Management Optimization Support Tool (WMOST) version 2 is a decision support tool designed to facilitate integrated water management by communities at the small watershed scale. WMOST allows users to look across management options in stormwater (including green i...
ERIC Educational Resources Information Center
Elias, Mohd Syahrizad; Mohamad Ali, Ahmad Zamzuri
2016-01-01
Simulation-aided learning has capability in improving student's learning performance. However, the positive effect of simulation-aided learning still being discussed, which at times has not played the purported role expected. To address these problems, Multimedia Instructional Message (MIM) appeared to be an essential supporting tool in ensuring…
Creating an Online Assessment Test for Heritage Learners of Russian
ERIC Educational Resources Information Center
Titus, Julia
2012-01-01
This paper examines the differences between second-language learners and heritage learners of Russian in terms of their linguistic performance, a finding supported by current research (Andrews, 2001; Kagan & Dillon, 2001/2003), examines the implications of these differences for the creation of testing tools, and offers a sample of a test designed…
NASA Technical Reports Server (NTRS)
Perchonok, Michele
2014-01-01
The goal of HRP is to provide human health and performance countermeasures, knowledge, technologies, and tools to enable safe, reliable, and productive human space exploration. Presentation discusses (1) Bone Health: Vitamin D, Fish Consumption and Exercise (2) Medical Support in Remote Areas (3) ISS Ultrasound 4) Dry electrode EKG System (5) Environmental Factors and Psychological Health.
ERIC Educational Resources Information Center
Hartley, Michael T.
2012-01-01
This study examined the assessment of resilience in undergraduate college students. Multigroup comparisons of the Connor-Davidson Resilience Scale (CD-RISC; Connor & Davidson, 2003) were performed on general population students and students recruited from campus mental health offices offering college counseling, psychiatric-support, and…
Developing New Literacies Using Commercial Videogames as Educational Tools
ERIC Educational Resources Information Center
Lacasa, Pilar; Martinez, Rut; Mendez, Laura
2008-01-01
The general aim of this paper is to examine how videogames, supported by conversations and theatrical performances in the classroom, can contribute to the development of narrative thought as present in written compositions in various contexts. Given that one of the primary ecological influences on children is the mass media, we discuss how media…
The Portable Usability Testing Lab: A Flexible Research Tool.
ERIC Educational Resources Information Center
Hale, Michael E.; And Others
A group of faculty at the University of Georgia obtained funding for a research and development facility called the Learning and Performance Support Laboratory (LPSL). One of the LPSL's primary needs was obtaining a portable usability lab for software testing, so the facility obtained the "Luggage Lab 2000." The lab is transportable to…
Point Cloud-Based Automatic Assessment of 3D Computer Animation Courseworks
ERIC Educational Resources Information Center
Paravati, Gianluca; Lamberti, Fabrizio; Gatteschi, Valentina; Demartini, Claudio; Montuschi, Paolo
2017-01-01
Computer-supported assessment tools can bring significant benefits to both students and teachers. When integrated in traditional education workflows, they may help to reduce the time required to perform the evaluation and consolidate the perception of fairness of the overall process. When integrated within on-line intelligent tutoring systems,…
Design and Implementation of a Learning Analytics Toolkit for Teachers
ERIC Educational Resources Information Center
Dyckhoff, Anna Lea; Zielke, Dennis; Bultmann, Mareike; Chatti, Mohamed Amine; Schroeder, Ulrik
2012-01-01
Learning Analytics can provide powerful tools for teachers in order to support them in the iterative process of improving the effectiveness of their courses and to collaterally enhance their students' performance. In this paper, we present the theoretical background, design, implementation, and evaluation details of eLAT, a Learning Analytics…
ERIC Educational Resources Information Center
Smith, Rachel S., Ed.
2008-01-01
The conference proceedings include the following papers: (1) Digital Storytelling: An Alternative Instructional Approach (Ruben R. Puentedura); (2) Digital Storytelling: Old Ways, New Tools (Laurie Burruss); (3) The Adding Machine: Remote Digital Storytelling and Performance (George Brown and James Ferolo); (4) Building and Supporting a…
Can Teacher Evaluation Improve Teaching?
ERIC Educational Resources Information Center
Principal Leadership, 2013
2013-01-01
The answer to the question, Can evaluation improve teaching? is a qualified yes. Teacher evaluation has changed and the role of the principal has changed as well; the focus now is on evidence, not merely good judgment. With the right tools, systems, and support, it should be possible to help improve teaching performance and student learning…
The Embedded Librarian Online or Face-to-Face: American University's Experiences
ERIC Educational Resources Information Center
Matos, Michael A.; Matsuoka-Motley, Nobue; Mayer, William
2010-01-01
This article examines the role online communication and tools play in embedded librarianship at American University. Two embedded models of user engagement, traditional and hybrid, are discussed. The librarians operating in each mode share their experiences providing tailored support to the departments of music/performing arts and business. The…
Internet Use among College Students: Tool or Toy?
ERIC Educational Resources Information Center
Englander, Fred; Terregrossa, Ralph A.; Wang, Zhaobo
2010-01-01
The purpose of this study is to analyze the relationship between the grade performance of 128 students in an introductory micro-economics course and the average number of hours per week these students report spending on the Internet. The literature review offers a "priori" arguments supporting both positive and negative relationships.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williamson, Richard L.; Kochunas, Brendan; Adams, Brian M.
The Virtual Environment for Reactor Applications components included in this distribution include selected computational tools and supporting infrastructure that solve neutronics, thermal-hydraulics, fuel performance, and coupled neutronics-thermal hydraulics problems. The infrastructure components provide a simplified common user input capability and provide for the physics integration with data transfer and coupled-physics iterative solution algorithms.
Effects of Computer-Based Visual Representation on Mathematics Learning and Cognitive Load
ERIC Educational Resources Information Center
Yung, Hsin I.; Paas, Fred
2015-01-01
Visual representation has been recognized as a powerful learning tool in many learning domains. Based on the assumption that visual representations can support deeper understanding, we examined the effects of visual representations on learning performance and cognitive load in the domain of mathematics. An experimental condition with visual…
ERIC Educational Resources Information Center
Rosadi, Kemas Imron
2015-01-01
Development of education in Indonesia is based on three aspects, namely equity and expansion, quality and relevance, as well as good governance. Quality education is influenced by several factors related to quality education managerial leaders, limited funds, facilities, educational facilities, media, learning resources, tools and training…
Improving Primary Care Provider Practices in Youth Concussion Management.
Arbogast, Kristy B; Curry, Allison E; Metzger, Kristina B; Kessler, Ronni S; Bell, Jeneita M; Haarbauer-Krupa, Juliet; Zonfrillo, Mark R; Breiding, Matthew J; Master, Christina L
2017-08-01
Primary care providers are increasingly providing youth concussion care but report insufficient time and training, limiting adoption of best practices. We implemented a primary care-based intervention including an electronic health record-based clinical decision support tool ("SmartSet") and in-person training. We evaluated consequent improvement in 2 key concussion management practices: (1) performance of a vestibular oculomotor examination and (2) discussion of return-to-learn/return-to-play (RTL/RTP) guidelines. Data were included from 7284 primary care patients aged 0 to 17 years with initial concussion visits between July 2010 and June 2014. We compared proportions of visits pre- and post-intervention in which the examination was performed or RTL/RTP guidelines provided. Examinations and RTL/RTP were documented for 1.8% and 19.0% of visits pre-intervention, respectively, compared with 71.1% and 72.9% post-intervention. A total of 95% of post-intervention examinations were documented within the SmartSet. An electronic clinical decision support tool, plus in-person training, may be key to changing primary care provider behavior around concussion care.
Acceptability and validity of older driver screening with the DrivingHealth Inventory.
Edwards, Jerri D; Leonard, Kathleen M; Lunsman, Melissa; Dodson, Joan; Bradley, Stacy; Myers, Charlsie A; Hubble, Bridgette
2008-05-01
Research has indicated that technology can be effectively used to identify high-risk older drivers. However, adaptation of such technology has been limited. Researchers debate whether older drivers represent a safety problem as well as whether they should be screened for driving fitness. The present study examined how drivers feel regarding technological screening and mandatory state testing. The validity and acceptability of a new technological screening battery for identifying high-risk drivers, the DrivingHealth Inventory (DHI), was also evaluated. In a sample of 258 Alabama drivers aged 18-87, older drivers performed significantly worse than younger drivers on sensory, cognitive, and physical subtests of the DHI, and older drivers with a crash history performed worse than older drivers without crashes. Regardless of age, 90% of participants supported states requiring screening for older drivers' license renewal. The majority of the participants (72%) supported use of technological screening batteries such as the DHI as a driver screening tool. Considering the acceptability and potential efficacy of the DHI, it may be a useful tool in evaluating driving fitness among older adults.
NASA Astrophysics Data System (ADS)
Hilliard, Antony
Energy Monitoring and Targeting is a well-established business process that develops information about utility energy consumption in a business or institution. While M&T has persisted as a worthwhile energy conservation support activity, it has not been widely adopted. This dissertation explains M&T challenges in terms of diagnosing and controlling energy consumption, informed by a naturalistic field study of M&T work. A Cognitive Work Analysis of M&T identifies structures that diagnosis can search, information flows un-supported in canonical support tools, and opportunities to extend the most popular tool for MM&T: Cumulative Sum of Residuals (CUSUM) charts. A design application outlines how CUSUM charts were augmented with a more contemporary statistical change detection strategy, Recursive Parameter Estimates, modified to better suit the M&T task using Representation Aiding principles. The design was experimentally evaluated in a controlled M&T synthetic task, and was shown to significantly improve diagnosis performance.
PyCoTools: A Python Toolbox for COPASI.
Welsh, Ciaran M; Fullard, Nicola; Proctor, Carole J; Martinez-Guimera, Alvaro; Isfort, Robert J; Bascom, Charles C; Tasseff, Ryan; Przyborski, Stefan A; Shanley, Daryl P
2018-05-22
COPASI is an open source software package for constructing, simulating and analysing dynamic models of biochemical networks. COPASI is primarily intended to be used with a graphical user interface but often it is desirable to be able to access COPASI features programmatically, with a high level interface. PyCoTools is a Python package aimed at providing a high level interface to COPASI tasks with an emphasis on model calibration. PyCoTools enables the construction of COPASI models and the execution of a subset of COPASI tasks including time courses, parameter scans and parameter estimations. Additional 'composite' tasks which use COPASI tasks as building blocks are available for increasing parameter estimation throughput, performing identifiability analysis and performing model selection. PyCoTools supports exploratory data analysis on parameter estimation data to assist with troubleshooting model calibrations. We demonstrate PyCoTools by posing a model selection problem designed to show case PyCoTools within a realistic scenario. The aim of the model selection problem is to test the feasibility of three alternative hypotheses in explaining experimental data derived from neonatal dermal fibroblasts in response to TGF-β over time. PyCoTools is used to critically analyse the parameter estimations and propose strategies for model improvement. PyCoTools can be downloaded from the Python Package Index (PyPI) using the command 'pip install pycotools' or directly from GitHub (https://github.com/CiaranWelsh/pycotools). Documentation at http://pycotools.readthedocs.io. Supplementary data are available at Bioinformatics.
Brain activity underlying tool-related and imitative skills after major left hemisphere stroke.
Martin, Markus; Nitschke, Kai; Beume, Lena; Dressing, Andrea; Bühler, Laura E; Ludwig, Vera M; Mader, Irina; Rijntjes, Michel; Kaller, Christoph P; Weiller, Cornelius
2016-05-01
Apraxia is a debilitating cognitive motor disorder that frequently occurs after left hemisphere stroke and affects tool-associated and imitative skills. However, the severity of the apraxic deficits varies even across patients with similar lesions. This variability raises the question whether regions outside the left hemisphere network typically associated with cognitive motor tasks in healthy subjects are of additional functional relevance. To investigate this hypothesis, we explored regions where functional magnetic resonance imaging activity is associated with better cognitive motor performance in patients with left hemisphere ischaemic stroke. Thirty-six patients with chronic (>6 months) large left hemisphere infarcts (age ± standard deviation, 60 ± 12 years, 29 male) and 29 control subjects (age ± standard deviation, 72 ± 7, 15 male) were first assessed behaviourally outside the scanner with tests for actual tool use, pantomime and imitation of tool-use gestures, as well as for meaningless gesture imitation. Second, functional magnetic resonance imaging activity was registered during the passive observation of videos showing tool-associated actions. Voxel-wise linear regression analyses were used to identify areas where behavioural performance was correlated with functional magnetic resonance imaging activity. Furthermore, lesions were delineated on the magnetic resonance imaging scans for voxel-based lesion-symptom mapping. The analyses revealed two sets of regions where functional magnetic resonance imaging activity was associated with better performance in the clinical tasks. First, activity in left hemisphere areas thought to mediate cognitive motor functions in healthy individuals (i.e. activity within the putative 'healthy' network) was correlated with better scores. Within this network, tool-associated tasks were mainly related to activity in supramarginal gyrus and ventral premotor cortex, while meaningless gesture imitation depended more on the anterior intraparietal sulcus and superior parietal lobule. Second, repeating the regression analyses with total left hemisphere lesion volume as additional covariate demonstrated that tool-related skills were further supported by right premotor, right inferior frontal and left anterior temporal areas, while meaningless gesture imitation was also driven by the left dorso-lateral prefrontal cortex. In summary, tool-related and imitative skills in left hemisphere stroke patients depend on the activation of spared left hemisphere regions that support these abilities in healthy individuals. In addition, cognitive motor functions rely on the activation of ipsi- and contralesional areas that are situated outside this 'healthy' network. This activity may explain why some patients perform surprisingly well despite large left brain lesions, while others are severely impaired. © The Author (2016). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Using intervention mapping to develop a work-related guidance tool for those affected by cancer.
Munir, Fehmidah; Kalawsky, Katryna; Wallis, Deborah J; Donaldson-Feilder, Emma
2013-01-05
Working-aged individuals diagnosed and treated for cancer require support and assistance to make decisions regarding work. However, healthcare professionals do not consider the work-related needs of patients and employers do not understand the full impact cancer can have upon the employee and their work. We therefore developed a work-related guidance tool for those diagnosed with cancer that enables them to take the lead in stimulating discussion with a range of different healthcare professionals, employers, employment agencies and support services. The tool facilitates discussions through a set of questions individuals can utilise to find solutions and minimise the impact cancer diagnosis, prognosis and treatment may have on their employment, sick leave and return to work outcomes. The objective of the present article is to describe the systematic development and content of the tool using Intervention Mapping Protocol (IMP). The study used the first five steps of the intervention mapping process to guide the development of the tool. A needs assessment identified the 'gaps' in information/advice received from healthcare professionals and other stakeholders. The intended outcomes and performance objectives for the tool were then identified followed by theory-based methods and an implementation plan. A draft of the tool was developed and subjected to a two-stage Delphi process with various stakeholders. The final tool was piloted with 38 individuals at various stages of the cancer journey. The tool was designed to be a self-led tool that can be used by any person with a cancer diagnosis and working for most types of employers. The pilot study indicated that the tool was relevant and much needed. Intervention Mapping is a valuable protocol for designing complex guidance tools. The process and design of this particular tool can lend itself to other situations both occupational and more health-care based.
Using intervention mapping to develop a work-related guidance tool for those affected by cancer
2013-01-01
Background Working-aged individuals diagnosed and treated for cancer require support and assistance to make decisions regarding work. However, healthcare professionals do not consider the work-related needs of patients and employers do not understand the full impact cancer can have upon the employee and their work. We therefore developed a work-related guidance tool for those diagnosed with cancer that enables them to take the lead in stimulating discussion with a range of different healthcare professionals, employers, employment agencies and support services. The tool facilitates discussions through a set of questions individuals can utilise to find solutions and minimise the impact cancer diagnosis, prognosis and treatment may have on their employment, sick leave and return to work outcomes. The objective of the present article is to describe the systematic development and content of the tool using Intervention Mapping Protocol (IMP). Methods The study used the first five steps of the intervention mapping process to guide the development of the tool. A needs assessment identified the ‘gaps’ in information/advice received from healthcare professionals and other stakeholders. The intended outcomes and performance objectives for the tool were then identified followed by theory-based methods and an implementation plan. A draft of the tool was developed and subjected to a two-stage Delphi process with various stakeholders. The final tool was piloted with 38 individuals at various stages of the cancer journey. Results The tool was designed to be a self-led tool that can be used by any person with a cancer diagnosis and working for most types of employers. The pilot study indicated that the tool was relevant and much needed. Conclusions Intervention Mapping is a valuable protocol for designing complex guidance tools. The process and design of this particular tool can lend itself to other situations both occupational and more health-care based. PMID:23289708
Spronk, Inge; Burgers, Jako S; Schellevis, François G; van Vliet, Liesbeth M; Korevaar, Joke C
2018-05-11
Shared decision-making (SDM) in the management of metastatic breast cancer care is associated with positive patient outcomes. In daily clinical practice, however, SDM is not fully integrated yet. Initiatives to improve the implementation of SDM would be helpful. The aim of this review was to assess the availability and effectiveness of tools supporting SDM in metastatic breast cancer care. Literature databases were systematically searched for articles published since 2006 focusing on the development or evaluation of tools to improve information-provision and to support decision-making in metastatic breast cancer care. Internet searches and experts identified additional tools. Data from included tools were extracted and the evaluation of tools was appraised using the GRADE grading system. The literature search yielded five instruments. In addition, two tools were identified via internet searches and consultation of experts. Four tools were specifically developed for supporting SDM in metastatic breast cancer, the other three tools focused on metastatic cancer in general. Tools were mainly applicable across the care process, and usable for decisions on supportive care with or without chemotherapy. All tools were designed for patients to be used before a consultation with the physician. Effects on patient outcomes were generally weakly positive although most tools were not studied in well-designed studies. Despite its recognized importance, only two tools were positively evaluated on effectiveness and are available to support patients with metastatic breast cancer in SDM. These tools show promising results in pilot studies and focus on different aspects of care. However, their effectiveness should be confirmed in well-designed studies before implementation in clinical practice. Innovation and development of SDM tools targeting clinicians as well as patients during a clinical encounter is recommended.
Abraham, Mark James; Murtola, Teemu; Schulz, Roland; ...
2015-07-15
GROMACS is one of the most widely used open-source and free software codes in chemistry, used primarily for dynamical simulations of biomolecules. It provides a rich set of calculation types, preparation and analysis tools. Several advanced techniques for free-energy calculations are supported. In version 5, it reaches new performance heights, through several new and enhanced parallelization algorithms. This work on every level; SIMD registers inside cores, multithreading, heterogeneous CPU–GPU acceleration, state-of-the-art 3D domain decomposition, and ensemble-level parallelization through built-in replica exchange and the separate Copernicus framework. Finally, the latest best-in-class compressed trajectory storage format is supported.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abraham, Mark James; Murtola, Teemu; Schulz, Roland
GROMACS is one of the most widely used open-source and free software codes in chemistry, used primarily for dynamical simulations of biomolecules. It provides a rich set of calculation types, preparation and analysis tools. Several advanced techniques for free-energy calculations are supported. In version 5, it reaches new performance heights, through several new and enhanced parallelization algorithms. This work on every level; SIMD registers inside cores, multithreading, heterogeneous CPU–GPU acceleration, state-of-the-art 3D domain decomposition, and ensemble-level parallelization through built-in replica exchange and the separate Copernicus framework. Finally, the latest best-in-class compressed trajectory storage format is supported.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Springmeyer, R R; Brugger, E; Cook, R
The Data group provides data analysis and visualization support to its customers. This consists primarily of the development and support of VisIt, a data analysis and visualization tool. Support ranges from answering questions about the tool, providing classes on how to use the tool, and performing data analysis and visualization for customers. The Information Management and Graphics Group supports and develops tools that enhance our ability to access, display, and understand large, complex data sets. Activities include applying visualization software for large scale data exploration; running video production labs on two networks; supporting graphics libraries and tools for end users;more » maintaining PowerWalls and assorted other displays; and developing software for searching and managing scientific data. Researchers in the Center for Applied Scientific Computing (CASC) work on various projects including the development of visualization techniques for large scale data exploration that are funded by the ASC program, among others. The researchers also have LDRD projects and collaborations with other lab researchers, academia, and industry. The IMG group is located in the Terascale Simulation Facility, home to Dawn, Atlas, BGL, and others, which includes both classified and unclassified visualization theaters, a visualization computer floor and deployment workshop, and video production labs. We continued to provide the traditional graphics group consulting and video production support. We maintained five PowerWalls and many other displays. We deployed a 576-node Opteron/IB cluster with 72 TB of memory providing a visualization production server on our classified network. We continue to support a 128-node Opteron/IB cluster providing a visualization production server for our unclassified systems and an older 256-node Opteron/IB cluster for the classified systems, as well as several smaller clusters to drive the PowerWalls. The visualization production systems includes NFS servers to provide dedicated storage for data analysis and visualization. The ASC projects have delivered new versions of visualization and scientific data management tools to end users and continue to refine them. VisIt had 4 releases during the past year, ending with VisIt 2.0. We released version 2.4 of Hopper, a Java application for managing and transferring files. This release included a graphical disk usage view which works on all types of connections and an aggregated copy feature for quickly transferring massive datasets quickly and efficiently to HPSS. We continue to use and develop Blockbuster and Telepath. Both the VisIt and IMG teams were engaged in a variety of movie production efforts during the past year in addition to the development tasks.« less
Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doebling, Scott William; Budzien, Joanne Louise; Ferguson, Jim Michael
This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents servemore » as the compilation of results demonstrating accomplishment of these objectives.« less
Watershed Management Optimization Support Tool v3
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that facilitates integrated water management at the local or small watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed context that is, accou...
STILTS -- Starlink Tables Infrastructure Library Tool Set
NASA Astrophysics Data System (ADS)
Taylor, Mark
STILTS is a set of command-line tools for processing tabular data. It has been designed for, but is not restricted to, use on astronomical data such as source catalogues. It contains both generic (format-independent) table processing tools and tools for processing VOTable documents. Facilities offered include crossmatching, format conversion, format validation, column calculation and rearrangement, row selection, sorting, plotting, statistical calculations and metadata display. Calculations on cell data can be performed using a powerful and extensible expression language. The package is written in pure Java and based on STIL, the Starlink Tables Infrastructure Library. This gives it high portability, support for many data formats (including FITS, VOTable, text-based formats and SQL databases), extensibility and scalability. Where possible the tools are written to accept streamed data so the size of tables which can be processed is not limited by available memory. As well as the tutorial and reference information in this document, detailed on-line help is available from the tools themselves. STILTS is available under the GNU General Public Licence.
Covariance Analysis Tool (G-CAT) for Computing Ascent, Descent, and Landing Errors
NASA Technical Reports Server (NTRS)
Boussalis, Dhemetrios; Bayard, David S.
2013-01-01
G-CAT is a covariance analysis tool that enables fast and accurate computation of error ellipses for descent, landing, ascent, and rendezvous scenarios, and quantifies knowledge error contributions needed for error budgeting purposes. Because GCAT supports hardware/system trade studies in spacecraft and mission design, it is useful in both early and late mission/ proposal phases where Monte Carlo simulation capability is not mature, Monte Carlo simulation takes too long to run, and/or there is a need to perform multiple parametric system design trades that would require an unwieldy number of Monte Carlo runs. G-CAT is formulated as a variable-order square-root linearized Kalman filter (LKF), typically using over 120 filter states. An important property of G-CAT is that it is based on a 6-DOF (degrees of freedom) formulation that completely captures the combined effects of both attitude and translation errors on the propagated trajectories. This ensures its accuracy for guidance, navigation, and control (GN&C) analysis. G-CAT provides the desired fast turnaround analysis needed for error budgeting in support of mission concept formulations, design trade studies, and proposal development efforts. The main usefulness of a covariance analysis tool such as G-CAT is its ability to calculate the performance envelope directly from a single run. This is in sharp contrast to running thousands of simulations to obtain similar information using Monte Carlo methods. It does this by propagating the "statistics" of the overall design, rather than simulating individual trajectories. G-CAT supports applications to lunar, planetary, and small body missions. It characterizes onboard knowledge propagation errors associated with inertial measurement unit (IMU) errors (gyro and accelerometer), gravity errors/dispersions (spherical harmonics, masscons), and radar errors (multiple altimeter beams, multiple Doppler velocimeter beams). G-CAT is a standalone MATLAB- based tool intended to run on any engineer's desktop computer.
The Launch Systems Operations Cost Model
NASA Technical Reports Server (NTRS)
Prince, Frank A.; Hamaker, Joseph W. (Technical Monitor)
2001-01-01
One of NASA's primary missions is to reduce the cost of access to space while simultaneously increasing safety. A key component, and one of the least understood, is the recurring operations and support cost for reusable launch systems. In order to predict these costs, NASA, under the leadership of the Independent Program Assessment Office (IPAO), has commissioned the development of a Launch Systems Operations Cost Model (LSOCM). LSOCM is a tool to predict the operations & support (O&S) cost of new and modified reusable (and partially reusable) launch systems. The requirements are to predict the non-recurring cost for the ground infrastructure and the recurring cost of maintaining that infrastructure, performing vehicle logistics, and performing the O&S actions to return the vehicle to flight. In addition, the model must estimate the time required to cycle the vehicle through all of the ground processing activities. The current version of LSOCM is an amalgamation of existing tools, leveraging our understanding of shuttle operations cost with a means of predicting how the maintenance burden will change as the vehicle becomes more aircraft like. The use of the Conceptual Operations Manpower Estimating Tool/Operations Cost Model (COMET/OCM) provides a solid point of departure based on shuttle and expendable launch vehicle (ELV) experience. The incorporation of the Reliability and Maintainability Analysis Tool (RMAT) as expressed by a set of response surface model equations gives a method for estimating how changing launch system characteristics affects cost and cycle time as compared to today's shuttle system. Plans are being made to improve the model. The development team will be spending the next few months devising a structured methodology that will enable verified and validated algorithms to give accurate cost estimates. To assist in this endeavor the LSOCM team is part of an Agency wide effort to combine resources with other cost and operations professionals to support models, databases, and operations assessments.
Automated Cache Performance Analysis And Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohror, Kathryn
While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool tomore » gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters, cache behavior could only be measured reliably in the ag- gregate across tens or hundreds of thousands of instructions. With the newest iteration of PEBS technology, cache events can be tied to a tuple of instruction pointer, target address (for both loads and stores), memory hierarchy, and observed latency. With this information we can now begin asking questions regarding the efficiency of not only regions of code, but how these regions interact with particular data structures and how these interactions evolve over time. In the short term, this information will be vital for performance analysts understanding and optimizing the behavior of their codes for the memory hierarchy. In the future, we can begin to ask how data layouts might be changed to improve performance and, for a particular application, what the theoretical optimal performance might be. The overall benefit to be produced by this effort was a commercial quality easy-to- use and scalable performance tool that will allow both beginner and experienced parallel programmers to automatically tune their applications for optimal cache usage. Effective use of such a tool can literally save weeks of performance tuning effort. Easy to use. With the proposed innovations, finding and fixing memory performance issues would be more automated and hide most to all of the performance engineer exper- tise ”under the hood” of the Open|SpeedShop performance tool. One of the biggest public benefits from the proposed innovations is that it makes performance analysis more usable to a larger group of application developers. Intuitive reporting of results. The Open|SpeedShop performance analysis tool has a rich set of intuitive, yet detailed reports for presenting performance results to application developers. Our goal was to leverage this existing technology to present the results from our memory performance addition to Open|SpeedShop. Suitable for experts as well as novices. Application performance is getting more difficult to measure as the hardware platforms they run on become more complicated. This makes life difficult for the application developer, in that they need to know more about the hardware platform, including the memory system hierarchy, in order to understand the performance of their application. Some application developers are comfortable in that sce- nario, while others want to do their scientific research and not have to understand all the nuances in the hardware platform they are running their application on. Our proposed innovations were aimed to support both experts and novice performance analysts. Useful in many markets. The enhancement to Open|SpeedShop would appeal to a broader market space, as it will be useful in scientific, commercial, and cloud computing environments. Our goal was to use technology developed initially at the and Lawrence Livermore Na- tional Laboratory combined with the development and commercial software experience of the Argo Navis Technologies, LLC (ANT) to form a powerful combination to delivery these objectives.« less
Platform for Automated Real-Time High Performance Analytics on Medical Image Data.
Allen, William J; Gabr, Refaat E; Tefera, Getaneh B; Pednekar, Amol S; Vaughn, Matthew W; Narayana, Ponnada A
2018-03-01
Biomedical data are quickly growing in volume and in variety, providing clinicians an opportunity for better clinical decision support. Here, we demonstrate a robust platform that uses software automation and high performance computing (HPC) resources to achieve real-time analytics of clinical data, specifically magnetic resonance imaging (MRI) data. We used the Agave application programming interface to facilitate communication, data transfer, and job control between an MRI scanner and an off-site HPC resource. In this use case, Agave executed the graphical pipeline tool GRAphical Pipeline Environment (GRAPE) to perform automated, real-time, quantitative analysis of MRI scans. Same-session image processing will open the door for adaptive scanning and real-time quality control, potentially accelerating the discovery of pathologies and minimizing patient callbacks. We envision this platform can be adapted to other medical instruments, HPC resources, and analytics tools.
Ryan, Gery W; Farmer, Carrie M; Adamson, David M; Weinick, Robin M
2014-01-01
Between 2001 and 2011, the U.S. Department of Defense has implemented numerous programs to support service members and their families in coping with the stressors from a decade of the longstanding conflicts in Iraq and Afghanistan. These programs, which address both psychological health and traumatic brain injury (TBI), number in the hundreds and vary in their size, scope, and target population. To ensure that resources are wisely invested and maximize the benefits of such programs, RAND developed a tool to help assess program performance, consider options for improvement, implement solutions, then assess whether the changes worked, with the intention of helping those responsible for managing or implementing programs to conduct assessments of how well the program is performing and to implement solutions for improving performance. Specifically, the tool is intended to provide practical guidance in program improvement and continuous quality improvement for all programs.
Experimental Performance Evaluation of a Supersonic Turbine for Rocket Engine Applications
NASA Technical Reports Server (NTRS)
Snellgrove, Lauren M.; Griffin, Lisa W.; Sieja, James P.; Huber, Frank W.
2003-01-01
In order to mitigate the risk of rocket propulsion development, efficient, accurate, detailed fluid dynamics analysis and testing of the turbomachinery is necessary. To support this requirement, a task was developed at NASA Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. These tools were applied to optimize a supersonic turbine design suitable for a reusable launch vehicle (RLV). The hot gas path and blading were redesigned-to obtain an increased efficiency. The goal of the demonstration was to increase the total-to- static efficiency of the turbine by eight points over the baseline design. A sub-scale, cold flow test article modeling the final optimized turbine was designed, manufactured, and tested in air at MSFC s Turbine Airflow Facility. Extensive on- and off- design point performance data, steady-state data, and unsteady blade loading data were collected during testing.
Supersonic civil airplane study and design: Performance and sonic boom
NASA Technical Reports Server (NTRS)
Cheung, Samson
1995-01-01
Since aircraft configuration plays an important role in aerodynamic performance and sonic boom shape, the configuration of the next generation supersonic civil transport has to be tailored to meet high aerodynamic performance and low sonic boom requirements. Computational fluid dynamics (CFD) can be used to design airplanes to meet these dual objectives. The work and results in this report are used to support NASA's High Speed Research Program (HSRP). CFD tools and techniques have been developed for general usages of sonic boom propagation study and aerodynamic design. Parallel to the research effort on sonic boom extrapolation, CFD flow solvers have been coupled with a numeric optimization tool to form a design package for aircraft configuration. This CFD optimization package has been applied to configuration design on a low-boom concept and an oblique all-wing concept. A nonlinear unconstrained optimizer for Parallel Virtual Machine has been developed for aerodynamic design and study.
NASA Technical Reports Server (NTRS)
Williams, Jacob; Stewart, Shaun M.; Lee, David E.; Davis, Elizabeth C.; Condon, Gerald L.; Senent, Juan
2010-01-01
The National Aeronautics and Space Administration s (NASA) Constellation Program paves the way for a series of lunar missions leading to a sustained human presence on the Moon. The proposed mission design includes an Earth Departure Stage (EDS), a Crew Exploration Vehicle (Orion) and a lunar lander (Altair) which support the transfer to and from the lunar surface. This report addresses the design, development and implementation of a new mission scan tool called the Mission Assessment Post Processor (MAPP) and its use to provide insight into the integrated (i.e., EDS, Orion, and Altair based) mission cost as a function of various mission parameters and constraints. The Constellation architecture calls for semiannual launches to the Moon and will support a number of missions, beginning with 7-day sortie missions, culminating in a lunar outpost at a specified location. The operational lifetime of the Constellation Program can cover a period of decades over which the Earth-Moon geometry (particularly, the lunar inclination) will go through a complete cycle (i.e., the lunar nodal cycle lasting 18.6 years). This geometry variation, along with other parameters such as flight time, landing site location, and mission related constraints, affect the outbound (Earth to Moon) and inbound (Moon to Earth) translational performance cost. The mission designer must determine the ability of the vehicles to perform lunar missions as a function of this complex set of interdependent parameters. Trade-offs among these parameters provide essential insights for properly assessing the ability of a mission architecture to meet desired goals and objectives. These trades also aid in determining the overall usable propellant required for supporting nominal and off-nominal missions over the entire operational lifetime of the program, thus they support vehicle sizing.
Web-Based Tools for Data Visualization and Decision Support for South Asia
NASA Astrophysics Data System (ADS)
Jones, N.; Nelson, J.; Pulla, S. T.; Ames, D. P.; Souffront, M.; David, C. H.; Zaitchik, B. F.; Gatlin, P. N.; Matin, M. A.
2017-12-01
The objective of the NASA SERVIR project is to assist developing countries in using information provided by Earth observing satellites to assess and manage climate risks, land use, and water resources. We present a collection of web apps that integrate earth observations and in situ data to facilitate deployment of data and water resources models as decision-making tools in support of this effort. The interactive nature of web apps makes this an excellent medium for creating decision support tools that harness cutting edge modeling techniques. Thin client apps hosted in a cloud portal eliminates the need for the decision makers to procure and maintain the high performance hardware required by the models, deal with issues related to software installation and platform incompatibilities, or monitor and install software updates, a problem that is exacerbated for many of the regional SERVIR hubs where both financial and technical capacity may be limited. All that is needed to use the system is an Internet connection and a web browser. We take advantage of these technologies to develop tools which can be centrally maintained but openly accessible. Advanced mapping and visualization make results intuitive and information derived actionable. We also take advantage of the emerging standards for sharing water information across the web using the OGC and WMO approved WaterML standards. This makes our tools interoperable and extensible via application programming interfaces (APIs) so that tools and data from other projects can both consume and share the tools developed in our project. Our approach enables the integration of multiple types of data and models, thus facilitating collaboration between science teams in SERVIR. The apps developed thus far by our team process time-varying netCDF files from Earth observations and large-scale computer simulations and allow visualization and exploration via raster animation and extraction of time series at selected points and/or regions.
PERTS: A Prototyping Environment for Real-Time Systems
NASA Technical Reports Server (NTRS)
Liu, Jane W. S.; Lin, Kwei-Jay; Liu, C. L.
1993-01-01
PERTS is a prototyping environment for real-time systems. It is being built incrementally and will contain basic building blocks of operating systems for time-critical applications, tools, and performance models for the analysis, evaluation and measurement of real-time systems and a simulation/emulation environment. It is designed to support the use and evaluation of new design approaches, experimentations with alternative system building blocks, and the analysis and performance profiling of prototype real-time systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnamoorthy, Sriram; Daily, Jeffrey A.; Vishnu, Abhinav
2015-11-01
Global Arrays (GA) is a distributed-memory programming model that allows for shared-memory-style programming combined with one-sided communication, to create a set of tools that combine high performance with ease-of-use. GA exposes a relatively straightforward programming abstraction, while supporting fully-distributed data structures, locality of reference, and high-performance communication. GA was originally formulated in the early 1990’s to provide a communication layer for the Northwest Chemistry (NWChem) suite of chemistry modeling codes that was being developed concurrently.
Watershed Management Optimization Support Tool (WMOST) v3: User Guide
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that facilitates integrated water management at the local or small watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed context that is, accou...
Watershed Management Optimization Support Tool (WMOST) v3: Theoretical Documentation
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that facilitates integrated water management at the local or small watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed context, accounting fo...
Watershed Management Optimization Support Tool (WMOST) v2: Theoretical Documentation
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that evaluates the relative cost-effectiveness of management practices at the local or watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed c...
Measuring, managing and maximizing performance of mineral processing plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bascur, O.A.; Kennedy, J.P.
1995-12-31
The implementation of continuous quality improvement is the confluence of Total Quality Management, People Empowerment, Performance Indicators and Information Engineering. The supporting information technologies allow a mineral processor to narrow the gap between management business objectives and the process control level. One of the most important contributors is the user friendliness and flexibility of the personal computer in a client/server environment. This synergistic combination when used for real time performance monitoring translates into production cost savings, improved communications and enhanced decision support. Other savings come from reduced time to collect data and perform tedious calculations, act quickly with fresh newmore » data, generate and validate data to be used by others. This paper presents an integrated view of plant management. The selection of the proper tools for continuous quality improvement are described. The process of selecting critical performance monitoring indices for improved plant performance are discussed. The importance of a well balanced technological improvement, personnel empowerment, total quality management and organizational assets are stressed.« less
Human Factors Report: TMA Operational Evaluations 1996 and 1998
NASA Technical Reports Server (NTRS)
Lee, Katharine K.; Quinn, Cheryl M.; Hoang, Ty; Sanford, Beverly D.
2000-01-01
The Traffic Management Advisor (TMA) is a component of the Center-TRACON Automation System (CTAS), a suite of decision-support tools for the air traffic control (ATC) environment which is being developed at NASA Ames Research Center. TMA has been operational at the ATC facilities in Dallas/Fort Worth, Texas, since an operational field evaluation in 1996. The Operational Evaluation demonstrated significant benefits, including an approximately 5 percent increase in airport capacity. This report describes the human factors results from the 1996 Operational Evaluation and an investigation of TMA usage performed two years later, during the 1998 TMA Daily Use Field Survey. The results described are instructive for CTAS focused development, and provide valuable lessons for future research in ATC decision-support tools where it is critical to merge a well-defined, complex work environment with advanced automation.
Designing and evaluating a STEM teacher learning opportunity in the research university.
Hardré, Patricia L; Ling, Chen; Shehab, Randa L; Herron, Jason; Nanny, Mark A; Nollert, Matthias U; Refai, Hazem; Ramseyer, Christopher; Wollega, Ebisa D
2014-04-01
This study examines the design and evaluation strategies for a year-long teacher learning and development experience, including their effectiveness, efficiency and recommendations for strategic redesign. Design characteristics include programmatic features and outcomes: cognitive, affective and motivational processes; interpersonal and social development; and performance activities. Program participants were secondary math and science teachers, partnered with engineering faculty mentors, in a research university-based education and support program. Data from multiple sources demonstrated strengths and weaknesses in design of the program's learning environment, including: face-to-face and via digital tools; on-site and distance community interactions; and strategic evaluation tools and systems. Implications are considered for the strategic design and evaluation of similar grant-funded research experiences intended to support teacher learning, development and transfer. Copyright © 2013 Elsevier Ltd. All rights reserved.
Designing Real-time Decision Support for Trauma Resuscitations
Yadav, Kabir; Chamberlain, James M.; Lewis, Vicki R.; Abts, Natalie; Chawla, Shawn; Hernandez, Angie; Johnson, Justin; Tuveson, Genevieve; Burd, Randall S.
2016-01-01
Background Use of electronic clinical decision support (eCDS) has been recommended to improve implementation of clinical decision rules. Many eCDS tools, however, are designed and implemented without taking into account the context in which clinical work is performed. Implementation of the pediatric traumatic brain injury (TBI) clinical decision rule at one Level I pediatric emergency department includes an electronic questionnaire triggered when ordering a head computed tomography using computerized physician order entry (CPOE). Providers use this CPOE tool in less than 20% of trauma resuscitation cases. A human factors engineering approach could identify the implementation barriers that are limiting the use of this tool. Objectives The objective was to design a pediatric TBI eCDS tool for trauma resuscitation using a human factors approach. The hypothesis was that clinical experts will rate a usability-enhanced eCDS tool better than the existing CPOE tool for user interface design and suitability for clinical use. Methods This mixed-methods study followed usability evaluation principles. Pediatric emergency physicians were surveyed to identify barriers to using the existing eCDS tool. Using standard trauma resuscitation protocols, a hierarchical task analysis of pediatric TBI evaluation was developed. Five clinical experts, all board-certified pediatric emergency medicine faculty members, then iteratively modified the hierarchical task analysis until reaching consensus. The software team developed a prototype eCDS display using the hierarchical task analysis. Three human factors engineers provided feedback on the prototype through a heuristic evaluation, and the software team refined the eCDS tool using a rapid prototyping process. The eCDS tool then underwent iterative usability evaluations by the five clinical experts using video review of 50 trauma resuscitation cases. A final eCDS tool was created based on their feedback, with content analysis of the evaluations performed to ensure all concerns were identified and addressed. Results Among 26 EPs (76% response rate), the main barriers to using the existing tool were that the information displayed is redundant and does not fit clinical workflow. After the prototype eCDS tool was developed based on the trauma resuscitation hierarchical task analysis, the human factors engineers rated it to be better than the CPOE tool for nine of 10 standard user interface design heuristics on a three-point scale. The eCDS tool was also rated better for clinical use on the same scale, in 84% of 50 expert–video pairs, and was rated equivalent in the remainder. Clinical experts also rated barriers to use of the eCDS tool as being low. Conclusions An eCDS tool for diagnostic imaging designed using human factors engineering methods has improved perceived usability among pediatric emergency physicians. PMID:26300010
Rodríguez, Daniela C; Peterson, Lauren A
2016-05-06
Factors that influence performance of community health workers (CHWs) delivering health services are not well understood. A recent logic model proposed categories of support from both health sector and communities that influence CHW performance and program outcomes. This logic model has been used to review a growth monitoring program delivered by CHWs in Honduras, known as Atención Integral a la Niñez en la Comunidad (AIN-C). A retrospective review of AIN-C was conducted through a document desk review and supplemented with in-depth interviews. Documents were systematically coded using the categories from the logic model, and gaps were addressed through interviews. Authors reviewed coded data for each category to analyze program details and outcomes as well as identify potential issues and gaps in the logic model. Categories from the logic model were inconsistently represented, with more information available for health sector than community. Context and input activities were not well documented. Information on health sector systems-level activities was available for governance but limited for other categories, while not much was found for community systems-level activities. Most available information focused on program-level activities with substantial data on technical support. Output, outcome, and impact data were drawn from various resources and suggest mixed results of AIN-C on indicators of interest. Assessing CHW performance through a desk review left gaps that could not be addressed about the relationship of activities and performance. There were critical characteristics of program design that made it contextually appropriate; however, it was difficult to identify clear links between AIN-C and malnutrition indicators. Regarding the logic model, several categories were too broad (e.g., technical support, context) and some aspects of AIN-C did not fit neatly in logic model categories (e.g., political commitment, equity, flexibility in implementation). The CHW performance logic model has potential as a tool for program planning and evaluation but would benefit from additional supporting tools and materials to facilitate and operationalize its use.
Sittig, Dean F; Ash, Joan S; Feblowitz, Joshua; Meltzer, Seth; McMullen, Carmit; Guappone, Ken; Carpenter, Jim; Richardson, Joshua; Simonaitis, Linas; Evans, R Scott; Nichol, W Paul; Middleton, Blackford
2011-01-01
Background Clinical decision support (CDS) is a valuable tool for improving healthcare quality and lowering costs. However, there is no comprehensive taxonomy of types of CDS and there has been limited research on the availability of various CDS tools across current electronic health record (EHR) systems. Objective To develop and validate a taxonomy of front-end CDS tools and to assess support for these tools in major commercial and internally developed EHRs. Study design and methods We used a modified Delphi approach with a panel of 11 decision support experts to develop a taxonomy of 53 front-end CDS tools. Based on this taxonomy, a survey on CDS tools was sent to a purposive sample of commercial EHR vendors (n=9) and leading healthcare institutions with internally developed state-of-the-art EHRs (n=4). Results Responses were received from all healthcare institutions and 7 of 9 EHR vendors (response rate: 85%). All 53 types of CDS tools identified in the taxonomy were found in at least one surveyed EHR system, but only 8 functions were present in all EHRs. Medication dosing support and order facilitators were the most commonly available classes of decision support, while expert systems (eg, diagnostic decision support, ventilator management suggestions) were the least common. Conclusion We developed and validated a comprehensive taxonomy of front-end CDS tools. A subsequent survey of commercial EHR vendors and leading healthcare institutions revealed a small core set of common CDS tools, but identified significant variability in the remainder of clinical decision support content. PMID:21415065
Support vector machine firefly algorithm based optimization of lens system.
Shamshirband, Shahaboddin; Petković, Dalibor; Pavlović, Nenad T; Ch, Sudheer; Altameem, Torki A; Gani, Abdullah
2015-01-01
Lens system design is an important factor in image quality. The main aspect of the lens system design methodology is the optimization procedure. Since optimization is a complex, nonlinear task, soft computing optimization algorithms can be used. There are many tools that can be employed to measure optical performance, but the spot diagram is the most useful. The spot diagram gives an indication of the image of a point object. In this paper, the spot size radius is considered an optimization criterion. Intelligent soft computing scheme support vector machines (SVMs) coupled with the firefly algorithm (FFA) are implemented. The performance of the proposed estimators is confirmed with the simulation results. The result of the proposed SVM-FFA model has been compared with support vector regression (SVR), artificial neural networks, and generic programming methods. The results show that the SVM-FFA model performs more accurately than the other methodologies. Therefore, SVM-FFA can be used as an efficient soft computing technique in the optimization of lens system designs.
NASA Astrophysics Data System (ADS)
Babanova, Sofia; Artyushkova, Kateryna; Ulyanova, Yevgenia; Singhal, Sameer; Atanassov, Plamen
2014-01-01
Two statistical methods, design of experiments (DOE) and principal component analysis (PCA) are employed to investigate and improve performance of air-breathing gas-diffusional enzymatic electrodes. DOE is utilized as a tool for systematic organization and evaluation of various factors affecting the performance of the composite system. Based on the results from the DOE, an improved cathode is constructed. The current density generated utilizing the improved cathode (755 ± 39 μA cm-2 at 0.3 V vs. Ag/AgCl) is 2-5 times higher than the highest current density previously achieved. Three major factors contributing to the cathode performance are identified: the amount of enzyme, the volume of phosphate buffer used to immobilize the enzyme, and the thickness of the gas-diffusion layer (GDL). PCA is applied as an independent confirmation tool to support conclusions made by DOE and to visualize the contribution of factors in individual cathode configurations.
Psychological and Behavioral Health Issues of Long-Duration Space Missions
NASA Technical Reports Server (NTRS)
Eksuzian, Daniel J.
1998-01-01
It will be the responsibility of the long-duration space flight crew to take the actions necessary to maintain their health and well-being and to cope with medical emergencies without direct assistance from support personnel, including maintaining mental health and managing physiological and psychological changes that may impair decision making and performance. The Behavior and Performance Integrated Product Team at Johnson Space Center, working, within the Space Medicine, Monitoring, and Countermeasures Program, has identified critical questions pertaining to long-duration space crew behavioral health, psychological adaptation, human factors and habitability, and sleep and circadian rhythms. Among the projects addressing these questions are: the development of tools to assess cognitive functions during space missions; the development of a model of psychological adaptation in isolated and confined environments; tools and methods for selecting individuals and teams well-suited for long-duration missions; identification of mission-critical tasks and performance evaluation; and measures of sleep quality and correlation to mission performance.
Toward High-Performance Communications Interfaces for Science Problem Solving
NASA Astrophysics Data System (ADS)
Oviatt, Sharon L.; Cohen, Adrienne O.
2010-12-01
From a theoretical viewpoint, educational interfaces that facilitate communicative actions involving representations central to a domain can maximize students' effort associated with constructing new schemas. In addition, interfaces that minimize working memory demands due to the interface per se, for example by mimicking existing non-digital work practice, can preserve students' attentional focus on their learning task. In this research, we asked the question: What type of interface input capabilities provide best support for science problem solving in both low- and high- performing students? High school students' ability to solve a diverse range of biology problems was compared over longitudinal sessions while they used: (1) hardcopy paper and pencil (2) a digital paper and pen interface (3) pen tablet interface, and (4) graphical tablet interface. Post-test evaluations revealed that time to solve problems, meta-cognitive control, solution correctness, and memory all were significantly enhanced when using the digital pen and paper interface, compared with tablet interfaces. The tangible pen and paper interface also was the only alternative that significantly facilitated skill acquisition in low-performing students. Paradoxically, all students nonetheless believed that the tablet interfaces provided best support for their performance, revealing a lack of self-awareness about how to use computational tools to best advantage. Implications are discussed for how pen interfaces can be optimized for future educational purposes, and for establishing technology fluency curricula to improve students' awareness of the impact of digital tools on their performance.
Evaluation of the Terminal Precision Scheduling and Spacing System for Near-Term NAS Application
NASA Technical Reports Server (NTRS)
Thipphavong, Jane; Martin, Lynne Hazel; Swenson, Harry N.; Lin, Paul; Nguyen, Jimmy
2012-01-01
NASA has developed a capability for terminal area precision scheduling and spacing (TAPSS) to provide higher capacity and more efficiently manage arrivals during peak demand periods. This advanced technology is NASA's vision for the NextGen terminal metering capability. A set of human-in-the-loop experiments was conducted to evaluate the performance of the TAPSS system for near-term implementation. The experiments evaluated the TAPSS system under the current terminal routing infrastructure to validate operational feasibility. A second goal of the study was to measure the benefit of the Center and TRACON advisory tools to help prioritize the requirements for controller radar display enhancements. Simulation results indicate that using the TAPSS system provides benefits under current operations, supporting a 10% increase in airport throughput. Enhancements to Center decision support tools had limited impact on improving the efficiency of terminal operations, but did provide more fuel-efficient advisories to achieve scheduling conformance within 20 seconds. The TRACON controller decision support tools were found to provide the most benefit, by improving the precision in schedule conformance to within 20 seconds, reducing the number of arrivals having lateral path deviations by 50% and lowering subjective controller workload. Overall, the TAPSS system was found to successfully develop an achievable terminal arrival metering plan that was sustainable under heavy traffic demand levels and reduce the complexity of terminal operations when coupled with the use of the terminal controller advisory tools.
Kim, Hyerin; Kang, NaNa; An, KyuHyeon; Koo, JaeHyung; Kim, Min-Soo
2016-07-08
Design of high-quality primers for multiple target sequences is essential for qPCR experiments, but is challenging due to the need to consider both homology tests on off-target sequences and the same stringent filtering constraints on the primers. Existing web servers for primer design have major drawbacks, including requiring the use of BLAST-like tools for homology tests, lack of support for ranking of primers, TaqMan probes and simultaneous design of primers against multiple targets. Due to the large-scale computational overhead, the few web servers supporting homology tests use heuristic approaches or perform homology tests within a limited scope. Here, we describe the MRPrimerW, which performs complete homology testing, supports batch design of primers for multi-target qPCR experiments, supports design of TaqMan probes and ranks the resulting primers to return the top-1 best primers to the user. To ensure high accuracy, we adopted the core algorithm of a previously reported MapReduce-based method, MRPrimer, but completely redesigned it to allow users to receive query results quickly in a web interface, without requiring a MapReduce cluster or a long computation. MRPrimerW provides primer design services and a complete set of 341 963 135 in silico validated primers covering 99% of human and mouse genes. Free access: http://MRPrimerW.com. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Case Report: SPECT/CT as the New Diagnostic Tool for Specific Wrist Pathology.
Linde, Musters; Ten Broek, M; Kraan, G A
2017-01-01
Single photon emission computed tomography has been introduced as a promising new diagnostic tool in orthopaedic pathology since the early 90'. Computed tomography, the combined with SPECT, gives insight in the specific sight of wrist pathology. Literature already supports introduction of SPECT/CT in wrist pathology, but clinical application is lagging. A 40yr old patient reported first in 2004 with persisting pain after a right distal radius fracture. Several diagnostics and operative interventions were performed, all unsuccessful. Because of the persisting pain a SPECT-CT was performed which showed a cyst in the hamate bone, which was successfully enucleated. The patient was finally pain free at recent follow-up. With a QDash-score of 43 and a PRW (H) E-DLV-score of 58/150. In this case report, SPECT/CT proved a very sensitive diagnostic tool for specific pathology of the wrist. It offered precise localisation and thereby the clinically suspected diagnosis was confirmed and the patient successfully treated.
Lynch, Abigail J.; Taylor, William W.; McCright, Aaron M.
2016-01-01
Decision support tools can aid decision making by systematically incorporating information, accounting for uncertainties, and facilitating evaluation between alternatives. Without user buy-in, however, decision support tools can fail to influence decision-making processes. We surveyed fishery researchers, managers, and fishers affiliated with the Lake Whitefish Coregonus clupeaformis fishery in the 1836 Treaty Waters of Lakes Huron, Michigan, and Superior to assess opinions of current and future management needs to identify barriers to, and opportunities for, developing a decision support tool based on Lake Whitefish recruitment projections with climate change. Approximately 64% of 39 respondents were satisfied with current management, and nearly 85% agreed that science was well integrated into management programs. Though decision support tools can facilitate science integration into management, respondents suggest that they face significant implementation barriers, including lack of political will to change management and perceived uncertainty in decision support outputs. Recommendations from this survey can inform development of decision support tools for fishery management in the Great Lakes and other regions.