DOE Office of Scientific and Technical Information (OSTI.GOV)
Draeger, Erik W.
This report documents the fact that the work in creating a strategic plan and beginning customer engagements has been completed. The description of milestone is: The newly formed advanced architecture and portability specialists (AAPS) team will develop a strategic plan to meet the goals of 1) sharing knowledge and experience with code teams to ensure that ASC codes run well on new architectures, and 2) supplying skilled computational scientists to put the strategy into practice. The plan will be delivered to ASC management in the first quarter. By the fourth quarter, the team will identify their first customers within PEMmore » and IC, perform an initial assessment and scalability and performance bottleneck for next-generation architectures, and embed AAPS team members with customer code teams to assist with initial portability development within standalone kernels or proxy applications.« less
Team interaction during surgery: a systematic review of communication coding schemes.
Tiferes, Judith; Bisantz, Ann M; Guru, Khurshid A
2015-05-15
Communication problems have been systematically linked to human errors in surgery and a deep understanding of the underlying processes is essential. Although a number of tools exist to assess nontechnical skills, methods to study communication and other team-related processes are far from being standardized, making comparisons challenging. We conducted a systematic review to analyze methods used to study events in the operating room (OR) and to develop a synthesized coding scheme for OR team communication. Six electronic databases were accessed to search for articles that collected individual events during surgery and included detailed coding schemes. Additional articles were added based on cross-referencing. That collection was then classified based on type of events collected, environment type (real or simulated), number of procedures, type of surgical task, team characteristics, method of data collection, and coding scheme characteristics. All dimensions within each coding scheme were grouped based on emergent content similarity. Categories drawn from articles, which focused on communication events, were further analyzed and synthesized into one common coding scheme. A total of 34 of 949 articles met the inclusion criteria. The methodological characteristics and coding dimensions of the articles were summarized. A priori coding was used in nine studies. The synthesized coding scheme for OR communication included six dimensions as follows: information flow, period, statement type, topic, communication breakdown, and effects of communication breakdown. The coding scheme provides a standardized coding method for OR communication, which can be used to develop a priori codes for future studies especially in comparative effectiveness research. Copyright © 2015 Elsevier Inc. All rights reserved.
Improving the Performance of Online Learning Teams--A Discourse Analysis
ERIC Educational Resources Information Center
Liu, Ying Chieh; Burn, Janice M.
2007-01-01
This paper compares the processes of Face-To-Face (FTF) teams and Online Learning Teams (OLTs) and proposes methods to improve the performance of OLTs. An empirical study reviewed the performance of fifteen FTF teams and OLTs and their communication patterns were coded by the TEMPO system developed by Futoran et al. (1989) in order to develop a…
Stewart, Claire; Shoemaker, Jamie; Keller-Smith, Rachel; Edmunds, Katherine; Davis, Andrew; Tegtmeyer, Ken
2017-10-16
Pediatric code blue activations are infrequent events with a high mortality rate despite the best effort of code teams. The best method for training these code teams is debatable; however, it is clear that training is needed to assure adherence to American Heart Association (AHA) Resuscitation Guidelines and to prevent the decay that invariably occurs after Pediatric Advanced Life Support training. The objectives of this project were to train a multidisciplinary, multidepartmental code team and to measure this team's adherence to AHA guidelines during code simulation. Multidisciplinary code team training sessions were held using high-fidelity, in situ simulation. Sessions were held several times per month. Each session was filmed and reviewed for adherence to 5 AHA guidelines: chest compression rate, ventilation rate, chest compression fraction, use of a backboard, and use of a team leader. After the first study period, modifications were made to the code team including implementation of just-in-time training and alteration of the compression team. Thirty-eight sessions were completed, with 31 eligible for video analysis. During the first study period, 1 session adhered to all AHA guidelines. During the second study period, after alteration of the code team and implementation of just-in-time training, no sessions adhered to all AHA guidelines; however, there was an improvement in percentage of sessions adhering to ventilation rate and chest compression rate and an improvement in median ventilation rate. We present a method for training a large code team drawn from multiple hospital departments and a method of assessing code team performance. Despite subjective improvement in code team positioning, communication, and role completion and some improvement in ventilation rate and chest compression rate, we failed to consistently demonstrate improvement in adherence to all guidelines.
Ways with Data: Understanding Coding as Writing
ERIC Educational Resources Information Center
Lindgren, Chris
2017-01-01
In this dissertation, I report findings from an exploratory case-study about Ray, a web developer, who works on a data-driven news team that finds and tells compelling stories with large sets of data. I implicate this case of Ray's coding on a data team in a writing studies epistemology, which is guided by the following question: "What might…
NASA Technical Reports Server (NTRS)
Griffin, Lisa W.; Huber, Frank W.
1992-01-01
The current status of the activities and future plans of the Turbine Technology Team of the Consortium for Computational Fluid Dynamics is reviewed. The activities of the Turbine Team focus on developing and enhancing codes and models, obtaining data for code validation and general understanding of flows through turbines, and developing and analyzing the aerodynamic designs of turbines suitable for use in the Space Transportation Main Engine fuel and oxidizer turbopumps. Future work will include the experimental evaluation of the oxidizer turbine configuration, the development, analysis, and experimental verification of concepts to control secondary and tip losses, and the aerodynamic design, analysis, and experimental evaluation of turbine volutes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Draeger, E. W.
The Advanced Architecture and Portability Specialists team (AAPS) worked with a select set of LLNL application teams to develop and/or implement a portability strategy for next-generation architectures. The team also investigated new and updated programming models and helped develop programming abstractions targeting maintainability and performance portability. Significant progress was made on both fronts in FY17, resulting in multiple applications being significantly more prepared for the nextgeneration machines than before.
Software quality and process improvement in scientific simulation codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ambrosiano, J.; Webster, R.
1997-11-01
This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.
Fischer, Ute; McDonnell, Lori; Orasanu, Judith
2007-05-01
Approaches to mitigating the likelihood of psychosocial problems during space missions emphasize preflight measures such as team training and team composition. Additionally, it may be necessary to monitor team interactions during missions for signs of interpersonal stress. The present research was conducted to identify features in team members' communications indicative of team functioning. Team interactions were studied in the context of six computer-simulated search and rescue missions. There were 12 teams of 4 U.S. men who participated; however, the present analyses contrast the top two teams with the two least successful teams. Communications between team members were analyzed using linguistic analysis software and a coding scheme developed to characterize task-related and social dimensions of team interactions. Coding reliability was established by having two raters independently code three transcripts. Between-rater agreement ranged from 78.1 to 97.9%. Team performance was significantly associated with team members' task-related communications, specifically with the extent to which task-critical information was shared. Successful and unsuccessful teams also showed different interactive patterns, in particular concerning the frequencies of elaborations and no-responses. Moreover, task success was negatively correlated with variability in team members' word count, and positively correlated with the number of positive emotion words and the frequency of assenting relative to dissenting responses. Analyses isolated certain task-related and social features of team communication related to team functioning. Team success was associated with the extent to which team members shared task-critical information, equally participated and built on each other's contributions, showed agreement, and positive affect.
Re-engineering NASA's space communications to remain viable in a constrained fiscal environment
NASA Astrophysics Data System (ADS)
Hornstein, Rhoda Shaller; Hei, Donald J., Jr.; Kelly, Angelita C.; Lightfoot, Patricia C.; Bell, Holland T.; Cureton-Snead, Izeller E.; Hurd, William J.; Scales, Charles H.
1994-11-01
Along with the Red and Blue Teams commissioned by the NASA Administrator in 1992, NASA's Associate Administrator for Space Communications commissioned a Blue Team to review the Office of Space Communications (Code O) Core Program and determine how the program could be conducted faster, better, and cheaper. Since there was no corresponding Red Team for the Code O Blue Team, the Blue Team assumed a Red Team independent attitude and challenged the status quo, including current work processes, functional distinctions, interfaces, and information flow, as well as traditional management and system development practices. The Blue Team's unconstrained, non-parochial, and imaginative look at NASA's space communications program produced a simplified representation of the space communications infrastructure that transcends organizational and functional boundaries, in addition to existing systems and facilities. Further, the Blue Team adapted the 'faster, better, cheaper' charter to be relevant to the multi-mission, continuous nature of the space communications program and to serve as a gauge for improving customer services concurrent with achieving more efficient operations and infrastructure life cycle economies. This simplified representation, together with the adapted metrics, offers a future view and process model for reengineering NASA's space communications to remain viable in a constrained fiscal environment. Code O remains firm in its commitment to improve productivity, effectiveness, and efficiency. In October 1992, the Associate Administrator reconstituted the Blue Team as the Code O Success Team (COST) to serve as a catalyst for change. In this paper, the COST presents the chronicle and significance of the simplified representation and adapted metrics, and their application during the FY 1993-1994 activities.
Re-engineering NASA's space communications to remain viable in a constrained fiscal environment
NASA Technical Reports Server (NTRS)
Hornstein, Rhoda Shaller; Hei, Donald J., Jr.; Kelly, Angelita C.; Lightfoot, Patricia C.; Bell, Holland T.; Cureton-Snead, Izeller E.; Hurd, William J.; Scales, Charles H.
1994-01-01
Along with the Red and Blue Teams commissioned by the NASA Administrator in 1992, NASA's Associate Administrator for Space Communications commissioned a Blue Team to review the Office of Space Communications (Code O) Core Program and determine how the program could be conducted faster, better, and cheaper. Since there was no corresponding Red Team for the Code O Blue Team, the Blue Team assumed a Red Team independent attitude and challenged the status quo, including current work processes, functional distinctions, interfaces, and information flow, as well as traditional management and system development practices. The Blue Team's unconstrained, non-parochial, and imaginative look at NASA's space communications program produced a simplified representation of the space communications infrastructure that transcends organizational and functional boundaries, in addition to existing systems and facilities. Further, the Blue Team adapted the 'faster, better, cheaper' charter to be relevant to the multi-mission, continuous nature of the space communications program and to serve as a gauge for improving customer services concurrent with achieving more efficient operations and infrastructure life cycle economies. This simplified representation, together with the adapted metrics, offers a future view and process model for reengineering NASA's space communications to remain viable in a constrained fiscal environment. Code O remains firm in its commitment to improve productivity, effectiveness, and efficiency. In October 1992, the Associate Administrator reconstituted the Blue Team as the Code O Success Team (COST) to serve as a catalyst for change. In this paper, the COST presents the chronicle and significance of the simplified representation and adapted metrics, and their application during the FY 1993-1994 activities.
Developing patient-centered teams: The role of sharing stories about patients and patient care.
Bennett, Ariana H; Hassinger, Jane A; Martin, Lisa A; Harris, Lisa H; Gold, Marji
2015-09-01
Research indicates that health care teams are good for staff, patients, and organizations. The characteristics that make teams effective include shared objectives, mutual respect, clarity of roles, communication, trust, and collaboration. We were interested in examining how teams develop these positive characteristics. This paper explores the role of sharing stories about patients in developing patient-centered teams. Data for this paper came from 1 primary care clinic as part of a larger Providers Share Workshop study conducted by the University of Michigan. Each workshop included 5 facilitated group sessions in which staff met to talk about their work. This paper analyzes qualitative data from the workshops. Through an iterative process, research team members identified major themes, developed a coding scheme, and coded transcripts for qualitative data analysis. One of the most powerful ways group members connected was through sharing stories about their patients. Sharing clinical cases and stories helped participants bond around their shared mission of patient-centered care, build supportive relationships, enhance compassion for patients, communicate and resolve conflict, better understand workflows and job roles, develop trust, and increase morale. These attributes highlighted by participants correspond to those documented in the literature as important elements of teambuilding and key indicators of team effectiveness. The sharing of stories about patients seems to be a promising tool for positive team development in a primary care clinical setting and should be investigated further. (c) 2015 APA, all rights reserved).
2017-04-19
In the Swarmathon competition at the Kennedy Space Center Visitor Complex, students were asked to develop computer code for the small robots, programming them to look for "resources" in the form of AprilTag cubes, similar to barcodes. Teams developed search algorithms for the Swarmies to operate autonomously, communicating and interacting as a collective swarm similar to ants foraging for food. In the spaceport's second annual Swarmathon, 20 teams representing 22 minority serving universities and community colleges were invited to develop software code to operate these innovative robots known as "Swarmies" to help find resources when astronauts explore distant locations, such as the moon or Mars.
2018-04-18
In the Swarmathon competition at the Kennedy Space Center Visitor Complex, students were asked to develop computer code for the small robots, programming them to look for "resources" in the form of AprilTag cubes, similar to barcodes. Teams developed search algorithms for the Swarmies to operate autonomously, communicating and interacting as a collective swarm similar to ants foraging for food. In the spaceport's third annual Swarmathon, 23 teams represented 24 minority serving universities and community colleges were invited to develop software code to operate these innovative robots known as "Swarmies" to help find resources when astronauts explore distant locations, such as the Moon or Mars.
2018-04-17
In the Swarmathon competition at the Kennedy Space Center Visitor Complex, students were asked to develop computer code for the small robots, programming them to look for "resources" in the form of AprilTag cubes, similar to barcodes. Teams developed search algorithms for the Swarmies to operate autonomously, communicating and interacting as a collective swarm similar to ants foraging for food. In the spaceport's third annual Swarmathon, 23 teams represented 24 minority serving universities and community colleges were invited to develop software code to operate these innovative robots known as "Swarmies" to help find resources when astronauts explore distant locations, such as the Moon or Mars.
New Tool Released for Engine-Airframe Blade-Out Structural Simulations
NASA Technical Reports Server (NTRS)
Lawrence, Charles
2004-01-01
Researchers at the NASA Glenn Research Center have enhanced a general-purpose finite element code, NASTRAN, for engine-airframe structural simulations during steady-state and transient operating conditions. For steady-state simulations, the code can predict critical operating speeds, natural modes of vibration, and forced response (e.g., cabin noise and component fatigue). The code can be used to perform static analysis to predict engine-airframe response and component stresses due to maneuver loads. For transient response, the simulation code can be used to predict response due to bladeoff events and subsequent engine shutdown and windmilling conditions. In addition, the code can be used as a pretest analysis tool to predict the results of the bladeout test required for FAA certification of new and derivative aircraft engines. Before the present analysis code was developed, all the major aircraft engine and airframe manufacturers in the United States and overseas were performing similar types of analyses to ensure the structural integrity of engine-airframe systems. Although there were many similarities among the analysis procedures, each manufacturer was developing and maintaining its own structural analysis capabilities independently. This situation led to high software development and maintenance costs, complications with manufacturers exchanging models and results, and limitations in predicting the structural response to the desired degree of accuracy. An industry-NASA team was formed to overcome these problems by developing a common analysis tool that would satisfy all the structural analysis needs of the industry and that would be available and supported by a commercial software vendor so that the team members would be relieved of maintenance and development responsibilities. Input from all the team members was used to ensure that everyone's requirements were satisfied and that the best technology was incorporated into the code. Furthermore, because the code would be distributed by a commercial software vendor, it would be more readily available to engine and airframe manufacturers, as well as to nonaircraft companies that did not previously have access to this capability.
Allan, Catherine K; Thiagarajan, Ravi R; Beke, Dorothy; Imprescia, Annette; Kappus, Liana J; Garden, Alexander; Hayes, Gavin; Laussen, Peter C; Bacha, Emile; Weinstock, Peter H
2010-09-01
Resuscitation of pediatric cardiac patients involves unique and complex physiology, requiring multidisciplinary collaboration and teamwork. To optimize team performance, we created a multidisciplinary Crisis Resource Management training course that addressed both teamwork and technical skill needs for the pediatric cardiac intensive care unit. We sought to determine whether participation improved caregiver comfort and confidence levels regarding future resuscitation events. We developed a simulation-based, in situ Crisis Resource Management curriculum using pediatric cardiac intensive care unit scenarios and unit-specific resuscitation equipment, including an extracorporeal membrane oxygenation circuit. Participants replicated the composition of a clinical team. Extensive video-based debriefing followed each scenario, focusing on teamwork principles and technical resuscitation skills. Pre- and postparticipation questionnaires were used to determine the effects on participants' comfort and confidence regarding participation in future resuscitations. A total of 182 providers (127 nurses, 50 physicians, 2 respiratory therapists, 3 nurse practitioners) participated in the course. All participants scored the usefulness of the program and scenarios as 4 of 5 or higher (5 = most useful). There was significant improvement in participants' perceived ability to function as a code team member and confidence in a code (P < .001). Participants reported they were significantly more likely to raise concerns about inappropriate management to the code leader (P < .001). We developed a Crisis Resource Management training program in a pediatric cardiac intensive care unit to teach technical resuscitation skills and improve team function. Participants found the experience useful and reported improved ability to function in a code. Further work is needed to determine whether participation in the Crisis Resource Management program objectively improves team function during real resuscitations. 2010 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.
2017-04-19
A sign at the Kennedy Space Center Visitor Complex announces the second annual Swarmathon competition. Students were asked to develop computer code for the small robots, programming them to look for "resources" in the form of cubes with AprilTags, similar to barcodes. Teams developed search algorithms for the Swarmies to operate autonomously, communicating and interacting as a collective swarm similar to ants foraging for food. In the spaceport's second annual Swarmathon, 20 teams representing 22 minority serving universities and community colleges were invited to develop software code to operate these innovative robots known as "Swarmies" to help find resources when astronauts explore distant locations, such as the moon or Mars.
Pritt, Stacy L; Mackta, Jayne
2010-05-01
Business models for transnational organizations include linking different geographies through common codes of conduct, policies, and virtual teams. Global companies with laboratory animal science activities (whether outsourced or performed inhouse) often see the need for these business activities in relation to animal-based research and benefit from them. Global biomedical research organizations can learn how to better foster worldwide cooperation and teamwork by understanding and working with sociocultural differences in ethics and by knowing how to facilitate appropriate virtual team actions. Associated practices include implementing codes and policies transcend cultural, ethnic, or other boundaries and equipping virtual teams with the needed technology, support, and rewards to ensure timely and productive work that ultimately promotes good science and patient safety in drug development.
Increasing Flexibility in Energy Code Compliance: Performance Packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Philip R.; Rosenberg, Michael I.
Energy codes and standards have provided significant increases in building efficiency over the last 38 years, since the first national energy code was published in late 1975. The most commonly used path in energy codes, the prescriptive path, appears to be reaching a point of diminishing returns. As the code matures, the prescriptive path becomes more complicated, and also more restrictive. It is likely that an approach that considers the building as an integrated system will be necessary to achieve the next real gains in building efficiency. Performance code paths are increasing in popularity; however, there remains a significant designmore » team overhead in following the performance path, especially for smaller buildings. This paper focuses on development of one alternative format, prescriptive packages. A method to develop building-specific prescriptive packages is reviewed based on a multiple runs of prototypical building models that are used to develop parametric decision analysis to determines a set of packages with equivalent energy performance. The approach is designed to be cost-effective and flexible for the design team while achieving a desired level of energy efficiency performance. A demonstration of the approach based on mid-sized office buildings with two HVAC system types is shown along with a discussion of potential applicability in the energy code process.« less
2018-04-17
Students from Montgomery College in Rockville in Maryland, follow the progress of their Swarmie robots during the Swarmathon competition at the Kennedy Space Center Visitor Complex. Students were asked to develop computer code for the small robots, programming them to look for "resources" in the form of AprilTag cubes, similar to barcodes. Teams developed search algorithms for the Swarmies to operate autonomously, communicating and interacting as a collective swarm similar to ants foraging for food. In the spaceport's third annual Swarmathon, 23 teams represented 24 minority serving universities and community colleges were invited to develop software code to operate these innovative robots known as "Swarmies" to help find resources when astronauts explore distant locations, such as the Moon or Mars.
2018-04-18
In the Swarmathon competition at the Kennedy Space Center Visitor Complex, students were asked to develop computer code for the small robots, programming them to look for "resources" in the form of AprilTag cubes, similar to barcodes. To add to the challenge, obstacles in the form of simulated rocks were placed in the completion arena. Teams developed search algorithms for the Swarmies to operate autonomously, communicating and interacting as a collective swarm similar to ants foraging for food. In the spaceport's third annual Swarmathon, 23 teams represented 24 minority serving universities and community colleges were invited to develop software code to operate these innovative robots known as "Swarmies" to help find resources when astronauts explore distant locations, such as the Moon or Mars.
Pan, Yi-Ling; Hwang, Ai-Wen; Simeonsson, Rune J; Lu, Lu; Liao, Hua-Fang
2015-01-01
Comprehensive description of functioning is important in providing early intervention services for infants with developmental delay/disabilities (DD). A code set of the International Classification of Functioning, Disability and Health: Children and Youth Version (ICF-CY) could facilitate the practical use of the ICF-CY in team evaluation. The purpose of this study was to derive an ICF-CY code set for infants under three years of age with early delay and disabilities (EDD Code Set) for initial team evaluation. The EDD Code Set based on the ICF-CY was developed on the basis of a Delphi survey of international professionals experienced in implementing the ICF-CY and professionals in early intervention service system in Taiwan. Twenty-five professionals completed the Delphi survey. A total of 82 ICF-CY second-level categories were identified for the EDD Code Set, including 28 categories from the domain Activities and Participation, 29 from body functions, 10 from body structures and 15 from environmental factors. The EDD Code Set of 82 ICF-CY categories could be useful in multidisciplinary team evaluations to describe functioning of infants younger than three years of age with DD, in a holistic manner. Future validation of the EDD Code Set and examination of its clinical utility are needed. The EDD Code Set with 82 essential ICF-CY categories could be useful in the initial team evaluation as a common language to describe functioning of infants less than three years of age with developmental delay/disabilities, with a more holistic view. The EDD Code Set including essential categories in activities and participation, body functions, body structures and environmental factors could be used to create a functional profile for each infant with special needs and to clarify the interaction of child and environment accounting for the child's functioning.
Code Blue Emergencies: A Team Task Analysis and Educational Initiative.
Price, James W; Applegarth, Oliver; Vu, Mark; Price, John R
2012-01-01
The objective of this study was to identify factors that have a positive or negative influence on resuscitation team performance during emergencies in the operating room (OR) and post-operative recovery unit (PAR) at a major Canadian teaching hospital. This information was then used to implement a team training program for code blue emergencies. In 2009/10, all OR and PAR nurses and 19 anesthesiologists at Vancouver General Hospital (VGH) were invited to complete an anonymous, 10 minute written questionnaire regarding their code blue experience. Survey questions were devised by 10 recovery room and operation room nurses as well as 5 anesthesiologists representing 4 different hospitals in British Columbia. Three iterations of the survey were reviewed by a pilot group of nurses and anesthesiologists and their feedback was integrated into the final version of the survey. Both nursing staff (n = 49) and anesthesiologists (n = 19) supported code blue training and believed that team training would improve patient outcome. Nurses noted that it was often difficult to identify the leader of the resuscitation team. Both nursing staff and anesthesiologists strongly agreed that too many people attending the code blue with no assigned role hindered team performance. Identifiable leadership and clear communication of roles were identified as keys to resuscitation team functioning. Decreasing the number of people attending code blue emergencies with no specific role, increased access to mock code blue training, and debriefing after crises were all identified as areas requiring improvement. Initial team training exercises have been well received by staff.
Life Sciences Centrifuge Facility review
NASA Technical Reports Server (NTRS)
Young, Laurence R.
1994-01-01
The Centrifuge Facility Project at ARC was reviewed by a code U team to determine appropriateness adequacy for the ISSA. This report represents the findings of one consultant to this team and concentrates on scientific and technical risks. This report supports continuation of the project to the next phase of development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vance, J.N.; Holderness, J.H.; James, D.W.
1992-12-01
Waste stream scaling factors based on sampling programs are vulnerable to one or more of the following factors: sample representativeness, analytic accuracy, and measurement sensitivity. As an alternative to sample analyses or as a verification of the sampling results, this project proposes the use of the RADSOURCE code, which accounts for the release of fuel-source radionuclides. Once the release rates of these nuclides from fuel are known, the code develops scaling factors for waste streams based on easily measured Cobalt-60 (Co-60) and Cesium-137 (Cs-137). The project team developed mathematical models to account for the appearance rate of 10CFR61 radionuclides inmore » reactor coolant. They based these models on the chemistry and nuclear physics of the radionuclides involved. Next, they incorporated the models into a computer code that calculates plant waste stream scaling factors based on reactor coolant gamma- isotopic data. Finally, the team performed special sampling at 17 reactors to validate the models in the RADSOURCE code.« less
NASA Technical Reports Server (NTRS)
Campbell, David; Wysong, Ingrid; Kaplan, Carolyn; Mott, David; Wadsworth, Dean; VanGilder, Douglas
2000-01-01
An AFRL/NRL team has recently been selected to develop a scalable, parallel, reacting, multidimensional (SUPREM) Direct Simulation Monte Carlo (DSMC) code for the DoD user community under the High Performance Computing Modernization Office (HPCMO) Common High Performance Computing Software Support Initiative (CHSSI). This paper will introduce the JANNAF Exhaust Plume community to this three-year development effort and present the overall goals, schedule, and current status of this new code.
NASA Astrophysics Data System (ADS)
Brewer, Denise
The air transport industry (ATI) is a dynamic, communal, international, and intercultural environment in which the daily operations of airlines, airports, and service providers are dependent on information technology (IT). Many of the IT legacy systems are more than 30 years old, and current regulations and the globally distributed workplace have brought profound changes to the way the ATI community interacts. The purpose of the study was to identify the areas of resistance to change in the ATI community and the corresponding factors in change management requirements that minimize product development delays and lead to a successful and timely shift from legacy to open web-based systems in upgrading ATI operations. The research questions centered on product development team processes as well as the members' perceived need for acceptance of change. A qualitative case study approach rooted in complexity theory was employed using a single case of an intercultural product development team dispersed globally. Qualitative data gathered from questionnaires were organized using Nvivo software, which coded the words and themes. Once coded, themes emerged identifying the areas of resistance within the product development team. Results of follow-up interviews with team members suggests that intercultural relationship building prior to and during project execution; focus on common team goals; and, development of relationships to enhance interpersonal respect, understanding and overall communication help overcome resistance to change. Positive social change in the form of intercultural group effectiveness evidenced in increased team functioning during major project transitions is likely to result when global managers devote time to cultural understanding.
Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design
NASA Technical Reports Server (NTRS)
Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.
2003-01-01
A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.
Improving the accuracy of operation coding in surgical discharge summaries
Martinou, Eirini; Shouls, Genevieve; Betambeau, Nadine
2014-01-01
Procedural coding in surgical discharge summaries is extremely important; as well as communicating to healthcare staff which procedures have been performed, it also provides information that is used by the hospital's coding department. The OPCS code (Office of Population, Censuses and Surveys Classification of Surgical Operations and Procedures) is used to generate the tariff that allows the hospital to be reimbursed for the procedure. We felt that the OPCS coding on discharge summaries was often incorrect within our breast and endocrine surgery department. A baseline measurement over two months demonstrated that 32% of operations had been incorrectly coded, resulting in an incorrect tariff being applied and an estimated loss to the Trust of £17,000. We developed a simple but specific OPCS coding table in collaboration with the clinical coding team and breast surgeons that summarised all operations performed within our department. This table was disseminated across the team, specifically to the junior doctors who most frequently complete the discharge summaries. Re-audit showed 100% of operations were accurately coded, demonstrating the effectiveness of the coding table. We suggest that specifically designed coding tables be introduced across each surgical department to ensure accurate OPCS codes are used to produce better quality surgical discharge summaries and to ensure correct reimbursement to the Trust. PMID:26734286
A proto-code of ethics and conduct for European nurse directors.
Stievano, Alessandro; De Marinis, Maria Grazia; Kelly, Denise; Filkins, Jacqueline; Meyenburg-Altwarg, Iris; Petrangeli, Mauro; Tschudin, Verena
2012-03-01
The proto-code of ethics and conduct for European nurse directors was developed as a strategic and dynamic document for nurse managers in Europe. It invites critical dialogue, reflective thinking about different situations, and the development of specific codes of ethics and conduct by nursing associations in different countries. The term proto-code is used for this document so that specifically country-orientated or organization-based and practical codes can be developed from it to guide professionals in more particular or situation-explicit reflection and values. The proto-code of ethics and conduct for European nurse directors was designed and developed by the European Nurse Directors Association's (ENDA) advisory team. This article gives short explanations of the code' s preamble and two main parts: Nurse directors' ethical basis, and Principles of professional practice, which is divided into six specific points: competence, care, safety, staff, life-long learning and multi-sectorial working.
Current Status of Japan's Activity for GPM/DPR and Global Rainfall Map algorithm development
NASA Astrophysics Data System (ADS)
Kachi, M.; Kubota, T.; Yoshida, N.; Kida, S.; Oki, R.; Iguchi, T.; Nakamura, K.
2012-04-01
The Global Precipitation Measurement (GPM) mission is composed of two categories of satellites; 1) a Tropical Rainfall Measuring Mission (TRMM)-like non-sun-synchronous orbit satellite (GPM Core Observatory); and 2) constellation of satellites carrying microwave radiometer instruments. The GPM Core Observatory carries the Dual-frequency Precipitation Radar (DPR), which is being developed by the Japan Aerospace Exploration Agency (JAXA) and the National Institute of Information and Communications Technology (NICT), and microwave radiometer provided by the National Aeronautics and Space Administration (NASA). GPM Core Observatory will be launched in February 2014, and development of algorithms is underway. DPR Level 1 algorithm, which provides DPR L1B product including received power, will be developed by the JAXA. The first version was submitted in March 2011. Development of the second version of DPR L1B algorithm (Version 2) will complete in March 2012. Version 2 algorithm includes all basic functions, preliminary database, HDF5 I/F, and minimum error handling. Pre-launch code will be developed by the end of October 2012. DPR Level 2 algorithm has been developing by the DPR Algorithm Team led by Japan, which is under the NASA-JAXA Joint Algorithm Team. The first version of GPM/DPR Level-2 Algorithm Theoretical Basis Document was completed on November 2010. The second version, "Baseline code", was completed in January 2012. Baseline code includes main module, and eight basic sub-modules (Preparation module, Vertical Profile module, Classification module, SRT module, DSD module, Solver module, Input module, and Output module.) The Level-2 algorithms will provide KuPR only products, KaPR only products, and Dual-frequency Precipitation products, with estimated precipitation rate, radar reflectivity, and precipitation information such as drop size distribution and bright band height. It is important to develop algorithm applicable to both TRMM/PR and KuPR in order to produce long-term continuous data set. Pre-launch code will be developed by autumn 2012. Global Rainfall Map algorithm has been developed by the Global Rainfall Map Algorithm Development Team in Japan. The algorithm succeeded heritages of the Global Satellite Mapping for Precipitation (GSMaP) project between 2002 and 2007, and near-real-time version operating at JAXA since 2007. "Baseline code" used current operational GSMaP code (V5.222,) and development completed in January 2012. Pre-launch code will be developed by autumn 2012, including update of database for rain type classification and rain/no-rain classification, and introduction of rain-gauge correction.
Job Analysis Results for Malicious-Code Reverse Engineers: A Case Study
2014-05-01
Testing in Personnel Selection: Contemporary Issues in Cognitive Ability and Personality Testing .” Journal of Business Inquiry: Research , Edu- cation, and...federally funded research and development center. Any opinions, findings and conclusions or recommendations expressed in this material are those of...predict the develop- ment of expertise is important. Currently, job analysis research on teams of malicious-code re- verse engineers is lacking. Therefore
A systems engineering initiative for NASA's space communications
NASA Technical Reports Server (NTRS)
Hornstein, Rhoda S.; Hei, Donald J., Jr.; Kelly, Angelita C.; Lightfoot, Patricia C.; Bell, Holland T.; Cureton-Snead, Izeller E.; Hurd, William J.; Scales, Charles H.
1993-01-01
In addition to but separate from the Red and Blue Teams commissioned by the NASA Administrator, NASA's Associate Administrator for Space Communications commissioned a Blue Team to review the Office of Space Communications (Code O) Core Program and determine how the program could be conducted faster, better, and cheaper, without compromising safety. Since there was no corresponding Red Team for the Code O Blue Team, the Blue Team assumed a Red Team independent attitude and challenged the status quo. The Blue Team process and results are summarized. The Associate Administrator for Space Communications subsequently convened a special management session to discuss the significance and implications of the Blue Team's report and to lay the groundwork and teamwork for the next steps, including the transition from engineering systems to systems engineering. The methodology and progress toward realizing the Code O Family vision and accomplishing the systems engineering initiative for NASA's space communications are presented.
Cole, Kenneth D; Waite, Martha S; Nichols, Linda O
2003-01-01
For a nationwide Geriatric Interdisciplinary Team Training (GITT) program evaluation of 8 sites and 26 teams, team evaluators developed a quantitative and qualitative team observation scale (TOS), examining structure, process, and outcome, with specific focus on the training function. Qualitative data provided an important expansion of quantitative data, highlighting positive effects that were not statistically significant, such as role modeling and training occurring within the clinical team. Qualitative data could also identify "too much" of a coded variable, such as time spent in individual team members' assessments and treatment plans. As healthcare organizations have increasing demands for productivity and changing reimbursement, traditional models of teamwork, with large teams and structured meetings, may no longer be as functional as they once were. To meet these constraints and to train students in teamwork, teams of the future will have to make choices, from developing and setting specific models to increasing the use of information technology to create virtual teams. Both quantitative and qualitative data will be needed to evaluate these new types of teams and the important outcomes they produce.
Digital Controller For Emergency Beacon
NASA Technical Reports Server (NTRS)
Ivancic, William D.
1990-01-01
Prototype digital controller intended for use in 406-MHz emergency beacon. Undergoing development according to international specifications, 406-MHz emergency beacon system includes satellites providing worldwide monitoring of beacons, with Doppler tracking to locate each beacon within 5 km. Controller turns beacon on and off and generates binary codes identifying source (e.g., ship, aircraft, person, or vehicle on land). Codes transmitted by phase modulation. Knowing code, monitor attempts to communicate with user, monitor uses code information to dispatch rescue team appropriate to type and locations of carrier.
Software ``Best'' Practices: Agile Deconstructed
NASA Astrophysics Data System (ADS)
Fraser, Steven
This workshop will explore the intersection of agility and software development in a world of legacy code-bases and large teams. Organizations with hundreds of developers and code-bases exceeding a million or tens of millions of lines of code are seeking new ways to expedite development while retaining and attracting staff who desire to apply “agile” methods. This is a situation where specific agile practices may be embraced outside of their usual zone of applicability. Here is where practitioners must understand both what “best practices” already exist in the organization - and how they might be improved or modified by applying “agile” approaches.
Data Management for a Climate Data Record in an Evolving Technical Landscape
NASA Astrophysics Data System (ADS)
Moore, K. D.; Walter, J.; Gleason, J. L.
2017-12-01
For nearly twenty years, NASA Langley Research Center's Clouds and the Earth's Radiant Energy System (CERES) Science Team has been producing a suite of data products that forms a persistent climate data record of the Earth's radiant energy budget. Many of the team's physical scientists and key research contributors have been with the team since the launch of the first CERES instrument in 1997. This institutional knowledge is irreplaceable and its longevity and continuity are among the reasons that the team has been so productive. Such legacy involvement, however, can also be a limiting factor. Some CERES scientists-cum-coders might possess skills that were state-of-the-field when they were emerging scientists but may now be outdated with respect to developments in software development best practices and supporting technologies. Both programming languages and processing frameworks have evolved significantly in the past twenty years, and updating one of these factors warrants consideration of updating the other. With the imminent launch of a final CERES instrument and the good health of those in flight, the CERES data record stands to continue far into the future. The CERES Science Team is, therefore, undergoing a re-architecture of its codebase to maintain compatibility with newer data processing platforms and technologies and to leverage modern software development best practices. This necessitates training our staff and consequently presents several challenges, including: Development continues immediately on the next "edition" of research algorithms upon release of the previous edition. How can code be rewritten at the same time that the science algorithms are being updated and integrated? With limited time to devote to training, how can we update the staff's existing skillset without slowing progress or introducing new errors? The CERES Science Team is large and complex, much like the current state of its codebase. How can we identify, in a breadth-wise manner, areas for code improvement across multiple research groups that maintain code with varying semantics but common concepts? In this work, we discuss the successes and pitfalls of this major re-architecture effort and share how we will sustain improvement into the future.
Extreme Programming: Maestro Style
NASA Technical Reports Server (NTRS)
Norris, Jeffrey; Fox, Jason; Rabe, Kenneth; Shu, I-Hsiang; Powell, Mark
2009-01-01
"Extreme Programming: Maestro Style" is the name of a computer programming methodology that has evolved as a custom version of a methodology, called extreme programming that has been practiced in the software industry since the late 1990s. The name of this version reflects its origin in the work of the Maestro team at NASA's Jet Propulsion Laboratory that develops software for Mars exploration missions. Extreme programming is oriented toward agile development of software resting on values of simplicity, communication, testing, and aggressiveness. Extreme programming involves use of methods of rapidly building and disseminating institutional knowledge among members of a computer-programming team to give all the members a shared view that matches the view of the customers for whom the software system is to be developed. Extreme programming includes frequent planning by programmers in collaboration with customers, continually examining and rewriting code in striving for the simplest workable software designs, a system metaphor (basically, an abstraction of the system that provides easy-to-remember software-naming conventions and insight into the architecture of the system), programmers working in pairs, adherence to a set of coding standards, collaboration of customers and programmers, frequent verbal communication, frequent releases of software in small increments of development, repeated testing of the developmental software by both programmers and customers, and continuous interaction between the team and the customers. The environment in which the Maestro team works requires the team to quickly adapt to changing needs of its customers. In addition, the team cannot afford to accept unnecessary development risk. Extreme programming enables the Maestro team to remain agile and provide high-quality software and service to its customers. However, several factors in the Maestro environment have made it necessary to modify some of the conventional extreme-programming practices. The single most influential of these factors is that continuous interaction between customers and programmers is not feasible.
Warp-X: A new exascale computing platform for beam–plasma simulations
Vay, J. -L.; Almgren, A.; Bell, J.; ...
2018-01-31
Turning the current experimental plasma accelerator state-of-the-art from a promising technology into mainstream scientific tools depends critically on high-performance, high-fidelity modeling of complex processes that develop over a wide range of space and time scales. As part of the U.S. Department of Energy's Exascale Computing Project, a team from Lawrence Berkeley National Laboratory, in collaboration with teams from SLAC National Accelerator Laboratory and Lawrence Livermore National Laboratory, is developing a new plasma accelerator simulation tool that will harness the power of future exascale supercomputers for high-performance modeling of plasma accelerators. We present the various components of the codes such asmore » the new Particle-In-Cell Scalable Application Resource (PICSAR) and the redesigned adaptive mesh refinement library AMReX, which are combined with redesigned elements of the Warp code, in the new WarpX software. Lastly, the code structure, status, early examples of applications and plans are discussed.« less
Warp-X: A new exascale computing platform for beam–plasma simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vay, J. -L.; Almgren, A.; Bell, J.
Turning the current experimental plasma accelerator state-of-the-art from a promising technology into mainstream scientific tools depends critically on high-performance, high-fidelity modeling of complex processes that develop over a wide range of space and time scales. As part of the U.S. Department of Energy's Exascale Computing Project, a team from Lawrence Berkeley National Laboratory, in collaboration with teams from SLAC National Accelerator Laboratory and Lawrence Livermore National Laboratory, is developing a new plasma accelerator simulation tool that will harness the power of future exascale supercomputers for high-performance modeling of plasma accelerators. We present the various components of the codes such asmore » the new Particle-In-Cell Scalable Application Resource (PICSAR) and the redesigned adaptive mesh refinement library AMReX, which are combined with redesigned elements of the Warp code, in the new WarpX software. Lastly, the code structure, status, early examples of applications and plans are discussed.« less
User systems guidelines for software projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abrahamson, L.
1986-04-01
This manual presents guidelines for software standards which were developed so that software project-development teams and management involved in approving the software could have a generalized view of all phases in the software production procedure and the steps involved in completing each phase. Guidelines are presented for six phases of software development: project definition, building a user interface, designing software, writing code, testing code, and preparing software documentation. The discussions for each phase include examples illustrating the recommended guidelines. 45 refs. (DWL)
Conveying empathy to hospice family caregivers: Team responses to caregiver empathic communication
Wittenberg-Lyles, Elaine; Oliver, Debra Parker; Demiris, George; Rankin, Anna; Shaunfield, Sara; Kruse, Robin L.
2012-01-01
Objective The goal of this study was to explore empathic communication opportunities presented by family caregivers and responses from interdisciplinary hospice team members. Methods Empathic opportunities and hospice team responses were analyzed from biweekly web-based videoconferences between family caregivers and hospice teams. The authors coded the data using the Empathic Communication Coding System (ECCS) and identified themes within and among the coded data. Results Data analysis identified 270 empathic opportunity-team response sequences. Caregivers expressed statements of emotion and decline most frequently. Two-thirds of the hospice team responses were implicit acknowledgments of caregiver statements and only one-third of the team responses were explicit recognitions of caregiver empathic opportunities. Conclusion Although hospice team members frequently express emotional concerns with family caregivers during one-on-one visits, there is a need for more empathic communication during team meetings that involve caregivers. Practice implications Hospice clinicians should devote more time to discussing emotional issues with patients and their families to enhance patient-centered hospice care. Further consideration should be given to training clinicians to empathize with patients and family caregivers. PMID:22554387
Conveying empathy to hospice family caregivers: team responses to caregiver empathic communication.
Wittenberg-Lyles, Elaine; Debra, Parker Oliver; Demiris, George; Rankin, Anna; Shaunfield, Sara; Kruse, Robin L
2012-10-01
The goal of this study was to explore empathic communication opportunities presented by family caregivers and responses from interdisciplinary hospice team members. Empathic opportunities and hospice team responses were analyzed from bi-weekly web-based videoconferences between family caregivers and hospice teams. The authors coded the data using the Empathic Communication Coding System (ECCS) and identified themes within and among the coded data. Data analysis identified 270 empathic opportunity-team response sequences. Caregivers expressed statements of emotion and decline most frequently. Two-thirds of the hospice team responses were implicit acknowledgements of caregiver statements and only one-third of the team responses were explicit recognitions of caregiver empathic opportunities. Although hospice team members frequently express emotional concerns with family caregivers during one-on-one visits, there is a need for more empathic communication during team meetings that involve caregivers. Hospice clinicians should devote more time to discussing emotional issues with patients and their families to enhance patient-centered hospice care. Further consideration should be given to training clinicians to empathize with patients and family caregivers. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Developing protocols for obstetric emergencies.
Roth, Cheryl K; Parfitt, Sheryl E; Hering, Sandra L; Dent, Sarah A
2014-01-01
There is potential for important steps to be missed in emergency situations, even in the presence of many health care team members. Developing a clear plan of response for common emergencies can ensure that no tasks are redundant or omitted, and can create a more controlled environment that promotes positive health outcomes. A multidisciplinary team was assembled in a large community hospital to create protocols that would help ensure optimum care and continuity of practice in cases of postpartum hemorrhage, shoulder dystocia, emergency cesarean surgical birth, eclamptic seizure and maternal code. Assignment of team roles and responsibilities led to the evolution of standardized protocols for each emergency situation. © 2014 AWHONN.
NA-42 TI Shared Software Component Library FY2011 Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knudson, Christa K.; Rutz, Frederick C.; Dorow, Kevin E.
The NA-42 TI program initiated an effort in FY2010 to standardize its software development efforts with the long term goal of migrating toward a software management approach that will allow for the sharing and reuse of code developed within the TI program, improve integration, ensure a level of software documentation, and reduce development costs. The Pacific Northwest National Laboratory (PNNL) has been tasked with two activities that support this mission. PNNL has been tasked with the identification, selection, and implementation of a Shared Software Component Library. The intent of the library is to provide a common repository that is accessiblemore » by all authorized NA-42 software development teams. The repository facilitates software reuse through a searchable and easy to use web based interface. As software is submitted to the repository, the component registration process captures meta-data and provides version control for compiled libraries, documentation, and source code. This meta-data is then available for retrieval and review as part of library search results. In FY2010, PNNL and staff from the Remote Sensing Laboratory (RSL) teamed up to develop a software application with the goal of replacing the aging Aerial Measuring System (AMS). The application under development includes an Advanced Visualization and Integration of Data (AVID) framework and associated AMS modules. Throughout development, PNNL and RSL have utilized a common AMS code repository for collaborative code development. The AMS repository is hosted by PNNL, is restricted to the project development team, is accessed via two different geographic locations and continues to be used. The knowledge gained from the collaboration and hosting of this repository in conjunction with PNNL software development and systems engineering capabilities were used in the selection of a package to be used in the implementation of the software component library on behalf of NA-42 TI. The second task managed by PNNL is the development and continued maintenance of the NA-42 TI Software Development Questionnaire. This questionnaire is intended to help software development teams working under NA-42 TI in documenting their development activities. When sufficiently completed, the questionnaire illustrates that the software development activities recorded incorporate significant aspects of the software engineering lifecycle. The questionnaire template is updated as comments are received from NA-42 and/or its development teams and revised versions distributed to those using the questionnaire. PNNL also maintains a list of questionnaire recipients. The blank questionnaire template, the AVID and AMS software being developed, and the completed AVID AMS specific questionnaire are being used as the initial content to be established in the TI Component Library. This report summarizes the approach taken to identify requirements, search for and evaluate technologies, and the approach taken for installation of the software needed to host the component library. Additionally, it defines the process by which users request access for the contribution and retrieval of library content.« less
An experiment to assess the cost-benefits of code inspections in large scale software development
NASA Technical Reports Server (NTRS)
Porter, A.; Siy, H.; Toman, C. A.; Votta, L. G.
1994-01-01
This experiment (currently in progress) is designed to measure costs and benefits of different code inspection methods. It is being performed with a real development team writing software for a commercial product. The dependent variables for each code unit's inspection are the elapsed time and the number of defects detected. We manipulate the method of inspection by randomly assigning reviewers, varying the number of reviewers and the number of teams, and, when using more than one team, randomly assigning author repair and non-repair of detected defects between code inspections. After collecting and analyzing the first 17 percent of the data, we have discovered several interesting facts about reviewers, about the defects recorded during reviewer preparation and during the inspection collection meeting, and about the repairs that are eventually made. (1) Only 17 percent of the defects that reviewers record in their preparations are true defects that are later repaired. (2) Defects recorded at the inspection meetings fall into three categories: 18 percent false positives requiring no author repair, 57 percent soft maintenance where the author makes changes only for readability or code standard enforcement, and 25 percent true defects requiring repair. (3) The median elapsed calendar time for code inspections is 10 working days - 8 working days before the collection meeting and 2 after. (4) In the collection meetings, 31 percent of the defects discovered by reviewers during preparation are suppressed. (5) Finally, 33 percent of the true defects recorded are discovered at the collection meetings and not during any reviewer's preparation. The results to date suggest that inspections with two sessions (two different teams) of two reviewers per session (2sX2p) are the most effective. These two-session inspections may be performed with author repair or with no author repair between the two sessions. We are finding that the two-session, two-person with repair (2sX2pR) inspections are the most expensive, taking 15 working days of calendar time from the time the code is ready for review until author repair is complete, whereas two-session, two-person with no repair (2sX2pN) inspections take only 10 working days, but find about 10 percent fewer defects.
Luger, Tana M; Volkman, Julie E; Rocheleau, Mary; Mueller, Nora; Barker, Anna M; Nazi, Kim M; Houston, Thomas K; Bokhour, Barbara G
2018-01-01
Background As information and communication technology is becoming more widely implemented across health care organizations, patient-provider email or asynchronous electronic secure messaging has the potential to support patient-centered communication. Within the medical home model of the Veterans Health Administration (VA), secure messaging is envisioned as a means to enhance access and strengthen the relationships between veterans and their health care team members. However, despite previous studies that have examined the content of electronic messages exchanged between patients and health care providers, less research has focused on the socioemotional aspects of the communication enacted through those messages. Objective Recognizing the potential of secure messaging to facilitate the goals of patient-centered care, the objectives of this analysis were to not only understand why patients and health care team members exchange secure messages but also to examine the socioemotional tone engendered in these messages. Methods We conducted a cross-sectional coding evaluation of a corpus of secure messages exchanged between patients and health care team members over 6 months at 8 VA facilities. We identified patients whose medical records showed secure messaging threads containing at least 2 messages and compiled a random sample of these threads. Drawing on previous literature regarding the analysis of asynchronous, patient-provider electronic communication, we developed a coding scheme comprising a series of a priori patient and health care team member codes. Three team members tested the scheme on a subset of the messages and then independently coded the sample of messaging threads. Results Of the 711 messages coded from the 384 messaging threads, 52.5% (373/711) were sent by patients and 47.5% (338/711) by health care team members. Patient and health care team member messages included logistical content (82.6%, 308/373 vs 89.1%, 301/338), were neutral in tone (70.2%, 262/373 vs 82.0%, 277/338), and respectful in nature (25.7%, 96/373 vs 33.4%, 113/338). Secure messages from health care team members sometimes appeared hurried (25.4%, 86/338) but also displayed friendliness or warmth (18.9%, 64/338) and reassurance or encouragement (18.6%, 63/338). Most patient messages involved either providing or seeking information; however, the majority of health care team member messages involved information provision in response to patient questions. Conclusions This evaluation is an important step toward understanding the content and socioemotional tone that is part of the secure messaging exchanges between patients and health care team members. Our findings were encouraging; however, there are opportunities for improvement. As health care organizations seek to supplement traditional encounters with virtual care, they must reexamine their use of secure messaging, including the patient centeredness of the communication, and the potential for more proactive use by health care team members. PMID:29519774
Mun, Eluned; Umbarger, Lillian; Ceria-Ulep, Clementina; Nakatsuka, Craig
2018-01-01
Palliative Care Teams have been shown to be instrumental in the early identification of multiple aspects of advanced care planning. Despite an increased number of services to meet the rising consultation demand, it is conceivable that the numbers of palliative care consultations generated from an ICU alone could become overwhelming for an existing palliative care team. Improve end-of-life care in the ICU by incorporating basic palliative care processes into the daily routine ICU workflow, thereby reserving the palliative care team for refractory situations. A structured, palliative care, quality-improvement program was implemented and evaluated in the ICU at Kaiser Permanente Medical Center in Hawaii. This included selecting trigger criteria, a care model, forming guidelines, and developing evaluation criteria. These included the early identification of the multiple features of advanced care planning, numbers of proactive ICU and palliative care family meetings, and changes in code status and treatment upon completion of either meeting. Early identification of Goals-of-Care, advance directives, and code status by the ICU staff led to a proactive ICU family meeting with resultant increases in changes in code status and treatment. The numbers of palliative care consultations also rose, but not significantly. Palliative care processes could be incorporated into a daily ICU workflow allowing for integration of aspects of advanced care planning to be identified in a systematic and proactive manner. This reserved the palliative care team for situations when palliative care efforts performed by the ICU staff were ineffective.
2017-04-20
In the second annual Swarmathon competition at NASA's Kennedy Space Center in Florida, students were asked to develop computer code for the small robots called "Swarmies." The students programmed the robots to look for "resources" in the form of cubes with AprilTags, similar to barcodes. A team from Southwestern Indian Polytechnic Institute (SIPI) in Albuquerque, New Mexico, captured first place and a $5,000 cash prize. SIPI team members, from the left, are: students Emery Sutherland, Ty Shurley, Christian Martinez, SIPI engineering professor Dr. Nader Vadiee who was the team's faculty advisor, and student Schulte Cooke.
NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems
NASA Technical Reports Server (NTRS)
Liu, Nan-Suey; Quealy, Angela
1999-01-01
A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.
Moura, Felipe Arruda; van Emmerik, Richard E A; Santana, Juliana Exel; Martins, Luiz Eduardo Barreto; Barros, Ricardo Machado Leite de; Cunha, Sergio Augusto
2016-12-01
The purpose of this study was to investigate the coordination between teams spread during football matches using cross-correlation and vector coding techniques. Using a video-based tracking system, we obtained the trajectories of 257 players during 10 matches. Team spread was calculated as functions of time. For a general coordination description, we calculated the cross-correlation between the signals. Vector coding was used to identify the coordination patterns between teams during offensive sequences that ended in shots on goal or defensive tackles. Cross-correlation showed that opponent teams have a tendency to present in-phase coordination, with a short time lag. During offensive sequences, vector coding results showed that, although in-phase coordination dominated, other patterns were observed. We verified that during the early stages, offensive sequences ending in shots on goal present greater anti-phase and attacking team phase periods, compared to sequences ending in tackles. Results suggest that the attacking team may seek to present a contrary behaviour of its opponent (or may lead the adversary behaviour) in the beginning of the attacking play, regarding to the distribution strategy, to increase the chances of a shot on goal. The techniques allowed detecting the coordination patterns between teams, providing additional information about football dynamics and players' interaction.
Automated JPSS VIIRS GEO code change testing by using Chain Run Scripts
NASA Astrophysics Data System (ADS)
Chen, W.; Wang, W.; Zhao, Q.; Das, B.; Mikles, V. J.; Sprietzer, K.; Tsidulko, M.; Zhao, Y.; Dharmawardane, V.; Wolf, W.
2015-12-01
The Joint Polar Satellite System (JPSS) is the next generation polar-orbiting operational environmental satellite system. The first satellite in the JPSS series of satellites, J-1, is scheduled to launch in early 2017. J1 will carry similar versions of the instruments that are on board of Suomi National Polar-Orbiting Partnership (S-NPP) satellite which was launched on October 28, 2011. The center for Satellite Applications and Research Algorithm Integration Team (STAR AIT) uses the Algorithm Development Library (ADL) to run S-NPP and pre-J1 algorithms in a development and test mode. The ADL is an offline test system developed by Raytheon to mimic the operational system while enabling a development environment for plug and play algorithms. The Perl Chain Run Scripts have been developed by STAR AIT to automate the staging and processing of multiple JPSS Sensor Data Record (SDR) and Environmental Data Record (EDR) products. JPSS J1 VIIRS Day Night Band (DNB) has anomalous non-linear response at high scan angles based on prelaunch testing. The flight project has proposed multiple mitigation options through onboard aggregation, and the Option 21 has been suggested by the VIIRS SDR team as the baseline aggregation mode. VIIRS GEOlocation (GEO) code analysis results show that J1 DNB GEO product cannot be generated correctly without the software update. The modified code will support both Op21, Op21/26 and is backward compatible with SNPP. J1 GEO code change version 0 delivery package is under development for the current change request. In this presentation, we will discuss how to use the Chain Run Script to verify the code change and Lookup Tables (LUTs) update in ADL Block2.
Recent and Planned Developments in the CARI Program
2013-04-01
software are available from the Radiobiology Research Team Website. The source code is available upon request. CARI-6 is based on the last major... Research Team at its newly founded Civil Aeromedical Research Institute (now called the Civil Aerospace Medical Institute, i.e., CAMI) to investigate...Administration, Office of Aerospace Medicine. Re- port DOT/FAA/AM-11/09, 2011. Online at: www. faa.gov/data_ research / research /med_humanfacs/ oamtechreports
Chargemaster maintenance: think 'spring cleaning' all year round.
Barton, Shawn; Lancaster, Dani; Bieker, Mike
2008-11-01
Steps toward maintaining a standardized chargemaster include: Building a corporate chargemaster maintenance team. Developing a core research function. Designating hospital liaisons. Publishing timely reports on facility compliance. Using system codes to identify charges. Selecting chargemaster maintenance software. Developing a standard chargemaster data repository. Educating staff.
Complex collaborative problem-solving processes in mission control.
Fiore, Stephen M; Wiltshire, Travis J; Oglesby, James M; O'Keefe, William S; Salas, Eduardo
2014-04-01
NASA's Mission Control Center (MCC) is responsible for control of the International Space Station (ISS), which includes responding to problems that obstruct the functioning of the ISS and that may pose a threat to the health and well-being of the flight crew. These problems are often complex, requiring individuals, teams, and multiteam systems, to work collaboratively. Research is warranted to examine individual and collaborative problem-solving processes in this context. Specifically, focus is placed on how Mission Control personnel-each with their own skills and responsibilities-exchange information to gain a shared understanding of the problem. The Macrocognition in Teams Model describes the processes that individuals and teams undertake in order to solve problems and may be applicable to Mission Control teams. Semistructured interviews centering on a recent complex problem were conducted with seven MCC professionals. In order to assess collaborative problem-solving processes in MCC with those predicted by the Macrocognition in Teams Model, a coding scheme was developed to analyze the interview transcriptions. Findings are supported with excerpts from participant transcriptions and suggest that team knowledge-building processes accounted for approximately 50% of all coded data and are essential for successful collaborative problem solving in mission control. Support for the internalized and externalized team knowledge was also found (19% and 20%, respectively). The Macrocognition in Teams Model was shown to be a useful depiction of collaborative problem solving in mission control and further research with this as a guiding framework is warranted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saad, Tony; Sutherland, James C.
To address the coding and software challenges of modern hybrid architectures, we propose an approach to multiphysics code development for high-performance computing. This approach is based on using a Domain Specific Language (DSL) in tandem with a directed acyclic graph (DAG) representation of the problem to be solved that allows runtime algorithm generation. When coupled with a large-scale parallel framework, the result is a portable development framework capable of executing on hybrid platforms and handling the challenges of multiphysics applications. In addition, we share our experience developing a code in such an environment – an effort that spans an interdisciplinarymore » team of engineers and computer scientists.« less
Saad, Tony; Sutherland, James C.
2016-05-04
To address the coding and software challenges of modern hybrid architectures, we propose an approach to multiphysics code development for high-performance computing. This approach is based on using a Domain Specific Language (DSL) in tandem with a directed acyclic graph (DAG) representation of the problem to be solved that allows runtime algorithm generation. When coupled with a large-scale parallel framework, the result is a portable development framework capable of executing on hybrid platforms and handling the challenges of multiphysics applications. In addition, we share our experience developing a code in such an environment – an effort that spans an interdisciplinarymore » team of engineers and computer scientists.« less
Hogan, Timothy P; Luger, Tana M; Volkman, Julie E; Rocheleau, Mary; Mueller, Nora; Barker, Anna M; Nazi, Kim M; Houston, Thomas K; Bokhour, Barbara G
2018-03-08
As information and communication technology is becoming more widely implemented across health care organizations, patient-provider email or asynchronous electronic secure messaging has the potential to support patient-centered communication. Within the medical home model of the Veterans Health Administration (VA), secure messaging is envisioned as a means to enhance access and strengthen the relationships between veterans and their health care team members. However, despite previous studies that have examined the content of electronic messages exchanged between patients and health care providers, less research has focused on the socioemotional aspects of the communication enacted through those messages. Recognizing the potential of secure messaging to facilitate the goals of patient-centered care, the objectives of this analysis were to not only understand why patients and health care team members exchange secure messages but also to examine the socioemotional tone engendered in these messages. We conducted a cross-sectional coding evaluation of a corpus of secure messages exchanged between patients and health care team members over 6 months at 8 VA facilities. We identified patients whose medical records showed secure messaging threads containing at least 2 messages and compiled a random sample of these threads. Drawing on previous literature regarding the analysis of asynchronous, patient-provider electronic communication, we developed a coding scheme comprising a series of a priori patient and health care team member codes. Three team members tested the scheme on a subset of the messages and then independently coded the sample of messaging threads. Of the 711 messages coded from the 384 messaging threads, 52.5% (373/711) were sent by patients and 47.5% (338/711) by health care team members. Patient and health care team member messages included logistical content (82.6%, 308/373 vs 89.1%, 301/338), were neutral in tone (70.2%, 262/373 vs 82.0%, 277/338), and respectful in nature (25.7%, 96/373 vs 33.4%, 113/338). Secure messages from health care team members sometimes appeared hurried (25.4%, 86/338) but also displayed friendliness or warmth (18.9%, 64/338) and reassurance or encouragement (18.6%, 63/338). Most patient messages involved either providing or seeking information; however, the majority of health care team member messages involved information provision in response to patient questions. This evaluation is an important step toward understanding the content and socioemotional tone that is part of the secure messaging exchanges between patients and health care team members. Our findings were encouraging; however, there are opportunities for improvement. As health care organizations seek to supplement traditional encounters with virtual care, they must reexamine their use of secure messaging, including the patient centeredness of the communication, and the potential for more proactive use by health care team members. ©Timothy P Hogan, Tana M Luger, Julie E Volkman, Mary Rocheleau, Nora Mueller, Anna M Barker, Kim M Nazi, Thomas K Houston, Barbara G Bokhour. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 08.03.2018.
Managing Communication among Geographically Distributed Teams: A Brazilian Case
NASA Astrophysics Data System (ADS)
Almeida, Ana Carina M.; de Farias Junior, Ivaldir H.; de S. Carneiro, Pedro Jorge
The growing demand for qualified professionals is making software companies opt for distributed software development (DSD). At the project conception, communication and synchronization of information are critical factors for success. However problems such as time-zone difference between teams, culture, language and different development processes among sites could difficult the communication among teams. In this way, the main goal of this paper is to describe the solution adopted by a Brazilian team to improve communication in a multisite project environment. The purposed solution was based on the best practices described in the literature, and the communication plan was created based on the infrastructure needed by the project. The outcome of this work is to minimize the impact of communication issues in multisite projects, increasing productivity, good understanding and avoiding rework on code and document writing.
Integrating advanced practice providers into medical critical care teams.
McCarthy, Christine; O'Rourke, Nancy C; Madison, J Mark
2013-03-01
Because there is increasing demand for critical care providers in the United States, many medical ICUs for adults have begun to integrate nurse practitioners and physician assistants into their medical teams. Studies suggest that such advanced practice providers (APPs), when appropriately trained in acute care, can be highly effective in helping to deliver high-quality medical critical care and can be important elements of teams with multiple providers, including those with medical house staff. One aspect of building an integrated team is a practice model that features appropriate coding and billing of services by all providers. Therefore, it is important to understand an APP's scope of practice, when they are qualified for reimbursement, and how they may appropriately coordinate coding and billing with other team providers. In particular, understanding when and how to appropriately code for critical care services (Current Procedural Terminology [CPT] code 99291, critical care, evaluation and management of the critically ill or critically injured patient, first 30-74 min; CPT code 99292, critical care, each additional 30 min) and procedures is vital for creating a sustainable program. Because APPs will likely play a growing role in medical critical care units in the future, more studies are needed to compare different practice models and to determine the best way to deploy this talent in specific ICU settings.
Sweeney, Angela; Greenwood, Kathryn E; Williams, Sally; Wykes, Til; Rose, Diana S
2013-12-01
Health research is frequently conducted in multi-disciplinary teams, with these teams increasingly including service user researchers. Whilst it is common for service user researchers to be involved in data collection--most typically interviewing other service users--it is less common for service user researchers to be involved in data analysis and interpretation. This means that a unique and significant perspective on the data is absent. This study aims to use an empirical report of a study on Cognitive Behavioural Therapy for psychosis (CBTp) to demonstrate the value of multiple coding in enabling service users voices to be heard in team-based qualitative data analysis. The CBTp study employed multiple coding to analyse service users' discussions of CBT for psychosis (CBTp) from the perspectives of a service user researcher, clinical researcher and psychology assistant. Multiple coding was selected to enable multiple perspectives to analyse and interpret data, to understand and explore differences and to build multi-disciplinary consensus. Multiple coding enabled the team to understand where our views were commensurate and incommensurate and to discuss and debate differences. Through the process of multiple coding, we were able to build strong consensus about the data from multiple perspectives, including that of the service user researcher. Multiple coding is an important method for understanding and exploring multiple perspectives on data and building team consensus. This can be contrasted with inter-rater reliability which is only appropriate in limited circumstances. We conclude that multiple coding is an appropriate and important means of hearing service users' voices in qualitative data analysis. © 2012 John Wiley & Sons Ltd.
Designing and maintaining an effective chargemaster.
Abbey, D C
2001-03-01
The chargemaster is the central repository of charges and associated coding information used to develop claims. But this simple description belies the chargemaster's true complexity. The chargemaster's role in the coding process differs from department to department, and not all codes provided on a claim form are necessarily included in the chargemaster, as codes for complex services may need to be developed and reviewed by coding staff. In addition, with the rise of managed care, the chargemaster increasingly is being used to track utilization of supplies and services. To ensure that the chargemaster performs all of its functions effectively, hospitals should appoint a chargemaster coordinator, supported by a chargemaster review team, to oversee the design and maintenance of the chargemaster. Important design issues that should be considered include the principle of "form follows function," static versus dynamic coding, how modifiers should be treated, how charges should be developed, how to incorporate physician fee schedules into the chargemaster, the interface between the chargemaster and cost reports, and how to include statistical information for tracking utilization.
SAFETY ON UNTRUSTED NETWORK DEVICES (SOUND)
2017-10-10
in the Cyber & Communication Technologies Group , but not on the SOUND project, would review the code, design and perform attacks against a live...3.5 Red Team As part of our testing , we planned to conduct Red Team assessments. In these assessments, a group of engineers from BAE who worked...developed under the DARPA CRASH program and SOUND were designed to be companion projects. SAFE focused on the processor and the host, SOUND focused on
An Overview of the JPSS Ground Project Algorithm Integration Process
NASA Astrophysics Data System (ADS)
Vicente, G. A.; Williams, R.; Dorman, T. J.; Williamson, R. C.; Shaw, F. J.; Thomas, W. M.; Hung, L.; Griffin, A.; Meade, P.; Steadley, R. S.; Cember, R. P.
2015-12-01
The smooth transition, implementation and operationalization of scientific software's from the National Oceanic and Atmospheric Administration (NOAA) development teams to the Join Polar Satellite System (JPSS) Ground Segment requires a variety of experiences and expertise. This task has been accomplished by a dedicated group of scientist and engineers working in close collaboration with the NOAA Satellite and Information Services (NESDIS) Center for Satellite Applications and Research (STAR) science teams for the JPSS/Suomi-NPOES Preparatory Project (S-NPP) Advanced Technology Microwave Sounder (ATMS), Cross-track Infrared Sounder (CrIS), Visible Infrared Imaging Radiometer Suite (VIIRS) and Ozone Mapping and Profiler Suite (OMPS) instruments. The presentation purpose is to describe the JPSS project process for algorithm implementation from the very early delivering stages by the science teams to the full operationalization into the Interface Processing Segment (IDPS), the processing system that provides Environmental Data Records (EDR's) to NOAA. Special focus is given to the NASA Data Products Engineering and Services (DPES) Algorithm Integration Team (AIT) functional and regression test activities. In the functional testing phase, the AIT uses one or a few specific chunks of data (granules) selected by the NOAA STAR Calibration and Validation (cal/val) Teams to demonstrate that a small change in the code performs properly and does not disrupt the rest of the algorithm chain. In the regression testing phase, the modified code is placed into to the Government Resources for Algorithm Verification, Integration, Test and Evaluation (GRAVITE) Algorithm Development Area (ADA), a simulated and smaller version of the operational IDPS. Baseline files are swapped out, not edited and the whole code package runs in one full orbit of Science Data Records (SDR's) using Calibration Look Up Tables (Cal LUT's) for the time of the orbit. The purpose of the regression test is to identify unintended outcomes. Overall the presentation provides a general and easy to follow overview of the JPSS Algorithm Change Process (ACP) and is intended to facility the audience understanding of a very extensive and complex process.
Tan, Edwin T.; Martin, Sarah R.; Fortier, Michelle A.; Kain, Zeev N.
2012-01-01
Objective To develop and validate a behavioral coding measure, the Children's Behavior Coding System-PACU (CBCS-P), for children's distress and nondistress behaviors while in the postanesthesia recovery unit. Methods A multidisciplinary team examined videotapes of children in the PACU and developed a coding scheme that subsequently underwent a refinement process (CBCS-P). To examine the reliability and validity of the coding system, 121 children and their parents were videotaped during their stay in the PACU. Participants were healthy children undergoing elective, outpatient surgery and general anesthesia. The CBCS-P was utilized and objective data from medical charts (analgesic consumption and pain scores) were extracted to establish validity. Results Kappa values indicated good-to-excellent (κ's > .65) interrater reliability of the individual codes. The CBCS-P had good criterion validity when compared to children's analgesic consumption and pain scores. Conclusions The CBCS-P is a reliable, observational coding method that captures children's distress and nondistress postoperative behaviors. These findings highlight the importance of considering context in both the development and application of observational coding schemes. PMID:22167123
Developing high-quality educational software.
Johnson, Lynn A; Schleyer, Titus K L
2003-11-01
The development of effective educational software requires a systematic process executed by a skilled development team. This article describes the core skills required of the development team members for the six phases of successful educational software development. During analysis, the foundation of product development is laid including defining the audience and program goals, determining hardware and software constraints, identifying content resources, and developing management tools. The design phase creates the specifications that describe the user interface, the sequence of events, and the details of the content to be displayed. During development, the pieces of the educational program are assembled. Graphics and other media are created, video and audio scripts written and recorded, the program code created, and support documentation produced. Extensive testing by the development team (alpha testing) and with students (beta testing) is conducted. Carefully planned implementation is most likely to result in a flawless delivery of the educational software and maintenance ensures up-to-date content and software. Due to the importance of the sixth phase, evaluation, we have written a companion article on it that follows this one. The development of a CD-ROM product is described including the development team, a detailed description of the development phases, and the lessons learned from the project.
Rudigoz, René-Charles; Huissoud, Cyril; Delecour, Lisa; Thevenet, Simone; Dupont, Corinne
2014-06-01
The medical team of the Croix Rousse teaching hospital maternity unit has developed, over the last ten years, a set of procedures designed to respond to various emergency situations necessitating Caesarean section. Using the Lucas classification, we have defined as precisely as possible the degree of urgency of Caesarian sections. We have established specific protocols for the implementation of urgent and very urgent Caesarean section and have chosen a simple means to convey the degree of urgency to all team members, namely a color code system (red, orange and green). We have set time goals from decision to delivery: 15 minutes for the red code and 30 minutes for the orange code. The results seem very positive: The frequency of urgent and very urgent Caesareans has fallen over time, from 6.1 % to 1.6% in 2013. The average time from decision to delivery is 11 minutes for code red Caesareans and 21 minutes for code orange Caesareans. These time goals are now achieved in 95% of cases. Organizational and anesthetic difficulties are the main causes of delays. The indications for red and orange code Caesarians are appropriate more than two times out of three. Perinatal outcomes are generally favorable, code red Caesarians being life-saving in 15% of cases. No increase in maternal complications has been observed. In sum: Each obstetric department should have its own protocols for handling urgent and very urgent Caesarean sections. Continuous monitoring of their implementation, relevance and results should be conducted Management of extreme urgency must be integrated into the management of patients with identified risks (scarred uterus and twin pregnancies for example), and also in structures without medical facilities (birthing centers). Obstetric teams must keep in mind that implementation of these protocols in no way dispenses with close monitoring of labour.
National Combustion Code Parallel Performance Enhancements
NASA Technical Reports Server (NTRS)
Quealy, Angela; Benyo, Theresa (Technical Monitor)
2002-01-01
The National Combustion Code (NCC) is being developed by an industry-government team for the design and analysis of combustion systems. The unstructured grid, reacting flow code uses a distributed memory, message passing model for its parallel implementation. The focus of the present effort has been to improve the performance of the NCC code to meet combustor designer requirements for model accuracy and analysis turnaround time. Improving the performance of this code contributes significantly to the overall reduction in time and cost of the combustor design cycle. This report describes recent parallel processing modifications to NCC that have improved the parallel scalability of the code, enabling a two hour turnaround for a 1.3 million element fully reacting combustion simulation on an SGI Origin 2000.
Data Parallel Line Relaxation (DPLR) Code User Manual: Acadia - Version 4.01.1
NASA Technical Reports Server (NTRS)
Wright, Michael J.; White, Todd; Mangini, Nancy
2009-01-01
Data-Parallel Line Relaxation (DPLR) code is a computational fluid dynamic (CFD) solver that was developed at NASA Ames Research Center to help mission support teams generate high-value predictive solutions for hypersonic flow field problems. The DPLR Code Package is an MPI-based, parallel, full three-dimensional Navier-Stokes CFD solver with generalized models for finite-rate reaction kinetics, thermal and chemical non-equilibrium, accurate high-temperature transport coefficients, and ionized flow physics incorporated into the code. DPLR also includes a large selection of generalized realistic surface boundary conditions and links to enable loose coupling with external thermal protection system (TPS) material response and shock layer radiation codes.
Cryptanalysis in World War II--and Mathematics Education.
ERIC Educational Resources Information Center
Hilton, Peter
1984-01-01
Hilton describes the team of cryptanalysts who tried to decipher German and Japanese codes during the Second World War. The work of Turing, essentially developing the computer, is reported, as well as inferences about pure and applied mathematics. (MNS)
Guła, Przemysław; Wejnarski, Arkadiusz; Moryto, Remigiusz; Gałazkowski, Robert; Swiezewski, Stanisław
2014-01-01
The Polish Emergency Medical Services (EMS) system is based on two types of medical rescue teams (MRT): specialist (S)--with system doctors and basic (B)--only paramedics. The aim of this study is to assess the reasonability of dividing medical rescue teams into specialist and basic. The retrospective analysis of medical cards of rescue activities performed during 21,896 interventions by medical rescue teams, 15,877 of which--by basic medical rescue teams (B MRT) and 6,019--by specialist medical rescue teams (S MRT). The procedures executed by both types of teams were compared. In the analysed group of dispatches, 56.4% were unrelated to medical emergencies. Simultaneously, 52.7% of code 1 interventions and 59.2% of code 2 interventions did not result in transporting the patient to the hospital. The qualification of S teams' dispatches is characterised by a higher number of assigned codes 1 (53.2% vs. 15.9%). It is worth emphasising that the procedures that can be applied exclusively by system doctors do not exceed 1% of interventions. Moreover, the number of the actions performed in medical emergencies in the secured region by the S team that is dispatched as the first one is comparable to that performed by B teams. The low need for usinq S teams'aid by B teams (0.92% of the interventions) was also indicated. This study points to the necessity to discuss the implementation of straightforward principles of call qualification and the optimisation of the system doctors' role in prehospital activities.
Swarmathon 2017 - Students Develop Computer Code to Support Exploration at Kennedy
2017-04-19
Students from colleges and universities from across the nation recently participated in a robotic programming competition at NASA's Kennedy Space Center in Florida. Their research may lead to technology which will help astronauts find needed resources when exploring the moon or Mars. In the spaceport's second annual Swarmathon competition, aspiring engineers from 20 teams representing 22 minority serving universities and community colleges were invited to develop software code to operate innovative robots called "Swarmies." The event took place April 18-20, 2017, at the Kennedy Space Center Visitor Complex.
Validation Data and Model Development for Fuel Assembly Response to Seismic Loads
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bardet, Philippe; Ricciardi, Guillaume
2016-01-31
Vibrations are inherently present in nuclear reactors, especially in cores and steam generators of pressurized water reactors (PWR). They can have significant effects on local heat transfer and wear and tear in the reactor and often set safety margins. The simulation of these multiphysics phenomena from first principles requires the coupling of several codes, which is one the most challenging tasks in modern computer simulation. Here an ambitious multiphysics multidisciplinary validation campaign is conducted. It relied on an integrated team of experimentalists and code developers to acquire benchmark and validation data for fluid-structure interaction codes. Data are focused on PWRmore » fuel bundle behavior during seismic transients.« less
Implementation of a Post-Code Pause: Extending Post-Event Debriefing to Include Silence.
Copeland, Darcy; Liska, Heather
2016-01-01
This project arose out of a need to address two issues at our hospital: we lacked a formal debriefing process for code/trauma events and the emergency department wanted to address the psychological and spiritual needs of code/trauma responders. We developed a debriefing process for code/trauma events that intentionally included mechanisms to facilitate recognition, acknowledgment, and, when needed, responses to the psychological and spiritual needs of responders. A post-code pause process was implemented in the emergency department with the aims of standardizing a debriefing process, encouraging a supportive team-based culture, improving transition back to "normal" activities after responding to code/trauma events, and providing responders an opportunity to express reverence for patients involved in code/trauma events. The post-code pause process incorporates a moment of silence and the addition of two simple questions to a traditional operational debrief. Implementation of post-code pauses was feasible despite the fast paced nature of the department. At the end of the 1-year pilot period, staff members reported increases in feeling supported by peers and leaders, their ability to pay homage to patients, and having time to regroup prior to returning to their assignment. There was a decrease in the number of respondents reporting having thoughts or feelings associated with the event within 24 hr. The pauses create a mechanism for operational team debriefing, provide an opportunity for staff members to honor their work and their patients, and support an environment in which the psychological and spiritual effects of responding to code/trauma events can be acknowledged.
Catts, Stanley V; Frost, Aaron D J; O'Toole, Brian I; Carr, Vaughan J; Lewin, Terry; Neil, Amanda L; Harris, Meredith G; Evans, Russell W; Crissman, Belinda R; Eadie, Kathy
2011-01-01
Clinical practice improvement carried out in a quality assurance framework relies on routinely collected data using clinical indicators. Herein we describe the development, minimum training requirements, and inter-rater agreement of indicators that were used in an Australian multi-site evaluation of the effectiveness of early psychosis (EP) teams. Surveys of clinician opinion and face-to-face consensus-building meetings were used to select and conceptually define indicators. Operationalization of definitions was achieved by iterative refinement until clinicians could be quickly trained to code indicators reliably. Calculation of percentage agreement with expert consensus coding was based on ratings of paper-based clinical vignettes embedded in a 2-h clinician training package. Consensually agreed upon conceptual definitions for seven clinical indicators judged most relevant to evaluating EP teams were operationalized for ease-of-training. Brief training enabled typical clinicians to code indicators with acceptable percentage agreement (60% to 86%). For indicators of suicide risk, psychosocial function, and family functioning this level of agreement was only possible with less precise 'broad range' expert consensus scores. Estimated kappa values indicated fair to good inter-rater reliability (kappa > 0.65). Inspection of contingency tables (coding category by health service) and modal scores across services suggested consistent, unbiased coding across services. Clinicians are able to agree upon what information is essential to routinely evaluate clinical practice. Simple indicators of this information can be designed and coding rules can be reliably applied to written vignettes after brief training. The real world feasibility of the indicators remains to be tested in field trials.
MDSplus quality improvement project
Fredian, Thomas W.; Stillerman, Joshua; Manduchi, Gabriele; ...
2016-05-31
MDSplus is a data acquisition and analysis system used worldwide predominantly in the fusion research community. Development began 29 years ago on the OpenVMS operating system. Since that time there have been many new features added and the code has been ported to many different operating systems. There have been contributions to the MDSplus development from the fusion community in the way of feature suggestions, feature implementations, documentation and porting to different operating systems. The bulk of the development and support of MDSplus, however, has been provided by a relatively small core developer group of three or four members. Givenmore » the size of the development team and the large number of users much more effort was focused on providing new features for the community than on keeping the underlying code and documentation up to date with the evolving software development standards. To ensure that MDSplus will continue to provide the needs of the community in the future, the MDSplus development team along with other members of the MDSplus user community has commenced on a major quality improvement project. The planned improvements include changes to software build scripts to better use GNU Autoconf and Automake tools, refactoring many of the source code modules using new language features available in modern compilers, using GNU MinGW-w64 to create MS Windows distributions, migrating to a more modern source code management system, improvement of source documentation as well as improvements to the www.mdsplus.org web site documentation and layout, and the addition of more comprehensive test suites to apply to MDSplus code builds prior to releasing installation kits to the community. This paper should lead to a much more robust product and establish a framework to maintain stability as more enhancements and features are added. Finally, this paper will describe these efforts that are either in progress or planned for the near future.« less
Mujika, Iñigo; Burke, Louise M
2010-01-01
Team sports are based on intermittent high-intensity activity patterns, but the exact characteristics vary between and within codes, and from one game to the next. Despite the challenge of predicting exact game demands, performance in team sports is often dependent on nutritional factors. Chronic issues include achieving ideal levels of muscle mass and body fat, and supporting the nutrient needs of the training program. Acute issues, both for training and in games, include strategies that allow the player to be well fuelled and hydrated over the duration of exercise. Each player should develop a plan of consuming fluid and carbohydrate according to the needs of their activity patterns, within the breaks that are provided in their sport. In seasonal fixtures, competition varies from a weekly game in some codes to 2-3 games over a weekend road trip in others, and a tournament fixture usually involves 1-3 days between matches. Recovery between events is a major priority, involving rehydration, refuelling and repair/adaptation activities. Some sports supplements may be of value to the team athlete. Sports drinks, gels and liquid meals may be valuable in allowing nutritional goals to be met, while caffeine, creatine and buffering agents may directly enhance performance. Copyright © 2011 S. Karger AG, Basel.
An evaluation of an interprofessional practice-based learning environment using student reflections.
Housley, Cora L; Neill, Kathryn K; White, Lanita S; Tedder, Andrea T; Castleberry, Ashley N
2018-01-01
The 12th Street Health and Wellness Center is an interprofessional, student-led, community-based clinic. Students from all University of Arkansas for Medical Sciences colleges work together to provide healthcare services for residents of an underserved community. Interprofessional student teams assess patients and present to an interprofessional preceptor team. At the conclusion of clinic, teams reflect on their experience. The objective of this study is to generate key themes from the end of clinic reflections to describe learning outcomes in an interprofessional practice environment. Student teams were asked to reflect on what they learned about patient care and interprofessional practice while volunteering at the clinic. Three hundred eighty reflection statements were assessed using the constant comparative approach with open coding by three researchers who identified and categorised themes by selecting key phrases from reflections. Eight themes emerged from this process which illuminated students' self-perceived development during practice-based learning and interprofessional collaboration. Key phrases were also coded to the four core Interprofessional Education Collaborative competency domains. These results suggest learners' perception that the Center is a practice-based environment that provides an opportunity to learn, integrate, and apply interprofessional curricular content.
2017-04-20
In the Swarmathon competition at the Kennedy Space Center Visitor Complex, students were asked to develop computer code for the small robots, programming them to look for "resources" in the form of cubes with AprilTags, similar to barcodes. Teams developed search algorithms for innovative robots known as "Swarmies" to operate autonomously, communicating and interacting as a collective swarm similar to ants foraging for food. In the spaceport's second annual Swarmathon, 20 teams representing 22 minority serving universities and community colleges were invited to participate. Similar robots could help find resources when astronauts explore distant locations, such as the moon or Mars.
Wright, Bruce; Lockyer, Jocelyn; Fidler, Herta; Hofmeister, Marianna
2007-11-01
To examine the beliefs and attitudes of FPs and health care professionals (HCPs) regarding FPs' roles and responsibilities on interdisciplinary geriatric health care teams. Qualitative study using focus groups. Calgary Health Region. Seventeen FPs and 22 HCPs working on geriatric health care teams. Four 90-minute focus groups were conducted with FPs, followed by 2 additional 90-minute focus groups with HCPs. The FP focus groups discussed 4 vignettes of typical teamwork scenarios. Discussions were transcribed and the 4 researchers analyzed and coded themes and subthemes and developed the HCP focus group questions. These questions asked about HCPs' expectations of FPs on teams, experiences with FPs on teams, and perspectives on optimal roles on teams. Several meetings were held to determine themes and subthemes. Family physicians identified patient centredness, role delineation for team members, team dynamics, and team structure as critical to team success. Both FPs and HCPs had a continuum of beliefs about the role FPs should play on teams, including whether FPs should be autonomous or collaborative decision makers, the extent to which FPs should work within or outside teams, whether FPs should be leaders or simply members of teams, and the level of responsibility implied or explicit in their roles. Comments from FPs and HCPs identified intraprofessional and interprofessional tensions that could affect team practice and impede the development of high-functioning teams. It will be important, as primary care reform continues, to help FPs and HCPs learn how to work together effectively on teams so that patients receive the best possible care.
Managing MDO Software Development Projects
NASA Technical Reports Server (NTRS)
Townsend, J. C.; Salas, A. O.
2002-01-01
Over the past decade, the NASA Langley Research Center developed a series of 'grand challenge' applications demonstrating the use of parallel and distributed computation and multidisciplinary design optimization. All but the last of these applications were focused on the high-speed civil transport vehicle; the final application focused on reusable launch vehicles. Teams of discipline experts developed these multidisciplinary applications by integrating legacy engineering analysis codes. As teams became larger and the application development became more complex with increasing levels of fidelity and numbers of disciplines, the need for applying software engineering practices became evident. This paper briefly introduces the application projects and then describes the approaches taken in project management and software engineering for each project; lessons learned are highlighted.
NASA Technical Reports Server (NTRS)
Weed, Richard Allen; Sankar, L. N.
1994-01-01
An increasing amount of research activity in computational fluid dynamics has been devoted to the development of efficient algorithms for parallel computing systems. The increasing performance to price ratio of engineering workstations has led to research to development procedures for implementing a parallel computing system composed of distributed workstations. This thesis proposal outlines an ongoing research program to develop efficient strategies for performing three-dimensional flow analysis on distributed computing systems. The PVM parallel programming interface was used to modify an existing three-dimensional flow solver, the TEAM code developed by Lockheed for the Air Force, to function as a parallel flow solver on clusters of workstations. Steady flow solutions were generated for three different wing and body geometries to validate the code and evaluate code performance. The proposed research will extend the parallel code development to determine the most efficient strategies for unsteady flow simulations.
78 FR 39531 - Mine Rescue Teams
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-01
... Administration 30 CFR Part 49 Mine Rescue Teams; CFR Correction #0;#0;Federal Register / Vol. 78 , No. 126... Health Administration 30 CFR Part 49 Mine Rescue Teams CFR Correction In Title 30 of the Code of Federal... Teams Type of mine rescue team Requirement Mine-site Composite Contract State-sponsored...
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Signe K.; Purohit, Sumit; Boyd, Lauren W.
The Geothermal Technologies Office Code Comparison Study (GTO-CCS) aims to support the DOE Geothermal Technologies Office in organizing and executing a model comparison activity. This project is directed at testing, diagnosing differences, and demonstrating modeling capabilities of a worldwide collection of numerical simulators for evaluating geothermal technologies. Teams of researchers are collaborating in this code comparison effort, and it is important to be able to share results in a forum where technical discussions can easily take place without requiring teams to travel to a common location. Pacific Northwest National Laboratory has developed an open-source, flexible framework called Velo that providesmore » a knowledge management infrastructure and tools to support modeling and simulation for a variety of types of projects in a number of scientific domains. GTO-Velo is a customized version of the Velo Framework that is being used as the collaborative tool in support of the GTO-CCS project. Velo is designed around a novel integration of a collaborative Web-based environment and a scalable enterprise Content Management System (CMS). The underlying framework provides a flexible and unstructured data storage system that allows for easy upload of files that can be in any format. Data files are organized in hierarchical folders and each folder and each file has a corresponding wiki page for metadata. The user interacts with Velo through a web browser based wiki technology, providing the benefit of familiarity and ease of use. High-level folders have been defined in GTO-Velo for the benchmark problem descriptions, descriptions of simulator/code capabilities, a project notebook, and folders for participating teams. Each team has a subfolder with write access limited only to the team members, where they can upload their simulation results. The GTO-CCS participants are charged with defining the benchmark problems for the study, and as each GTO-CCS Benchmark problem is defined, the problem creator can provide a description using a template on the metadata page corresponding to the benchmark problem folder. Project documents, references and videos of the weekly online meetings are shared via GTO-Velo. A results comparison tool allows users to plot their uploaded simulation results on the fly, along with those of other teams, to facilitate weekly discussions of the benchmark problem results being generated by the teams. GTO-Velo is an invaluable tool providing the project coordinators and team members with a framework for collaboration among geographically dispersed organizations.« less
76 FR 30229 - Shipping Coordinating Committee; Notice of Committee Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-24
.... --Strategy and planning. --Organizational reforms. --Resource management. --Technical Co-operation Fund--biennial allocation to support the ITCP Programme for 2012-2013. --Results-based budget. --Voluntary IMO... FSS Code for communication equipment for fire-fighting teams. --Development of guidelines for use of...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Nicholas R.; Pointer, William David; Sieger, Matt
2016-04-01
The goal of this review is to enable application of codes or software packages for safety assessment of advanced sodium-cooled fast reactor (SFR) designs. To address near-term programmatic needs, the authors have focused on two objectives. First, the authors have focused on identification of requirements for software QA that must be satisfied to enable the application of software to future safety analyses. Second, the authors have collected best practices applied by other code development teams to minimize cost and time of initial code qualification activities and to recommend a path to the stated goal.
Baucom, Brian R W; Leo, Karena; Adamo, Colin; Georgiou, Panayiotis; Baucom, Katherine J W
2017-12-01
Observational behavioral coding methods are widely used for the study of relational phenomena. There are numerous guidelines for the development and implementation of these methods that include principles for creating new and adapting existing coding systems as well as principles for creating coding teams. While these principles have been successfully implemented in research on relational phenomena, the ever expanding array of phenomena being investigated with observational methods calls for a similar expansion of these principles. Specifically, guidelines are needed for decisions that arise in current areas of emphasis in couple research including observational investigation of related outcomes (e.g., relationship distress and psychological symptoms), the study of change in behavior over time, and the study of group similarities and differences in the enactment and perception of behavior. This article describes conceptual and statistical considerations involved in these 3 areas of research and presents principle- and empirically based rationale for design decisions related to these issues. A unifying principle underlying these guidelines is the need for careful consideration of fit between theory, research questions, selection of coding systems, and creation of coding teams. Implications of (mis)fit for the advancement of theory are discussed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Sam, Jonathan; Pierse, Michael; Al-Qahtani, Abdullah; Cheng, Adam
2012-02-01
To develop, implement and evaluate a simulation-based acute care curriculum in a paediatric residency program using an integrated and longitudinal approach. Curriculum framework consisting of three modular, year-specific courses and longitudinal just-in-time, in situ mock codes. Paediatric residency program at BC Children's Hospital, Vancouver, British Columbia. The three year-specific courses focused on the critical first 5 min, complex medical management and crisis resource management, respectively. The just-in-time in situ mock codes simulated the acute deterioration of an existing ward patient, prepared the actual multidisciplinary code team, and primed the surrounding crisis support systems. Each curriculum component was evaluated with surveys using a five-point Likert scale. A total of 40 resident surveys were completed after each of the modular courses, and an additional 28 surveys were completed for the overall simulation curriculum. The highest Likert scores were for hands-on skill stations, immersive simulation environment and crisis resource management teaching. Survey results also suggested that just-in-time mock codes were realistic, reinforced learning, and prepared ward teams for patient deterioration. A simulation-based acute care curriculum was successfully integrated into a paediatric residency program. It provides a model for integrating simulation-based learning into other training programs, as well as a model for any hospital that wishes to improve paediatric resuscitation outcomes using just-in-time in situ mock codes.
Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica
2017-12-28
The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems, countries are able to lay the foundation for interoperability and ensure a harmonized language between global health stakeholders. © Hara et al.
Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica
2017-01-01
The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems, countries are able to lay the foundation for interoperability and ensure a harmonized language between global health stakeholders. PMID:29284701
2017-04-20
In the Swarmathon competition at the Kennedy Space Center Visitor Complex, students were asked to develop computer code for the small robots, programming them to look for "resources" in the form of cubes with AprilTags, similar to barcodes. Teams developed search algorithms for the Swarmies to operate autonomously, communicating and interacting as a collective swarm similar to ants foraging for food.
Open Technology Development: Roadmap Plan
2006-04-01
65 RECOMMENDATION 1: APPROVE AND FUND AN OTD STRIKE TEAM................. 67 Senior Leadership...negotiated, rather than an innate property of the product. Software’s replicability also means it can be incorporated into other software systems without...to leverage an open code development model, DoD would provide the market incentives to increase the agility and competitiveness of the industrial
2013-01-01
Abstract Since the concept of team science gained recognition among biomedical researchers, social scientists have been challenged with investigating evidence of team mechanisms and functional dynamics within transdisciplinary teams. Identification of these mechanisms has lacked substantial research using grounded theory models to adequately describe their dynamical qualities. Research trends continue to favor the measurement of teams by isolating occurrences of production over relational mechanistic team tendencies. This study uses a social constructionist‐grounded multilevel mixed methods approach to identify social dynamics and mechanisms within a transdisciplinary team. A National Institutes of Health—funded research team served as a sample. Data from observations, interviews, and focus groups were qualitatively coded to generate micro/meso level analyses. Social mechanisms operative within this biomedical scientific team were identified. Dynamics that support such mechanisms were documented and explored. Through theoretical and emergent coding, four social mechanisms dominated in the analysis—change, kinship, tension, and heritage. Each contains relational social dynamics. This micro/meso level study suggests such mechanisms and dynamics are key features of team science and as such can inform problems of integration, praxis, and engagement in teams. PMID:23919361
Lotrecchiano, Gaetano R
2013-08-01
Since the concept of team science gained recognition among biomedical researchers, social scientists have been challenged with investigating evidence of team mechanisms and functional dynamics within transdisciplinary teams. Identification of these mechanisms has lacked substantial research using grounded theory models to adequately describe their dynamical qualities. Research trends continue to favor the measurement of teams by isolating occurrences of production over relational mechanistic team tendencies. This study uses a social constructionist-grounded multilevel mixed methods approach to identify social dynamics and mechanisms within a transdisciplinary team. A National Institutes of Health-funded research team served as a sample. Data from observations, interviews, and focus groups were qualitatively coded to generate micro/meso level analyses. Social mechanisms operative within this biomedical scientific team were identified. Dynamics that support such mechanisms were documented and explored. Through theoretical and emergent coding, four social mechanisms dominated in the analysis-change, kinship, tension, and heritage. Each contains relational social dynamics. This micro/meso level study suggests such mechanisms and dynamics are key features of team science and as such can inform problems of integration, praxis, and engagement in teams. © 2013 Wiley Periodicals, Inc.
ToxPredictor: a Toxicity Estimation Software Tool
The Computational Toxicology Team within the National Risk Management Research Laboratory has developed a software tool that will allow the user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be ac...
A parallel and modular deformable cell Car-Parrinello code
NASA Astrophysics Data System (ADS)
Cavazzoni, Carlo; Chiarotti, Guido L.
1999-12-01
We have developed a modular parallel code implementing the Car-Parrinello [Phys. Rev. Lett. 55 (1985) 2471] algorithm including the variable cell dynamics [Europhys. Lett. 36 (1994) 345; J. Phys. Chem. Solids 56 (1995) 510]. Our code is written in Fortran 90, and makes use of some new programming concepts like encapsulation, data abstraction and data hiding. The code has a multi-layer hierarchical structure with tree like dependences among modules. The modules include not only the variables but also the methods acting on them, in an object oriented fashion. The modular structure allows easier code maintenance, develop and debugging procedures, and is suitable for a developer team. The layer structure permits high portability. The code displays an almost linear speed-up in a wide range of number of processors independently of the architecture. Super-linear speed up is obtained with a "smart" Fast Fourier Transform (FFT) that uses the available memory on the single node (increasing for a fixed problem with the number of processing elements) as temporary buffer to store wave function transforms. This code has been used to simulate water and ammonia at giant planet conditions for systems as large as 64 molecules for ˜50 ps.
Dual leadership in a hospital practice.
Thude, Bettina Ravnborg; Thomsen, Svend Erik; Stenager, Egon; Hollnagel, Erik
2017-02-06
Purpose Despite the practice of dual leadership in many organizations, there is relatively little research on the topic. Dual leadership means two leaders share the leadership task and are held jointly accountable for the results of the unit. To better understand how dual leadership works, this study aims to analyse three different dual leadership pairs at a Danish hospital. Furthermore, this study develops a tool to characterize dual leadership teams from each other. Design/methodology/approach This is a qualitative study using semi-structured interviews. Six leaders were interviewed to clarify how dual leadership works in a hospital context. All interviews were transcribed and coded. During coding, focus was on the nine principles found in the literature and another principle was found by looking at the themes that were generic for all six interviews. Findings Results indicate that power balance, personal relations and decision processes are important factors for creating efficient dual leaderships. The study develops a categorizing tool to use for further research or for organizations, to describe and analyse dual leaderships. Originality/value The study describes dual leadership in the hospital context and develops a categorizing tool for being able to distinguish dual leadership teams from each other. It is important to reveal if there are any indicators that can be used for optimising dual leadership teams in the health-care sector and in other organisations.
2004-10-01
Top-Level Process for Identification and Analysis of Safety-Related Re- quirements 4.4 Collaborators The primary SEI team members were Don Firesmith...Graff, M. & van Wyk, K. Secure Coding Principles & Practices. O’Reilly, 2003. • Hoglund, G. & McGraw, G. Exploiting Software: How to Break Code. Addison...Eisenecker, U.; Glück, R.; Vandevoorde, D.; & Veldhuizen , T. “Generative Programming and Active Libraries (Extended Abstract)” <osl.iu.edu/~tveldhui/papers
Health Services Assistant. Revised. Instructor Guide.
ERIC Educational Resources Information Center
Missouri Univ., Columbia. Instructional Materials Lab.
This color-coded curriculum guide was developed to help health services educators prepare students for health services occupations. The curriculum is organized in 20 units that cover the following topics: interpersonal relationships and the health care team; communication and observation skills; safety considerations; microbiology; the body as a…
Introduction to the Navigation Team: Johnson Space Center EG6 Internship
NASA Technical Reports Server (NTRS)
Gualdoni, Matthew
2017-01-01
The EG6 navigation team at NASA Johnson Space Center, like any team of engineers, interacts with the engineering process from beginning to end; from exploring solutions to a problem, to prototyping and studying the implementations, all the way to polishing and verifying a final flight-ready design. This summer, I was privileged enough to gain exposure to each of these processes, while also getting to truly experience working within a team of engineers. My summer can be broken up into three projects: i) Initial study and prototyping: investigating a manual navigation method that can be utilized onboard Orion in the event of catastrophic failure of navigation systems; ii) Finalizing and verifying code: altering a software routine to improve its robustness and reliability, as well as designing unit tests to verify its performance; and iii) Development of testing equipment: assisting in developing and integrating of a high-fidelity testbed to verify the performance of software and hardware.
Experimental Products Development Team (EPDT) Supporting New AWIPS . Part 2; Capabilities
NASA Technical Reports Server (NTRS)
Burks, Jason E.
2015-01-01
In 2012, the Experimental Products Development Team (EPDT) was formed within NASA's Short-term Prediction Research and Transition (SPoRT) Center to create training for development of plug-ins to extend the National Weather Service (NWS) Advanced Weather Interactive Processing System (AWIPS) version 2. The broader atmospheric science community had a need for AWIPS II development training being created at SPoRT and EPDT was expanded to include other groups who were looking for training. Since the expansion of the group occurred, EPDT has provided AWIPS II development training to over thirty participants spanning a wide variety of groups such as NWS Systems Engineering Center, NWS Meteorological Development Laboratory, and several NOAA Cooperative Institutes. Participants within EPDT solidify their learning experience through hands-on learning and by participating in a "code-sprint" in which they troubleshoot existing and develop plug-ins. The hands-on learning workshop is instructor lead with participants completing exercises within the AWIPS II Development Environment. During the code sprints EPDT groups work on projects important to the community and have worked on various plug-ins such as an RGB image recipe creation tool, and an mPing (crowd sourced precipitation type reporting system) ingest and display. EPDT has developed a well-defined training regime which prepares participants to fully develop plug-ins for the extendible AWIPS II architecture from ingest to the display of new data. SPoRT has hosted 2 learning workshops and 1 code sprint over the last two years, and continues to build and shape the EPDT group based on feedback from previous workshops. The presentation will provide an overview of EPDT current and future activities, and best practices developed within EPDT.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rawls, G.; Newhouse, N.; Rana, M.
2010-04-13
The Boiler and Pressure Vessel Project Team on Hydrogen Tanks was formed in 2004 to develop Code rules to address the various needs that had been identified for the design and construction of up to 15000 psi hydrogen storage vessel. One of these needs was the development of Code rules for high pressure composite vessels with non-load sharing liners for stationary applications. In 2009, ASME approved new Appendix 8, for Section X Code which contains the rules for these vessels. These vessels are designated as Class III vessels with design pressure ranging from 20.7 MPa (3,000 ps)i to 103.4 MPamore » (15,000 psi) and maximum allowable outside liner diameter of 2.54 m (100 inches). The maximum design life of these vessels is limited to 20 years. Design, fabrication, and examination requirements have been specified, included Acoustic Emission testing at time of manufacture. The Code rules include the design qualification testing of prototype vessels. Qualification includes proof, expansion, burst, cyclic fatigue, creep, flaw, permeability, torque, penetration, and environmental testing.« less
Ramirez, Magaly; Wu, Shinyi; Ryan, Gery; Towfighi, Amytis; Vickrey, Barbara G
2017-05-23
Beta versions of health information technology tools are needed in service delivery models with health care and community partnerships to confirm the key components and to assess the performance of the tools and their impact on users. We developed a care management technology (CMT) for use by community health workers (CHWs) and care managers (CMs) working collaboratively to improve risk factor control among recent stroke survivors. The CMT was expected to enhance the efficiency and effectiveness of the CHW-CM team. The primary objective was to describe the Secondary Stroke Prevention by Uniting Community and Chronic Care Model Teams Early to End Disparities (SUCCEED) CMT and investigate CM and CHW perceptions of the CMT's usefulness and challenges for team-based care management. We conducted qualitative interviews with all users of the beta-version SUCCEED CMT, namely two CMs and three CHWs. They were asked to demonstrate and describe their perceptions of the CMT's ease of use and usefulness for completing predefined key care management activities. They were also probed about their general perceptions of the CMT's information quality, ease of use, usefulness, and impact on CM and CHW roles. Interview transcripts were coded using a priori codes. Coded excerpts were grouped into broader themes and then related in a conceptual model of how the CMT facilitated care management. We also conducted a survey with 14 patients to obtain their perspective on CHW tablet use during CHW-patient interactions. Care managers and community health workers expressed that the CMT helped them keep track of patient interactions and plan their work. It guided CMs in developing and sharing care plans with CHWs. For CHWs, the CMT enabled electronic collection of clinical assessment data, provided decision support, and provided remote access to patients' risk factor values. Long loading times and downtimes due to outages were the most significant challenges encountered. Additional issues included extensive use of free-text responses and manual data transfer from the electronic medical record. Despite these challenges, patients overall did not perceive the tablet as interfering with CHW-patient interactions. Our findings suggest useful functionalities of CMTs supporting health care and community partners in collaborative chronic care management. However, usability issues need to be addressed during the development process. The SUCCEED CMT is an initial step toward the development of effective health information technology tools to support collaborative, team-based models of care and will need to be modified as the evidence base grows. Future research should assess the CMT's effects on team performance. ©Magaly Ramirez, Shinyi Wu, Gery Ryan, Amytis Towfighi, Barbara G Vickrey. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 23.05.2017.
NASA Technical Reports Server (NTRS)
Liu, Nan-Suey
2001-01-01
A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between then NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Glenn Research Center (LeRC), and Pratt & Whitney (P&W). The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration. The development of the NCC beta version was essentially completed in June 1998. Technical details of the NCC elements are given in the Reference List. Elements such as the baseline flow solver, turbulence module, and the chemistry module, have been extensively validated; and their parallel performance on large-scale parallel systems has been evaluated and optimized. However the scalar PDF module and the Spray module, as well as their coupling with the baseline flow solver, were developed in a small-scale distributed computing environment. As a result, the validation of the NCC beta version as a whole was quite limited. Current effort has been focused on the validation of the integrated code and the evaluation/optimization of its overall performance on large-scale parallel systems.
Chameleon Changes: An Exploration of Racial Identity Themes of Multiracial People
ERIC Educational Resources Information Center
Miville, Marie L.; Constantine, Madonna G.; Baysden, Matthew F.; So-Lloyd, Gloria
2005-01-01
The current study explored essential themes of racial identity development among 10 self-identified multiracial adults from a variety of racial backgrounds. Participants were interviewed using a semistructured protocol, and the interviews were recorded, transcribed, and then coded for themes by research team members. Four primary themes were…
The Legacy of Apollo: Assessed and Appreciated.
ERIC Educational Resources Information Center
Griffin, Richard A.; Griffin, Ann D.
1997-01-01
The real-life drama 25 years ago when Apollo 13 was rescued through a collaborative team of colleagues provides a model for changes in many public schools. In Texas, the state code specifies that site-based decision making address planning, budgeting, curriculum staffing patterns, staff development, and school organization. (MLF)
Extensive Air Showers in the Classroom
ERIC Educational Resources Information Center
Badala, A.; Blanco, F.; La Rocca, P.; Pappalardo, G. S.; Pulvirenti, A.; Riggi, F.
2007-01-01
The basic properties of extensive air showers of particles produced in the interaction of a high-energy primary cosmic ray in the Earth's atmosphere are discussed in the context of educational cosmic ray projects involving undergraduate students and high-school teams. Simulation results produced by an air shower development code were made…
Ontario's emergency department process improvement program: the experience of implementation.
Rotteau, Leahora; Webster, Fiona; Salkeld, Erin; Hellings, Chelsea; Guttmann, Astrid; Vermeulen, Marian J; Bell, Robert S; Zwarenstein, Merrick; Rowe, Brian H; Nigam, Amit; Schull, Michael J
2015-06-01
In recent years, Lean manufacturing principles have been applied to health care quality improvement efforts to improve wait times. In Ontario, an emergency department (ED) process improvement program based on Lean principles was introduced by the Ministry of Health and Long-Term Care as part of a strategy to reduce ED length of stay (LOS) and to improve patient flow. This article aims to describe the hospital-based teams' experiences during the ED process improvement program implementation and the teams' perceptions of the key factors that influenced the program's success or failure. A qualitative evaluation was conducted based on semistructured interviews with hospital implementation team members, such as team leads, medical leads, and executive sponsors, at 10 purposively selected hospitals in Ontario, Canada. Sites were selected based, in part, on their changes in median ED LOS following the implementation period. A thematic framework approach as used for interviews, and a standard thematic coding framework was developed. Twenty-four interviews were coded and analyzed. The results are organized according to participants' experience and are grouped into four themes that were identified as significantly affecting the implementation experience: local contextual factors, relationship between improvement team and support players, staff engagement, and success and sustainability. The results demonstrate the importance of the context of implementation, establishing strong relationships and communication strategies, and preparing for implementation and sustainability prior to the start of the project. Several key factors were identified as important to the success of the program, such as preparing for implementation, ensuring strong executive support, creation of implementation teams based on the tasks and outcomes of the initiative, and using multiple communication strategies throughout the implementation process. Explicit incorporation of these factors into the development and implementation of future similar interventions in health care settings could be useful. © 2015 by the Society for Academic Emergency Medicine.
Sam, Jonathan; Pierse, Michael; Al-Qahtani, Abdullah; Cheng, Adam
2012-01-01
OBJECTIVE: To develop, implement and evaluate a simulation-based acute care curriculum in a paediatric residency program using an integrated and longitudinal approach. DESIGN: Curriculum framework consisting of three modular, year-specific courses and longitudinal just-in-time, in situ mock codes. SETTING: Paediatric residency program at BC Children’s Hospital, Vancouver, British Columbia. INTERVENTIONS: The three year-specific courses focused on the critical first 5 min, complex medical management and crisis resource management, respectively. The just-in-time in situ mock codes simulated the acute deterioration of an existing ward patient, prepared the actual multidisciplinary code team, and primed the surrounding crisis support systems. Each curriculum component was evaluated with surveys using a five-point Likert scale. RESULTS: A total of 40 resident surveys were completed after each of the modular courses, and an additional 28 surveys were completed for the overall simulation curriculum. The highest Likert scores were for hands-on skill stations, immersive simulation environment and crisis resource management teaching. Survey results also suggested that just-in-time mock codes were realistic, reinforced learning, and prepared ward teams for patient deterioration. CONCLUSIONS: A simulation-based acute care curriculum was successfully integrated into a paediatric residency program. It provides a model for integrating simulation-based learning into other training programs, as well as a model for any hospital that wishes to improve paediatric resuscitation outcomes using just-in-time in situ mock codes. PMID:23372405
The Integration of COTS/GOTS within NASA's HST Command and Control System
NASA Technical Reports Server (NTRS)
Pfarr, Thomas; Reis, James E.; Obenschain, Arthur F. (Technical Monitor)
2001-01-01
NASA's mission critical Hubble Space Telescope (HST) command and control system has been re-engineered with COTS/GOTS and minimal custom code. This paper focuses on the design of this new HST Control Center System (CCS) and the lessons learned throughout its development. CCS currently utilizes 31 COTS/GOTS products with an additional 12 million lines of custom glueware code; the new CCS exceeds the capabilities of the original system while significantly reducing the lines of custom code by more than 50%. The lifecycle of COTS/GOTS products will be examined including the pack-age selection process, evaluation process, and integration process. The advantages, disadvantages, issues, concerns, and lessons teamed for integrating COTS/GOTS into the NASA's mission critical HST CCS will be examined in detail. Command and control systems designed with traditional custom code development efforts will be compared with command and control systems designed with new development techniques relying heavily on COTS/COTS integration. This paper will reveal the many hidden costs of COTS/GOTS solutions when compared to traditional custom code development efforts; this paper will show the high cost of COTS/GOTS solutions including training expenses, consulting fees, and long-term maintenance expenses.
An Approach to Verification and Validation of a Reliable Multicasting Protocol
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.
1994-01-01
This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or offnominal behaviors predicted by the current model. If the execution of a test was different between the model and implementation, then the differences helped identify inconsistencies between the model and implementation. The dialogue between both teams drove the co-evolution of the model and implementation. Testing served as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP.
An approach to verification and validation of a reliable multicasting protocol
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.
1995-01-01
This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or off-nominal behaviors predicted by the current model. If the execution of a test was different between the model and implementation, then the differences helped identify inconsistencies between the model and implementation. The dialogue between both teams drove the co-evolution of the model and implementation. Testing served as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP.
Developing of a New Atmospheric Ionizing Radiation (AIR) Model
NASA Technical Reports Server (NTRS)
Clem, John M.; deAngelis, Giovanni; Goldhagen, Paul; Wilson, John W.
2003-01-01
As a result of the research leading to the 1998 AIR workshop and the subsequent analysis, the neutron issues posed by Foelsche et al. and further analyzed by Hajnal have been adequately resolved. We are now engaged in developing a new atmospheric ionizing radiation (AIR) model for use in epidemiological studies and air transportation safety assessment. A team was formed to examine a promising code using the basic FLUKA software but with modifications to allow multiple charged ion breakup effects. A limited dataset of the ER-2 measurements and other cosmic ray data will be used to evaluate the use of this code.
On transform coding tools under development for VP10
NASA Astrophysics Data System (ADS)
Parker, Sarah; Chen, Yue; Han, Jingning; Liu, Zoe; Mukherjee, Debargha; Su, Hui; Wang, Yongzhe; Bankoski, Jim; Li, Shunyao
2016-09-01
Google started the WebM Project in 2010 to develop open source, royaltyfree video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec, VP10, that achieves at least a generational improvement in coding efficiency over VP9. Starting from VP9, a set of new experimental coding tools have already been added to VP10 to achieve decent coding gains. Subsequently, Google joined a consortium of major tech companies called the Alliance for Open Media to jointly develop a new codec AV1. As a result, the VP10 effort is largely expected to merge with AV1. In this paper, we focus primarily on new tools in VP10 that improve coding of the prediction residue using transform coding techniques. Specifically, we describe tools that increase the flexibility of available transforms, allowing the codec to handle a more diverse range or residue structures. Results are presented on a standard test set.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Atamturktur, Sez; Unal, Cetin; Hemez, Francois
The project proposed to provide a Predictive Maturity Framework with its companion metrics that (1) introduce a formalized, quantitative means to communicate information between interested parties, (2) provide scientifically dependable means to claim completion of Validation and Uncertainty Quantification (VU) activities, and (3) guide the decision makers in the allocation of Nuclear Energy’s resources for code development and physical experiments. The project team proposed to develop this framework based on two complimentary criteria: (1) the extent of experimental evidence available for the calibration of simulation models and (2) the sophistication of the physics incorporated in simulation models. The proposed frameworkmore » is capable of quantifying the interaction between the required number of physical experiments and degree of physics sophistication. The project team has developed this framework and implemented it with a multi-scale model for simulating creep of a core reactor cladding. The multi-scale model is composed of the viscoplastic self-consistent (VPSC) code at the meso-scale, which represents the visco-plastic behavior and changing properties of a highly anisotropic material and a Finite Element (FE) code at the macro-scale to represent the elastic behavior and apply the loading. The framework developed takes advantage of the transparency provided by partitioned analysis, where independent constituent codes are coupled in an iterative manner. This transparency allows model developers to better understand and remedy the source of biases and uncertainties, whether they stem from the constituents or the coupling interface by exploiting separate-effect experiments conducted within the constituent domain and integral-effect experiments conducted within the full-system domain. The project team has implemented this procedure with the multi- scale VPSC-FE model and demonstrated its ability to improve the predictive capability of the model. Within this framework, the project team has focused on optimizing resource allocation for improving numerical models through further code development and experimentation. Related to further code development, we have developed a code prioritization index (CPI) for coupled numerical models. CPI is implemented to effectively improve the predictive capability of the coupled model by increasing the sophistication of constituent codes. In relation to designing new experiments, we investigated the information gained by the addition of each new experiment used for calibration and bias correction of a simulation model. Additionally, the variability of ‘information gain’ through the design domain has been investigated in order to identify the experiment settings where maximum information gain occurs and thus guide the experimenters in the selection of the experiment settings. This idea was extended to evaluate the information gain from each experiment can be improved by intelligently selecting the experiments, leading to the development of the Batch Sequential Design (BSD) technique. Additionally, we evaluated the importance of sufficiently exploring the domain of applicability in experiment-based validation of high-consequence modeling and simulation by developing a new metric to quantify coverage. This metric has also been incorporated into the design of new experiments. Finally, we have proposed a data-aware calibration approach for the calibration of numerical models. This new method considers the complexity of a numerical model (the number of parameters to be calibrated, parameter uncertainty, and form of the model) and seeks to identify the number of experiments necessary to calibrate the model based on the level of sophistication of the physics. The final component in the project team’s work to improve model calibration and validation methods is the incorporation of robustness to non-probabilistic uncertainty in the input parameters. This is an improvement to model validation and uncertainty quantification stemming beyond the originally proposed scope of the project. We have introduced a new metric for incorporating the concept of robustness into experiment-based validation of numerical models. This project has accounted for the graduation of two Ph.D. students (Kendra Van Buren and Josh Hegenderfer) and two M.S. students (Matthew Egeberg and Parker Shields). One of the doctoral students is now working in the nuclear engineering field and the other one is a post-doctoral fellow at the Los Alamos National Laboratory. Additionally, two more Ph.D. students (Garrison Stevens and Tunc Kulaksiz) who are working towards graduation have been supported by this project.« less
Overview of NASA Multi-dimensional Stirling Convertor Code Development and Validation Effort
NASA Technical Reports Server (NTRS)
Tew, Roy C.; Cairelli, James E.; Ibrahim, Mounir B.; Simon, Terrence W.; Gedeon, David
2002-01-01
A NASA grant has been awarded to Cleveland State University (CSU) to develop a multi-dimensional (multi-D) Stirling computer code with the goals of improving loss predictions and identifying component areas for improvements. The University of Minnesota (UMN) and Gedeon Associates are teamed with CSU. Development of test rigs at UMN and CSU and validation of the code against test data are part of the effort. The one-dimensional (1-D) Stirling codes used for design and performance prediction do not rigorously model regions of the working space where abrupt changes in flow area occur (such as manifolds and other transitions between components). Certain hardware experiences have demonstrated large performance gains by varying manifolds and heat exchanger designs to improve flow distributions in the heat exchangers. 1-D codes were not able to predict these performance gains. An accurate multi-D code should improve understanding of the effects of area changes along the main flow axis, sensitivity of performance to slight changes in internal geometry, and, in general, the understanding of various internal thermodynamic losses. The commercial CFD-ACE code has been chosen for development of the multi-D code. This 2-D/3-D code has highly developed pre- and post-processors, and moving boundary capability. Preliminary attempts at validation of CFD-ACE models of MIT gas spring and "two space" test rigs were encouraging. Also, CSU's simulations of the UMN oscillating-flow fig compare well with flow visualization results from UMN. A complementary Department of Energy (DOE) Regenerator Research effort is aiding in development of regenerator matrix models that will be used in the multi-D Stirling code. This paper reports on the progress and challenges of this
Development of Fuel Shuffling Module for PHISICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allan Mabe; Andrea Alfonsi; Cristian Rabiti
2013-06-01
PHISICS (Parallel and Highly Innovative Simulation for the INL Code System) [4] code toolkit has been in development at the Idaho National Laboratory. This package is intended to provide a modern analysis tool for reactor physics investigation. It is designed with the mindset to maximize accuracy for a given availability of computational resources and to give state of the art tools to the modern nuclear engineer. This is obtained by implementing several different algorithms and meshing approaches among which the user will be able to choose, in order to optimize his computational resources and accuracy needs. The software is completelymore » modular in order to simplify the independent development of modules by different teams and future maintenance. The package is coupled with the thermo-hydraulic code RELAP5-3D [3]. In the following the structure of the different PHISICS modules is briefly recalled, focusing on the new shuffling module (SHUFFLE), object of this paper.« less
C++ Coding Standards for the AMP Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Thomas M; Clarno, Kevin T
2009-09-01
This document provides an initial starting point to define the C++ coding standards used by the AMP nuclear fuel performance integrated code project and a part of AMP's software development process. This document draws from the experiences, and documentation [1], of the developers of the Marmot Project at Los Alamos National Laboratory. Much of the software in AMP will be written in C++. The power of C++ can be abused easily, resulting in code that is difficult to understand and maintain. This document gives the practices that should be followed on the AMP project for all new code that ismore » written. The intent is not to be onerous but to ensure that the code can be readily understood by the entire code team and serve as a basis for collectively defining a set of coding standards for use in future development efforts. At the end of the AMP development in fiscal year (FY) 2010, all developers will have experience with the benefits, restrictions, and limitations of the standards described and will collectively define a set of standards for future software development. External libraries that AMP uses do not have to meet these requirements, although we encourage external developers to follow these practices. For any code of which AMP takes ownership, the project will decide on any changes on a case-by-case basis. The practices that we are using in the AMP project have been in use in the Denovo project [2] for several years. The practices build on those given in References [3-5]; the practices given in these references should also be followed. Some of the practices given in this document can also be found in [6].« less
Quality improvement utilizing in-situ simulation for a dual-hospital pediatric code response team.
Yager, Phoebe; Collins, Corey; Blais, Carlene; O'Connor, Kathy; Donovan, Patricia; Martinez, Maureen; Cummings, Brian; Hartnick, Christopher; Noviski, Natan
2016-09-01
Given the rarity of in-hospital pediatric emergency events, identification of gaps and inefficiencies in the code response can be difficult. In-situ, simulation-based medical education programs can identify unrecognized systems-based challenges. We hypothesized that developing an in-situ, simulation-based pediatric emergency response program would identify latent inefficiencies in a complex, dual-hospital pediatric code response system and allow rapid intervention testing to improve performance before implementation at an institutional level. Pediatric leadership from two hospitals with a shared pediatric code response team employed the Institute for Healthcare Improvement's (IHI) Breakthrough Model for Collaborative Improvement to design a program consisting of Plan-Do-Study-Act cycles occurring in a simulated environment. The objectives of the program were to 1) identify inefficiencies in our pediatric code response; 2) correlate to current workflow; 3) employ an iterative process to test quality improvement interventions in a safe environment; and 4) measure performance before actual implementation at the institutional level. Twelve dual-hospital, in-situ, simulated, pediatric emergencies occurred over one year. The initial simulated event allowed identification of inefficiencies including delayed provider response, delayed initiation of cardiopulmonary resuscitation (CPR), and delayed vascular access. These gaps were linked to process issues including unreliable code pager activation, slow elevator response, and lack of responder familiarity with layout and contents of code cart. From first to last simulation with multiple simulated process improvements, code response time for secondary providers coming from the second hospital decreased from 29 to 7 min, time to CPR initiation decreased from 90 to 15 s, and vascular access obtainment decreased from 15 to 3 min. Some of these simulated process improvements were adopted into the institutional response while others continue to be trended over time for evidence that observed changes represent a true new state of control. Utilizing the IHI's Breakthrough Model, we developed a simulation-based program to 1) successfully identify gaps and inefficiencies in a complex, dual-hospital, pediatric code response system and 2) provide an environment in which to safely test quality improvement interventions before institutional dissemination. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
PHANTOM: Practical Oblivious Computation in a Secure Processor
2014-05-16
Utilizing Multiple FPGAs . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 6 Implementation on the HC-2ex 50 6.1 Integration with a RISC -V...development of Phantom, Mohit also contributed to the code base, in particular with regard to the integration between the ORAM controller and the RISC -V...well. v Tremendous thanks is owed to the team that developed the RISC -V processor Phantom is using: among other contributors, this includes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krasheninnikov, Sergei I.; Angus, Justin; Lee, Wonjae
The goal of the Edge Simulation Laboratory (ESL) multi-institutional project is to advance scientific understanding of the edge plasma region of magnetic fusion devices via a coordinated effort utilizing modern computing resources, advanced algorithms, and ongoing theoretical development. The UCSD team was involved in the development of the COGENT code for kinetic studies across a magnetic separatrix. This work included a kinetic treatment of electrons and multiple ion species (impurities) and accurate collision operators.
NASA Technical Reports Server (NTRS)
2004-01-01
In early 1995, NASA s Glenn Research Center (then Lewis Research Center) formed an industry-government team with several jet engine companies to develop the National Combustion Code (NCC), which would help aerospace engineers solve complex aerodynamics and combustion problems in gas turbine, rocket, and hypersonic engines. The original development team consisted of Allison Engine Company (now Rolls-Royce Allison), CFD Research Corporation, GE Aircraft Engines, Pratt and Whitney, and NASA. After the baseline beta version was established in July 1998, the team focused its efforts on consolidation, streamlining, and integration, as well as enhancement, evaluation, validation, and application. These activities, mainly conducted at NASA Glenn, led to the completion of NCC version 1.0 in October 2000. NCC version 1.0 features high-fidelity representation of complex geometry, advanced models for two-phase turbulent combustion, and massively parallel computing. Researchers and engineers at Glenn have been using NCC to provide analysis and design support for various aerospace propulsion technology development projects. NASA transfers NCC technology to external customers using non- exclusive Space Act Agreements. Glenn researchers also communicate research and development results derived from NCC's further development through publications and special sessions at technical conferences.
Scheduling System Assessment, and Development and Enhancement of Re-engineered Version of GPSS
NASA Technical Reports Server (NTRS)
Loganantharaj, Rasiah; Thomas, Bushrod; Passonno, Nicole
1996-01-01
The objective of this project is two-fold. First to provide an evaluation of a commercially developed version of the ground processing scheduling system (GPSS) for its applicability to the Kennedy Space Center (KSC) ground processing problem. Second, to work with the KSC GPSS development team and provide enhancement to the existing software. Systems reengineering is required to provide a sustainable system for the users and the software maintenance group. Using the LISP profile prototype code developed by the GPSS reverse reengineering groups as a building block, we have implemented the resource deconfliction portion of GPSS in common LISP using its object oriented features. The prototype corrects and extends some of the deficiencies of the current production version, plus it uses and builds on the classes from the development team's profile prototype.
Supporting the Use of CERT (registered trademark) Secure Coding Standards in DoD Acquisitions
2012-07-01
Capability Maturity Model IntegrationSM (CMMI®) [Davis 2009]. SM Team Software Process, TSP, and Capability Maturity Model Integration are service...STP Software Test Plan TEP Test and Evaluation Plan TSP Team Software Process V & V verification and validation CMU/SEI-2012-TN-016 | 47...Supporting the Use of CERT® Secure Coding Standards in DoD Acquisitions Tim Morrow ( Software Engineering Institute) Robert Seacord ( Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foucar, James G.
The version of STK (Sierra ToolKit) that has long been provided with Trilinos is no longer supported by the core develop- ment team. With the introduction of a the new STK library into Trilinos, the old STK has been renamed to stk classic. This document contains a rough guide of how to port a stk classic code to STK.
NASA Technical Reports Server (NTRS)
Green, Scott; Kouchakdjian, Ara; Basili, Victor; Weidow, David
1990-01-01
This case study analyzes the application of the cleanroom software development methodology to the development of production software at the NASA/Goddard Space Flight Center. The cleanroom methodology emphasizes human discipline in program verification to produce reliable software products that are right the first time. Preliminary analysis of the cleanroom case study shows that the method can be applied successfully in the FDD environment and may increase staff productivity and product quality. Compared to typical Software Engineering Laboratory (SEL) activities, there is evidence of lower failure rates, a more complete and consistent set of inline code documentation, a different distribution of phase effort activity, and a different growth profile in terms of lines of code developed. The major goals of the study were to: (1) assess the process used in the SEL cleanroom model with respect to team structure, team activities, and effort distribution; (2) analyze the products of the SEL cleanroom model and determine the impact on measures of interest, including reliability, productivity, overall life-cycle cost, and software quality; and (3) analyze the residual products in the application of the SEL cleanroom model, such as fault distribution, error characteristics, system growth, and computer usage.
1994-03-01
22202-4302. and to the Office of Managmnt and Budget, Paperwork Reduction Project (0704-0188) Wahington DC 20503. 1. AGENCY USE ONLY (Leave blank) 2...Project Officer ........................ 49 d. Program Management ..................... 50 2. User/Systems Development Team Relationship ...... 52 D...and 60’s, when programming and systems development was in its infancy, virtually all software was custom -made. The programmer designed, coded
NASA Technical Reports Server (NTRS)
Lavelle, Tom
2003-01-01
The objective is to increase the usability of the current NPSS code/architecture by incorporating an advanced space transportation propulsion system capability into the existing NPSS code and begin defining advanced capabilities for NPSS and provide an enhancement for the NPSS code/architecture.
Rapid Prototyping of an Aircraft Model in an Object-Oriented Simulation
NASA Technical Reports Server (NTRS)
Kenney, P. Sean
2003-01-01
A team was created to participate in the Mars Scout Opportunity. Trade studies determined that an aircraft provided the best opportunity to complete the science objectives of the team. A high fidelity six degree of freedom flight simulation was required to provide credible evidence that the aircraft design fulfilled mission objectives and to support the aircraft design process by providing performance evaluations. The team created the simulation using the Langley Standard Real-Time Simulation in C++ (LaSRS++) application framework. A rapid prototyping approach was necessary because the team had only three months to both develop the aircraft simulation model and evaluate aircraft performance as the design and mission parameters matured. The design of LaSRS++ enabled rapid-prototyping in several ways. First, the framework allowed component models to be designed, implemented, unit-tested, and integrated quickly. Next, the framework provides a highly reusable infrastructure that allowed developers to maximize code reuse while concentrating on aircraft and mission specific features. Finally, the framework reduces risk by providing reusable components that allow developers to build a quality product with a compressed testing cycle that relies heavily on unit testing of new components.
Implementing newborn mock codes.
Blakely, Teresa Gail
2007-01-01
This article describes the implementation of a newborn mock code program. Although the Neonatal Resuscitation Program (NRP) is one of the most widely used health education programs in the world and is required for most hospital providers who attend deliveries, research tells us that retention of NRP skills deteriorates rapidly after completion of the course. NRP requires coordination and cooperation among all providers; however, a lack of leadership and teamwork during resuscitation (often associated with a lack of confidence) has been noted. Implementation of newborn mock code scenarios can encourage teamwork, communication, skills building, and increased confidence levels of providers. Mock codes can help providers become strong team members and team leaders by helping them be better prepared for serious situations in the delivery room. Implementation of newborn mock codes can be effectively accomplished with appropriate planning and consideration for adult learning behaviors.
Purple L1 Milestone Review Panel TotalView Debugger Functionality and Performance for ASC Purple
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolfe, M
2006-12-12
ASC code teams require a robust software debugging tool to help developers quickly find bugs in their codes and get their codes running. Development debugging commonly runs up to 512 processes. Production jobs run up to full ASC Purple scale, and at times require introspection while running. Developers want a debugger that runs on all their development and production platforms and that works with all compilers and runtimes used with ASC codes. The TotalView Multiprocess Debugger made by Etnus was specified for ASC Purple to address this needed capability. The ASC Purple environment builds on the environment seen by TotalViewmore » on ASCI White. The debugger must now operate with the Power5 CPU, Federation switch, AIX 5.3 operating system including large pages, IBM compilers 7 and 9, POE 4.2 parallel environment, and rs6000 SLURM resource manager. Users require robust, basic debugger functionality with acceptable performance at development debugging scale. A TotalView installation must be provided at the beginning of the early user access period that meets these requirements. A functional enhancement, fast conditional data watchpoints, and a scalability enhancement, capability up to 8192 processes, are to be demonstrated.« less
The ZPIC educational code suite
NASA Astrophysics Data System (ADS)
Calado, R.; Pardal, M.; Ninhos, P.; Helm, A.; Mori, W. B.; Decyk, V. K.; Vieira, J.; Silva, L. O.; Fonseca, R. A.
2017-10-01
Particle-in-Cell (PIC) codes are used in almost all areas of plasma physics, such as fusion energy research, plasma accelerators, space physics, ion propulsion, and plasma processing, and many other areas. In this work, we present the ZPIC educational code suite, a new initiative to foster training in plasma physics using computer simulations. Leveraging on our expertise and experience from the development and use of the OSIRIS PIC code, we have developed a suite of 1D/2D fully relativistic electromagnetic PIC codes, as well as 1D electrostatic. These codes are self-contained and require only a standard laptop/desktop computer with a C compiler to be run. The output files are written in a new file format called ZDF that can be easily read using the supplied routines in a number of languages, such as Python, and IDL. The code suite also includes a number of example problems that can be used to illustrate several textbook and advanced plasma mechanisms, including instructions for parameter space exploration. We also invite contributions to this repository of test problems that will be made freely available to the community provided the input files comply with the format defined by the ZPIC team. The code suite is freely available and hosted on GitHub at https://github.com/zambzamb/zpic. Work partially supported by PICKSC.
The (mis)use of subjective process measures in software engineering
NASA Technical Reports Server (NTRS)
Valett, Jon D.; Condon, Steven E.
1993-01-01
A variety of measures are used in software engineering research to develop an understanding of the software process and product. These measures fall into three broad categories: quantitative, characteristics, and subjective. Quantitative measures are those to which a numerical value can be assigned, for example effort or lines of code (LOC). Characteristics describe the software process or product; they might include programming language or the type of application. While such factors do not provide a quantitative measurement of a process or product, they do help characterize them. Subjective measures (as defined in this study) are those that are based on the opinion or opinions of individuals; they are somewhat unique and difficult to quantify. Capturing of subjective measure data typically involves development of some type of scale. For example, 'team experience' is one of the subjective measures that were collected and studied by the Software Engineering Laboratory (SEL). Certainly, team experience could have an impact on the software process or product; actually measuring a team's experience, however, is not a strictly mathematical exercise. Simply adding up each team member's years of experience appears inadequate. In fact, most researchers would agree that 'years' do not directly translate into 'experience.' Team experience must be defined subjectively and then a scale must be developed e.g., high experience versus low experience; or high, medium, low experience; or a different or more granular scale. Using this type of scale, a particular team's overall experience can be compared with that of other teams in the development environment. Defining, collecting, and scaling subjective measures is difficult. First, precise definitions of the measures must be established. Next, choices must be made about whose opinions will be solicited to constitute the data. Finally, care must be given to defining the right scale and level of granularity for measurement.
Fostering Team Awareness in Earth System Modeling Communities
NASA Astrophysics Data System (ADS)
Easterbrook, S. M.; Lawson, A.; Strong, S.
2009-12-01
Existing Global Climate Models are typically managed and controlled at a single site, with varied levels of participation by scientists outside the core lab. As these models evolve to encompass a wider set of earth systems, this central control of the modeling effort becomes a bottleneck. But such models cannot evolve to become fully distributed open source projects unless they address the imbalance in the availability of communication channels: scientists at the core site have access to regular face-to-face communication with one another, while those at remote sites have access to only a subset of these conversations - e.g. formally scheduled teleconferences and user meetings. Because of this imbalance, critical decision making can be hidden from many participants, their code contributions can interact in unanticipated ways, and the community loses awareness of who knows what. We have documented some of these problems in a field study at one climate modeling centre, and started to develop tools to overcome these problems. We report on one such tool, TracSNAP, which analyzes the social network of the scientists contributing code to the model by extracting the data in an existing project code repository. The tool presents the results of this analysis to modelers and model users in a number of ways: recommendation for who has expertise on particular code modules, suggestions for code sections that are related to files being worked on, and visualizations of team communication patterns. The tool is currently available as a plugin for the Trac bug tracking system.
Virtual Engineering and Science Team - Reusable Autonomy for Spacecraft Subsystems
NASA Technical Reports Server (NTRS)
Bailin, Sidney C.; Johnson, Michael A.; Rilee, Michael L.; Truszkowski, Walt; Thompson, Bryan; Day, John H. (Technical Monitor)
2002-01-01
In this paper we address the design, development, and evaluation of the Virtual Engineering and Science Team (VEST) tool - a revolutionary way to achieve onboard subsystem/instrument autonomy. VEST directly addresses the technology needed for advanced autonomy enablers for spacecraft subsystems. It will significantly support the efficient and cost effective realization of on-board autonomy and contribute directly to realizing the concept of an intelligent autonomous spacecraft. VEST will support the evolution of a subsystem/instrument model that is probably correct and from that model the automatic generation of the code needed to support the autonomous operation of what was modeled. VEST will directly support the integration of the efforts of engineers, scientists, and software technologists. This integration of efforts will be a significant advancement over the way things are currently accomplished. The model, developed through the use of VEST, will be the basis for the physical construction of the subsystem/instrument and the generated code will support its autonomous operation once in space. The close coupling between the model and the code, in the same tool environment, will help ensure that correct and reliable operational control of the subsystem/instrument is achieved.VEST will provide a thoroughly modern interface that will allow users to easily and intuitively input subsystem/instrument requirements and visually get back the system's reaction to the correctness and compatibility of the inputs as the model evolves. User interface/interaction, logic, theorem proving, rule-based and model-based reasoning, and automatic code generation are some of the basic technologies that will be brought into play in realizing VEST.
Big Software for SmallSats: Adapting CFS to CubeSat Missions
NASA Technical Reports Server (NTRS)
Cudmore, Alan P.; Crum, Gary; Sheikh, Salman; Marshall, James
2015-01-01
Expanding capabilities and mission objectives for SmallSats and CubeSats is driving the need for reliable, reusable, and robust flight software. While missions are becoming more complicated and the scientific goals more ambitious, the level of acceptable risk has decreased. Design challenges are further compounded by budget and schedule constraints that have not kept pace. NASA's Core Flight Software System (cFS) is an open source solution which enables teams to build flagship satellite level flight software within a CubeSat schedule and budget. NASA originally developed cFS to reduce mission and schedule risk for flagship satellite missions by increasing code reuse and reliability. The Lunar Reconnaissance Orbiter, which launched in 2009, was the first of a growing list of Class B rated missions to use cFS. Large parts of cFS are now open source, which has spurred adoption outside of NASA. This paper reports on the experiences of two teams using cFS for current CubeSat missions. The performance overheads of cFS are quantified, and the reusability of code between missions is discussed. The analysis shows that cFS is well suited to use on CubeSats and demonstrates the portability and modularity of cFS code.
Quick Response codes for surgical safety: a prospective pilot study.
Dixon, Jennifer L; Smythe, William Roy; Momsen, Lara S; Jupiter, Daniel; Papaconstantinou, Harry T
2013-09-01
Surgical safety programs have been shown to reduce patient harm; however, there is variable compliance. The purpose of this study is to determine if innovative technology such as Quick Response (QR) codes can facilitate surgical safety initiatives. We prospectively evaluated the use of QR codes during the surgical time-out for 40 operations. Feasibility and accuracy were assessed. Perceptions of the current time-out process and the QR code application were evaluated through surveys using a 5-point Likert scale and binomial yes or no questions. At baseline (n = 53), survey results from the surgical team agreed or strongly agreed that the current time-out process was efficient (64%), easy to use (77%), and provided clear information (89%). However, 65% of surgeons felt that process improvements were needed. Thirty-seven of 40 (92.5%) QR codes scanned successfully, of which 100% were accurate. Three scan failures resulted from excessive curvature or wrinkling of the QR code label on the body. Follow-up survey results (n = 33) showed that the surgical team agreed or strongly agreed that the QR program was clearer (70%), easier to use (57%), and more accurate (84%). Seventy-four percent preferred the QR system to the current time-out process. QR codes accurately transmit patient information during the time-out procedure and are preferred to the current process by surgical team members. The novel application of this technology may improve compliance, accuracy, and outcomes. Copyright © 2013 Elsevier Inc. All rights reserved.
Clean Cities Technical Assistance Project (Tiger Teams)
DOE Office of Scientific and Technical Information (OSTI.GOV)
This two-page fact sheet describes Clean Cities' technical assistance (Tiger Teams) capabilities and projects, both completed and ongoing. Tiger Teams are a critical element of the Clean Cities program, providing on-the-ground consultation to help inform program strategies. The knowledge Tiger Team experts gain from these experiences often helps inform other alternative fuels activities, such as needed research, codes and standards revisions, and new training resources.
Development of the ICD-10 simplified version and field test.
Paoin, Wansa; Yuenyongsuwan, Maliwan; Yokobori, Yukiko; Endo, Hiroyoshi; Kim, Sukil
2018-05-01
The International Statistical Classification of Diseases and Related Health Problems, 10th Revision (ICD-10) has been used in various Asia-Pacific countries for more than 20 years. Although ICD-10 is a powerful tool, clinical coding processes are complex; therefore, many developing countries have not been able to implement ICD-10-based health statistics (WHO-FIC APN, 2007). This study aimed to simplify ICD-10 clinical coding processes, to modify index terms to facilitate computer searching and to provide a simplified version of ICD-10 for use in developing countries. The World Health Organization Family of International Classifications Asia-Pacific Network (APN) developed a simplified version of the ICD-10 and conducted field testing in Cambodia during February and March 2016. Ten hospitals were selected to participate. Each hospital sent a team to join a training workshop before using the ICD-10 simplified version to code 100 cases. All hospitals subsequently sent their coded records to the researchers. Overall, there were 1038 coded records with a total of 1099 ICD clinical codes assigned. The average accuracy rate was calculated as 80.71% (66.67-93.41%). Three types of clinical coding errors were found. These related to errors relating to the coder (14.56%), those resulting from the physician documentation (1.27%) and those considered system errors (3.46%). The field trial results demonstrated that the APN ICD-10 simplified version is feasible for implementation as an effective tool to implement ICD-10 clinical coding for hospitals. Developing countries may consider adopting the APN ICD-10 simplified version for ICD-10 code assignment in hospitals and health care centres. The simplified version can be viewed as an introductory tool which leads to the implementation of the full ICD-10 and may support subsequent ICD-11 adoption.
WEC-SIM Validation Testing Plan FY14 Q4.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruehl, Kelley Michelle
2016-02-01
The WEC-Sim project is currently on track, having met both the SNL and NREL FY14 Milestones, as shown in Table 1 and Table 2. This is also reflected in the Gantt chart uploaded to the WEC-Sim SharePoint site in the FY14 Q4 Deliverables folder. The work completed in FY14 includes code verification through code-to-code comparison (FY14 Q1 and Q2), preliminary code validation through comparison to experimental data (FY14 Q2 and Q3), presentation and publication of the WEC-Sim project at OMAE 2014 [1], [2], [3] and GMREC/METS 2014 [4] (FY14 Q3), WEC-Sim code development and public open-source release (FY14 Q3), andmore » development of a preliminary WEC-Sim validation test plan (FY14 Q4). This report presents the preliminary Validation Testing Plan developed in FY14 Q4. The validation test effort started in FY14 Q4 and will go on through FY15. Thus far the team has developed a device selection method, selected a device, and placed a contract with the testing facility, established several collaborations including industry contacts, and have working ideas on the testing details such as scaling, device design, and test conditions.« less
Agile Methods: Selected DoD Management and Acquisition Concerns
2011-10-01
SIDRE Software Intensive Innovative Development and Reengineering/Evolution SLIM Software Lifecycle Management -Estimate SLOC source lines of code...ISBN #0321502752 Coaching Agile Teams Lyssa Adkins ISBN #0321637704 Agile Project Management : Creating Innovative Products – Second Edition Jim...Accessed July 13, 2011. [Highsmith 2009] Highsmith, J. Agile Project Management : Creating Innovative Products, 2nd ed. Addison- Wesley, 2009
Using a Sociomatrix to Evaluate the Effectiveness of Small-Group Teaching to Residents.
ERIC Educational Resources Information Center
Toffler, William L.; And Others
1990-01-01
A team-developed sociomatrix coding sheet was used to classify behaviors seen on videotaped small-group teaching sessions held at the end of patient-care activities. Although the sociomatrix appears to have potential to relate specific leader behavior to residents' feedback on subjective learning outcomes, the number of observations was too small…
Development of N-version software samples for an experiment in software fault tolerance
NASA Technical Reports Server (NTRS)
Lauterbach, L.
1987-01-01
The report documents the task planning and software development phases of an effort to obtain twenty versions of code independently designed and developed from a common specification. These versions were created for use in future experiments in software fault tolerance, in continuation of the experimental series underway at the Systems Validation Methods Branch (SVMB) at NASA Langley Research Center. The 20 versions were developed under controlled conditions at four U.S. universities, by 20 teams of two researchers each. The versions process raw data from a modified Redundant Strapped Down Inertial Measurement Unit (RSDIMU). The specifications, and over 200 questions submitted by the developers concerning the specifications, are included as appendices to this report. Design documents, and design and code walkthrough reports for each version, were also obtained in this task for use in future studies.
Riordan, Rick
2013-01-01
Background/Aims With the implementation of ICD 10 CM and ICD 10 PCS less than two years away, there are still unanswered questions as to how research teams will effectively translate or use ICD 10 codes in research. Approximately 84% of the ICD 10 codes have only approximate matches with 10% having multiple matches and only 5% have exact one-to-one matches between ICD 9 and ICD 10. With the number of codes increasing five-fold, this offers additional opportunities and risks when pulling data. Methods Besides looking at the General Equivalency Mappings and other tools that are used to translate ICD 9 codes to ICD 10 codes, we will examine some common research areas where only approximate matches between ICD 9 and ICD 10 exist. We will also discuss how the finer level of detail that ICD 10 gives allows research teams to pinpoint exactly what type of asthma, Crohn’s disease, and diabetic retinopathy they wish to study without including some of the other cases that do not meet their research criteria. Results There are significant ambiguities and irregularity in several common areas such as diabetes, mental health, asthma, and gastroenterology due to approximate, multiple, or combination matches. Even in the case of exact matches such as an old myocardial infarction where there is an exact match, the definition of when a myocardial infarction becomes “old” is different. Conclusions ICD 10 offers a finer level of detail and a higher level of specificity, thereby allowing research teams to be more targeted when pulling data. On the other hand, research teams need to exercise caution when using GEMs and other tools to translate ICD 9 codes into ICD 10 codes and vice versa, especially if they are looking at data that overlaps the implementation date of October 1, 2014.
NASA Technical Reports Server (NTRS)
Cowen, Benjamin
2011-01-01
Simulations are essential for engineering design. These virtual realities provide characteristic data to scientists and engineers in order to understand the details and complications of the desired mission. A standard development simulation package known as Trick is used in developing a source code to model a component (federate in HLA terms). The runtime executive is integrated into an HLA based distributed simulation. TrickHLA is used to extend a Trick simulation for a federation execution, develop a source code for communication between federates, as well as foster data input and output. The project incorporates international cooperation along with team collaboration. Interactions among federates occur throughout the simulation, thereby relying on simulation interoperability. Communication through the semester went on between participants to figure out how to create this data exchange. The NASA intern team is designing a Lunar Rover federate and a Lunar Shuttle federate. The Lunar Rover federate supports transportation across the lunar surface and is essential for fostering interactions with other federates on the lunar surface (Lunar Shuttle, Lunar Base Supply Depot and Mobile ISRU Plant) as well as transporting materials to the desired locations. The Lunar Shuttle federate transports materials to and from lunar orbit. Materials that it takes to the supply depot include fuel and cargo necessary to continue moon-base operations. This project analyzes modeling and simulation technologies as well as simulation interoperability. Each team from participating universities will work on and engineer their own federate(s) to participate in the SISO Spring 2011 Workshop SIW Smackdown in Boston, Massachusetts. This paper will focus on the Lunar Rover federate.
ERIC Educational Resources Information Center
Brunner, Marjorie L., Ed.; Casto, R. Michael, Ed.
The following are among the 40 papers included in this proceedings: "Code of Ethics for Interdisciplinary Care" (Thomasma); "Training Model for Increasing Team Excellence and Efficiency" (Clayton, Lund); "Organizational Structures of Health Care Teams" (Farrell, Schmitt, Heinemann); "Nutrition Support Practice" (Johnson); "Dividing up the Work on…
NASA Astrophysics Data System (ADS)
Yasar, Senay
Collaborative teamwork is a common practice in both science and engineering schools and workplaces. This study, using a mixed-methods approach, was designed to identify which team discourse characteristics are correlated with changes in student self-efficacy and achievement. Bandura's self-efficacy theory constitutes the theoretical framework. Seven teams, consisting of first-year engineering students, took the pre- and post-surveys and were video- and audio-recorded during a semester-long Introduction to Engineering Design course. Three instruments were developed: a self-efficacy survey, a team interaction observation protocol, and a team interaction self-report survey. The reliability and validity of these instruments were established. An iterative process of code development and refinement led to the development of thirty-five discourse types, which were grouped under six discourse categories: task-oriented, response-oriented, learning-oriented, support-oriented, challenge-oriented, and disruptive. The results of the quantitative data analysis showed that achievement and gain in self-efficacy were significantly correlated ( r=.55, p<.01). There was also a positive correlation between support-orientated discourse and post self-efficacy scores ( r=.43, p<.05). Negative correlations were observed between disruptive discourse behaviors and post self-efficacy (r=-.48, p<.05). Neither being challenged by peers nor receiving negative feedback revealed significant correlations with student self-efficacy. In addition, no direct correlations between the team discourse characteristics and achievement were found. These findings suggest that collaborative teamwork can lead to achievement to the extent that it supports self-efficacy. They also suggest that interactions such as receiving positive or negative feedback have less impact on self-efficacy than does the overall constructive behavior of the group. The qualitative component of the study, which focused on three case studies, presents how supportive and disruptive interactions occurred during team discourse. Discussion includes recommendations for educators on how to help teams build supportive environments as well as what to look for when forming teams and evaluating student team interactions.
Understanding Lustre Internals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Feiyi; Oral, H Sarp; Shipman, Galen M
2009-04-01
Lustre was initiated and funded, almost a decade ago, by the U.S. Department of Energy (DoE) Office of Science and National Nuclear Security Administration laboratories to address the need for an open source, highly-scalable, high-performance parallel filesystem on by then present and future supercomputing platforms. Throughout the last decade, it was deployed over numerous medium-to-large-scale supercomputing platforms and clusters, and it performed and met the expectations of the Lustre user community. As it stands at the time of writing this document, according to the Top500 list, 15 of the top 30 supercomputers in the world use Lustre filesystem. This reportmore » aims to present a streamlined overview on how Lustre works internally at reasonable details including relevant data structures, APIs, protocols and algorithms involved for Lustre version 1.6 source code base. More importantly, it tries to explain how various components interconnect with each other and function as a system. Portions of this report are based on discussions with Oak Ridge National Laboratory Lustre Center of Excellence team members and portions of it are based on our own understanding of how the code works. We, as the authors team bare all responsibilities for all errors and omissions in this document. We can only hope it helps current and future Lustre users and Lustre code developers as much as it helped us understanding the Lustre source code and its internal workings.« less
NASA Astrophysics Data System (ADS)
Hannah, M. A.; Simeone, M.
2017-12-01
On interdisciplinary teams, expertise is varied, as is evidenced by differences in team members' language use. Developing strategies to combine that expertise and bridge differentiated language practices is especially difficult between geoscience subdisciplines as researchers assume they use a shared language—vocabulary, jargon, codes, linguistic styles. In our paper, we discuss a network-based approach used to identify varied expertise and language practices between geoscientists (n=29) on a NSF team funded to study how deep and surface Earth processes worked together to give rise to the Great Oxygenation Event. We describe how we modeled the team's expertise from a language corpus consisting of 220 oxygen-related terms frequently used by team members and then compared their understanding of the terms to develop interventions to bridge the team's expertise. Corpus terms were identified via team member interviews, observations of members' interactions at research meetings, and discourse analysis of members' publications. Comparisons of members' language use were based on a Likert scale survey that asked members to assess how they understood a term; how frequently they used a term; and whether they conceptualized a term as an object or process. Rather than use our method as a communication audit tool (Zwijze-Koning & de Jong, 2015), teams can proactively use it in a project's early stages to assess the contours of the team's differentiated expertise and show where specialized knowledge resides in the team, where latent or non-obvious expertise exists, where expertise overlaps, and where gaps are in the team's knowledge. With this information, teams can make evidence based recommendations to forward their work such as allocating resources; identifying and empowering members to serve as connectors and lead cross-functional project initiatives; and developing strategies to avoid communication barriers. The method also generates models for teaching language sensitivity to subdisciplinary colleagues by making visible the nuanced ways they use language to organize and communicate their research. Ultimately, understanding the impact of differentiated language use is an unmet need in Earth science research, and our method offers a unique way to visualize and understand how such use impacts team communication.
Rosenman, Elizabeth D; Dixon, Aurora J; Webb, Jessica M; Brolliar, Sarah; Golden, Simon J; Jones, Kerin A; Shah, Sachita; Grand, James A; Kozlowski, Steve W J; Chao, Georgia T; Fernandez, Rosemarie
2018-02-01
Team situational awareness (TSA) is critical for effective teamwork and supports dynamic decision making in unpredictable, time-pressured situations. Simulation provides a platform for developing and assessing TSA, but these efforts are limited by suboptimal measurement approaches. The objective of this study was to develop and evaluate a novel approach to TSA measurement in interprofessional emergency medicine (EM) teams. We performed a multicenter, prospective, simulation-based observational study to evaluate an approach to TSA measurement. Interprofessional emergency medical teams, consisting of EM resident physicians, nurses, and medical students, were recruited from the University of Washington (Seattle, WA) and Wayne State University (Detroit, MI). Each team completed a simulated emergency resuscitation scenario. Immediately following the simulation, team members completed a TSA measure, a team perception of shared understanding measure, and a team leader effectiveness measure. Subject matter expert reviews and pilot testing of the TSA measure provided evidence of content and response process validity. Simulations were recorded and independently coded for team performance using a previously validated measure. The relationships between the TSA measure and other variables (team clinical performance, team perception of shared understanding, team leader effectiveness, and team experience) were explored. The TSA agreement metric was indexed by averaging the pairwise agreement for each dyad on a team and then averaging across dyads to yield agreement at the team level. For the team perception of shared understanding and team leadership effectiveness measures, individual team member scores were aggregated within a team to create a single team score. We computed descriptive statistics for all outcomes. We calculated Pearson's product-moment correlations to determine bivariate correlations between outcome variables with two-tailed significance testing (p < 0.05). A total of 123 participants were recruited and formed three-person teams (n = 41 teams). All teams completed the assessment scenario and postsimulation measures. TSA agreement ranged from 0.19 to 0.9 and had a mean (±SD) of 0.61 (±0.17). TSA correlated with team clinical performance (p < 0.05) but did not correlate with team perception of shared understanding, team leader effectiveness, or team experience. Team situational awareness supports adaptive teams and is critical for high reliability organizations such as healthcare systems. Simulation can provide a platform for research aimed at understanding and measuring TSA. This study provides a feasible method for simulation-based assessment of TSA in interdisciplinary teams that addresses prior measure limitations and is appropriate for use in highly dynamic, uncertain situations commonly encountered in emergency department systems. Future research is needed to understand the development of and interactions between individual-, team-, and system (distributed)-level cognitive processes. © 2017 by the Society for Academic Emergency Medicine.
Role Allocation and Team Structure in Command and Control Teams
2014-06-01
organizational psychology and management sciences literature show concepts such as empowered self-management and self-regulating work teams (see Cooney, 2004...tankers (FT), search units (S) and rescue units (R). Each unit is represented on the map by a numbered icon. Each type of unit is colour -coded and...Understanding team adaptation: A conceptual analysis and model. Journal of Applied Psychology , 91, 1189-1207. Cannon-Bowers, J. A., Tannenbaum
Understanding Treatment of Mild Traumatic Brain Injury in the Military Health System
2016-04-18
OEF Veterans: Polytrauma Clinical Triad,” Journal of Rehabilitation Research and Development, Vol. 46, No. 6, July 2009, pp. 697–702. Lew, Henry L...pubs/permissions.html. The RAND Corporation is a research organization that develops solutions to public policy challenges to help make communities...case definition for mTBI based on codes in the International Classification of Dis- eases, Ninth Revision (ICD-9), Clinical Modification. The team then
A new approach for instrument software at Gemini
NASA Astrophysics Data System (ADS)
Gillies, Kim; Nunez, Arturo; Dunn, Jennifer
2008-07-01
Gemini Observatory is now developing its next generation of astronomical instruments, the Aspen instruments. These new instruments are sophisticated and costly requiring large distributed, collaborative teams. Instrument software groups often include experienced team members with existing mature code. Gemini has taken its experience from the previous generation of instruments and current hardware and software technology to create an approach for developing instrument software that takes advantage of the strengths of our instrument builders and our own operations needs. This paper describes this new software approach that couples a lightweight infrastructure and software library with aspects of modern agile software development. The Gemini Planet Imager instrument project, which is currently approaching its critical design review, is used to demonstrate aspects of this approach. New facilities under development will face similar issues in the future, and the approach presented here can be applied to other projects.
Lawrence, Daphne
2009-01-01
CIOs should act as a team with HIM and Finance to prepare for RAC audits. CIOs can take the lead in looking at improved coding systems, and can be involved in creating policies and procedures for the hospital's RAC team. RAC is an opportunity to improve documentation, coding and data analysis. RAC appeals will become more common as states share lessons learned. Follow the money and check on claims that are frequently returned.
Lamers, Audri; van Nieuwenhuizen, Chijs; Twisk, Jos; de Koning, Erica; Vermeiren, Robert
2016-01-01
In a semi-residential setting where children switch daily between treatment and home, establishment of a strong parent-team alliance can be a challenge. The development of alliance with parents and the symptoms of the child might be strengthened by a structured investment of treatment team members. Participants were caregivers and treatment team members of 46 children (6-12 years) who received semi-residential psychiatric treatment. An A-B design was applied, in which the first 22 children were assigned to the comparison group receiving treatment as usual and the next 24 to the experimental group, where treatment team members used additional alliance-building strategies. Alliance and symptom questionnaires were filled out at three-month intervals during both treatment conditions. Parent-treatment team interactions, assessed on DVD, were coded according to members' adherence to these strategies. Multilevel analyses (using MLwiN) showed that based on reports of primary caregivers and a case manager, the alliance-building strategies had a statistically significant effect on the strength of the therapeutic alliance between treatment team members and parents. In addition, primary caregivers in the experimental group reported significant less hyperactivity symptoms of their children. Despite the methodological challenge of examining therapeutic processes in this complex treatment setting, this study supports the benefits of structured investment in the parent-team alliance.
ERIC Educational Resources Information Center
Stenton, Anthony
2013-01-01
The CNRS-financed authoring system SWANS (Synchronised Web Authoring Notation System), now used in several CercleS centres, was developed by teams from four laboratories as a personalised learning tool for the purpose of making available knowledge about lexical stress patterns and mother-tongue interference in L2 speech production--helping…
USL/DBMS NASA/PC R and D project C programming standards
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Moreau, Dennis R.
1984-01-01
A set of programming standards intended to promote reliability, readability, and portability of C programs written for PC research and development projects is established. These standards must be adhered to except where reasons for deviation are clearly identified and approved by the PC team. Any approved deviation from these standards must also be clearly documented in the pertinent source code.
Modeling and Diagnostic Software for Liquefying-Fuel Rockets
NASA Technical Reports Server (NTRS)
Poll, Scott; Iverson, David; Ou, Jeremy; Sanderfer, Dwight; Patterson-Hine, Ann
2005-01-01
A report presents a study of five modeling and diagnostic computer programs considered for use in an integrated vehicle health management (IVHM) system during testing of liquefying-fuel hybrid rocket engines in the Hybrid Combustion Facility (HCF) at NASA Ames Research Center. Three of the programs -- TEAMS, L2, and RODON -- are model-based reasoning (or diagnostic) programs. The other two programs -- ICS and IMS -- do not attempt to isolate the causes of failures but can be used for detecting faults. In the study, qualitative models (in TEAMS and L2) and quantitative models (in RODON) having varying scope and completeness were created. Each of the models captured the structure and behavior of the HCF as a physical system. It was noted that in the cases of the qualitative models, the temporal aspects of the behavior of the HCF and the abstraction of sensor data are handled outside of the models, and it is necessary to develop additional code for this purpose. A need for additional code was also noted in the case of the quantitative model, though the amount of development effort needed was found to be less than that for the qualitative models.
2013 R&D 100 Award: DNATrax could revolutionize air quality detection and tracking
Farquar, George
2018-01-16
A team of LLNL scientists and engineers has developed a safe and versatile material, known as DNA Tagged Reagents for Aerosol Experiments (DNATrax), that can be used to reliably and rapidly diagnose airflow patterns and problems in both indoor and outdoor venues. Until DNATrax particles were developed, no rapid or safe way existed to validate air transport models with realistic particles in the range of 1-10 microns. Successful DNATrax testing was conducted at the Pentagon in November 2012 in conjunction with the Pentagon Force Protection Agency. This study enhanced the team's understanding of indoor ventilation environments created by heating, ventilation and air conditioning (HVAC) systems. DNATrax are particles comprised of sugar and synthetic DNA that serve as a bar code for the particle. The potential for creating unique bar-coded particles is virtually unlimited, thus allowing for simultaneous and repeated releases, which dramatically reduces the costs associated with conducting tests for contaminants. Among the applications for the new material are indoor air quality detection, for homes, offices, ships and airplanes; urban particulate tracking, for subway stations, train stations, and convention centers; environmental release tracking; and oil and gas uses, including fracking, to better track fluid flow.
2013 R&D 100 Award: DNATrax could revolutionize air quality detection and tracking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farquar, George
A team of LLNL scientists and engineers has developed a safe and versatile material, known as DNA Tagged Reagents for Aerosol Experiments (DNATrax), that can be used to reliably and rapidly diagnose airflow patterns and problems in both indoor and outdoor venues. Until DNATrax particles were developed, no rapid or safe way existed to validate air transport models with realistic particles in the range of 1-10 microns. Successful DNATrax testing was conducted at the Pentagon in November 2012 in conjunction with the Pentagon Force Protection Agency. This study enhanced the team's understanding of indoor ventilation environments created by heating, ventilationmore » and air conditioning (HVAC) systems. DNATrax are particles comprised of sugar and synthetic DNA that serve as a bar code for the particle. The potential for creating unique bar-coded particles is virtually unlimited, thus allowing for simultaneous and repeated releases, which dramatically reduces the costs associated with conducting tests for contaminants. Among the applications for the new material are indoor air quality detection, for homes, offices, ships and airplanes; urban particulate tracking, for subway stations, train stations, and convention centers; environmental release tracking; and oil and gas uses, including fracking, to better track fluid flow.« less
Why and how Mastering an Incremental and Iterative Software Development Process
NASA Astrophysics Data System (ADS)
Dubuc, François; Guichoux, Bernard; Cormery, Patrick; Mescam, Jean Christophe
2004-06-01
One of the key issues regularly mentioned in the current software crisis of the space domain is related to the software development process that must be performed while the system definition is not yet frozen. This is especially true for complex systems like launchers or space vehicles.Several more or less mature solutions are under study by EADS SPACE Transportation and are going to be presented in this paper. The basic principle is to develop the software through an iterative and incremental process instead of the classical waterfall approach, with the following advantages:- It permits systematic management and incorporation of requirements changes over the development cycle with a minimal cost. As far as possible the most dimensioning requirements are analyzed and developed in priority for validating very early the architecture concept without the details.- A software prototype is very quickly available. It improves the communication between system and software teams, as it enables to check very early and efficiently the common understanding of the system requirements.- It allows the software team to complete a whole development cycle very early, and thus to become quickly familiar with the software development environment (methodology, technology, tools...). This is particularly important when the team is new, or when the environment has changed since the previous development. Anyhow, it improves a lot the learning curve of the software team.These advantages seem very attractive, but mastering efficiently an iterative development process is not so easy and induces a lot of difficulties such as:- How to freeze one configuration of the system definition as a development baseline, while most of thesystem requirements are completely and naturally unstable?- How to distinguish stable/unstable and dimensioning/standard requirements?- How to plan the development of each increment?- How to link classical waterfall development milestones with an iterative approach: when should theclassical reviews be performed: Software Specification Review? Preliminary Design Review? CriticalDesign Review? Code Review? Etc...Several solutions envisaged or already deployed by EADS SPACE Transportation will be presented, both from a methodological and technological point of view:- How the MELANIE EADS ST internal methodology improves the concurrent engineering activitiesbetween GNC, software and simulation teams in a very iterative and reactive way.- How the CMM approach can help by better formalizing Requirements Management and Planningprocesses.- How the Automatic Code Generation with "certified" tools (SCADE) can still dramatically shorten thedevelopment cycle.Then the presentation will conclude by showing an evaluation of the cost and planning reduction based on a pilot application by comparing figures on two similar projects: one with the classical waterfall process, the other one with an iterative and incremental approach.
Development of the Orion Crew Module Static Aerodynamic Database. Part 1; Hypersonic
NASA Technical Reports Server (NTRS)
Bibb, Karen L.; Walker, Eric L.; Robinson, Philip E.
2011-01-01
The Orion aerodynamic database provides force and moment coefficients given the velocity, attitude, configuration, etc. of the Crew Exploration Vehicle (CEV). The database is developed and maintained by the NASA CEV Aerosciences Project team from computational and experimental aerodynamic simulations. The database is used primarily by the Guidance, Navigation, and Control (GNC) team to design vehicle trajectories and assess flight performance. The initial hypersonic re-entry portion of the Crew Module (CM) database was developed in 2006. Updates incorporating additional data and improvements to the database formulation and uncertainty methodologies have been made since then. This paper details the process used to develop the CM database, including nominal values and uncertainties, for Mach numbers greater than 8 and angles of attack between 140deg and 180deg. The primary available data are more than 1000 viscous, reacting gas chemistry computational simulations using both the Laura and Dplr codes, over a range of Mach numbers from 2 to 37 and a range of angles of attack from 147deg to 172deg. Uncertainties were based on grid convergence, laminar-turbulent solution variations, combined altitude and code-to-code variations, and expected heatshield asymmetry. A radial basis function response surface tool, NEAR-RS, was used to fit the coefficient data smoothly in a velocity-angle-of-attack space. The resulting database is presented and includes some data comparisons and a discussion of the predicted variation of trim angle of attack and lift-to-drag ratio. The database provides a variation in trim angle of attack on the order of +/-2deg, and a range in lift-to-drag ratio of +/-0.035 for typical vehicle flight conditions.
NASA Astrophysics Data System (ADS)
Kulas, M.; Borelli, Jose Luis; Gässler, Wolfgang; Peter, Diethard; Rabien, Sebastian; Orban de Xivry, Gilles; Busoni, Lorenzo; Bonaglia, Marco; Mazzoni, Tommaso; Rahmer, Gustavo
2014-07-01
Commissioning time for an instrument at an observatory is precious, especially the night time. Whenever astronomers come up with a software feature request or point out a software defect, the software engineers have the task to find a solution and implement it as fast as possible. In this project phase, the software engineers work under time pressure and stress to deliver a functional instrument control software (ICS). The shortness of development time during commissioning is a constraint for software engineering teams and applies to the ARGOS project as well. The goal of the ARGOS (Advanced Rayleigh guided Ground layer adaptive Optics System) project is the upgrade of the Large Binocular Telescope (LBT) with an adaptive optics (AO) system consisting of six Rayleigh laser guide stars and wavefront sensors. For developing the ICS, we used the technique Test- Driven Development (TDD) whose main rule demands that the programmer writes test code before production code. Thereby, TDD can yield a software system, that grows without defects and eases maintenance. Having applied TDD in a calm and relaxed environment like office and laboratory, the ARGOS team has profited from the benefits of TDD. Before the commissioning, we were worried that the time pressure in that tough project phase would force us to drop TDD because we would spend more time writing test code than it would be worth. Despite this concern at the beginning, we could keep TDD most of the time also in this project phase This report describes the practical application and performance of TDD including its benefits, limitations and problems during the ARGOS commissioning. Furthermore, it covers our experience with pair programming and continuous integration at the telescope.
NASA Astrophysics Data System (ADS)
Adler, D. S.
2000-12-01
The Science Planning and Scheduling Team (SPST) of the Space Telescope Science Institute (STScI) has historically operated exclusively under VMS. Due to diminished support for VMS-based platforms at STScI, SPST is in the process of transitioning to Unix operations. In the summer of 1999, SPST selected Python as the primary scripting language for the operational tools and began translation of the VMS DCL code. As of October 2000, SPST has installed a utility library of 16 modules consisting of 8000 lines of code and 80 Python tools consisting of 13000 lines of code. All tasks related to calendar generation have been switched to Unix operations. Current work focuses on translating the tools used to generate the Science Mission Specifications (SMS). The software required to generate the Mission Schedule and Command Loads (PASS), maintained by another team at STScI, will take longer to translate than the rest of the SPST operational code. SPST is planning on creating tools to access PASS from Unix in the short term. We are on schedule to complete the work needed to fully transition SPST to Unix operations (while remotely accessing PASS on VMS) by the fall of 2001.
MacNaughton, Kate; Chreim, Samia; Bourgeault, Ivy Lynn
2013-11-24
The move towards enhancing teamwork and interprofessional collaboration in health care raises issues regarding the management of professional boundaries and the relationship among health care providers. This qualitative study explores how roles are constructed within interprofessional health care teams. It focuses on elucidating the different types of role boundaries, the influences on role construction and the implications for professionals and patients. A comparative case study was conducted to examine the dynamics of role construction on two interprofessional primary health care teams. The data collection included interviews and non-participant observation of team meetings. Thematic content analysis was used to code and analyze the data and a conceptual model was developed to represent the emergent findings. The findings indicate that role boundaries can be organized around interprofessional interactions (giving rise to autonomous or collaborative roles) as well as the distribution of tasks (giving rise to interchangeable or differentiated roles). Different influences on role construction were identified. They are categorized as structural (characteristics of the workplace), interpersonal (dynamics between team members such as trust and leadership) and individual dynamics (personal attributes). The implications of role construction were found to include professional satisfaction and more favourable wait times for patients. A model that integrates these different elements was developed. Based on the results of this study, we argue that autonomy may be an important element of interprofessional team functioning. Counter-intuitive as this may sound, we found that empowering team members to develop autonomy can enhance collaborative interactions. We also argue that while more interchangeable roles could help to lessen the workloads of team members, they could also increase the potential for power struggles because the roles of various professions would become less differentiated. We consider the conceptual and practical implications of our findings and we address the transferability of our model to other interprofessional teams.
Role construction and boundaries in interprofessional primary health care teams: a qualitative study
2013-01-01
Background The move towards enhancing teamwork and interprofessional collaboration in health care raises issues regarding the management of professional boundaries and the relationship among health care providers. This qualitative study explores how roles are constructed within interprofessional health care teams. It focuses on elucidating the different types of role boundaries, the influences on role construction and the implications for professionals and patients. Methods A comparative case study was conducted to examine the dynamics of role construction on two interprofessional primary health care teams. The data collection included interviews and non-participant observation of team meetings. Thematic content analysis was used to code and analyze the data and a conceptual model was developed to represent the emergent findings. Results The findings indicate that role boundaries can be organized around interprofessional interactions (giving rise to autonomous or collaborative roles) as well as the distribution of tasks (giving rise to interchangeable or differentiated roles). Different influences on role construction were identified. They are categorized as structural (characteristics of the workplace), interpersonal (dynamics between team members such as trust and leadership) and individual dynamics (personal attributes). The implications of role construction were found to include professional satisfaction and more favourable wait times for patients. A model that integrates these different elements was developed. Conclusions Based on the results of this study, we argue that autonomy may be an important element of interprofessional team functioning. Counter-intuitive as this may sound, we found that empowering team members to develop autonomy can enhance collaborative interactions. We also argue that while more interchangeable roles could help to lessen the workloads of team members, they could also increase the potential for power struggles because the roles of various professions would become less differentiated. We consider the conceptual and practical implications of our findings and we address the transferability of our model to other interprofessional teams. PMID:24267663
Using Qualitative Methods to Evaluate a Family Behavioral Intervention for Type 1 Diabetes
Herbert, Linda Jones; Sweenie, Rachel; Kelly, Katherine Patterson; Holmes, Clarissa; Streisand, Randi
2013-01-01
Introduction The objectives of this study were to qualitatively evaluate a dyadic adolescent-parent type 1 diabetes (T1D) program developed to prevent deterioration in diabetes care among adolescents with T1D and provide recommendations for program refinement. Method Thirteen adolescent-parent dyads who participated in the larger RCT, the TeamWork Project, were interviewed regarding their perceptions of their participation in the program and current T1D challenges. Interviews were transcribed and coded to establish broad themes. Results Adolescents and parents thought the TeamWork Project sessions were helpful and taught them new information. Five themes catalog findings from the qualitative interviews: TeamWork content, TeamWork structure, transition of responsibility, current and future challenges, and future intervention considerations. Discussion Addressing T1D challenges as a parent-adolescent dyad via a behavioral clinic program is helpful to families during adolescence. Findings highlight the utility of qualitative evaluation to tailor interventions for the unique challenges related to pediatric chronic illness. PMID:24269281
National Combustion Code: Parallel Implementation and Performance
NASA Technical Reports Server (NTRS)
Quealy, A.; Ryder, R.; Norris, A.; Liu, N.-S.
2000-01-01
The National Combustion Code (NCC) is being developed by an industry-government team for the design and analysis of combustion systems. CORSAIR-CCD is the current baseline reacting flow solver for NCC. This is a parallel, unstructured grid code which uses a distributed memory, message passing model for its parallel implementation. The focus of the present effort has been to improve the performance of the NCC flow solver to meet combustor designer requirements for model accuracy and analysis turnaround time. Improving the performance of this code contributes significantly to the overall reduction in time and cost of the combustor design cycle. This paper describes the parallel implementation of the NCC flow solver and summarizes its current parallel performance on an SGI Origin 2000. Earlier parallel performance results on an IBM SP-2 are also included. The performance improvements which have enabled a turnaround of less than 15 hours for a 1.3 million element fully reacting combustion simulation are described.
When study participants are vulnerable: getting and keeping the right team.
Hill, Nikki L; Mogle, Jacqueline; Wion, Rachel; Kolanowski, Ann M; Fick, Donna; Behrens, Liza; Muhall, Paula; McDowell, Jane
2017-09-19
Research assistants (RAs) are critical members of all research teams. When a study involves vulnerable populations, it is particularly important to have the right team members. To describe the motivations, personal characteristics and team characteristics that promoted the job satisfaction of RAs who worked on two multi-year, randomised clinical trials involving older adults with dementia. A survey was conducted with 41 community members who worked as RAs for up to five years. Measures included demographics, work engagement, personality and characteristics of effective teams, as well as open-ended questions about respondents' experiences of the study. Quantitative analyses and coding of open-ended responses were used to summarise results. Almost all the RAs surveyed joined the team because of previous experiences of interacting with cognitively impaired older people. The RA respondents scored higher in 'dedication to work', 'extraversion', 'agreeableness' and 'conscientiousness' than average. An important aspect of their job satisfaction was team culture, including positive interpersonal interaction and the development of supportive team relationships. A positive work culture provides RAs with an opportunity to work with a study population that they are personally driven to help, and promotes motivation and satisfaction in team members. Results from this study can guide the recruitment, screening and retention of team members for studies that include vulnerable populations. ©2012 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.
Multiphysics Code Demonstrated for Propulsion Applications
NASA Technical Reports Server (NTRS)
Lawrence, Charles; Melis, Matthew E.
1998-01-01
The utility of multidisciplinary analysis tools for aeropropulsion applications is being investigated at the NASA Lewis Research Center. The goal of this project is to apply Spectrum, a multiphysics code developed by Centric Engineering Systems, Inc., to simulate multidisciplinary effects in turbomachinery components. Many engineering problems today involve detailed computer analyses to predict the thermal, aerodynamic, and structural response of a mechanical system as it undergoes service loading. Analysis of aerospace structures generally requires attention in all three disciplinary areas to adequately predict component service behavior, and in many cases, the results from one discipline substantially affect the outcome of the other two. There are numerous computer codes currently available in the engineering community to perform such analyses in each of these disciplines. Many of these codes are developed and used in-house by a given organization, and many are commercially available. However, few, if any, of these codes are designed specifically for multidisciplinary analyses. The Spectrum code has been developed for performing fully coupled fluid, thermal, and structural analyses on a mechanical system with a single simulation that accounts for all simultaneous interactions, thus eliminating the requirement for running a large number of sequential, separate, disciplinary analyses. The Spectrum code has a true multiphysics analysis capability, which improves analysis efficiency as well as accuracy. Centric Engineering, Inc., working with a team of Lewis and AlliedSignal Engines engineers, has been evaluating Spectrum for a variety of propulsion applications including disk quenching, drum cavity flow, aeromechanical simulations, and a centrifugal compressor flow simulation.
Cultural and Technological Issues and Solutions for Geodynamics Software Citation
NASA Astrophysics Data System (ADS)
Heien, E. M.; Hwang, L.; Fish, A. E.; Smith, M.; Dumit, J.; Kellogg, L. H.
2014-12-01
Computational software and custom-written codes play a key role in scientific research and teaching, providing tools to perform data analysis and forward modeling through numerical computation. However, development of these codes is often hampered by the fact that there is no well-defined way for the authors to receive credit or professional recognition for their work through the standard methods of scientific publication and subsequent citation of the work. This in turn may discourage researchers from publishing their codes or making them easier for other scientists to use. We investigate the issues involved in citing software in a scientific context, and introduce features that should be components of a citation infrastructure, particularly oriented towards the codes and scientific culture in the area of geodynamics research. The codes used in geodynamics are primarily specialized numerical modeling codes for continuum mechanics problems; they may be developed by individual researchers, teams of researchers, geophysicists in collaboration with computational scientists and applied mathematicians, or by coordinated community efforts such as the Computational Infrastructure for Geodynamics. Some but not all geodynamics codes are open-source. These characteristics are common to many areas of geophysical software development and use. We provide background on the problem of software citation and discuss some of the barriers preventing adoption of such citations, including social/cultural barriers, insufficient technological support infrastructure, and an overall lack of agreement about what a software citation should consist of. We suggest solutions in an initial effort to create a system to support citation of software and promotion of scientific software development.
The role of the primary care team in the rapid response system.
O'Horo, John C; Sevilla Berrios, Ronaldo A; Elmer, Jennifer L; Velagapudi, Venu; Caples, Sean M; Kashyap, Rahul; Jensen, Jeffrey B
2015-04-01
The purpose of the study is to evaluate the impact of primary service involvement on rapid response team (RRT) evaluations. The study is a combination of retrospective chart review and prospective survey-based evaluation. Data included when and where the activations occurred and the patient's code status, primary service, and ultimate disposition. These data were correlated with survey data from each event. A prospective survey evaluated the primary team's involvement in decision making and the overall subjective quality of the interaction with primary service through a visual analog scale. We analyzed 4408 RRTs retrospectively and an additional 135 prospectively. The primary team's involvement by telephone or in person was associated with significantly more transfers to higher care levels in retrospective (P < .01) and prospective data sets. Code status was addressed more frequently in primary team involvement, with more frequent changes seen in the retrospective analysis (P = .01). Subjective ratings of communication by the RRT leader were significantly higher when the primary service was involved (P < .001). Active primary team involvement influences RRT activation processes of care. The RRT role should be an adjunct to, but not a substitute for, an engaged and present primary care team. Copyright © 2014 Elsevier Inc. All rights reserved.
Pawnee Nation Energy Option Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matlock, M.; Kersey, K.; Riding In, C.
2009-07-31
In 2003, the Pawnee Nation leadership identified the need for the tribe to comprehensively address its energy issues. During a strategic energy planning workshop a general framework was laid out and the Pawnee Nation Energy Task Force was created to work toward further development of the tribe’s energy vision. The overarching goals of the “first steps” project were to identify the most appropriate focus for its strategic energy initiatives going forward, and to provide information necessary to take the next steps in pursuit of the “best fit” energy options. Based on the request of Pawnee Nation’s Energy Task Force themore » research team, consisting Tribal personnel and Summit Blue Consulting, focused on a review of renewable energy resource development potential, funding sources and utility organizational along with energy savings options. Elements of the energy demand forecasting and characterization and demand side options review remained in the scope of work, but were only addressed at a high level. Description of Activities Performed Renewable Energy Resource Development Potential The research team reviewed existing data pertaining to the availability of biomass (focusing on woody biomass, agricultural biomass/bio-energy crops, and methane capture), solar, wind and hydropower resources on the Pawnee-owned lands. Using these data, combined with assumptions about costs and revenue streams, the research team performed preliminary feasibility assessments for each resource category. The research team also reviewed available funding resources and made recommendations to Pawnee Nation highlighting those resources with the greatest potential for financially-viable development, both in the near-term and over a longer time horizon. Energy Efficiency Options While this was not a major focus of the project, the research team highlighted common strategies for reducing energy use in buildings. The team also discussed the benefits of adopting a building energy code and introduced two model energy codes Pawnee Nation should consider for adoption. Summary of Current and Expected Future Electricity Usage The research team provided a summary overview of electricity usage patterns in current buildings and included discussion of known plans for new construction. Utility Options Review Pawnee Nation electric utility options were analyzed through a four-phase process, which included: 1) summarizing the relevant utility background information; 2) gathering relevant utility assessment data; 3) developing a set of realistic Pawnee electric utility service options, and 4) analyzing the various Pawnee electric utility service options for the Pawnee Energy Team’s consideration. III. Findings and Recommendations Due to a lack of financial incentives for renewable energy, particularly at the state level, combined mediocre renewable energy resources, renewable energy development opportunities are limited for Pawnee Nation. However, near-term potential exists for development of solar hot water at the gym, and an exterior wood-fired boiler system at the tribe’s main administrative building. Pawnee Nation should also explore options for developing LFGTE resources in collaboration with the City of Pawnee. Significant potential may also exist for development of bio-energy resources within the next decade. Pawnee Nation representatives should closely monitor market developments in the bio-energy industry, establish contacts with research institutions with which the tribe could potentially partner in grant-funded research initiatives. In addition, a substantial effort by the Kaw and Cherokee tribes is underway to pursue wind development at the Chilocco School Site in northern Oklahoma where Pawnee is a joint landowner. Pawnee Nation representatives should become actively involved in these development discussions and should explore the potential for joint investment in wind development at the Chilocco site.« less
Implementing a rapid response team: factors influencing success.
Murray, Theresa; Kleinpell, Ruth
2006-12-01
Rapid response teams (RRTs), or medical emergency teams, focus on preventing a patient crisis by addressing changes in patient status before a cardiopulmonary arrest occurs. Responding to acute changes, RRTs and medical emergency teams are similar to "code" teams. The exception, however is that they step into action before a patient arrests. Although RRTs are acknowledge as an important initiative, implementation can present many challenges. This article reports on the implementation and ongoing use of a RRT at a community health care setting, highlighting important considerations and strategies for success.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fu, Pengcheng; Mcclure, Mark; Shiozawa, Sogo
A series of experiments performed at the Fenton Hill hot dry rock site after stage 2 drilling of Phase I reservoir provided intriguing field observations on the reservoir’s responses to injection and venting under various conditions. Two teams participating in the US DOE Geothermal Technologies Office (GTO)’s Code Comparison Study (CCS) used different numerical codes to model these five experiments with the objective of inferring the hydraulic stimulation mechanism involved. The codes used by the two teams are based on different numerical principles, and the assumptions made were also different, due to intrinsic limitations in the codes and the modelers’more » personal interpretations of the field observations. Both sets of models were able to produce the most important field observations and both found that it was the combination of the vertical gradient of the fracture opening pressure, injection volume, and the use/absence of proppant that yielded the different outcomes of the five experiments.« less
Status of the Space Radiation Monte Carlos Simulation Based on FLUKA and ROOT
NASA Technical Reports Server (NTRS)
Andersen, Victor; Carminati, Federico; Empl, Anton; Ferrari, Alfredo; Pinsky, Lawrence; Sala, Paola; Wilson, Thomas L.
2002-01-01
The NASA-funded project reported on at the first IWSSRR in Arona to develop a Monte-Carlo simulation program for use in simulating the space radiation environment based on the FLUKA and ROOT codes is well into its second year of development, and considerable progress has been made. The general tasks required to achieve the final goals include the addition of heavy-ion interactions into the FLUKA code and the provision of a ROOT-based interface to FLUKA. The most significant progress to date includes the incorporation of the DPMJET event generator code within FLUKA to handle heavy-ion interactions for incident projectile energies greater than 3GeV/A. The ongoing effort intends to extend the treatment of these interactions down to 10 MeV, and at present two alternative approaches are being explored. The ROOT interface is being pursued in conjunction with the CERN LHC ALICE software team through an adaptation of their existing AliROOT software. As a check on the validity of the code, a simulation of the recent data taken by the ATIC experiment is underway.
Team Leader Structuring for Team Effectiveness and Team Learning in Command-and-Control Teams.
van der Haar, Selma; Koeslag-Kreunen, Mieke; Euwe, Eline; Segers, Mien
2017-04-01
Due to their crucial and highly consequential task, it is of utmost importance to understand the levers leading to effectiveness of multidisciplinary emergency management command-and-control (EMCC) teams. We argue that the formal EMCC team leader needs to initiate structure in the team meetings to support organizing the work as well as facilitate team learning, especially the team learning process of constructive conflict. In a sample of 17 EMCC teams performing a realistic EMCC exercise, including one or two team meetings (28 in sum), we coded the team leader's verbal structuring behaviors (1,704 events), rated constructive conflict by external experts, and rated team effectiveness by field experts. Results show that leaders of effective teams use structuring behaviors more often (except asking procedural questions) but decreasingly over time. They support constructive conflict by clarifying and by making summaries that conclude in a command or decision in a decreasing frequency over time.
Job coding (PCS 2003): feedback from a study conducted in an Occupational Health Service
Henrotin, Jean-Bernard; Vaissière, Monique; Etaix, Maryline; Malard, Stéphane; Dziurla, Mathieu; Lafon, Dominique
2016-10-19
Aim: To examine the quality of manual job coding carried out by occupational health teams with access to a software application that provides assistance in job and business sector coding (CAPS). Methods: Data from a study conducted in an Occupational Health Service were used to examine the first-level coding of 1,495 jobs by occupational health teams according to the French job classification entitled “PSC- Professions and socio-professional categories” (INSEE, 2003 version). A second level of coding was also performed by an experienced coder and the first and second level codes were compared. Agreement between the two coding systems was studied using the kappa coefficient (κ) and frequencies were compared by Chi2 tests. Results: Missing data or incorrect codes were observed for 14.5% of social groups (1 digit) and 25.7% of job codes (4 digits). While agreement between the first two levels of PCS 2003 appeared to be satisfactory (κ=0.73 and κ=0.75), imbalances in reassignment flows were effectively noted. The divergent job code rate was 48.2%. Variation in the frequency of socio-occupational variables was as high as 8.6% after correcting for missing data and divergent codes. Conclusions: Compared with other studies, the use of the CAPS tool appeared to provide effective coding assistance. However, our results indicate that job coding based on PSC 2003 should be conducted using ancillary data by personnel trained in the use of this tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harper, F.T.; Young, M.L.; Miller, L.A.
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulatedmore » jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.« less
Lingard, Lorelei; Reznick, Richard; Espin, Sherry; Regehr, Glenn; DeVito, Isabella
2002-03-01
Although the communication that occurs within health care teams is important to both team function and the socialization of novices, the nature of team communication and its educational influence are not well documented. This study explored the nature of communications among operating room (OR) team members from surgery, nursing, and anesthesia to identify common communicative patterns, sites of tension, and their impact on novices. Paired researchers observed 128 hours of OR interactions during 35 procedures from four surgical divisions at one teaching hospital. Brief, unstructured interviews were conducted following each observation. Field notes were independently read by each researcher and coded for emergent themes in the grounded theory tradition. Coding consensus was achieved via regular discussion. Findings were returned to insider "experts" for their assessment of authenticity and adequacy. Patterns of communication were complex and socially motivated. Dominant themes were time, safety and sterility, resources, roles, and situation. Communicative tension arose regularly in relation to these themes. Each procedure had one to four "higher-tension" events, which often had a ripple effect, spreading tension to other participants and contexts. Surgical trainees responded to tension by withdrawing from the communication or mimicking the senior staff surgeon. Both responses had negative implications for their own team relations. Team communications in the OR follow observable patterns and are influenced by recurrent themes that suggest sites of team tension. Tension in team communication affects novices, who respond with behaviors that may intensify rather than resolve interprofessional conflict.
Evaluation of Agency Non-Code Layered Pressure Vessels (LPVs) . Volume 2; Appendices
NASA Technical Reports Server (NTRS)
Prosser, William H.
2014-01-01
In coordination with the Office of Safety and Mission Assurance and the respective Center Pressure System Managers (PSMs), the NASA Engineering and Safety Center (NESC) was requested to formulate a consensus draft proposal for the development of additional testing and analysis methods to establish the technical validity, and any limitation thereof, for the continued safe operation of facility non-code layered pressure vessels. The PSMs from each NASA Center were asked to participate as part of the assessment team by providing, collecting, and reviewing data regarding current operations of these vessels. This document contains the appendices to the main report.
Adjoint-Based Methodology for Time-Dependent Optimal Control (AMTOC)
NASA Technical Reports Server (NTRS)
Yamaleev, Nail; Diskin, boris; Nishikawa, Hiroaki
2012-01-01
During the five years of this project, the AMTOC team developed an adjoint-based methodology for design and optimization of complex time-dependent flows, implemented AMTOC in a testbed environment, directly assisted in implementation of this methodology in the state-of-the-art NASA's unstructured CFD code FUN3D, and successfully demonstrated applications of this methodology to large-scale optimization of several supersonic and other aerodynamic systems, such as fighter jet, subsonic aircraft, rotorcraft, high-lift, wind-turbine, and flapping-wing configurations. In the course of this project, the AMTOC team has published 13 refereed journal articles, 21 refereed conference papers, and 2 NIA reports. The AMTOC team presented the results of this research at 36 international and national conferences, meeting and seminars, including International Conference on CFD, and numerous AIAA conferences and meetings. Selected publications that include the major results of the AMTOC project are enclosed in this report.
Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN
NASA Astrophysics Data System (ADS)
Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.
2013-12-01
Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third-party library to provide hydrologic flow, energy transport, and biogeochemical capability to the community land model, CLM, part of the open-source community earth system model (CESM) for climate. In this presentation, the advantages and disadvantages of open source software development in support of geoscience research at government laboratories, universities, and the private sector are discussed. Since the code is open-source (i.e. it's transparent and readily available to competitors), the PFLOTRAN team's development strategy within a competitive research environment is presented. Finally, the developers discuss their approach to object-oriented programming and the leveraging of modern Fortran in support of collaborative geoscience research as the Fortran standard evolves among compiler vendors.
Identification of Barriers to Pediatric Care in Limited-Resource Settings: A Simulation Study.
Shilkofski, Nicole; Hunt, Elizabeth A
2015-12-01
Eighty percent of the 10 million annual deaths in children aged <5 years in developing countries are estimated to be avoidable, with improvements in education for pediatric emergency management being a key factor. Education must take into account cultural considerations to be effective. Study objectives were: (1) to use simulation to identify factors posing barriers to patient care in limited resource settings (LRS); and (2) to understand how simulations in LRS can affect communication and decision-making processes. A qualitative study was conducted at 17 different sites in 12 developing countries in Asia, Latin America, and Africa. Data from observations of 68 in situ simulated pediatric emergencies were coded for thematic analysis. Sixty-two different "key informants" were interviewed regarding perceived benefit of simulations. Coding of observations and interviews yielded common themes: impact of culture on team hierarchy, impact of communication and language barriers on situational awareness, systematic emergency procedures, role delineation, shared cognition and resource awareness through simulation, logistic barriers to patient care, and use of recognition-primed decision-making by experienced clinicians. Changes in clinical environments were implemented as a result of simulations. Ad hoc teams in LRS face challenges in caring safely for patients; these include language and cultural barriers, as well as environmental and resource constraints. Engaging teams in simulations may promote improved communication, identification of systems issues and latent threats to target for remediation. There may be a role for training novices in use of recognition-primed or algorithmic decision-making strategies to improve rapidity and efficiency of decisions in LRS. Copyright © 2015 by the American Academy of Pediatrics.
NASA Astrophysics Data System (ADS)
Bilke, Lars; Watanabe, Norihiro; Naumov, Dmitri; Kolditz, Olaf
2016-04-01
A complex software project in general with high standards regarding code quality requires automated tools to help developers in doing repetitive and tedious tasks such as compilation on different platforms and configurations, doing unit testing as well as end-to-end tests and generating distributable binaries and documentation. This is known as continuous integration (CI). A community-driven FOSS-project within the Earth Sciences benefits even more from CI as time and resources regarding software development are often limited. Therefore testing developed code on more than the developers PC is a task which is often neglected and where CI can be the solution. We developed an integrated workflow based on GitHub, Travis and Jenkins for the community project OpenGeoSys - a coupled multiphysics modeling and simulation package - allowing developers to concentrate on implementing new features in a tight feedback loop. Every interested developer/user can create a pull request containing source code modifications on the online collaboration platform GitHub. The modifications are checked (compilation, compiler warnings, memory leaks, undefined behaviors, unit tests, end-to-end tests, analyzing differences in simulation run results between changes etc.) from the CI system which automatically responds to the pull request or by email on success or failure with detailed reports eventually requesting to improve the modifications. Core team developers review the modifications and merge them into the main development line once they satisfy agreed standards. We aim for efficient data structures and algorithms, self-explaining code, comprehensive documentation and high test code coverage. This workflow keeps entry barriers to get involved into the project low and permits an agile development process concentrating on feature additions rather than software maintenance procedures.
How to secure your servers, code and data
Lopienski, Sebastian
2018-04-30
Oral presentation in English, slides in English. Advice and best practices regarding the security of your servers, code and data will be presented. We will also describe how the Computer Security Team can help you reduce the risks.
Development Context Driven Change Awareness and Analysis Framework
NASA Technical Reports Server (NTRS)
Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha
2014-01-01
Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, De- CAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.
Development Context Driven Change Awareness and Analysis Framework
NASA Technical Reports Server (NTRS)
Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha; Wang, Yurong; Elbaum, Sebastian
2014-01-01
Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, DeCAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.
Efficient, Multi-Scale Designs Take Flight
NASA Technical Reports Server (NTRS)
2003-01-01
Engineers can solve aerospace design problems faster and more efficiently with a versatile software product that performs automated structural analysis and sizing optimization. Collier Research Corporation's HyperSizer Structural Sizing Software is a design, analysis, and documentation tool that increases productivity and standardization for a design team. Based on established aerospace structural methods for strength, stability, and stiffness, HyperSizer can be used all the way from the conceptual design to in service support. The software originated from NASA s efforts to automate its capability to perform aircraft strength analyses, structural sizing, and weight prediction and reduction. With a strategy to combine finite element analysis with an automated design procedure, NASA s Langley Research Center led the development of a software code known as ST-SIZE from 1988 to 1995. Collier Research employees were principal developers of the code along with Langley researchers. The code evolved into one that could analyze the strength and stability of stiffened panels constructed of any material, including light-weight, fiber-reinforced composites.
Novel inter and intra prediction tools under consideration for the emerging AV1 video codec
NASA Astrophysics Data System (ADS)
Joshi, Urvang; Mukherjee, Debargha; Han, Jingning; Chen, Yue; Parker, Sarah; Su, Hui; Chiang, Angie; Xu, Yaowu; Liu, Zoe; Wang, Yunqing; Bankoski, Jim; Wang, Chen; Keyder, Emil
2017-09-01
Google started the WebM Project in 2010 to develop open source, royalty- free video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec AV1, in a consortium of major tech companies called the Alliance for Open Media, that achieves at least a generational improvement in coding efficiency over VP9. In this paper, we focus primarily on new tools in AV1 that improve the prediction of pixel blocks before transforms, quantization and entropy coding are invoked. Specifically, we describe tools and coding modes that improve intra, inter and combined inter-intra prediction. Results are presented on standard test sets.
Lessons Learned for Collaborative Clinical Content Development
Collins, S.A.; Bavuso, K.; Zuccotti, G.; Rocha, R.A.
2013-01-01
Background Site-specific content configuration of vendor-based Electronic Health Records (EHRs) is a vital step in the development of standardized and interoperable content that can be used for clinical decision-support, reporting, care coordination, and information exchange. The multi-site, multi-stakeholder Acute Care Documentation (ACD) project at Partners Healthcare Systems (PHS) aimed to develop highly structured clinical content with adequate breadth and depth to meet the needs of all types of acute care clinicians at two academic medical centers. The Knowledge Management (KM) team at PHS led the informatics and knowledge management effort for the project. Objectives We aimed to evaluate the role, governance, and project management processes and resources for the KM team’s effort as part of the standardized clinical content creation. Methods We employed the Center for Disease Control’s six step Program Evaluation Framework to guide our evaluation steps. We administered a forty-four question, open-ended, semi-structured voluntary survey to gather focused, credible evidence from members of the KM team. Qualitative open-coding was performed to identify themes for lessons learned and concluding recommendations. Results Six surveys were completed. Qualitative data analysis informed five lessons learned and thirty specific recommendations associated with the lessons learned. The five lessons learned are: 1) Assess and meet knowledge needs and set expectations at the start of the project; 2) Define an accountable decision-making process; 3) Increase team meeting moderation skills; 4) Ensure adequate resources and competency training with online asynchronous collaboration tools; 5) Develop focused, goal-oriented teams and supportive, consultative service based teams. Conclusions Knowledge management requirements for the development of standardized clinical content within a vendor-based EHR among multi-stakeholder teams and sites include: 1) assessing and meeting informatics knowledge needs, 2) setting expectations and standardizing the process for decision-making, and 3) ensuring the availability of adequate resources and competency training. PMID:23874366
Bar-Eli, Michael; Tenenbaum, Gershon; Geister, Sabine
2006-10-01
This study documents the effect of players' dismissals on team performance in professional soccer. Our aim was to determine whether the punishment meted out for unacceptable player behaviour results in reduced team performance. The official web site of the German Soccer Association was used for coding data from games played in the first Bundesliga between the 1963 - 64 and 2003 - 04 (n = 41) seasons. A sample of 743 games where at least one red card was issued was used to test hypotheses derived from crisis theory (Bar-Eli & Tenenbaum, 1989a). Players' dismissals weaken a sanctioned team in terms of the goals and final score following the punishment. The chances of a sanctioned team scoring or winning were substantially reduced following the sanction. Most cards were issued in the later stages of matches. The statistics pertaining to outcome results as a function of game standing, game location, and time phases - all strongly support the view that teams can be considered conceptually similar to individuals regarding the link between stress and performance. To further develop the concept of team and individual psychological performance crisis in competition, it is recommended that reversal theory (Apter, 1982) and self-monitoring and distraction theories (Baumeister, 1984) be included in the design of future investigations pertaining to choking under pressure.
Neighboring block based disparity vector derivation for multiview compatible 3D-AVC
NASA Astrophysics Data System (ADS)
Kang, Jewon; Chen, Ying; Zhang, Li; Zhao, Xin; Karczewicz, Marta
2013-09-01
3D-AVC being developed under Joint Collaborative Team on 3D Video Coding (JCT-3V) significantly outperforms the Multiview Video Coding plus Depth (MVC+D) which simultaneously encodes texture views and depth views with the multiview extension of H.264/AVC (MVC). However, when the 3D-AVC is configured to support multiview compatibility in which texture views are decoded without depth information, the coding performance becomes significantly degraded. The reason is that advanced coding tools incorporated into the 3D-AVC do not perform well due to the lack of a disparity vector converted from the depth information. In this paper, we propose a disparity vector derivation method utilizing only the information of texture views. Motion information of neighboring blocks is used to determine a disparity vector for a macroblock, so that the derived disparity vector is efficiently used for the coding tools in 3D-AVC. The proposed method significantly improves a coding gain of the 3D-AVC in the multiview compatible mode about 20% BD-rate saving in the coded views and 26% BD-rate saving in the synthesized views on average.
Temporal motifs reveal collaboration patterns in online task-oriented networks
NASA Astrophysics Data System (ADS)
Xuan, Qi; Fang, Huiting; Fu, Chenbo; Filkov, Vladimir
2015-05-01
Real networks feature layers of interactions and complexity. In them, different types of nodes can interact with each other via a variety of events. Examples of this complexity are task-oriented social networks (TOSNs), where teams of people share tasks towards creating a quality artifact, such as academic research papers or software development in commercial or open source environments. Accomplishing those tasks involves both work, e.g., writing the papers or code, and communication, to discuss and coordinate. Taking into account the different types of activities and how they alternate over time can result in much more precise understanding of the TOSNs behaviors and outcomes. That calls for modeling techniques that can accommodate both node and link heterogeneity as well as temporal change. In this paper, we report on methodology for finding temporal motifs in TOSNs, limited to a system of two people and an artifact. We apply the methods to publicly available data of TOSNs from 31 Open Source Software projects. We find that these temporal motifs are enriched in the observed data. When applied to software development outcome, temporal motifs reveal a distinct dependency between collaboration and communication in the code writing process. Moreover, we show that models based on temporal motifs can be used to more precisely relate both individual developer centrality and team cohesion to programmer productivity than models based on aggregated TOSNs.
Temporal motifs reveal collaboration patterns in online task-oriented networks.
Xuan, Qi; Fang, Huiting; Fu, Chenbo; Filkov, Vladimir
2015-05-01
Real networks feature layers of interactions and complexity. In them, different types of nodes can interact with each other via a variety of events. Examples of this complexity are task-oriented social networks (TOSNs), where teams of people share tasks towards creating a quality artifact, such as academic research papers or software development in commercial or open source environments. Accomplishing those tasks involves both work, e.g., writing the papers or code, and communication, to discuss and coordinate. Taking into account the different types of activities and how they alternate over time can result in much more precise understanding of the TOSNs behaviors and outcomes. That calls for modeling techniques that can accommodate both node and link heterogeneity as well as temporal change. In this paper, we report on methodology for finding temporal motifs in TOSNs, limited to a system of two people and an artifact. We apply the methods to publicly available data of TOSNs from 31 Open Source Software projects. We find that these temporal motifs are enriched in the observed data. When applied to software development outcome, temporal motifs reveal a distinct dependency between collaboration and communication in the code writing process. Moreover, we show that models based on temporal motifs can be used to more precisely relate both individual developer centrality and team cohesion to programmer productivity than models based on aggregated TOSNs.
STAR-CCM+ Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pointer, William David
2016-09-30
The commercial Computational Fluid Dynamics (CFD) code STAR-CCM+ provides general purpose finite volume method solutions for fluid dynamics and energy transport. This document defines plans for verification and validation (V&V) of the base code and models implemented within the code by the Consortium for Advanced Simulation of Light water reactors (CASL). The software quality assurance activities described herein are port of the overall software life cycle defined in the CASL Software Quality Assurance (SQA) Plan [Sieger, 2015]. STAR-CCM+ serves as the principal foundation for development of an advanced predictive multi-phase boiling simulation capability within CASL. The CASL Thermal Hydraulics Methodsmore » (THM) team develops advanced closure models required to describe the subgrid-resolution behavior of secondary fluids or fluid phases in multiphase boiling flows within the Eulerian-Eulerian framework of the code. These include wall heat partitioning models that describe the formation of vapor on the surface and the forces the define bubble/droplet dynamic motion. The CASL models are implemented as user coding or field functions within the general framework of the code. This report defines procedures and requirements for V&V of the multi-phase CFD capability developed by CASL THM. Results of V&V evaluations will be documented in a separate STAR-CCM+ V&V assessment report. This report is expected to be a living document and will be updated as additional validation cases are identified and adopted as part of the CASL THM V&V suite.« less
Soones, Tacara N; O'Brien, Bridget C; Julian, Katherine A
2015-09-01
In order to teach residents how to work in interprofessional teams, educators in graduate medical education are implementing team-based care models in resident continuity clinics. However, little is known about the impact of interprofessional teams on residents' education in the ambulatory setting. To identify factors affecting residents' experience of team-based care within continuity clinics and the impact of these teams on residents' education. This was a qualitative study of focus groups with internal medicine residents. Seventy-seven internal medicine residents at the University of California San Francisco at three continuity clinic sites participated in the study. Qualitative interviews were audiotaped and transcribed. The authors used a general inductive approach with sensitizing concepts in four frames (structural, human resources, political and symbolic) to develop codes and identify themes. Residents believed that team-based care improves continuity and quality of care. Factors in four frames affected their ability to achieve these goals. Structural factors included communication through the electronic medical record, consistent schedules and regular team meetings. Human resources factors included the presence of stable teams and clear roles. Political and symbolic factors negatively impacted team-based care, and included low staffing ratios and a culture of ultimate resident responsibility, respectively. Regardless of the presence of these factors or resident perceptions of their teams, residents did not see the practice of interprofessional team-based care as intrinsically educational. Residents' experiences practicing team-based care are influenced by many principles described in the interprofessional teamwork literature, including understanding team members' roles, good communication and sufficient staffing. However, these attributes are not correlated with residents' perceptions of the educational value of team-based care. Including residents in interprofessional teams in their clinic may not be sufficient to teach residents how team-based care can enhance their overall learning and future practice.
Foam on Tile Impact Modeling for the Space Shuttle Program
NASA Technical Reports Server (NTRS)
Stellingwerf, R. F.; Robinson, J. H.; Richardson, S.; Evans, S. W.; Stallworth, R.; Hovater, M.
2003-01-01
Following the breakup of the Space Shuttle Columbia during reentry a NASA-wide investigation team was formed to examine the probable damage inflicted on Orbiter Thermal Protection System (TPS) elements by impact of External Tank insulating foam projectiles. Our team was to apply rigorous, physics-based analysis techniques to help determine parameters of interest for an experimental test program, utilize validated codes to investigate the full range of impact scenarios, and use analysis derived models to predict aero-thermal-structural responses to entry conditions. We were to operate on a non-interference basis with the j Team, and were to supply significant findings to that team and to the Orbiter Vehicle Engineering Working Group, being responsive to any solicitations for support from these entities. The authors formed a working sub-group within the larger team to apply the Smooth Particle Hydrodynamics code SPHC to the damage estimation problem. Numerical models of the LI-900 TPS tiles and of the BX-250 foam were constructed and used as inputs into the code. Material properties needed to properly model the tiles and foam were obtained from other working sub-groups who performed tests on these items for this purpose. Two- and three- dimensional models of the tiles were constructed, including the glass outer layer, the densified lower layer of LI-900 insulation, the Nomex felt Strain Isolation Pad (SIP) mounting layer, and the underlying aluminum 2024 vehicle skin. A model for the BX-250 foam including porous compression, elastic rebound, and surface erosion was developed. Code results for the tile damage and foam behavior were extensively validated through comparison with the Southwest Research Institute (SwRI) foam-on-tile impact experiments carried out in 1999. These tests involved small projectiles striking individual tiles and small tile arrays. Following code and model validation we simulated impacts of larger ET foam projectiles on the TPS tile systems used on the wings of the orbiter. Tiles used on the Wing Acreage, the Main Landing Gear Door, and the Carrier Panels near the front edge of the wing were modeled. Foam impacts shot for the CAB investigation were modeled, as well as impacts at larger angles, including rapid rotation of the projectile, and with varying foam properties. General results suggest that foam impacts on tiles at about 500 mph could cause appreciable damage if the impact angle is greater than about 20 degrees. Some variations of the foam properties, such as increased brittleness or increased density could increase damage in some cases. Rapid (17 rps) rotation failed to increase the damage for the two cases considered. This does not rule out other cases in which the rotational energy might lead to an increase in tile damage, but suggests that in most cases rotation will not be an important factor. Similar models will be applied for other impacting materials, other velocities, and other geometries as part of the Return to Flight process.
Berkowitz, Seth A; Eisenstat, Stephanie A; Barnard, Lily S; Wexler, Deborah J
2018-06-01
To explore the patient perspective on coordinated multidisciplinary diabetes team care among a socioeconomically diverse group of adults with type 2 diabetes. Qualitative research design using 8 focus groups (n=53). We randomly sampled primary care patients with type 2 diabetes and conducted focus groups at their primary care clinic. Discussion prompts queried current perceptions of team care. Each focus group was audio recorded, transcribed verbatim, and independently coded by three reviewers. Coding used an iterative process. Thematic saturation was achieved. Data were analyzed using content analysis. Most participants believed that coordinated multidisciplinary diabetes team care was a good approach, feeling that diabetes was too complicated for any one care team member to manage. Primary care physicians were seen as too busy to manage diabetes alone, and participants were content to be treated by other care team members, especially if there was a single point of contact and the care was coordinated. Participants suggested that an ideal multidisciplinary approach would additionally include support for exercise and managing socioeconomic challenges, components perceived to be missing from the existing approach to diabetes care. Coordinated, multidisciplinary diabetes team care is understood by and acceptable to patients with type 2 diabetes. Copyright © 2018 Primary Care Diabetes Europe. Published by Elsevier Ltd. All rights reserved.
Multiple Access Schemes for Lunar Missions
NASA Technical Reports Server (NTRS)
Deutsch, Leslie; Hamkins, Jon; Stocklin, Frank J.
2010-01-01
Two years ago, the NASA Coding, Modulation, and Link Protocol (CMLP) study was completed. The study, led by the authors of this paper, recommended codes, modulation schemes, and desired attributes of link protocols for all space communication links in NASA's future space architecture. Portions of the NASA CMLP team were reassembled to resolve one open issue: the use of multiple access (MA) communication from the lunar surface. The CMLP-MA team analyzed and simulated two candidate multiple access schemes that were identified in the original CMLP study: Code Division MA (CDMA) and Frequency Division MA (FDMA) based on a bandwidth-efficient Continuous Phase Modulation (CPM) with a superimposed Pseudo-Noise (PN) ranging signal (CPM/PN). This paper summarizes the results of the analysis and simulation of the CMLP-MA study and describes the final recommendations.
Benchmark Problems of the Geothermal Technologies Office Code Comparison Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Mark D.; Podgorney, Robert; Kelkar, Sharad M.
A diverse suite of numerical simulators is currently being applied to predict or understand the performance of enhanced geothermal systems (EGS). To build confidence and identify critical development needs for these analytical tools, the United States Department of Energy, Geothermal Technologies Office has sponsored a Code Comparison Study (GTO-CCS), with participants from universities, industry, and national laboratories. A principal objective for the study was to create a community forum for improvement and verification of numerical simulators for EGS modeling. Teams participating in the study were those representing U.S. national laboratories, universities, and industries, and each team brought unique numerical simulationmore » capabilities to bear on the problems. Two classes of problems were developed during the study, benchmark problems and challenge problems. The benchmark problems were structured to test the ability of the collection of numerical simulators to solve various combinations of coupled thermal, hydrologic, geomechanical, and geochemical processes. This class of problems was strictly defined in terms of properties, driving forces, initial conditions, and boundary conditions. Study participants submitted solutions to problems for which their simulation tools were deemed capable or nearly capable. Some participating codes were originally developed for EGS applications whereas some others were designed for different applications but can simulate processes similar to those in EGS. Solution submissions from both were encouraged. In some cases, participants made small incremental changes to their numerical simulation codes to address specific elements of the problem, and in other cases participants submitted solutions with existing simulation tools, acknowledging the limitations of the code. The challenge problems were based on the enhanced geothermal systems research conducted at Fenton Hill, near Los Alamos, New Mexico, between 1974 and 1995. The problems involved two phases of research, stimulation, development, and circulation in two separate reservoirs. The challenge problems had specific questions to be answered via numerical simulation in three topical areas: 1) reservoir creation/stimulation, 2) reactive and passive transport, and 3) thermal recovery. Whereas the benchmark class of problems were designed to test capabilities for modeling coupled processes under strictly specified conditions, the stated objective for the challenge class of problems was to demonstrate what new understanding of the Fenton Hill experiments could be realized via the application of modern numerical simulation tools by recognized expert practitioners.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chao, Mark
This report summarizes activity conducted by the Institute for Market Transformation and a team of American and Chinese partners in development of a new building energy-efficiency code for the transitional climate zone in the People's Republic of China.
Functions within the Naval Air Training Command
1984-02-15
Resource Management < HRM ), Leadership and Management Education and Training (LMET), Officer and Enlisted Career Counseling and Retention programs...as division officer for assigned HRM and career counselor personnel. i3i/byQ]§Q_BesDurce_rianaaement_Seeci.ali.5t N13-5 Advises and assists Code...13 in the development, evaluation and standardization of NATRACOM HRM programs. N13-6 Responsible for monitoring NAVAVSCOLSCOM HRM Support Team
Ahmed, Fathima
2018-03-07
The ever-evolving nature of nursing requires professionals to keep their knowledge up to date and uphold the Nursing and Midwifery Council (NMC) Code by engaging themselves in ongoing personal and professional development (PPD). This article aims to highlight the importance of good leadership and management in healthcare and to explore the literature surrounding leadership and management, such as the current NHS healthcare leadership model ( NHS Leadership Academy 2013 ), the Leading Change, Adding Value Framework underpinned by the 10 commitments and 6Cs ( NHS England 2016 ) and the NMC Code ( NMC 2015a ) in relation to PPD. It examines how nurses can be supported in their PPD by their team leader and or managers using examples experienced in a clinical setting while caring for children and young people (CYP). Furthermore, the importance of team working and group processes in the context of leadership will be deliberated, using examples of formative group work to illustrate principles described in the literature. Finally, reflections will be discussed on how learning from this experience can influence future practice when caring for CYP. ©2018 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.
Practical strategies for developing the business case for hospital glycemic control teams.
Magee, Michelle F; Beck, Adam
2008-09-01
Many business models may be used to make the business case for support of a multidisciplinary team to implement targeted glucose control in the hospital. Models may be hospital-supported or self-supporting. In the former, the hospital provides financial support based on improved documentation opportunities, reduction in length of stay, and improved resource utilization. In the latter, clinical revenues for diabetes management offsets costs of salary, fringe benefits, and overheads. A combination of these strategies may also be used. The business plan presented to administration must justify return on investment. It is imperative to involve hospital administration, particularly representatives from coding, billing, and finance, in the development of the business plan. The business case for hospital support will be based on opportunities related to improving accuracy of documentation and coding for diabetes-related diagnoses, including level of control and complications present, on reduction in length of stay and on optimization of resource utilization through reduction in morbidity and mortality (cost aversion). The case for revenue generation through billing for clinical services will be based on opportunities to increase the provision of glycemic management services in the hospital. Examples from the literature and of analyses to support each of these models are presented. (c) 2008 Society of Hospital Medicine.
Emotional Intelligence in Library Disaster Response Assistance Teams: Which Competencies Emerged?
ERIC Educational Resources Information Center
Wilkinson, Frances C.
2015-01-01
This qualitative study examines the relationship between emotional intelligence competencies and the personal attributes of library disaster response assistance team (DRAT) members. Using appreciative inquiry protocol to conduct interviews at two academic libraries, the study presents findings from emergent thematic coding of interview…
Team Leader Structuring for Team Effectiveness and Team Learning in Command-and-Control Teams
van der Haar, Selma; Koeslag-Kreunen, Mieke; Euwe, Eline; Segers, Mien
2017-01-01
Due to their crucial and highly consequential task, it is of utmost importance to understand the levers leading to effectiveness of multidisciplinary emergency management command-and-control (EMCC) teams. We argue that the formal EMCC team leader needs to initiate structure in the team meetings to support organizing the work as well as facilitate team learning, especially the team learning process of constructive conflict. In a sample of 17 EMCC teams performing a realistic EMCC exercise, including one or two team meetings (28 in sum), we coded the team leader’s verbal structuring behaviors (1,704 events), rated constructive conflict by external experts, and rated team effectiveness by field experts. Results show that leaders of effective teams use structuring behaviors more often (except asking procedural questions) but decreasingly over time. They support constructive conflict by clarifying and by making summaries that conclude in a command or decision in a decreasing frequency over time. PMID:28490856
SciDAC Center for Gyrokinetic Particle Simulation of Turbulent Transport in Burning Plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Zhihong
2013-12-18
During the first year of the SciDAC gyrokinetic particle simulation (GPS) project, the GPS team (Zhihong Lin, Liu Chen, Yasutaro Nishimura, and Igor Holod) at the University of California, Irvine (UCI) studied the tokamak electron transport driven by electron temperature gradient (ETG) turbulence, and by trapped electron mode (TEM) turbulence and ion temperature gradient (ITG) turbulence with kinetic electron effects, extended our studies of ITG turbulence spreading to core-edge coupling. We have developed and optimized an elliptic solver using finite element method (FEM), which enables the implementation of advanced kinetic electron models (split-weight scheme and hybrid model) in the SciDACmore » GPS production code GTC. The GTC code has been ported and optimized on both scalar and vector parallel computer architectures, and is being transformed into objected-oriented style to facilitate collaborative code development. During this period, the UCI team members presented 11 invited talks at major national and international conferences, published 22 papers in peer-reviewed journals and 10 papers in conference proceedings. The UCI hosted the annual SciDAC Workshop on Plasma Turbulence sponsored by the GPS Center, 2005-2007. The workshop was attended by about fifties US and foreign researchers and financially sponsored several gradual students from MIT, Princeton University, Germany, Switzerland, and Finland. A new SciDAC postdoc, Igor Holod, has arrived at UCI to initiate global particle simulation of magnetohydrodynamics turbulence driven by energetic particle modes. The PI, Z. Lin, has been promoted to the Associate Professor with tenure at UCI.« less
What determines successful implementation of inpatient information technology systems?
Spetz, Joanne; Burgess, James F; Phibbs, Ciaran S
2012-03-01
To identify the factors and strategies that were associated with successful implementation of hospital-based information technology (IT) systems in US Department of Veterans Affairs (VA) hospitals, and how these might apply to other hospitals. Qualitative analysis of 118 interviews conducted at 7 VA hospitals. The study focused on the inpatient setting, where nurses are the main patient-care providers; thus, the research emphasized the impact of Computerized Patient Record System and Bar Code Medication Administration on nurses. Hospitals were selected to represent a range of IT implementation dates, facility sizes, and geography. The subjects included nurses, pharmacists, physicians, IT staff, and managers. Interviews were guided by a semi-structured interview protocol, and a thematic analysis was conducted, with initial codes drawn from the content of the interview guides. Additional themes were proposed as the coding was conducted. Five broad themes arose as factors which affected the process and success of implementation: (1) organizational stability and implementation team leadership, (2) implementation timelines, (3) equipment availability and reliability, (4) staff training, and (5) changes in work flow Overall IT implementation success in the VA depended on: (1) whether there was support for change from both leaders and staff, (2) development of a gradual and flexible implementation approach, (3) allocation of adequate resources for equipment and infrastructure, hands-on support, and deployment of additional staff, and (4) how the implementation team planned for setbacks, and continued the process to achieve success. Problems that developed in the early stages of implementation tended to become persistent, and poor implementation can lead to patient harm.
Development of a web service for analysis in a distributed network.
Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila
2014-01-01
We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among researchers at geographically dispersed institutes.
Development of a Web Service for Analysis in a Distributed Network
Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila
2014-01-01
Objective: We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. Background: We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. Methods: We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. Discussion: During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Conclusion: Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among researchers at geographically dispersed institutes. PMID:25848586
A description of the new 3D electron gun and collector modeling tool: MICHELLE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petillo, J.; Mondelli, A.; Krueger, W.
1999-07-01
A new 3D finite element gun and collector modeling code is under development at SAIC in collaboration with industrial partners and national laboratories. This development program has been designed specifically to address the shortcomings of current simulation and modeling tools. In particular, although there are 3D gun codes that exist today, their ability to address fine scale features is somewhat limited in 3D due to disparate length scales of certain classes of devices. Additionally, features like advanced emission rules, including thermionic Child's law and comprehensive secondary emission models also need attention. The program specifically targets problems classes including gridded-guns, sheet-beammore » guns, multi-beam devices, and anisotropic collectors. The presentation will provide an overview of the program objectives, the approach to be taken by the development team, and a status of the project.« less
Effect of focused debriefing on team communication skills.
Nwokorie, Ndidi; Svoboda, Deborah; Rovito, Debra K; Krugman, Scott D
2012-10-01
Community hospitals often lack tertiary care support such as pediatric intensivists and anesthesiologists. Resuscitation of critically ill and injured children in community hospitals requires a well-coordinated team effort, because good team performance improves quality of care. The lack of subspecialty support makes team coordination and communication more imperative yet much more challenging. This study sought to determine if the addition of a defined focused post-mock code debriefing session improved communication skills among team members in a community pediatric emergency department. Twenty-two volunteer members of the pediatric emergency and respiratory therapy departments at Medstar Franklin Square Medical Center took part in monthly simulated resuscitations for 3 consecutive months. After each simulation, participants answered an 18-item survey on observed communication among their team members. Members then participated in a 30-minute debriefing session in which they reflected on their own communication skills. A video taping of the resuscitation was later scored by one of the investigators by using a rubric designed by the investigators. Descriptive statistics were calculated for both the participant survey and the team communication indicator scores. Paired-sample Wilcoxon signed rank test examined the difference in the scores between each of 3 sessions. The mean scores by investigator-scored video recordings of the teams' mock resuscitation by session showed overall team communication improved between sessions 1 and 3 for all communication areas (P = .03), with significant improvement in 4 of 9 communication areas by the third session. All team members improved communication skills as well, with the greatest improvement by the clinical multifunctional technicians. Communication skills improve with the addition of focused debriefing sessions after mock codes as perceived by participants during debriefing sessions and evidenced by investigator-scored video recordings of resuscitations.
General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft
NASA Technical Reports Server (NTRS)
Dove, Edwin; Hughes, Steve
2007-01-01
The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.
NASA Astrophysics Data System (ADS)
Ghiringhelli, Luca M.; Carbogno, Christian; Levchenko, Sergey; Mohamed, Fawzi; Huhs, Georg; Lüders, Martin; Oliveira, Micael; Scheffler, Matthias
2017-11-01
With big-data driven materials research, the new paradigm of materials science, sharing and wide accessibility of data are becoming crucial aspects. Obviously, a prerequisite for data exchange and big-data analytics is standardization, which means using consistent and unique conventions for, e.g., units, zero base lines, and file formats. There are two main strategies to achieve this goal. One accepts the heterogeneous nature of the community, which comprises scientists from physics, chemistry, bio-physics, and materials science, by complying with the diverse ecosystem of computer codes and thus develops "converters" for the input and output files of all important codes. These converters then translate the data of each code into a standardized, code-independent format. The other strategy is to provide standardized open libraries that code developers can adopt for shaping their inputs, outputs, and restart files, directly into the same code-independent format. In this perspective paper, we present both strategies and argue that they can and should be regarded as complementary, if not even synergetic. The represented appropriate format and conventions were agreed upon by two teams, the Electronic Structure Library (ESL) of the European Center for Atomic and Molecular Computations (CECAM) and the NOvel MAterials Discovery (NOMAD) Laboratory, a European Centre of Excellence (CoE). A key element of this work is the definition of hierarchical metadata describing state-of-the-art electronic-structure calculations.
NASA Astrophysics Data System (ADS)
Barba, M.; Rains, C.; von Dassow, W.; Parker, J. W.; Glasscoe, M. T.
2013-12-01
Knowing the location and behavior of active faults is essential for earthquake hazard assessment and disaster response. In Interferometric Synthetic Aperture Radar (InSAR) images, faults are revealed as linear discontinuities. Currently, interferograms are manually inspected to locate faults. During the summer of 2013, the NASA-JPL DEVELOP California Disasters team contributed to the development of a method to expedite fault detection in California using remote-sensing technology. The team utilized InSAR images created from polarimetric L-band data from NASA's Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) project. A computer-vision technique known as 'edge-detection' was used to automate the fault-identification process. We tested and refined an edge-detection algorithm under development through NASA's Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) project. To optimize the algorithm we used both UAVSAR interferograms and synthetic interferograms generated through Disloc, a web-based modeling program available through NASA's QuakeSim project. The edge-detection algorithm detected seismic, aseismic, and co-seismic slip along faults that were identified and compared with databases of known fault systems. Our optimization process was the first step toward integration of the edge-detection code into E-DECIDER to provide decision support for earthquake preparation and disaster management. E-DECIDER partners that will use the edge-detection code include the California Earthquake Clearinghouse and the US Department of Homeland Security through delivery of products using the Unified Incident Command and Decision Support (UICDS) service. Through these partnerships, researchers, earthquake disaster response teams, and policy-makers will be able to use this new methodology to examine the details of ground and fault motions for moderate to large earthquakes. Following an earthquake, the newly discovered faults can be paired with infrastructure overlays, allowing emergency response teams to identify sites that may have been exposed to damage. The faults will also be incorporated into a database for future integration into fault models and earthquake simulations, improving future earthquake hazard assessment. As new faults are mapped, they will further understanding of the complex fault systems and earthquake hazards within the seismically dynamic state of California.
ERIC Educational Resources Information Center
Bailey, James; Sass, Mary; Swiercz, Paul M.; Seal, Craig; Kayes, D. Christopher
2005-01-01
Modern organizations prize teamwork. Management schools have responded to this reality by integrating teamwork into the curriculum. Two important challenges associated with integrating teams in the management classroom include (a) designing teamwork assignments that achieve multiple, sophisticated learning outcomes and (b) instruction in, and…
Evaluation of Agency Non-Code Layered Pressure Vessels (LPVs)
NASA Technical Reports Server (NTRS)
Prosser, William H.
2014-01-01
In coordination with the Office of Safety and Mission Assurance and the respective Center Pressure System Managers (PSMs), the NASA Engineering and Safety Center (NESC) was requested to formulate a consensus draft proposal for the development of additional testing and analysis methods to establish the technical validity, and any limitation thereof, for the continued safe operation of facility non-code layered pressure vessels. The PSMs from each NASA Center were asked to participate as part of the assessment team by providing, collecting, and reviewing data regarding current operations of these vessels. This report contains the outcome of the assessment and the findings, observations, and NESC recommendations to the Agency and individual NASA Centers.
Evaluation of Agency Non-Code Layered Pressure Vessels (LPVs). Corrected Copy, Aug. 25, 2014
NASA Technical Reports Server (NTRS)
Prosser, William H.
2014-01-01
In coordination with the Office of Safety and Mission Assurance and the respective Center Pressure System Managers (PSMs), the NASA Engineering and Safety Center (NESC) was requested to formulate a consensus draft proposal for the development of additional testing and analysis methods to establish the technical validity, and any limitation thereof, for the continued safe operation of facility non-code layered pressure vessels. The PSMs from each NASA Center were asked to participate as part of the assessment team by providing, collecting, and reviewing data regarding current operations of these vessels. This report contains the outcome of the assessment and the findings, observations, and NESC recommendations to the Agency and individual NASA Centers.
New Web Server - the Java Version of Tempest - Produced
NASA Technical Reports Server (NTRS)
York, David W.; Ponyik, Joseph G.
2000-01-01
A new software design and development effort has produced a Java (Sun Microsystems, Inc.) version of the award-winning Tempest software (refs. 1 and 2). In 1999, the Embedded Web Technology (EWT) team received a prestigious R&D 100 Award for Tempest, Java Version. In this article, "Tempest" will refer to the Java version of Tempest, a World Wide Web server for desktop or embedded systems. Tempest was designed at the NASA Glenn Research Center at Lewis Field to run on any platform for which a Java Virtual Machine (JVM, Sun Microsystems, Inc.) exists. The JVM acts as a translator between the native code of the platform and the byte code of Tempest, which is compiled in Java. These byte code files are Java executables with a ".class" extension. Multiple byte code files can be zipped together as a "*.jar" file for more efficient transmission over the Internet. Today's popular browsers, such as Netscape (Netscape Communications Corporation) and Internet Explorer (Microsoft Corporation) have built-in Virtual Machines to display Java applets.
U. S. Naval Forces, Vietnam Monthly Historical Supplement for April 1968
1968-10-11
CHNAVVLAT (Code 04)(2) COMSEVENTFLT (Hist. Team ) Pres, NAVWARCOL .. ". COMPHIBLANT COMWHIBPAC COMCBPAC COMBLANT COI4NAVFACENGCOM U...ganda Team with a Australian Advisor, about 31 miles north of Vinh Lcng. PbR 87 landed the troops embarked and the Australian 0 Advisor, but when PBR...responded to the situation and apprehended one man who was turned over to the Vietnamese National Police, The Royal Australian diving team at Vung - Tau
Mediating the gap between the white coat ceremony and the ethics and professionalism curriculum.
Cohn, Felicia; Lie, Désirée
2002-11-01
Like many other medical schools, the University of California, Irvine annually conducts a White Coat ceremony in which incoming students take a professional oath of ethical conduct.(1) We report a new educational activity to connect the values expressed in the oath taken to the Ethics and Professionalism (EP) curriculum for first-year medical students(2) and its potential impact on physician training. Following the White Coat ceremony, students participated in the Patient Doctor Society course that integrates diverse curricular topics centered on physician-patient communication. During this course, the students were introduced to EP content through a collaborative peer exercise. With the assistance of background readings on professional values and ethics concepts, small groups of students were asked to construct their own codes of ethics. The process of working in a group became part of the learning. After developing a code of ethics, each group was asked to identify primary values embodied in its code; primary obligations to patients and their families, other members of the health care team, and the community; key factors influencing code development; and likely effects of the code on the conduct of medical students and physicians. The goals of the session were to recognize formally both individual values and the values to which students commit themselves during the White Coat ceremony, to facilitate understanding of those values, and to begin to reconcile differences between personal and professional values. The small groups convened to report their findings in a three-hour session. Common values expressed by the students included patient autonomy, respect, beneficence, and professionalism. The delivery of quality health care, communication, education, and the equitable distribution of health care were among the most often listed obligations. The students reported that culture, societal values, family, experience, religion, education, and assigned readings were the key sources of the values in their codes. Most of the students enjoyed and learned from the exercise, believing that a code of ethics will serve as a helpful educational guide while they are students and as an action guide in their future practices. Student evaluations, narrative feedback, and faculty observation indicated that the students appreciated the opportunity to work in teams and to explore professional values. The students' most common suggestion for improvement involved incorporating analysis of clinical cases in which questions about professional values arise. Medical educators suggest that students' values and professional behaviors change throughout medical school, but such change is difficult to assess. The code-development exercise established a baseline of values at entry to medical school. We plan to track this cohort of students by reintroducing this exercise in their fourth year and will compare the codes developed in their first and fourth years to identify changes in values and to suggest what the students have learned about values during medical school. The comparison will be used to inform further development of the EP curriculum toward the goal of shaping and supporting the positive professional growth of our student-physicians.
Current Status of Japanese Global Precipitation Measurement (GPM) Research Project
NASA Astrophysics Data System (ADS)
Kachi, Misako; Oki, Riko; Kubota, Takuji; Masaki, Takeshi; Kida, Satoshi; Iguchi, Toshio; Nakamura, Kenji; Takayabu, Yukari N.
2013-04-01
The Global Precipitation Measurement (GPM) mission is a mission led by the Japan Aerospace Exploration Agency (JAXA) and the National Aeronautics and Space Administration (NASA) under collaboration with many international partners, who will provide constellation of satellites carrying microwave radiometer instruments. The GPM Core Observatory, which carries the Dual-frequency Precipitation Radar (DPR) developed by JAXA and the National Institute of Information and Communications Technology (NICT), and the GPM Microwave Imager (GMI) developed by NASA. The GPM Core Observatory is scheduled to be launched in early 2014. JAXA also provides the Global Change Observation Mission (GCOM) 1st - Water (GCOM-W1) named "SHIZUKU," as one of constellation satellites. The SHIZUKU satellite was launched in 18 May, 2012 from JAXA's Tanegashima Space Center, and public data release of the Advanced Microwave Scanning Radiometer 2 (AMSR2) on board the SHIZUKU satellite was planned that Level 1 products in January 2013, and Level 2 products including precipitation in May 2013. The Japanese GPM research project conducts scientific activities on algorithm development, ground validation, application research including production of research products. In addition, we promote collaboration studies in Japan and Asian countries, and public relations activities to extend potential users of satellite precipitation products. In pre-launch phase, most of our activities are focused on the algorithm development and the ground validation related to the algorithm development. As the GPM standard products, JAXA develops the DPR Level 1 algorithm, and the NASA-JAXA Joint Algorithm Team develops the DPR Level 2 and the DPR-GMI combined Level2 algorithms. JAXA also develops the Global Rainfall Map product as national product to distribute hourly and 0.1-degree horizontal resolution rainfall map. All standard algorithms including Japan-US joint algorithm will be reviewed by the Japan-US Joint Precipitation Measuring Mission (PMM) Science Team (JPST) before the release. DPR Level 2 algorithm has been developing by the DPR Algorithm Team led by Japan, which is under the NASA-JAXA Joint Algorithm Team. The Level-2 algorithms will provide KuPR only products, KaPR only products, and Dual-frequency Precipitation products, with estimated precipitation rate, radar reflectivity, and precipitation information such as drop size distribution and bright band height. At-launch code was developed in December 2012. In addition, JAXA and NASA have provided synthetic DPR L1 data and tests have been performed using them. Japanese Global Rainfall Map algorithm for the GPM mission has been developed by the Global Rainfall Map Algorithm Development Team in Japan. The algorithm succeeded heritages of the Global Satellite Mapping for Precipitation (GSMaP) project, which was sponsored by the Japan Science and Technology Agency (JST) under the Core Research for Evolutional Science and Technology (CREST) framework between 2002 and 2007. The GSMaP near-real-time version and reanalysis version have been in operation at JAXA, and browse images and binary data available at the GSMaP web site (http://sharaku.eorc.jaxa.jp/GSMaP/). The GSMaP algorithm for GPM is developed in collaboration with AMSR2 standard algorithm for precipitation product, and their validation studies are closely related. As JAXA GPM product, we will provide 0.1-degree grid and hourly product for standard and near-realtime processing. Outputs will include hourly rainfall, gauge-calibrated hourly rainfall, and several quality information (satellite information flag, time information flag, and gauge quality information) over global areas from 60°S to 60°N. At-launch code of GSMaP for GPM is under development, and will be delivered to JAXA GPM Mission Operation System by April 2013. At-launch code will include several updates of microwave imager and sounder algorithms and databases, and introduction of rain-gauge correction.
NASA Astrophysics Data System (ADS)
Chapoutier, Nicolas; Mollier, François; Nolin, Guillaume; Culioli, Matthieu; Mace, Jean-Reynald
2017-09-01
In the context of the rising of Monte Carlo transport calculations for any kind of application, AREVA recently improved its suite of engineering tools in order to produce efficient Monte Carlo workflow. Monte Carlo codes, such as MCNP or TRIPOLI, are recognized as reference codes to deal with a large range of radiation transport problems. However the inherent drawbacks of theses codes - laboring input file creation and long computation time - contrast with the maturity of the treatment of the physical phenomena. The goals of the recent AREVA developments were to reach similar efficiency as other mature engineering sciences such as finite elements analyses (e.g. structural or fluid dynamics). Among the main objectives, the creation of a graphical user interface offering CAD tools for geometry creation and other graphical features dedicated to the radiation field (source definition, tally definition) has been reached. The computations times are drastically reduced compared to few years ago thanks to the use of massive parallel runs, and above all, the implementation of hybrid variance reduction technics. From now engineering teams are capable to deliver much more prompt support to any nuclear projects dealing with reactors or fuel cycle facilities from conceptual phase to decommissioning.
First results of coupled IPS/NIMROD/GENRAY simulations
NASA Astrophysics Data System (ADS)
Jenkins, Thomas; Kruger, S. E.; Held, E. D.; Harvey, R. W.; Elwasif, W. R.; Schnack, D. D.
2010-11-01
The Integrated Plasma Simulator (IPS) framework, developed by the SWIM Project Team, facilitates self-consistent simulations of complicated plasma behavior via the coupling of various codes modeling different spatial/temporal scales in the plasma. Here, we apply this capability to investigate the stabilization of tearing modes by ECCD. Under IPS control, the NIMROD code (MHD) evolves fluid equations to model bulk plasma behavior, while the GENRAY code (RF) calculates the self-consistent propagation and deposition of RF power in the resulting plasma profiles. GENRAY data is then used to construct moments of the quasilinear diffusion tensor (induced by the RF) which influence the dynamics of momentum/energy evolution in NIMROD's equations. We present initial results from these coupled simulations and demonstrate that they correctly capture the physics of magnetic island stabilization [Jenkins et al, PoP 17, 012502 (2010)] in the low-beta limit. We also discuss the process of code verification in these simulations, demonstrating good agreement between NIMROD and GENRAY predictions for the flux-surface-averaged, RF-induced currents. An overview of ongoing model development (synthetic diagnostics/plasma control systems; neoclassical effects; etc.) is also presented. Funded by US DoE.
A Process and Programming Design to Develop Virtual Patients for Medical Education
McGee, James B.; Wu, Martha
1999-01-01
Changes in the financing and delivery of healthcare in our nation's teaching hospitals have diminished the variety and quality of a medical student's clinical training. The Virtual Patient Project is a series of computer-based, multimedia, clinical simulations, designed to fill this gap. After the development of a successful prototype and obtaining funding for a series of 16 cases, a method to write and produce many virtual patients was created. Case authors now meet with our production team to write and edit a movie-like script. This script is converted into a design document which specifies the clinical aspects, teaching points, media production, and interactivity of each case. The program's code was modularized, using object-oriented techniques, to allow for the variations in cases and for team programming. All of the clinical and teaching content is stored in a database, that allows for faster and easier editing by many persons simultaneously.
Aeras: A next generation global atmosphere model
Spotz, William F.; Smith, Thomas M.; Demeshko, Irina P.; ...
2015-06-01
Sandia National Laboratories is developing a new global atmosphere model named Aeras that is performance portable and supports the quantification of uncertainties. These next-generation capabilities are enabled by building Aeras on top of Albany, a code base that supports the rapid development of scientific application codes while leveraging Sandia's foundational mathematics and computer science packages in Trilinos and Dakota. Embedded uncertainty quantification (UQ) is an original design capability of Albany, and performance portability is a recent upgrade. Other required features, such as shell-type elements, spectral elements, efficient explicit and semi-implicit time-stepping, transient sensitivity analysis, and concurrent ensembles, were not componentsmore » of Albany as the project began, and have been (or are being) added by the Aeras team. We present early UQ and performance portability results for the shallow water equations.« less
Cost/Performance Ratio Achieved by Using a Commodity-Based Cluster
NASA Technical Reports Server (NTRS)
Lopez, Isaac
2001-01-01
Researchers at the NASA Glenn Research Center acquired a commodity cluster based on Intel Corporation processors to compare its performance with a traditional UNIX cluster in the execution of aeropropulsion applications. Since the cost differential of the clusters was significant, a cost/performance ratio was calculated. After executing a propulsion application on both clusters, the researchers demonstrated a 9.4 cost/performance ratio in favor of the Intel-based cluster. These researchers utilize the Aeroshark cluster as one of the primary testbeds for developing NPSS parallel application codes and system software. The Aero-shark cluster provides 64 Intel Pentium II 400-MHz processors, housed in 32 nodes. Recently, APNASA - a code developed by a Government/industry team for the design and analysis of turbomachinery systems was used for a simulation on Glenn's Aeroshark cluster.
An approach to verification and validation of a reliable multicasting protocol: Extended Abstract
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.
1995-01-01
This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. This initial version did not handle off-nominal cases such as network partitions or site failures. Meanwhile, the V&V team concurrently developed a formal model of the requirements using a variant of SCR-based state tables. Based on these requirements tables, the V&V team developed test cases to exercise the implementation. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or off-nominal behaviors predicted by the current model. If the execution of a test in the model and implementation agreed, then the test either found a potential problem or verified a required behavior. However, if the execution of a test was different in the model and implementation, then the differences helped identify inconsistencies between the model and implementation. In either case, the dialogue between both teams drove the co-evolution of the model and implementation. We have found that this interactive, iterative approach to development allows software designers to focus on delivery of nominal functionality while the V&V team can focus on analysis of off nominal cases. Testing serves as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP. Although RMP has provided our research effort with a rich set of test cases, it also has practical applications within NASA. For example, RMP is being considered for use in the NASA EOSDIS project due to its significant performance benefits in applications that need to replicate large amounts of data to many network sites.
NASA Technical Reports Server (NTRS)
Quaranto, Kristy
2014-01-01
This internship provided an opportunity for an intern to work with NASA's Ground Support Equipment (GSE) for the Spaceport Command and Control System (SCCS) at Kennedy Space Center as a remote display developer, under NASA technical mentor Kurt Leucht. The main focus was on creating remote displays and applications for the hypergolic and high pressure helium subsystem team to help control the filling of the respective tanks. As a remote display and application developer for the GSE hypergolic and high pressure helium subsystem team the intern was responsible for creating and testing graphical remote displays and applications to be used in the Launch Control Center (LCC) on the Firing Room's computers. To become more familiar with the subsystem, the individual attended multiple project meetings and acquired their specific requirements regarding what needed to be included in the software. After receiving the requirements for the displays, the next step was to create displays that had both visual appeal and logical order using the Display Editor, on the Virtual Machine (VM). In doing so, all Compact Unique Identifiers (CUI), which are associated with specific components within the subsystem, were need to be included in each respective display for the system to run properly. Then, once the display was created it was to be tested to ensure that the display runs as intended by using the Test Driver, also found on the VM. This Test Driver is a specific application that checks to make sure all the CUIs in the display are running properly and returning the correct form of information. After creating and locally testing the display it needed to go through further testing and evaluation before deemed suitable for actual use. For the remote applications the intern was responsible for creating a project that focused on channelizing each component included in each display. The core of the application code was created by setting up spreadsheets and having an auto test generator, generate the complete code structure. This application code was then loaded and ran on a testing environment set to ensure the code runs as anticipated. By the end of the semester-long experience at NASA's Kennedy Space Center, the individual should have gained great knowledge and experience in various areas of both display and application development and testing. They were able to demonstrate this new knowledge obtained by creating multiple successful remote displays that will one day be used by the hypergolic and high pressure helium subsystem team in the LCC's firing rooms to service the new Orion spacecraft. The completed display channelization application will be used to receive verification from NASA quality engineers.
Proceedings of the Toronto TEAM/ACES workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, L.R.
The third TEAM Workshop of the third round was held at Ontario Hydro in Toronto 25--26 October 1990, immediately following the Conference on Electromagnetic Field Computation. This was the first Joint Workshop with ACES (Applied Computational Electromagnetics Society), whose goals are similar to TEAM, but who tend to work at higher frequencies (Antennas, Propagation, and Scattering). A fusion problem, the eddy current heating of the case of the Euratom Large Coil Project Coil, was adapted as Problem 14 at the Oxford Workshop, and a solution to that problem was presented at Toronto by Oskar Biro of the Graz (Austria) Universitymore » of Technology. Individual solutions were also presented for Problems 8 (Flaw in a Plate) and 9 (Moving Coil inside a Pipe). Five new solutions were presented to Problem 13 (DC Coil in a Ferromagnetic Yoke), and Koji Fujiwara of Okayama University summarized these solutions along with the similar number presented at Oxford. The solutions agreed well in the air but disagreed in the steel. Codes with a formulation in magnetic field strength or scalar potential underestimated the flux density in the steel, and codes based on flux density or vector potential overestimated it. Codes with edge elements appeared to do better than codes with nodal elements. These results stimulated considerable discussions; in my view that was the most valuable result of the workshop.« less
Maintaining Quality and Confidence in Open-Source, Evolving Software: Lessons Learned with PFLOTRAN
NASA Astrophysics Data System (ADS)
Frederick, J. M.; Hammond, G. E.
2017-12-01
Software evolution in an open-source framework poses a major challenge to a geoscientific simulator, but when properly managed, the pay-off can be enormous for both the developers and the community at large. Developers must juggle implementing new scientific process models, adopting increasingly efficient numerical methods and programming paradigms, changing funding sources (or total lack of funding), while also ensuring that legacy code remains functional and reported bugs are fixed in a timely manner. With robust software engineering and a plan for long-term maintenance, a simulator can evolve over time incorporating and leveraging many advances in the computational and domain sciences. In this positive light, what practices in software engineering and code maintenance can be employed within open-source development to maximize the positive aspects of software evolution and community contributions while minimizing its negative side effects? This presentation will discusses steps taken in the development of PFLOTRAN (www.pflotran.org), an open source, massively parallel subsurface simulator for multiphase, multicomponent, and multiscale reactive flow and transport processes in porous media. As PFLOTRAN's user base and development team continues to grow, it has become increasingly important to implement strategies which ensure sustainable software development while maintaining software quality and community confidence. In this presentation, we will share our experiences and "lessons learned" within the context of our open-source development framework and community engagement efforts. Topics discussed will include how we've leveraged both standard software engineering principles, such as coding standards, version control, and automated testing, as well unique advantages of object-oriented design in process model coupling, to ensure software quality and confidence. We will also be prepared to discuss the major challenges faced by most open-source software teams, such as on-boarding new developers or one-time contributions, dealing with competitors or lookie-loos, and other downsides of complete transparency, as well as our approach to community engagement, including a user group email list, hosting short courses and workshops for new users, and maintaining a website. SAND2017-8174A
A novel use of photovoice methodology in a leadership APPE and pharmacy leadership elective.
Wilson, Jane E; Smith, Michael J; Lambert, Tammy L; George, David L; Bulkley, Christina
2017-11-01
The purpose of this article is to describe and assess the effectiveness of an innovative teaching approach in an advanced pharmacy practice experience (APPE) and leadership elective. Three cohorts of students [(2014: n = 14), (2015: n = 17), (2016: n = 19)] were introduced to the photovoice (PV) method in their leadership APPE. PV required students to take, present, and discuss photographs within their cohorts. PV was used as a teaching method with the intention that the process would compel students to be involved in leadership development throughout experiential rotations, participate in discussions related to leadership development, and engage in creative activity. Group discussions from the class of 2014 were recorded and transcribed. Students from all cohorts were asked to participate in an electronic survey containing items based on PV learning objectives. All students were asked to participate in semi-structured interviews about PV. The inductive coding method was used to identify themes from discussion transcripts. Analysis of themes revealed 51.5% of the PV photographs related to emotional intelligence. Development of others and strong teams were themes represented in 44.3% of photographs. Survey data indicated all respondents agreed PV was a valuable method to describe learning in leadership. Interview coding revealed themes related to emotional intelligence and development of teams. The PV method was an effective teaching tool in a leadership APPE and elective course. PV is a teaching method to be utilized in a variety of experiential learning environments to better enhance the professional development of pharmacy students. Copyright © 2017 Elsevier Inc. All rights reserved.
Brocklehurst, P; Nomura, M; Ozaki, T; Ferguson, J; Matsuda, R
2013-11-01
Leadership has been argued to be a key component in the transformation of services in the United Kingdom and in Japan. In the UK, local professional networks have developed to provide clinician led care in dentistry; working to develop local plans to deliver improvements in the quality of care for patients. In Japan, the remuneration model for dental care has been revised with the aim to improve the service and tackle the current challenges of population health there. The aim of this study was to use semi-structured interviews and thematic analysis to explore general dental practitioners' (GDPs) understanding of the term 'leadership' and determine whether its meaning is culturally bound. Twelve participants were sampled purposively by the research team; identifying GDPs involved in leadership roles from across Greater Manchester and Tokyo. A set of open-ended questions was developed for semi-structured interviews a priori and the interviews continued until saturation. Interviews were recorded, transcribed verbatim and codes were developed into a coding frame for thematic analysis. Representative quotations are provided in the results. Fourteen codes were identified according to the aims of the study and organised into five overarching themes. 'Leadership as the relationship' was more pronounced among Japanese GDPs, while 'leadership as the individual' was common in GDPs from Greater Manchester. Differences were also found in respect of education and training in leadership. Training was also considered to be important by the GDPs from Japan, while UK GDPs felt leaders were more likely to be influenced by innate qualities. The interdependence of leadership and entrepreneurship was raised by both sets of GDPs. The concept of leadership was considered to be important by GDPs from both Greater Manchester and Tokyo; leadership was seen as providing strategy and direction for a clinical team. However, cultural influences were evident in how this was conceptualised.
Performance evaluation of the intra compression in the video coding standards
NASA Astrophysics Data System (ADS)
Abramowski, Andrzej
2015-09-01
The article presents a comparison of the Intra prediction algorithms in the current state-of-the-art video coding standards, including MJPEG 2000, VP8, VP9, H.264/AVC and H.265/HEVC. The effectiveness of techniques employed by each standard is evaluated in terms of compression efficiency and average encoding time. The compression efficiency is measured using BD-PSNR and BD-RATE metrics with H.265/HEVC results as an anchor. Tests are performed on a set of video sequences, composed of sequences gathered by Joint Collaborative Team on Video Coding during the development of the H.265/HEVC standard and 4K sequences provided by Ultra Video Group. According to results, H.265/HEVC provides significant bit-rate savings at the expense of computational complexity, while VP9 may be regarded as a compromise between the efficiency and required encoding time.
CASL Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mousseau, Vincent Andrew; Dinh, Nam
2016-06-30
This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation andmore » verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.« less
Problem-Solving Phase Transitions During Team Collaboration.
Wiltshire, Travis J; Butner, Jonathan E; Fiore, Stephen M
2018-01-01
Multiple theories of problem-solving hypothesize that there are distinct qualitative phases exhibited during effective problem-solving. However, limited research has attempted to identify when transitions between phases occur. We integrate theory on collaborative problem-solving (CPS) with dynamical systems theory suggesting that when a system is undergoing a phase transition it should exhibit a peak in entropy and that entropy levels should also relate to team performance. Communications from 40 teams that collaborated on a complex problem were coded for occurrence of problem-solving processes. We applied a sliding window entropy technique to each team's communications and specified criteria for (a) identifying data points that qualify as peaks and (b) determining which peaks were robust. We used multilevel modeling, and provide a qualitative example, to evaluate whether phases exhibit distinct distributions of communication processes. We also tested whether there was a relationship between entropy values at transition points and CPS performance. We found that a proportion of entropy peaks was robust and that the relative occurrence of communication codes varied significantly across phases. Peaks in entropy thus corresponded to qualitative shifts in teams' CPS communications, providing empirical evidence that teams exhibit phase transitions during CPS. Also, lower average levels of entropy at the phase transition points predicted better CPS performance. We specify future directions to improve understanding of phase transitions during CPS, and collaborative cognition, more broadly. Copyright © 2017 Cognitive Science Society, Inc.
A recent Cleanroom success story: The Redwing project
NASA Technical Reports Server (NTRS)
Hausler, Philip A.
1992-01-01
Redwing is the largest completed Cleanroom software engineering project in IBM, both in terms of lines of code and project staffing. The product provides a decision-support facility that utilizes artificial intelligence (AI) technology for predicting and preventing complex operating problems in an MVS environment. The project used the Cleanroom process for development and realized a defect rate of 2.6 errors/KLOC, measured from first execution. This represents the total amount of errors that were found in testing and installation at three field test sites. Development productivity was 486 LOC/PM, which included all development labor expended in design specification through completion of incremental testing. In short, the Redwing team produced a complex systems software product with an extraordinarily low error rate, while maintaining high productivity. All of this was accomplished by a project team using Cleanroom for the first time. An 'introductory implementation' of Cleanroom was defined and used on Redwing. This paper describes the quality and productivity results, the Redwing project, and how Cleanroom was implemented.
78 FR 14912 - International Aviation Safety Assessment (IASA) Program Change
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-08
...; and Public Expectations of IASA Categories Removal of Inactive Countries Under the IASA program, the... can put a U.S. carrier code on its flights. Public Expectations of IASA Category Ratings Members of... by a team consisting of a team leader and at least one expert in operations, maintenance, and...
Edge Simulation Laboratory Progress and Plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cohen, R
The Edge Simulation Laboratory (ESL) is a project to develop a gyrokinetic code for MFE edge plasmas based on continuum (Eulerian) techniques. ESL is a base-program activity of OFES, with an allied algorithm research activity funded by the OASCR base math program. ESL OFES funds directly support about 0.8 FTE of career staff at LLNL, a postdoc and a small fraction of an FTE at GA, and a graduate student at UCSD. In addition the allied OASCR program funds about 1/2 FTE each in the computations directorates at LBNL and LLNL. OFES ESL funding for LLNL and UCSD began inmore » fall 2005, while funding for GA and the math team began about a year ago. ESL's continuum approach is a complement to the PIC-based methods of the CPES Project, and was selected (1) because of concerns about noise issues associated with PIC in the high-density-contrast environment of the edge pedestal, (2) to be able to exploit advanced numerical methods developed for fluid codes, and (3) to build upon the successes of core continuum gyrokinetic codes such as GYRO, GS2 and GENE. The ESL project presently has three components: TEMPEST, a full-f, full-geometry (single-null divertor, or arbitrary-shape closed flux surfaces) code in E, {mu} (energy, magnetic-moment) coordinates; EGK, a simple-geometry rapid-prototype code, presently of; and the math component, which is developing and implementing algorithms for a next-generation code. Progress would be accelerated if we could find funding for a fourth, computer science, component, which would develop software infrastructure, provide user support, and address needs for data handing and analysis. We summarize the status and plans for the three funded activities.« less
Development and feasibility testing of the Pediatric Emergency Discharge Interaction Coding Scheme.
Curran, Janet A; Taylor, Alexandra; Chorney, Jill; Porter, Stephen; Murphy, Andrea; MacPhee, Shannon; Bishop, Andrea; Haworth, Rebecca
2017-08-01
Discharge communication is an important aspect of high-quality emergency care. This study addresses the gap in knowledge on how to describe discharge communication in a paediatric emergency department (ED). The objective of this feasibility study was to develop and test a coding scheme to characterize discharge communication between health-care providers (HCPs) and caregivers who visit the ED with their children. The Pediatric Emergency Discharge Interaction Coding Scheme (PEDICS) and coding manual were developed following a review of the literature and an iterative refinement process involving HCP observations, inter-rater assessments and team consensus. The coding scheme was pilot-tested through observations of HCPs across a range of shifts in one urban paediatric ED. Overall, 329 patient observations were carried out across 50 observational shifts. Inter-rater reliability was evaluated in 16% of the observations. The final version of the PEDICS contained 41 communication elements. Kappa scores were greater than .60 for the majority of communication elements. The most frequently observed communication elements were under the Introduction node and the least frequently observed were under the Social Concerns node. HCPs initiated the majority of the communication. Pediatric Emergency Discharge Interaction Coding Scheme addresses an important gap in the discharge communication literature. The tool is useful for mapping patterns of discharge communication between HCPs and caregivers. Results from our pilot test identified deficits in specific areas of discharge communication that could impact adherence to discharge instructions. The PEDICS would benefit from further testing with a different sample of HCPs. © 2017 The Authors. Health Expectations Published by John Wiley & Sons Ltd.
Fernandez, Rosemarie; Pearce, Marina; Grand, James A; Rench, Tara A; Jones, Kerin A; Chao, Georgia T; Kozlowski, Steve W J
2013-11-01
To determine the impact of a low-resource-demand, easily disseminated computer-based teamwork process training intervention on teamwork behaviors and patient care performance in code teams. A randomized comparison trial of computer-based teamwork training versus placebo training was conducted from August 2010 through March 2011. This study was conducted at the simulation suite within the Kado Family Clinical Skills Center, Wayne State University School of Medicine. Participants (n = 231) were fourth-year medical students and first-, second-, and third-year emergency medicine residents at Wayne State University. Each participant was assigned to a team of four to six members (nteams = 45). Teams were randomly assigned to receive either a 25-minute computer-based training module targeting appropriate resuscitation teamwork behaviors or a placebo training module. Teamwork behaviors and patient care behaviors were video recorded during high-fidelity simulated patient resuscitations and coded by trained raters blinded to condition assignment and study hypotheses. Teamwork behavior items (e.g., "chest radiograph findings communicated to team" and "team member assists with intubation preparation") were standardized before combining to create overall teamwork scores. Similarly, patient care items ("chest radiograph correctly interpreted"; "time to start of compressions") were standardized before combining to create overall patient care scores. Subject matter expert reviews and pilot testing of scenario content, teamwork items, and patient care items provided evidence of content validity. When controlling for team members' medically relevant experience, teams in the training condition demonstrated better teamwork (F [1, 42] = 4.81, p < 0.05; ηp = 10%) and patient care (F [1, 42] = 4.66, p < 0.05; ηp = 10%) than did teams in the placebo condition. Computer-based team training positively impacts teamwork and patient care during simulated patient resuscitations. This low-resource team training intervention may help to address the dissemination and sustainability issues associated with larger, more costly team training programs.
Operative team communication during simulated emergencies: Too busy to respond?
Davis, W Austin; Jones, Seth; Crowell-Kuhnberg, Adrianna M; O'Keeffe, Dara; Boyle, Kelly M; Klainer, Suzanne B; Smink, Douglas S; Yule, Steven
2017-05-01
Ineffective communication among members of a multidisciplinary team is associated with operative error and failure to rescue. We sought to measure operative team communication in a simulated emergency using an established communication framework called "closed loop communication." We hypothesized that communication directed at a specific recipient would be more likely to elicit a check back or closed loop response and that this relationship would vary with changes in patients' clinical status. We used the closed loop communication framework to code retrospectively the communication behavior of 7 operative teams (each comprising 2 surgeons, anesthesiologists, and nurses) during response to a simulated, postanesthesia care unit "code blue." We identified call outs, check backs, and closed loop episodes and applied descriptive statistics and a mixed-effects negative binomial regression to describe characteristics of communication in individuals and in different specialties. We coded a total of 662 call outs. The frequency and type of initiation and receipt of communication events varied between clinical specialties (P < .001). Surgeons and nurses initiated fewer and received more communication events than anesthesiologists. For the average participant, directed communication increased the likelihood of check back by at least 50% (P = .021) in periods preceding acute changes in the clinical setting, and exerted no significant effect in periods after acute changes in the clinical situation. Communication patterns vary by specialty during a simulated operative emergency, and the effect of directed communication in eliciting a response depends on the clinical status of the patient. Operative training programs should emphasize the importance of quality communication in the period immediately after an acute change in the clinical setting of a patient and recognize that communication patterns and needs vary between members of multidisciplinary operative teams. Copyright © 2016 Elsevier Inc. All rights reserved.
Curran, Vernon; Fleet, Lisa; Greene, Melanie
2012-01-01
Resuscitation and life support skills training comprises a significant proportion of continuing education programming for health professionals. The purpose of this study was to explore the perceptions and attitudes of certified resuscitation providers toward the retention of resuscitation skills, regular skills updating, and methods for enhancing retention. A mixed-methods, explanatory study design was undertaken utilizing focus groups and an online survey-questionnaire of rural and urban health care providers. Rural providers reported less experience with real codes and lower abilities across a variety of resuscitation areas. Mock codes, practice with an instructor and a team, self-practice with a mannequin, and e-learning were popular methods for skills updating. Aspects of team performance that were felt to influence resuscitation performance included: discrepancies in skill levels, lack of communication, and team leaders not up to date on their skills. Confidence in resuscitation abilities was greatest after one had recently practiced or participated in an update or an effective debriefing session. Lowest confidence was reported when team members did not work well together, there was no clear leader of the resuscitation code, or if team members did not communicate. The study findings highlight the importance of access to update methods for improving providers' confidence and abilities, and the need for emphasis on teamwork training in resuscitation. An eclectic approach combining methods may be the best strategy for addressing the needs of health professionals across various clinical departments and geographic locales. Copyright © 2012 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on CME, Association for Hospital Medical Education.
Meyer, Emily M; Zapatka, Susan; Brienza, Rebecca S
2015-06-01
The United States Department of Veterans Affairs Connecticut Healthcare System (VACHS) is one of five Centers of Excellence in Primary Care Education (CoEPCE) pilot sites. The overall goal of the CoEPCE program, which is funded by the Office of Academic Affiliations, is to develop and implement innovative approaches for training future health care providers in postgraduate education programs to function effectively in teams to provide exceptional patient care. This longitudinal study employs theoretically grounded qualitative methods to understand the effect of a combined nursing and medical training model on professional identity and team development at the VACHS CoEPCE site. The authors used qualitative approaches to understand trainees' experiences, expectations, and impressions of the program. From September 2011 to August 2012, they conducted 28 interviews of 18 trainees (internal medicine [IM] residents and nurse practitioners [NPs]) and subjected data to three stages of open, iterative coding. Major themes illuminate both the evolution of individual professional identity within both types of trainees and the dynamic process of group identity development. Results suggest that initially IM residents struggled to understand NPs' roles and responsibilities, whereas NP trainees doubted their ability to work alongside physicians. At the end of one academic year, these uncertainties disappeared, and what was originally artificial had transformed into an organic interprofessional team of health providers who shared a strong sense of understanding and trust. This study provides early evidence of successful interprofessional collaboration among NPs and IM residents in a primary care training program.
Performance study of a data flow architecture
NASA Technical Reports Server (NTRS)
Adams, George
1985-01-01
Teams of scientists studied data flow concepts, static data flow machine architecture, and the VAL language. Each team mapped its application onto the machine and coded it in VAL. The principal findings of the study were: (1) Five of the seven applications used the full power of the target machine. The galactic simulation and multigrid fluid flow teams found that a significantly smaller version of the machine (16 processing elements) would suffice. (2) A number of machine design parameters including processing element (PE) function unit numbers, array memory size and bandwidth, and routing network capability were found to be crucial for optimal machine performance. (3) The study participants readily acquired VAL programming skills. (4) Participants learned that application-based performance evaluation is a sound method of evaluating new computer architectures, even those that are not fully specified. During the course of the study, participants developed models for using computers to solve numerical problems and for evaluating new architectures. These models form the bases for future evaluation studies.
Ecklund, Alexandra M; Hunt, Shanda L; Nelson, Toben F; Toomey, Traci L
2015-01-01
Background Researchers and practitioners interested in developing online health interventions most often rely on Web-based and print resources to guide them through the process of online intervention development. Although useful for understanding many aspects of best practices for website development, missing from these resources are concrete examples of experiences in online intervention development for health apps from the perspective of those conducting online health interventions. Objective This study aims to serve as a series of case studies in the development of online health interventions to provide insights for researchers and practitioners who are considering technology-based interventional or programmatic approaches. Methods A convenience sample of six study coordinators and five principal investigators at a large, US-based land grant university were interviewed about the process of developing online interventions in the areas of alcohol policy, adolescent health, medication adherence, and human immunodeficiency virus prevention in transgender persons and in men who have sex with men. Participants were asked questions that broadly addressed each of the four phases of the User-Centered Design Process Map from the US Department of Health and Human Services' Research-Based Web Design & Usability Guidelines. Interviews were audio recorded and transcribed. Qualitative codes were developed using line-by-line open coding for all transcripts, and all transcripts were coded independently by at least 2 authors. Differences among coders were resolved with discussion. Results We identified the following seven themes: (1) hire a strong (or at least the right) research team, (2) take time to plan before beginning the design process, (3) recognize that vendors and researchers have differing values, objectives, and language, (4) develop a detailed contract, (5) document all decisions and development activities, (6) use a content management system, and (7) allow extra time for testing and debugging your intervention. Each of these areas is discussed in detail, with supporting quotations from principal investigators and study coordinators. Conclusions The values held by members of each participating organization involved in the development of the online intervention or program, as well as the objectives that are trying to be met with the website, must be considered. These defined values and objectives should prompt an open and explicit discussion about the scope of work, budget, and other needs from the perspectives of each organization. Because of the complexity of developing online interventions, researchers and practitioners should become familiar with the process and how it may differ from the development and implementation of in-person interventions or programs. To assist with this, the intervention team should consider expanding the team to include experts in computer science or learning technologies, as well as taking advantage of institutional resources that will be needed for successful completion of the project. Finally, we describe the tradeoff between funds available for online intervention or program development and the complexity of the project. PMID:25650702
Horvath, Keith J; Ecklund, Alexandra M; Hunt, Shanda L; Nelson, Toben F; Toomey, Traci L
2015-01-23
Researchers and practitioners interested in developing online health interventions most often rely on Web-based and print resources to guide them through the process of online intervention development. Although useful for understanding many aspects of best practices for website development, missing from these resources are concrete examples of experiences in online intervention development for health apps from the perspective of those conducting online health interventions. This study aims to serve as a series of case studies in the development of online health interventions to provide insights for researchers and practitioners who are considering technology-based interventional or programmatic approaches. A convenience sample of six study coordinators and five principal investigators at a large, US-based land grant university were interviewed about the process of developing online interventions in the areas of alcohol policy, adolescent health, medication adherence, and human immunodeficiency virus prevention in transgender persons and in men who have sex with men. Participants were asked questions that broadly addressed each of the four phases of the User-Centered Design Process Map from the US Department of Health and Human Services' Research-Based Web Design & Usability Guidelines. Interviews were audio recorded and transcribed. Qualitative codes were developed using line-by-line open coding for all transcripts, and all transcripts were coded independently by at least 2 authors. Differences among coders were resolved with discussion. We identified the following seven themes: (1) hire a strong (or at least the right) research team, (2) take time to plan before beginning the design process, (3) recognize that vendors and researchers have differing values, objectives, and language, (4) develop a detailed contract, (5) document all decisions and development activities, (6) use a content management system, and (7) allow extra time for testing and debugging your intervention. Each of these areas is discussed in detail, with supporting quotations from principal investigators and study coordinators. The values held by members of each participating organization involved in the development of the online intervention or program, as well as the objectives that are trying to be met with the website, must be considered. These defined values and objectives should prompt an open and explicit discussion about the scope of work, budget, and other needs from the perspectives of each organization. Because of the complexity of developing online interventions, researchers and practitioners should become familiar with the process and how it may differ from the development and implementation of in-person interventions or programs. To assist with this, the intervention team should consider expanding the team to include experts in computer science or learning technologies, as well as taking advantage of institutional resources that will be needed for successful completion of the project. Finally, we describe the tradeoff between funds available for online intervention or program development and the complexity of the project.
Simulation trainer for practicing emergent open thoracotomy procedures.
Hamilton, Allan J; Prescher, Hannes; Biffar, David E; Poston, Robert S
2015-07-01
An emergent open thoracotomy (OT) is a high-risk, low-frequency procedure uniquely suited for simulation training. We developed a cost-effective Cardiothoracic (CT) Surgery trainer and assessed its potential for improving technical and interprofessional skills during an emergent simulated OT. We modified a commercially available mannequin torso with artificial tissue models to create a custom CT Surgery trainer. The trainer's feasibility for simulating emergent OT was tested using a multidisciplinary CT team in three consecutive in situ simulations. Five discretely observable milestones were identified as requisite steps in carrying out an emergent OT; namely (1) diagnosis and declaration of a code situation, (2) arrival of the code cart, (3) arrival of the thoracotomy tray, (4) initiation of the thoracotomy incision, and (5) defibrillation of a simulated heart. The time required for a team to achieve each discrete step was measured by an independent observer over the course of each OT simulation trial and compared. Over the course of the three OT simulation trials conducted in the coronary care unit, there was an average reduction of 29.5% (P < 0.05) in the times required to achieve the five critical milestones. The time required to complete the whole OT procedure improved by 7 min and 31 s from the initial to the final trial-an overall improvement of 40%. In our preliminary evaluation, the CT Surgery trainer appears to be useful for improving team performance during a simulated emergent bedside OT in the coronary care unit. Copyright © 2015 Elsevier Inc. All rights reserved.
The benefits of flexible team interaction during crises.
Stachowski, Alicia A; Kaplan, Seth A; Waller, Mary J
2009-11-01
Organizations increasingly rely on teams to respond to crises. While research on team effectiveness during nonroutine events is growing, naturalistic studies examining team behaviors during crises are relatively scarce. Furthermore, the relevant literature offers competing theoretical rationales concerning effective team response to crises. In this article, the authors investigate whether high- versus average-performing teams can be distinguished on the basis of the number and complexity of their interaction patterns. Using behavioral observation methodology, the authors coded the discrete verbal and nonverbal behaviors of 14 nuclear power plant control room crews as they responded to a simulated crisis. Pattern detection software revealed systematic differences among crews in their patterns of interaction. Mean comparisons and discriminant function analysis indicated that higher performing crews exhibited fewer, shorter, and less complex interaction patterns. These results illustrate the limitations of standardized response patterns and highlight the importance of team adaptability. Implications for future research and for team training are included.
PS1-41: Just Add Data: Implementing an Event-Based Data Model for Clinical Trial Tracking
Fuller, Sharon; Carrell, David; Pardee, Roy
2012-01-01
Background/Aims Clinical research trials often have similar fundamental tracking needs, despite being quite variable in their specific logic and activities. A model tracking database that can be quickly adapted by a variety of studies has the potential to achieve significant efficiencies in database development and maintenance. Methods Over the course of several different clinical trials, we have developed a database model that is highly adaptable to a variety of projects. Rather than hard-coding each specific event that might occur in a trial, along with its logical consequences, this model considers each event and its parameters to be a data record in its own right. Each event may have related variables (metadata) describing its prerequisites, subsequent events due, associated mailings, or events that it overrides. The metadata for each event is stored in the same record with the event name. When changes are made to the study protocol, no structural changes to the database are needed. One has only to add or edit events and their metadata. Changes in the event metadata automatically determine any related logic changes. In addition to streamlining application code, this model simplifies communication between the programmer and other team members. Database requirements can be phrased as changes to the underlying data, rather than to the application code. The project team can review a single report of events and metadata and easily see where changes might be needed. In addition to benefitting from streamlined code, the front end database application can also implement useful standard features such as automated mail merges and to do lists. Results The event-based data model has proven itself to be robust, adaptable and user-friendly in a variety of study contexts. We have chosen to implement it as a SQL Server back end and distributed Access front end. Interested readers may request a copy of the Access front end and scripts for creating the back end database. Discussion An event-based database with a consistent, robust set of features has the potential to significantly reduce development time and maintenance expense for clinical trial tracking databases.
Computational strategies for three-dimensional flow simulations on distributed computer systems
NASA Technical Reports Server (NTRS)
Sankar, Lakshmi N.; Weed, Richard A.
1995-01-01
This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.
Factors associated with delay in trauma team activation and impact on patient outcomes.
Connolly, Rory; Woo, Michael Y; Lampron, Jacinthe; Perry, Jeffrey J
2017-09-05
Trauma code activation is initiated by emergency physicians using physiological and anatomical criteria, mechanism of injury, and patient demographic factors. Our objective was to identify factors associated with delayed trauma team activation. We assessed consecutive cases from a regional trauma database from January 2008 to March 2014. We defined a delay in trauma code activation as a time greater than 30 minutes from the time of arrival. We conducted univariate analysis for factors potentially influencing trauma team activation, and we subsequently used multiple logistic regression analysis models for delayed activation in relation to mortality, length of stay, and time to operative management. Patients totalling 846 were included for our analysis; 4.1% (35/846) of trauma codes were activated after 30 minutes. Mean age was 40.8 years in the early group versus 49.2 in the delayed group (p=0.01). Patients were over age 70 years in 7.6% in the early activation group versus 17.1% in the delayed group (p=0.04). There was no significant difference in sex, type of injury, injury severity, or time from injury between the two groups. There was no significant difference in mortality, median length of stay, or median time to operative management. Delayed activation is linked with increasing age with no clear link to increased mortality. Given the severe injuries in the delayed cohort that required activation of the trauma team, further emphasis on the older trauma patient and interventions to recognize this vulnerable population should be made.
Computational strategies for three-dimensional flow simulations on distributed computer systems
NASA Astrophysics Data System (ADS)
Sankar, Lakshmi N.; Weed, Richard A.
1995-08-01
This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.
Olson, Kristian R; Walsh, Madeline; Garg, Priya; Steel, Alexis; Mehta, Sahil; Data, Santorino; Petersen, Rebecca; Guarino, Anthony J; Bailey, Elizabeth; Bangsberg, David R
2017-02-01
Healthcare-focused hackathons are 48-hour platforms intended to accelerate novel medical technology. However, debate exists about how much they contribute to medical technology innovation. The Consortium for Affordable Medical Technologies (CAMTech) has developed a three-pronged model to maximise their effectiveness. To gauge the success of this model, we examined follow-up outcomes. Outcomes of 12 hackathons from 2012 to 2015 in India, Uganda and the USA were measured using emailed surveys. To minimise response bias, non-responding teams were coded as having made no progress. 331 individuals provided information on 196 of 356 projects (55.1% response rate), with no difference in responses from teams participating in different countries (Cramer's V=0.09, p=0.17). 30.3% of projects had made progress after a mean of 12.2 months. 88 (24.7%) teams had initiated pilot testing, with 42 (11.8%) piloting with care providers and 24 (6.7%) with patients. Overall, 97 teams (8.1 per hackathon) drafted business plans, 22 (1.8 per hackathon) had filed patents on their innovations and 15 (1.3 per hackathon) had formed new companies. Teams raised US$64.08 million in funding (average US$5.34 million per hackathon; median award size of $1800). In addition, 108 teams (30.3%) reported at least one member working on additional technologies with people they met at a hackathon. Individual confidence to address medical technology challenges was significantly increased after attending (t(1282)=192.77, p 0.001). CAMTech healthcare hackathons lead to consistent output with respect to medical technology innovation, including clinical trials, business plan development, securing investment capital/funding and new company formation.
Walsh, Madeline; Garg, Priya; Steel, Alexis; Mehta, Sahil; Data, Santorino; Petersen, Rebecca; Guarino, Anthony J; Bailey, Elizabeth; Bangsberg, David R
2017-01-01
Background Healthcare-focused hackathons are 48-hour platforms intended to accelerate novel medical technology. However, debate exists about how much they contribute to medical technology innovation. The Consortium for Affordable Medical Technologies (CAMTech) has developed a three-pronged model to maximise their effectiveness. To gauge the success of this model, we examined follow-up outcomes. Methods Outcomes of 12 hackathons from 2012 to 2015 in India, Uganda and the USA were measured using emailed surveys. To minimise response bias, non-responding teams were coded as having made no progress. Results 331 individuals provided information on 196 of 356 projects (55.1% response rate), with no difference in responses from teams participating in different countries (Cramer's V=0.09, p=0.17). 30.3% of projects had made progress after a mean of 12.2 months. 88 (24.7%) teams had initiated pilot testing, with 42 (11.8%) piloting with care providers and 24 (6.7%) with patients. Overall, 97 teams (8.1 per hackathon) drafted business plans, 22 (1.8 per hackathon) had filed patents on their innovations and 15 (1.3 per hackathon) had formed new companies. Teams raised US$64.08 million in funding (average US$5.34 million per hackathon; median award size of $1800). In addition, 108 teams (30.3%) reported at least one member working on additional technologies with people they met at a hackathon. Individual confidence to address medical technology challenges was significantly increased after attending (t(1282)=192.77, p 0.001). Conclusion CAMTech healthcare hackathons lead to consistent output with respect to medical technology innovation, including clinical trials, business plan development, securing investment capital/funding and new company formation. PMID:28250965
Kelm, Diana J; Ridgeway, Jennifer L; Gas, Becca L; Mohan, Monali; Cook, David A; Nelson, Darlene R; Benzo, Roberto P
2018-05-18
Mindfulness training includes mindfulness meditation, which has been shown to improve both attention and self-awareness. Medical providers in the intensive care unit often deal with difficult situations with strong emotions, life-and-death decisions, and both interpersonal and interprofessional conflicts. The effect of mindfulness meditation training on healthcare providers during acute care tasks such as cardiopulmonary resuscitation remains unknown. Mindfulness meditation has the potential to improve provider well-being and reduce stress in individuals involved in resuscitation teams, which could then translate into better team communication and delivery of care under stress. A better understanding of this process could lead to more effective training approaches, improved team performance, and better patient outcomes. All participants were instructed to use a mindfulness meditation device (Muse™ headband) at home for 7 min twice a day or 14 min daily over the 4-week training period. This device uses brainwave sensors to monitor active versus relaxing brain activity and provides real-time feedback. We conducted a single-group pretest-posttest convergent mixed-methods study. We enrolled 24 healthcare providers, comprising 4 interprofessional code teams, including physicians, nurses, respiratory therapists, and pharmacists. Each team participated in a simulation session immediately before and after the mindfulness training period. Each session consisted of two simulated cardiopulmonary arrest scenarios. Both quantitative and qualitative outcomes were assessed. The median proportion of participants who used the device as prescribed was 85%. Emotional balance, as measured by the critical positivity ratio, improved significantly from pretraining to posttraining (p = .02). Qualitative findings showed that mindfulness meditation changed how participants responded to work-related stress, including stress in real-code situations. Participants described the value of time for self-guided practice with feedback from the device, which then helped them develop individual approaches to meditation not reliant on the technology. Time measures during the simulated scenarios improved, specifically, time to epinephrine in Scenario 1 (p = .03) and time to defibrillation in Scenario 2 (p = .02), improved. In addition, team performance, such as teamwork (p = .04), task management (p = .01), and overall performance (p = .04), improved significantly after mindfulness meditation training. Physiologic stress (skin conductance) improved but did not reach statistical significance (p = .11). Mindfulness meditation practice may improve individual well-being and team function in high-stress clinical environments. Our results may represent a foundation to design larger confirmatory studies.
Size Matters: How Big Should a Military Design Team Be?
2010-05-21
the other members in a group or team. Jeff Bezos , the CEO of Amazon, recommends his “two-pizza” rule, which provides further support for the...Social Psychology 90, no. 4 (2006): 644. Shannon and Weaver works similar 77 Jeff Atwood, “The Magical Number Seven Plus or Minus Two,” Coding...applicability of Miller’s cognitive limitation of five to nine items for interpersonal interactions. Bezos ’ rule states that if a team cannot be fed with
An Open Source Tool to Test Interoperability
NASA Astrophysics Data System (ADS)
Bermudez, L. E.
2012-12-01
Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and description of performing local tests. It will also provide information about how to participate in the open source code development of TEAM Engine.
Identifying Human Factors Issues in Aircraft Maintenance Operations
NASA Technical Reports Server (NTRS)
Veinott, Elizabeth S.; Kanki, Barbara G.; Shafto, Michael G. (Technical Monitor)
1995-01-01
Maintenance operations incidents submitted to the Aviation Safety Reporting System (ASRS) between 1986-1992 were systematically analyzed in order to identify issues relevant to human factors and crew coordination. This exploratory analysis involved 95 ASRS reports which represented a wide range of maintenance incidents. The reports were coded and analyzed according to the type of error (e.g, wrong part, procedural error, non-procedural error), contributing factors (e.g., individual, within-team, cross-team, procedure, tools), result of the error (e.g., aircraft damage or not) as well as the operational impact (e.g., aircraft flown to destination, air return, delay at gate). The main findings indicate that procedural errors were most common (48.4%) and that individual and team actions contributed to the errors in more than 50% of the cases. As for operational results, most errors were either corrected after landing at the destination (51.6%) or required the flight crew to stop enroute (29.5%). Interactions among these variables are also discussed. This analysis is a first step toward developing a taxonomy of crew coordination problems in maintenance. By understanding what variables are important and how they are interrelated, we may develop intervention strategies that are better tailored to the human factor issues involved.
Continuous Energy Photon Transport Implementation in MCATK
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Terry R.; Trahan, Travis John; Sweezy, Jeremy Ed
2016-10-31
The Monte Carlo Application ToolKit (MCATK) code development team has implemented Monte Carlo photon transport into the MCATK software suite. The current particle transport capabilities in MCATK, which process the tracking and collision physics, have been extended to enable tracking of photons using the same continuous energy approximation. We describe the four photoatomic processes implemented, which are coherent scattering, incoherent scattering, pair-production, and photoelectric absorption. The accompanying background, implementation, and verification of these processes will be presented.
Advancing medical-surgical nursing practice: improving management of the changing patient condition.
Monroe, Heidi; Plylar, Peggy; Krugman, Mary
2014-01-01
Higher patient acuities and more novice nurses on medical-surgical units have Educators focused on achieving positive outcomes with changes in patient condition. An educational program was developed to enhance nurses' knowledge, skill, and confidence in assessing hemodynamics, recognizing early signs of instability, and administering vasoactive medications. The program was successful with significant knowledge improvement as well as an increased use of the Medical Emergency Team while maintaining a low number of code calls.
Trust-Based Collaborative Control for Teams on Communication Networks
2012-02-11
Das, F.L. Lewis, and K . Subbarao , “Sliding Mode Approach to Control Quadrotor Using Dynamic Inversion," in Challenges and Paradigms in Applied Robust... b . ABSTRACT c. THIS PAGE 19b. TELEPHONE NUMBER (include area code) Standard Form 298 (Re . 8-98) v Prescribed by ANSI Std. Z39.18 11...Game Solutions In our work with students Draguna Vrabie and K . Vamvoudakis cited below we have developed new algorithms and theory for solving
NASA Technical Reports Server (NTRS)
Butler, Madeline J.; Sonneborn, George; Perkins, Dorothy C.
1994-01-01
The Mission Operations and Data Systems Directorate (MO&DSD, Code 500), the Space Sciences Directorate (Code 600), and the Flight Projects Directorate (Code 400) have developed a new approach to combine the science and mission operations for the FUSE mission. FUSE, the last of the Delta-class Explorer missions, will obtain high resolution far ultraviolet spectra (910 - 1220 A) of stellar and extragalactic sources to study the evolution of galaxies and conditions in the early universe. FUSE will be launched in 2000 into a 24-hour highly eccentric orbit. Science operations will be conducted in real time for 16-18 hours per day, in a manner similar to the operations performed today for the International Ultraviolet Explorer. In a radical departure from previous missions, the operations concept combines spacecraft and science operations and data processing functions in a single facility to be housed in the Laboratory for Astronomy and Solar Physics (Code 680). A small missions operations team will provide the spacecraft control, telescope operations and data handling functions in a facility designated as the Science and Mission Operations Center (SMOC). This approach will utilize the Transportable Payload Operations Control Center (TPOCC) architecture for both spacecraft and instrument commanding. Other concepts of integrated operations being developed by the Code 500 Renaissance Project will also be employed for the FUSE SMOC. The primary objective of this approach is to reduce development and mission operations costs. The operations concept, integration of mission and science operations, and extensive use of existing hardware and software tools will decrease both development and operations costs extensively. This paper describes the FUSE operations concept, discusses the systems engineering approach used for its development, and the software, hardware and management tools that will make its implementation feasible.
Children's implicit recall of junk food, alcohol and gambling sponsorship in Australian sport.
Bestman, Amy; Thomas, Samantha L; Randle, Melanie; Thomas, Stuart D M
2015-10-05
In Australia, sport is saturated by the promotion of junk food, alcohol and gambling products. This is particularly evident on player jerseys. The effect of this advertising on children, who are exposed to these messages while watching sport, has not been thoroughly investigated. The aim of this research study was to investigate: (1) the extent to which children implicitly recalled shirt sponsors with the correct sporting team; (2) whether children associated some types of sponsors with certain sporting codes more than others; and (3) whether age of the children influenced the correct recall of sponsoring brands and teams. This experimental study conducted in New South Wales, Australia used projective techniques to measure the implicit recall of team sponsorship relationships of 85 children aged 5-12 years. Participants were asked to arrange two sets of magnets - one which contained sporting teams and one which contained brand logos - in the manner deemed most appropriate by them. Children were not given any prompts relating to sporting sponsorship relationships. Three quarters (77 %) of the children were able to identify at least one correct shirt sponsor. Children associated alcohol and gambling brands more highly with the more popular sporting code, the National Rugby League compared to the Australian Football League sporting code. Results showed that age had an effect on number of shirt sponsors correctly recalled with 9-12 year olds being significantly more likely than 5-8 year olds to correctly identify team sponsors. Given children's ability to implicitly recall shirt sponsors in a sporting context, Australian sporting codes should examine their current sponsorship relationships to reduce the number of unhealthy commodity shirt sponsors. While there is some regulation that protects children from the marketing of unhealthy commodity products, these findings suggest that children are still exposed to and recall these sponsorship relationships. Results suggest that the promotion of unhealthy commodity products during sporting matches is contributing to increased awareness amongst children of unhealthy commodity brands. Further investigation is required to examine the extent and impact of marketing initiatives during televised sporting matches on children.
Why saying what you mean matters: An analysis of trauma team communication.
Jung, Hee Soo; Warner-Hillard, Charles; Thompson, Ryan; Haines, Krista; Moungey, Brooke; LeGare, Anne; Shaffer, David Williamson; Pugh, Carla; Agarwal, Suresh; Sullivan, Sarah
2018-02-01
We hypothesized that team communication with unmatched grammatical form and communicative intent (mixed mode communication) would correlate with worse trauma teamwork. Interdisciplinary trauma simulations were conducted. Team performance was rated using the TEAM tool. Team communication was coded for grammatical form and communicative intent. The rate of mixed mode communication (MMC) was calculated. MMC rates were compared to overall TEAM scores. Statements with advisement intent (attempts to guide behavior) and edification intent (objective information) were specifically examined. The rates of MMC with advisement intent (aMMC) and edification intent (eMMC) were also compared to TEAM scores. TEAM scores did not correlate with MMC or eMMC. However, aMMC rates negatively correlated with total TEAM scores (r = -0.556, p = 0.025) and with the TEAM task management component scores (r = -0.513, p = 0.042). Trauma teams with lower rates of mixed mode communication with advisement intent had better non-technical skills as measured by TEAM. Copyright © 2017 Elsevier Inc. All rights reserved.
Nouraei, S A R; Hudovsky, A; Frampton, A E; Mufti, U; White, N B; Wathen, C G; Sandhu, G S; Darzi, A
2015-06-01
Clinical coding is the translation of clinical activity into a coded language. Coded data drive hospital reimbursement and are used for audit and research, and benchmarking and outcomes management purposes. We undertook a 2-center audit of coding accuracy across surgery. Clinician-auditor multidisciplinary teams reviewed the coding of 30,127 patients and assessed accuracy at primary and secondary diagnosis and procedure levels, morbidity level, complications assignment, and financial variance. Postaudit data of a randomly selected sample of 400 cases were reaudited by an independent team. At least 1 coding change occurred in 15,402 patients (51%). There were 3911 (13%) and 3620 (12%) changes to primary diagnoses and procedures, respectively. In 5183 (17%) patients, the Health Resource Grouping changed, resulting in income variance of £3,974,544 (+6.2%). The morbidity level changed in 2116 (7%) patients (P < 0.001). The number of assigned complications rose from 2597 (8.6%) to 2979 (9.9%) (P < 0.001). Reaudit resulted in further primary diagnosis and procedure changes in 8.7% and 4.8% of patients, respectively. The coded data are a key engine for knowledge-driven health care provision. They are used, increasingly at individual surgeon level, to benchmark performance. Surgical clinical coding is prone to subjectivity, variability, and error (SVE). Having a specialty-by-specialty understanding of the nature and clinical significance of informatics variability and adopting strategies to reduce it, are necessary to allow accurate assumptions and informed decisions to be made concerning the scope and clinical applicability of administrative data in surgical outcomes improvement.
Code Krishna: an innovative practice respecting death, dying and beyond.
Vaishnav, Bhalendu; Nimbalkar, Somashekhar; Desai, Sandeep; Vaishnav, Smruti
2017-01-01
In moments of grief, human beings seek solace and attempt to discover the meaning of life and death by reaching out to wider and deeper dimensions of existence that stem from their religious, cultural and spiritual beliefs. Conventional patient care fails to consider this vital aspect of our lives. Many hold the view that life and its experiences do not end with death; the body is but a sheath which holds the soul that inhabits it. The use of a protocol-based practice to create a solemn atmosphere around the departed individual can bridge the gap between the materialistic and non-materialistic perceptions of the dimensions of care. The innovative practice, "Code Krishna", is aimed at institutionalising a practice which sensitises and empowers the treating team to address the grief of the relatives of deceased patients, and respect the departed in consonance with the family's cultural, religious and spiritual beliefs. The practice entails the creation of a solemn atmosphere amidst the action-packed environment of the critical care unit at the time of the patient's death, offering of collective prayer and floral tributes, and observation of silence both by the healthcare team and family members. Code Krishna attempts to blend current care practices with spirituality, ensuring that the treating team is the first to commiserate with the grieving family, with warmth and openness. In this piece, we briefly report our first-hand experiences of practising Code Krishna in our hospital [Shree Krishna Hospital, Karamsad, Central Gujarat].
American School Counselor Association Ethical Code Changes Relevant to Family Work
ERIC Educational Resources Information Center
Bodenhorn, Nancy
2005-01-01
Professional organizations regularly review and revise their codes of ethics. The American School Counselor Association (ASCA) completed this task in June 2004. At the 2004 national conference, the leadership team, consisting of state presidents, past presidents, and president elects, voted to adopt the changes. These changes, as outlined in Table…
High Speed Research Program Structural Acoustics Multi-Year Summary Report
NASA Technical Reports Server (NTRS)
Beier, Theodor H.; Bhat, Waman V.; Rizzi, Stephen A.; Silcox, Richard J.; Simpson, Myles A.
2005-01-01
This report summarizes the work conducted by the Structural Acoustics Integrated Technology Development (ITD) Team under NASA's High Speed Research (HSR) Phase II program from 1993 to 1999. It is intended to serve as a reference for future researchers by documenting the results of the interior noise and sonic fatigue technology development activities conducted during this period. For interior noise, these activities included excitation modeling, structural acoustic response modeling, development of passive treatments and active controls, and prediction of interior noise. For sonic fatigue, these activities included loads prediction, materials characterization, sonic fatigue code development, development of response reduction techniques, and generation of sonic fatigue design requirements. Also included are lessons learned and recommendations for future work.
Reducing the complexity of NASA's space communications infrastructure
NASA Technical Reports Server (NTRS)
Miller, Raymond E.; Liu, Hong; Song, Junehwa
1995-01-01
This report describes the range of activities performed during the annual reporting period in support of the NASA Code O Success Team - Lifecycle Effectiveness for Strategic Success (COST LESS) team. The overall goal of the COST LESS team is to redefine success in a constrained fiscal environment and reduce the cost of success for end-to-end mission operations. This goal is more encompassing than the original proposal made to NASA for reducing complexity of NASA's Space Communications Infrastructure. The COST LESS team approach for reengineering the space operations infrastructure has a focus on reversing the trend of engineering special solutions to similar problems.
Management of a CFD organization in support of space hardware development
NASA Technical Reports Server (NTRS)
Schutzenhofer, L. A.; Mcconnaughey, P. K.; Mcconnaughey, H. V.; Wang, T. S.
1991-01-01
The management strategy of NASA-Marshall's CFD branch in support of space hardware development and code validation implements various elements of total quality management. The strategy encompasses (1) a teaming strategy which focuses on the most pertinent problem, (2) quick-turnaround analysis, (3) the evaluation of retrofittable design options through sensitivity analysis, and (4) coordination between the chief engineer and the hardware contractors. Advanced-technology concepts are being addressed via the definition of technology-development projects whose products are transferable to hardware programs and the integration of research activities with industry, government agencies, and universities, on the basis of the 'consortium' concept.
Who's minding the charge description master?
Schaum, Kathleen D
2011-11-01
Just as it takes a team to manage chronic wounds, it takes a team to maintain the CDM. The technical staff from the wound care department should be represented on this team and should share the appropriate HCPCS codes and CPT codes, product descriptions, and costs for all procedures, services, supplies, drugs, and biologics used in their department. The billing department should ensure that the appropriate revenue codes for each payer are listed for each item on the CDM. Based on costs supplied by the wound care department, the finance department should consistently assign hospital charges to each line item on the CDM. The information technology department is responsible for making the specific changes to the CDM in the computer system. Most hospitals have a CDM coordinator. The technical staff from the wound care department should work closely with the CDM coordinator and should obtain from him/her the policies and procedures for maintaining the wound care department CDM. Most CDM coordinators will also provide a CDM Change Request Form. Use that form each year when the hospital is performing its annual CDM maintenance and throughout the year to add procedures, services, supplies, drugs, or biologics to your wound care offerings and/or when the cost for these offerings change.
2010-09-01
Operations and Procedures • Logistics and Facilities • Training • Exercises, Evaluation and Corrective Actions • Crisis Communications ...Assessment Team BCA Benefit-cost analysis CEO Chief Executive Officer CERT Community Emergency Response Team CFR Code of Federal Regulations...CHDS Center for Homeland Defense and Security CPG 101 Comprehensive Preparedness Guidelines 101 CPP Community Preparedness and Participation CPW
NATO Code of Best Practice for C2 Assessment (revised)
2002-10-01
Based Research, Inc . for the United States Office of Naval Research. These collaboration metrics focus on individual and team cognitive/awareness, team...Troops, Time, and Civil considerations OOTW – Operations Other Than War PESTLE – Political, Economic, Social, Technological, Legal, and Environmental...Operational Analysis OOTW Operations Other Than War OR Operations Research P PESTLE Political, Economic, Social, Technological, Legal, and Environmental
Support for Systematic Code Reviews with the SCRUB Tool
NASA Technical Reports Server (NTRS)
Holzmann, Gerald J.
2010-01-01
SCRUB is a code review tool that supports both large, team-based software development efforts (e.g., for mission software) as well as individual tasks. The tool was developed at JPL to support a new, streamlined code review process that combines human-generated review reports with program-generated review reports from a customizable range of state-of-the-art source code analyzers. The leading commercial tools include Codesonar, Coverity, and Klocwork, each of which can achieve a reasonably low rate of false-positives in the warnings that they generate. The time required to analyze code with these tools can vary greatly. In each case, however, the tools produce results that would be difficult to realize with human code inspections alone. There is little overlap in the results produced by the different analyzers, and each analyzer used generally increases the effectiveness of the overall effort. The SCRUB tool allows all reports to be accessed through a single, uniform interface (see figure) that facilitates brows ing code and reports. Improvements over existing software include significant simplification, and leveraging of a range of commercial, static source code analyzers in a single, uniform framework. The tool runs as a small stand-alone application, avoiding the security problems related to tools based on Web browsers. A developer or reviewer, for instance, must have already obtained access rights to a code base before that code can be browsed and reviewed with the SCRUB tool. The tool cannot open any files or folders to which the user does not already have access. This means that the tool does not need to enforce or administer any additional security policies. The analysis results presented through the SCRUB tool s user interface are always computed off-line, given that, especially for larger projects, this computation can take longer than appropriate for interactive tool use. The recommended code review process that is supported by the SCRUB tool consists of three phases: Code Review, Developer Response, and Closeout Resolution. In the Code Review phase, all tool-based analysis reports are generated, and specific comments from expert code reviewers are entered into the SCRUB tool. In the second phase, Developer Response, the developer is asked to respond to each comment and tool-report that was produced, either agreeing or disagreeing to provide a fix that addresses the issue that was raised. In the third phase, Closeout Resolution, all disagreements are discussed in a meeting of all parties involved, and a resolution is made for all disagreements. The first two phases generally take one week each, and the third phase is concluded in a single closeout meeting.
2010-10-14
non-battle injuries , and illnesses. International Classification of Diseases, Ninth Revision (ICD-9) coded patient conditions, selected by the...for a range of surgical and non- surgical injuries and illnesses, typically seen and treated by an ophthalmologist and one technician working 12-hour...receive them. The “Equipment/supplies” column identifies the items needed to complete the “Insert endo - trach tube” task at that level of capability. Not
2010-11-10
asset, including combat wounds, non-battle injuries , and illnesses. International Classification of Diseases, Ninth Revision (ICD-9) coded patient...patient conditions and the frequency at which they would present. The resulting illness and injury frequencies characterize the expected patient...The scenario is shown in Table 1. Table 1 Thoracic/Vascular Scenario ICD-9 ICD-9 description No. patients 903.9 INJURY ARM VESSEL NOS 2 904.8
Sociotechnical Challenges of Developing an Interoperable Personal Health Record
Gaskin, G.L.; Longhurst, C.A.; Slayton, R.; Das, A.K.
2011-01-01
Objectives To analyze sociotechnical issues involved in the process of developing an interoperable commercial Personal Health Record (PHR) in a hospital setting, and to create guidelines for future PHR implementations. Methods This qualitative study utilized observational research and semi-structured interviews with 8 members of the hospital team, as gathered over a 28 week period of developing and adapting a vendor-based PHR at Lucile Packard Children’s Hospital at Stanford University. A grounded theory approach was utilized to code and analyze over 100 pages of typewritten field notes and interview transcripts. This grounded analysis allowed themes to surface during the data collection process which were subsequently explored in greater detail in the observations and interviews. Results Four major themes emerged: (1) Multidisciplinary teamwork helped team members identify crucial features of the PHR; (2) Divergent goals for the PHR existed even within the hospital team; (3) Differing organizational conceptions of the end-user between the hospital and software company differentially shaped expectations for the final product; (4) Difficulties with coordination and accountability between the hospital and software company caused major delays and expenses and strained the relationship between hospital and software vendor. Conclusions Though commercial interoperable PHRs have great potential to improve healthcare, the process of designing and developing such systems is an inherently sociotechnical process with many complex issues and barriers. This paper offers recommendations based on the lessons learned to guide future development of such PHRs. PMID:22003373
Jamoulle, Marc; Resnick, Melissa; Grosjean, Julien; Ittoo, Ashwin; Cardillo, Elena; Vander Stichele, Robert; Darmoni, Stefan; Vanmeerbeek, Marc
2018-12-01
While documentation of clinical aspects of General Practice/Family Medicine (GP/FM) is assured by the International Classification of Primary Care (ICPC), there is no taxonomy for the professional aspects (context and management) of GP/FM. To present the development, dissemination, applications, and resulting face validity of the Q-Codes taxonomy specifically designed to describe contextual features of GP/FM, proposed as an extension to the ICPC. The Q-Codes taxonomy was developed from Lamberts' seminal idea for indexing contextual content (1987) by a multi-disciplinary team of knowledge engineers, linguists and general practitioners, through a qualitative and iterative analysis of 1702 abstracts from six GP/FM conferences using Atlas.ti software. A total of 182 concepts, called Q-Codes, representing professional aspects of GP/FM were identified and organized in a taxonomy. Dissemination: The taxonomy is published as an online terminological resource, using semantic web techniques and web ontology language (OWL) ( http://www.hetop.eu/Q ). Each Q-Code is identified with a unique resource identifier (URI), and provided with preferred terms, and scope notes in ten languages (Portuguese, Spanish, English, French, Dutch, Korean, Vietnamese, Turkish, Georgian, German) and search filters for MEDLINE and web searches. This taxonomy has already been used to support queries in bibliographic databases (e.g., MEDLINE), to facilitate indexing of grey literature in GP/FM as congress abstracts, master theses, websites and as an educational tool in vocational teaching, Conclusions: The rapidly growing list of practical applications provides face-validity for the usefulness of this freely available new terminological resource.
microRNA in Cerebral Spinal Fluid as Biomarkers of Alzheimer’s Disease Risk After Brain Injury
2016-08-01
protein processing is a key feature of AD. MiRNAs are small non- coding RNA that regulate mRNA transcription, and may be a significant cause of protein...non- coding RNA that regulate mRNA transcription, and may be a significant cause of protein dysregulation. Our investigative team has generated
Using Docker Containers to Extend Reproducibility Architecture for the NASA Earth Exchange (NEX)
NASA Technical Reports Server (NTRS)
Votava, Petr; Michaelis, Andrew; Spaulding, Ryan; Becker, Jeffrey C.
2016-01-01
NASA Earth Exchange (NEX) is a data, supercomputing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to address large-scale challenges in Earth sciences. As NEX has been growing into a petabyte-size platform for analysis, experiments and data production, it has been increasingly important to enable users to easily retrace their steps, identify what datasets were produced by which process chains, and give them ability to readily reproduce their results. This can be a tedious and difficult task even for a small project, but is almost impossible on large processing pipelines. We have developed an initial reproducibility and knowledge capture solution for the NEX, however, if users want to move the code to another system, whether it is their home institution cluster, laptop or the cloud, they have to find, build and install all the required dependencies that would run their code. This can be a very tedious and tricky process and is a big impediment to moving code to data and reproducibility outside the original system. The NEX team has tried to assist users who wanted to move their code into OpenNEX on Amazon cloud by creating custom virtual machines with all the software and dependencies installed, but this, while solving some of the issues, creates a new bottleneck that requires the NEX team to be involved with any new request, updates to virtual machines and general maintenance support. In this presentation, we will describe a solution that integrates NEX and Docker to bridge the gap in code-to-data migration. The core of the solution is saemi-automatic conversion of science codes, tools and services that are already tracked and described in the NEX provenance system, to Docker - an open-source Linux container software. Docker is available on most computer platforms, easy to install and capable of seamlessly creating and/or executing any application packaged in the appropriate format. We believe this is an important step towards seamless process deployment in heterogeneous environments that will enhance community access to NASA data and tools in a scalable way, promote software reuse, and improve reproducibility of scientific results.
Using Docker Containers to Extend Reproducibility Architecture for the NASA Earth Exchange (NEX)
NASA Astrophysics Data System (ADS)
Votava, P.; Michaelis, A.; Spaulding, R.; Becker, J. C.
2016-12-01
NASA Earth Exchange (NEX) is a data, supercomputing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to address large-scale challenges in Earth sciences. As NEX has been growing into a petabyte-size platform for analysis, experiments and data production, it has been increasingly important to enable users to easily retrace their steps, identify what datasets were produced by which process chains, and give them ability to readily reproduce their results. This can be a tedious and difficult task even for a small project, but is almost impossible on large processing pipelines. We have developed an initial reproducibility and knowledge capture solution for the NEX, however, if users want to move the code to another system, whether it is their home institution cluster, laptop or the cloud, they have to find, build and install all the required dependencies that would run their code. This can be a very tedious and tricky process and is a big impediment to moving code to data and reproducibility outside the original system. The NEX team has tried to assist users who wanted to move their code into OpenNEX on Amazon cloud by creating custom virtual machines with all the software and dependencies installed, but this, while solving some of the issues, creates a new bottleneck that requires the NEX team to be involved with any new request, updates to virtual machines and general maintenance support. In this presentation, we will describe a solution that integrates NEX and Docker to bridge the gap in code-to-data migration. The core of the solution is saemi-automatic conversion of science codes, tools and services that are already tracked and described in the NEX provenance system, to Docker - an open-source Linux container software. Docker is available on most computer platforms, easy to install and capable of seamlessly creating and/or executing any application packaged in the appropriate format. We believe this is an important step towards seamless process deployment in heterogeneous environments that will enhance community access to NASA data and tools in a scalable way, promote software reuse, and improve reproducibility of scientific results.
Exploring the Media Mix during IT-Offshore Project
NASA Astrophysics Data System (ADS)
Wende, Erik; Schwabe, Gerhard; Philip, Tom
Offshore outsourced IT projects continue to gain relevance in the globalized world scenario. The temporal, geographical and cultural distances involved during the development of software between distributed team members result in communication challenges. As software development involves the coding of knowledge, the management of knowledge and its transfer remain critical for the success of the project. For effective knowledge transfer between geographically dispersed teams the ongoing selection of communication medium or the media channel mix becomes highly significant. Although there is an abundance of theory dealing with knowledge transfer and media channel selection during offshore outsourcing projects, the specific role of cultural differences in the media mix is often overlooked. As a first step to rectify this, this paper presents an explorative outsourcing case study with emphasis on the chosen media channels and the problems that arose from differences in culture. The case study is analyzed in light of several theoretical models. Finally the paper presents the idea of extending the Media Synchonicity theory with cultural factors.
Nouraei, S A R; O'Hanlon, S; Butler, C R; Hadovsky, A; Donald, E; Benjamin, E; Sandhu, G S
2009-02-01
To audit the accuracy of otolaryngology clinical coding and identify ways of improving it. Prospective multidisciplinary audit, using the 'national standard clinical coding audit' methodology supplemented by 'double-reading and arbitration'. Teaching-hospital otolaryngology and clinical coding departments. Otolaryngology inpatient and day-surgery cases. Concordance between initial coding performed by a coder (first cycle) and final coding by a clinician-coder multidisciplinary team (MDT; second cycle) for primary and secondary diagnoses and procedures, and Health Resource Groupings (HRG) assignment. 1250 randomly-selected cases were studied. Coding errors occurred in 24.1% of cases (301/1250). The clinician-coder MDT reassigned 48 primary diagnoses and 186 primary procedures and identified a further 209 initially-missed secondary diagnoses and procedures. In 203 cases, patient's initial HRG changed. Incorrect coding caused an average revenue loss of 174.90 pounds per patient (14.7%) of which 60% of the total income variance was due to miscoding of a eight highly-complex head and neck cancer cases. The 'HRG drift' created the appearance of disproportionate resource utilisation when treating 'simple' cases. At our institution the total cost of maintaining a clinician-coder MDT was 4.8 times lower than the income regained through the double-reading process. This large audit of otolaryngology practice identifies a large degree of error in coding on discharge. This leads to significant loss of departmental revenue, and given that the same data is used for benchmarking and for making decisions about resource allocation, it distorts the picture of clinical practice. These can be rectified through implementing a cost-effective clinician-coder double-reading multidisciplinary team as part of a data-assurance clinical governance framework which we recommend should be established in hospitals.
[Global aspects of medical ethics: conditions and possibilities].
Neitzke, G
2001-01-01
A global or universal code of medical ethics seems paradoxical in the era of pluralism and postmodernism. A different conception of globalisation will be developed in terms of a "procedural universality". According to this philosophical concept, a code of medical ethics does not oblige physicians to accept certain specific, preset, universal values and rules. It rather obliges every culture and society to start a culture-sensitive, continuous, and active discourse on specific issues, mentioned in the codex. This procedure might result in regional, intra-cultural consensus, which should be presented to an inter-cultural dialogue. To exemplify this procedure, current topics of medical ethics (spiritual foundations of medicine, autonomy, definitions concerning life and death, physicians' duties, conduct within therapeutic teams) will be discussed from the point of view of western medicine.
Baldassano, Steven N; Brinkmann, Benjamin H; Ung, Hoameng; Blevins, Tyler; Conrad, Erin C; Leyde, Kent; Cook, Mark J; Khambhati, Ankit N; Wagenaar, Joost B; Worrell, Gregory A; Litt, Brian
2017-06-01
There exist significant clinical and basic research needs for accurate, automated seizure detection algorithms. These algorithms have translational potential in responsive neurostimulation devices and in automatic parsing of continuous intracranial electroencephalography data. An important barrier to developing accurate, validated algorithms for seizure detection is limited access to high-quality, expertly annotated seizure data from prolonged recordings. To overcome this, we hosted a kaggle.com competition to crowdsource the development of seizure detection algorithms using intracranial electroencephalography from canines and humans with epilepsy. The top three performing algorithms from the contest were then validated on out-of-sample patient data including standard clinical data and continuous ambulatory human data obtained over several years using the implantable NeuroVista seizure advisory system. Two hundred teams of data scientists from all over the world participated in the kaggle.com competition. The top performing teams submitted highly accurate algorithms with consistent performance in the out-of-sample validation study. The performance of these seizure detection algorithms, achieved using freely available code and data, sets a new reproducible benchmark for personalized seizure detection. We have also shared a 'plug and play' pipeline to allow other researchers to easily use these algorithms on their own datasets. The success of this competition demonstrates how sharing code and high quality data results in the creation of powerful translational tools with significant potential to impact patient care. © The Author (2017). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Space Communications Emulation Facility
NASA Technical Reports Server (NTRS)
Hill, Chante A.
2004-01-01
Establishing space communication between ground facilities and other satellites is a painstaking task that requires many precise calculations dealing with relay time, atmospheric conditions, and satellite positions, to name a few. The Space Communications Emulation Facility (SCEF) team here at NASA is developing a facility that will approximately emulate the conditions in space that impact space communication. The emulation facility is comprised of a 32 node distributed cluster of computers; each node representing a satellite or ground station. The objective of the satellites is to observe the topography of the Earth (water, vegetation, land, and ice) and relay this information back to the ground stations. Software originally designed by the University of Kansas, labeled the Emulation Manager, controls the interaction of the satellites and ground stations, as well as handling the recording of data. The Emulation Manager is installed on a Linux Operating System, employing both Java and C++ programming codes. The emulation scenarios are written in extensible Markup Language, XML. XML documents are designed to store, carry, and exchange data. With XML documents data can be exchanged between incompatible systems, which makes it ideal for this project because Linux, MAC and Windows Operating Systems are all used. Unfortunately, XML documents cannot display data like HTML documents. Therefore, the SCEF team uses XML Schema Definition (XSD) or just schema to describe the structure of an XML document. Schemas are very important because they have the capability to validate the correctness of data, define restrictions on data, define data formats, and convert data between different data types, among other things. At this time, in order for the Emulation Manager to open and run an XML emulation scenario file, the user must first establish a link between the schema file and the directory under which the XML scenario files are saved. This procedure takes place on the command line on the Linux Operating System. Once this link has been established the Emulation manager validates all the XML files in that directory against the schema file, before the actual scenario is run. Using some very sophisticated commercial software called the Satellite Tool Kit (STK) installed on the Linux box, the Emulation Manager is able to display the data and graphics generated by the execution of a XML emulation scenario file. The Emulation Manager software is written in JAVA programming code. Since the SCEF project is in the developmental stage, the source code for this type of software is being modified to better fit the requirements of the SCEF project. Some parameters for the emulation are hard coded, set at fixed values. Members of the SCEF team are altering the code to allow the user to choose the values of these hard coded parameters by inserting a toolbar onto the preexisting GUI.
The Future of ECHO: Evaluating Open Source Possibilities
NASA Astrophysics Data System (ADS)
Pilone, D.; Gilman, J.; Baynes, K.; Mitchell, A. E.
2012-12-01
NASA's Earth Observing System ClearingHOuse (ECHO) is a format agnostic metadata repository supporting over 3000 collections and 100M science granules. ECHO exposes FTP and RESTful Data Ingest APIs in addition to both SOAP and RESTful search and order capabilities. Built on top of ECHO is a human facing search and order web application named Reverb. ECHO processes hundreds of orders, tens of thousands of searches, and 1-2M ingest actions each week. As ECHO's holdings, metadata format support, and visibility have increased, the ECHO team has received requests by non-NASA entities for copies of ECHO that can be run locally against their data holdings. ESDIS and the ECHO Team have begun investigations into various deployment and Open Sourcing models that can balance the real constraints faced by the ECHO project with the benefits of providing ECHO capabilities to a broader set of users and providers. This talk will discuss several release and Open Source models being investigated by the ECHO team along with the impacts those models are expected to have on the project. We discuss: - Addressing complex deployment or setup issues for potential users - Models of vetting code contributions - Balancing external (public) user requests versus our primary partners - Preparing project code for public release, including navigating licensing issues related to leveraged libraries - Dealing with non-free project dependencies such as commercial databases - Dealing with sensitive aspects of project code such as database passwords, authentication approaches, security through obscurity, etc. - Ongoing support for the released code including increased testing demands, bug fixes, security fixes, and new features.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rey, D.; Ryan, W.; Ross, M.
A method for more efficiently utilizing the frequency bandwidth allocated for data transmission is presented. Current space and range communication systems use modulation and coding schemes that transmit 0.5 to 1.0 bits per second per Hertz of radio frequency bandwidth. The goal in this LDRD project is to increase the bandwidth utilization by employing advanced digital communications techniques. This is done with little or no increase in the transmit power which is usually very limited on airborne systems. Teaming with New Mexico State University, an implementation of trellis coded modulation (TCM), a coding and modulation scheme pioneered by Ungerboeck, wasmore » developed for this application and simulated on a computer. TCM provides a means for reliably transmitting data while simultaneously increasing bandwidth efficiency. The penalty is increased receiver complexity. In particular, the trellis decoder requires high-speed, application-specific digital signal processing (DSP) chips. A system solution based on the QualComm Viterbi decoder and the Graychip DSP receiver chips is presented.« less
The ADVANCE Code of Conduct for collaborative vaccine studies.
Kurz, Xavier; Bauchau, Vincent; Mahy, Patrick; Glismann, Steffen; van der Aa, Lieke Maria; Simondon, François
2017-04-04
Lessons learnt from the 2009 (H1N1) flu pandemic highlighted factors limiting the capacity to collect European data on vaccine exposure, safety and effectiveness, including lack of rapid access to available data sources or expertise, difficulties to establish efficient interactions between multiple parties, lack of confidence between private and public sectors, concerns about possible or actual conflicts of interest (or perceptions thereof) and inadequate funding mechanisms. The Innovative Medicines Initiative's Accelerated Development of VAccine benefit-risk Collaboration in Europe (ADVANCE) consortium was established to create an efficient and sustainable infrastructure for rapid and integrated monitoring of post-approval benefit-risk of vaccines, including a code of conduct and governance principles for collaborative studies. The development of the code of conduct was guided by three core and common values (best science, strengthening public health, transparency) and a review of existing guidance and relevant published articles. The ADVANCE Code of Conduct includes 45 recommendations in 10 topics (Scientific integrity, Scientific independence, Transparency, Conflicts of interest, Study protocol, Study report, Publication, Subject privacy, Sharing of study data, Research contract). Each topic includes a definition, a set of recommendations and a list of additional reading. The concept of the study team is introduced as a key component of the ADVANCE Code of Conduct with a core set of roles and responsibilities. It is hoped that adoption of the ADVANCE Code of Conduct by all partners involved in a study will facilitate and speed-up its initiation, design, conduct and reporting. Adoption of the ADVANCE Code of Conduct should be stated in the study protocol, study report and publications and journal editors are encouraged to use it as an indication that good principles of public health, science and transparency were followed throughout the study. Copyright © 2017. Published by Elsevier Ltd.
Exploring the Content of Shared Mental Models in Project Teams
2005-09-30
FINAL REPORT Grant Title: EXPLORING THE CONTENT OF SHARED MENTAL MODELS IN PROJECT TEAMS Office of Naval Research Award Number: N000140210535... Research Laboratory, Attn: code 5227, 4555 Overlook Ave., SW, Washington, DC •t• The University of Massachusetts Amherst is an Affirmative Action/Equal...satisfaction. 2.0 PROJECT SUMMARY No consensus among researchers studying shared cognition exists regarding the identification of what should be
Connecting to Get Things Done: A Conceptual Model of the Process Used to Respond to Bias Incidents
ERIC Educational Resources Information Center
LePeau, Lucy A.; Morgan, Demetri L.; Zimmerman, Hilary B.; Snipes, Jeremy T.; Marcotte, Beth A.
2016-01-01
In this study, we interviewed victims of bias incidents and members of a bias response team to investigate the process the team used to respond to incidents. Incidents included acts of sexism, homophobia, and racism on a large, predominantly White research university in the Midwest. Data were analyzed using a 4-stage coding process. The emergent…
Optical systems integrated modeling
NASA Technical Reports Server (NTRS)
Shannon, Robert R.; Laskin, Robert A.; Brewer, SI; Burrows, Chris; Epps, Harlan; Illingworth, Garth; Korsch, Dietrich; Levine, B. Martin; Mahajan, Vini; Rimmer, Chuck
1992-01-01
An integrated modeling capability that provides the tools by which entire optical systems and instruments can be simulated and optimized is a key technology development, applicable to all mission classes, especially astrophysics. Many of the future missions require optical systems that are physically much larger than anything flown before and yet must retain the characteristic sub-micron diffraction limited wavefront accuracy of their smaller precursors. It is no longer feasible to follow the path of 'cut and test' development; the sheer scale of these systems precludes many of the older techniques that rely upon ground evaluation of full size engineering units. The ability to accurately model (by computer) and optimize the entire flight system's integrated structural, thermal, and dynamic characteristics is essential. Two distinct integrated modeling capabilities are required. These are an initial design capability and a detailed design and optimization system. The content of an initial design package is shown. It would be a modular, workstation based code which allows preliminary integrated system analysis and trade studies to be carried out quickly by a single engineer or a small design team. A simple concept for a detailed design and optimization system is shown. This is a linkage of interface architecture that allows efficient interchange of information between existing large specialized optical, control, thermal, and structural design codes. The computing environment would be a network of large mainframe machines and its users would be project level design teams. More advanced concepts for detailed design systems would support interaction between modules and automated optimization of the entire system. Technology assessment and development plans for integrated package for initial design, interface development for detailed optimization, validation, and modeling research are presented.
What Not To Do: Anti-patterns for Developing Scientific Workflow Software Components
NASA Astrophysics Data System (ADS)
Futrelle, J.; Maffei, A. R.; Sosik, H. M.; Gallager, S. M.; York, A.
2013-12-01
Scientific workflows promise to enable efficient scaling-up of researcher code to handle large datasets and workloads, as well as documentation of scientific processing via standardized provenance records, etc. Workflow systems and related frameworks for coordinating the execution of otherwise separate components are limited, however, in their ability to overcome software engineering design problems commonly encountered in pre-existing components, such as scripts developed externally by scientists in their laboratories. In practice, this often means that components must be rewritten or replaced in a time-consuming, expensive process. In the course of an extensive workflow development project involving large-scale oceanographic image processing, we have begun to identify and codify 'anti-patterns'--problematic design characteristics of software--that make components fit poorly into complex automated workflows. We have gone on to develop and document low-effort solutions and best practices that efficiently address the anti-patterns we have identified. The issues, solutions, and best practices can be used to evaluate and improve existing code, as well as guiding the development of new components. For example, we have identified a common anti-pattern we call 'batch-itis' in which a script fails and then cannot perform more work, even if that work is not precluded by the failure. The solution we have identified--removing unnecessary looping over independent units of work--is often easier to code than the anti-pattern, as it eliminates the need for complex control flow logic in the component. Other anti-patterns we have identified are similarly easy to identify and often easy to fix. We have drawn upon experience working with three science teams at Woods Hole Oceanographic Institution, each of which has designed novel imaging instruments and associated image analysis code. By developing use cases and prototypes within these teams, we have undertaken formal evaluations of software components developed by programmers with widely varying levels of expertise, and have been able to discover and characterize a number of anti-patterns. Our evaluation methodology and testbed have also enabled us to assess the efficacy of strategies to address these anti-patterns according to scientifically relevant metrics, such as ability of algorithms to perform faster than the rate of data acquisition and the accuracy of workflow component output relative to ground truth. The set of anti-patterns and solutions we have identified augments of the body of more well-known software engineering anti-patterns by addressing additional concerns that obtain when a software component has to function as part of a workflow assembled out of independently-developed codebases. Our experience shows that identifying and resolving these anti-patterns reduces development time and improves performance without reducing component reusability.
Weller, Jennifer M; Janssen, Anna L; Merry, Alan F; Robinson, Brian
2008-04-01
We placed anaesthesia teams into a stressful environment in order to explore interactions between members of different professional groups and to investigate their perspectives on the impact of these interactions on team performance. Ten anaesthetists, 5 nurses and 5 trained anaesthetic assistants each participated in 2 full-immersion simulations of critical events using a high-fidelity computerised patient simulator. Their perceptions of team interactions were explored through questionnaires and semi-structured interviews. Written questionnaire data and interview transcriptions were entered into N6 qualitative software. Data were analysed by 2 investigators for emerging themes and coded to produce reports on each theme. We found evidence of limited understanding of the roles and capabilities of team members across professional boundaries, different perceptions of appropriate roles and responsibilities for different members of the team, limited sharing of information between team members and limited team input into decision making. There was a perceived impact on task distribution and the optimal utilisation of resources within the team. Effective management of medical emergencies depends on optimal team function. We have identified important factors affecting interactions between different health professionals in the anaesthesia team, and their perceived influences on team function. This provides evidence on which to build appropriate and specific strategies for interdisciplinary team training in operating theatre staff.
Fully Employing Software Inspections Data
NASA Technical Reports Server (NTRS)
Shull, Forrest; Feldmann, Raimund L.; Seaman, Carolyn; Regardie, Myrna; Godfrey, Sally
2009-01-01
Software inspections provide a proven approach to quality assurance for software products of all kinds, including requirements, design, code, test plans, among others. Common to all inspections is the aim of finding and fixing defects as early as possible, and thereby providing cost savings by minimizing the amount of rework necessary later in the lifecycle. Measurement data, such as the number and type of found defects and the effort spent by the inspection team, provide not only direct feedback about the software product to the project team but are also valuable for process improvement activities. In this paper, we discuss NASA's use of software inspections and the rich set of data that has resulted. In particular, we present results from analysis of inspection data that illustrate the benefits of fully utilizing that data for process improvement at several levels. Examining such data across multiple inspections or projects allows team members to monitor and trigger cross project improvements. Such improvements may focus on the software development processes of the whole organization as well as improvements to the applied inspection process itself.
Garitte, B.; Nguyen, T. S.; Barnichon, J. D.; ...
2017-05-09
Coupled thermal–hydrological–mechanical (THM) processes in the near field of deep geological repositories can influence several safety features of the engineered and geological barriers. Among those features are: the possibility of damage in the host rock, the time for re-saturation of the bentonite, and the perturbations in the hydraulic regime in both the rock and engineered seals. Within the international cooperative code-validation project DECOVALEX-2015, eight research teams developed models to simulate an in situ heater experiment, called HE-D, in Opalinus Clay at the Mont Terri Underground Research Laboratory in Switzerland. The models were developed from the theory of poroelasticity in ordermore » to simulate the coupled THM processes that prevailed during the experiment and thereby to characterize the in situ THM properties of Opalinus Clay. The modelling results for the evolution of temperature, pore water pressure, and deformation at different points are consistent among the research teams and compare favourably with the experimental data in terms of trends and absolute values. The models were able to reproduce the main physical processes of the experiment. In particular, most teams simulated temperature and thermally induced pore water pressure well, including spatial variations caused by inherent anisotropy due to bedding.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garitte, B.; Nguyen, T. S.; Barnichon, J. D.
Coupled thermal–hydrological–mechanical (THM) processes in the near field of deep geological repositories can influence several safety features of the engineered and geological barriers. Among those features are: the possibility of damage in the host rock, the time for re-saturation of the bentonite, and the perturbations in the hydraulic regime in both the rock and engineered seals. Within the international cooperative code-validation project DECOVALEX-2015, eight research teams developed models to simulate an in situ heater experiment, called HE-D, in Opalinus Clay at the Mont Terri Underground Research Laboratory in Switzerland. The models were developed from the theory of poroelasticity in ordermore » to simulate the coupled THM processes that prevailed during the experiment and thereby to characterize the in situ THM properties of Opalinus Clay. The modelling results for the evolution of temperature, pore water pressure, and deformation at different points are consistent among the research teams and compare favourably with the experimental data in terms of trends and absolute values. The models were able to reproduce the main physical processes of the experiment. In particular, most teams simulated temperature and thermally induced pore water pressure well, including spatial variations caused by inherent anisotropy due to bedding.« less
Shepherd, Annabel; Lough, Murray
2010-05-01
Although multi-source feedback (MSF) has been used in primary healthcare, the development of an MSF instrument specific to this setting in the UK has not been previously described. The aims of this study were to develop and evaluate an MSF instrument for GPs in Scotland taking part in appraisal. The members of ten primary healthcare teams in the west of Scotland were asked to provide comments in answer to the question, 'What is a good GP?'. The data were reduced and coded by two researchers and questions were devised. Following content validity testing the MSF process was evaluated with volunteers using face-to-face interviews and a postal survey. Thirty-seven statements covering the six domains of communication skills, professional values, clinical care, working with colleagues, personality issues and duties and responsibilities were accepted as relevant by ten primary healthcare teams using a standard of 80 percent agreement. The evaluation found the MSF process to be feasible and acceptable and participants provided some evidence of educational impact. An MSF instrument for GPs has been developed based on the concept of 'the good GP' as described by the primary healthcare team. The evaluation of the resultant MSF process illustrates the potential of MSF, when delivered in the supportive environment of GP appraisal, to provide feedback which has the possibility of improving working relationships between GPs and their colleagues.
Coordination of cancer care between family physicians and cancer specialists
Easley, Julie; Miedema, Baukje; Carroll, June C.; Manca, Donna P.; O’Brien, Mary Ann; Webster, Fiona; Grunfeld, Eva
2016-01-01
Abstract Objective To explore health care provider (HCP) perspectives on the coordination of cancer care between FPs and cancer specialists. Design Qualitative study using semistructured telephone interviews. Setting Canada. Participants A total of 58 HCPs, comprising 21 FPs, 15 surgeons, 12 medical oncologists, 6 radiation oncologists, and 4 GPs in oncology. Methods This qualitative study is nested within a larger mixed-methods program of research, CanIMPACT (Canadian Team to Improve Community-Based Cancer Care along the Continuum), focused on improving the coordination of cancer care between FPs and cancer specialists. Using a constructivist grounded theory approach, telephone interviews were conducted with HCPs involved in cancer care. Invitations to participate were sent to a purposive sample of HCPs based on medical specialty, sex, province or territory, and geographic location (urban or rural). A coding schema was developed by 4 team members; subsequently, 1 team member coded the remaining transcripts. The resulting themes were reviewed by the entire team and a summary of results was mailed to participants for review. Main findings Communication challenges emerged as the most prominent theme. Five key related subthemes were identified around this core concept that occurred at both system and individual levels. System-level issues included delays in medical transcription, difficulties accessing patient information, and physicians not being copied on all reports. Individual-level issues included the lack of rapport between FPs and cancer specialists, and the lack of clearly defined and broadly communicated roles. Conclusion Effective and timely communication of medical information, as well as clearly defined roles for each provider, are essential to good coordination of care along the cancer care trajectory, particularly during transitions of care between cancer specialist and FP care. Despite advances in technology, substantial communication challenges still exist. This can lead to serious consequences that affect clinical decision making. PMID:27737996
Easley, Julie; Miedema, Baukje; Carroll, June C; Manca, Donna P; O'Brien, Mary Ann; Webster, Fiona; Grunfeld, Eva
2016-10-01
To explore health care provider (HCP) perspectives on the coordination of cancer care between FPs and cancer specialists. Qualitative study using semistructured telephone interviews. Canada. A total of 58 HCPs, comprising 21 FPs, 15 surgeons, 12 medical oncologists, 6 radiation oncologists, and 4 GPs in oncology. This qualitative study is nested within a larger mixed-methods program of research, CanIMPACT (Canadian Team to Improve Community-Based Cancer Care along the Continuum), focused on improving the coordination of cancer care between FPs and cancer specialists. Using a constructivist grounded theory approach, telephone interviews were conducted with HCPs involved in cancer care. Invitations to participate were sent to a purposive sample of HCPs based on medical specialty, sex, province or territory, and geographic location (urban or rural). A coding schema was developed by 4 team members; subsequently, 1 team member coded the remaining transcripts. The resulting themes were reviewed by the entire team and a summary of results was mailed to participants for review. Communication challenges emerged as the most prominent theme. Five key related subthemes were identified around this core concept that occurred at both system and individual levels. System-level issues included delays in medical transcription, difficulties accessing patient information, and physicians not being copied on all reports. Individual-level issues included the lack of rapport between FPs and cancer specialists, and the lack of clearly defined and broadly communicated roles. Effective and timely communication of medical information, as well as clearly defined roles for each provider, are essential to good coordination of care along the cancer care trajectory, particularly during transitions of care between cancer specialist and FP care. Despite advances in technology, substantial communication challenges still exist. This can lead to serious consequences that affect clinical decision making. Copyright© the College of Family Physicians of Canada.
Fernandez, Jordi; Camerino, Oleguer; Anguera, M Teresa; Jonsson, Gudberg K
2009-08-01
In the field of sports research, there is a growing need for the rigorous collection of data that provide empirical evidence about the complex reality they refer to. Although sports psychology research has advanced considerably in recent years, in both extent and quality, one area of research that remains relatively unexplored is the dynamics of the sports group and the influence of the group on its members (George & Feltz, 1995; Widmeyer, Brawley, & Carron, 1992). Key aspects in this regard include the presence of regularities that are not detectable through visual inference or traditional methods of data analysis, the lack of standard observation instruments, and, assuming priority, the need to develop powerful, computerized coding systems, all of which must form part of an approach that is suitable for natural and habitual contexts. The present study is part of a broader research project concerning ACB teams (first Spanish basketball division) and considers the interaction context before teams try to score (where this is understood as how teams create scoring opportunities) as the core aspect that links team play. This investigation proposes a new model of analysis for studying the effectiveness and construction of offensive basketball plays in order to identify their outcomes, thus providing coaches with an important device for improving or consolidating them.
Nuclear Engine System Simulation (NESS) version 2.0
NASA Technical Reports Server (NTRS)
Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.
1993-01-01
The topics are presented in viewgraph form and include the following; nuclear thermal propulsion (NTP) engine system analysis program development; nuclear thermal propulsion engine analysis capability requirements; team resources used to support NESS development; expanded liquid engine simulations (ELES) computer model; ELES verification examples; NESS program development evolution; past NTP ELES analysis code modifications and verifications; general NTP engine system features modeled by NESS; representative NTP expander, gas generator, and bleed engine system cycles modeled by NESS; NESS program overview; NESS program flow logic; enabler (NERVA type) nuclear thermal rocket engine; prismatic fuel elements and supports; reactor fuel and support element parameters; reactor parameters as a function of thrust level; internal shield sizing; and reactor thermal model.
Merino, Aimee M; Greiner, Ryan; Hartwig, Kristopher
2017-09-01
Patient preferences regarding cardiopulmonary resuscitation (CPR) are important, especially during hospitalization when a patient's health is changing. Yet many patients are not adequately informed or involved in the decision-making process. We examined the effect of an informational video about CPR on hospitalized patients' code status choices. This was a prospective, randomized trial conducted at the Minneapolis Veterans Affairs Health Care System in Minnesota. We enrolled 119 patients, hospitalized on the general medicine service, and at least 65 years old. The majority were men (97%) with a mean age of 75. A video described code status choices: full code (CPR and intubation if required), do not resuscitate (DNR), and do not resuscitate/do not intubate (DNR/DNI). Participants were randomized to watch the video (n = 59) or usual care (n = 60). The primary outcome was participants' code status preferences. Secondary outcomes included a questionnaire designed to evaluate participants' trust in their healthcare team and knowledge and perceptions about CPR. Participants who viewed the video were less likely to choose full code (37%) compared to participants in the usual care group (71%) and more likely to choose DNR/DNI (56% in the video group vs. 17% in the control group) ( < 0.00001). We did not see a difference in trust in their healthcare team or knowledge and perceptions about CPR as assessed by our questionnaire. Hospitalized patients who watched a video about CPR and code status choices were less likely to choose full code and more likely to choose DNR/DNI. © 2017 Society of Hospital Medicine
Sports Stars: Analyzing the Performance of Astronomers at Visualization-based Discovery
NASA Astrophysics Data System (ADS)
Fluke, C. J.; Parrington, L.; Hegarty, S.; MacMahon, C.; Morgan, S.; Hassan, A. H.; Kilborn, V. A.
2017-05-01
In this data-rich era of astronomy, there is a growing reliance on automated techniques to discover new knowledge. The role of the astronomer may change from being a discoverer to being a confirmer. But what do astronomers actually look at when they distinguish between “sources” and “noise?” What are the differences between novice and expert astronomers when it comes to visual-based discovery? Can we identify elite talent or coach astronomers to maximize their potential for discovery? By looking to the field of sports performance analysis, we consider an established, domain-wide approach, where the expertise of the viewer (i.e., a member of the coaching team) plays a crucial role in identifying and determining the subtle features of gameplay that provide a winning advantage. As an initial case study, we investigate whether the SportsCode performance analysis software can be used to understand and document how an experienced Hi astronomer makes discoveries in spectral data cubes. We find that the process of timeline-based coding can be applied to spectral cube data by mapping spectral channels to frames within a movie. SportsCode provides a range of easy to use methods for annotation, including feature-based codes and labels, text annotations associated with codes, and image-based drawing. The outputs, including instance movies that are uniquely associated with coded events, provide the basis for a training program or team-based analysis that could be used in unison with discipline specific analysis software. In this coordinated approach to visualization and analysis, SportsCode can act as a visual notebook, recording the insight and decisions in partnership with established analysis methods. Alternatively, in situ annotation and coding of features would be a valuable addition to existing and future visualization and analysis packages.
The Transition from VMS to Unix Operations for STScI's Science Planning and Scheduling Team
NASA Astrophysics Data System (ADS)
Adler, D. S.; Taylor, D. K.
The Science Planning and Scheduling Team of the Space Telescope Science Institute currently uses the VMS operating system. SPST began a transition to Unix-based operations in the summer of 1999. The main tasks for SPST to address in the Unix transition are: (1) converting the current SPST operational tools from DCL to Python; (2) converting our database report scripts from SQL; (3) adopting a Unix-based code management system; and (4) training the SPST staff. The goal is to fully transition the team to Unix operations by the end of 2001.
Nolan, Heather R; Fitzgerald, Michael; Howard, Brett; Jarrard, Joey; Vaughn, Danny
Procedural time-outs are widely accepted safety standards that are protocolized in nearly all hospital systems. The trauma time-out, however, has been largely unstudied in the existing literature and does not have a standard protocol outlined by any of the major trauma surgery organizations. The goal of this study was to evaluate our institution's use of the trauma time-out and assess how trauma team members viewed its effectiveness. A multiple-answer survey was sent to trauma team members at a Level I trauma center. Questions included items directed at background, experience, opinions, and write-in responses. Most responders were experienced trauma team members who regularly participated in trauma codes. All respondents noted the total time required to complete the time-out was less than 5 min, with the majority saying it took less than 1 min. Seventy-five percent agreed that trauma time-outs should continue, with 92% noting that it improved understanding of patient presentation and prehospital evaluation. Seventy-seven percent said it improved understanding of other team member's roles, and 75% stated it improved patient care. Subgroups of physicians and nurses were statistically similar; yet, physicians did note that it improved their understanding of the team member's function more frequently than nurses. The trauma time-out can be an excellent tool to improve patient care and team understanding of the incoming trauma patient. Although used widely at multiple levels of trauma institutions, development of a documented protocol can be the next step in creating a unified safety standard.
NASA Astrophysics Data System (ADS)
Mena, Irene B.; Diefes-Dux, Heidi A.
2012-04-01
Students' perceptions of engineering have been documented through studies involving interviews, surveys, and word associations that take a direct approach to asking students about various aspects of their understanding of engineering. Research on perceptions of engineering rarely focuses on how students would portray engineering to others. First-year engineering student teams proposed a museum exhibit, targeted to middle school students, to explore the question "What is engineering?" The proposals took the form of a poster. The overarching research question focuses on how these students would portray engineering to middle school students as seen through their museum exhibit proposals. A preliminary analysis was done on 357 posters to determine the overall engineering themes for the proposed museum exhibits. Forty of these posters were selected and, using open coding, more thoroughly analyzed to learn what artifacts/objects, concepts, and skills student teams associate with engineering. These posters were also analyzed to determine if there were any differences by gender composition of the student teams. Building, designing, and teamwork are skills the first-year engineering students link to engineering. Regarding artifacts, students mentioned those related to transportation and structures most often. All-male teams were more likely to focus on the idea of space and to mention teamwork and designing as engineering skills; equal-gender teams were more likely to focus on the multidisciplinary aspect of engineering. This analysis of student teams' proposals provides baseline data, positioning instructors to develop and assess instructional interventions that stretch students' self-exploration of engineering.
Cabral, Linda; Strother, Heather; Muhr, Kathy; Sefton, Laura; Savageau, Judith
2014-01-01
Mental health peer specialists develop peer-to-peer relationships of trust with clients to improve their health and well-being, functioning in ways similar to community health workers. Although the number of peer specialists in use has been increasing, their role in care teams is less defined than that of the community health worker. This qualitative study explored how the peer specialist role is defined across different stakeholder groups, the expectations for this role and how the peer specialist is utilised and integrated across different types of mental health services. Data were collected through interviews and focus groups conducted in Massachusetts with peer specialists (N = 44), their supervisors (N = 14) and clients (N = 10) between September 2009 and January 2011. A consensus coding approach was used and all data outputs were reviewed by the entire team to identify themes. Peer specialists reported that their most important role is to develop relationships with clients and that having lived mental health experience is a key element in creating that bond. They also indicated that educating staff about the recovery model and peer role is another important function. However, they often felt a lack of clarity about their role within their organisation and care team. Supervisors valued the unique experience that peer specialists bring to an organisation. However, without a defined set of expectations for this role, they struggled with training, guiding and evaluating their peer specialist staff. Clients reported that the shared lived experience is important for the relationship and that working with a peer specialist has improved their mental health. With increasing support for person-centred integrated healthcare delivery models, the demand for mental health peer specialist services will probably increase. Therefore, clearer role definition, as well as workforce development focused on team orientation, is necessary for peer specialists to be fully integrated and supported in care teams. © 2013 John Wiley & Sons Ltd.
van Tongeren, Martie
2013-01-01
The INTEROCC project is a multi-centre case–control study investigating the risk of developing brain cancer due to occupational chemical and electromagnetic field exposures. To estimate chemical exposures, the Finnish Job Exposure Matrix (FINJEM) was modified to improve its performance in the INTEROCC study and to address some of its limitations, resulting in the development of the INTEROCC JEM. An international team of occupational hygienists developed a crosswalk between the Finnish occupational codes used in FINJEM and the International Standard Classification of Occupations 1968 (ISCO68). For ISCO68 codes linked to multiple Finnish codes, weighted means of the exposure estimates were calculated. Similarly, multiple ISCO68 codes linked to a single Finnish code with evidence of heterogeneous exposure were refined. One of the key time periods in FINJEM (1960–1984) was split into two periods (1960–1974 and 1975–1984). Benzene exposure estimates in early periods were modified upwards. The internal consistency of hydrocarbon exposures and exposures to engine exhaust fumes was improved. Finally, exposure to polycyclic aromatic hydrocarbon and benzo(a)pyrene was modified to include the contribution from second-hand smoke. The crosswalk ensured that the FINJEM exposure estimates could be applied to the INTEROCC study subjects. The modifications generally resulted in an increased prevalence of exposure to chemical agents. This increased prevalence of exposure was not restricted to the lowest categories of cumulative exposure, but was seen across all levels for some agents. Although this work has produced a JEM with important improvements compared to FINJEM, further improvements are possible with the expansion of agents and additional external data. PMID:23467593
Rosenthal, Jennifer L; Okumura, Megumi J; Hernandez, Lenore; Li, Su-Ting T; Rehm, Roberta S
2016-01-01
Children with special health care needs often require health services that are only provided at subspecialty centers. Such children who present to nonspecialty hospitals might require a hospital-to-hospital transfer. When transitioning between medical settings, communication is an integral aspect that can affect the quality of patient care. The objectives of the study were to identify barriers and facilitators to effective interfacility pediatric transfer communication to general pediatric floors from the perspectives of referring and accepting physicians, and then develop a conceptual model for effective interfacility transfer communication. This was a single-center qualitative study using grounded theory methodology. Referring and accepting physicians of children with special health care needs were interviewed. Four researchers coded the data using ATLAS.ti (version 7, Scientific Software Development GMBH, Berlin, Germany), using a 2-step process of open coding, followed by focused coding until no new codes emerged. The research team reached consensus on the final major categories and subsequently developed a conceptual model. Eight referring and 9 accepting physicians were interviewed. Theoretical coding resulted in 3 major categories: streamlined transfer process, quality handoff and 2-way communication, and positive relationships between physicians across facilities. The conceptual model unites these categories and shows how these categories contribute to effective interfacility transfer communication. Proposed interventions involved standardizing the communication process and incorporating technology such as telemedicine during transfers. Communication is perceived to be an integral component of interfacility transfers. We recommend that transfer systems be re-engineered to make the process more streamlined, to improve the quality of the handoff and 2-way communication, and to facilitate positive relationships between physicians across facilities. Copyright © 2016 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.
Pediatric intensive care unit admission tool: a colorful approach.
Biddle, Amy
2007-12-01
This article discusses the development, implementation, and utilization of our institution's Pediatric Intensive Care Unit (PICU) Color-Coded Admission Status Tool. Rather than the historical method of identifying a maximum number of staffed beds, a tool was developed to color code the PICU's admission status. Previous methods had been ineffective and led to confusion between the PICU leadership team and the administration. The tool includes the previously missing components of staffing and acuity, which are essential in determining admission capability. The PICU tool has three colored levels: green indicates open for admissions; yellow, admission alert resulting from available beds or because staffing is not equal to the projected patient numbers or required acuity; and red, admissions on hold because only one trauma or arrest bed is available or staffing is not equal to the projected acuity. Yellow and red designations require specific actions and the medical director's approval. The tool has been highly successful and significantly impacted nursing with the inclusion of the essential component of nurse staffing necessary in determining bed availability.
Management of the science ground segment for the Euclid mission
NASA Astrophysics Data System (ADS)
Zacchei, Andrea; Hoar, John; Pasian, Fabio; Buenadicha, Guillermo; Dabin, Christophe; Gregorio, Anna; Mansutti, Oriana; Sauvage, Marc; Vuerli, Claudio
2016-07-01
Euclid is an ESA mission aimed at understanding the nature of dark energy and dark matter by using simultaneously two probes (weak lensing and baryon acoustic oscillations). The mission will observe galaxies and clusters of galaxies out to z 2, in a wide extra-galactic survey covering 15000 deg2, plus a deep survey covering an area of 40 deg². The payload is composed of two instruments, an imager in the visible domain (VIS) and an imager-spectrometer (NISP) covering the near-infrared. The launch is planned in Q4 of 2020. The elements of the Euclid Science Ground Segment (SGS) are the Science Operations Centre (SOC) operated by ESA and nine Science Data Centres (SDCs) in charge of data processing, provided by the Euclid Consortium (EC), formed by over 110 institutes spread in 15 countries. SOC and the EC started several years ago a tight collaboration in order to design and develop a single, cost-efficient and truly integrated SGS. The distributed nature, the size of the data set, and the needed accuracy of the results are the main challenges expected in the design and implementation of the SGS. In particular, the huge volume of data (not only Euclid data but also ground based data) to be processed in the SDCs will require distributed storage to avoid data migration across SDCs. This paper describes the management challenges that the Euclid SGS is facing while dealing with such complexity. The main aspect is related to the organisation of a geographically distributed software development team. In principle algorithms and code is developed in a large number of institutes, while data is actually processed at fewer centers (the national SDCs) where the operational computational infrastructures are maintained. The software produced for data handling, processing and analysis is built within a common development environment defined by the SGS System Team, common to SOC and ECSGS, which has already been active for several years. The code is built incrementally through different levels of maturity, going from prototypes (developed mainly by scientists) to production code (engineered and tested at the SDCs). A number of incremental challenges (infrastructure, data processing and integrated) have been included in the Euclid SGS test plan to verify the correctness and accuracy of the developed systems.
Space experiment development process
NASA Technical Reports Server (NTRS)
Depauw, James F.
1987-01-01
Described is a process for developing space experiments utilizing the Space Shuttle. The role of the Principal Investigator is described as well as the Principal Investigator's relation with the project development team. Described also is the sequence of events from an early definition phase through the steps of hardware development. The major interactions between the hardware development program and the Shuttle integration and safety activities are also shown. The presentation is directed to people with limited Shuttle experiment experience. The objective is to summarize the development process, discuss the roles of major participants, and list some lessons learned. Two points should be made at the outset. First, no two projects are the same so the process varies from case to case. Second, the emphasis here is on Code EN/Microgravity Science and Applications Division (MSAD).
A Qualitative Analysis of Narrative Preclerkship Assessment Data to Evaluate Teamwork Skills.
Dolan, Brigid M; O'Brien, Celia Laird; Cameron, Kenzie A; Green, Marianne M
2018-04-16
Construct: Students entering the health professions require competency in teamwork. Although many teamwork curricula and assessments exist, studies have not demonstrated robust longitudinal assessment of preclerkship students' teamwork skills and attitudes. Assessment portfolios may serve to fill this gap, but it is unknown how narrative comments within portfolios describe student teamwork behaviors. We performed a qualitative analysis of narrative data in 15 assessment portfolios. Student portfolios were randomly selected from 3 groups stratified by quantitative ratings of teamwork performance gathered from small-group and clinical preceptor assessment forms. Narrative data included peer and faculty feedback from these same forms. Data were coded for teamwork-related behaviors using a constant comparative approach combined with an identification of the valence of the coded statements as either "positive observation" or "suggestion for improvement." Eight codes related to teamwork emerged: attitude and demeanor, information facilitation, leadership, preparation and dependability, professionalism, team orientation, values team member contributions, and nonspecific teamwork comments. The frequency of codes and valence varied across the 3 performance groups, with students in the low-performing group receiving more suggestions for improvement across all teamwork codes. Narrative data from assessment portfolios included specific descriptions of teamwork behavior, with important contributions provided by both faculty and peers. A variety of teamwork domains were represented. Such feedback as collected in an assessment portfolio can be used for longitudinal assessment of preclerkship student teamwork skills and attitudes.
An overview of aeroelasticity studies for the National Aero-Space Plane
NASA Technical Reports Server (NTRS)
Ricketts, Rodney H.; Noll, Thomas E.; Whitlow, Woodrow, Jr.; Huttsell, Lawrence J.
1993-01-01
The National Aero-Space Plane (NASP), or X-30, is a single-stage-to-orbit vehicle that is designed to takeoff and land on conventional runways. Research in aeroelasticity was conducted by the NASA and the Wright Laboratory to support the design of a flight vehicle by the national contractor team. This research includes the development of new computational codes for predicting unsteady aerodynamic pressures. In addition, studies were conducted to determine the aerodynamic heating effects on vehicle aeroelasticity and to determine the effects of fuselage flexibility on the stability of the control systems. It also includes the testing of scale models to better understand the aeroelastic behavior of the X-30 and to obtain data for code validation and correlation. This paper presents an overview of the aeroelastic research which has been conducted to support the airframe design.
Quasi-real-time end-to-end simulations of ELT-scale adaptive optics systems on GPUs
NASA Astrophysics Data System (ADS)
Gratadour, Damien
2011-09-01
Our team has started the development of a code dedicated to GPUs for the simulation of AO systems at the E-ELT scale. It uses the CUDA toolkit and an original binding to Yorick (an open source interpreted language) to provide the user with a comprehensive interface. In this paper we present the first performance analysis of our simulation code, showing its ability to provide Shack-Hartmann (SH) images and measurements at the kHz scale for VLT-sized AO system and in quasi-real-time (up to 70 Hz) for ELT-sized systems on a single top-end GPU. The simulation code includes multiple layers atmospheric turbulence generation, ray tracing through these layers, image formation at the focal plane of every sub-apertures of a SH sensor using either natural or laser guide stars and centroiding on these images using various algorithms. Turbulence is generated on-the-fly giving the ability to simulate hours of observations without the need of loading extremely large phase screens in the global memory. Because of its performance this code additionally provides the unique ability to test real-time controllers for future AO systems under nominal conditions.
Game changer: the topology of creativity.
de Vaan, Mathijs; Stark, David; Vedres, Balazs
2015-01-01
This article examines the sociological factors that explain why some creative teams are able to produce game changers--cultural products that stand out as distinctive while also being critically recognized as outstanding. The authors build on work pointing to structural folding--the network property of a cohesive group whose membership overlaps with that of another cohesive group. They hypothesize that the effects of structural folding on game changing success are especially strong when overlapping groups are cognitively distant. Measuring social distance separately from cognitive distance and distinctiveness independently from critical acclaim, the authors test their hypothesis about structural folding and cognitive diversity by analyzing team reassembly for 12,422 video games and the career histories of 139,727 video game developers. When combined with cognitive distance, structural folding channels and mobilizes a productive tension of rules, roles, and codes that promotes successful innovation. In addition to serving as pipes and prisms, network ties are also the source of tools and tensions.
NASA's Information Power Grid: Large Scale Distributed Computing and Data Management
NASA Technical Reports Server (NTRS)
Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)
2001-01-01
Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.
Quality circles in a department of dietetics.
Treadwell, D D; Klein, J A
1984-06-01
Quality circles can be an excellent approach to managerial effectiveness in the 1980s. For the Department of Dietetics at Miami Valley Hospital, Dayton , Ohio, quality circles have demonstrated excellent return on investment. Their many benefits include increased productivity, improved employee satisfaction and morale, and cost savings. In order to ensure success, the team needs to be selected carefully and trained thoroughly in problem-solving techniques. Initial meetings should be directed to defining the objectives and code of conduct as well as establishing a trusting environment in which to grow and develop.
2010-11-09
Report No. 10-13M, supported by the U.S. Air Force Medical Logistics Agency, under Work Unit No. 60334. The views expressed in this article are those...recommended 917Q line list. The Unit Type Code (UTC) capabilities, operational requirements, and materiel solutions were identified, and issues of...by 22%, and cost by 4%, or $9,500. Modeling and simulating a medical system like the FFDOT, with a range of capabilities and functional areas
2010-10-14
non-battle injuries , and illnesses. International Classification of Diseases, Ninth Revision (ICD-9) coded patient conditions that have been selected...The patient stream was used to simulate the equipment and supply requirements for the range of surgical cases and non-surgical injuries and illnesses...supplies” column identifies the items needed to complete the “Insert endo - trach tube” task at that level of capability. Not shown in this figure are
NASA Astrophysics Data System (ADS)
Martin, Adrian
As the applications of mobile robotics evolve it has become increasingly less practical for researchers to design custom hardware and control systems for each problem. This research presents a new approach to control system design that looks beyond end-of-lifecycle performance and considers control system structure, flexibility, and extensibility. Toward these ends the Control ad libitum philosophy is proposed, stating that to make significant progress in the real-world application of mobile robot teams the control system must be structured such that teams can be formed in real-time from diverse components. The Control ad libitum philosophy was applied to the design of the HAA (Host, Avatar, Agent) architecture: a modular hierarchical framework built with provably correct distributed algorithms. A control system for exploration and mapping, search and deploy, and foraging was developed to evaluate the architecture in three sets of hardware-in-the-loop experiments. First, the basic functionality of the HAA architecture was studied, specifically the ability to: a) dynamically form the control system, b) dynamically form the robot team, c) dynamically form the processing network, and d) handle heterogeneous teams. Secondly, the real-time performance of the distributed algorithms was tested, and proved effective for the moderate sized systems tested. Furthermore, the distributed Just-in-time Cooperative Simultaneous Localization and Mapping (JC-SLAM) algorithm demonstrated accuracy equal to or better than traditional approaches in resource starved scenarios, while reducing exploration time significantly. The JC-SLAM strategies are also suitable for integration into many existing particle filter SLAM approaches, complementing their unique optimizations. Thirdly, the control system was subjected to concurrent software and hardware failures in a series of increasingly complex experiments. Even with unrealistically high rates of failure the control system was able to successfully complete its tasks. The HAA implementation designed following the Control ad libitum philosophy proved to be capable of dynamic team formation and extremely robust against both hardware and software failure; and, due to the modularity of the system there is significant potential for reuse of assets and future extensibility. One future goal is to make the source code publically available and establish a forum for the development and exchange of new agents.
NASA Technical Reports Server (NTRS)
Avila, Edwin M. Martinez; Muniz, Ricardo; Szafran, Jamie; Dalton, Adam
2011-01-01
Lines of code (LOC) analysis is one of the methods used to measure programmer productivity and estimate schedules of programming projects. The Launch Control System (LCS) had previously used this method to estimate the amount of work and to plan development efforts. The disadvantage of using LOC as a measure of effort is that one can only measure 30% to 35% of the total effort of software projects involves coding [8]. In the application, instead of using the LOC we are using function point for a better estimation of hours in each software to develop. Because of these disadvantages, Jamie Szafran of the System Software Branch of Control And Data Systems (NE-C3) at Kennedy Space Canter developed a web application called Function Point Analysis (FPA) Depot. The objective of this web application is that the LCS software architecture team can use the data to more accurately estimate the effort required to implement customer requirements. This paper describes the evolution of the domain model used for function point analysis as project managers continually strive to generate more accurate estimates.
Why Aren't More Primary Care Residents Going into Primary Care? A Qualitative Study.
Long, Theodore; Chaiyachati, Krisda; Bosu, Olatunde; Sircar, Sohini; Richards, Bradley; Garg, Megha; McGarry, Kelly; Solomon, Sonja; Berman, Rebecca; Curry, Leslie; Moriarty, John; Huot, Stephen
2016-12-01
Workforce projections indicate a potential shortage of up to 31,000 adult primary care providers by the year 2025. Approximately 80 % of internal medicine residents and nearly two-thirds of primary care internal medicine residents do not plan to have a career in primary care or general internal medicine. We aimed to explore contextual and programmatic factors within primary care residency training environments that may influence career choices. This was a qualitative study based on semi-structured, in-person interviews. Three primary care internal medicine residency programs were purposefully selected to represent a diversity of training environments. Second and third year residents were interviewed. We used a survey guide developed from pilot interviews and existing literature. Three members of the research team independently coded the transcripts and developed the code structure based on the constant comparative method. The research team identified emerging themes and refined codes. ATLAS.ti was used for the analysis. We completed 24 interviews (12 second-year residents, and 12 third-year residents). The age range was 27-39 years. Four recurrent themes characterized contextual and programmatic factors contributing to residents' decision-making: resident expectations of a career in primary care, navigation of the boundary between social needs and medical needs, mentorship and perceptions of primary care, and structural features of the training program. Addressing aspects of training that may discourage residents from careers in primary care such as lack of diversity in outpatient experiences and resident frustration with their inability to address social needs of patients, and strengthening aspects of training that may encourage interests in careers in primary care such as mentorship and protected time away from inpatient responsibilities during primary care rotations, may increase the proportion of residents enrolled in primary care training programs who pursue a career in primary care.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Tammie Renee; Tretiak, Sergei
2017-01-06
Understanding and controlling excited state dynamics lies at the heart of all our efforts to design photoactive materials with desired functionality. This tailor-design approach has become the standard for many technological applications (e.g., solar energy harvesting) including the design of organic conjugated electronic materials with applications in photovoltaic and light-emitting devices. Over the years, our team has developed efficient LANL-based codes to model the relevant photophysical processes following photoexcitation (spatial energy transfer, excitation localization/delocalization, and/or charge separation). The developed approach allows the non-radiative relaxation to be followed on up to ~10 ps timescales for large realistic molecules (hundreds of atomsmore » in size) in the realistic solvent dielectric environment. The Collective Electronic Oscillator (CEO) code is used to compute electronic excited states, and the Non-adiabatic Excited State Molecular Dynamics (NA-ESMD) code is used to follow the non-adiabatic dynamics on multiple coupled Born-Oppenheimer potential energy surfaces. Our preliminary NA-ESMD simulations have revealed key photoinduced mechanisms controlling competing interactions and relaxation pathways in complex materials, including organic conjugated polymer materials, and have provided a detailed understanding of photochemical products and intermediates and the internal conversion process during the initiation of energetic materials. This project will be using LANL-based CEO and NA-ESMD codes to model nonradiative relaxation in organic and energetic materials. The NA-ESMD and CEO codes belong to a class of electronic structure/quantum chemistry codes that require large memory, “long-queue-few-core” distribution of resources in order to make useful progress. The NA-ESMD simulations are trivially parallelizable requiring ~300 processors for up to one week runtime to reach a meaningful restart point.« less
Understanding Patterns of Team Collaboration Employed To Solve Unique Problems
2008-06-01
ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval...Postgraduate School,Code IS/Hs,589 Dyer Road,Monterey,CA,93943 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND... organizations , systems, infrastructure, and processes to create and share data, information, and knowledge that is needed for the team to plan
Final Report for ALCC Allocation: Predictive Simulation of Complex Flow in Wind Farms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barone, Matthew F.; Ananthan, Shreyas; Churchfield, Matt
This report documents work performed using ALCC computing resources granted under a proposal submitted in February 2016, with the resource allocation period spanning the period July 2016 through June 2017. The award allocation was 10.7 million processor-hours at the National Energy Research Scientific Computing Center. The simulations performed were in support of two projects: the Atmosphere to Electrons (A2e) project, supported by the DOE EERE office; and the Exascale Computing Project (ECP), supported by the DOE Office of Science. The project team for both efforts consists of staff scientists and postdocs from Sandia National Laboratories and the National Renewable Energymore » Laboratory. At the heart of these projects is the open-source computational-fluid-dynamics (CFD) code, Nalu. Nalu solves the low-Mach-number Navier-Stokes equations using an unstructured- grid discretization. Nalu leverages the open-source Trilinos solver library and the Sierra Toolkit (STK) for parallelization and I/O. This report documents baseline computational performance of the Nalu code on problems of direct relevance to the wind plant physics application - namely, Large Eddy Simulation (LES) of an atmospheric boundary layer (ABL) flow and wall-modeled LES of a flow past a static wind turbine rotor blade. Parallel performance of Nalu and its constituent solver routines residing in the Trilinos library has been assessed previously under various campaigns. However, both Nalu and Trilinos have been, and remain, in active development and resources have not been available previously to rigorously track code performance over time. With the initiation of the ECP, it is important to establish and document baseline code performance on the problems of interest. This will allow the project team to identify and target any deficiencies in performance, as well as highlight any performance bottlenecks as we exercise the code on a greater variety of platforms and at larger scales. The current study is rather modest in scale, examining performance on problem sizes of O(100 million) elements and core counts up to 8k cores. This will be expanded as more computational resources become available to the projects.« less
Wagner, Daniel J; Durbin, Janet; Barnsley, Jan; Ivers, Noah M
2017-12-02
Despite its popularity, the effectiveness of audit and feedback in support quality improvement efforts is mixed. While audit and feedback-related research efforts have investigated issues relating to feedback design and delivery, little attention has been directed towards factors which motivate interest and engagement with feedback interventions. This study explored the motivating factors that drove primary care teams to participate in a voluntary audit and feedback initiative. Interviews were conducted with leaders of primary care teams who had participated in at least one iteration of the audit and feedback program. This intervention was developed by an organization which advocates for high-quality, team-based primary care in Ontario, Canada. Interview transcripts were coded using the Consolidated Framework for Implementation Research and the resulting framework was analyzed inductively to generate key themes. Interviews were completed with 25 individuals from 18 primary care teams across Ontario. The majority were Executive Directors (14), Physician leaders (3) and support staff for Quality Improvement (4). A range of motivations for participating in the audit and feedback program beyond quality improvement were emphasized. Primarily, informants believed that the program would eventually become a best-in-class audit and feedback initiative. This reflected concerns regarding existing initiatives in terms of the intervention components and intentions as well as the perception that an initiative by primary care, for primary care would better reflect their own goals and better support desired patient outcomes. Key enablers included perceived obligations to engage and provision of support for the work involved. No teams cited an evidence base for A&F as a motivating factor for participation. A range of motivating factors, beyond quality improvement, contributed to participation in the audit and feedback program. Findings from this study highlight that efforts to understand how and when the intervention works best cannot be limited to factors within developers' control. Clinical teams may more readily engage with initiatives with the potential to address their own long-term system goals. Aligning motivations for participation with the goals of the audit and feedback initiative may facilitate both engagement and impact.
Experiences Supporting the Lunar Reconnaissance Orbiter Camera: the Devops Model
NASA Astrophysics Data System (ADS)
Licht, A.; Estes, N. M.; Bowman-Cisnesros, E.; Hanger, C. D.
2013-12-01
Introduction: The Lunar Reconnaissance Orbiter Camera (LROC) Science Operations Center (SOC) is responsible for instrument targeting, product processing, and archiving [1]. The LROC SOC maintains over 1,000,000 observations with over 300 TB of released data. Processing challenges compound with the acquisition of over 400 Gbits of observations daily creating the need for a robust, efficient, and reliable suite of specialized software. Development Environment: The LROC SOC's software development methodology has evolved over time. Today, the development team operates in close cooperation with the systems administration team in a model known in the IT industry as DevOps. The DevOps model enables a highly productive development environment that facilitates accomplishment of key goals within tight schedules[2]. The LROC SOC DevOps model incorporates industry best practices including prototyping, continuous integration, unit testing, code coverage analysis, version control, and utilizing existing open source software. Scientists and researchers at LROC often prototype algorithms and scripts in a high-level language such as MATLAB or IDL. After the prototype is functionally complete the solution is implemented as production ready software by the developers. Following this process ensures that all controls and requirements set by the LROC SOC DevOps team are met. The LROC SOC also strives to enhance the efficiency of the operations staff by way of weekly presentations and informal mentoring. Many small scripting tasks are assigned to the cognizant operations personnel (end users), allowing for the DevOps team to focus on more complex and mission critical tasks. In addition to leveraging open source software the LROC SOC has also contributed to the open source community by releasing Lunaserv [3]. Findings: The DevOps software model very efficiently provides smooth software releases and maintains team momentum. Scientists prototyping their work has proven to be very efficient as developers do not need to spend time iterating over small changes. Instead, these changes are realized in early prototypes and implemented before the task is seen by developers. The development practices followed by the LROC SOC DevOps team help facilitate a high level of software quality that is necessary for LROC SOC operations. Application to the Scientific Community: There is no replacement for having software developed by professional developers. While it is beneficial for scientists to write software, this activity should be seen as prototyping, which is then made production ready by professional developers. When constructed properly, even a small development team has the ability to increase the rate of software development for a research group while creating more efficient, reliable, and maintainable products. This strategy allows scientists to accomplish more, focusing on teamwork, rather than software development, which may not be their primary focus. 1. Robinson et al. (2010) Space Sci. Rev. 150, 81-124 2. DeGrandis. (2011) Cutter IT Journal. Vol 24, No. 8, 34-39 3. Estes, N.M.; Hanger, C.D.; Licht, A.A.; Bowman-Cisneros, E.; Lunaserv Web Map Service: History, Implementation Details, Development, and Uses, http://adsabs.harvard.edu/abs/2013LPICo1719.2609E.
Rupcic, Sonia; Tamrat, Tigest; Kachnowski, Stan
2012-11-01
This study reviews the state of diabetes information technology (IT) initiatives and presents a set of recommendations for improvement based on interviews with commercial IT innovators. Semistructured interviews were conducted with 10 technology developers, representing 12 of the most successful IT companies in the world. Average interview time was approximately 45 min. Interviews were audio-recorded, transcribed, and entered into ATLAS.ti for qualitative data analysis. Themes were identified through a process of selective and open coding by three researchers. We identified two practices, common among successful IT companies, that have allowed them to avoid or surmount the challenges that confront healthcare professionals involved in diabetes IT development: (1) employing a diverse research team of software developers and engineers, statisticians, consumers, and business people and (2) conducting rigorous research and analytics on technology use and user preferences. Because of the nature of their respective fields, healthcare professionals and commercial innovators face different constraints. With these in mind we present three recommendations, informed by practices shared by successful commercial developers, for those involved in developing diabetes IT programming: (1) include software engineers on the implementation team throughout the intervention, (2) conduct more extensive baseline testing of users and monitor the usage data derived from the technology itself, and (3) pursue Institutional Review Board-exempt research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carbary, Lawrence D.; Perkins, Laura L.; Serino, Roland
The team led by Dow Corning collaborated to increase the thermal performance of exterior insulation and finishing systems (EIFS) to reach R-40 performance meeting the needs for high efficiency insulated walls. Additionally, the project helped remove barriers to using EIFS on retrofit commercial buildings desiring high insulated walls. The three wall systems developed within the scope of this project provide the thermal performance of R-24 to R-40 by incorporating vacuum insulation panels (VIPs) into an expanded polystyrene (EPS) encapsulated vacuum insulated sandwich element (VISE). The VISE was incorporated into an EIFS as pre-engineered insulation boards. The VISE is installed usingmore » typical EIFS details and network of trained installers. These three wall systems were tested and engineered to be fully code compliant as an EIFS and meet all of the International Building Code structural, durability and fire test requirements for a code compliant exterior wall cladding system. This system is being commercialized under the trade name Dryvit® Outsulation® HE system. Full details, specifications, and application guidelines have been developed for the system. The system has been modeled both thermally and hygrothermally to predict condensation potential. Based on weather models for Baltimore, MD; Boston, MA; Miami, FL; Minneapolis, MN; Phoenix, AZ; and Seattle, WA; condensation and water build up in the wall system is not a concern. Finally, the team conducted a field trial of the system on a building at the former Brunswick Naval Air Station which is being redeveloped by the Midcoast Regional Redevelopment Authority (Brunswick, Maine). The field trial provided a retrofit R-30 wall onto a wood frame construction, slab on grade, 1800 ft2 building, that was monitored over the course of a year. Simultaneous with the façade retrofit, the building’s windows were upgraded at no charge to this program. The retrofit building used 49% less natural gas during the winter of 2012 compared to previous winters. This project achieved its goal of developing a system that is constructible, offers protection to the VIPs, and meets all performance targets established for the project.« less
Macrocognition in Complex Team Problem Solving
2007-06-01
Organization: Office of Naval Research Complete Address: Dr Michael Letsky Office of Naval Research Life Sciences Department Code 341 Rm 1051 875...S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Office of Naval Research ,Code 341 Rm...distribution unlimited 13. SUPPLEMENTARY NOTES Twelfth International Command and Control Research and Technology Symposium (12th ICCRTS), 19-21 June
ERIC Educational Resources Information Center
Wall, Candace A.; Rafferty, Lisa A.; Camizzi, Mariya A.; Max, Caroline A.; Van Blargan, David M.
2016-01-01
Many students who struggle to obtain the alphabetic principle are at risk for being identified as having a reading disability and would benefit from additional explicit phonics instruction as a remedial measure. In this action research case study, the research team conducted two experiments to investigate the effects of a color-coded, onset-rime,…
Perspectives on the methods of a large systematic mapping of maternal health interventions.
Chersich, Matthew; Becerril-Montekio, Victor; Becerra-Posada, Francisco; Dumbaugh, Mari; Kavanagh, Josephine; Blaauw, Duane; Thwala, Siphiwe; Kern, Elinor; Penn-Kekana, Loveday; Vargas, Emily; Mlotshwa, Langelihle; Dhana, Ashar; Mannava, Priya; Portela, Anayda; Tristan, Mario; Rees, Helen; Bijlmakers, Leon
2016-08-25
Mapping studies describe a broad body of literature, and differ from classical systematic reviews, which assess more narrowly-defined questions and evaluate the quality of the studies included in the review. While the steps involved in mapping studies have been described previously, a detailed qualitative account of the methodology could inform the design of future mapping studies. Describe the perspectives of a large research team on the methods used and collaborative experiences in a study that mapped the literature published on maternal health interventions in low- and middle-income countries (2292 full text articles included, after screening 35,048 titles and abstracts in duplicate). Fifteen members of the mapping team, drawn from eight countries, provided their experiences and perspectives of the study in response to a list of questions and probes. The responses were collated and analysed thematically following a grounded theory approach. The objectives of the mapping evolved over time, posing difficulties in ensuring a uniform understanding of the purpose of the mapping among the team members. Ambiguity of some study variables and modifications in data extraction codes were the main threats to the quality of data extraction. The desire for obtaining detailed information on a few topics needed to be weighed against the benefits of collecting more superficial data on a wider range of topics. Team members acquired skills in systematic review methodology and software, and a broad knowledge of maternal health literature. Participation in analysis and dissemination was lower than during the screening of articles for eligibility and data coding. Though all respondents believed the workload involved was high, study outputs were viewed as novel and important contributions to evidence. Overall, most believed there was a favourable balance between the amount of work done and the project's outputs. A large mapping of literature is feasible with a committed team aiming to build their research capacity, and with a limited, simplified set of data extraction codes. In the team's view, the balance between the time spent on the review, and the outputs and skills acquired was favourable. Assessments of the value of a mapping need, however, to take into account the limitations inherent in such exercises, especially the exclusion of grey literature and of assessments of the quality of the studies identified.
Mongoose: Creation of a Rad-Hard MIPS R3000
NASA Technical Reports Server (NTRS)
Lincoln, Dan; Smith, Brian
1993-01-01
This paper describes the development of a 32 Bit, full MIPS R3000 code-compatible Rad-Hard CPU, code named Mongoose. Mongoose progressed from contract award, through the design cycle, to operational silicon in 12 months to meet a space mission for NASA. The goal was the creation of a fully static device capable of operation to the maximum Mil-883 derated speed, worst-case post-rad exposure with full operational integrity. This included consideration of features for functional enhancements relating to mission compatibility and removal of commercial practices not supported by Rad-Hard technology. 'Mongoose' developed from an evolution of LSI Logic's MIPS-I embedded processor, LR33000, code named Cobra, to its Rad-Hard 'equivalent', Mongoose. The term 'equivalent' is used to infer that the core of the processor is functionally identical, allowing the same use and optimizations of the MIPS-I Instruction Set software tool suite for compilation, software program trace, etc. This activity was started in September of 1991 under a contract from NASA-Goddard Space Flight Center (GSFC)-Flight Data Systems. The approach affected a teaming of NASA-GSFC for program development, LSI Logic for system and ASIC design coupled with the Rad-Hard process technology, and Harris (GASD) for Rad-Hard microprocessor design expertise. The program culminated with the generation of Rad-Hard Mongoose prototypes one year later.
National Fusion Collaboratory: Grid Computing for Simulations and Experiments
NASA Astrophysics Data System (ADS)
Greenwald, Martin
2004-05-01
The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.
Rice, Simon M; Simmons, Magenta B; Bailey, Alan P; Parker, Alexandra G; Hetrick, Sarah E; Davey, Christopher G; Phelan, Mark; Blaikie, Simon; Edwards, Jane
2014-01-01
There is a lack of clear guidance regarding the management of ongoing suicidality in young people experiencing major depressive disorder. This study utilised an expert consensus approach in identifying practice principles to complement relevant clinical guidelines for the treatment of major depressive disorder in young people. The study also sought to outline a broad treatment framework for clinical intervention with young people experiencing ongoing suicidal ideation. In-depth focus groups were undertaken with a specialist multidisciplinary clinical team (the Youth Mood Clinic at Orygen Youth Health Clinical Program, Melbourne) working with young people aged 15-25 years experiencing ongoing suicidal ideation. Each focus group was audio recorded and transcribed verbatim using orthographic conventions. Principles of grounded theory and thematic analysis were used to analyse and code the resultant data. The identified codes were subsequently synthesised into eight practice principles reflecting engagement and consistency of care, ongoing risk assessment and documentation, individualised crisis planning, engaging systems of support, engendering hopefulness, development of adaptive coping, management of acute risk, and consultation and supervision. The identified practice principles provide a broad management framework, and may assist to improve treatment consistency and clinical management of young people experiencing ongoing suicidal ideation. The practice principles may be of use to health professionals working within a team-based setting involved in the provision of care, even if peripherally, to young people with ongoing suicidal ideation. Findings address the lack of treatment consistency and shared terminology and may provide containment and guidance to multidisciplinary clinicians working with this at-risk group.
Computing Properties of Hadrons, Nuclei and Nuclear Matter from Quantum Chromodynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Savage, Martin J.
This project was part of a coordinated software development effort which the nuclear physics lattice QCD community pursues in order to ensure that lattice calculations can make optimal use of present, and forthcoming leadership-class and dedicated hardware, including those of the national laboratories, and prepares for the exploitation of future computational resources in the exascale era. The UW team improved and extended software libraries used in lattice QCD calculations related to multi-nucleon systems, enhanced production running codes related to load balancing multi-nucleon production on large-scale computing platforms, and developed SQLite (addressable database) interfaces to efficiently archive and analyze multi-nucleon datamore » and developed a Mathematica interface for the SQLite databases.« less
FY 2002 Report on Software Visualization Techniques for IV and V
NASA Technical Reports Server (NTRS)
Fotta, Michael E.
2002-01-01
One of the major challenges software engineers often face in performing IV&V is developing an understanding of a system created by a development team they have not been part of. As budgets shrink and software increases in complexity, this challenge will become even greater as these software engineers face increased time and resource constraints. This research will determine which current aspects of providing this understanding (e.g., code inspections, use of control graphs, use of adjacency matrices, requirements traceability) are critical to the performing IV&V and amenable to visualization techniques. We will then develop state-of-the-art software visualization techniques to facilitate the use of these aspects to understand software and perform IV&V.
Perioperative cardiopulmonary arrest competencies.
Murdock, Darlene B
2013-08-01
Although basic life support skills are not often needed in the surgical setting, it is crucial that surgical team members understand their roles and are ready to intervene swiftly and effectively if necessary. Ongoing education and training are key elements to equip surgical team members with the skills and knowledge they need to handle untimely and unexpected life-threatening scenarios in the perioperative setting. Regular emergency cardiopulmonary arrest skills education, including the use of checklists, and mock codes are ways to validate that team members understand their responsibilities and are competent to help if an arrest occurs in the OR. After a mock drill, a debriefing session can help team members discuss and critique their performances and improve their knowledge and mastery of skills. Copyright © 2013 AORN, Inc. Published by Elsevier Inc. All rights reserved.
CHALLENGES OF DSD: DIVERSE PERCEPTIONS ACROSS STAKEHOLDERS
Kogan, Barry A.; Gardner, Melissa; Alpern, Adrianne N.; Cohen, Laura M.; Grimley, Mary Beth; Quittner, Alexandra L.; Sandberg, David E.
2012-01-01
Background/Aims Disorders of Sex Development (DSD) are congenital conditions in which chromosomal, gonadal, or anatomic sex development is atypical. Optimal management is patient- and family-centered and delivered by interdisciplinary teams. The present pilot study elicits concerns held by important stakeholders on issues affecting young patients with DSD and their families. Methods Content from focus groups with expert clinicians (pediatric urologists [n=7], pediatric endocrinologists [n=10], mental health professionals [n=4]), DSD patient advocates (n=4), and interviews with parents of DSD-affected children (newborn to 6 yrs; n=11) was coded and content-analyzed to identify health-related quality of life issues. Results Key stressors varied across stakeholder groups. In general, family-centered issues were noted more than child-centered. In the child-centered domain, providers worried more about physical functioning; family and advocates emphasized gender concerns and body image. In the family-centered domain, parental concerns about medication management outweighed those of providers. Advocates reported more stressors regarding communication/information than other stakeholders. Conclusion Variability exists across stakeholder groups in the key concerns affecting young children/families with DSD. Interdisciplinary DSD healthcare team development should account for varying perspectives when counseling families and planning treatment. PMID:22832323
Suomi NPP VIIRS active fire product status
NASA Astrophysics Data System (ADS)
Ellicott, E. A.; Csiszar, I. A.; Schroeder, W.; Giglio, L.; Wind, B.; Justice, C. O.
2012-12-01
We provide an overview of the evaluation and development of the Active Fires product derived from the Visible Infrared Imager Radiometer Suite (VIIRS) sensor on the Suomi National Polar-orbiting Partnership (SNPP) satellite during the first year of on-orbit data. Results from the initial evaluation of the standard SNPP Active Fires product, generated by the SNPP Interface Data Processing System (IDPS), supported the stabilization of the VIIRS Sensor Data Record (SDR) product. This activity focused in particular on the processing of the dual-gain 4 micron VIIRS M13 radiometric measurements into 750m aggregated data, which are fundamental for active fire detection. Following the VIIRS SDR product's Beta maturity status in April 2012, correlative analysis between VIIRS and near-simultaneous fire detections from the Moderate Resolution Imaging Spectroradiometer (MODIS) on the NASA Earth Observing System Aqua satellite confirmed the expected relative detection rates driven primarily by sensor differences. The VIIRS Active Fires Product Development and Validation Team also developed a science code that is based on the latest MODIS Collection 6 algorithm and provides a full spatially explicit fire mask to replace the sparse array output of fire locations from a MODIS Collection 4 equivalent algorithm in the current IDPS product. The Algorithm Development Library (ADL) was used to support the planning for the transition of the science code into IDPS operations in the future. Product evaluation and user outreach was facilitated by a product website that provided end user access to fire data in user-friendly format over North America as well as examples of VIIRS-MODIS comparisons. The VIIRS fire team also developed an experimental product based on 375m VIIRS Imagery band measurements and provided high quality imagery of major fire events in US. By August 2012 the IDPS product achieved Beta maturity, with some known and documented shortfalls related to the processing of incorrect SDR input data and to apparent algorithm deficiencies in select observing and environmental conditions.
Predictors of individual player match performance in junior Australian football.
Tangalos, Christie; Robertson, Samuel J; Spittle, Michael; Gastin, Paul B
2015-10-01
Player match statistics in junior Australian football (AF) are not well documented, and contributors to success are poorly understood. A clearer understanding of the relationships between fitness and skill in younger players participating at the foundation level of the performance pathway in AF has implications for the development of coaching priorities (eg, physical or technical). To investigate the relationships between indices of fitness (speed, power, and endurance) and skill (coach rating) on player performance (disposals and effective disposals) in junior AF. Junior male AF players (N = 156, 10-15 y old) were recruited from 12 teams of a single amateur recreational AF club located in metropolitan Victoria. All players were tested for fitness (20-m sprint, vertical jump, 20-m shuttle run) and rated by their coach on a 6-point Likert scale for skill (within a team in comparison with their teammates). Player performance was assessed during a single match in which disposals and their effectiveness were coded from a video recording. Coach rating of skill displayed the strongest correlations and, combined with 20-m shuttle test, showed a good ability to predict the number of both disposals and effective disposals. None of the skill or fitness attributes adequately explained the percentage of effective disposals. The influence of team did not meaningfully contribute to the performance of any of the models. Skill development should be considered a high priority by coaches in junior AF.
Exploring Anthropology’s Value to Military Strategy Since 2000
2014-04-01
anthropological study of military culture, MA2 : anthropological study for the military, in endeavors such as the Human Terrain System concept, where teams of...Anthropology The AAA has judged MA2 as the least ethical category of military anthropology by means of its code of ethics, CEAUSSIC reports, and...open debates on its blog. The lighting rod system most associated with MA2 is the Human Terrain Team, (HTT) employed under the Human Terrain System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Torcellini, Paul A; Scheib, Jennifer G; Pless, Shanti
New construction could account for more than 25% of the U.S. energy consumption by 2030. Millions of square feet are built every year that will not perform as expected - despite advancing codes, rating systems, super-efficient technologies, and advanced utility programs. With retrofits of these under-performers decades away, savings potential will be lost for years to come. Only the building owner is in the driver's seat to demand - and verify - higher-performing buildings. Yet our current policy and market interventions really target the design team, not the owner. Accelerate Performance, a U.S. Department of Energy funded initiative, is changingmore » the building procurement approach to drive deeper, verified savings in three pilot states: Illinois, Minnesota, and Connecticut. Performance-based procurement ties energy performance to design and contractor team compensation while freeing them to meet energy targets with strategies most familiar to them. The process teases out the creativity of the design and contracting teams to deliver energy performance - without driving up the construction cost. The paper will share early results and lessons learned from new procurement and contract approaches in government, public, and private sector building projects. The paper provides practical guidance for building owners, facilities managers, design, and contractor teams who wish to incorporate effective performance-based procurement for deeper energy savings in their buildings.« less
Nouraei, S A R; Hudovsky, A; Virk, J S; Chatrath, P; Sandhu, G S
2013-12-01
To audit the accuracy of clinical coding in otolaryngology, assess the effectiveness of previously implemented interventions, and determine ways in which it can be further improved. Prospective clinician-auditor multidisciplinary audit of clinical coding accuracy. Elective and emergency ENT admissions and day-case activity. Concordance between initial coding and the clinician-auditor multi-disciplinary teams (MDT) coding in respect of primary and secondary diagnoses and procedures, health resource groupings health resource groupings (HRGs) and tariffs. The audit of 3131 randomly selected otolaryngology patients between 2010 and 2012 resulted in 420 instances of change to the primary diagnosis (13%) and 417 changes to the primary procedure (13%). In 1420 cases (44%), there was at least one change to the initial coding and 514 (16%) health resource groupings changed. There was an income variance of £343,169 or £109.46 per patient. The highest rates of health resource groupings change were observed in head and neck surgery and in particular skull-based surgery, laryngology and within that tracheostomy, and emergency admissions, and specially, epistaxis management. A randomly selected sample of 235 patients from the audit were subjected to a second audit by a second clinician-auditor multi-disciplinary team. There were 12 further health resource groupings changes (5%) and at least one further coding change occurred in 57 patients (24%). These changes were significantly lower than those observed in the pre-audit sample, but were also significantly greater than zero. Asking surgeons to 'code in theatre' and applying these codes without further quality assurance to activity resulted in an health resource groupings error rate of 45%. The full audit sample was regrouped under health resource groupings 3.5 and was compared with a previous audit of 1250 patients performed between 2007 and 2008. This comparison showed a reduction in the baseline rate of health resource groupings change from 16% during the first audit cycle to 9% in the current audit cycle (P < 0.001). Otolaryngology coding is complex and susceptible to subjectivity, variability and error. Coding variability can be improved, but not eliminated through regular education supported by an audit programme. © 2013 John Wiley & Sons Ltd.
High Resolution Integrated Hohlraum-Capsule Simulations for Virtual NIF Ignition Campaign
NASA Astrophysics Data System (ADS)
Jones, O. S.; Marinak, M. M.; Cerjan, C. J.; Clark, D. S.; Edwards, M. J.; Haan, S. W.; Langer, S. H.; Salmonson, J. D.
2009-11-01
We have undertaken a virtual campaign to assess the viability of the sequence of NIF experiments planned for 2010 that will experimentally tune the shock timing, symmetry, and ablator thickness of a cryogenic ignition capsule prior to the first ignition attempt. The virtual campaign consists of two teams. The ``red team'' creates realistic simulated diagnostic data for a given experiment from the output of a detailed radiation hydrodynamics calculation that has physics models that have been altered in a way that is consistent with probable physics uncertainties. The ``blue team'' executes a series of virtual experiments and interprets the simulated diagnostic data from those virtual experiments. To support this effort we have developed a capability to do very high spatial resolution integrated hohlraum-capsule simulations using the Hydra code. Surface perturbations for all ablator layer surfaces and the DT ice layer are calculated explicitly through mode 30. The effects of the fill tube, cracks in the ice layer, and defects in the ablator are included in models extracted from higher resolution calculations. Very high wave number mix is included through a mix model. We will show results from these calculations in the context of the ongoing virtual campaign.
Wenrich, Marjorie D; Jackson, Molly Blackley; Maestas, Ramoncita R; Wolfhagen, Ineke H A P; Scherpbier, Albert J J
2015-11-01
Medical students learn clinical skills at the bedside from teaching clinicians, who often learn to teach by teaching. Little is known about the process of becoming an effective clinical teacher. Understanding how teaching skills and approaches change with experience may help tailor faculty development for new teachers. Focusing on giving feedback to early learners, the authors asked: What is the developmental progression of clinician-teachers as they learn to give clinical skills feedback to medical students? This qualitative study included longitudinal interviews with clinician-teachers over five years in a new clinical skills teaching program for preclinical medical students. Techniques derived from grounded theory were used for initial analyses. The current study focused on one theme identified in initial analyses: giving feedback to students. Transcript passages were organized by interview year, coded, and discussed in year clusters; thematic codes were compared and emergent codes developed. Themes related to giving feedback demonstrated a dyadic structure: characteristic of less experienced teachers versus characteristic of experienced teachers. Seven dominant dyadic themes emerged, including teacher as cheerleader versus coach, concern about student fragility versus understanding resilience, and focus on creating a safe environment versus challenging students within a safe environment. With consistent teaching, clinical teachers demonstrated progress in giving feedback to students in multiple areas, including understanding students' developmental trajectory and needs, developing tools and strategies, and adopting a dynamic, challenging, inclusive team approach. Ongoing teaching opportunities with targeted faculty development may help improve clinician-teachers' feedback skills and approaches.
The process of implementing a rural VA wound care program for diabetic foot ulcer patients.
Reiber, Gayle E; Raugi, Gregory J; Rowberg, Donald
2007-10-01
Delivering and documenting evidence-based treatment to all Department of Veterans Affairs (VA) foot ulcer patients has wide appeal. However, primary and secondary care medical centers where 52% of these patients receive care are at a disadvantage given the frequent absence of trained specialists to manage diabetic foot ulcers. A retrospective review of diabetic foot ulcer patient records and a provider survey were conducted to document the foot ulcer problem and to assess practitioner needs. Results showed of the 125 persons with foot ulcers identified through administrative data, only, 21% of diabetic foot patients were correctly coded. Chronic Care and Microsystem models were used to prepare a tailored intervention in a VA primary care medical center. The site Principal Investigators, a multidisciplinary site wound care team, and study investigators jointly implemented a diabetic foot ulcer program. Intervention components include wound care team education and training, standardized good wound care practices based on strong scientific evidence, and a wound care template embedded in the electronic medical record to facilitate data collection, clinical decision making, patient ordering, and coding. A strategy for delivering offloading pressure devices, regular case management support, and 24/7 emergency assistance also was developed. It took 9 months to implement the model. Patients were enrolled and followed for 1 year. Process and outcome evaluations are on-going.
Perinatal depression: a review of US legislation and law.
Rhodes, Ann M; Segre, Lisa S
2013-08-01
Accumulating research documenting the prevalence and negative effects of perinatal depression, together with highly publicized tragic critical incidents of suicide and filicide by mothers with postpartum psychosis, have fueled a continuum of legislation. Specialists in perinatal mental health should recognize how their work influences legislative initiatives and penal codes, and take this into consideration when developing perinatal services and research. Yet, without legal expertise, the status of legislative initiatives can be confusing. To address this shortfall, we assembled an interdisciplinary team of academics specializing in law, as well as perinatal mental health, to summarize these issues. This review presents the relevant federal and state legislation and summarizes the criminal codes that governed the court decisions on cases in which a mother committed filicide because of postpartum psychosis. Moreover, the review aims to help researchers and providers who specialize in perinatal depression understand their role in this legal landscape.
An OpenMI Implementation of a Water Resources System using Simple Script Wrappers
NASA Astrophysics Data System (ADS)
Steward, D. R.; Aistrup, J. A.; Kulcsar, L.; Peterson, J. M.; Welch, S. M.; Andresen, D.; Bernard, E. A.; Staggenborg, S. A.; Bulatewicz, T.
2013-12-01
This team has developed an adaption of the Open Modelling Interface (OpenMI) that utilizes Simple Script Wrappers. Code is made OpenMI compliant through organization within three modules that initialize, perform time steps, and finalize results. A configuration file is prepared that specifies variables a model expects to receive as input and those it will make available as output. An example is presented for groundwater, economic, and agricultural production models in the High Plains Aquifer region of Kansas. Our models use the programming environments in Scilab and Matlab, along with legacy Fortran code, and our Simple Script Wrappers can also use Python. These models are collectively run within this interdisciplinary framework from initial conditions into the future. It will be shown that by applying model constraints to one model, the impact may be accessed on changes to the water resources system.
NASA Technical Reports Server (NTRS)
Bache, George
1993-01-01
Validation of CFD codes is a critical first step in the process of developing CFD design capability. The MSFC Pump Technology Team has recognized the importance of validation and has thus funded several experimental programs designed to obtain CFD quality validation data. The first data set to become available is for the SSME High Pressure Fuel Turbopump Impeller. LDV Data was taken at the impeller inlet (to obtain a reliable inlet boundary condition) and three radial positions at the impeller discharge. Our CFD code, TASCflow, is used within the Propulsion and Commercial Pump industry as a tool for pump design. The objective of this work, therefore, is to further validate TASCflow for application in pump design. TASCflow was used to predict flow at the impeller discharge for flowrates of 80, 100 and 115 percent of design flow. Comparison to data has been made with encouraging results.
Casemix Funding Optimisation: Working Together to Make the Most of Every Episode.
Uzkuraitis, Carly; Hastings, Karen; Torney, Belinda
2010-10-01
Eastern Health, a large public Victorian Healthcare network, conducted a WIES optimisation audit across the casemix-funded sites for separations in the 2009/2010 financial year. The audit was conducted using existing staff resources and resulted in a significant increase in casemix funding at a minimal cost. The audit showcased the skill set of existing staff and resulted in enormous benefits to the coding and casemix team by demonstrating the value of the combination of skills that makes clinical coders unique. The development of an internal web-based application allowed accurate and timely reporting of the audit results, providing the basis for a restructure of the coding and casemix service, along with approval for additional staffing resources and inclusion of a regular auditing program to focus on the creation of high quality data for research, health services management and financial reimbursement.
A Design for Composing and Extending Vehicle Models
NASA Technical Reports Server (NTRS)
Madden, Michael M.; Neuhaus, Jason R.
2003-01-01
The Systems Development Branch (SDB) at NASA Langley Research Center (LaRC) creates simulation software products for research. Each product consists of an aircraft model with experiment extensions. SDB treats its aircraft models as reusable components, upon which experiments can be built. SDB has evolved aircraft model design with the following goals: 1. Avoid polluting the aircraft model with experiment code. 2. Discourage the copy and tailor method of reuse. The current evolution of that architecture accomplishes these goals by reducing experiment creation to extend and compose. The architecture mechanizes the operational concerns of the model's subsystems and encapsulates them in an interface inherited by all subsystems. Generic operational code exercises the subsystems through the shared interface. An experiment is thus defined by the collection of subsystems that it creates ("compose"). Teams can modify the aircraft subsystems for the experiment using inheritance and polymorphism to create variants ("extend").
Patient Safety Center Organization
2007-06-01
placement Medicine, Surgery Lumbar puncture* Medicine Thoracentesis* Medicine Shoulder dystocia Obstetrics & Gynecology Mock code-depressed newborn...Airway 2) Team Training (using SimMan), 3) Endoscopy, 4) Shoulder Dystocia , 5) Episiotomy, and 6) Central Line Placement. The second group is
Enhance your team-based qualitative research.
Fernald, Douglas H; Duclos, Christine W
2005-01-01
Qualitative research projects often involve the collaborative efforts of a research team. Challenges inherent in teamwork include changes in membership and differences in analytical style, philosophy, training, experience, and skill. This article discusses teamwork issues and tools and techniques used to improve team-based qualitative research. We drew on our experiences in working on numerous projects of varying, size, duration, and purpose. Through trials of different tools and techniques, expert consultation, and review of the literature, we learned to improve how we build teams, manage information, and disseminate results. Attention given to team members and team processes is as important as choosing appropriate analytical tools and techniques. Attentive team leadership, commitment to early and regular team meetings, and discussion of roles, responsibilities, and expectations all help build more effective teams and establish clear norms. As data are collected and analyzed, it is important to anticipate potential problems from differing skills and styles, and how information and files are managed. Discuss analytical preferences and biases and set clear guidelines and practices for how data will be analyzed and handled. As emerging ideas and findings disperse across team members, common tools (such as summary forms and data grids), coding conventions, intermediate goals or products, and regular documentation help capture essential ideas and insights. In a team setting, little should be left to chance. This article identifies ways to improve team-based qualitative research with more a considered and systematic approach. Qualitative researchers will benefit from further examination and discussion of effective, field-tested, team-based strategies.
SDTM - SYSTEM DESIGN TRADEOFF MODEL FOR SPACE STATION FREEDOM RELEASE 1.1
NASA Technical Reports Server (NTRS)
Chamberlin, R. G.
1994-01-01
Although extensive knowledge of space station design exists, the information is widely dispersed. The Space Station Freedom Program (SSFP) needs policies and procedures that ensure the use of consistent design objectives throughout its organizational hierarchy. The System Design Tradeoff Model (SDTM) produces information that can be used for this purpose. SDTM is a mathematical model of a set of possible designs for Space Station Freedom. Using the SDTM program, one can find the particular design which provides specified amounts of resources to Freedom's users at the lowest total (or life cycle) cost. One can also compare alternative design concepts by changing the set of possible designs, while holding the specified user services constant, and then comparing costs. Finally, both costs and user services can be varied simultaneously when comparing different designs. SDTM selects its solution from a set of feasible designs. Feasibility constraints include safety considerations, minimum levels of resources required for station users, budget allocation requirements, time limitations, and Congressional mandates. The total, or life cycle, cost includes all of the U.S. costs of the station: design and development, purchase of hardware and software, assembly, and operations throughout its lifetime. The SDTM development team has identified, for a variety of possible space station designs, the subsystems that produce the resources to be modeled. The team has also developed formulas for the cross consumption of resources by other resources, as functions of the amounts of resources produced. SDTM can find the values of station resources, so that subsystem designers can choose new design concepts that further reduce the station's life cycle cost. The fundamental input to SDTM is a set of formulas that describe the subsystems which make up a reference design. Most of the formulas identify how the resources required by each subsystem depend upon the size of the subsystem. Some of the formulas describe how the subsystem costs depend on size. The formulas can be complicated and nonlinear (if nonlinearity is needed to describe how designs change with size). SDTM's outputs are amounts of resources, life-cycle costs, and marginal costs. SDTM will run on IBM PC/XTs, ATs, and 100% compatibles with 640K of RAM and at least 3Mb of fixed-disk storage. A printer which can print in 132-column mode is also required, and a mathematics co-processor chip is highly recommended. This code is written in Turbo C 2.0. However, since the developers used a modified version of the proprietary Vitamin C source code library, the complete source code is not available. The executable is provided, along with all non-proprietary source code. This program was developed in 1989.
Accuracy of clinical coding for procedures in oral and maxillofacial surgery.
Khurram, S A; Warner, C; Henry, A M; Kumar, A; Mohammed-Ali, R I
2016-10-01
Clinical coding has important financial implications, and discrepancies in the assigned codes can directly affect the funding of a department and hospital. Over the last few years, numerous oversights have been noticed in the coding of oral and maxillofacial (OMF) procedures. To establish the accuracy and completeness of coding, we retrospectively analysed the records of patients during two time periods: March to May 2009 (324 patients), and January to March 2014 (200 patients). Two investigators independently collected and analysed the data to ensure accuracy and remove bias. A large proportion of operations were not assigned all the relevant codes, and only 32% - 33% were correct in both cycles. To our knowledge, this is the first reported audit of clinical coding in OMFS, and it highlights serious shortcomings that have substantial financial implications. Better input by the surgical team and improved communication between the surgical and coding departments will improve accuracy. Copyright © 2016 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Huffhines, Lindsay; Tunno, Angela M; Cho, Bridget; Hambrick, Erin P; Campos, Ilse; Lichty, Brittany; Jackson, Yo
2016-08-01
State social service agency case files are a common mechanism for obtaining information about a child's maltreatment history, yet these documents are often challenging for researchers to access, and then to process in a manner consistent with the requirements of social science research designs. Specifically, accessing and navigating case files is an extensive undertaking, and a task that many researchers have had to maneuver with little guidance. Even after the files are in hand and the research questions and relevant variables have been clarified, case file information about a child's maltreatment exposure can be idiosyncratic, vague, inconsistent, and incomplete, making coding such information into useful variables for statistical analyses difficult. The Modified Maltreatment Classification System (MMCS) is a popular tool used to guide the process, and though comprehensive, this coding system cannot cover all idiosyncrasies found in case files. It is not clear from the literature how researchers implement this system while accounting for issues outside of the purview of the MMCS or that arise during MMCS use. Finally, a large yet reliable file coding team is essential to the process, however, the literature lacks training guidelines and methods for establishing reliability between coders. In an effort to move the field toward a common approach, the purpose of the present discussion is to detail the process used by one large-scale study of child maltreatment, the Studying Pathways to Adjustment and Resilience in Kids (SPARK) project, a longitudinal study of resilience in youth in foster care. The article addresses each phase of case file coding, from accessing case files, to identifying how to measure constructs of interest, to dealing with exceptions to the coding system, to coding variables reliably, to training large teams of coders and monitoring for fidelity. Implications for a comprehensive and efficient approach to case file coding are discussed.
Huffhines, Lindsay; Tunno, Angela M.; Cho, Bridget; Hambrick, Erin P.; Campos, Ilse; Lichty, Brittany; Jackson, Yo
2016-01-01
State social service agency case files are a common mechanism for obtaining information about a child’s maltreatment history, yet these documents are often challenging for researchers to access, and then to process in a manner consistent with the requirements of social science research designs. Specifically, accessing and navigating case files is an extensive undertaking, and a task that many researchers have had to maneuver with little guidance. Even after the files are in hand and the research questions and relevant variables have been clarified, case file information about a child’s maltreatment exposure can be idiosyncratic, vague, inconsistent, and incomplete, making coding such information into useful variables for statistical analyses difficult. The Modified Maltreatment Classification System (MMCS) is a popular tool used to guide the process, and though comprehensive, this coding system cannot cover all idiosyncrasies found in case files. It is not clear from the literature how researchers implement this system while accounting for issues outside of the purview of the MMCS or that arise during MMCS use. Finally, a large yet reliable file coding team is essential to the process, however, the literature lacks training guidelines and methods for establishing reliability between coders. In an effort to move the field toward a common approach, the purpose of the present discussion is to detail the process used by one large-scale study of child maltreatment, the Studying Pathways to Adjustment and Resilience in Kids (SPARK) project, a longitudinal study of resilience in youth in foster care. The article addresses each phase of case file coding, from accessing case files, to identifying how to measure constructs of interest, to dealing with exceptions to the coding system, to coding variables reliably, to training large teams of coders and monitoring for fidelity. Implications for a comprehensive and efficient approach to case file coding are discussed. PMID:28138207
Towards seamless workflows in agile data science
NASA Astrophysics Data System (ADS)
Klump, J. F.; Robertson, J.
2017-12-01
Agile workflows are a response to projects with requirements that may change over time. They prioritise rapid and flexible responses to change, preferring to adapt to changes in requirements rather than predict them before a project starts. This suits the needs of research very well because research is inherently agile in its methodology. The adoption of agile methods has made collaborative data analysis much easier in a research environment fragmented across institutional data stores, HPC, personal and lab computers and more recently cloud environments. Agile workflows use tools that share a common worldview: in an agile environment, there may be more that one valid version of data, code or environment in play at any given time. All of these versions need references and identifiers. For example, a team of developers following the git-flow conventions (github.com/nvie/gitflow) may have several active branches, one for each strand of development. These workflows allow rapid and parallel iteration while maintaining identifiers pointing to individual snapshots of data and code and allowing rapid switching between strands. In contrast, the current focus of versioning in research data management is geared towards managing data for reproducibility and long-term preservation of the record of science. While both are important goals in the persistent curation domain of the institutional research data infrastructure, current tools emphasise planning over adaptation and can introduce unwanted rigidity by insisting on a single valid version or point of truth. In the collaborative curation domain of a research project, things are more fluid. However, there is no equivalent to the "versioning iso-surface" of the git protocol for the management and versioning of research data. At CSIRO we are developing concepts and tools for the agile management of software code and research data for virtual research environments, based on our experiences of actual data analytics projects in the geosciences. We use code management that allows researchers to interact with the code through tools like Jupyter Notebooks while data are held in an object store. Our aim is an architecture allowing seamless integration of code development, data management, and data processing in virtual research environments.
The Afghan National Police: Turning a Counterinsurgency Problem into a Solution
2009-12-01
27 “Special Forces groups are organized in small teams of 12 men — a.k.a. Operational Detachment Alpha (ODA). A typical Green Beret’s Team structure...Human Progress, 98. 162 Akbar S. Ahmed and David M. Hart, Islam in Tribal Societies: From the Atlas to the Indus (London; Boston: Routledge & Kegan ...Boston: Routledge & Kegan Paul, 1984. Al Jazeera. “Taliban Issue Code of Conduct.” Al Jazeera English Central/S. Asia (July 27, 2009), http
NASA Astrophysics Data System (ADS)
Kwon, N.; Gentle, J.; Pierce, S. A.
2015-12-01
Software code developed for research is often used for a relatively short period of time before it is abandoned, lost, or becomes outdated. This unintentional abandonment of code is a valid problem in the 21st century scientific process, hindering widespread reusability and increasing the effort needed to develop research software. Potentially important assets, these legacy codes may be resurrected and documented digitally for long-term reuse, often with modest effort. Furthermore, the revived code may be openly accessible in a public repository for researchers to reuse or improve. For this study, the research team has begun to revive the codebase for Groundwater Decision Support System (GWDSS), originally developed for participatory decision making to aid urban planning and groundwater management, though it may serve multiple use cases beyond those originally envisioned. GWDSS was designed as a java-based wrapper with loosely federated commercial and open source components. If successfully revitalized, GWDSS will be useful for both practical applications as a teaching tool and case study for groundwater management, as well as informing theoretical research. Using the knowledge-sharing approaches documented by the NSF-funded Ontosoft project, digital documentation of GWDSS is underway, from conception to development, deployment, characterization, integration, composition, and dissemination through open source communities and geosciences modeling frameworks. Information assets, documentation, and examples are shared using open platforms for data sharing and assigned digital object identifiers. Two instances of GWDSS version 3.0 are being created: 1) a virtual machine instance for the original case study to serve as a live demonstration of the decision support tool, assuring the original version is usable, and 2) an open version of the codebase, executable installation files, and developer guide available via an open repository, assuring the source for the application is accessible with version control and potential for new branch developments. Finally, metadata about the software has been completed within the OntoSoft portal to provide descriptive curation, make GWDSS searchable, and complete documentation of the scientific software lifecycle.
Subjective evaluation of next-generation video compression algorithms: a case study
NASA Astrophysics Data System (ADS)
De Simone, Francesca; Goldmann, Lutz; Lee, Jong-Seok; Ebrahimi, Touradj; Baroncini, Vittorio
2010-08-01
This paper describes the details and the results of the subjective quality evaluation performed at EPFL, as a contribution to the effort of the Joint Collaborative Team on Video Coding (JCT-VC) for the definition of the next-generation video coding standard. The performance of 27 coding technologies have been evaluated with respect to two H.264/MPEG-4 AVC anchors, considering high definition (HD) test material. The test campaign involved a total of 494 naive observers and took place over a period of four weeks. While similar tests have been conducted as part of the standardization process of previous video coding technologies, the test campaign described in this paper is by far the most extensive in the history of video coding standardization. The obtained subjective quality scores show high consistency and support an accurate comparison of the performance of the different coding solutions.
Scanning for safety: an integrated approach to improved bar-code medication administration.
Early, Cynde; Riha, Chris; Martin, Jennifer; Lowdon, Karen W; Harvey, Ellen M
2011-03-01
This is a review of lessons learned in the postimplementation evaluation of a bar-code medication administration technology implemented at a major tertiary-care hospital in 2001. In 2006, with a bar-code medication administration scan compliance rate of 82%, a near-miss sentinel event prompted review of this technology as part of an institutional recommitment to a "culture of safety." Multifaceted problems with bar-code medication administration created an environment of circumventing safeguards as demonstrated by an increase in manual overrides to ensure timely medication administration. A multiprofessional team composed of nursing, pharmacy, human resources, quality, and technical services formalized. Each step in the bar-code medication administration process was reviewed. Technology, process, and educational solutions were identified and implemented systematically. Overall compliance with bar-code medication administration rose from 82% to 97%, which resulted in a calculated cost avoidance of more than $2.8 million during this time frame of the project.
Kassie, Getnet M; Belay, Teklu; Sharma, Anjali; Feleke, Getachew
2018-01-01
Focus on improving access and quality of HIV care and treatment gained acceptance in Ethiopia through the work of the International Training and Education Center for Health. The initiative deployed mobile field-based teams and capacity building teams to mentor health care providers on clinical services and program delivery in three regions, namely Tigray, Amhara, and Afar. Transitioning of the clinical mentoring program (CMP) began in 2012 through capacity building and transfer of skills and knowledge to local health care providers and management. The initiative explored the process of transitioning a CMP on HIV care and treatment to local ownership and documented key lessons learned. A mixed qualitative design was used employing focus group discussions, individual in-depth interviews, and review of secondary data. The participants included regional focal persons, mentors, mentees, multidisciplinary team members, and International Training and Education Center for Health (I-TECH) staff. Three facilities were selected in each region. Data were collected by trained research assistants using customized guides for interviews and with data extraction format. The interviews were recorded and fully transcribed. Open Code software was used for coding and categorizing the data. A total of 16 focus group discussions and 20 individual in-depth interviews were conducted. The critical processes for transitioning a project were: establishment of a mentoring transition task force, development of a roadmap to define steps and directions for implementing the transition, and signing of a memorandum of understanding (MOU) between the respective regional health bureaus and I-TECH Ethiopia to formalize the transition. The elements of implementation included mentorship and capacity building, joint mentoring, supportive supervision, review meetings, and independent mentoring supported by facility-based mechanisms: multidisciplinary team meetings, case-based discussions, and catchment area meetings. The process of transitioning the CMP to local ownership involved signing an MOU, training of mentors, and building capacity of mentoring in each region. The experience shed light on how to transition donor-supported work to local country ownership, with key lessons related to strengthening the structures of regional health bureaus, and other facilities addressing critical issues and ensuring continuity of the facility-based activities.
... Central Office-Coding Resources AHA Team Training Health Career Center Health Forum Connect More Regulatory Relief The regulatory burden faced by hospitals is substantial and unsustainable. Read the report . More AHA Opioid Toolkit Stem the Tide: Addressing the Opioid Epidemic More ...
Patient safety incidents in hospice care: observations from interdisciplinary case conferences.
Oliver, Debra Parker; Demiris, George; Wittenberg-Lyles, Elaine; Gage, Ashley; Dewsnap-Dreisinger, Mariah L; Luetkemeyer, Jamie
2013-12-01
In the home hospice environment, issues arise every day presenting challenges to the safety, care, and quality of the dying experience. The literature pertaining to the safety challenges in this environment is limited. The study explored two research questions; 1) What types of patient safety incidents occur in the home hospice setting? 2) How many of these incidents are recognized by the hospice staff and/or the patient or caregiver as a patient safety incident? Video-recordings of hospice interdisciplinary team case conferences were reviewed and coded for patient safety incidents. Patient safety incidents were defined as any event or circumstance that could have resulted or did result in unnecessary harm to the patient or caregiver, or that could have resulted or did result in a negative impact on the quality of the dying experience for the patient. Codes for categories of patient safety incidents were based on the International Classification for Patient Safety. The setting for the study included two rural hospice programs in one Midwestern state in the United States. One hospice team had two separately functioning teams, the second hospice had three teams. 54 video-recordings were reviewed and coded. Patient safety incidents were identified that involved issues in clinical process, medications, falls, family or caregiving, procedural problems, documentation, psychosocial issues, administrative challenges and accidents. This study distinguishes categories of patient safety events that occur in home hospice care. Although the scope and definition of potential patient safety incidents in hospice is unique, the events observed in this study are similar to those observed with in other settings. This study identifies an operating definition and a potential classification for further research on patient safety incidents in hospice. Further research and consensus building of the definition of patient safety incidents and patient safety incidents in this setting is recommended.
Tactile communication, cooperation, and performance: an ethological study of the NBA.
Kraus, Michael W; Huang, Cassey; Keltner, Dacher
2010-10-01
Tactile communication, or physical touch, promotes cooperation between people, communicates distinct emotions, soothes in times of stress, and is used to make inferences of warmth and trust. Based on this conceptual analysis, we predicted that in group competition, physical touch would predict increases in both individual and group performance. In an ethological study, we coded the touch behavior of players from the National Basketball Association (NBA) during the 2008-2009 regular season. Consistent with hypotheses, early season touch predicted greater performance for individuals as well as teams later in the season. Additional analyses confirmed that touch predicted improved performance even after accounting for player status, preseason expectations, and early season performance. Moreover, coded cooperative behaviors between teammates explained the association between touch and team performance. Discussion focused on the contributions touch makes to cooperative groups and the potential implications for other group settings. (PsycINFO Database Record (c) 2010 APA, all rights reserved).
Moreira, Maria E; Hernandez, Caleb; Stevens, Allen D; Jones, Seth; Sande, Margaret; Blumen, Jason R; Hopkins, Emily; Bakes, Katherine; Haukoos, Jason S
2015-08-01
The Institute of Medicine has called on the US health care system to identify and reduce medical errors. Unfortunately, medication dosing errors remain commonplace and may result in potentially life-threatening outcomes, particularly for pediatric patients when dosing requires weight-based calculations. Novel medication delivery systems that may reduce dosing errors resonate with national health care priorities. Our goal was to evaluate novel, prefilled medication syringes labeled with color-coded volumes corresponding to the weight-based dosing of the Broselow Tape, compared with conventional medication administration, in simulated pediatric emergency department (ED) resuscitation scenarios. We performed a prospective, block-randomized, crossover study in which 10 emergency physician and nurse teams managed 2 simulated pediatric arrest scenarios in situ, using either prefilled, color-coded syringes (intervention) or conventional drug administration methods (control). The ED resuscitation room and the intravenous medication port were video recorded during the simulations. Data were extracted from video review by blinded, independent reviewers. Median time to delivery of all doses for the conventional and color-coded delivery groups was 47 seconds (95% confidence interval [CI] 40 to 53 seconds) and 19 seconds (95% CI 18 to 20 seconds), respectively (difference=27 seconds; 95% CI 21 to 33 seconds). With the conventional method, 118 doses were administered, with 20 critical dosing errors (17%); with the color-coded method, 123 doses were administered, with 0 critical dosing errors (difference=17%; 95% CI 4% to 30%). A novel color-coded, prefilled syringe decreased time to medication administration and significantly reduced critical dosing errors by emergency physician and nurse teams during simulated pediatric ED resuscitations. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
McEvoy, Matthew D.; Smalley, Jeremy C.; Nietert, Paul J.; Field, Larry C.; Furse, Cory M.; Blenko, John W.; Cobb, Benjamin G.; Walters, Jenna L.; Pendarvis, Allen; Dalal, Nishita S.; Schaefer, John J.
2012-01-01
Introduction Defining valid, reliable, defensible, and generalizable standards for the evaluation of learner performance is a key issue in assessing both baseline competence and mastery in medical education. However, prior to setting these standards of performance, the reliability of the scores yielding from a grading tool must be assessed. Accordingly, the purpose of this study was to assess the reliability of scores generated from a set of grading checklists used by non-expert raters during simulations of American Heart Association (AHA) MegaCodes. Methods The reliability of scores generated from a detailed set of checklists, when used by four non-expert raters, was tested by grading team leader performance in eight MegaCode scenarios. Videos of the scenarios were reviewed and rated by trained faculty facilitators and by a group of non-expert raters. The videos were reviewed “continuously” and “with pauses.” Two content experts served as the reference standard for grading, and four non-expert raters were used to test the reliability of the checklists. Results Our results demonstrate that non-expert raters are able to produce reliable grades when using the checklists under consideration, demonstrating excellent intra-rater reliability and agreement with a reference standard. The results also demonstrate that non-expert raters can be trained in the proper use of the checklist in a short amount of time, with no discernible learning curve thereafter. Finally, our results show that a single trained rater can achieve reliable scores of team leader performance during AHA MegaCodes when using our checklist in continuous mode, as measures of agreement in total scoring were very strong (Lin’s Concordance Correlation Coefficient = 0.96; Intraclass Correlation Coefficient = 0.97). Discussion We have shown that our checklists can yield reliable scores, are appropriate for use by non-expert raters, and are able to be employed during continuous assessment of team leader performance during the review of a simulated MegaCode. This checklist may be more appropriate for use by Advanced Cardiac Life Support (ACLS) instructors during MegaCode assessments than current tools provided by the AHA. PMID:22863996
Economic Analysis on the Space Transportation Architecture Study (STAS) NASA Team
NASA Technical Reports Server (NTRS)
Shaw, Eric J.
1999-01-01
The National Aeronautics and Space Administration (NASA) performed the Space Transportation Architecture Study (STAS) to provide information to support end-of-the-decade decisions on possible near-term US Government (USG) investments in space transportation. To gain a clearer understanding of the costs and benefits of the broadest range of possible space transportation options, six teams, five from aerospace industry companies and one internal to NASA, were tasked to answer three primary questions: a) If the Space Shuttle system should be replaced; b) If so, when the replacement should take place and how the transition should be implemented; and c) If not, what is the upgrade strategy to continue safe and affordable flight of the Space Shuttle beyond 2010. The overall goal of the Study was "to develop investment options to be considered by the Administration for the President's FY2001 budget to meet NASA's future human space flight requirements with significant reductions in costs." This emphasis on government investment, coupled with the participation by commercial f'trms, required an unprecedented level of economic analysis of costs and benefits from both industry and government viewpoints. This paper will discuss the economic and market models developed by the in-house NASA Team to analyze space transportation architectures, the results of those analyses, and how those results were reflected in the conclusions and recommendations of the STAS NASA Team. Copyright 1999 by the American Institute of Aeronautics and Astronautics, Inc. No copyright is asserted in the United States under Title 17, U.$. Code. The U.S. Government has a royalty-free license to exercise all rights under the copyright claimed herein for Governmental purposes. All other rights are reserved by the copyright owner.
Stevenson, K; Baker, R; Farooqi, A; Sorrie, R; Khunti, K
2001-02-01
In quality improvement activities such as audit, some general practices succeed in improving care and some do not. With audit of care likely to be one of the major tools in clinical governance, it would be helpful to establish what features of primary health care teams are associated with successful audit in general practice. The aim of the present study was to identify those features of primary health care teams that were associated with successful quality improvement during systematic audit of diabetes care. Semi-structured tape-recorded interviews were carried out with lead GPs and practice nurses in 18 general practices in Leicestershire that had the opportunity to improve their care and had completed two data collections in a multipractice audit of diabetes care. The interviewees were asked to describe their practice's approach to audit and the transcripts were coded for common features and judged for strength of feeling by blinded independent raters. Features common to practices that had, and those that had not, managed to improve diabetes care were identified. Six features were identified reliably in the transcripts by blinded independent raters. Four were significantly associated with the successful improvement of care. Success was more likely in teams in which: the GP or nurse felt personally involved in the audit; they perceived their teamwork as good; they had recognized the need for systematic plans to address obstacles to quality improvement; and their teams had a positive attitude to continued monitoring of care. A positive attitude to audit and a personal interest in the disease were not associated with improvement in care. Success in improving diabetes care is associated with certain organizational features of primary health care teams. Experimental studies are required to determine whether the development of teamwork enables practice teams to identify and overcome systematically the obstacles to improved quality of patient care that face them.
2013 R&D 100 Award: âMiniappsâ Bolster High Performance Computing
Belak, Jim; Richards, David
2018-06-12
Two Livermore computer scientists served on a Sandia National Laboratories-led team that developed Mantevo Suite 1.0, the first integrated suite of small software programs, also called "miniapps," to be made available to the high performance computing (HPC) community. These miniapps facilitate the development of new HPC systems and the applications that run on them. Miniapps (miniature applications) serve as stripped down surrogates for complex, full-scale applications that can require a great deal of time and effort to port to a new HPC system because they often consist of hundreds of thousands of lines of code. The miniapps are a prototype that contains some or all of the essentials of the real application but with many fewer lines of code, making the miniapp more versatile for experimentation. This allows researchers to more rapidly explore options and optimize system design, greatly improving the chances the full-scale application will perform successfully. These miniapps have become essential tools for exploring complex design spaces because they can reliably predict the performance of full applications.
The value of psychosocial group activity in nursing education: A qualitative analysis.
Choi, Yun-Jung
2018-05-01
Nursing faculty often struggle to find effective teaching strategies for nursing students that integrate group work into nursing students' learning activities. This study was conducted to evaluate students' experiences in a psychiatric and mental health nursing course using psychosocial group activities to develop therapeutic communication and interpersonal relationship skills, as well as to introduce psychosocial nursing interventions. A qualitative research design was used. The study explored nursing students' experiences of the course in accordance with the inductive, interpretative, and constructive approaches via focus group interviews. Participants were 17 undergraduate nursing students who registered for a psychiatric and mental health nursing course. The collected data were analyzed by qualitative content analysis. The analysis resulted in 28 codes, 14 interpretive codes, 4 themes (developing interpersonal relationships, learning problem-solving skills, practicing cooperation and altruism, and getting insight and healing), and a core theme (interdependent growth in self-confidence). The psychosocial group activity provided constructive opportunities for the students to work independently and interdependently as healthcare team members through reflective learning experiences. Copyright © 2018 Elsevier Ltd. All rights reserved.
SimZones: An Organizational Innovation for Simulation Programs and Centers.
Roussin, Christopher J; Weinstock, Peter
2017-08-01
The complexity and volume of simulation-based learning programs have increased dramatically over the last decade, presenting several major challenges for those who lead and manage simulation programs and centers. The authors present five major issues affecting the organization of simulation programs: (1) supporting both single- and double-loop learning experiences; (2) managing the training of simulation teaching faculty; (3) optimizing the participant mix, including individuals, professional groups, teams, and other role-players, to ensure learning; (4) balancing in situ, node-based, and center-based simulation delivery; and (5) organizing simulation research and measuring value. They then introduce the SimZones innovation, a system of organization for simulation-based learning, and explain how it can alleviate the problems associated with these five issues.Simulations are divided into four zones (Zones 0-3). Zone 0 simulations include autofeedback exercises typically practiced by solitary learners, often using virtual simulation technology. Zone 1 simulations include hands-on instruction of foundational clinical skills. Zone 2 simulations include acute situational instruction, such as clinical mock codes. Zone 3 simulations involve authentic, native teams of participants and facilitate team and system development.The authors also discuss the translation of debriefing methods from Zone 3 simulations to real patient care settings (Zone 4), and they illustrate how the SimZones approach can enable the development of longitudinal learning systems in both teaching and nonteaching hospitals. The SimZones approach was initially developed in the context of the Boston Children's Hospital Simulator Program, which the authors use to illustrate this innovation in action.
[Economic impact of consultation-liaison psychiatry in a French University Hospital Centre].
Yrondi, A; Petiot, D; Arbus, C; Schmitt, L
2016-02-01
In times of fiscal restraint for health structures, apart from the clinical input, it seems important to discuss the economic impact of liaison psychiatry. There are only a few studies on the economic added value provided by a liaison psychiatry team. In addition to this, only a few psychiatric pathologies are coded as they should be, hence we make the assumption of an additional development provided by a specialised team. Over a short period of 4months, in three departments of the Toulouse University Hospital Centre, the added value to the general pricing system of liaison psychiatry was studied. The population was represented by all the consecutive requests for consultations from patients over 18years old, men and women, hospitalised at that time. These three departments frequently request consultations with the psychiatry liaison team. They set a diagnostic, and if this is associated with a higher Homogeneous Group of Patients (HGP), it provides added value. Fifty-two patients benefited from a psychiatric consultation over 4months. The results highlight a development of € 8630.43 for the traumatology department, € 3325.03 for the internal medicine department, and € 513.61 for the haematology department over the study period. The overall development over this period was € 12,469.07. To our knowledge, this approach is one of the first in France to highlight an economic impact of the intervention of liaison psychiatry in the claiming departments. Copyright © 2014 L’Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.
EMPHASIS(TM)/Nevada UTDEM User Guide Version 2.1.1.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, C. David; Pasik, Michael F.; Pointon, Timothy D.
The Unstructured Time - Domain ElectroMagnetics (UTDEM) portion of the EMPHASIS suite solves Maxwell's equations using finite - element techniques on unstructured meshes. This document provides user - specific information to facilitate the use of the code for ap plications of interest. Acknowledgement The authors would like to thank all of those individuals who have helped to bring EMPHASIS/Nevada to the point it is today, including Bill Bohnhoff, Rich Drake, and all of the NEVADA code team.
Ramsingh, Brigit
2014-07-01
Following the Second World War, the Food and Agriculture Organization (FAO) and the World Health Organization (WHO) teamed up to construct an International Codex Alimentarius (or 'food code') which emerged in 1963. The Codex Committee on Food Hygiene (CCFH) was charged with the task of developing microbial hygiene standards, although it found itself embroiled in debate with the WHO over the nature these standards should take. The WHO was increasingly relying upon the input of biometricians and especially the International Commission on Microbial Specifications for Foods (ICMSF) which had developed statistical sampling plans for determining the microbial counts in the final end products. The CCFH, however, was initially more focused on a qualitative approach which looked at the entire food production system and developed codes of practice as well as more descriptive end-product specifications which the WHO argued were 'not scientifically correct'. Drawing upon historical archival material (correspondence and reports) from the WHO and FAO, this article examines this debate over microbial hygiene standards and suggests that there are many lessons from history which could shed light upon current debates and efforts in international food safety management systems and approaches.
Wranik, W Dominika; Haydt, Susan M; Katz, Alan; Levy, Adrian R; Korchagina, Maryna; Edwards, Jeanette M; Bower, Ian
2017-05-15
Reliance on interdisciplinary teams in the delivery of primary care is on the rise. Funding bodies strive to design financial environments that support collaboration between providers. At present, the design of financial arrangements has been fragmented and not based on evidence. The root of the problem is a lack of systematic evidence demonstrating the superiority of any particular financial arrangement, or a solid understanding of options. In this study we develop a framework for the conceptualization and analysis of financial arrangements in interdisciplinary primary care teams. We use qualitative data from three sources: (i) interviews with 19 primary care decision makers representing 215 clinics in three Canadian provinces, (ii) a research roundtable with 14 primary care decision makers and/or researchers, and (iii) policy documents. Transcripts from interviews and the roundtable were coded thematically and a framework synthesis approach was applied. Our conceptual framework differentiates between team level funding and provider level remuneration, and characterizes the interplay and consonance between them. Particularly the notions of hierarchy, segregation, and dependence of provider incomes, and the link between funding and team activities are introduced as new clarifying concepts, and their implications explored. The framework is applied to the analysis of collaboration incentives, which appear strongest when provider incomes are interdependent, funding is linked to the team as a whole, and accountability does not have multiple lines. Emergent implementation issues discussed by respondents include: (i) centrality of budget negotiations; (ii) approaches to patient rostering; (iii) unclear funding sources for space and equipment; and (iv) challenges with community engagement. The creation of patient rosters is perceived as a surprisingly contentious issue, and the challenges of funding for space and equipment remain unresolved. The development and application of a conceptual framework is an important step to the systematic study of the best performing financial models in the context of interdisciplinary primary care. The identification of optimal financial arrangements must be contextualized in terms of feasibility and the implementation environment. In general, financial hierarchy, both overt and covert, is considered a barrier to collaboration.
Shared leadership in multiteam systems: how cockpit and cabin crews lead each other to safety.
Bienefeld, Nadine; Grote, Gudela
2014-03-01
In this study, we aimed to examine the effect of shared leadership within and across teams in multiteam systems (MTS) on team goal attainment and MTS success. Due to different and sometimes competing goals in MTS, leadership is required within and across teams. Shared leadership, the effectiveness of which has been proven in single teams, may be an effective strategy to cope with these challenges. We observed leadership in 84 cockpit and cabin crews that collaborated in the form of six-member MTS aircrews (N = 504) during standardized simulations of an in-flight emergency. Leadership was coded by three trained observers using a structured observation system. Team goal attainment was assessed by two subject matter experts using a checklist-based rating tool. MTS goal attainment was measured objectively on the basis of the outcome of the simulated flights. In successful MTS aircrews, formal leaders and team members displayed significantly more leadership behaviors, shared leadership by pursers and flight attendants predicted team goal attainment, and pursers' shared leadership across team boundaries predicted cross-team goal attainment. In cockpit crews, leadership was not shared and captains' vertical leadership predicted team goal attainment regardless of MTS success. The results indicate that in general, shared leadership positively relates to team goal attainment and MTS success,whereby boundary spanners' dual leadership role is key. Leadership training in MTS should address shared rather than merely vertical forms of leadership, and component teams in MTS should be trained together with emphasis on boundary spanners' dual leadership role. Furthermore, team members should be empowered to engage in leadership processes when required.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrison, Cyrus; Larsen, Matt; Brugger, Eric
Strawman is a system designed to explore the in situ visualization and analysis needs of simulation code teams running multi-physics calculations on many-core HPC architectures. It porvides rendering pipelines that can leverage both many-core CPUs and GPUs to render images of simulation meshes.
75 FR 26879 - Temporary Organization To Facilitate a Strategic Partnership With the Republic of Iraq
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-12
... Teams; (c) support and create a sustainable Rule of Law mission in Iraq, including the Police... States Code, unless sooner terminated by the Secretary. (Presidential Sig.) THE WHITE HOUSE, May 7, 2010...
Gawel, Marcie; Emerson, Beth; Giuliano, John S; Rosenberg, Alana; Minges, Karl E; Feder, Shelli; Violano, Pina; Morrell, Patricia; Petersen, Judy; Christison-Lagay, Emily; Auerbach, Marc
2018-02-01
Most injured children initially present to a community hospital, and many will require transfer to a regional pediatric trauma center. The purpose of this study was 1) to explore multidisciplinary providers' experiences with the process of transferring injured children and 2) to describe proposed ideas for process improvement. This qualitative study involved 26 semistructured interviews. Subjects were recruited from 6 community hospital emergency departments and the trauma and transport teams of a level I pediatric trauma center in New Haven, Conn. Participants (n = 34) included interprofessional providers from sending facilities, transport teams, and receiving facilities. Using the constant comparative method, a multidisciplinary team coded transcripts and collectively refined codes to generate recurrent themes across interviews until theoretical saturation was achieved. Participants reported that the transfer process for injured children is complex, stressful, and necessitates collaboration. The transfer process was perceived to involve numerous interrelated components, including professions, disciplines, and institutions. The 5 themes identified as areas to improve this transfer process included 1) Creation of a unified standard operating procedure that crosses institutions/teams, 2) Enhancing 'shared sense making' of all providers, 3) Improving provider confidence, expertise, and skills in caring for pediatric trauma transfer cases, 4) Addressing organization and environmental factors that may impede/delay transfer, and 5) Fostering institutional and personal relationships. Efforts to improve the transfer process for injured children should be guided by the experiences of and input from multidisciplinary frontline emergency providers.
NASA Technical Reports Server (NTRS)
Levine, Art
2002-01-01
In the early 70's the management at the American Stock Exchange wanted a set of automated displays installed on the trading floor. The purpose of the displays was to announce to the public all changes related to the trading of equities. I had exactly three months to get the work done. Because of local building codes, I was told that the displays that met the specifications would not be available until nine months after the order was placed. This was not acceptable to the Exchange's management. The project team was in a quandary. I called a meeting to discuss the situation and develop a report explaining why we needed more time. Jokingly, I suggested, 'Why not use picket signs?' The Exchange had just gone through some painful labor negotiations. To anyone who had been involved in those negotiations, the thought of a picket sign should have sounded, I thought, like gallows humor. This was not acceptable to the Exchange's management. To my surprise, the rest of the project team took the idea seriously.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pointer, William David; Shaver, Dillon; Liu, Yang
The U.S. Department of Energy, Office of Nuclear Energy charges participants in the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program with the development of advanced modeling and simulation capabilities that can be used to address design, performance and safety challenges in the development and deployment of advanced reactor technology. The NEAMS has established a high impact problem (HIP) team to demonstrate the applicability of these tools to identification and mitigation of sources of steam generator flow induced vibration (SGFIV). The SGFIV HIP team is working to evaluate vibration sources in an advanced helical coil steam generator using computational fluidmore » dynamics (CFD) simulations of the turbulent primary coolant flow over the outside of the tubes and CFD simulations of the turbulent multiphase boiling secondary coolant flow inside the tubes integrated with high resolution finite element method assessments of the tubes and their associated structural supports. This report summarizes the demonstration of a methodology for the multiphase boiling flow analysis inside the helical coil steam generator tube. A helical coil steam generator configuration has been defined based on the experiments completed by Polytecnico di Milano in the SIET helical coil steam generator tube facility. Simulations of the defined problem have been completed using the Eulerian-Eulerian multi-fluid modeling capabilities of the commercial CFD code STAR-CCM+. Simulations suggest that the two phases will quickly stratify in the slightly inclined pipe of the helical coil steam generator. These results have been successfully benchmarked against both empirical correlations for pressure drop and simulations using an alternate CFD methodology, the dispersed phase mixture modeling capabilities of the open source CFD code Nek5000.« less
Evans, M Blair; Allan, Veronica; Erickson, Karl; Martin, Luc J; Budziszewski, Ross; Côté, Jean
2017-02-01
Models of sport development often support the assumption that young athletes' psychosocial experiences differ as a result of seemingly minor variations in how their sport activities are designed (eg, participating in team or individual sport; sampling many sports or specialising at an early age). This review was conducted to systematically search sport literature and explore how the design of sport activities relates to psychosocial outcomes. Systematic search, followed by data extraction and synthesis. The Preferred Reporting Items for Systematic reviews and Meta-Analyses guidelines were applied and a coding sheet was used to extract article information and code for risk of bias. Academic databases and manual search of peer-reviewed journals. Search criteria determined eligibility primarily based on the sample (eg, ages 7 through 17 years) and study design (eg, measured psychosocial constructs). 35 studies were located and were classified within three categories: (1) sport types, (2) sport settings, and (3) individual patterns of sport involvement. These studies represented a wide range of scores when assessed for risk of bias and involved an array of psychosocial constructs, with the most prevalent investigations predicting outcomes such as youth development, self-esteem and depression by comparing (1) team or individual sport participants and (2) youth with varying amounts of sport involvement. As variations in sport activities impact youth sport experiences, it is vital for researchers to carefully describe and study these factors, while practitioners may use the current findings when designing youth sport programmes. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
An empirically based conceptual framework for fostering meaningful patient engagement in research.
Hamilton, Clayon B; Hoens, Alison M; Backman, Catherine L; McKinnon, Annette M; McQuitty, Shanon; English, Kelly; Li, Linda C
2018-02-01
Patient engagement in research (PEIR) is promoted to improve the relevance and quality of health research, but has little conceptualization derived from empirical data. To address this issue, we sought to develop an empirically based conceptual framework for meaningful PEIR founded on a patient perspective. We conducted a qualitative secondary analysis of in-depth interviews with 18 patient research partners from a research centre-affiliated patient advisory board. Data analysis involved three phases: identifying the themes, developing a framework and confirming the framework. We coded and organized the data, and abstracted, illustrated, described and explored the emergent themes using thematic analysis. Directed content analysis was conducted to derive concepts from 18 publications related to PEIR to supplement, confirm or refute, and extend the emergent conceptual framework. The framework was reviewed by four patient research partners on our research team. Participants' experiences of working with researchers were generally positive. Eight themes emerged: procedural requirements, convenience, contributions, support, team interaction, research environment, feel valued and benefits. These themes were interconnected and formed a conceptual framework to explain the phenomenon of meaningful PEIR from a patient perspective. This framework, the PEIR Framework, was endorsed by the patient research partners on our team. The PEIR Framework provides guidance on aspects of PEIR to address for meaningful PEIR. It could be particularly useful when patient-researcher partnerships are led by researchers with little experience of engaging patients in research. © 2017 The Authors Health Expectations Published by John Wiley & Sons Ltd.
Hunt, Elizabeth A; Walker, Allen R; Shaffner, Donald H; Miller, Marlene R; Pronovost, Peter J
2008-01-01
Outcomes of in-hospital pediatric cardiopulmonary arrest are dismal. Recent data suggest that the quality of basic and advanced life support delivered to adults is low and contributes to poor outcomes, but few data regarding pediatric events have been reported. The objectives of this study were to (1) measure the median elapsed time to initiate important resuscitation maneuvers in simulated pediatric medical emergencies (ie, "mock codes") and (2) identify the types and frequency of errors committed during pediatric mock codes. A prospective, observational study was conducted of 34 consecutive hospital-based mock codes. A mannequin or computerized simulator was used to enact unannounced, simulated crisis situations involving children with respiratory distress or insufficiency, respiratory arrest, hemodynamic instability, and/or cardiopulmonary arrest. Assessment included time elapsed to initiation of specific resuscitation maneuvers and deviation from American Heart Association guidelines. Among the 34 mock codes, the median time to assessment of airway and breathing was 1.3 minutes, to administration of oxygen was 2.0 minutes, to assessment of circulation was 4.0 minutes, to arrival of any physician was 3.0 minutes, and to arrival of first member of code team was 6.0 minutes. Among cardiopulmonary arrest scenarios, elapsed time to initiation of compressions was 1.5 minutes and to request for defibrillator was 4.3 minutes. In 75% of mock codes, the team deviated from American Heart Association pediatric basic life support protocols, and in 100% of mock codes there was a communication error. Alarming delays and deviations occur in the major components of pediatric resuscitation. Future educational and organizational interventions should focus on improving the quality of care delivered during the first 5 minutes of resuscitation. Simulation of pediatric crises can identify targets for educational intervention to improve pediatric cardiopulmonary resuscitation and, ideally, outcomes.
Impact and implications of disruptive behavior in the perioperative arena.
Rosenstein, Alan H; O'Daniel, Michelle
2006-07-01
There is a growing concern about the role of human factor issues and their effect on patient safety and clinical outcomes of care. Problems with disruptive behaviors negatively affect communication flow and team dynamics, which can lead to adverse events and poor quality outcomes. A 25-question survey tool was used to assess the status and significance of disruptive behaviors around perioperative services in a large metropolitan academic medical center. Results were analyzed and compared with those from a national databank to identify areas of concern and opportunities for improvement. Disruptive behaviors were a common occurrence in the perioperative setting. These types of behaviors were most prevalent in attending surgeons. Disruptive behaviors increased levels of stress and frustration, which impaired concentration, impeded communication flow, and adversely affected staff relationships and team collaboration. These events were perceived to increase the likelihood of medical errors and adverse events and to compromise patient safety and quality of care. Disruptive behaviors in the perioperative arena have a significant impact on team dynamics and communication flow, which can have a negative impact on patient care. Organizations need to recognize the prevalence and significance of disruptive behaviors and develop policies and processes to address the issue. Key areas of focus include recognition and awareness, organizational and cultural commitment, implementation of appropriate codes of behavior policies and procedures, and provision of education and training programs to discuss contributing factors and tools to build effective communication and team collaboration skills.
Global positioning systems (GPS) and microtechnology sensors in team sports: a systematic review.
Cummins, Cloe; Orr, Rhonda; O'Connor, Helen; West, Cameron
2013-10-01
Use of Global positioning system (GPS) technology in team sport permits measurement of player position, velocity, and movement patterns. GPS provides scope for better understanding of the specific and positional physiological demands of team sport and can be used to design training programs that adequately prepare athletes for competition with the aim of optimizing on-field performance. The objective of this study was to conduct a systematic review of the depth and scope of reported GPS and microtechnology measures used within individual sports in order to present the contemporary and emerging themes of GPS application within team sports. A systematic review of the application of GPS technology in team sports was conducted. We systematically searched electronic databases from earliest record to June 2012. Permutations of key words included GPS; male and female; age 12-50 years; able-bodied; and recreational to elite competitive team sports. The 35 manuscripts meeting the eligibility criteria included 1,276 participants (age 11.2-31.5 years; 95 % males; 53.8 % elite adult athletes). The majority of manuscripts reported on GPS use in various football codes: Australian football league (AFL; n = 8), soccer (n = 7), rugby union (n = 6), and rugby league (n = 6), with limited representation in other team sports: cricket (n = 3), hockey (n = 3), lacrosse (n = 1), and netball (n = 1). Of the included manuscripts, 34 (97 %) detailed work rate patterns such as distance, relative distance, speed, and accelerations, with only five (14.3 %) reporting on impact variables. Activity profiles characterizing positional play and competitive levels were also described. Work rate patterns were typically categorized into six speed zones, ranging from 0 to 36.0 km·h⁻¹, with descriptors ranging from walking to sprinting used to identify the type of activity mainly performed in each zone. With the exception of cricket, no standardized speed zones or definitions were observed within or between sports. Furthermore, speed zone criteria often varied widely within (e.g. zone 3 of AFL ranged from 7 to 16 km·h⁻¹) and between sports (e.g. zone 3 of soccer ranged from 3.0 to <13 km·h⁻¹ code). Activity descriptors for a zone also varied widely between sports (e.g. zone 4 definitions ranged from jog, run, high velocity, to high-intensity run). Most manuscripts focused on the demands of higher intensity efforts (running and sprint) required by players. Body loads and impacts, also summarized into six zones, showed small variations in descriptions, with zone criteria based upon grading systems provided by GPS manufacturers. This systematic review highlights that GPS technology has been used more often across a range of football codes than across other team sports. Work rate pattern activities are most often reported, whilst impact data, which require the use of microtechnology sensors such as accelerometers, are least reported. There is a lack of consistency in the definition of speed zones and activity descriptors, both within and across team sports, thus underscoring the difficulties encountered in meaningful comparisons of the physiological demands both within and between team sports. A consensus on definitions of speed zones and activity descriptors within sports would facilitate direct comparison of the demands within the same sport. Meta-analysis from systematic review would also be supported. Standardization of speed zones between sports may not be feasible due to disparities in work rate pattern activities.
Towards Accurate Application Characterization for Exascale (APEX)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hammond, Simon David
Sandia National Laboratories has been engaged in hardware and software codesign activities for a number of years, indeed, it might be argued that prototyping of clusters as far back as the CPLANT machines and many large capability resources including ASCI Red and RedStorm were examples of codesigned solutions. As the research supporting our codesign activities has moved closer to investigating on-node runtime behavior a nature hunger has grown for detailed analysis of both hardware and algorithm performance from the perspective of low-level operations. The Application Characterization for Exascale (APEX) LDRD was a project concieved of addressing some of these concerns.more » Primarily the research was to intended to focus on generating accurate and reproducible low-level performance metrics using tools that could scale to production-class code bases. Along side this research was an advocacy and analysis role associated with evaluating tools for production use, working with leading industry vendors to develop and refine solutions required by our code teams and to directly engage with production code developers to form a context for the application analysis and a bridge to the research community within Sandia. On each of these accounts significant progress has been made, particularly, as this report will cover, in the low-level analysis of operations for important classes of algorithms. This report summarizes the development of a collection of tools under the APEX research program and leaves to other SAND and L2 milestone reports the description of codesign progress with Sandia’s production users/developers.« less
Whitehair, Leeann; Hurley, John; Provost, Steve
2018-06-12
To explore how team processes support nursing teams in hospital units during every day work. Due to their close proximity to patients, nurses are central to the process of maintaining patient safety. Globally, changes in models of care delivery by nurses, inclusive of team nursing are being considered. This qualitative study used purposive sampling in a single hospital and participants were nurses employed to work on a paediatric unit. Data was collected using non-participant observation. Thematic analysis was used to analyse and code data to create themes. Three clear themes emerged. Theme 1:"We are a close knit team"; Behaviours building a successful team"- outlines expectations regarding how members are to behave when establishing, nurturing and managing a team. Theme 2: "Onto it"; Ways of interacting with each other" - Identifies the expected pattern of relating within the team which contribute to shared understanding and actions. Theme 3: "No point in second guessing"; Maintaining a global view of the unit" - focuses on the processes for monitoring and reporting signals that team performance is on course or breaking down and includes accepting responsibility to lead the team and team members having a widespread sensitivity to what needs to happen. Essential to successful teamwork is the interplay and mutuality of team members and team leaders. Leadership behaviours exhibited in this study provide useful insights to how informal and shared or distributed leadership of teams may be achieved. Without buy-in from team members, teams may not achieve successful desired outcomes. It is not sufficient for teams to rely on current successful outcomes, as they need to be on the look-out for new ways to ensure that they can anticipate possible risks or threats to the team before harm is done. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Gabbett, Tim J
2014-07-01
A limitation of most rugby league time-motion studies is that researchers have examined the demands of single teams, with no investigations of all teams in an entire competition. This study investigated the activity profiles and technical and tactical performances of successful and less-successful teams throughout an entire rugby league competition. In total, 185 rugby league players representing 11 teams from a semiprofessional competition participated in this study. Global positioning system analysis was completed across the entire season. Video footage from individual matches was also coded via notational analysis for technical and tactical performance of teams. Trivial to small differences were found among Top 4, Middle 4, and Bottom 4 teams for absolute and relative total distances covered and distances covered at low speeds. Small, nonsignificant differences (P = .054, ES = 0.31) were found between groups for the distance covered sprinting, with Top 4 teams covering greater sprinting distances than Bottom 4 teams. Top 4 teams made more meters in attack and conceded fewer meters in defense than Bottom 4 teams. Bottom 4 teams had a greater percentage of slow play-the-balls in defense than Top 4 teams (74.8% ± 7.3% vs 67.2% ± 8.3%). Middle 4 teams showed the greatest reduction in high-speed running from the first to the second half (-20.4%), while Bottom 4 teams completed 14.3% more high-speed running in the second half than in the first half. These findings demonstrate that a combination of activity profiles and technical and tactical performance are associated with playing success in semiprofessional rugby league players.
ERIC Educational Resources Information Center
Caudle, Melissa
1994-01-01
School crises may be categorized as emergency situations, human-made crises, natural events, medical emergencies, and mechanical crises. Central to any successful crisis-management plan are onsite and district-level crisis response teams. Plans should specify staff responsibilities; provide for communication codes, devices, and procedures;…
CrossTalk: The Journal of Defense Software Engineering. Volume 25, Number 4, July/August 2012
2012-08-01
understand the interface between various code components. For example, consider a situation in which handwrit - ten code produced by one team generates an...conclusively say that a division by zero will not occur. The abstract interpretation concept can be generalized as a tool set that can be used to determine...word what makes a good manager, I would say decisiveness. You can use the fan- ciest computers to gather the numbers, but in the end you have to set
2016-06-01
managed by teams organized by the four- digit Federal Supply Classification (FSC) code, which classifies a part by type of materiel. When the consumable...Command [NAVSUP], 2015a). The first four digits of the NSN comprise the FSC code, which categorizes the item being ordered; in the present example it...Table 3, requisitions are divided into three priority bins—high (TP 1), medium (TP 2), 15 and low (TP 3). A mission-critical requirement almost
Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application.
Hanwell, Marcus D; de Jong, Wibe A; Harris, Christopher J
2017-10-30
An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction-connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platform with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web-going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.
Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application
Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.
2017-10-30
An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less
Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.
An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less
Creation of the Naturalistic Engagement in Secondary Tasks (NEST) distracted driving dataset.
Owens, Justin M; Angell, Linda; Hankey, Jonathan M; Foley, James; Ebe, Kazutoshi
2015-09-01
Distracted driving has become a topic of critical importance to driving safety research over the past several decades. Naturalistic driving data offer a unique opportunity to study how drivers engage with secondary tasks in real-world driving; however, the complexities involved with identifying and coding relevant epochs of naturalistic data have limited its accessibility to the general research community. This project was developed to help address this problem by creating an accessible dataset of driver behavior and situational factors observed during distraction-related safety-critical events and baseline driving epochs, using the Strategic Highway Research Program 2 (SHRP2) naturalistic dataset. The new NEST (Naturalistic Engagement in Secondary Tasks) dataset was created using crashes and near-crashes from the SHRP2 dataset that were identified as including secondary task engagement as a potential contributing factor. Data coding included frame-by-frame video analysis of secondary task and hands-on-wheel activity, as well as summary event information. In addition, information about each secondary task engagement within the trip prior to the crash/near-crash was coded at a higher level. Data were also coded for four baseline epochs and trips per safety-critical event. 1,180 events and baseline epochs were coded, and a dataset was constructed. The project team is currently working to determine the most useful way to allow broad public access to the dataset. We anticipate that the NEST dataset will be extraordinarily useful in allowing qualified researchers access to timely, real-world data concerning how drivers interact with secondary tasks during safety-critical events and baseline driving. The coded dataset developed for this project will allow future researchers to have access to detailed data on driver secondary task engagement in the real world. It will be useful for standalone research, as well as for integration with additional SHRP2 data to enable the conduct of more complex research. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.
Software Engineering for Scientific Computer Simulations
NASA Astrophysics Data System (ADS)
Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.
2004-11-01
Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.
The TeraShake Computational Platform for Large-Scale Earthquake Simulations
NASA Astrophysics Data System (ADS)
Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas
Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.
Issues in Developing and Evaluating a Culturally Tailored Internet Cancer Support Group
Im, Eun-Ok; Ji, Xiaopeng; Zhang, Jingwen; Kim, Sangmi; Lee, Yaelim; Chee, Eunice; Chee, Wonshik; Tsai, Hsiu-Min; Nishigaki, Masakazu; Yeo, Seon Ae; Schapira, Marilyn; Mao, Jun James
2016-01-01
The purpose of this paper is to explore practical issues in developing and implementing a culturally tailored Internet Cancer Support Group for a group of ethnic minority cancer patients—Asian American cancer patients. Throughout the research process of the original study testing the Internet cancer support group, research team made written records of practical issues and plausible rationales for the issues. Weekly group discussion among research team members was conducted, and the discussion records were evaluated and analyzed using a content analysis (with individual words as the unit of analysis). The codes from the analysis process were categorized into idea themes, through which the issues were extracted. The issues included those in: (a) difficulties in using multiple languages; (b) collaboration with the IT department and technical challenges (c) difficulties in recruitment; (d) difficulties in retention; (e) optimal timing; and (f) characteristics of the users. Based on the findings, we suggested researchers to plan a workable translation process, check technical needs in advance, use multiple strategies to recruit and retain research participants, plan the right time for data collection, and consider characteristics of the users in the study design. PMID:27379523
Joyce, Emmeline; Tai, Sara; Gebbia, Piersanti; Mansell, Warren
2017-05-01
Background Psychological interventions for bipolar disorders typically produce mixed outcomes and modest effects. The need for a more effective intervention prompted the development of a new cognitive behavioural therapy, based on an integrative cognitive model ('Think Effectively About Mood Swings' [TEAMS] therapy). Unlike previous interventions, TEAMS addresses current symptoms and comorbidities, and helps clients achieve long-term goals. A pilot randomized controlled trial (the TEAMS trial) of the therapy has recently concluded. This study explored participants' experiences of TEAMS, recommendations for improvement and experiences of useful changes post-therapy. Methods Fourteen TEAMS therapy participants took part in semi-structured interviews. Their accounts were analysed using interpretative thematic analysis. Two researchers coded the dataset independently. Member checks were conducted of the preliminary themes. Results Two overarching themes; 'useful elements of therapy' and 'changes from therapy' encompassed 12 emerging subthemes. Participants appreciated having opportunities to talk and described the therapy as person-centred and delivered by caring, approachable and skilled therapists. Some recommended more sessions than the 16 provided. Helpful therapeutic techniques were reported to be, normalization about moods, methods to increase understanding of moods, relapse-prevention, reappraisal techniques and metaphors. However, some did not find therapeutic techniques helpful. Post-therapy, many reported changes in managing mood swings more effectively and in their thinking (although some participants reported changes in neither). Many described increased acceptance of themselves and of having bipolar disorder, increased productivity and reduced anxiety in social situations. Conclusions The present study evaluates participants' therapy experiences in detail, including aspects of therapy viewed as helpful, and meaningful post-therapy outcomes. Copyright © 2016 John Wiley & Sons, Ltd. This is the first paper to qualitatively explore people's experiences of individual psychotherapy for bipolar disorders. It highlights elements of psychotherapy described as particularly helpful or unhelpful and the clinical changes viewed as most impactful. Participants reported benefitting in a number of ways from TEAMS therapy. They valued learning to reappraise and problem-solve situations and manage moods. Participants identified TEAMS techniques as helpful, such as exploring advantages and disadvantages of moods, and building healthy self-states. Copyright © 2016 John Wiley & Sons, Ltd.
Object-oriented technologies in a multi-mission data system
NASA Technical Reports Server (NTRS)
Murphy, Susan C.; Miller, Kevin J.; Louie, John J.
1993-01-01
The Operations Engineering Laboratory (OEL) at JPL is developing new technologies that can provide more efficient and productive ways of doing business in flight operations. Over the past three years, we have worked closely with the Multi-Mission Control Team to develop automation tools, providing technology transfer into operations and resulting in substantial cost savings and error reduction. The OEL development philosophy is characterized by object-oriented design, extensive reusability of code, and an iterative development model with active participation of the end users. Through our work, the benefits of object-oriented design became apparent for use in mission control data systems. Object-oriented technologies and how they can be used in a mission control center to improve efficiency and productivity are explained. The current research and development efforts in the JPL Operations Engineering Laboratory are also discussed to architect and prototype a new paradigm for mission control operations based on object-oriented concepts.
Idea Paper: The Lifecycle of Software for Scientific Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dubey, Anshu; McInnes, Lois C.
The software lifecycle is a well researched topic that has produced many models to meet the needs of different types of software projects. However, one class of projects, software development for scientific computing, has received relatively little attention from lifecycle researchers. In particular, software for end-to-end computations for obtaining scientific results has received few lifecycle proposals and no formalization of a development model. An examination of development approaches employed by the teams implementing large multicomponent codes reveals a great deal of similarity in their strategies. This idea paper formalizes these related approaches into a lifecycle model for end-to-end scientific applicationmore » software, featuring loose coupling between submodels for development of infrastructure and scientific capability. We also invite input from stakeholders to converge on a model that captures the complexity of this development processes and provides needed lifecycle guidance to the scientific software community.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rutan, D.; Rose, F.; Charlock, T.P.
2005-03-18
Within the Clouds and the Earth's Radiant Energy System (CERES) science team (Wielicki et al. 1996), the Surface and Atmospheric Radiation Budget (SARB) group is tasked with calculating vertical profiles of heating rates, globally, and continuously, beneath CERES footprint observations of Top of Atmosphere (TOA) fluxes. This is accomplished using a fast radiative transfer code originally developed by Qiang Fu and Kuo-Nan Liou (Fu and Liou 1993) and subsequently highly modified by the SARB team. Details on the code and its inputs can be found in Kato et al. (2005) and Rose and Charlock (2002). Among the many required inputsmore » is characterization of the vertical column profile of aerosols beneath each footprint. To do this SARB combines aerosol optical depth information from the moderate-resolution imaging spectroradiometer (MODIS) instrument along with aerosol constituents specified by the Model for Atmosphere and Chemical Transport (MATCH) of Collins et al. (2001), and aerosol properties (e.g. single scatter albedo and asymmetry parameter) from Tegen and Lacis (1996) and OPAC (Hess et al. 1998). The publicly available files that include these flux profiles, called the Clouds and Radiative Swath (CRS) data product, available from the Langley Atmospheric Sciences Data Center (http://eosweb.larc.nasa.gov/). As various versions of the code are completed, publishable results are named ''Editions.'' After CRS Edition 2A was finalized it was found that dust aerosols were too absorptive. Dust aerosols have subsequently been modified using a new set of properties developed by Andy Lacis and results have been released in CRS Edition 2B. This paper discusses the effects of changing desert dust aerosol properties, which can be significant for the radiation budget in mid ocean, a few thousand kilometers from the source regions. Resulting changes are validated via comparison of surface observed fluxes from the Saudi Solar Village surface site (Myers et al. 1999), and the E13 site at the Atmospheric Radiation Measurement (ARM), Southern Great Plains (SGP) central facility.« less
Clinical Complexity in Medicine: A Measurement Model of Task and Patient Complexity.
Islam, R; Weir, C; Del Fiol, G
2016-01-01
Complexity in medicine needs to be reduced to simple components in a way that is comprehensible to researchers and clinicians. Few studies in the current literature propose a measurement model that addresses both task and patient complexity in medicine. The objective of this paper is to develop an integrated approach to understand and measure clinical complexity by incorporating both task and patient complexity components focusing on the infectious disease domain. The measurement model was adapted and modified for the healthcare domain. Three clinical infectious disease teams were observed, audio-recorded and transcribed. Each team included an infectious diseases expert, one infectious diseases fellow, one physician assistant and one pharmacy resident fellow. The transcripts were parsed and the authors independently coded complexity attributes. This baseline measurement model of clinical complexity was modified in an initial set of coding processes and further validated in a consensus-based iterative process that included several meetings and email discussions by three clinical experts from diverse backgrounds from the Department of Biomedical Informatics at the University of Utah. Inter-rater reliability was calculated using Cohen's kappa. The proposed clinical complexity model consists of two separate components. The first is a clinical task complexity model with 13 clinical complexity-contributing factors and 7 dimensions. The second is the patient complexity model with 11 complexity-contributing factors and 5 dimensions. The measurement model for complexity encompassing both task and patient complexity will be a valuable resource for future researchers and industry to measure and understand complexity in healthcare.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Draeger, Erik W.
The theme of this year’s meeting was “Predictivity: Now and in the Future”. After welcoming remarks, Erik Draeger gave a talk on the NNSA Labs’ history of predictive simulation and the new challenges faced by upcoming architecture changes. He described an example where the volume of analysis data produced by a set of inertial confinement fusion (ICF) simulations on the Trinity machine was too large to store or transfer, and the steps needed to reduce it to a manageable size. He also described the software re-engineering plan for LLNL’s suite of multiphysics codes and physics packages with a new pushmore » toward common components, making collaboration with teams like the CCMSC who already have experience trying to architect complex multiphysics code infrastructure on next-generation architectures all the more important. Phil Smith then gave an overview outlining the goals of the project, namely to accelerate development of new technology in the form of high efficiency carbon capture pulverized coal power generation as well as further optimize existing state of the art designs. He then presented a summary of the Center’s top-down uncertainty quantification approach, in which ultimate target predictivity informs uncertainty targets for lower-level components, and gave data on how close all the different components currently are to their targets. Most components still need an approximately two-fold reduction in uncertainty to hit the ultimate predictivity target, but the current accuracy is already rather impressive.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Leary, Patrick
The primary challenge motivating this project is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who can perform analysis only on a small fraction of the data they calculate, resulting in the substantial likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, which is known as in situ processing. The idea in situ processing was not new at the time ofmore » the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by Department of Energy (DOE) science projects. Our objective was to produce and enable the use of production-quality in situ methods and infrastructure, at scale, on DOE high-performance computing (HPC) facilities, though we expected to have an impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve this objective, we engaged in software technology research and development (R&D), in close partnerships with DOE science code teams, to produce software technologies that were shown to run efficiently at scale on DOE HPC platforms.« less
Royall, Dawna; Brauer, Paula; Atta-Konadu, Edwoba; Dwyer, John J M; Edwards, A Michelle; Hussey, Tracy; Kates, Nick
2017-09-01
Both providers and patients may have important insights to inform the development of obesity prevention and management services in Canadian primary care settings. In this formative study, insights for new obesity management services were sought from both providers and patients in 1 progressive citywide organization (150 physicians, team services, separate offices). Seven focus groups with interprofessional health providers (n = 56) and 4 focus groups with patients (n = 34) were conducted. Two clinical vignettes (adult, child) were used to focus discussion. Four analysts coded for descriptive content and interpretative themes on possible tools and care processes using NVivo. Participants identified numerous strategies for care processes, most of which could be categorized into 1 or more of 11 themes: 6 directed at clinical care of patients (raising awareness, screening, clinical care, skill building, ongoing support, and social/peer support) and 5 directed at the organization (coordination/collaboration, creating awareness among health professionals, adding new expertise to the team, marketing, and lobbying/advocacy). The approach was successful in generating an extensive list of diverse activities to be considered for implementation studies. Both patients and providers identified that multiple strategies and systems approaches will be needed to address obesity management in primary care.
The ASC Sequoia Programming Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seager, M
2008-08-06
In the late 1980's and early 1990's, Lawrence Livermore National Laboratory was deeply engrossed in determining the next generation programming model for the Integrated Design Codes (IDC) beyond vectorization for the Cray 1s series of computers. The vector model, developed in mid 1970's first for the CDC 7600 and later extended from stack based vector operation to memory to memory operations for the Cray 1s, lasted approximately 20 years (See Slide 5). The Cray vector era was deemed an extremely long lived era as it allowed vector codes to be developed over time (the Cray 1s were faster in scalarmore » mode than the CDC 7600) with vector unit utilization increasing incrementally over time. The other attributes of the Cray vector era at LLNL were that we developed, supported and maintained the Operating System (LTSS and later NLTSS), communications protocols (LINCS), Compilers (Civic Fortran77 and Model), operating system tools (e.g., batch system, job control scripting, loaders, debuggers, editors, graphics utilities, you name it) and math and highly machine optimized libraries (e.g., SLATEC, and STACKLIB). Although LTSS was adopted by Cray for early system generations, they later developed COS and UNICOS operating systems and environment on their own. In the late 1970s and early 1980s two trends appeared that made the Cray vector programming model (described above including both the hardware and system software aspects) seem potentially dated and slated for major revision. These trends were the appearance of low cost CMOS microprocessors and their attendant, departmental and mini-computers and later workstations and personal computers. With the wide spread adoption of Unix in the early 1980s, it appeared that LLNL (and the other DOE Labs) would be left out of the mainstream of computing without a rapid transition to these 'Killer Micros' and modern OS and tools environments. The other interesting advance in the period is that systems were being developed with multiple 'cores' in them and called Symmetric Multi-Processor or Shared Memory Processor (SMP) systems. The parallel revolution had begun. The Laboratory started a small 'parallel processing project' in 1983 to study the new technology and its application to scientific computing with four people: Tim Axelrod, Pete Eltgroth, Paul Dubois and Mark Seager. Two years later, Eugene Brooks joined the team. This team focused on Unix and 'killer micro' SMPs. Indeed, Eugene Brooks was credited with coming up with the 'Killer Micro' term. After several generations of SMP platforms (e.g., Sequent Balance 8000 with 8 33MHz MC32032s, Allian FX8 with 8 MC68020 and FPGA based Vector Units and finally the BB&N Butterfly with 128 cores), it became apparent to us that the killer micro revolution would indeed take over Crays and that we definitely needed a new programming and systems model. The model developed by Mark Seager and Dale Nielsen focused on both the system aspects (Slide 3) and the code development aspects (Slide 4). Although now succinctly captured in two attached slides, at the time there was tremendous ferment in the research community as to what parallel programming model would emerge, dominate and survive. In addition, we wanted a model that would provide portability between platforms of a single generation but also longevity over multiple--and hopefully--many generations. Only after we developed the 'Livermore Model' and worked it out in considerable detail did it become obvious that what we came up with was the right approach. In a nutshell, the applications programming model of the Livermore Model posited that SMP parallelism would ultimately not scale indefinitely and one would have to bite the bullet and implement MPI parallelism within the Integrated Design Code (IDC). We also had a major emphasis on doing everything in a completely standards based, portable methodology with POSIX/Unix as the target environment. We decided against specialized libraries like STACKLIB for performance, but kept as many general purpose, portable math libraries as were needed by the codes. Third, we assumed that the SMPs in clusters would evolve in time to become more powerful, feature rich and, in particular, offer more cores. Thus, we focused on OpenMP, and POSIX PThreads for programming SMP parallelism. These code porting efforts were lead by Dale Nielsen, A-Division code group leader, and Randy Christensen, B-Division code group leader. Most of the porting effort revolved removing 'Crayisms' in the codes: artifacts of LTSS/NLTSS, Civic compiler extensions beyond Fortran77, IO libraries and dealing with new code control languages (we switched to Perl and later to Python). Adding MPI to the codes was initially problematic and error prone because the programmers used MPI directly and sprinkled the calls throughout the code.« less
NASA Technical Reports Server (NTRS)
Abbott, Mark R.
1996-01-01
Our first activity is based on delivery of code to Bob Evans (University of Miami) for integration and eventual delivery to the MODIS Science Data Support Team. As we noted in our previous semi-annual report, coding required the development and analysis of an end-to-end model of fluorescence line height (FLH) errors and sensitivity. This model is described in a paper in press in Remote Sensing of the Environment. Once the code was delivered to Miami, we continue to use this error analysis to evaluate proposed changes in MODIS sensor specifications and performance. Simply evaluating such changes on a band by band basis may obscure the true impacts of changes in sensor performance that are manifested in the complete algorithm. This is especially true with FLH that is sensitive to band placement and width. The error model will be used by Howard Gordon (Miami) to evaluate the effects of absorbing aerosols on the FLH algorithm performance. Presently, FLH relies only on simple corrections for atmospheric effects (viewing geometry, Rayleigh scattering) without correcting for aerosols. Our analysis suggests that aerosols should have a small impact relative to changes in the quantum yield of fluorescence in phytoplankton. However, the effect of absorbing aerosol is a new process and will be evaluated by Gordon.
Identifying the challenges and facilitators of implementing a COPD care bundle.
Lennox, Laura; Green, Stuart; Howe, Cathy; Musgrave, Hannah; Bell, Derek; Elkin, Sarah
2014-01-01
Care bundles have been shown to improve outcomes, reduce hospital readmissions and reduce length of hospital stay; therefore increasing the speed of uptake and delivery of care bundles should be a priority in order to deliver more timely improvements and consistent high-quality care. Previous studies have detailed the difficulties of obtaining full compliance to bundle elements but few have described the underlying reasons for this. In order to improve future implementation this paper investigates the challenges encountered by clinical teams implementing a chronic obstructive pulmonary disease (COPD) care bundle and describes actions taken to overcome these challenges. An initial retrospective documentary analysis of data from seven clinical implementation teams was undertaken to review the challenges faced by the clinical teams. Three focus groups with healthcare professionals and managers explored solutions to these challenges developed during the project. Documentary analysis identified 28 challenges which directly impacted implementation of the COPD care bundle within five themes; staffing, infrastructure, process, use of improvement methodology and patient and public involvement. Focus groups revealed that the five most significant challenges for all groups were: staff too busy, staff shortages, lack of staff engagement, added workload of the bundle and patient coding issues. The participants shared facilitating factors used to overcome issues including: shifting perceptions to improve engagement, further education sessions to increase staff participation and gaining buy-in from managers through payment frameworks. Maximising the impact of a care bundle relies on its successful and timely implementation. Teams implementing the COPD care bundle encountered challenges that were common to all teams and sites. Understanding and learning from the challenges faced by previous endeavours and identifying the facilitators to overcoming these barriers provides an opportunity to mitigate issues that waste time and resources, and ensures that training can be tailored to the anticipated challenges.
Foam on Tile Impact Modeling for the STS-107 Investigation
NASA Technical Reports Server (NTRS)
Stellingwerf, R. F.; Robinson, J. H.; Richardson, S.; Evans, S. W.; Stallworth, R.; Hovater, M.
2004-01-01
Following the breakup of the Space Shuttle Columbia during reentry a NASA/Contractor investigation team was formed to examine the probable damage inflicted on Orbiter Thermal Protection System elements by impact of External Tank insulating foam projectiles. The authors formed a working subgroup within the larger team to apply the Smooth Particle Hydrodynamics code SPHC to the damage estimation problem. Numerical models of the Orbiter's tiles and of the Tank's foam were constructed and used as inputs into the code. Material properties needed to properly model the tiles and foam were obtained from other working subgroups who performed tests on these items for this purpose. Two- and three-dimensional models of the tiles were constructed, including the glass outer layer, the main body of LI-900 insulation, the densified lower layer of LI-900, the Nomex felt mounting layer, and the Aluminum 2024 vehicle skin. A model for the BX-250 foam including porous compression, elastic rebound, and surface erosion was developed. Code results for the tile damage and foam behavior were extensively validated through comparison with Southwest Research Institute foam-on-tile impact experiments carried out in 1999. These tests involved small projectiles striking individual tiles and small tile arrays. Following code and model validation we simulated impacts of larger foam projectiles on the examples of tile systems used on the Orbiter. Results for impacts on the main landing gear door are presented in this paper, including effects of impacts at several angles, and of rapidly rotating projectiles. General results suggest that foam impacts on tiles at about 500 mph could cause appreciable damage if the impact angle is greater than about 20 degrees. Some variations of the foam properties, such as increased brittleness or increased density could increase damage in some cases. Rotation up to 17 rps failed to increase the damage for the two cases considered. This does not rule out other cases in which the rotational energy might lead to an increase in tile damage, but suggests that in most cases rotation will not be an important factor.
Gittinger, Matthew; Brolliar, Sarah M; Grand, James A; Nichol, Graham; Fernandez, Rosemarie
2017-06-01
This pilot study used a simulation-based platform to evaluate the effect of an automated mechanical chest compression device on team communication and patient management. Four-member emergency department interprofessional teams were randomly assigned to perform manual chest compressions (control, n = 6) or automated chest compressions (intervention, n = 6) during a simulated cardiac arrest with 2 phases: phase 1 baseline (ventricular tachycardia), followed by phase 2 (ventricular fibrillation). Patient management was coded using an Advanced Cardiovascular Life Support-based checklist. Team communication was categorized in the following 4 areas: (1) teamwork focus; (2) huddle events, defined as statements focused on re-establishing situation awareness, reinforcing existing plans, and assessing the need to adjust the plan; (3) clinical focus; and (4) profession of team member. Statements were aggregated for each team. At baseline, groups were similar with respect to total communication statements and patient management. During cardiac arrest, the total number of communication statements was greater in teams performing manual compressions (median, 152.3; interquartile range [IQR], 127.6-181.0) as compared with teams using an automated compression device (median, 105; IQR, 99.5-123.9). Huddle events were more frequent in teams performing automated chest compressions (median, 4.0; IQR, 3.1-4.3 vs. 2.0; IQR, 1.4-2.6). Teams randomized to the automated compression intervention had a delay to initial defibrillation (median, 208.3 seconds; IQR, 153.3-222.1 seconds) as compared with control teams (median, 63.2 seconds; IQR, 30.1-397.2 seconds). Use of an automated compression device may impact both team communication and patient management. Simulation-based assessments offer important insights into the effect of technology on healthcare teams.
Kastner, Monika; Sayal, Radha; Oliver, Doug; Straus, Sharon E; Dolovich, Lisa
2017-08-01
Chronic diseases are a significant public health concern, particularly in older adults. To address the delivery of health care services to optimally meet the needs of older adults with multiple chronic diseases, Health TAPESTRY (Teams Advancing Patient Experience: Strengthening Quality) uses a novel approach that involves patient home visits by trained volunteers to collect and transmit relevant health information using e-health technology to inform appropriate care from an inter-professional healthcare team. Health TAPESTRY was implemented, pilot tested, and evaluated in a randomized controlled trial (analysis underway). Knowledge translation (KT) interventions such as Health TAPESTRY should involve an investigation of their sustainability and scalability determinants to inform further implementation. However, this is seldom considered in research or considered early enough, so the objectives of this study were to assess the sustainability and scalability potential of Health TAPESTRY from the perspective of the team who developed and pilot-tested it. Our objectives were addressed using a sequential mixed-methods approach involving the administration of a validated, sustainability survey developed by the National Health Service (NHS) to all members of the Health TAPESTRY team who were actively involved in the development, implementation and pilot evaluation of the intervention (Phase 1: n = 38). Mean sustainability scores were calculated to identify the best potential for improvement across sustainability factors. Phase 2 was a qualitative study of interviews with purposively selected Health TAPESTRY team members to gain a more in-depth understanding of the factors that influence the sustainability and scalability Health TAPESTRY. Two independent reviewers coded transcribed interviews and completed a multi-step thematic analysis. Outcomes were participant perceptions of the determinants influencing the sustainability and scalability of Health TAPESTRY. Twenty Health TAPESTRY team members (53% response rate) completed the NHS sustainability survey. The overall mean sustainability score was 64.6 (range 22.8-96.8). Important opportunities for improving sustainability were better staff involvement and training, clinical leadership engagement, and infrastructure for sustainability. Interviews with 25 participants (response rate 60%) showed that factors influencing the sustainability and scalability of Health TAPESTRY emerged across two dimensions: I) Health TAPESTRY operations (development and implementation activities undertaken by the central team); and II) the Health TAPESTRY intervention (factors specific to the intervention and its elements). Resource capacity appears to be an important factor to consider for Health TAPESTRY operations as it was identified across both sustainability and scalability factors; and perceived lack of interprofessional team and volunteer resource capacity and the need for stakeholder buy-in are important considerations for the Health TAPESTRY intervention. We used these findings to create actionable recommendations to initiate dialogue among Health TAPESTRY team members to improve the intervention. Our study identified sustainability and scalability determinants of the Health TAPESTRY intervention that can be used to optimize its potential for impact. Next steps will involve using findings to inform a guide to facilitate sustainability and scalability of Health TAPESTRY in other jurisdictions considering its adoption. Our findings build on the limited current knowledge of sustainability, and advances KT science related to the sustainability and scalability of KT interventions.
Heart Pump Design for Cleveland Clinic Foundation
NASA Technical Reports Server (NTRS)
2005-01-01
Through a Lewis CommTech Program project with the Cleveland Clinic Foundation, the NASA Lewis Research Center is playing a key role in the design and development of a permanently implantable, artificial heart pump assist device. Known as the Innovative Ventricular Assist System (IVAS), this device will take on the pumping role of the damaged left ventricle of the heart. The key part of the IVAS is a nonpulsatile (continuous flow) artificial heart pump with centrifugal impeller blades, driven by an electric motor. Lewis is part of an industry and academia team, led by the Ohio Aerospace Institute (OAI), that is working with the Cleveland Clinic Foundation to make IVAS a reality. This device has the potential to save tens of thousands of lives each year, since 80 percent of heart attack victims suffer irreversible damage to the left ventricle, the part of the heart that does most of the pumping. Impeller blade design codes and flow-modeling analytical codes will be used in the project. These codes were developed at Lewis for the aerospace industry but will be applicable to the IVAS design project. The analytical codes, which currently simulate the flow through the compressor and pump systems, will be used to simulate the flow within the blood pump in the artificial heart assist device. The Interdisciplinary Technology Office heads up Lewis' efforts in the IVAS project. With the aid of numerical modeling, the blood pump will address many design issues, including some fluid-dynamic design considerations that are unique to the properties of blood. Some of the issues that will be addressed in the design process include hemolysis, deposition, recirculation, pump efficiency, rotor thrust balance, and bearing lubrication. Optimum pumping system performance will be achieved by modeling all the interactions between the pump components. The interactions can be multidisciplinary and, therefore, are influenced not only by the fluid dynamics of adjacent components but also by thermal and structural effects. Lewis-developed flow-modeling codes to be used in the pump simulations will include a one-dimensional code and an incompressible three-dimensional Navier-Stokes flow code. These codes will analyze the prototype pump designed by the Cleveland Clinic Foundation. With an improved understanding of the flow phenomena within the prototype pump, design changes to improve the performance of the pump system can be verified by computer prior to fabrication in order to reduce risks. The use of Lewis flow modeling codes during the design and development process will improve pump system performance and reduce the number of prototypes built in the development phase. The first phase of the IVAS project is to fully develop the prototype in a laboratory environment that uses a water/glycerin mixture as the surrogate fluid to simulate blood. A later phase of the project will include testing in animals for final validation. Lewis will be involved in the IVAS project for 3 to 5 years.
Underworld: What we set out to do, How far did we get, What did we Learn ? (Invited)
NASA Astrophysics Data System (ADS)
Moresi, L. N.
2013-12-01
Underworld was conceived as a tool for modelling 3D lithospheric deformation coupled with the underlying / surrounding mantle flow. The challenges involved were to find a method capable of representing the complicated, non-linear, history dependent rheology of the near surface as well as being able to model mantle convection, and, simultaneously, to be able to solve the numerical system efficiently. Underworld is a hybrid particle / mesh code reminiscent of the particle-in-cell techniques from the early 1960s. The Underworld team (*) was not the first to use this approach, nor the last, but the team does have considerable experience and much has been learned along the way. The use of a finite element method as the underlying "cell" in which the Lagrangian particles are embedded considerably reduces errors associated with mapping material properties to the cells. The particles are treated as moving quadrature points in computing the stiffness matrix integrals. The decoupling of deformation markers from computation points allows the use of structured meshes, efficient parallel decompositions, and simple-to-code geometric multigrid solution methods. For a 3D code such efficiencies are very important. The elegance of the method is that it can be completely described in a couple of sentences. However, there are some limitations: it is not obvious how to retain this elegance for unstructured or adaptive meshes, arbitrary element types are not sufficiently well integrated by the simple quadrature approach, and swarms of particles representing volumes are usually an inefficient representation of surfaces. This will be discussed ! (*) Although not formally constituted, my co-conspirators in this exercise are listed as the Underworld team and I will reveal their true identities on the day.
Validation of Framework Code Approach to a Life Prediction System for Fiber Reinforced Composites
NASA Technical Reports Server (NTRS)
Gravett, Phillip
1997-01-01
The grant was conducted by the MMC Life Prediction Cooperative, an industry/government collaborative team, Ohio Aerospace Institute (OAI) acted as the prime contractor on behalf of the Cooperative for this grant effort. See Figure I for the organization and responsibilities of team members. The technical effort was conducted during the period August 7, 1995 to June 30, 1996 in cooperation with Erwin Zaretsky, the LERC Program Monitor. Phil Gravett of Pratt & Whitney was the principal technical investigator. Table I documents all meeting-related coordination memos during this period. The effort under this grant was closely coordinated with an existing USAF sponsored program focused on putting into practice a life prediction system for turbine engine components made of metal matrix composites (MMC). The overall architecture of the NMC life prediction system was defined in the USAF sponsored program (prior to this grant). The efforts of this grant were focussed on implementing and tailoring of the life prediction system, the framework code within it and the damage modules within it to meet the specific requirements of the Cooperative. T'he tailoring of the life prediction system provides the basis for pervasive and continued use of this capability by the industry/government cooperative. The outputs of this grant are: 1. Definition of the framework code to analysis modules interfaces, 2. Definition of the interface between the materials database and the finite element model, and 3. Definition of the integration of the framework code into an FEM design tool.
Replacing the IRAF/PyRAF Code-base at STScI: The Advanced Camera for Surveys (ACS)
NASA Astrophysics Data System (ADS)
Lucas, Ray A.; Desjardins, Tyler D.; STScI ACS (Advanced Camera for Surveys) Team
2018-06-01
IRAF/PyRAF are no longer viable on the latest hardware often used by HST observers, therefore STScI no longer actively supports IRAF or PyRAF for most purposes. STScI instrument teams are in the process of converting all of our data processing and analysis code from IRAF/PyRAF to Python, including our calibration reference file pipelines and data reduction software. This is exemplified by our latest ACS Data Handbook, version 9.0, which was recently published in February 2018. Examples of IRAF and PyRAF commands have now been replaced by code blocks in Python, with references linked to documentation on how to download and install the latest Python software via Conda and AstroConda. With the temporary exception of the ACS slitless spectroscopy tool aXe, all ACS-related software is now independent of IRAF/PyRAF. A concerted effort has been made across STScI divisions to help the astronomical community transition from IRAF/PyRAF to Python, with tools such as Python Jupyter notebooks being made to give users workable examples. In addition to our code changes, the new ACS data handbook discusses the latest developments in charge transfer efficiency (CTE) correction, bias de-striping, and updates to the creation and format of calibration reference files among other topics.
Seelandt, Julia C; Tschan, Franziska; Keller, Sandra; Beldi, Guido; Jenni, Nadja; Kurmann, Anita; Candinas, Daniel; Semmer, Norbert K
2014-11-01
To develop a behavioural observation method to simultaneously assess distractors and communication/teamwork during surgical procedures through direct, on-site observations; to establish the reliability of the method for long (>3 h) procedures. Observational categories for an event-based coding system were developed based on expert interviews, observations and a literature review. Using Cohen's κ and the intraclass correlation coefficient, interobserver agreement was assessed for 29 procedures. Agreement was calculated for the entire surgery, and for the 1st hour. In addition, interobserver agreement was assessed between two tired observers and between a tired and a non-tired observer after 3 h of surgery. The observational system has five codes for distractors (door openings, noise distractors, technical distractors, side conversations and interruptions), eight codes for communication/teamwork (case-relevant communication, teaching, leadership, problem solving, case-irrelevant communication, laughter, tension and communication with external visitors) and five contextual codes (incision, last stitch, personnel changes in the sterile team, location changes around the table and incidents). Based on 5-min intervals, Cohen's κ was good to excellent for distractors (0.74-0.98) and for communication/teamwork (0.70-1). Based on frequency counts, intraclass correlation coefficient was excellent for distractors (0.86-0.99) and good to excellent for communication/teamwork (0.45-0.99). After 3 h of surgery, Cohen's κ was 0.78-0.93 for distractors, and 0.79-1 for communication/teamwork. The observational method developed allows a single observer to simultaneously assess distractors and communication/teamwork. Even for long procedures, high interobserver agreement can be achieved. Data collected with this method allow for investigating separate or combined effects of distractions and communication/teamwork on surgical performance and patient outcomes. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
BOREAS RSS-4 1994 Jack Pine Leaf Biochemistry and Modeled Spectra in the SSA
NASA Technical Reports Server (NTRS)
Hall, Forrest G. (Editor); Nickeson, Jaime (Editor); Plummer, Stephen; Lucas, Neil; Dawson, Terry
2000-01-01
The BOREAS RSS-4 team focused its efforts on deriving estimates of LAI and leaf chlorophyll and nitrogen concentrations from remotely sensed data for input into the Forest BGC model. This data set contains measurements of jack pine (Pinus banksiana) needle biochemistry from the BOREAS SSA in July and August 1994. The data contain measurements of current and year-1 needle chlorophyll, nitrogen, lignin, cellulose, and water content for the OJP flux tower and nearby auxiliary sites. The data have been used to test a needle reflectance and transmittance model, LIBERTY (Dawson et al., in press). The source code for the model and modeled needle spectra for each of the sampled tower and auxiliary sites are provided as part of this data set. The LIBERTY model was developed and the predicted spectral data generated to parameterize a canopy reflectance model (North, 1996) for comparison with AVIRIS, POLDER, and PARABOLA data. The data and model source code are stored in ASCII files.
Assessment of the Draft AIAA S-119 Flight Dynamic Model Exchange Standard
NASA Technical Reports Server (NTRS)
Jackson, E. Bruce; Murri, Daniel G.; Hill, Melissa A.; Jessick, Matthew V.; Penn, John M.; Hasan, David A.; Crues, Edwin Z.; Falck, Robert D.; McCarthy, Thomas G.; Vuong, Nghia;
2011-01-01
An assessment of a draft AIAA standard for flight dynamics model exchange, ANSI/AIAA S-119-2011, was conducted on behalf of NASA by a team from the NASA Engineering and Safety Center. The assessment included adding the capability of importing standard models into real-time simulation facilities at several NASA Centers as well as into analysis simulation tools. All participants were successful at importing two example models into their respective simulation frameworks by using existing software libraries or by writing new import tools. Deficiencies in the libraries and format documentation were identified and fixed; suggestions for improvements to the standard were provided to the AIAA. An innovative tool to generate C code directly from such a model was developed. Performance of the software libraries compared favorably with compiled code. As a result of this assessment, several NASA Centers can now import standard models directly into their simulations. NASA is considering adopting the now-published S-119 standard as an internal recommended practice.
The openEHR Java reference implementation project.
Chen, Rong; Klein, Gunnar
2007-01-01
The openEHR foundation has developed an innovative design for interoperable and future-proof Electronic Health Record (EHR) systems based on a dual model approach with a stable reference information model complemented by archetypes for specific clinical purposes.A team from Sweden has implemented all the stable specifications in the Java programming language and donated the source code to the openEHR foundation. It was adopted as the openEHR Java Reference Implementation in March 2005 and released under open source licenses. This encourages early EHR implementation projects around the world and a number of groups have already started to use this code. The early Java implementation experience has also led to the publication of the openEHR Java Implementation Technology Specification. A number of design changes to the specifications and important minor corrections have been directly initiated by the implementation project over the last two years. The Java Implementation has been important for the validation and improvement of the openEHR design specifications and provides building blocks for future EHR systems.
NASA Technical Reports Server (NTRS)
Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron
1994-01-01
This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.
Overcoming the Challenges of Implementing a Multi-Mission Distributed Workflow System
NASA Technical Reports Server (NTRS)
Sayfi, Elias; Cheng, Cecilia; Lee, Hyun; Patel, Rajesh; Takagi, Atsuya; Yu, Dan
2009-01-01
A multi-mission approach to solving the same problems for various projects is enticing. However, the multi-mission approach leads to the need to develop a configurable, adaptable and distributed system to meet unique project requirements. That, in turn, leads to a set of challenges varying from handling synchronization issues to coming up with a smart design that allows the "unknowns" to be decided later. This paper discusses the challenges that the Multi-mission Automated Task Invocation Subsystem (MATIS) team has come up against while designing the distributed workflow system, as well as elaborates on the solutions that were implemented. The first is to design an easily adaptable system that requires no code changes as a result of configuration changes. The number of formal deliveries is often limited because each delivery costs time and money. Changes such as the sequence of programs being called, a change of a parameter value in the program that is being automated should not result in code changes or redelivery.
NASA Technical Reports Server (NTRS)
Mehta, Manish; Seaford, Mark; Kovarik, Brian; Dufrene, Aaron; Solly, Nathan
2014-01-01
ATA-002 Technical Team has successfully designed, developed, tested and assessed the SLS Pathfinder propulsion systems for the Main Base Heating Test Program. Major Outcomes of the Pathfinder Test Program: Reach 90% of full-scale chamber pressure Achieved all engine/motor design parameter requirements Reach steady plume flow behavior in less than 35 msec Steady chamber pressure for 60 to 100 msec during engine/motor operation Similar model engine/motor performance to full-scale SLS system Mitigated nozzle throat and combustor thermal erosion Test data shows good agreement with numerical prediction codes Next phase of the ATA-002 Test Program Design & development of the SLS OML for the Main Base Heating Test Tweak BSRM design to optimize performance Tweak CS-REM design to increase robustness MSFC Aerosciences and CUBRC have the capability to develop sub-scale propulsion systems to meet desired performance requirements for short-duration testing.
Challenges of interprofessional team training: a qualitative analysis of residents' perceptions.
van Schaik, Sandrijn; Plant, Jennifer; O'Brien, Bridget
2015-01-01
Simulation-based interprofessional team training is thought to improve patient care. Participating teams often consist of both experienced providers and trainees, which likely impacts team dynamics, particularly when a resident leads the team. Although similar team composition is found in real-life, debriefing after simulations puts a spotlight on team interactions and in particular on residents in the role of team leader. The goal of the current study was to explore residents' perceptions of simulation-based interprofessional team training. This was a secondary analysis of a study of residents in the pediatric residency training program at the University of California, San Francisco (United States) leading interprofessional teams in simulated resuscitations, followed by facilitated debriefing. Residents participated in individual, semi-structured, audio-recorded interviews within one month of the simulation. The original study aimed to examine residents' self-assessment of leadership skills, and during analysis we encountered numerous comments regarding the interprofessional nature of the simulation training. We therefore performed a secondary analysis of the interview transcripts. We followed an iterative process to create a coding scheme, and used interprofessional learning and practice as sensitizing concepts to extract relevant themes. 16 residents participated in the study. Residents felt that simulated resuscitations were helpful but anxiety provoking, largely due to interprofessional dynamics. They embraced the interprofessional training opportunity and appreciated hearing other healthcare providers' perspectives, but questioned the value of interprofessional debriefing. They identified the need to maintain positive relationships with colleagues in light of the teams' complex hierarchy as a barrier to candid feedback. Pediatric residents in our study appreciated the opportunity to participate in interprofessional team training but were conflicted about the value of feedback and debriefing in this setting. These data indicate that the optimal approach to such interprofessional education activities deserves further study.
77 FR 11517 - Rapid Response Team for Transmission
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-27
...: Office of Electricity Delivery and Energy Reliability, Department of Energy, DoE. ACTION: Request for information. SUMMARY: The Department of Energy's Office of Electricity Delivery and Energy Reliability is... Electricity Delivery and Energy Reliability, Mail Code: OE-20, U.S. Department of Energy, 1000 Independence...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-05
... Samuelson, Interdisciplinary Team Leader, Tongass National Forest Minerals Program Leader, 8510 Mendenhall...'' including wetlands, habitat, and the intrinsic characteristics that warranted the Monument's initial... Doc. 2010-24907 Filed 10-4-10; 8:45 am] BILLING CODE 3410-11-P ...
Self-Identifying Emergency Radio Beacons
NASA Technical Reports Server (NTRS)
Friedman, Morton L.
1987-01-01
Rescue teams aided by knowledge of vehicle in distress. Similar to conventional emergency transmitters except contains additional timing and modulating circuits. Additions to standard emergency transmitter enable transmitter to send rescuers identifying signal in addition to conventional distress signal created by sweep generator. Data generator contains identifying code.
Aeroelastic modeling for the FIT team F/A-18 simulation
NASA Technical Reports Server (NTRS)
Zeiler, Thomas A.; Wieseman, Carol D.
1989-01-01
Some details of the aeroelastic modeling of the F/A-18 aircraft done for the Functional Integration Technology (FIT) team's research in integrated dynamics modeling and how these are combined with the FIT team's integrated dynamics model are described. Also described are mean axis corrections to elastic modes, the addition of nonlinear inertial coupling terms into the equations of motion, and the calculation of internal loads time histories using the integrated dynamics model in a batch simulation program. A video tape made of a loads time history animation was included as a part of the oral presentation. Also discussed is work done in one of the areas of unsteady aerodynamic modeling identified as needing improvement, specifically, in correction factor methodologies for improving the accuracy of stability derivatives calculated with a doublet lattice code.
Using practice development methodology to develop children's centre teams: ideas for the future.
Hemingway, Ann; Cowdell, Fiona
2009-09-01
The Children's Centre Programme is a recent development in the UK and brings together multi-agency teams to work with disadvantaged families. Practice development methods enable teams to work together in new ways. Although the term practice development remains relatively poorly defined, its key properties suggest that it embraces engagement, empowerment, evaluation and evolution. This paper introduces the Children's Centre Programme and practice development methods and aims to discuss the relevance of using this method to develop teams in children's centres through considering the findings from an evaluation of a two-year project to develop inter-agency public health teams. The evaluation showed that practice development methods can enable successful team development and showed that through effective facilitation, teams can change their practice to focus on areas of local need. The team came up with their own process to develop a strategy for their locality.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paris, Mark
A team of physicists and astrophysicists at Los Alamos National Laboratory, in collaboration with leading universities around the country, are using the Laboratory’s supercomputers to simulate the Big Bang nucleosynthesis and the early universe to unprecedented precision. These researchers developed a code, called BURST that describes the universe from a time of a few seconds after the Big Bang to several hundred thousand years later. BURST allows physicists to study the microscopic, quantum nature of fundamental particles — like nuclei and the ghostly, weakly interacting neutrinos — by simulating the universe at its largest, cosmological scale. BURST simultaneously describes allmore » the particles present in the early universe as they develop, tracking their evolution, particularly the amounts of light nuclei fused in the cosmic soup.« less
Hurricane Katrina Wind Investigation Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Desjarlais, A. O.
This investigation of roof damage caused by Hurricane Katrina is a joint effort of the Roofing Industry Committee on Weather Issues, Inc. (RICOWI) and the Oak Ridge National Laboratory/U.S. Department of Energy (ORNL/DOE). The Wind Investigation Program (WIP) was initiated in 1996. Hurricane damage that met the criteria of a major windstorm event did not materialize until Hurricanes Charley and Ivan occurred in August 2004. Hurricane Katrina presented a third opportunity for a wind damage investigation in August 29, 2005. The major objectives of the WIP are as follows: (1) to investigate the field performance of roofing assemblies after majormore » wind events; (2) to factually describe roofing assembly performance and modes of failure; and (3) to formally report results of the investigations and damage modes for substantial wind speeds The goal of the WIP is to perform unbiased, detailed investigations by credible personnel from the roofing industry, the insurance industry, and academia. Data from these investigations will, it is hoped, lead to overall improvement in roofing products, systems, roofing application, and durability and a reduction in losses, which may lead to lower overall costs to the public. This report documents the results of an extensive and well-planned investigative effort. The following program changes were implemented as a result of the lessons learned during the Hurricane Charley and Ivan investigations: (1) A logistics team was deployed to damage areas immediately following landfall; (2) Aerial surveillance--imperative to target wind damage areas--was conducted; (3) Investigation teams were in place within 8 days; (4) Teams collected more detailed data; and (5) Teams took improved photographs and completed more detailed photo logs. Participating associations reviewed the results and lessons learned from the previous investigations and many have taken the following actions: (1) Moved forward with recommendations for new installation procedures; (2) Updated and improved application guidelines and manuals from associations and manufacturers; (3) Launched certified product installer programs; and (4) Submitted building code changes to improve product installation. Estimated wind speeds at the damage locations came from simulated hurricane models prepared by Applied Research Associates of Raleigh, North Carolina. A dynamic hurricane wind field model was calibrated to actual wind speeds measured at 12 inland and offshore stations. The maximum estimated peak gust wind speeds in Katrina were in the 120-130 mph range. Hurricane Katrina made landfall near Grand Isle, Louisiana, and traveled almost due north across the city of New Orleans. Hurricane winds hammered the coastline from Houma, Louisiana, to Pensacola, Florida. The severe flooding problems in New Orleans made it almost impossible for the investigating teams to function inside the city. Thus the WIP investigations were all conducted in areas east of the city. The six teams covered the coastal areas from Bay Saint Louis, Mississippi, on the west to Pascagoula, Mississippi, on the east. Six teams involving a total of 25 persons documented damage to both low slope and steep slope roofing systems. The teams collected specific information on each building examined, including type of structure (use or occupancy), wall construction, roof type, roof slope, building dimensions, roof deck, insulation, construction, and method of roof attachment. In addition, the teams noted terrain exposure and the estimated wind speeds at the building site from the Katrina wind speed map. With each team member assigned a specific duty, they described the damage in detail and illustrated important features with numerous color photos. Where possible, the points of damage initiation were identified and damage propagation described. Because the wind speeds in Katrina at landfall, where the investigations took place, were less than code-specified design speeds, one would expect roof damage to be minimal. One team speculated that damage to all roofs in the area they examined was less than 10% when improper installation and deterioration were eliminated as causes. Roofs designed to code and installed according to manufacturers recommendations performed very well.« less
National Centers for Environmental Prediction
Organization Search Enter text Search Navigation Bar End Cap Search EMC Go Branches Global Climate and Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Model Configuration Collaborators Documentation and Code FAQ Operational Change Log Parallel Experiment
A method for studying decision-making by guideline development groups.
Gardner, Benjamin; Davidson, Rosemary; McAteer, John; Michie, Susan
2009-08-05
Multidisciplinary guideline development groups (GDGs) have considerable influence on UK healthcare policy and practice, but previous research suggests that research evidence is a variable influence on GDG recommendations. The Evidence into Recommendations (EiR) study has been set up to document social-psychological influences on GDG decision-making. In this paper we aim to evaluate the relevance of existing qualitative methodologies to the EiR study, and to develop a method best-suited to capturing influences on GDG decision-making. A research team comprised of three postdoctoral research fellows and a multidisciplinary steering group assessed the utility of extant qualitative methodologies for coding verbatim GDG meeting transcripts and semi-structured interviews with GDG members. A unique configuration of techniques was developed to permit data reduction and analysis. Our method incorporates techniques from thematic analysis, grounded theory analysis, content analysis, and framework analysis. Thematic analysis of individual interviews conducted with group members at the start and end of the GDG process defines discrete problem areas to guide data extraction from GDG meeting transcripts. Data excerpts are coded both inductively and deductively, using concepts taken from theories of decision-making, social influence and group processes. These codes inform a framework analysis to describe and explain incidents within GDG meetings. We illustrate the application of the method by discussing some preliminary findings of a study of a National Institute for Health and Clinical Excellence (NICE) acute physical health GDG. This method is currently being applied to study the meetings of three of NICE GDGs. These cover topics in acute physical health, mental health and public health, and comprise a total of 45 full-day meetings. The method offers potential for application to other health care and decision-making groups.
Brooks, Joanna Veazey; Gorbenko, Ksenia; Bosk, Charles
Implementing quality improvement in hospitals requires a multifaceted commitment from leaders, including financial, material, and personnel resources. However, little is known about the interactional resources needed for project implementation. The aim of this analysis was to identify the types of interactional support hospital teams sought in a surgical quality improvement project. Hospital site visits were conducted using a combination of observations, interviews, and focus groups to explore the implementation of a surgical quality improvement project. Twenty-six site visits were conducted between October 2012 and August 2014 at a total of 16 hospitals that agreed to participate. All interviews were recorded, transcribed, and coded for themes using inductive analysis. We interviewed 321 respondents and conducted an additional 28 focus groups. Respondents reported needing the following types of interactional support during implementation of quality improvement interventions: (1) a critical outside perspective on their implementation progress; (2) opportunities to learn from peers, especially around clinical innovations; and (3) external validation to help establish visibility for and commitment to the project. Quality improvement in hospitals is both a clinical endeavor and a social endeavor. Our findings show that teams often desire interactional resources as they implement quality improvement initiatives. In-person site visits can provide these resources while also activating emotional energy for teams, which builds momentum and sustainability for quality improvement work. Policymakers and quality improvement leaders will benefit from developing strategies to maximize interactional learning and feedback for quality improvement teams. Further research should investigate the most effective methods for meeting these needs.
Brooks, Joanna Veazey; Gorbenko, Ksenia; Bosk, Charles
2017-01-01
BACKGROUND Implementing quality improvement in hospitals requires a multi-faceted commitment from leaders, including financial, material, and personnel resources. However, little is known about the interactional resources needed for project implementation. The aim of this analysis was to identify the types of interactional support hospital teams sought in a surgical quality improvement project. METHODS Hospital site visits were conducted using a combination of observations, interviews, and focus groups to explore the implementation of a surgical quality improvement project. Twenty-six site visits were conducted between October 2012 and August 2014 at a total of 16 hospitals that agreed to participate. All interviews were recorded, transcribed, and coded for themes using inductive analysis. RESULTS We interviewed 321 respondents and conducted an additional 28 focus groups. Respondents reported needing the following types of interactional support during implementation of quality improvement interventions: 1) a critical outside perspective on their implementation progress; 2) opportunities to learn from peers, especially around clinical innovations; and 3) external validation to help establish visibility for and commitment to the project. CONCLUSIONS Quality improvement in hospitals is both a clinical and a social endeavor. Our findings show that teams often desire interactional resources as they implement quality improvement initiatives. In-person site visits can provide these resources while also activating emotional energy for teams, which builds momentum and sustainability for quality improvement work. IMPLICATIONS Policymakers and quality improvement leaders will benefit from developing strategies to maximize interactional learning and feedback for quality improvement teams. Further research should investigate the most effective methods for meeting these needs. PMID:28375951
Understanding antibiotic decision making in surgery-a qualitative analysis.
Charani, E; Tarrant, C; Moorthy, K; Sevdalis, N; Brennan, L; Holmes, A H
2017-10-01
To investigate the characteristics and culture of antibiotic decision making in the surgical specialty. A qualitative study including ethnographic observation and face-to-face interviews with participants from six surgical teams at a teaching hospital in London was conducted. Over a 3-month period: (a) 30 ward rounds (WRs) (100 h) were observed, (b) face-to-face follow-up interviews took place with 13 key informants, (c) multidisciplinary meetings on the management of surgical patients and daily practice on wards were observed. Applying these methods provided rich data for characterizing the antibiotic decision making in surgery and enabled cross-validation and triangulation of the findings. Data from the interview transcripts and the observational notes were coded and analysed iteratively until saturation was reached. The surgical team is in a state of constant flux with individuals having to adjust to the context in which they work. The demands placed on the team to be in the operating room, and to address the surgical needs of the patient mean that the responsibility for antibiotic decision making is uncoordinated and diffuse. Antibiotic decision making is considered by surgeons as a secondary task, commonly delegated to junior members of their team and occurs in the context of disjointed communication. There is lack of clarity around medical decision making for treating infections in surgical patients. The result is sub-optimal and uncoordinated antimicrobial management. Developing the role of a perioperative clinician may help to improve patient-level outcomes and optimize decision making. Copyright © 2017 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.
Overcoming Codes and Standards Barriers to Innovations in Building Energy Efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Pamala C.; Gilbride, Theresa L.
2015-02-15
In this journal article, the authors discuss approaches to overcoming building code barriers to energy-efficiency innovations in home construction. Building codes have been a highly motivational force for increasing the energy efficiency of new homes in the United States in recent years. But as quickly as the codes seem to be changing, new products are coming to the market at an even more rapid pace, sometimes offering approaches and construction techniques unthought of when the current code was first proposed, which might have been several years before its adoption by various jurisdictions. Due to this delay, the codes themselves canmore » become barriers to innovations that might otherwise be helping to further increase the efficiency, comfort, health or durability of new homes. . The U.S. Department of Energy’s Building America, a program dedicated to improving the energy efficiency of America’s housing stock through research and education, is working with the U.S. housing industry through its research teams to help builders identify and remove code barriers to innovation in the home construction industry. The article addresses several approaches that builders use to achieve approval for innovative building techniques when code barriers appear to exist.« less
Sakashita, Akihiro; Kishino, Megumi; Nakazawa, Yoko; Yotani, Nobuyuki; Yamaguchi, Takashi; Kizawa, Yoshiyuki
2016-07-01
To clarify how highly active hospital palliative care teams can provide efficient and effective care regardless of the lack of full-time palliative care physicians. Semistructured focus group interviews were conducted, and content analysis was performed. A total of 7 physicians and 6 nurses participated. We extracted 209 codes from the transcripts and organized them into 3 themes and 21 categories, which were classified as follows: (1) tips for managing palliative care teams efficiently and effectively (7 categories); (2) ways of acquiring specialist palliative care expertise (9 categories); and (3) ways of treating symptoms that are difficult to alleviate (5 categories). The findings of this study can be used as a nautical chart of hospital-based palliative care team (HPCT) without full-time PC physician. Full-time nurses who have high management and coordination abilities play a central role in resource-limited HPCTs. © The Author(s) 2015.
Sullivan, Jennifer L.; Adjognon, Omonyêlé L.; Engle, Ryann L.; Shin, Marlena H.; Afable, Melissa K.; Rudin, Whitney; White, Bert; Shay, Kenneth; Lukas, Carol VanDeusen
2018-01-01
Background: From 2010 to 2013, the Department of Veterans Affairs (VA) funded a large pilot initiative to implement noninstitutional long-term services and supports (LTSS) programs to support aging Veterans. Our team evaluated implementation of 59 VA noninstitutional LTSS programs. Purpose: The specific objectives of this study are to (a) examine the challenges influencing program implementation comparing active sites that remained open and inactive sites that closed during the funding period and (b) identify ways that active sites overcame the challenges they experienced. Methodology: Key informant semistructured interviews occurred between 2011 and 2013. We conducted 217 telephone interviews over four time points. Content analysis was used to identify emergent themes. The study team met regularly to define each challenge, review all codes, and discuss discrepancies. For each follow-up interview with the sites, the list of established challenges was used as a priori themes. Emergent data were also coded. Results: The challenges affecting implementation included human resources and staffing issues, infrastructure, resources allocation and geography, referrals and marketing, leadership support, and team dynamics and processes. Programs were able to overcome challenges by communicating with team members and other areas in the organization, utilizing information technology solutions, creative use of staff and flexible schedules, and obtaining additional resources. Discussion: This study highlights several common challenges programs can address during the program implementation. The most often mentioned strategy was effective communication. Strategies also targeted several components of the organization including organizational functions and processes (e.g., importance of coordination within a team and across disciplines to provide good care), infrastructure (e.g., information technology and human resources), and program fit with priorities in the organization (e.g., leadership support). Implications: Anticipating potential pitfalls of program implementation for future noninstitutional LTSS programs can improve implementation efficiency and program sustainability. Staff at multiple levels in the organization must fully support noninstitutional LTSS programs to address these challenges. PMID:28125459
Sullivan, Jennifer L; Adjognon, Omonyêlé L; Engle, Ryann L; Shin, Marlena H; Afable, Melissa K; Rudin, Whitney; White, Bert; Shay, Kenneth; Lukas, Carol VanDeusen
From 2010 to 2013, the Department of Veterans Affairs (VA) funded a large pilot initiative to implement noninstitutional long-term services and supports (LTSS) programs to support aging Veterans. Our team evaluated implementation of 59 VA noninstitutional LTSS programs. The specific objectives of this study are to (a) examine the challenges influencing program implementation comparing active sites that remained open and inactive sites that closed during the funding period and (b) identify ways that active sites overcame the challenges they experienced. Key informant semistructured interviews occurred between 2011 and 2013. We conducted 217 telephone interviews over four time points. Content analysis was used to identify emergent themes. The study team met regularly to define each challenge, review all codes, and discuss discrepancies. For each follow-up interview with the sites, the list of established challenges was used as a priori themes. Emergent data were also coded. The challenges affecting implementation included human resources and staffing issues, infrastructure, resources allocation and geography, referrals and marketing, leadership support, and team dynamics and processes. Programs were able to overcome challenges by communicating with team members and other areas in the organization, utilizing information technology solutions, creative use of staff and flexible schedules, and obtaining additional resources. This study highlights several common challenges programs can address during the program implementation. The most often mentioned strategy was effective communication. Strategies also targeted several components of the organization including organizational functions and processes (e.g., importance of coordination within a team and across disciplines to provide good care), infrastructure (e.g., information technology and human resources), and program fit with priorities in the organization (e.g., leadership support). Anticipating potential pitfalls of program implementation for future noninstitutional LTSS programs can improve implementation efficiency and program sustainability. Staff at multiple levels in the organization must fully support noninstitutional LTSS programs to address these challenges.
Waiswa, Peter; O'Connell, Thomas; Bagenda, Danstan; Mullachery, Pricila; Mpanga, Flavia; Henriksson, Dorcus Kiwanuka; Katahoire, Anne Ruhweza; Ssegujja, Eric; Mbonye, Anthony K; Peterson, Stefan Swartling
2016-03-11
Innovative and sustainable strategies to strengthen districts and other sub-national health systems and management are urgently required to reduce child mortality. Although highly effective evidence-based and affordable child survival interventions are well-known, at the district level, lack of data, motivation, analytic and planning capacity often impedes prioritization and management weaknesses impede implementation. The Community and District Empowerment for Scale-up (CODES) project is a complex management intervention designed to test whether districts when empowered with data and management tools can prioritize and implement evidence-based child survival interventions equitably. The CODES strategy combines management, diagnostic, and evaluation tools to identify and analyze the causes of bottlenecks to implementation, build capacity of district management teams to implement context-specific solutions, and to foster community monitoring and social accountability to increase demand for services. CODES combines UNICEF tools designed to systematize priority setting, allocation of resources and problem solving with Community dialogues based on Citizen Report Cards and U-Reports used to engage and empower communities in monitoring health service provision and to demand for quality services. Implementation and all data collection will be by the districts teams or local Community-based Organizations who will be supported by two local implementing partners. The study will be evaluated as a cluster randomized trial with eight intervention and eight comparison districts over a period of 3 years. Evaluation will focus on differences in uptake of child survival interventions and will follow an intention-to-treat analysis. We will also document and analyze experiences in implementation including changes in management practices. By increasing the District Health Management Teams' capacity to prioritize and implement context-specific solutions, and empowering communities to become active partners in service delivery, coverage of child survival interventions will increase. Lessons learned on strengthening district-level managerial capacities and mechanisms for community monitoring may have implications, not only in Uganda but also in other similar settings, especially with regard to accelerating effective coverage of key child survival interventions using locally available resources. ISRCTN15705788 , Date of registration; 24 July 2015.
2010-01-01
Background Much has been written in the educational literature on the value of communities of practise in enhancing student learning. Here, we take the experience of senior undergraduate medical students involved in short-term research as a member of a team as a paradigm for learning in a community of practise. Based on feedback from experienced supervisors, we offer recommendations for initiating students into the research culture of their team. In so doing, we endeavour to create a bridge between theory and practise through disseminating advice on good supervisory practise, where the supervisor is perceived as an educator responsible for designing the research process to optimize student learning. Methods Using the questionnaire design tool SurveyMonkey and comprehensive lists of contact details of staff who had supervised research projects at the University of Edinburgh during 1995 - 2008, current and previous supervisors were invited to recommend procedures which they had found successful in initiating students into the research culture of a team. Text responses were then coded in the form of derivative recommendations and categorized under general themes and sub-themes. Results Using the chi-square tests of linear trend and association, evidence was found for a positive trend towards more experienced supervisors offering responses (χ2 = 16.833, p < 0.0005, n = 215) while there was a lack of evidence of bias in the gender distribution of respondents (χ2 = 0.482, p = 0.487, n = 203), respectively. A total of 126 codes were extracted from the text responses of 65 respondents. These codes were simplified to form a complete list of 52 recommendations, which were in turn categorized under seven derivative overarching themes, the most highly represented themes being Connecting the student with others and Cultivating self-efficacy in research competence. Conclusions Through the design of a coding frame for supervisor responses, a wealth of ideas has been captured to make communities of research practise effective mediums for undergraduate student learning. The majority of these recommendations are underpinned by educational theory and have the potential to take the learner beyond the stage of initiation to that of integration within their community of research practise. PMID:21092088
Raley, Jessica; Meenakshi, Rani; Dent, Daniel; Willis, Ross; Lawson, Karla; Duzinski, Sarah
Fatal errors due to miscommunication among members of trauma teams are 2 to 4 times more likely to occur than in other medical teams, yet most trauma team members do not receive communication effectiveness training. A needs assessment was conducted to examine trauma team members' miscommunication experiences and research scientists' evaluations of live trauma activations. The purpose of this study is to demonstrate that communication training is necessary and highlight specific team communication competencies that trauma teams should learn to improve communication during activations. Data were collected in 2 phases. Phase 1 required participants to complete a series of surveys. Phase 2 included live observations and assessments of pediatric trauma activations using the assessment of pediatric resuscitation team assessments (APRC-TA) and assessment of pediatric resuscitation leader assessments (APRC-LA). Data were collected at a southwestern pediatric hospital. Trauma team members and leaders completed surveys at a meeting and were observed while conducting activations in the trauma bay. Trained research scientists and clinical staff used the APRC-TA and APRC-LA to measure trauma teams' medical performance and communication effectiveness. The sample included 29 healthcare providers who regularly participate in trauma activations. Additionally, 12 live trauma activations were assessed monday to friday from 8am to 5pm. Team members indicated that communication training should focus on offering assistance, delegating duties, accepting feedback, and controlling emotional expressions. Communication scores were not significantly different from medical performance scores. None of the teams were coded as effective medical performance and ineffective team communication and only 1 team was labeled as ineffective leader communication and effective medical performance. Communication training may be necessary for trauma teams and offer a deeper understanding of the communication competencies that should be addressed. The APRC-TA and APRC-LA both include team communication competencies that could be used as a guide to design training for trauma team members and leaders. Researchers should also continue to examine recommendations for improved team and leader communication during activations using in-depth interviews and focus groups. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
ISS Double-Gimbaled CMG Subsystem Simulation Using the Agile Development Method
NASA Technical Reports Server (NTRS)
Inampudi, Ravi
2016-01-01
This paper presents an evolutionary approach in simulating a cluster of 4 Control Moment Gyros (CMG) on the International Space Station (ISS) using a common sense approach (the agile development method) for concurrent mathematical modeling and simulation of the CMG subsystem. This simulation is part of Training systems for the 21st Century simulator which will provide training for crew members, instructors, and flight controllers. The basic idea of how the CMGs on the space station are used for its non-propulsive attitude control is briefly explained to set up the context for simulating a CMG subsystem. Next different reference frames and the detailed equations of motion (EOM) for multiple double-gimbal variable-speed control moment gyroscopes (DGVs) are presented. Fixing some of the terms in the EOM becomes the special case EOM for ISS's double-gimbaled fixed speed CMGs. CMG simulation development using the agile development method is presented in which customer's requirements and solutions evolve through iterative analysis, design, coding, unit testing and acceptance testing. At the end of the iteration a set of features implemented in that iteration are demonstrated to the flight controllers thus creating a short feedback loop and helping in creating adaptive development cycles. The unified modeling language (UML) tool is used in illustrating the user stories, class designs and sequence diagrams. This incremental development approach of mathematical modeling and simulating the CMG subsystem involved the development team and the customer early on, thus improving the quality of the working CMG system in each iteration and helping the team to accurately predict the cost, schedule and delivery of the software.
Nursing from the casual pool: focus group study to explore the experiences of casual nurses.
FitzGerald, Mary; McMillan, Margaret; Maguire, Jane Margaret
2007-08-01
The use of flexible non-contract nursing staff is increasing in Australia and in other countries where there is currently a nursing shortage. There is sparse empirical evidence relating to the experience of these nurses. This focus group study with six groups of enrolled and registered nurses in one regional health authority in New South Wales reports on the challenges and rewards of working through the casual pool. The textual data were coded and reported in themes and subthemes; the overarching theme is balance of social and professional life, while subthemes are social politics, nursing work and professional performance. The results reveal that nurses who work from the casual pool have insight into the work environment and culture of clinical teams that is untapped formally. They have little or no chance to provide clinical teams with feedback or receive feedback on their own performance. The consequence of this study has been the development of a two-way performance intervention to promote high standards of care from nurses who work from the casual pool and the promotion of safe clinical environments and cultures.
Research culture in a regional allied health setting.
Borkowski, Donna; McKinstry, Carol; Cotchett, Matthew
2017-07-01
Research evidence is required to guide best practice, inform policy and improve the health of communities. Current indicators consider allied health research culture to be low. This study aimed to measure the allied health research culture and capacity in a Victorian regional health service. The Research Capacity and Culture tool was used to evaluate research capacity and culture across individual, team and organisation domains. One-way ANOVA was used to determine differences between allied health professions, whereas responses to open-ended questions were themed using open coding. One hundred thirty-six allied health professionals completed the survey. There were statistically significant differences in the organisation domain between social work, physiotherapy and occupational therapy professions; in the team domain, between social work and all other professions. Motivators for conducting research included providing a high-quality service, developing skills and increasing job satisfaction. Barriers included other work roles taking priority, a lack of time and limited research skills. Multi-layered strategies including establishing conjoint research positions are recommended to increase allied health research culture in this regional area.
Final Report on ITER Task Agreement 81-08
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richard L. Moore
As part of an ITER Implementing Task Agreement (ITA) between the ITER US Participant Team (PT) and the ITER International Team (IT), the INL Fusion Safety Program was tasked to provide the ITER IT with upgrades to the fusion version of the MELCOR 1.8.5 code including a beryllium dust oxidation model. The purpose of this model is to allow the ITER IT to investigate hydrogen production from beryllium dust layers on hot surfaces inside the ITER vacuum vessel (VV) during in-vessel loss-of-cooling accidents (LOCAs). Also included in the ITER ITA was a task to construct a RELAP5/ATHENA model of themore » ITER divertor cooling loop to model the draining of the loop during a large ex-vessel pipe break followed by an in-vessel divertor break and compare the results to a simular MELCOR model developed by the ITER IT. This report, which is the final report for this agreement, documents the completion of the work scope under this ITER TA, designated as TA 81-08.« less
Singer, Zachary; Fung, Kevin; Lillie, Elaine; McLeod, Jennifer; Scott, Grace; You, Peng; Helleman, Krista
2018-05-01
Interprofessional health care teams have been shown to improve patient safety and reduce medical errors, among other benefits. Introducing interprofessional concepts to students in full day events is an established model that allows students to learn together. Our group developed an academic day for first-year students devoted to an introductory interprofessional education (IPE) experience, 'IPE Day'. In total, 438 students representing medicine, dentistry, pharmacy and optometry gathered together, along with 25 facilitators, for IPE Day. Following the day's program, students completed the evaluation consisting of the Interprofessional Collaborative Competencies Attainment Survey and open-ended questions. Narrative responses were analyzed for content and coded using the Canadian Interprofessional Health Collaborative competency domains. Three hundred and eight evaluations were completed. Students reported increased self-ratings of competency across all 20 items (p < 0.05). Their comments were organized into the six domains: interprofessional communication, collaborative leadership, role clarification, patient-centred care, conflict resolution, and team functioning. Based on these findings, we suggest that this IPE activity may be useful for improving learner perceptions about their interprofessional collaborative practice competence.
Swertz, Morris A; De Brock, E O; Van Hijum, Sacha A F T; De Jong, Anne; Buist, Girbe; Baerends, Richard J S; Kok, Jan; Kuipers, Oscar P; Jansen, Ritsert C
2004-09-01
Genomic research laboratories need adequate infrastructure to support management of their data production and research workflow. But what makes infrastructure adequate? A lack of appropriate criteria makes any decision on buying or developing a system difficult. Here, we report on the decision process for the case of a molecular genetics group establishing a microarray laboratory. Five typical requirements for experimental genomics database systems were identified: (i) evolution ability to keep up with the fast developing genomics field; (ii) a suitable data model to deal with local diversity; (iii) suitable storage of data files in the system; (iv) easy exchange with other software; and (v) low maintenance costs. The computer scientists and the researchers of the local microarray laboratory considered alternative solutions for these five requirements and chose the following options: (i) use of automatic code generation; (ii) a customized data model based on standards; (iii) storage of datasets as black boxes instead of decomposing them in database tables; (iv) loosely linking to other programs for improved flexibility; and (v) a low-maintenance web-based user interface. Our team evaluated existing microarray databases and then decided to build a new system, Molecular Genetics Information System (MOLGENIS), implemented using code generation in a period of three months. This case can provide valuable insights and lessons to both software developers and a user community embarking on large-scale genomic projects. http://www.molgenis.nl
Khoshnood, Narges; Hopwood, Marie-Clare; Lokuge, Bhadra; Kurahashi, Allison; Tobin, Anastasia; Isenberg, Sarina; Husain, Amna
2018-05-15
MAiD allows a practitioner to administer or prescribe medication for the purpose of ending a patient's life. In 2016, Canada was the latest country, following several European countries and American states, to legalize physician-assisted death. Although some studies report on physician attitudes towards MAiD or describe patient characteristics, there are few that explore the professional challenges faced by physicians who provide MAiD. To explore the professional challenges faced by Canadian physicians who provide MAiD. Sixteen physicians from across Canada who provide MAiD completed in-depth, semi-structured telephone interviews. An inductive thematic analysis approach guided data collection and the iterative, interpretive analysis of interview transcripts. Three members of the research team systematically co-coded interview transcripts and the emerging themes were developed with the broader research team. NVivo was used to manage the coded data. Participants described three challenges associated with providing MAiD: 1) their relationships with other MAiD providers were enhanced and relationships with objecting colleagues were sometimes strained, 2) they received inadequate financial compensation for time, and, 3) they experienced increased workload, resulting in sacrifices to personal time. Although these providers did not intend to stop providing MAiD at the time of the interview, they indicated their concerns about whether they would be able to sustain this service over time. Physicians described relationship, financial, and workload challenges to providing MAiD. We provide several recommendations to address these challenges and help ensure the sustainability of MAiD in countries that provide this service. Copyright © 2018. Published by Elsevier Inc.
NASA. Marshall Space Flight Center Hydrostatic Bearing Activities
NASA Technical Reports Server (NTRS)
Benjamin, Theodore G.
1991-01-01
The basic approach for analyzing hydrostatic bearing flows at the Marshall Space Flight Center (MSFC) is briefly discussed. The Hydrostatic Bearing Team has responsibility for assessing and evaluating flow codes; evaluating friction, ignition, and galling effects; evaluating wear; and performing tests. The Office of Aerospace and Exploration Technology Turbomachinery Seals Tasks consist of tests and analysis. The MSFC in-house analyses utilize one-dimensional bulk-flow codes. Computational fluid dynamics (CFD) analysis is used to enhance understanding of bearing flow physics or to perform parametric analysis that are outside the bulk flow database. As long as the bulk flow codes are accurate enough for most needs, they will be utilized accordingly and will be supported by CFD analysis on an as-needed basis.
National Centers for Environmental Prediction
Organization Search Enter text Search Navigation Bar End Cap Search EMC Go Branches Global Climate and Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Configuration Collaborators Documentation and Code FAQ Operational Change Log Parallel Experiment Change Log
National Centers for Environmental Prediction
Organization Search Enter text Search Navigation Bar End Cap Search EMC Go Branches Global Climate and Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Collaborators Documentation and Code FAQ Operational Change Log Parallel Experiment Change Log Contacts
Toward International Comparability of Survey Statistics on Visual Impairment: The DISTAB Project
ERIC Educational Resources Information Center
Hendershot, Gerry E.; Crews, John E.
2006-01-01
Using data from recent national disability surveys in Australia, Canada, France, the Netherlands, South Africa, and the United States, an international team of researchers coded indicators of several types of disability using the International Classification of Functioning, Disability, and Health. This article discusses the Disability Tabulations…
Emergency Planning and Community Right-to-Know Act Section 312 Tier Two report forms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, R.A.
2000-02-01
The report contains forms for the chemical description, physical and health hazards, inventory volumes, and storage codes and locations for all hazardous chemicals located at the Y-12 Plant. These can be used by local emergency response teams in case of an accident.
Emergency Planning and Community Right-To-Know Act Section 312 Tier Two report forms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, R.A.
2000-02-01
The report contains forms for the chemical description, physical and health hazards, inventory volumes, and storage codes and locations for all hazardous chemicals located at the Y-12 Plant. These can be used by local emergency response teams in case of an accident.
Baudoin, D; Krebs, S
2013-04-01
This article describes how a mobile team of palliative care and a department of neurology learned to cope with many complex end-of-life situations. After a brief introduction to inter-team cooperation, clinical work of the mobile team with patients and families and its cooperation with the neurology team are presented. The specificity of supportive care in neurology is also analyzed. Two interdisciplinary and multi-professional tools - the Palliative Care Resource Group and the Ethics Consultation Group - are described, with their activities and their goals. The Palliative Care Resource Group is a specific entity whose identity lies at the crossroads between commonly recognized organizational units: clinic staff, clinical practice, ethical or organizational analysis groups (Balint, 1960), discussion groups (Rusznievski, 1999), training groups. It has several objectives: 1) create a robust conceptual environment enabling the pursuit of palliative care practices without relying on the empty paradigm of stereotypical actions; if suffering cannot be avoided, psychic development and transformation can be promoted; 2) attempt to prevent caregiver burnout; 3) help support and strengthen the collective dimension of the team, learning a mode of care which goes beyond the execution of coded actions; 4) enhance the primary dimension of care, i.e. taking care, especially in clinical situations where conventional wisdom declares that "nothing more can be done."; 5) promote group work so new ideas arising from the different teams influence the behavior of all caregivers. The Ethics Consultation Group organizes its work in several steps. The first step is discernment, clearly identifying the question at hand with the clinical staff. This is followed by a consultation between the clinical team, the patient, the family and the referring physician to arrive at a motivated decision, respecting the competent patient's opinion. The final step is an evaluation of the decision and its consequences. The Ethical Consultation Group, which meets at a scheduled time at a set place, unites the different members of the neurology and palliative care teams who come to a common decision. These specific moments have an important impact on team cohesion, creating a common culture and a convergence of individual representations about making difficult decisions. Specific clinical cases are described to illustrate some of the difficulties encountered in palliative care decision-making. These cases provide insight about the decision to create a palliative care gastrostomy for a man with progressive supranuclear palsy, the suffering experienced by a medical team caring for a young woman with Creutzfeldt-Jacob encephalopathy, or a woman's experience with the post-stroke life-and-death seesaw. Theoretical divisions, illustrated with clinical stories, can be useful touchstones for neurology teams. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
NASA Technical Reports Server (NTRS)
Kinard, Tim A.; Harris, Brenda W.; Raj, Pradeep
1995-01-01
Vortex flows on a twin-tail and a single-tail modular transonic vortex interaction (MTVI) model, representative of a generic fighter configuration, are computationally simulated in this study using the Three-dimensional Euler/Navier-Stokes Aerodynamic Method (TEAM). The primary objective is to provide an assessment of viscous effects on benign (10 deg angle of attack) and burst (35 deg angle of attack) vortex flow solutions. This study was conducted in support of a NASA project aimed at assessing the viability of using Euler technology to predict aerodynamic characteristics of aircraft configurations at moderate-to-high angles of attack in a preliminary design environment. The TEAM code solves the Euler and Reynolds-average Navier-Stokes equations on patched multiblock structured grids. Its algorithm is based on a cell-centered finite-volume formulation with multistage time-stepping scheme. Viscous effects are assessed by comparing the computed inviscid and viscous solutions with each other and experimental data. Also, results of Euler solution sensitivity to grid density and numerical dissipation are presented for the twin-tail model. The results show that proper accounting of viscous effects is necessary for detailed design and optimization but Euler solutions can provide meaningful guidelines for preliminary design of flight vehicles which exhibit vortex flows in parts of their flight envelope.
Team-Based Professional Development Interventions in Higher Education: A Systematic Review.
Gast, Inken; Schildkamp, Kim; van der Veen, Jan T
2017-08-01
Most professional development activities focus on individual teachers, such as mentoring or the use of portfolios. However, new developments in higher education require teachers to work together in teams more often. Due to these changes, there is a growing need for professional development activities focusing on teams. Therefore, this review study was conducted to provide an overview of what is known about professional development in teams in the context of higher education. A total of 18 articles were reviewed that describe the effects of professional development in teams on teacher attitudes and teacher learning. Furthermore, several factors that can either hinder or support professional development in teams are identified at the individual teacher level, at the team level, and also at the organizational level.
Toye, Francine; Seers, Kate; Allcock, Nick; Briggs, Michelle; Carr, Eloise; Andrews, JoyAnn; Barker, Karen
2013-03-21
Studies that systematically search for and synthesise qualitative research are becoming more evident in health care, and they can make an important contribution to patient care. However, there is still no agreement as to whether, or how we should appraise studies for inclusion. We aimed to explore the intuitive processes that determined the 'quality' of qualitative research for inclusion in qualitative research syntheses. We were particularly interested to explore the way that knowledge was constructed. We used qualitative methods to explore the process of quality appraisal within a team of seven qualitative researchers funded to undertake a meta-ethnography of chronic non-malignant musculoskeletal pain. Team discussions took place monthly between October 2010 and June 2012 and were recorded and transcribed. Data was coded and organised using constant comparative method. The development of our conceptual analysis was both iterative and collaborative. The strength of this team approach to quality came from open and honest discussion, where team members felt free to agree, disagree, or change their position within the safety of the group. We suggest two core facets of quality for inclusion in meta-ethnography - (1) Conceptual clarity; how clearly has the author articulated a concept that facilitates theoretical insight. (2) Interpretive rigour; fundamentally, can the interpretation 'be trusted?' Our findings showed that three important categories help the reader to judge interpretive rigour: (ii) What is the context of the interpretation? (ii) How inductive is the interpretation? (iii) Has the researcher challenged their interpretation? We highlight that methods alone do not determine the quality of research for inclusion into a meta-ethnography. The strength of a concept and its capacity to facilitate theoretical insight is integral to meta-ethnography, and arguably to the quality of research. However, we suggest that to be judged 'good enough' there also needs to be some assurance that qualitative findings are more than simply anecdotal. Although our conceptual model was developed specifically for meta-ethnography, it may be transferable to other research methodologies.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-24
...), Global Product Development, Engineering Workstation Refresh Team, Working On-Site at General Motors... groups: The Non-Information Technology Business Development Team, the Engineering Application Support Team, and the Engineering Workstation Refresh Team. On February 2, 2011, the Department issued an...
RINGMesh: A programming library for developing mesh-based geomodeling applications
NASA Astrophysics Data System (ADS)
Pellerin, Jeanne; Botella, Arnaud; Bonneau, François; Mazuyer, Antoine; Chauvin, Benjamin; Lévy, Bruno; Caumon, Guillaume
2017-07-01
RINGMesh is a C++ open-source programming library for manipulating discretized geological models. It is designed to ease the development of applications and workflows that use discretized 3D models. It is neither a geomodeler, nor a meshing software. RINGMesh implements functionalities to read discretized surface-based or volumetric structural models and to check their validity. The models can be then exported in various file formats. RINGMesh provides data structures to represent geological structural models, either defined by their discretized boundary surfaces, and/or by discretized volumes. A programming interface allows to develop of new geomodeling methods, and to plug in external software. The goal of RINGMesh is to help researchers to focus on the implementation of their specific method rather than on tedious tasks common to many applications. The documented code is open-source and distributed under the modified BSD license. It is available at https://www.ring-team.org/index.php/software/ringmesh.
Verifying Architectural Design Rules of the Flight Software Product Line
NASA Technical Reports Server (NTRS)
Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen
2009-01-01
This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.
MX: A beamline control system toolkit
NASA Astrophysics Data System (ADS)
Lavender, William M.
2000-06-01
The development of experimental and beamline control systems for two Collaborative Access Teams at the Advanced Photon Source has resulted in the creation of a portable data acquisition and control toolkit called MX. MX consists of a set of servers, application programs and libraries that enable the creation of command line and graphical user interface applications that may be easily retargeted to new and different kinds of motor and device controllers. The source code for MX is written in ANSI C and Tcl/Tk with interprocess communication via TCP/IP. MX is available for several versions of Unix, Windows 95/98/NT and DOS. It may be downloaded from the web site http://www.imca.aps.anl.gov/mx/.
Command and Control Software Development Memory Management
NASA Technical Reports Server (NTRS)
Joseph, Austin Pope
2017-01-01
This internship was initially meant to cover the implementation of unit test automation for a NASA ground control project. As is often the case with large development projects, the scope and breadth of the internship changed. Instead, the internship focused on finding and correcting memory leaks and errors as reported by a COTS software product meant to track such issues. Memory leaks come in many different flavors and some of them are more benign than others. On the extreme end a program might be dynamically allocating memory and not correctly deallocating it when it is no longer in use. This is called a direct memory leak and in the worst case can use all the available memory and crash the program. If the leaks are small they may simply slow the program down which, in a safety critical system (a system for which a failure or design error can cause a risk to human life), is still unacceptable. The ground control system is managed in smaller sub-teams, referred to as CSCIs. The CSCI that this internship focused on is responsible for monitoring the health and status of the system. This team's software had several methods/modules that were leaking significant amounts of memory. Since most of the code in this system is safety-critical, correcting memory leaks is a necessity.
Staff Experiences Forming and Sustaining Palliative Care Teams in Nursing Homes.
Norton, Sally A; Ladwig, Susan; Caprio, Thomas V; Quill, Timothy E; Temkin-Greener, Helena
2018-01-03
Building palliative care (PC) capacity in nursing homes (NH) is a national priority and essential to providing high quality care for residents with advanced illness. We report on NH staff experiences in developing and sustaining Palliative Care Teams (PCTeams) as part of a randomized clinical trial to "Improve Palliative Care through Teamwork" (IMPACTT). We conducted rapid ethnographic assessments of all NH (N = 14) in the intervention arm. Data included semistructured interviews with direct care and administrative staff (n = 41), field observations, and written materials. We used a phased approach to data analysis including open coding and comparative analyses within and across homes. We found four key structural themes in our analysis including: administrative support, financial considerations, turnover and staffing, and competing priorities. The development and sustainability of the nascent PCTeams were constantly threatened by competing priorities and the key factor in their success was consistent and tangible administrative support. While improving PC in NHs is a recognized priority, lack of stable infrastructure and unintended consequences of reimbursement policies created conditions which often thwarted the sustainability of the PCTeams. © The Author(s) 2018. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Embryo transfer: a comparative biosecurity advantage in international movements of germplasm.
Thibier, M
2011-04-01
This paper uses cattle as a model to provide an overview of the hazards involved in the transfer of in vivo-derived and in vitro-produced embryos. While scientific studies in recent decades have led to the identification of pathogens that may be associated with both in vivo- and in vitro-derived embryos, those studies have also been the basis of appropriate disease control measures to reduce the risks to a negligible level. These disease control measures have been identified and assessed by the International Embryo Transfer Society's (lETS) Health and Safety Advisory Committee, the expert body that advises the World Organisation for Animal Health (OIE) on matters related to the safety of embryo transfer. Through the OIE's processes for developing and adopting international standards, the disease control measures identified by the IETS have been incorporated into the Terrestrial Animal Health Code. The basic principles rely on the crucial ethical roles of the embryo collection team and embryo transfer team, under the leadership of approved veterinarians. Decades of experience, with nearly 10 million embryos transferred, have demonstrated the very significant biosecurity advantage that embryo transfer technology has when moving germplasm internationally, provided that the international standards developed by the IETS and adopted by the OIE are strictly followed.
LIGHT WATER REACTOR ACCIDENT TOLERANT FUELS IRRADIATION TESTING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carmack, William Jonathan; Barrett, Kristine Eloise; Chichester, Heather Jean MacLean
2015-09-01
The purpose of Accident Tolerant Fuels (ATF) experiments is to test novel fuel and cladding concepts designed to replace the current zirconium alloy uranium dioxide (UO2) fuel system. The objective of this Research and Development (R&D) is to develop novel ATF concepts that will be able to withstand loss of active cooling in the reactor core for a considerably longer time period than the current fuel system while maintaining or improving the fuel performance during normal operations, operational transients, design basis, and beyond design basis events. It was necessary to design, analyze, and fabricate drop-in capsules to meet the requirementsmore » for testing under prototypic LWR temperatures in Idaho National Laboratory's Advanced Test Reactor (ATR). Three industry led teams and one DOE team from Oak Ridge National Laboratory provided fuel rodlet samples for their new concepts for ATR insertion in 2015. As-built projected temperature calculations were performed on the ATF capsules using the BISON fuel performance code. BISON is an application of INL’s Multi-physics Object Oriented Simulation Environment (MOOSE), which is a massively parallel finite element based framework used to solve systems of fully coupled nonlinear partial differential equations. Both 2D and 3D models were set up to examine cladding and fuel performance.« less
Economic analysis of athletic team coverage by an orthopedic practice.
Lombardi, Nicholas; Freedman, Kevin; Tucker, Brad; Austin, Luke; Eck, Brandon; Pepe, Matt; Tjoumakaris, Fotios
2015-11-01
Coverage of high school football by orthopedic sports medicine specialists is considered standard of care in many localities. Determining the economic viability of this endeavor has never been investigated. The primary purpose of the present investigation was to perform an economic analysis of local high school sports coverage by an orthopedic sports medicine practice. From January 2010 to June 2012, a prospective injury report database was used to collect sports injuries from five high school athletic programs covered by a single, private orthopedic sports medicine practice. Patients referred for orthopedic care were then tracked to determine expected cost of care (potential revenue). Evaluation and management codes and current procedure terminology codes were obtained to determine the value of physician visits and surgical care rendered. Overhead costs were calculated based on historical rates within our practice and incorporated to determine estimated profit. 19,165 athletic trainer contacts with athletes playing all sports, including both those 'on-field' and in the training room, resulted in 473 (2.5%) physician referrals. The covering orthopedic practice handled 89 (27.9%) of the orthopedic referrals. Of orthopedic physician referrals, 26 (5.4%) required orthopedic surgical treatment. The covering team practice handled 17/26 (65%) surgical cases. The total revenue collected by the covering team practice was $26,226.14. The overhead cost of treatment was $9441.41. Overall estimated profit of orthopedic visits and treatment during this period for the covering practice was $16,784.73. The covering team practice handled 28% of the orthopedic referrals, 65% of the surgical cases and captured 59% of the potential profit. An increase in physician referrals could increase the benefit for orthopedic surgeons.
Biehl, Michael; Sadowski, Peter; Bhanot, Gyan; Bilal, Erhan; Dayarian, Adel; Meyer, Pablo; Norel, Raquel; Rhrissorrakrai, Kahn; Zeller, Michael D.; Hormoz, Sahand
2015-01-01
Motivation: Animal models are widely used in biomedical research for reasons ranging from practical to ethical. An important issue is whether rodent models are predictive of human biology. This has been addressed recently in the framework of a series of challenges designed by the systems biology verification for Industrial Methodology for Process Verification in Research (sbv IMPROVER) initiative. In particular, one of the sub-challenges was devoted to the prediction of protein phosphorylation responses in human bronchial epithelial cells, exposed to a number of different chemical stimuli, given the responses in rat bronchial epithelial cells. Participating teams were asked to make inter-species predictions on the basis of available training examples, comprising transcriptomics and phosphoproteomics data. Results: Here, the two best performing teams present their data-driven approaches and computational methods. In addition, post hoc analyses of the datasets and challenge results were performed by the participants and challenge organizers. The challenge outcome indicates that successful prediction of protein phosphorylation status in human based on rat phosphorylation levels is feasible. However, within the limitations of the computational tools used, the inclusion of gene expression data does not improve the prediction quality. The post hoc analysis of time-specific measurements sheds light on the signaling pathways in both species. Availability and implementation: A detailed description of the dataset, challenge design and outcome is available at www.sbvimprover.com. The code used by team IGB is provided under http://github.com/uci-igb/improver2013. Implementations of the algorithms applied by team AMG are available at http://bhanot.biomaps.rutgers.edu/wiki/AMG-sc2-code.zip. Contact: meikelbiehl@gmail.com PMID:24994890
Datathons and Software to Promote Reproducible Research.
Celi, Leo Anthony; Lokhandwala, Sharukh; Montgomery, Robert; Moses, Christopher; Naumann, Tristan; Pollard, Tom; Spitz, Daniel; Stretch, Robert
2016-08-24
Datathons facilitate collaboration between clinicians, statisticians, and data scientists in order to answer important clinical questions. Previous datathons have resulted in numerous publications of interest to the critical care community and serve as a viable model for interdisciplinary collaboration. We report on an open-source software called Chatto that was created by members of our group, in the context of the second international Critical Care Datathon, held in September 2015. Datathon participants formed teams to discuss potential research questions and the methods required to address them. They were provided with the Chatto suite of tools to facilitate their teamwork. Each multidisciplinary team spent the next 2 days with clinicians working alongside data scientists to write code, extract and analyze data, and reformulate their queries in real time as needed. All projects were then presented on the last day of the datathon to a panel of judges that consisted of clinicians and scientists. Use of Chatto was particularly effective in the datathon setting, enabling teams to reduce the time spent configuring their research environments to just a few minutes-a process that would normally take hours to days. Chatto continued to serve as a useful research tool after the conclusion of the datathon. This suite of tools fulfills two purposes: (1) facilitation of interdisciplinary teamwork through archiving and version control of datasets, analytical code, and team discussions, and (2) advancement of research reproducibility by functioning postpublication as an online environment in which independent investigators can rerun or modify analyses with relative ease. With the introduction of Chatto, we hope to solve a variety of challenges presented by collaborative data mining projects while improving research reproducibility.
Management Guidelines for Database Developers' Teams in Software Development Projects
NASA Astrophysics Data System (ADS)
Rusu, Lazar; Lin, Yifeng; Hodosi, Georg
Worldwide job market for database developers (DBDs) is continually increasing in last several years. In some companies, DBDs are organized as a special team (DBDs team) to support other projects and roles. As a new role, the DBDs team is facing a major problem that there are not any management guidelines for them. The team manager does not know which kinds of tasks should be assigned to this team and what practices should be used during DBDs work. Therefore in this paper we have developed a set of management guidelines, which includes 8 fundamental tasks and 17 practices from software development process, by using two methodologies Capability Maturity Model (CMM) and agile software development in particular Scrum in order to improve the DBDs team work. Moreover the management guidelines developed here has been complemented with practices from authors' experience in this area and has been evaluated in the case of a software company. The management guidelines for DBD teams presented in this paper could be very usefully for other companies too that are using a DBDs team and could contribute towards an increase of the efficiency of these teams in their work on software development projects.
Toward Real-Time Infoveillance of Twitter Health Messages.
Colditz, Jason B; Chu, Kar-Hai; Emery, Sherry L; Larkin, Chandler R; James, A Everette; Welling, Joel; Primack, Brian A
2018-06-21
There is growing interest in conducting public health research using data from social media. In particular, Twitter "infoveillance" has demonstrated utility across health contexts. However, rigorous and reproducible methodologies for using Twitter data in public health are not yet well articulated, particularly those related to content analysis, which is a highly popular approach. In 2014, we gathered an interdisciplinary team of health science researchers, computer scientists, and methodologists to begin implementing an open-source framework for real-time infoveillance of Twitter health messages (RITHM). Through this process, we documented common challenges and novel solutions to inform future work in real-time Twitter data collection and subsequent human coding. The RITHM framework allows researchers and practitioners to use well-planned and reproducible processes in retrieving, storing, filtering, subsampling, and formatting data for health topics of interest. Further considerations for human coding of Twitter data include coder selection and training, data representation, codebook development and refinement, and monitoring coding accuracy and productivity. We illustrate methodological considerations through practical examples from formative work related to hookah tobacco smoking, and we reference essential methods literature related to understanding and using Twitter data. (Am J Public Health. Published online ahead of print June 21, 2018: e1-e6. doi:10.2105/AJPH.2018.304497).
Performance Analysis, Modeling and Scaling of HPC Applications and Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhatele, Abhinav
2016-01-13
E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research alongmore » the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.« less
Brookman-Frazee, Lauren; Stahmer, Aubyn C
2018-05-09
The Centers for Disease Control (2018) estimates that 1 in 59 children has autism spectrum disorder, and the annual cost of ASD in the U.S. is estimated to be $236 billion. Evidence-based interventions have been developed and demonstrate effectiveness in improving child outcomes. However, research on generalizable methods to scale up these practices in the multiple service systems caring for these children has been limited and is critical to meet this growing public health need. This project includes two, coordinated studies testing the effectiveness of the Translating Evidence-based Interventions (EBI) for ASD: Multi-Level Implementation Strategy (TEAMS) model. TEAMS focuses on improving implementation leadership, organizational climate, and provider attitudes and motivation in order to improve two key implementation outcomes-provider training completion and intervention fidelity and subsequent child outcomes. The TEAMS Leadership Institute applies implementation leadership strategies and TEAMS Individualized Provider Strategies for training applies motivational interviewing strategies to facilitate provider and organizational behavior change. A cluster randomized implementation/effectiveness Hybrid, type 3, trial with a dismantling design will be used to understand the effectiveness of TEAMS and the mechanisms of change across settings and participants. Study #1 will test the TEAMS model with AIM HI (An Individualized Mental Health Intervention for ASD) in publicly funded mental health services. Study #2 will test TEAMS with CPRT (Classroom Pivotal Response Teaching) in education settings. Thirty-seven mental health programs and 37 school districts will be randomized, stratified by county and study, to one of four groups (Standard Provider Training Only, Standard Provider Training + Leader Training, Enhanced Provider Training, Enhanced Provider Training + Leader Training) to test the effectiveness of combining standard, EBI-specific training with the two TEAMS modules individually and together on multiple implementation outcomes. Implementation outcomes including provider training completion, fidelity (coded by observers blind to group assignment) and child behavior change will be examined for 295 mental health providers, 295 teachers, and 590 children. This implementation intervention has the potential to increase quality of care for ASD in publicly funded settings by improving effectiveness of intervention implementation. The process and modules will be generalizable to multiple service systems, providers, and interventions, providing broad impact in community services. This study is registered with Clinicaltrials.gov ( NCT03380078 ). Registered 20 December 2017, retrospectively registered.
Combat injury coding: a review and reconfiguration.
Lawnick, Mary M; Champion, Howard R; Gennarelli, Thomas; Galarneau, Michael R; D'Souza, Edwin; Vickers, Ross R; Wing, Vern; Eastridge, Brian J; Young, Lee Ann; Dye, Judy; Spott, Mary Ann; Jenkins, Donald H; Holcomb, John; Blackbourne, Lorne H; Ficke, James R; Kalin, Ellen J; Flaherty, Stephen
2013-10-01
The current civilian Abbreviated Injury Scale (AIS), designed for automobile crash injuries, yields important information about civilian injuries. It has been recognized for some time, however, that both the AIS and AIS-based scores such as the Injury Severity Score (ISS) are inadequate for describing penetrating injuries, especially those sustained in combat. Existing injury coding systems do not adequately describe (they actually exclude) combat injuries such as the devastating multi-mechanistic injuries resulting from attacks with improvised explosive devices (IEDs). After quantifying the inapplicability of current coding systems, the Military Combat Injury Scale (MCIS), which includes injury descriptors that accurately characterize combat anatomic injury, and the Military Functional Incapacity Scale (MFIS), which indicates immediate tactical functional impairment, were developed by a large tri-service military and civilian group of combat trauma subject-matter experts. Assignment of MCIS severity levels was based on urgency, level of care needed, and risk of death from each individual injury. The MFIS was developed based on the casualty's ability to shoot, move, and communicate, and comprises four levels ranging from "Able to continue mission" to "Lost to military." Separate functional impairments were identified for injuries aboard ship. Preliminary evaluation of MCIS discrimination, calibration, and casualty disposition was performed on 992 combat-injured patients using two modeling processes. Based on combat casualty data, the MCIS is a new, simpler, comprehensive severity scale with 269 codes (vs. 1999 in AIS) that specifically characterize and distinguish the many unique injuries encountered in combat. The MCIS integrates with the MFIS, which associates immediate combat functional impairment with minor and moderate-severity injuries. Predictive validation on combat datasets shows improved performance over AIS-based tools in addition to improved face, construct, and content validity and coding inter-rater reliability. Thus, the MCIS has greater relevance, accuracy, and precision for many military-specific applications. Over a period of several years, the Military Combat Injury Scale and Military Functional Incapacity Scale were developed, tested and validated by teams of civilian and tri-service military expertise. MCIS shows significant promise in documenting the nature, severity and complexity of modern combat injury.
Flexible knowledge repertoires: communication by leaders in trauma teams
2012-01-01
Background In emergency situations, it is important for the trauma team to efficiently communicate their observations and assessments. One common communication strategy is “closed-loop communication”, which can be described as a transmission model in which feedback is of great importance. The role of the leader is to create a shared goal in order to achieve consensus in the work for the safety of the patient. The purpose of this study was to analyze how formal leaders communicate knowledge, create consensus, and position themselves in relation to others in the team. Methods Sixteen trauma teams were audio- and video-recorded during high fidelity training in an emergency department. Each team consisted of six members: one surgeon or emergency physician (the designated team leader), one anaesthesiologist, one nurse anaesthetist, one enrolled nurse from the theatre ward, one registered nurse and one enrolled nurse from the emergency department (ED). The communication was transcribed and analyzed, inspired by discourse psychology and Strauss’ concept of “negotiated order”. The data were organized and coded in NVivo 9. Results The findings suggest that leaders use coercive, educational, discussing and negotiating strategies to work things through. The leaders in this study used different repertoires to convey their knowledge to the team, in order to create a common goal of the priorities of the work. Changes in repertoires were dependent on the urgency of the situation and the interaction between team members. When using these repertoires, the leaders positioned themselves in different ways, either on an authoritarian or a more egalitarian level. Conclusion This study indicates that communication in trauma teams is complex and consists of more than just transferring messages quickly. It also concerns what the leaders express, and even more importantly, how they speak to and involve other team members. PMID:22747848
Müller, C; Plewnia, A; Becker, S; Rundel, M; Zimmermann, L; Körner, M
2015-08-19
Interdisciplinary teamwork and team interventions are highly valued in the rehabilitation sector because they can improve outcomes of care for persons with complex health problems. However, little is known about expectations and requests regarding team interventions, especially in medical rehabilitation. This study aimed to explore how clinical managers and health professionals within multidisciplinary rehabilitation teams describe their expectations and requests regarding team-training interventions in the field of medical rehabilitation. Considering the methodology of qualitative research, data were collected using semi-structured interviews and focus groups at five rehabilitation clinics in Germany. We conducted face-to-face interviews with 5 clinical managers and 13 department heads of health care teams as well as five focus groups with a total of 35 members of interdisciplinary rehabilitation teams. Afterwards, the data were analyzed through qualitative content analysis encompassing data coding and using inductive thematic analysis. The exploration of team members' and clinical managers' descriptions showed that, to them, interdisciplinary team training programs should include a wide array of training contents. Seven common core themes emerged from the interviews, including participation of employees, leadership, communication, team meetings, team composition, coordination, and equal esteem. Additionally, 13 themes were identified by either managers or team members. The body of expectations regarding team training content in healthcare spans the continuum of changes on the team and organizational levels. On the organizational level, a number of structural factors were mentioned (e.g. improving the general conditions for team meetings, organized workshops to exchange interdisciplinary experiences, and leadership training), and on the team level, changes in procedural factors were listed (e.g. optimizing the consecutive planning and coordination of patient treatments, clarity with regard to roles and responsibilities of team members, and mutual esteem and appreciation between different professions). The synthesis underscores that there is meaningful heterogeneity in team training needs; training interventions should be locally adapted for each clinic in terms of training content and training strategies. Tailored team interventions are important for rehabilitation clinics. Future work should evaluate employed team training concepts over time as well as training contents, implementation strategies, and learning outcomes. This includes using robust study designs and evaluating team-training effects.
Team-Based Professional Development Interventions in Higher Education: A Systematic Review
Gast, Inken; Schildkamp, Kim; van der Veen, Jan T.
2017-01-01
Most professional development activities focus on individual teachers, such as mentoring or the use of portfolios. However, new developments in higher education require teachers to work together in teams more often. Due to these changes, there is a growing need for professional development activities focusing on teams. Therefore, this review study was conducted to provide an overview of what is known about professional development in teams in the context of higher education. A total of 18 articles were reviewed that describe the effects of professional development in teams on teacher attitudes and teacher learning. Furthermore, several factors that can either hinder or support professional development in teams are identified at the individual teacher level, at the team level, and also at the organizational level. PMID:28989192
Mark, Lynette J; Herzer, Kurt R; Cover, Renee; Pandian, Vinciya; Bhatti, Nasir I; Berkow, Lauren C; Haut, Elliott R; Hillel, Alexander T; Miller, Christina R; Feller-Kopman, David J; Schiavi, Adam J; Xie, Yanjun J; Lim, Christine; Holzmueller, Christine; Ahmad, Mueen; Thomas, Pradeep; Flint, Paul W; Mirski, Marek A
2015-07-01
Difficult airway cases can quickly become emergencies, increasing the risk of life-threatening complications or death. Emergency airway management outside the operating room is particularly challenging. We developed a quality improvement program-the Difficult Airway Response Team (DART)-to improve emergency airway management outside the operating room. DART was implemented by a team of anesthesiologists, otolaryngologists, trauma surgeons, emergency medicine physicians, and risk managers in 2005 at The Johns Hopkins Hospital in Baltimore, Maryland. The DART program had 3 core components: operations, safety, and education. The operations component focused on developing a multidisciplinary difficult airway response team, standardizing the emergency response process, and deploying difficult airway equipment carts throughout the hospital. The safety component focused on real-time monitoring of DART activations and learning from past DART events to continuously improve system-level performance. This objective entailed monitoring the paging system, reporting difficult airway events and DART activations to a Web-based registry, and using in situ simulations to identify and mitigate defects in the emergency airway management process. The educational component included development of a multispecialty difficult airway curriculum encompassing case-based lectures, simulation, and team building/communication to ensure consistency of care. Educational materials were also developed for non-DART staff and patients to inform them about the needs of patients with difficult airways and ensure continuity of care with other providers after discharge. Between July 2008 and June 2013, DART managed 360 adult difficult airway events comprising 8% of all code activations. Predisposing patient factors included body mass index >40, history of head and neck tumor, prior difficult intubation, cervical spine injury, airway edema, airway bleeding, and previous or current tracheostomy. Twenty-three patients (6%) required emergent surgical airways. Sixty-two patients (17%) were stabilized and transported to the operating room for definitive airway management. There were no airway management-related deaths, sentinel events, or malpractice claims in adult patients managed by DART. Five in situ simulations conducted in the first program year improved DART's teamwork, communication, and response times and increased the functionality of the difficult airway carts. Over the 5-year period, we conducted 18 airway courses, through which >200 providers were trained. DART is a comprehensive program for improving difficult airway management. Future studies will examine the comparative effectiveness of the DART program and evaluate how DART has impacted patient outcomes, operational efficiency, and costs of care.
Mark, Lynette J.; Herzer, Kurt R.; Cover, Renee; Pandian, Vinciya; Bhatti, Nasir I.; Berkow, Lauren C.; Haut, Elliott R.; Hillel, Alexander T.; Miller, Christina R.; Feller-Kopman, David J.; Schiavi, Adam J.; Xie, Yanjun J.; Lim, Christine; Holzmueller, Christine; Ahmad, Mueen; Thomas, Pradeep; Flint, Paul W.; Mirski, Marek A.
2015-01-01
Background Difficult airway cases can quickly become emergencies, increasing the risk of life-threatening complications or death. Emergency airway management outside the operating room is particularly challenging. Methods We developed a quality improvement program—the Difficult Airway Response Team (DART)—to improve emergency airway management outside the operating room. DART was implemented by a team of anesthesiologists, otolaryngologists, trauma surgeons, emergency medicine physicians, and risk managers in 2005 at The Johns Hopkins Hospital in Baltimore, Maryland. The DART program had three core components: operations, safety, and education. The operations component focused on developing a multidisciplinary difficult airway response team, standardizing the emergency response process, and deploying difficult airway equipment carts throughout the hospital. The safety component focused on real-time monitoring of DART activations and learning from past DART events to continuously improve system-level performance. This objective entailed monitoring the paging system, reporting difficult airway events and DART activations to a web-based registry, and using in situ simulations to identify and mitigate defects in the emergency airway management process. The educational component included development of a multispecialty difficult airway curriculum encompassing case-based lectures, simulation, and team building/communication to ensure consistency of care. Educational materials were also developed for non-DART staff and patients to inform them about the needs of patients with difficult airways and ensure continuity of care with other providers after discharge. Results Between July 2008 and June 2013, DART managed 360 adult difficult airway events comprising 8% of all code activations. Predisposing patient factors included body mass index > 40, history of head and neck tumor, prior difficult intubation, cervical spine injury, airway edema, airway bleeding, and previous or current tracheostomy. Twenty-three patients (6%) required emergent surgical airways. Sixty-two patients (17%) were stabilized and transported to the operating room for definitive airway management. There were no airway management-related deaths, sentinel events, or malpractice claims in adult patients managed by DART. Five in situ simulations conducted in the first program year improved DART's teamwork, communication, and response times and increased the functionality of the difficult airway carts. Over the 5-year period, we conducted 18 airway courses, through which more than 200 providers were trained. Conclusions DART is a comprehensive program for improving difficult airway management. Future studies will examine the comparative effectiveness of the DART program and evaluate how DART has impacted patient outcomes, operational efficiency, and costs of care. PMID:26086513
Building and expanding interprofessional teaching teams.
Darlow, Ben; McKinlay, Eileen; Gallagher, Peter; Beckingsale, Louise; Coleman, Karen; Perry, Meredith; Pullon, Sue
2017-03-01
INTRODUCTION Interprofessional education (IPE) aims to prepare learners to work in collaborative health-care teams. The University of Otago, Wellington has piloted, developed and expanded an IPE programme since 2011. An interprofessional teaching team has developed alongside this programme. AIMS This study aimed to understand the development of a university-based interprofessional teaching team over a 4-year period and generate insights to aid the development of such teams elsewhere. METHODS Two semi-structured audio-recorded educator focus groups were conducted at key times in the development of the IPE programme in 2011 and 2014. The programme focused on long-term condition management and involved students from dietetics, medicine, physiotherapy and radiation therapy. Focus group transcripts were independently analysed by two researchers using Thematic Analysis to identify broad themes. Initial themes were compared, discussed and combined to form a thematic framework. The thematic framework was verified by the education team and subsequently updated and reorganised. RESULTS Three key themes emerged: (i) development as an interprofessional educator; (ii) developing a team; and (iii) risk and reward. Teaching in an interprofessional environment was initially daunting but confidence increased with experience. Team teaching highlighted educators' disciplinary roles and skill sets and exposed educators to different teaching approaches. Educators perceived they modelled team development processes to students through their own development as a team. Interprofessional teaching was challenging to organise but participation was rewarding. Programme expansion increased the risks and complexity, but also acted as a stimulus for development and energised the teaching team. DISCUSSION Interprofessional teaching is initially challenging but ultimately enriching. Interprofessional teaching skills take time to develop and perspectives of role change over time. Educator team development is aided by commitment, understanding, enthusiasm, leadership and trust.
Smith, Katherine; Washington, Carmen; Brown, Jennifer; Vadnais, Alison; Kroart, Laura; Ferguson, Jacqueline; Cohen, Joanna
2015-01-01
Tobacco remains the world's leading preventable cause of death, with the majority of tobacco-caused deaths occurring in low- and middle-income countries. The first global health treaty, the Framework Convention on Tobacco Control (FCTC), outlines a set of policy initiatives that have been demonstrated as effective in reducing tobacco use. Article 11 of the FCTC focuses on using the tobacco package to communicate tobacco-caused harms; it also seeks to restrict the delivery of misleading information about the product on the pack. The objective of this study was to establish a surveillance system for tobacco packs in the 14 low- and middle-income countries with the greatest number of smokers. The Tobacco Pack Surveillance System (TPackSS) monitors whether required health warnings on tobacco packages are being implemented as intended, and identifies pack designs and appeals that might violate or detract from the communication of harm-related information and undermine the impact of a country's tobacco packaging laws. The protocol outlined is intended to be applicable or adaptable for surveillance efforts in other countries. Tobacco packs were collected in 14 countries during 2013. The intention was, to the extent possible, to construct a census of "unique" pack presentations available for purchase in each country. The TPackSS team partnered with in-country field staff to implement a standardized protocol for acquiring packs from 36 diverse neighborhoods across three cities in each country. At the time of purchase, data on price and place of acquisition of each pack was recorded. The field staff, according to a standardized protocol, then photographed packs before they were shipped to the United States for coding and archiving. Each pack was coded for compliance with the country-specific health warning label laws, as well as for key design features of the pack and appeals of the branding elements. The coding protocols were developed based upon prior research, expert opinion, and communication theories. Each pack was coded by two independent coders, with consistency of personnel across the project. We routinely measured intercoder reliability, and only retained variables for which a good level of reliability was achieved. Variables where reliability was too low were not included in final analyses, and any inconsistencies in coding were resolved on a daily basis. Across the 14 countries, the TPackSS team collected 3307 tobacco packs. We have established a publicly accessible, Internet archive of these packs that is intended for use by the tobacco control policy advocacy and research community.
Banna, Jinan C; Gilliland, Betsy; Keefe, Margaret; Zheng, Dongping
2016-09-26
Understanding views about what constitutes a healthy diet in diverse populations may inform design of culturally tailored behavior change interventions. The objective of this study was to describe perspectives on healthy eating among Chinese and American young adults and identify similarities and differences between these groups. Chinese (n = 55) and American (n = 57) undergraduate students in Changsha, Hunan, China and Honolulu, Hawai'i, U.S.A. composed one- to two-paragraph responses to the following prompt: "What does the phrase 'a healthy diet' mean to you?" Researchers used content analysis to identify predominant themes using Dedoose (version 5.2.0, SocioCultural Research Consultants, LLC, Los Angeles, CA, 2015). Three researchers independently coded essays and grouped codes with similar content. The team then identified themes and sorted them in discussion. Two researchers then deductively coded the entire data set using eight codes developed from the initial coding and calculated total code counts for each group of participants. Chinese students mentioned physical outcomes, such as maintaining immunity and digestive health. Timing of eating, with regular meals and greater intake during day than night, was emphasized. American students described balancing among food groups and balancing consumption with exercise, with physical activity considered essential. Students also stated that food components such as sugar, salt and fat should be avoided in large quantities. Similarities included principles such as moderation and fruits and vegetables as nutritious, and differences included foods to be restricted and meal timing. While both groups emphasized specific foods and guiding dietary principles, several distinctions in viewpoints emerged. The diverse views may reflect food-related messages to which participants are exposed both through the media and educational systems in their respective countries. Future studies may further examine themes that may not typically be addressed in nutrition education programs in diverse populations of young adults. Gaining greater knowledge of the ways in which healthy eating is viewed will allow for development of interventions that are sensitive to the traditional values and predominant views of health in various groups.
NASA/industry advanced turboprop technology program
NASA Technical Reports Server (NTRS)
Ziemianski, Joseph A.; Whitlow, John B., Jr.
1988-01-01
Experimental and analytical effort shows that use of advanced turboprop (propfan) propulsion instead of conventional turbofans in the older narrow-body airline fleet could reduce fuel consumption for this type of aircraft by up to 50 percent. The NASA Advanced Turboprop (ATP) program was formulated to address the key technologies required for these thin, swept-blade propeller concepts. A NASA, industry, and university team was assembled to develop and validate applicable design codes and prove by ground and flight test the viability of these propeller concepts. Some of the history of the ATP project, an overview of some of the issues, and a summary of the technology developed to make advanced propellers viable in the high-subsonic cruise speed application are presented. The ATP program was awarded the prestigious Robert J. Collier Trophy for the greatest achievement in aeronautics and astronautics in America in 1987.
Teamwork in perioperative nursing. Understanding team development, effectiveness, evaluation.
Farley, M J
1991-03-01
Teams are an essential part of perioperative nursing practice. Nurses who have a knowledge of teamwork and experience in working on teams have a greater understanding of the processes and problems involved as teams develop from new, immature teams to those that are mature and effective. This understanding will assist nurses in helping their teams achieve a higher level of productivity, and members will be more satisfied with team efforts. Team development progresses through several stages. Each stage has certain characteristics and desired outcomes. At each stage, team members and leaders have certain responsibilities. Team growth does not take place automatically and inevitably, but as a consequence of conscious and unconscious efforts of its leader and members to solve problems and satisfy needs. Building and maintaining a team is certainly work, but work that brings a great deal of satisfaction and feelings of pride in accomplishment. According to I Tenzer, RN, MS, teamwork "is not a panacea; it is a viable approach to developing a hospital's most valuable resource--people."
Xyce parallel electronic simulator design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thornquist, Heidi K.; Rankin, Eric Lamont; Mei, Ting
2010-09-01
This document is the Xyce Circuit Simulator developer guide. Xyce has been designed from the 'ground up' to be a SPICE-compatible, distributed memory parallel circuit simulator. While it is in many respects a research code, Xyce is intended to be a production simulator. As such, having software quality engineering (SQE) procedures in place to insure a high level of code quality and robustness are essential. Version control, issue tracking customer support, C++ style guildlines and the Xyce release process are all described. The Xyce Parallel Electronic Simulator has been under development at Sandia since 1999. Historically, Xyce has mostly beenmore » funded by ASC, the original focus of Xyce development has primarily been related to circuits for nuclear weapons. However, this has not been the only focus and it is expected that the project will diversify. Like many ASC projects, Xyce is a group development effort, which involves a number of researchers, engineers, scientists, mathmaticians and computer scientists. In addition to diversity of background, it is to be expected on long term projects for there to be a certain amount of staff turnover, as people move on to different projects. As a result, it is very important that the project maintain high software quality standards. The point of this document is to formally document a number of the software quality practices followed by the Xyce team in one place. Also, it is hoped that this document will be a good source of information for new developers.« less
[Role-specific targets and teamwork in the operating room].
Hoeper, K; Kriependorf, M; Felix, C; Nyhuis, P; Tecklenburg, A
2017-12-01
The primary goal of a surgical team is the successful performance of an operation on a patien; however, this primary goal can show discrepancies from the goals of individual team members. The main causes for differences of interests can be variations in subjective preferences and organizational differences. Subjective preferences are due to the values held by those involved. These values are of an intrinsic nature and therefore difficult to change. Another reason for individual goals is that hospitals and universities are professional bureaucracies. Experts working in professional bureaucracies are known to identify themselves to a greater extent with their respective profession than with their institution; however, teams in the operating room (OR) have to work together in multidisciplinary teams. The main goal of this analysis is to document role-specific targets and motivations within teams. This was a case study at a university hospital with 40 operating rooms. The data collection resulted from the three pillars of the goal documentation instrument, which includes expert interviews, a utility analysis and card placement as a basis for communicative validation. The results were analyzed with a systematic method as a qualitative content analysis. The four-pillar success model, which maps aspects of a successful hospital, was used as a deductive coding scheme. The four pillars represent the level of medical quality (process, structure and outcome quality), economy and efficiency, client satisfaction (patients and referring physicians) and employee satisfaction. At a university hospital an additional focus is on research and teaching. In addition to the four pillar success model as a deductive coding scheme, an inductive coding scheme was introduced. Approximately 10% of the employees from each professional group (surgeons, anesthesiologists, OR nurses, nurse anesthetists) were interviewed resulting in 65 interviews overall. The interviews were conducted within a time span of 4 months. Considering the main categories quality of medical care, economy and efficiency, patient satisfaction and employee satisfaction as well as research and teaching, surgeons thought the categories of economy and efficiency (37%) and quality of medical care (34%) to be the most important. For anesthesiologists, however, the category of employee satisfaction (38%) was most important, followed by the category of economy and efficiency (31%). For the OR nurses as well as for the nurse anesthetists the category of employee satisfaction was of highest priority (61% and 57%, respectively). The results show that considering the main categories no dimension is equally important for the participating professional groups. This can result in goal conflicts. Additionally, the ad hoc teams make it impossible for team building to occur, making it difficult for the professional groups to adapt to each other and the individual goals. This presents a high potential for conflict. The difference in the perception of the importance of employee satisfaction is a crucial factor for emerging conflicts in the OR, as employee satisfaction correlates with productivity and patient satisfaction. Knowing and communicating the different goals is a first step for optimizing the OR management system.
Pearsall, Matthew J; Ellis, Aleksander P J; Bell, Bradford S
2010-01-01
The primary purpose of this study was to extend theory and research regarding the emergence of mental models and transactive memory in teams. Utilizing Kozlowski, Gully, Nason, and Smith's (1999) model of team compilation, we examined the effect of role identification behaviors and posited that such behaviors represent the initial building blocks of team cognition during the role compilation phase of team development. We then hypothesized that team mental models and transactive memory would convey the effects of these behaviors onto team performance in the team compilation phase of development. Results from 60 teams working on a command-and-control simulation supported our hypotheses. Copyright 2009 APA, all rights reserved.
SPARC: Demonstrate burst-buffer-based checkpoint/restart on ATS-1.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldfield, Ron A.; Ulmer, Craig D.; Widener, Patrick
Recent high-performance computing (HPC) platforms such as the Trinity Advanced Technology System (ATS-1) feature burst buffer resources that can have a dramatic impact on an application’s I/O performance. While these non-volatile memory (NVM) resources provide a new tier in the storage hierarchy, developers must find the right way to incorporate the technology into their applications in order to reap the benefits. Similar to other laboratories, Sandia is actively investigating ways in which these resources can be incorporated into our existing libraries and workflows without burdening our application developers with excessive, platform-specific details. This FY18Q1 milestone summaries our progress in adaptingmore » the Sandia Parallel Aerodynamics and Reentry Code (SPARC) in Sandia’s ATDM program to leverage Trinity’s burst buffers for checkpoint/restart operations. We investigated four different approaches with varying tradeoffs in this work: (1) simply updating job script to use stage-in/stage out burst buffer directives, (2) modifying SPARC to use LANL’s hierarchical I/O (HIO) library to store/retrieve checkpoints, (3) updating Sandia’s IOSS library to incorporate the burst buffer in all meshing I/O operations, and (4) modifying SPARC to use our Kelpie distributed memory library to store/retrieve checkpoints. Team members were successful in generating initial implementation for all four approaches, but were unable to obtain performance numbers in time for this report (reasons: initial problem sizes were not large enough to stress I/O, and SPARC refactor will require changes to our code). When we presented our work to the SPARC team, they expressed the most interest in the second and third approaches. The HIO work was favored because it is lightweight, unobtrusive, and should be portable to ATS-2. The IOSS work is seen as a long-term solution, and is favored because all I/O work (including checkpoints) can be deferred to a single library.« less
Potential Uses of Occupational Analysis Data By Air Force Management Engineering Teams.
ERIC Educational Resources Information Center
McFarland, Barry P.
Both the occupational analysis program and the management engineering program are primarily concerned with task level descriptions of time spent to perform tasks required in the Air Force, the first being personnel specialty code oriented and the second being work center oriented. However two separate and independent techniques have been developed…