SMARTE: SUSTAINABLE MANAGEMENT APPROACHES AND REVITALIZATION TOOLS-ELECTRONIC (BELFAST, IRELAND)
The U.S.-German Bilateral Working Group is developing Site-specific Management Approaches and Redevelopment Tools (SMART). In the U.S., the SMART compilation is housed in a web-based, decision support tool called SMARTe. All tools within SMARTe that are developed specifically for...
ERIC Educational Resources Information Center
European Training Foundation, Turin (Italy).
This document presents a management tool kit on training needs assessment and program design for countries in transition to a market economy. Chapter 1 describes the tool's development within the framework of the project called Strengthening of Partnership between Management Training Institutions and Companies, Ukraine-Kazakhstan-Kyrgyzstan.…
Marketing, Management and Performance: Multilingualism as Commodity in a Tourism Call Centre
ERIC Educational Resources Information Center
Duchene, Alexandre
2009-01-01
This paper focuses on the ways an institution of the new economy--a tourism call centre in Switzerland--markets, manages and performs multilingual services. In particular, it explores the ways multilingualism operates as a strategic and managerial tool within tourism call centres and how the institutional regulation of language practices…
Computer-Aided Air-Traffic Control In The Terminal Area
NASA Technical Reports Server (NTRS)
Erzberger, Heinz
1995-01-01
Developmental computer-aided system for automated management and control of arrival traffic at large airport includes three integrated subsystems. One subsystem, called Traffic Management Advisor, another subsystem, called Descent Advisor, and third subsystem, called Final Approach Spacing Tool. Data base that includes current wind measurements and mathematical models of performances of types of aircraft contributes to effective operation of system.
Durr, W
1998-01-01
Call centers are strategically and tactically important to many industries, including the healthcare industry. Call centers play a key role in acquiring and retaining customers. The ability to deliver high-quality and timely customer service without much expense is the basis for the proliferation and expansion of call centers. Call centers are unique blends of people and technology, where performance indicates combining appropriate technology tools with sound management practices built on key operational data. While the technology is fascinating, the people working in call centers and the skill of the management team ultimately make a difference to their companies.
Mullinix, C.; Hearn, P.; Zhang, H.; Aguinaldo, J.
2009-01-01
Federal, State, and local water quality managers charged with restoring the Chesapeake Bay ecosystem require tools to maximize the impact of their limited resources. To address this need, the U.S. Geological Survey (USGS) and the Environmental Protection Agency's Chesapeake Bay Program (CBP) are developing a suite of Web-based tools called the Chesapeake Online Assessment Support Toolkit (COAST). The goal of COAST is to help CBP partners identify geographic areas where restoration activities would have the greatest effect, select the appropriate management strategies, and improve coordination and prioritization among partners. As part of the COAST suite of tools focused on environmental restoration, a water quality management visualization component called the Nutrient Yields Mapper (NYM) tool is being developed by USGS. The NYM tool is a web application that uses watershed yield estimates from USGS SPAtially Referenced Regressions On Watershed (SPARROW) attributes model (Schwarz et al., 2006) [6] to allow water quality managers to identify important sources of nitrogen and phosphorous within the Chesapeake Bay watershed. The NYM tool utilizes new open source technologies that have become popular in geospatial web development, including components such as OpenLayers and GeoServer. This paper presents examples of water quality data analysis based on nutrient type, source, yield, and area of interest using the NYM tool for the Chesapeake Bay watershed. In addition, we describe examples of map-based techniques for identifying high and low nutrient yield areas; web map engines; and data visualization and data management techniques.
NASA Technical Reports Server (NTRS)
Johnson, Paul W.
2008-01-01
ePORT (electronic Project Online Risk Tool) provides a systematic approach to using an electronic database program to manage a program/project risk management processes. This presentation will briefly cover the standard risk management procedures, then thoroughly cover NASA's Risk Management tool called ePORT. This electronic Project Online Risk Tool (ePORT) is a web-based risk management program that provides a common framework to capture and manage risks, independent of a programs/projects size and budget. It is used to thoroughly cover the risk management paradigm providing standardized evaluation criterion for common management reporting, ePORT improves Product Line, Center and Corporate Management insight, simplifies program/project manager reporting, and maintains an archive of data for historical reference.
NASA Technical Reports Server (NTRS)
1987-01-01
Manugistics, Inc. (formerly AVYX, Inc.) has introduced a new programming language for IBM and IBM compatible computers called TREES-pls. It is a resource management tool originating from the space shuttle, that can be used in such applications as scheduling, resource allocation project control, information management, and artificial intelligence. Manugistics, Inc. was looking for a flexible tool that can be applied to many problems with minimal adaptation. Among the non-government markets are aerospace, other manufacturing, transportation, health care, food and beverage and professional services.
Chapter 14. New tools to assess nitrogen management for conservation of our biosphere
USDA-ARS?s Scientific Manuscript database
There are several tools that can be used to assess the effects of management on nitrogen (N) losses to the environment. The Nitrogen Loss and Environmental Assessment Package (NLEAP) is an improved and renamed version of the DOS program that was called the Nitrate Leaching and Economic Analysis Pack...
SMARTE: IMPROVING REVITALIZATION DECISIONS (BERLIN, GERMANY)
The U.S.-German Bilateral Working Group is developing Site-specific Management Approaches and Redevelopment Tools (SMART). In the U.S., the SMART compilation is housed in a web-based, decision support tool called SMARTe. All tools within SMARTe that are developed specifically for...
Enhancement of the EPA Stormwater BMP Decision-Support Tool (SUSTAIN)
U.S. Environmental Protection Agency (EPA) has been developing and improving a decision-support tool for placement of stormwater best management practices (BMPs) at strategic locations in urban watersheds. The tool is called the System for Urban Stormwater Treatment and Analysis...
Enhancement of the EPA Stormwater BMP Decision-Support Tool (SUSTAIN) - slides
U.S. Environmental Protection Agency (EPA) has been developing and improving a decision-support tool for placement of stormwater best management practices (BMPs) at strategic locations in urban watersheds. The tool is called the System for Urban Stormwater Treatment and Analysis...
ERIC Educational Resources Information Center
Stahl, Lizabeth A. B.; Behnken, Lisa M.; Breitenbach, Fritz R.; Miller, Ryan P.; Nicolai, David; Gunsolus, Jeffrey L.
2016-01-01
University of Minnesota educators use an integrated pest management (IPM) survey conducted during private pesticide applicator training as an educational, needs assessment, and evaluation tool. By incorporating the IPM Assessment, as the survey is called, into a widely attended program and using TurningPoint audience response devices, Extension…
DDP - a tool for life-cycle risk management
NASA Technical Reports Server (NTRS)
Cornford, S. L.; Feather, M. S.; Hicks, K. A.
2001-01-01
At JPL we have developed, and implemented, a process for achieving life-cycle risk management. This process has been embodied in a software tool and is called Defect Detection and Prevention (DDP). The DDP process can be succinctly stated as: determine where we want to be, what could get in the way and how we will get there.
New Tool for Benefit-Cost Analysis in Evaluating Transportation Alternatives
DOT National Transportation Integrated Search
1997-01-01
The Intermodal Surface Transportation Efficiency Act (ISTEA) emphasizes assessment of multi-modal alternatives and demand management strategies. In 1995, the Federal Highway Administration (FHWA) developed a corridor sketch planning tool called the S...
FFI: A software tool for ecological monitoring
Duncan C. Lutes; Nathan C. Benson; MaryBeth Keifer; John F. Caratti; S. Austin Streetman
2009-01-01
A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool. FFI provides software...
How Can I Deal with My Asthma?
... had trouble with it and why. Use asthma management tools. Even if you're feeling absolutely fine, don't abandon tools like daily long-term control medicines (also called "controller" or "maintenance" medicines) if they're a part of your ...
SUSTAIN - A BMP Process and Placement Tool for Urban Watersheds (Poster)
To assist stormwater management professionals in planning for best management practices (BMPs) and low-impact developments (LIDs) implementation, USEPA is developing a decision support system, called the System for Urban Stormwater Treatment and Analysis INtegration (SUSTAIN). ...
Feasibility of lane closures using probe data : technical brief.
DOT National Transportation Integrated Search
2017-04-01
This study developed an on-line system analysis tool called the Work Zone Interactive : Management Application - Planning (WIMAP-P), an easy-to-use and easy-to-learn tool for : predicting the traffic impact caused by work zone lane closures on freewa...
National Center for Farmworker Health
... Access Data Health Centers Population Estimates Resources Performance Management & Governance Tool Box > Administrative Governance Human Resources Needs Assessment Service Delivery Emergency Preparedness Call for ...
Automated support for experience-based software management
NASA Technical Reports Server (NTRS)
Valett, Jon D.
1992-01-01
To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.
ART-Ada design project, phase 2
NASA Technical Reports Server (NTRS)
Lee, S. Daniel; Allen, Bradley P.
1990-01-01
Interest in deploying expert systems in Ada has increased. An Ada based expert system tool is described called ART-Ada, which was built to support research into the language and methodological issues of expert systems in Ada. ART-Ada allows applications of an existing expert system tool called ART-IM (Automated Reasoning Tool for Information Management) to be deployed in various Ada environments. ART-IM, a C-based expert system tool, is used to generate Ada source code which is compiled and linked with an Ada based inference engine to produce an Ada executable image. ART-Ada is being used to implement several expert systems for NASA's Space Station Freedom Program and the U.S. Air Force.
FFI: What it is and what it can do for you
Duncan C. Lutes; MaryBeth Keifer; Nathan C. Benson; John F. Caratti
2009-01-01
A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool (FEAT). FFI provides...
Current Capabilities and Planned Enhancements of SUSTAIN - Paper
Efforts have been under way by the U.S. Environmental Protection Agency (EPA) since 2003 to develop a decision-support tool for placement of best management practices (BMPs) at strategic locations in urban watersheds. The tool is called the System for Urban Stormwater Treatment ...
RAFCON: A Graphical Tool for Engineering Complex, Robotic Tasks
2016-10-09
Robotic tasks are becoming increasingly complex, and with this also the robotic systems. This requires new tools to manage this complexity and to...execution of robotic tasks, called RAFCON. These tasks are described in hierarchical state machines supporting concurrency. A formal notation of this concept
ERIC Educational Resources Information Center
Luan, Jing; Willett, Terrence
This paper discusses data mining--an end-to-end (ETE) data analysis tool that is used by researchers in higher education. It also relates data mining and other software programs to a brand new concept called "Knowledge Management." The paper culminates in the Tier Knowledge Management Model (TKMM), which seeks to provide a stable…
The NASA Program Management Tool: A New Vision in Business Intelligence
NASA Technical Reports Server (NTRS)
Maluf, David A.; Swanson, Keith; Putz, Peter; Bell, David G.; Gawdiak, Yuri
2006-01-01
This paper describes a novel approach to business intelligence and program management for large technology enterprises like the U.S. National Aeronautics and Space Administration (NASA). Two key distinctions of the approach are that 1) standard business documents are the user interface, and 2) a "schema-less" XML database enables flexible integration of technology information for use by both humans and machines in a highly dynamic environment. The implementation utilizes patent-pending NASA software called the NASA Program Management Tool (PMT) and its underlying "schema-less" XML database called Netmark. Initial benefits of PMT include elimination of discrepancies between business documents that use the same information and "paperwork reduction" for program and project management in the form of reducing the effort required to understand standard reporting requirements and to comply with those reporting requirements. We project that the underlying approach to business intelligence will enable significant benefits in the timeliness, integrity and depth of business information available to decision makers on all organizational levels.
Knowledge as an interactional tool in the management of client empowerment.
Moore, John
2016-06-01
To examine the way speaker and recipient knowledge is managed in interaction by a call taker at a mental-health information line, to achieve the institutional goals of information provision and client empowerment. This study utilizes conversation analysis in the analysis of a single call to the line. Analysis demonstrates the ways in which a call taker produces turns-at-talk that construct a caller as knowing what help they wanted prior to that moment in the interaction, and that invoke 'common' knowledge of sources of such help. Talk that orients to knowledge is used as an interactional resource that allows the call taker to avoid talk that may be considered advice, and to be heard to achieve the goal of client empowerment. The asymmetric identities of help-seeker and help-provider are managed in this process. Client empowerment can be seen as something interactionally achieved and managed in talk-in-interaction, while not necessarily objectively experienced by the client. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Mapping benefits as a tool for natural resource management in estuarine watersheds
Natural resource managers are often called upon to justify the value of protecting or restoring natural capital based on its perceived benefit to stakeholders. This usually takes the form of formal valuation exercises (i.e., ancillary costs) of a resource without consideration f...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Femec, D.A.
This report describes two code-generating tools used to speed design and implementation of relational databases and user interfaces: CREATE-SCHEMA and BUILD-SCREEN. CREATE-SCHEMA produces the SQL commands that actually create and define the database. BUILD-SCREEN takes templates for data entry screens and generates the screen management system routine calls to display the desired screen. Both tools also generate the related FORTRAN declaration statements and precompiled SQL calls. Included with this report is the source code for a number of FORTRAN routines and functions used by the user interface. This code is broadly applicable to a number of different databases.
Career management: a competitive advantage in today's health care marketplace.
Bourbeau, J
2001-01-01
A valuable new tool to attract and retain new employees is being used by some of the most progressive companies in Michigan. It is called career management, and it is being used with great success by businesses of all types to give themselves a competitive advantage.
Kevin Hyde; Matthew B. Dickinson; Gil Bohrer; David Calkin; Louisa Evers; Julie Gilbertson-Day; Tessa Nicolet; Kevin Ryan; Christina Tague
2013-01-01
Wildland fire management has moved beyond a singular focus on suppression, calling for wildfire management for ecological benefit where no critical human assets are at risk. Processes causing direct effects and indirect, long-term ecosystem changes are complex and multidimensional. Robust risk-assessment tools are required that account for highly variable effects on...
Flight Awareness Collaboration Tool Development
NASA Technical Reports Server (NTRS)
Mogford, Richard
2016-01-01
This is a PowerPoint presentation covering airline operations center (AOC) research. It reviews a dispatcher decision support tool called the Flight Awareness Collaboration Tool (FACT). FACT gathers information about winter weather onto one screen and includes predictive abilities. FACT should prove to be useful for airline dispatchers and airport personnel when they manage winter storms and their effect on air traffic. This material is very similar to other previously approved presentations.
Citizen Participation -- A Tool for Conflict Management on the Public Lands
ERIC Educational Resources Information Center
Irland, Lloyd C.
1975-01-01
The search for harmony in public land-use planning is a hopeless pursuit. A more realistic approach is a conflict management strategy that emphasizes concern for the planning process, rather than for the plan itself. The search for legitimate planning processes calls for the conscious building of citizen participation. (JG)
Utilizing the Myers-Briggs Personality Inventory in Employee Assistance Program Workplace Seminars.
ERIC Educational Resources Information Center
Aviles, Christopher B.
Social work educators are being called upon more often to deliver employee workplace seminars for community agencies on a variety of topics ranging from burnout and stress management to improving workplace communication and managing workplace conflicts. One tool that addresses workplace communication is the Myers-Briggs Type Indicator (MBTI). It…
Managing Change to a Quality Philosophy: A Partnership Perspective.
ERIC Educational Resources Information Center
Snyder, Karolyn J.; Acker-Hocevar, Michele
Within the past 5 years there has been an international movement to adapt the principles and practices of Total Quality Management work environments to school-restructuring agendas. This paper reports on the development of a model called the Educational Quality System, a benchmark assessment tool for identifying the essential elements of quality…
James M. Guldin; David Cawrse; Russell Graham; Miles Hemstrom; Linda Joyce; Steve Kessler; Ranotta McNair; George Peterson; Charles G. Shaw; Peter Stine; Mark Twery; Jeffrey Walter
2003-01-01
The paper outlines a process called the science consistency review, which can be used to evaluate the use of scientific information in land management decisions. Developed with specific reference to land management decisions in the U.S. Department of Agriculture Forest Service, the process involves assembling a team of reviewers under a review administrator to...
Wildland fire potential: A tool for assessing wildfire risk and fuels management needs
Greg Dillon; James Menakis; Frank Fay
2015-01-01
Federal wildfire managers often want to know, over large landscapes, where wildfires are likely to occur and how intense they may be. To meet this need we developed a map that we call wildland fire potential (WFP) - a raster geospatial product that can help to inform evaluations of wildfire risk or prioritization of fuels management needs across very large spatial...
Global Impact Estimation of ISO 50001 Energy Management System for Industrial and Service Sectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aghajanzadeh, Arian; Therkelsen, Peter L.; Rao, Prakash
A methodology has been developed to determine the impacts of ISO 50001 Energy Management System (EnMS) at a region or country level. The impacts of ISO 50001 EnMS include energy, CO2 emissions, and cost savings. This internationally recognized and transparent methodology has been embodied in a user friendly Microsoft Excel® based tool called ISO 50001 Impact Estimator Tool (IET 50001). However, the tool inputs are critical in order to get accurate and defensible results. This report is intended to document the data sources used and assumptions made to calculate the global impact of ISO 50001 EnMS.
Multimodal visualization interface for data management, self-learning and data presentation.
Van Sint Jan, S; Demondion, X; Clapworthy, G; Louryan, S; Rooze, M; Cotten, A; Viceconti, M
2006-10-01
A multimodal visualization software, called the Data Manager (DM), has been developed to increase interdisciplinary communication around the topic of visualization and modeling of various aspects of the human anatomy. Numerous tools used in Radiology are integrated in the interface that runs on standard personal computers. The available tools, combined to hierarchical data management and custom layouts, allow analyzing of medical imaging data using advanced features outside radiological premises (for example, for patient review, conference presentation or tutorial preparation). The system is free, and based on an open-source software development architecture, and therefore updates of the system for custom applications are possible.
Evaluating the ecological benefits of wildfire by integrating fire and ecosystem simulation models
Robert E. Keane; Eva Karau
2010-01-01
Fire managers are now realizing that wildfires can be beneficial because they can reduce hazardous fuels and restore fire-dominated ecosystems. A software tool that assesses potential beneficial and detrimental ecological effects from wildfire would be helpful to fire management. This paper presents a simulation platform called FLEAT (Fire and Landscape Ecology...
Visual management support system
Lee Anderson; Jerry Mosier; Geoffrey Chandler
1979-01-01
The Visual Management Support System (VMSS) is an extension of an existing computer program called VIEWIT, which has been extensively used by the U. S. Forest Service. The capabilities of this program lie in the rapid manipulation of large amounts of data, specifically opera-ting as a tool to overlay or merge one set of data with another. VMSS was conceived to...
Docster: The Future of Document Delivery?
ERIC Educational Resources Information Center
Chudnov, Daniel
2000-01-01
Considers the possibility of a bibliographic management tool that combines file storage with a Napster-like communications protocol, called docster. Explains Napster and discusses copyright issues, interlibrary loans, infrastructure, security concerns, the library's role, and online publishing. (LRW)
Decision support frameworks and tools for conservation
Schwartz, Mark W.; Cook, Carly N.; Pressey, Robert L.; Pullin, Andrew S.; Runge, Michael C.; Salafsky, Nick; Sutherland, William J.; Williamson, Matthew A.
2018-01-01
The practice of conservation occurs within complex socioecological systems fraught with challenges that require transparent, defensible, and often socially engaged project planning and management. Planning and decision support frameworks are designed to help conservation practitioners increase planning rigor, project accountability, stakeholder participation, transparency in decisions, and learning. We describe and contrast five common frameworks within the context of six fundamental questions (why, who, what, where, when, how) at each of three planning stages of adaptive management (project scoping, operational planning, learning). We demonstrate that decision support frameworks provide varied and extensive tools for conservation planning and management. However, using any framework in isolation risks diminishing potential benefits since no one framework covers the full spectrum of potential conservation planning and decision challenges. We describe two case studies that have effectively deployed tools from across conservation frameworks to improve conservation actions and outcomes. Attention to the critical questions for conservation project planning should allow practitioners to operate within any framework and adapt tools to suit their specific management context. We call on conservation researchers and practitioners to regularly use decision support tools as standard practice for framing both practice and research.
An MIP model to schedule the call center workforce and organize the breaks
NASA Astrophysics Data System (ADS)
Türker, Turgay; Demiriz, Ayhan
2016-06-01
In modern economies, companies place a premium on managing their workforce efficiently especially in labor intensive service sector, since the services have become the significant portion of the economies. Tour scheduling is an important tool to minimize the overall workforce costs while satisfying the minimum service level constraints. In this study, we consider the workforce management problem of an inbound call-center while satisfying the call demand within the short time periods with the minimum cost. We propose a mixed-integer programming model to assign workers to the daily shifts, to determine the weekly off-days, and to determine the timings of lunch and other daily breaks for each worker. The proposed model has been verified on the weekly demand data observed at a specific call center location of a satellite TV operator. The model was run on both 15 and 10 minutes demand estimation periods (planning time intervals).
Development of a knowledge management system for complex domains.
Perott, André; Schader, Nils; Bruder, Ralph; Leonhardt, Jörg
2012-01-01
Deutsche Flugsicherung GmbH, the German Air Navigation Service Provider, follows a systematic approach, called HERA, for investigating incidents. The HERA analysis shows a distinctive occurrence of incidents in German air traffic control in which the visual perception of information plays a key role. The reasons can be partially traced back to workstation design, where basic ergonomic rules and principles are not sufficiently followed by the designers in some cases. In cooperation with the Institute of Ergonomics in Darmstadt the DFS investigated possible approaches that may support designers to implement ergonomic systems. None of the currently available tools were found to be able to meet the identified user requirements holistically. Therefore it was suggested to develop an enhanced software tool called Design Process Guide. The name Design Process Guide indicates that this tool exceeds the classic functions of currently available Knowledge Management Systems. It offers "design element" based access, shows processual and content related topics, and shows the implications of certain design decisions. Furthermore, it serves as documentation, detailing why a designer made to a decision under a particular set of conditions.
Miller, Brian W.; Morisette, Jeffrey T.
2014-01-01
Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.
NY TBO Research: Integrated Demand Management (IDM): IDM Concept, Tools, and Training Package
NASA Technical Reports Server (NTRS)
Smith, Nancy
2016-01-01
A series of human-in-the-loop simulation sessions were conducted in the Airspace Operations Laboratory (AOL) to evaluate a new traffic management concept called Integrated Demand Management (IDM). The simulation explored how to address chronic equity, throughput and delay issues associated with New Yorks high-volume airports by operationally integrating three current and NextGen capabilities the Collaborative Trajectory Options Program (CTOP), Time-Based Flow Management (TBFM) and Required Time of Arrival (RTA) in order to better manage traffic demand within the National Air Traffic System. A package of presentation slides was developed to describe the concept, tools, and training materials used in the simulation sessions. The package will be used to outbrief our stakeholders by both presenting orally and disseminating of the materials via email.
NASA Astrophysics Data System (ADS)
Albano, Raffaele; Manfreda, Salvatore; Celano, Giuseppe
The paper introduces a minimalist water-driven crop model for sustainable irrigation management using an eco-hydrological approach. Such model, called MY SIRR, uses a relatively small number of parameters and attempts to balance simplicity, accuracy, and robustness. MY SIRR is a quantitative tool to assess water requirements and agricultural production across different climates, soil types, crops, and irrigation strategies. The MY SIRR source code is published under copyleft license. The FOSS approach could lower the financial barriers of smallholders, especially in developing countries, in the utilization of tools for better decision-making on the strategies for short- and long-term water resource management.
2009-06-06
written in Standard ML, and comprises nearly 7,000 lines of code. OpenSSL is used for all cryptographic operations. Because the front end tools are used...be managed. Macrobenchmarks. To understand the performance of PCFS in practice, we also ran two simple macrobenchmarks. The first (called OpenSSL in...the table below), untars the OpenSSL source code, compiles it and deletes it. The other (called Fuse in the table below), performs similar operations
On the design of computer-based models for integrated environmental science.
McIntosh, Brian S; Jeffrey, Paul; Lemon, Mark; Winder, Nick
2005-06-01
The current research agenda in environmental science is dominated by calls to integrate science and policy to better understand and manage links between social (human) and natural (nonhuman) processes. Freshwater resource management is one area where such calls can be heard. Designing computer-based models for integrated environmental science poses special challenges to the research community. At present it is not clear whether such tools, or their outputs, receive much practical policy or planning application. It is argued that this is a result of (1) a lack of appreciation within the research modeling community of the characteristics of different decision-making processes including policy, planning, and (2) participation, (3) a lack of appreciation of the characteristics of different decision-making contexts, (4) the technical difficulties in implementing the necessary support tool functionality, and (5) the socio-technical demands of designing tools to be of practical use. This article presents a critical synthesis of ideas from each of these areas and interprets them in terms of design requirements for computer-based models being developed to provide scientific information support for policy and planning. Illustrative examples are given from the field of freshwater resources management. Although computer-based diagramming and modeling tools can facilitate processes of dialogue, they lack adequate simulation capabilities. Component-based models and modeling frameworks provide such functionality and may be suited to supporting problematic or messy decision contexts. However, significant technical (implementation) and socio-technical (use) challenges need to be addressed before such ambition can be realized.
Open Architecture as an Enabler for FORCEnet Cruise Missile Defense
2007-09-01
2007). Step 4 introduces another tool called the Strengths, Weaknesses, Opportunities, and Threats ( SWOT ) analysis. Once the TRO has been identified...the SWOT analysis can be used to help in the pursuit of that objective or mission objective. SWOT is defined as Strengths: attributes of the...overtime. In addition to the SCAN and SWOT , analysis processes also needed are Automated Battle Management Aids (ABMA) tools that are required to
A Computer-Aided Abstracting Tool Kit.
ERIC Educational Resources Information Center
Craven, Timothy C.
1993-01-01
Reports on the development of a prototype computerized abstractor's assistant called TEXNET, a text network management system. Features of the system discussed include semantic dependency links; displays of text structure; basic text editing; extracting; weighting methods; and listings of frequent words. (Contains 25 references.) (LRW)
Hiller, Thomas Stephan; Freytag, Antje; Breitbart, Jörg; Teismann, Tobias; Schöne, Elisabeth; Blank, Wolfgang; Schelle, Mercedes; Vollmar, Horst Christian; Margraf, Jürgen; Gensichen, Jochen
2018-04-01
Behavior therapy-oriented methods are recommended for treating anxiety disorders in primary care. The treatment of patients with long-term conditions can be improved by case management and structured clinical monitoring. The present paper describes the rationale, design and application of the 'Jena Anxiety Monitoring List' (JAMoL), a monitoring tool for the treatment of patients with panic disorder, with or without agoraphobia, in primary care. JAMoL's design was based on established clinical measures, the rationale of exposure-based anxiety treatment, and research on family practice-based case management. After piloting, the JAMoL was used in the clinical study 'Jena-PARADISE' (ISRCTN64669297), where non-physician practice staff monitored patients with panic disorder by telephone. Using semi-structured interviews in concomitant studies, study participants were asked about the instrument's functionality. The JAMoL assesses the severity of anxiety symptoms (6 items) as well as the patient's adherence to therapy (4 items) and fosters the case management-related information exchange (3 items). An integrated traffic light scheme facilitates the evaluation of monitoring results. Within the clinical study, non-physician practice staff carried out a total of 1,525 JAMoL-supported monitoring calls on 177 patients from 30 primary care practices (median calls per patient: 10 [interquartile range, 9-10]). Qualitative analyses revealed that most practice teams and patients rated the JAMoL as a practicable and treatment-relevant tool. The JAMoL enables primary care practice teams to continuously monitor anxiety symptoms and treatment adherence in patients with panic disorder with or without agoraphobia. Within the behavior therapy-oriented treatment program 'Jena-PARADISE', the JAMoL constitutes an important case management tool. Copyright © 2018. Published by Elsevier GmbH.
Management of natural and bioterrorism induced pandemics.
Tyshenko, Michael G
2007-09-01
A recent approach for bioterrorism risk management calls for stricter regulations over biotechnology as a way to control subversion of technology that may be used to create a man-made pandemic. This approach is largely unworkable given the increasing pervasiveness of molecular techniques and tools throughout society. Emerging technology has provided the tools to design much deadlier pathogens but concomitantly the ability to respond to emerging pandemics to reduce mortality has also improved significantly in recent decades. In its historical context determining just how 'risky' biological weapons is an important consideration for decision making and resource allocation. Management should attempt to increase capacity, share resources, provide accurate infectious disease reporting, deliver information transparency and improve communications to help mitigate the magnitude of future pandemics.
Joint Experiment on Scalable Parallel Processors (JESPP) Parallel Data Management
2006-05-01
management and analysis tool, called Simulation Data Grid ( SDG ). The design principles driving the design of SDG are: 1) minimize network communication...or SDG . In this report, an initial prototype implementation of this system is described. This project follows on earlier research, primarily...distributed logging system had some 2 limitations. These limitations will be described in this report, and how the SDG addresses these limitations. 3.0
David Bushman at the Mission Manager's console onboard NASA's DC-8 during the AirSAR 2004 campaign
2004-03-03
David Bushman at the Mission Manager's console onboard NASA's DC-8 during the AirSAR 2004 campaign. AirSAR 2004 is a three-week expedition by an international team of scientists that will use an all-weather imaging tool, called the Airborne Synthetic Aperture Radar (AirSAR), in a mission ranging from the tropical rain forests of Central America to frigid Antarctica.
Experiences with DCE: the pro7 communication server based on OSF-DCE functionality.
Schulte, M; Lordieck, W
1997-01-01
The pro7-communication server is a new approach to manage communication between different applications on different hardware platforms in a hospital environment. The most important features are the use of OSF/DCE for realising remote procedure calls between different platforms, the use of an SQL-92 compatible relational database and the design of a new software development tool (called protocol definition language compiler) for describing the interface of a new application, which is to integrate in a hospital environment.
DeMAID/GA an Enhanced Design Manager's Aid for Intelligent Decomposition
NASA Technical Reports Server (NTRS)
Rogers, J. L.
1996-01-01
Many companies are looking for new tools and techniques to aid a design manager in making decisions that can reduce the time and cost of a design cycle. One tool is the Design Manager's Aid for Intelligent Decomposition (DeMAID). Since the initial public release of DeMAID in 1989, much research has been done in the areas of decomposition, concurrent engineering, parallel processing, and process management; many new tools and techniques have emerged. Based on these recent research and development efforts, numerous enhancements have been added to DeMAID to further aid the design manager in saving both cost and time in a design cycle. The key enhancement, a genetic algorithm (GA), will be available in the next public release called DeMAID/GA. The GA sequences the design processes to minimize the cost and time in converging a solution. The major enhancements in the upgrade of DeMAID to DeMAID/GA are discussed in this paper. A sample conceptual design project is used to show how these enhancements can be applied to improve the design cycle.
Using an interview guide to identify effective nurse managers: phase II, outcomes.
Fosbinder, D; Everson-Bates, S; Hendrix, L
2000-01-01
The investigators report validation of a survey tool called the Interview Guide to assist in the selection of nurses who will be effective as managers. Nurse administrators rated nurse managers at six months and two years after hire. The Interview Guide rated the management qualities "seeing the big picture" and potential for "rehire" as the best predictors of managerial success. After two years, a good "self-concept" or a "flexible attitude" was the best predictor. The ability to manage conflict was the most significant competency for predicting rehire at both six months and two years.
Management practices in substance abuse treatment programs.
McConnell, K John; Hoffman, Kim A; Quanbeck, Andrew; McCarty, Dennis
2009-07-01
Efforts to understand how to improve the delivery of substance abuse treatment have led to a recent call for studies on the "business of addiction treatment." This study adapts an innovative survey tool to collect baseline management practice data from 147 addiction treatment programs enrolled in the Network for the Improvement of Addiction Treatment 200 project. Measures of "good" management practice were strongly associated with days to treatment admission. Management practice scores were weakly associated with revenues per employee but were not correlated with operating margins. Better management practices were more prevalent among programs with a higher number of competitors in their catchment area.
The Effect of Age and Task Difficulty
ERIC Educational Resources Information Center
Mallo, Jason; Nordstrom, Cynthia R.; Bartels, Lynn K.; Traxler, Anthony
2007-01-01
Electronic Performance Monitoring (EPM) is a common technique used to record employee performance. EPM may include counting computer keystrokes, monitoring employees' phone calls or internet activity, or documenting time spent on work activities. Despite EPM's prevalence, no studies have examined how this management tool affects older workers--a…
Molecular Characterization of Atoxigenic Strains for Biological Control of Aflatoxins in Nigeria
USDA-ARS?s Scientific Manuscript database
Aflatoxins are highly toxic, carcinogens produced by several species in Aspergillus section Flavi. Strains of A. flavus that do not produce aflatoxins, called atoxigenic strains, have been used commercially in North America as tools for limiting aflatoxin contamination. A similar aflatoxin manage...
Memory Forensics: Review of Acquisition and Analysis Techniques
2013-11-01
Management Overview Processes running on modern multitasking operating systems operate on an abstraction of RAM, called virtual memory [7]. In these systems...information such as user names, email addresses and passwords [7]. Analysts also use tools such as WinHex to identify headers or other suspicious data within
Incidence of Group Awareness Information on Students' Collaborative Learning Processes
ERIC Educational Resources Information Center
Pifarré, M.; Cobos, R.; Argelagós, E.
2014-01-01
This paper studies how the integration of group awareness tools in the knowledge management system called KnowCat (Knowledge Catalyser), which promotes collaborative knowledge construction, may both foster the students' perception about the meaningfulness of visualization of group awareness information and promote better collaborative…
Final-Approach-Spacing Subsystem For Air Traffic
NASA Technical Reports Server (NTRS)
Davis, Thomas J.; Erzberger, Heinz; Bergeron, Hugh
1992-01-01
Automation subsystem of computers, computer workstations, communication equipment, and radar helps air-traffic controllers in terminal radar approach-control (TRACON) facility manage sequence and spacing of arriving aircraft for both efficiency and safety. Called FAST (Final Approach Spacing Tool), subsystem enables controllers to choose among various levels of automation.
System engineering toolbox for design-oriented engineers
NASA Technical Reports Server (NTRS)
Goldberg, B. E.; Everhart, K.; Stevens, R.; Babbitt, N., III; Clemens, P.; Stout, L.
1994-01-01
This system engineering toolbox is designed to provide tools and methodologies to the design-oriented systems engineer. A tool is defined as a set of procedures to accomplish a specific function. A methodology is defined as a collection of tools, rules, and postulates to accomplish a purpose. For each concept addressed in the toolbox, the following information is provided: (1) description, (2) application, (3) procedures, (4) examples, if practical, (5) advantages, (6) limitations, and (7) bibliography and/or references. The scope of the document includes concept development tools, system safety and reliability tools, design-related analytical tools, graphical data interpretation tools, a brief description of common statistical tools and methodologies, so-called total quality management tools, and trend analysis tools. Both relationship to project phase and primary functional usage of the tools are also delineated. The toolbox also includes a case study for illustrative purposes. Fifty-five tools are delineated in the text.
Electronic Handbooks Simplify Process Management
NASA Technical Reports Server (NTRS)
2012-01-01
Getting a multitude of people to work together to manage processes across many organizations for example, flight projects, research, technologies, or data centers and others is not an easy task. Just ask Dr. Barry E. Jacobs, a research computer scientist at Goddard Space Flight Center. He helped NASA develop a process management solution that provided documenting tools for process developers and participants to help them quickly learn, adapt, test, and teach their views. Some of these tools included editable files for subprocess descriptions, document descriptions, role guidelines, manager worksheets, and references. First utilized for NASA's Headquarters Directives Management process, the approach led to the invention of a concept called the Electronic Handbook (EHB). This EHB concept was successfully applied to NASA's Small Business Innovation Research (SBIR) and Small Business Technology Transfer (STTR) programs, among other NASA programs. Several Federal agencies showed interest in the concept, so Jacobs and his team visited these agencies to show them how their specific processes could be managed by the methodology, as well as to create mockup versions of the EHBs.
Weller, J M; Torrie, J; Boyd, M; Frengley, R; Garden, A; Ng, W L; Frampton, C
2014-06-01
Sharing information with the team is critical in developing a shared mental model in an emergency, and fundamental to effective teamwork. We developed a structured call-out tool, encapsulated in the acronym 'SNAPPI': Stop; Notify; Assessment; Plan; Priorities; Invite ideas. We explored whether a video-based intervention could improve structured call-outs during simulated crises and if this would improve information sharing and medical management. In a simulation-based randomized, blinded study, we evaluated the effect of the video-intervention teaching SNAPPI on scores for SNAPPI, information sharing, and medical management using baseline and follow-up crisis simulations. We assessed information sharing using a probe technique where nurses and technicians received unique, clinically relevant information probes before the simulation. Shared knowledge of probes was measured in a written, post-simulation test. We also scored sharing of diagnostic options with the team and medical management. Anaesthetists' scores for SNAPPI were significantly improved, as was the number of diagnostic options they shared. We found a non-significant trend to improve information-probe sharing and medical management in the intervention group, and across all simulations, a significant correlation between SNAPPI and information-probe sharing. Of note, only 27% of the clinically relevant information about the patient provided to the nurse and technician in the pre-simulation information probes was subsequently learnt by the anaesthetist. We developed a structured communication tool, SNAPPI, to improve information sharing between anaesthetists and their team, taught it using a video-based intervention, and provide initial evidence to support its value for improving communication in a crisis. © The Author [2014]. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Airline Operations Research Center
NASA Technical Reports Server (NTRS)
Mogford, Richard H.
2016-01-01
This is a PowerPoint presentation NASA airline operations center (AOC) research. It includes information on using IBM Watson in the AOC. It also reviews a dispatcher decision support tool call the Flight Awareness Collaboration Tool (FACT). FACT gathers information about winter weather onto one screen and includes predictive abilities. It should prove to be useful for airline dispatchers and airport personnel when they manage winter storms and their effect on air traffic. This material is very similar to other previously approved presentations with the same title.
Group decision-making techniques for natural resource management applications
Coughlan, Beth A.K.; Armour, Carl L.
1992-01-01
This report is an introduction to decision analysis and problem-solving techniques for professionals in natural resource management. Although these managers are often called upon to make complex decisions, their training in the natural sciences seldom provides exposure to the decision-making tools developed in management science. Our purpose is to being to fill this gap. We present a general analysis of the pitfalls of group problem solving, and suggestions for improved interactions followed by the specific techniques. Selected techniques are illustrated. The material is easy to understand and apply without previous training or excessive study and is applicable to natural resource management issues.
Piette, John D; Mendoza-Avelares, Milton O; Ganser, Martha; Mohamed, Muhima; Marinec, Nicolle; Krishnan, Sheila
2011-06-01
Although interactive voice response (IVR) calls can be an effective tool for chronic disease management, many regions of the world lack the infrastructure to provide these services. This study evaluated the feasibility and potential impact of an IVR program using a cloud-computing model to improve diabetes management in Honduras. A single-group, pre-post study was conducted between June and August 2010. The telecommunications infrastructure was maintained on a U.S. server, and calls were directed to patients' cell phones using VoIP. Eighty-five diabetes patients in Honduras received weekly IVR disease management calls for 6 weeks, with automated follow-up e-mails to clinicians, and voicemail reports to family caregivers. Patients completed interviews at enrollment and a 6-week follow-up. Other measures included patients' glycemic control (HbA1c) and data from the IVR calling system. A total of 53% of participants completed at least half of their IVR calls and 23% of participants completed 80% or more. Higher baseline blood pressures, greater diabetes burden, greater distance from the clinic, and better medication adherence were related to higher call completion rates. Nearly all participants (98%) reported that because of the program, they improved in aspects of diabetes management such as glycemic control (56%) or foot care (89%). Mean HbA1c's decreased from 10.0% at baseline to 8.9% at follow-up (p<0.01). Most participants (92%) said that if the service were available in their clinic they would use it again. Cloud computing is a feasible strategy for providing IVR services globally. IVR self-care support may improve self-care and glycemic control for patients in underdeveloped countries. Published by Elsevier Inc.
Piette, John D.; Mendoza-Avelares, Milton O.; Ganser, Martha; Mohamed, Muhima; Marinec, Nicolle; Krishnan, Sheila
2013-01-01
Background Although interactive voice response (IVR) calls can be an effective tool for chronic disease management, many regions of the world lack the infrastructure to provide these services. Objective This study evaluated the feasibility and potential impact of an IVR program using a cloud-computing model to improve diabetes management in Honduras. Methods A single group, pre-post study was conducted between June and August 2010. The telecommunications infrastructure was maintained on a U.S. server, and calls were directed to patients’ cell phones using VoIP. Eighty-five diabetes patients in Honduras received weekly IVR disease management calls for six weeks, with automated follow-up emails to clinicians, and voicemail reports to family caregivers. Patients completed interviews at enrollment and a six week follow-up. Other measures included patients’ glycemic control (A1c) and data from the IVR calling system. Results 55% of participants completed the majority of their IVR calls and 33% completed 80% or more. Higher baseline blood pressures, greater diabetes burden, greater distance from the clinic, and better adherence were related to higher call completion rates. Nearly all participants (98%) reported that because of the program, they improved in aspects of diabetes management such as glycemic control (56%) or foot care (89%). Mean A1c’s decreased from 10.0% at baseline to 8.9% at follow-up (p<.01). Most participants (92%) said that if the service were available in their clinic they would use it again. Conclusions Cloud computing is a feasible strategy for providing IVR services globally. IVR self-care support may improve self-care and glycemic control for patients in under-developed countries. PMID:21565655
Integration of e-Management, e-Development and e-Learning Technologies for Blended Course Delivery
ERIC Educational Resources Information Center
Johnson, Lynn E.; Tang, Michael
2005-01-01
This paper describes and assesses a pre-engineering curriculum development project called Foundations of Engineering, Science and Technology (FEST). FEST integrates web-based technologies into an inter-connected system to enable delivery of a blended program at multiple institutions. Tools and systems described include 1) technologies to deliver…
Electronic Mail Is One High-Tech Management Tool that Really Delivers.
ERIC Educational Resources Information Center
Parker, Donald C.
1987-01-01
Describes an electronic mail system used by the Horseheads (New York) Central School Distict's eight schools and central office that saves time and enhances productivity. This software calls up information from the district's computer network and sends it to other users' special files--electronic "mailboxes" set aside for messages and…
Answering the Call for Accountability: An Activity and Cost Analysis Case Study
ERIC Educational Resources Information Center
Carducci, Rozana; Kisker, Carrie B.; Chang, June; Schirmer, James
2007-01-01
This article summarizes the findings of a case study on the creation and application of an activity-based cost accounting model that links community college salary expenditures to mission-critical practices within academic divisions of a southern California community college. Although initially applied as a financial management tool in private…
Mapping a Crisis, One Text Message at a Time
ERIC Educational Resources Information Center
Bauduy, Jennifer
2010-01-01
An interactive mapping project is revolutionizing the way crises are reported and managed, and is spotlighting the value of citizen journalism. The project, called Ushahidi, which means testimony in Swahili, uses crowdsourcing (gathering information from a large number of people) to map crisis information. This crisis mapping tool has since been…
Storying energy consumption: Collective video storytelling in energy efficiency social marketing.
Gordon, Ross; Waitt, Gordon; Cooper, Paul; Butler, Katherine
2018-05-01
Despite calls for more socio-technical research on energy, there is little practical advice to how narratives collected through qualitative research may be melded with technical knowledge from the physical sciences such as engineering and then applied in energy efficiency social action strategies. This is despite established knowledge in the environmental management literature about domestic energy use regarding the utility of social practice theory and narrative framings that socialise everyday consumption. Storytelling is positioned in this paper both as a focus for socio-technical energy research, and as one potential practical tool that can arguably enhance energy efficiency interventions. We draw upon the literature on everyday social practices, and storytelling, to present our framework called 'collective video storytelling' that combines scientific and lay knowledge about domestic energy use to offer a practical tool for energy efficiency management. Collective video storytelling is discussed in the context of Energy+Illawarra, a 3-year cross-disciplinary collaboration between social marketers, human geographers, and engineers to target energy behavioural change within older low-income households in regional NSW, Australia. Copyright © 2018 Elsevier Ltd. All rights reserved.
Rheumatology telephone advice line - experience of a Portuguese department.
Ferreira, R; Marques, A; Mendes, A; da Silva, J A
2015-01-01
Telephone helplines for patients are tool for information and advice. They can contribute to patient's satisfaction with care and to the effectiveness and safety of treatments. In order to achieve this, they need to be adequately adapted to the target populations, as to incorporate their abilities and expectations. a) Evaluate the adherence of patients to a telephone helpline managed by nurses in a Portuguese Rheumatology Department, b) Analyse the profile of users and their major needs, c) Analyse the management of calls by the nurses. The target population of this phone service are the patients treated at Day Care Hospital and Early Arthritis Clinic of our department. Nurses answered phone calls immediately between 8am and 4pm of working days. In the remaining hours messages were recorded on voice mail and answered as soon as possible. Details of the calls were registered in a dedicated sheet and patients were requested permission to use data to improve the service, with respect for their rights of confidentiality, anonymity and freedom of decision. In 18 months 173 calls were made by 79 patients, with a mean age of 47.9 years (sd=9.13). Considering the proportions of men and women in the target population, it was found that men called more frequently (M= 32.7% vs F= 20.4%, p=.016). The reasons for these calls can be divided into three categories: instrumental help, such as the request for results of complementary tests or rescheduling appointments (43.9% of calls); counselling on side effects or worsening of the disease/pain (31.2 %); counselling on therapy management (24.9%). Neither sex nor patient age were significantly related to these reasons for calling. Nurses resolved autonomously half (50.3%) of the calls and in 79.8% of the cases there was no need for patient referral to other health services. About a quarter of patients adhered to the telephone helpline.. Patients called to obtain support in the management of disease and therapy or to report side effects and/or symptom aggravation in addition to reasonable instrumental reasons. This suggests that this service may provide important health gains, in addition to comfort for the patient.
The Virtual Mission Operations Center
NASA Technical Reports Server (NTRS)
Moore, Mike; Fox, Jeffrey
1994-01-01
Spacecraft management is becoming more human intensive as spacecraft become more complex and as operations costs are growing accordingly. Several automation approaches have been proposed to lower these costs. However, most of these approaches are not flexible enough in the operations processes and levels of automation that they support. This paper presents a concept called the Virtual Mission Operations Center (VMOC) that provides highly flexible support for dynamic spacecraft management processes and automation. In a VMOC, operations personnel can be shared among missions, the operations team can change personnel and their locations, and automation can be added and removed as appropriate. The VMOC employs a form of on-demand supervisory control called management by exception to free operators from having to actively monitor their system. The VMOC extends management by exception, however, so that distributed, dynamic teams can work together. The VMOC uses work-group computing concepts and groupware tools to provide a team infrastructure, and it employs user agents to allow operators to define and control system automation.
Stress management standards: a warning indicator for employee health.
Kazi, A; Haslam, C O
2013-07-01
Psychological stress is a major cause of lost working days in the UK. The Health & Safety Executive (HSE) has developed management standards (MS) to help organizations to assess work-related stress. To investigate the relationships between the MS indicator tool and employee health, job attitudes, work performance and environmental outcomes. The first phase involved a survey employing the MS indicator tool, General Health Questionnaire-12 (GHQ-12), job attitudes, work performance and environmental measures in a call centre from a large utility company. The second phase comprised six focus groups to investigate what employees believed contributed to their perceived stress. Three hundred and four call centre employees responded with a response rate of 85%. Significant negative correlations were found between GHQ-12 and two MS dimensions; demands (Rho = -0.211, P < 0.001) and relationships (Rho= -0.134, P < 0.05). Other dimensions showed no significant relationship with GHQ-12. Higher levels of stress were associated with reduced job performance, job motivation and increased intention to quit but low stress levels were associated with reduced job satisfaction. Lack of management support, recognition and development opportunities were identified as sources of stress. The findings support the utility of the MS as a measure of employee attitudes and performance.
Chronic condition self-management support for Aboriginal people: Adapting tools and training.
Battersby, Malcolm; Lawn, Sharon; Kowanko, Inge; Bertossa, Sue; Trowbridge, Coral; Liddicoat, Raylene
2018-04-22
Chronic conditions are major health problems for Australian Aboriginal people. Self-management programs can improve health outcomes. However, few health workers are skilled in self-management support and existing programs are not always appropriate in Australian Aboriginal contexts. The goal was to increase the capacity of the Australian health workforce to support Australian Aboriginal people to self-manage their chronic conditions by adapting the Flinders Program of chronic condition self-management support for Australian Aboriginal clients and develop and deliver training for health professionals to implement the program. Feedback from health professionals highlighted that the Flinders Program assessment and care planning tools needed to be adapted to suit Australian Aboriginal contexts. Through consultation with Australian Aboriginal Elders and other experts, the tools were condensed into an illustrated booklet called 'My Health Story'. Associated training courses and resources focusing on cultural safety and effective engagement were developed. A total of 825 health professionals across Australia was trained and 61 people qualified as accredited trainers in the program, ensuring sustainability. The capacity and skills of the Australian health workforce to engage with and support Australian Aboriginal people to self-manage their chronic health problems significantly increased as a result of this project. The adapted tools and training were popular and appreciated by the health care organisations, health professionals and clients involved. The adapted tools have widespread appeal for cultures that do not have Western models of health care and where there are health literacy challenges. My Health Story has already been used internationally. © 2018 National Rural Health Alliance Ltd.
Eckart, J Dana; Sobral, Bruno W S
2003-01-01
The emergent needs of the bioinformatics community challenge current information systems. The pace of biological data generation far outstrips Moore's Law. Therefore, a gap continues to widen between the capabilities to produce biological (molecular and cell) data sets and the capability to manage and analyze these data sets. As a result, Federal investments in large data set generation produces diminishing returns in terms of the community's capabilities of understanding biology and leveraging that understanding to make scientific and technological advances that improve society. We are building an open framework to address various data management issues including data and tool interoperability, nomenclature and data communication standardization, and database integration. PathPort, short for Pathogen Portal, employs a generic, web-services based framework to deal with some of the problems identified by the bioinformatics community. The motivating research goal of a scalable system to provide data management and analysis for key pathosystems, especially relating to molecular data, has resulted in a generic framework using two major components. On the server-side, we employ web-services. On the client-side, a Java application called ToolBus acts as a client-side "bus" for contacting data and tools and viewing results through a single, consistent user interface.
2014-10-01
designed an Internet-based and mobile application (software) to assist with the following domains pertinent to diabetes self-management: 1...management that provides education, reminders, and support. The new tool is an internet-based and mobile application (software), now called Tracking...is mobile , provides decision support with actionable options, and is based on user input, will enhance diabetes self-care, improve glycemic control
Social Ontology Documentation for Knowledge Externalization
NASA Astrophysics Data System (ADS)
Aranda-Corral, Gonzalo A.; Borrego-Díaz, Joaquín; Jiménez-Mavillard, Antonio
Knowledge externalization and organization is a major challenge that companies must face. Also, they have to ask whether is possible to enhance its management. Mechanical processing of information represents a chance to carry out these tasks, as well as to turn intangible knowledge assets into real assets. Machine-readable knowledge provides a basis to enhance knowledge management. A promising approach is the empowering of Knowledge Externalization by the community (users, employees). In this paper, a social semantic tool (called OntoxicWiki) for enhancing the quality of knowledge is presented.
2004-03-03
Airborne Science personnel Walter Klein and David Bushman at the Mission Manager's console onboard NASA's DC-8 during the AirSAR 2004 campaign. AirSAR 2004 is a three-week expedition by an international team of scientists that will use an all-weather imaging tool, called the Airborne Synthetic Aperture Radar (AirSAR), in a mission ranging from the tropical rain forests of Central America to frigid Antarctica.
Information Technology Research Services: Powerful Tools to Keep Up with a Rapidly Moving Field
NASA Technical Reports Server (NTRS)
Hunter, Paul
2010-01-01
Marty firms offer Information Technology Research reports, analyst calls, conferences, seminars, tools, leadership development, etc. These entities include Gartner, Forrester Research, IDC, The Burton Group, Society for Information Management, 1nfoTech Research, The Corporate Executive Board, and so on. This talk will cover how a number of such services are being used at the Goddard Space Flight Center to improve our IT management practices, workforce skills, approach to innovation, and service delivery. These tools and services are used across the workforce, from the executive leadership to the IT worker. The presentation will cover the types of services each vendor provides and their primary engagement model. The use of these services at other NASA Centers and Headquarters will be included. In addition, I will explain how two of these services are available now to the entire NASA IT workforce through enterprise-wide subscriptions.
NASA Astrophysics Data System (ADS)
Klar, Jochen; Engelhardt, Claudia; Neuroth, Heike; Enke, Harry
2016-04-01
Following the call to make the results of publicly funded research openly accessible, more and more funding agencies demand the submission of a data management plan (DMP) as part of the application process. These documents specify, how the data management of the project is organized and what datasets will be published when. Of particular importance for European researchers is the Open Data Research Pilot of Horizon 2020 which requires data management plans for a set of 9 selected research fields from social sciences to nanotechnology. In order to assist the researchers creating these documents, several institutions developed dedicated software tools. The most well-known are DMPonline by the Digital Curation Centre (DCC) and DMPtool by the California Digital Library (CDL) - both extensive and well received web applications. The core functionality of these tools is the assisted editing of the DMP templates provided by the particular funding agency.While this is certainly helpful, especially in an environment with a plethora of different funding agencies like the UK or the USA, these tools are somewhat limited to this particular task and don't utilise the full potential of DMP. Beyond the purpose of fulfilling funder requirements, DMP can be useful for a number of additional tasks. In the initial conception phase of a project, they can be used as a planning tool to determine which date management activities and measures are necessary throughout the research process, to assess which resources are needed, and which institutions (computing centers, libraries, data centers) should be involved. During the project, they can act as a constant reference or guideline for the handling of research data. They also determine where the data will be stored after the project has ended and whether it can be accessed by the public, helping to take into account resulting requirements of the data center or actions necessary to ensure re-usability by others from early on. Ideally, a DMP acts as information source for all stakeholders involved during the complete life cycle of the project. The aim our project is the development of a web application called DMPwerkzeug, which enables this structured planning, implementation and administration of the research data management in a scientific project and, in addition, provides the scientist with a textual DMP. Building upon a generic set of content, DMPwerkzeug will be customizable to serve specific disciplinary and institutional requirements. The tool will not only be available at a central web site, but will be able to be installed and integrated into the existing infrastructure of a university or research institution. The tool will be multilingual, with a first version being published in English and German.
NASA Technical Reports Server (NTRS)
Karandikar, Harsh M.
1997-01-01
An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.
OTLA: A New Model for Online Teaching, Learning and Assessment in Higher Education
ERIC Educational Resources Information Center
Ghilay, Yaron; Ghilay, Ruth
2013-01-01
The study examined a new asynchronous model for online teaching, learning and assessment, called OTLA. It is designed for higher-education institutions and is based on LMS (Learning Management System) as well as other relevant IT tools. The new model includes six digital basic components: text, hypertext, text reading, lectures (voice/video),…
Forage resource evaluation system for habitat—deer: an interactive deer habitat model
Thomas A. Hanley; Donald E. Spalinger; Kenrick J. Mock; Oran L. Weaver; Grant M. Harris
2012-01-01
We describe a food-based system for quantitatively evaluating habitat quality for deer called the Forage Resource Evaluation System for Habitat and provide its rationale and suggestions for use. The system was developed as a tool for wildlife biologists and other natural resource managers and planners interested in evaluating habitat quality and, especially, comparing...
Database Software for Social Studies. A MicroSIFT Quarterly Report.
ERIC Educational Resources Information Center
Weaver, Dave
The report describes and evaluates the use of a set of learning tools called database managers and their creation of databases to help teach problem solving skills in social studies. Details include the design, building, and use of databases in a social studies setting, along with advantages and disadvantages of using them. The three types of…
Akuna: An Open Source User Environment for Managing Subsurface Simulation Workflows
NASA Astrophysics Data System (ADS)
Freedman, V. L.; Agarwal, D.; Bensema, K.; Finsterle, S.; Gable, C. W.; Keating, E. H.; Krishnan, H.; Lansing, C.; Moeglein, W.; Pau, G. S. H.; Porter, E.; Scheibe, T. D.
2014-12-01
The U.S. Department of Energy (DOE) is investing in development of a numerical modeling toolset called ASCEM (Advanced Simulation Capability for Environmental Management) to support modeling analyses at legacy waste sites. ASCEM is an open source and modular computing framework that incorporates new advances and tools for predicting contaminant fate and transport in natural and engineered systems. The ASCEM toolset includes both a Platform with Integrated Toolsets (called Akuna) and a High-Performance Computing multi-process simulator (called Amanzi). The focus of this presentation is on Akuna, an open-source user environment that manages subsurface simulation workflows and associated data and metadata. In this presentation, key elements of Akuna are demonstrated, which includes toolsets for model setup, database management, sensitivity analysis, parameter estimation, uncertainty quantification, and visualization of both model setup and simulation results. A key component of the workflow is in the automated job launching and monitoring capabilities, which allow a user to submit and monitor simulation runs on high-performance, parallel computers. Visualization of large outputs can also be performed without moving data back to local resources. These capabilities make high-performance computing accessible to the users who might not be familiar with batch queue systems and usage protocols on different supercomputers and clusters.
Portable parallel portfolio optimization in the Aurora Financial Management System
NASA Astrophysics Data System (ADS)
Laure, Erwin; Moritsch, Hans
2001-07-01
Financial planning problems are formulated as large scale, stochastic, multiperiod, tree structured optimization problems. An efficient technique for solving this kind of problems is the nested Benders decomposition method. In this paper we present a parallel, portable, asynchronous implementation of this technique. To achieve our portability goals we elected the programming language Java for our implementation and used a high level Java based framework, called OpusJava, for expressing the parallelism potential as well as synchronization constraints. Our implementation is embedded within a modular decision support tool for portfolio and asset liability management, the Aurora Financial Management System.
Software engineering and data management for automated payload experiment tool
NASA Technical Reports Server (NTRS)
Maddux, Gary A.; Provancha, Anna; Chattam, David
1994-01-01
The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by UAH and provide versions of the software in a Macintosh and Windows compatible format.
The Future of the Internet in Science
NASA Technical Reports Server (NTRS)
Guice, Jon; Duffy, Robert
2000-01-01
How are scientists going to make use of the Internet several years from now? This is a case study of a leading-edge experiment in building a 'virtual institute'-- using electronic communication tools to foster collaboration among geographically dispersed scientists. Our experience suggests: Scientists will want to use web-based document management systems. There will be a demand for Internet-enabled meeting support tools. While internet videoconferencing will have limited value for scientists, webcams will be in great demand as a tool for transmitting pictures of objects and settings, rather than "talking heads." and a significant share of scientists who do fieldwork will embrace mobile voice, data and video communication tools. The setting for these findings is a research consortium called the NASA Astrobiology Institute.
Karp, Peter D; Paley, Suzanne; Romero, Pedro
2002-01-01
Bioinformatics requires reusable software tools for creating model-organism databases (MODs). The Pathway Tools is a reusable, production-quality software environment for creating a type of MOD called a Pathway/Genome Database (PGDB). A PGDB such as EcoCyc (see http://ecocyc.org) integrates our evolving understanding of the genes, proteins, metabolic network, and genetic network of an organism. This paper provides an overview of the four main components of the Pathway Tools: The PathoLogic component supports creation of new PGDBs from the annotated genome of an organism. The Pathway/Genome Navigator provides query, visualization, and Web-publishing services for PGDBs. The Pathway/Genome Editors support interactive updating of PGDBs. The Pathway Tools ontology defines the schema of PGDBs. The Pathway Tools makes use of the Ocelot object database system for data management services for PGDBs. The Pathway Tools has been used to build PGDBs for 13 organisms within SRI and by external users.
Llandres, Ana L; Almohamad, Raki; Brévault, Thierry; Renou, Alain; Téréta, Idrissa; Jean, Janine; Goebel, François-Regis
2018-04-17
Enhancing cotton pest management using plant natural defenses has been described as a promising way to improve the management of crop pests. We here reviewed different studies on cotton growing systems to illustrate how an ancient technique called plant training, which includes plant topping and pruning, may contribute to this goal. Based on examples from cotton crops, we show how trained plants could be promoted to a state of enhanced defense that causes faster and more robust activation of their defense responses. We revisit agricultural benefits associated to this technique in cotton crops, with a focus on its potential as a supplementary tool for Integrated Pest Management (IPM). Particularly, we examine its role in mediating plant interactions with conspecific neighboring plants, pests and associated natural enemies. We propose a new IPM tool, plant training for induced defense, which involves inducing plant defense by artificial injuries. Experimental evidence from various studies shows that cotton training is a promising technique, particularly for smallholders, which can be used as part of an IPM program to significantly reduce insecticide use and to improve productivity in cotton farming. This article is protected by copyright. All rights reserved.
DeMAID/GA USER'S GUIDE Design Manager's Aid for Intelligent Decomposition with a Genetic Algorithm
NASA Technical Reports Server (NTRS)
Rogers, James L.
1996-01-01
Many companies are looking for new tools and techniques to aid a design manager in making decisions that can reduce the time and cost of a design cycle. One tool that is available to aid in this decision making process is the Design Manager's Aid for Intelligent Decomposition (DeMAID). Since the initial release of DEMAID in 1989, numerous enhancements have been added to aid the design manager in saving both cost and time in a design cycle. The key enhancement is a genetic algorithm (GA) and the enhanced version is called DeMAID/GA. The GA orders the sequence of design processes to minimize the cost and time to converge to a solution. These enhancements as well as the existing features of the original version of DEMAID are described. Two sample problems are used to show how these enhancements can be applied to improve the design cycle. This report serves as a user's guide for DeMAID/GA.
AquaCrop-OS: A tool for resilient management of land and water resources in agriculture
NASA Astrophysics Data System (ADS)
Foster, Timothy; Brozovic, Nicholas; Butler, Adrian P.; Neale, Christopher M. U.; Raes, Dirk; Steduto, Pasquale; Fereres, Elias; Hsiao, Theodore C.
2017-04-01
Water managers, researchers, and other decision makers worldwide are faced with the challenge of increasing food production under population growth, drought, and rising water scarcity. Crop simulation models are valuable tools in this effort, and, importantly, provide a means of quantifying rapidly crop yield response to water, climate, and field management practices. Here, we introduce a new open-source crop modelling tool called AquaCrop-OS (Foster et al., 2017), which extends the functionality of the globally used FAO AquaCrop model. Through case studies focused on groundwater-fed irrigation in the High Plains and Central Valley of California in the United States, we demonstrate how AquaCrop-OS can be used to understand the local biophysical, behavioural, and institutional drivers of water risks in agricultural production. Furthermore, we also illustrate how AquaCrop-OS can be combined effectively with hydrologic and economic models to support drought risk mitigation and decision-making around water resource management at a range of spatial and temporal scales, and highlight future plans for model development and training. T. Foster, et al. (2017) AquaCrop-OS: An open source version of FAO's crop water productivity model. Agricultural Water Management. 181: 18-22. http://dx.doi.org/10.1016/j.agwat.2016.11.015.
Command Center Training Tool (C2T2)
NASA Technical Reports Server (NTRS)
Jones, Phillip; Drucker, Nich; Mathews, Reejo; Stanton, Laura; Merkle, Ed
2012-01-01
This abstract presents the training approach taken to create a management-centered, experiential learning solution for the Virginia Port Authority's Port Command Center. The resultant tool, called the Command Center Training Tool (C2T2), follows a holistic approach integrated across the training management cycle and within a single environment. The approach allows a single training manager to progress from training design through execution and AAR. The approach starts with modeling the training organization, identifying the organizational elements and their individual and collective performance requirements, including organizational-specific performance scoring ontologies. Next, the developer specifies conditions, the problems, and constructs that compose exercises and drive experiential learning. These conditions are defined by incidents, which denote a single, multi-media datum, and scenarios, which are stories told by incidents. To these layered, modular components, previously developed meta-data is attached, including associated performance requirements. The components are then stored in a searchable library An event developer can create a training event by searching the library based on metadata and then selecting and loading the resultant modular pieces. This loading process brings into the training event all the previously associated task and teamwork material as well as AAR preparation materials. The approach includes tools within an integrated management environment that places these materials at the fingertips of the event facilitator such that, in real time, the facilitator can track training audience performance and resultantly modify the training event. The approach also supports the concentrated knowledge management requirements for rapid preparation of an extensive AAR. This approach supports the integrated training cycle and allows a management-based perspective and advanced tools, through which a complex, thorough training event can be developed.
ERIC Educational Resources Information Center
Moreillon, Judi
2015-01-01
As more and more library and LIS education migrates to the online environment, college and university faculty are called upon to expand their pedagogy. Interactivity has been cited as one factor in attracting and retaining students in online courses and programs. LIS educators can reach outside the online learning management system (LMS) to…
Rain gardens are designed to capture and infiltrate rainwater in the landscape. These gardens are also called "rain water gardens". Rainwater is routed to the garden and filtered naturally by the plants and soils in the garden. This filtration process removes nutrients and poll...
NASA Technical Reports Server (NTRS)
Phojanamongkolkij, Nipa; Okuniek, Nikolai; Lohr, Gary W.; Schaper, Meilin; Christoffels, Lothar; Latorella, Kara A.
2014-01-01
The runway is a critical resource of any air transport system. It is used for arrivals, departures, and for taxiing aircraft and is universally acknowledged as a constraining factor to capacity for both surface and airspace operations. It follows that investigation of the effective use of runways, both in terms of selection and assignment as well as the timing and sequencing of the traffic is paramount to the efficient traffic flows. Both the German Aerospace Center (DLR) and NASA have developed concepts and tools to improve atomic aspects of coordinated arrival/departure/surface management operations and runway configuration management. In December 2012, NASA entered into a Collaborative Agreement with DLR. Four collaborative work areas were identified, one of which is called "Runway Management." As part of collaborative research in the "Runway Management" area, which is conducted with the DLR Institute of Flight Guidance, located in Braunschweig, the goal is to develop an integrated system comprised of the three DLR tools - arrival, departure, and surface management (collectively referred to as A/D/S-MAN) - and NASA's tactical runway configuration management (TRCM) tool. To achieve this goal, it is critical to prepare a concept of operations (ConOps) detailing how the NASA runway management and DLR arrival, departure, and surface management tools will function together to the benefit of each. To assist with the preparation of the ConOps, the integrated NASA and DLR tools are assessed through a functional analysis method described in this report. The report first provides the highlevel operational environments for air traffic management (ATM) in Germany and in the U.S., and the descriptions of the DLR's A/D/S-MAN and NASA's TRCM tools at the level of details necessary to compliment the purpose of the study. Functional analyses of each tool and a completed functional analysis of an integrated system design are presented next in the report. Future efforts to fully develop the ConOps will include: developing scenarios to fully test environmental, procedural, and data availability assumptions; executing the analysis by a walk-through of the integrated system using these scenarios; defining the appropriate role of operators in terms of their monitoring requirements and decision authority; executing the analysis by a walk-through of the integrated system with operator involvement; characterizing the environmental, system data requirements, and operator role assumptions for the ConOps.
Advancements in nano-enabled therapeutics for neuroHIV management.
Kaushik, Ajeet; Jayant, Rahul Dev; Nair, Madhavan
This viewpoint is a global call to promote fundamental and applied research aiming toward designing smart nanocarriers of desired properties, novel noninvasive strategies to open the blood-brain barrier (BBB), delivery/release of single/multiple therapeutic agents across the BBB to eradicate neurohuman immunodeficiency virus (HIV), strategies for on-demand site-specific release of antiretroviral therapy, developing novel nanoformulations capable to recognize and eradicate latently infected HIV reservoirs, and developing novel smart analytical diagnostic tools to detect and monitor HIV infection. Thus, investigation of novel nanoformulations, methodologies for site-specific delivery/release, analytical methods, and diagnostic tools would be of high significance to eradicate and monitor neuroacquired immunodeficiency syndrome. Overall, these developments will certainly help to develop personalized nanomedicines to cure HIV and to develop smart HIV-monitoring analytical systems for disease management.
Developing mechanisms for estimating carbon footprint in farming systems
NASA Astrophysics Data System (ADS)
Anaya-Romero, María; Fernández Luque, José Enrique; Rodríguez Merino, Alejandro; José Moreno Delgado, Juan; Rodado, Concepción Mira; Romero Vicente, Rafael; Perez-Martin, Alfonso; Muñoz-Rojas, Miriam
2015-04-01
Sustainable land management is critical to avoid land degradation and to reclaim degraded land for its productive use and for reaping the benefits of crucial ecosystem services and protecting biodiversity. It also helps in mitigating and adapting to climate change. Land and its various uses are affected severely by climate change too (flooding, droughts, etc.). Existing tools and technologies for efficient land management need to be adapted and their application expanded. A large number of human livelihoods and ecosystems can benefit from these tools and techniques since these yield multiple benefits. Disseminating and scaling up the implementation of sustainable land management approaches will, however, need to be backed up by mobilizing strong political will and financial resources. The challenge is to provide an integral decision support tool that can establish relationships between soil carbon content, climate change and land use and management aspects that allow stakeholders to detect, cope with and intervene into land system change in a sustainable way. In order to achieve this goal an agro-ecological meta-model called CarboLAND will be calibrated in several plots located in Andalusia region, Southern Spain, under different scenarios of climate and agricultural use and management. The output will be the CLIMALAND e-platform, which will also include protocols in order to support stakeholders for an integrated ecosystem approach, taking into account biodiversity, hydrological and soil capability, socio-economic aspects, and regional and environmental policies. This tool will be made available at the European context for a regional level, providing user-friendly interfaces and a scientifically-technical platform for the assessment of sustainable land use and management.
Development of demand forecasting tool for natural resources recouping from municipal solid waste.
Zaman, Atiq Uz; Lehmann, Steffen
2013-10-01
Sustainable waste management requires an integrated planning and design strategy for reliable forecasting of waste generation, collection, recycling, treatment and disposal for the successful development of future residential precincts. The success of the future development and management of waste relies to a high extent on the accuracy of the prediction and on a comprehensive understanding of the overall waste management systems. This study defies the traditional concepts of waste, in which waste was considered as the last phase of production and services, by putting forward the new concept of waste as an intermediate phase of production and services. The study aims to develop a demand forecasting tool called 'zero waste index' (ZWI) for measuring the natural resources recouped from municipal solid waste. The ZWI (ZWI demand forecasting tool) quantifies the amount of virgin materials recovered from solid waste and subsequently reduces extraction of natural resources. In addition, the tool estimates the potential amount of energy, water and emissions avoided or saved by the improved waste management system. The ZWI is tested in a case study of waste management systems in two developed cities: Adelaide (Australia) and Stockholm (Sweden). The ZWI of waste management systems in Adelaide and Stockholm is 0.33 and 0.17 respectively. The study also enumerates per capita energy savings of 2.9 GJ and 2.83 GJ, greenhouse gas emissions reductions of 0.39 tonnes (CO2e) and 0.33 tonnes (CO2e), as well as water savings of 2.8 kL and 0.92 kL in Adelaide and Stockholm respectively.
EGenBio: A Data Management System for Evolutionary Genomics and Biodiversity
Nahum, Laila A; Reynolds, Matthew T; Wang, Zhengyuan O; Faith, Jeremiah J; Jonna, Rahul; Jiang, Zhi J; Meyer, Thomas J; Pollock, David D
2006-01-01
Background Evolutionary genomics requires management and filtering of large numbers of diverse genomic sequences for accurate analysis and inference on evolutionary processes of genomic and functional change. We developed Evolutionary Genomics and Biodiversity (EGenBio; ) to begin to address this. Description EGenBio is a system for manipulation and filtering of large numbers of sequences, integrating curated sequence alignments and phylogenetic trees, managing evolutionary analyses, and visualizing their output. EGenBio is organized into three conceptual divisions, Evolution, Genomics, and Biodiversity. The Genomics division includes tools for selecting pre-aligned sequences from different genes and species, and for modifying and filtering these alignments for further analysis. Species searches are handled through queries that can be modified based on a tree-based navigation system and saved. The Biodiversity division contains tools for analyzing individual sequences or sequence alignments, whereas the Evolution division contains tools involving phylogenetic trees. Alignments are annotated with analytical results and modification history using our PRAED format. A miscellaneous Tools section and Help framework are also available. EGenBio was developed around our comparative genomic research and a prototype database of mtDNA genomes. It utilizes MySQL-relational databases and dynamic page generation, and calls numerous custom programs. Conclusion EGenBio was designed to serve as a platform for tools and resources to ease combined analysis in evolution, genomics, and biodiversity. PMID:17118150
NASA Astrophysics Data System (ADS)
Almeida, W. G.; Ferreira, A. L.; Mendes, M. V.; Ribeiro, A.; Yoksas, T.
2007-05-01
CPTEC, a division of Brazil’s INPE, has been using several open-source software packages for a variety of tasks in its Data Division. Among these tools are ones traditionally used in research and educational communities such as GrADs (Grid Analysis and Display System from the Center for Ocean-Land-Atmosphere Studies (COLA)), the Local Data Manager (LDM) and GEMPAK (from Unidata), andl operational tools such the Automatic File Distributor (AFD) that are popular among National Meteorological Services. In addition, some tools developed locally at CPTEC are also being made available as open-source packages. One package is being used to manage the data from Automatic Weather Stations that INPE operates. This system uses only open- source tools such as MySQL database, PERL scripts and Java programs for web access, and Unidata’s Internet Data Distribution (IDD) system and AFD for data delivery. All of these packages are get bundled into a low-cost and easy to install and package called the Meteorological Data Operational System. Recently, in a cooperation with the SICLIMAD project, this system has been modified for use by Portuguese- speaking countries in Africa to manage data from many Automatic Weather Stations that are being installed in these countries under SICLIMAD sponsorship. In this presentation we describe the tools included-in and and architecture-of the Meteorological Data Operational System.
Coordinating complex problem-solving among distributed intelligent agents
NASA Technical Reports Server (NTRS)
Adler, Richard M.
1992-01-01
A process-oriented control model is described for distributed problem solving. The model coordinates the transfer and manipulation of information across independent networked applications, both intelligent and conventional. The model was implemented using SOCIAL, a set of object-oriented tools for distributing computing. Complex sequences of distributed tasks are specified in terms of high level scripts. Scripts are executed by SOCIAL objects called Manager Agents, which realize an intelligent coordination model that routes individual tasks to suitable server applications across the network. These tools are illustrated in a prototype distributed system for decision support of ground operations for NASA's Space Shuttle fleet.
Interest in a digital health tool in veterans with epilepsy: results of a phone survey.
Hixson, John D; Van Bebber, Stephanie L; Bertko, Kate M
2015-04-01
Online tools for managing chronic health conditions are becoming increasingly popular. Perceived benefits include ease of use, low costs, and availability but are contingent on patient engagement, Internet access, and digital literacy. This article describes data collected during the recruitment phase of a study evaluating an online self-management platform for epilepsy in a U.S. Veteran population. We used administrative data to identify and contact Veterans with a likely diagnosis of epilepsy in the Veterans Health Administration (VHA). Veterans who did not respond directly to a mailed invitation were recruited by phone to determine study interest and evaluate digital access. Of the 2,143 Veterans mailed study invitations, phone calls were made to 1,789 who did not specifically decline participation. Among those reached by phone (n = 1,053): 295 (28%) expressed interest in the study and an online tool, 333 (19%) reported a lack of computer and/or Internet access and 425 (40%) were not interested for other reasons. This study suggests an interest in online tools for managing health despite the fact that some Veterans lack computer and/or Internet access. As investment in digital health solutions grows, the VHA should prioritize the widespread provision of digital access to more Veterans. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.
Development of Medical Technology for Contingency Response to Marrow Toxic Agents
2018-06-06
Health Physics e. Emergency Medicine f. Burn Care g. State Public Health h. Federal Public Health i. Emergency Management. 2. The group has...Preparedness 4 Project: Local Public Health Radiological Preparedness Gap Review and Tool Development Identification 1. The National Association...of County and City Health Officials (NACCHO) has held multiple conference calls with leaders within their organization to identify the areas of
Bangbang Zhang; Gary Feng; Lajpat R. Ahuja; Xiangbin Kong; Ying Ouyang; Ardeshir Adeli; Johnie N. Jenkins
2018-01-01
Crop production as a function of water use or water applied, called the crop water production function (CWPF), is a useful tool for irrigation planning, design and management. However, these functions are not only crop and variety specific they also vary with soil types and climatic conditions (locations). Derivation of multi-year average CWPFs through field...
2016-07-01
from unlawful discrimination based on race, color, national origin, religion, sex (including pregnancy , gender identity, and sexual orientation when...range of online resources for diversity management and equal opportunity programming. DEOMI’s Research Directorate administers a survey called the...Defense Equal Opportunity Climate Survey (DEOCS). This survey is intended to be a tool for commanders to improve their organizational culture. It
A New Tool for Managing Students' Self-Evaluations in Traditional and Distance Education Courses.
ERIC Educational Resources Information Center
Contreras-Castillo, Juan; Block, Arthur Edwards
This paper describes a Web-based question-answering system called E-teacher that can be used in both traditional and distance learning courses to review academic contents. E-teacher is a self-editing template-based system that consists of a set of PHP scripts that generate the HTML code dynamically, or "on the fly." The entire E-teacher…
SCI peer health coach influence on self-management with peers: a qualitative analysis.
Skeels, S E; Pernigotti, D; Houlihan, B V; Belliveau, T; Brody, M; Zazula, J; Hasiotis, S; Seetharama, S; Rosenblum, D; Jette, A
2017-11-01
A process evaluation of a clinical trial. To describe the roles fulfilled by peer health coaches (PHCs) with spinal cord injury (SCI) during a randomized controlled trial research study called 'My Care My Call', a novel telephone-based, peer-led self-management intervention for adults with chronic SCI 1+ years after injury. Connecticut and Greater Boston Area, MA, USA. Directed content analysis was used to qualitatively examine information from 504 tele-coaching calls, conducted with 42 participants with SCI, by two trained SCI PHCs. Self-management was the focus of each 6-month PHC-peer relationship. PHCs documented how and when they used the communication tools (CTs) and information delivery strategies (IDSs) they developed for the intervention. Interaction data were coded and analyzed to determine PHC roles in relation to CT and IDS utilization and application. PHCs performed three principal roles: Role Model, Supporter, and Advisor. Role Model interactions included CTs and IDSs that allowed PHCs to share personal experiences of managing and living with an SCI, including sharing their opinions and advice when appropriate. As Supporters, PHCs used CTs and IDSs to build credible relationships based on dependability and reassuring encouragement. PHCs fulfilled the unique role of Advisor using CTs and IDSs to teach and strategize with peers about SCI self-management. The SCI PHC performs a powerful, flexible role in promoting SCI self-management among peers. Analysis of PHC roles can inform the design of peer-led interventions and highlights the importance for the provision of peer mentor training.
Kelly, Michelle M; Blunt, Elizabeth; Nestor, Kelly
2017-12-01
Few nurse practitioner (NP) programs include an after-hours/on-call component in their clinical preparation of NP students. This role is expected in many primary and specialty care practices, and is one that students feel unprepared to competently navigate. Utilizing simulated callers as patients or parents, NP students participated in a simulated after-hours/on-call experience that included receiving the call, managing the patient, and submitting documentation of the encounter. Students completed pre- and postparticipation evaluations, and were evaluated by the simulated patient callers and faculty using standardized evaluation tools. NP students rated the experience as an educationally valuable experience despite feeling anxious and nervous about the experience. Several essential skills were identified including critical thinking, clear communication, self-confidence, and access to resources. After participation NP students were more receptive to an NP position with an on-call component. Inclusion of a simulated on-call experience is a feasible component of NP education and should be added to the NP curriculum. ©2017 American Association of Nurse Practitioners.
THE HUMAN BEHAVIOR RATING SCALE-BRIEF: A TOOL TO MEASURE 21ST CENTURY SKILLS OF K-12 LEARNERS.
Woods-Groves, Suzanne
2015-06-01
Currently there is a call for brief concise measurements to appraise relevant 21st century college readiness skills in K-12 learners. This study employed K-12 teachers' ratings for over 3,000 students for an existing 91-item rating scale, the Human Behavior Rating Scale, that measured the 21st century skills of persistence, curiosity, externalizing affect, internalizing affect, and cognition. Teachers' ratings for K-12 learners were used to develop a brief, concise, and manageable 30-item tool, the Human Behavior Rating Scale-Brief. Results yielded high internal consistency coefficients and inter-item correlations. The items were not biased with regard to student sex or race, and were supported through confirmatory factor analyses. In addition, when teachers' ratings were compared with students' academic and behavioral performance data, moderate to strong relationships were revealed. This study provided an essential first step in the development of a psychometrically sound, manageable, and brief tool to appraise 21st century skills in K-12 learners.
Pert, Petina L; Ens, Emilie J; Locke, John; Clarke, Philip A; Packer, Joanne M; Turpin, Gerry
2015-11-15
With growing international calls for the enhanced involvement of Indigenous peoples and their biocultural knowledge in managing conservation and the sustainable use of physical environment, it is timely to review the available literature and develop cross-cultural approaches to the management of biocultural resources. Online spatial databases are becoming common tools for educating land managers about Indigenous Biocultural Knowledge (IBK), specifically to raise a broad awareness of issues, identify knowledge gaps and opportunities, and to promote collaboration. Here we describe a novel approach to the application of internet and spatial analysis tools that provide an overview of publically available documented Australian IBK (AIBK) and outline the processes used to develop the online resource. By funding an AIBK working group, the Australian Centre for Ecological Analysis and Synthesis (ACEAS) provided a unique opportunity to bring together cross-cultural, cross-disciplinary and trans-organizational contributors who developed these resources. Without such an intentionally collaborative process, this unique tool would not have been developed. The tool developed through this process is derived from a spatial and temporal literature review, case studies and a compilation of methods, as well as other relevant AIBK papers. The online resource illustrates the depth and breadth of documented IBK and identifies opportunities for further work, partnerships and investment for the benefit of not only Indigenous Australians, but all Australians. The database currently includes links to over 1500 publically available IBK documents, of which 568 are geo-referenced and were mapped. It is anticipated that as awareness of the online resource grows, more documents will be provided through the website to build the database. It is envisaged that this will become a well-used tool, integral to future natural and cultural resource management and maintenance. Copyright © 2015. Published by Elsevier B.V.
Student Evaluation of CALL Tools during the Design Process
ERIC Educational Resources Information Center
Nesbitt, Dallas
2013-01-01
This article discusses the comparative effectiveness of student input at different times during the design of CALL tools for learning kanji, the Japanese characters of Chinese origin. The CALL software "package" consisted of tools to facilitate the writing, reading and practising of kanji characters in context. A pre-design questionnaire…
Precision Departure Release Capability (PDRC) Final Report
NASA Technical Reports Server (NTRS)
Engelland, Shawn A.; Capps, Richard; Day, Kevin Brian; Kistler, Matthew Stephen; Gaither, Frank; Juro, Greg
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows that may be subject to constraints that create localized demand/capacity imbalances. When demand exceeds capacity, Traffic Management Coordinators (TMCs) and Frontline Managers (FLMs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves a Call for Release (CFR) procedure wherein the Tower must call the Center to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool, based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release time is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that improves tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions and departure runway assignments to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept reduces uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs and FLMs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station in Dallas/Fort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations. This paper presents research results from the PDRC research activity. Companion papers present the Concept of Operations and a Technology Description.
Precision Departure Release Capability (PDRC): NASA to FAA Research Transition
NASA Technical Reports Server (NTRS)
Engelland, Shawn; Davis, Thomas J.
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows which may be subject to constraints that create localized demand-capacity imbalances. When demand exceeds capacity, Traffic Management Coordinators (TMCs) and Frontline Managers (FLMs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves use of a Call for Release (CFR) procedure wherein the Tower must call the Center to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release time is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that improves tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions and departure runway assignments to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept reduces uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs and FLMs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station in Dallas-Fort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations.
Ong, Stephanie W; Jassal, Sarbjit V; Porter, Eveline; Logan, Alexander G; Miller, Judith A
2013-01-01
New healthcare delivery models are needed to enhance the patient experience and improve quality of care for individuals with chronic conditions such as kidney disease. One potential avenue is to implement self-management strategies. There is growing evidence that self-management interventions help optimize various aspects of chronic disease management. With the increasing use of information technology (IT) in health care, chronic disease management programs are incorporating IT solutions to support patient self-management practices. IT solutions have the ability to promote key principles of self-management, namely education, empowerment, and collaboration. Positive clinical outcomes have been demonstrated for a number of chronic conditions when IT solutions were incorporated into self-management programs. There is a paucity of evidence for self-management in chronic kidney disease (CKD) patients. Furthermore, IT strategies have not been tested in this patient population to the same extent as other chronic conditions (e.g., diabetes, hypertension). Therefore, it is currently unknown if IT strategies will promote self-management behaviors and lead to improvements in overall patient care. We designed and developed an IT solution called My KidneyCare Centre to support self-management strategies for patients with CKD. In this review, we discuss the rationale and vision of incorporating an electronic self-management tool to support the care of patients with CKD. © 2013 Wiley Periodicals, Inc.
Integrated modeling approach for optimal management of water, energy and food security nexus
NASA Astrophysics Data System (ADS)
Zhang, Xiaodong; Vesselinov, Velimir V.
2017-03-01
Water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-period socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. The obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.
Measuring engagement effectiveness in social media
NASA Astrophysics Data System (ADS)
Li, Lei; Sun, Tong; Peng, Wei; Li, Tao
2012-03-01
Social media is becoming increasingly prevalent with the advent of web 2.0 technologies. Popular social media websites, such as Twitter and Facebook, are attracting a gigantic number of online users to post and share information. An interesting phenomenon under this trend involves that more and more users share their experiences or issues with regard to a product, and then the product service agents use commercial social media listening and engagement tools (e.g. Radian6, Sysomos, etc.) to response to users' complaints or issues and help them tackle their problems. This is often called customer care in social media or social customer relationship management (CRM). However, all these existing commercial social media tools only provide an aggregated level of trends, patterns and sentiment analysis based on the keyword-centric brand relevant data, which have little insights for answering one of the key questions in social CRM system: how effective is our social customer care engagement? In this paper, we focus on addressing the problem of how to measure the effectiveness of engagement for service agents in customer care. Traditional CRM effectiveness measurements are defined under the scenario of the call center, where the effectiveness is mostly based on the duration time per call and/or number of answered calls per day. Different from customer care in a call center, we can obtain detailed conversations between agents and customers in social media, and therefore the effectiveness can be measured by analyzing the content of conversations and the sentiment of customers.
A comparative evaluation of five hazard screening tools.
Panko, J M; Hitchcock, K; Fung, M; Spencer, P J; Kingsbury, T; Mason, A M
2017-01-01
An increasing number of hazard assessment tools and approaches are being used in the marketplace as a means to differentiate products and ingredients with lower versus higher hazards or to certify what some call greener chemical ingredients in consumer products. Some leading retailers have established policies for product manufacturers and their suppliers to disclose chemical ingredients and their related hazard characteristics often specifying what tools to use. To date, no data exists that show a tool's reliability to provide consistent, credible screening-level hazard scores that can inform greener product selection. We conducted a small pilot study to understand and compare the hazard scoring of several hazard screening tools to determine if hazard and toxicity profiles for chemicals differ. Seven chemicals were selected that represent both natural and man-made chemistries as well as a range of toxicological activity. We conducted the assessments according to each tool provider's guidelines, which included factors such as endpoints, weighting preferences, sources of information, and treatment of data gaps. The results indicate the tools varied in the level of discrimination seen in the scores for these 7 chemicals and that tool classifications of the same chemical varied widely between the tools, ranging from little or no hazard or toxicity to very high hazard or toxicity. The results also highlight the need for transparency in describing the basis for the tool's hazard scores and suggest possible enhancements. Based on this pilot study, tools should not be generalized to fit all situations because their evaluations are context-specific. Before choosing a tool or approach, it is critical that the assessment rationale be clearly defined and matches the selected tool or approach. Integr Environ Assess Manag 2017;13:139-154. © 2016 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of SETAC. © 2016 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of SETAC.
Comparison of BrainTool to other UML modeling and model transformation tools
NASA Astrophysics Data System (ADS)
Nikiforova, Oksana; Gusarovs, Konstantins
2017-07-01
In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.
Human capital strategy: talent management.
Nagra, Michael
2011-01-01
Large organizations, including the US Army Medical Department and the Army Nurse Corps, are people-based organizations. Consequently, effective and efficient management of the human capital within these organizations is a strategic goal for the leadership. Over time, the Department of Defense has used many different systems and strategies to manage people throughout their service life-cycle. The current system in use is called Human Capital Management. In the near future, the Army's human capital will be managed based on skills, knowledge, and behaviors through various measurement tools. This article elaborates the human capital management strategy within the Army Nurse Corps, which identifies, develops, and implements key talent management strategies under the umbrella of the Corps' human capital goals. The talent management strategy solutions are aligned under the Nurse Corps business strategy captured by the 2008 Army Nurse Corps Campaign Plan, and are implemented within the context of the culture and core values of the organization.
Generic Airspace Concepts and Research
NASA Technical Reports Server (NTRS)
Mogford, Richard H.
2010-01-01
The purpose of this study was to evaluate methods for reducing the training and memorization required to manage air traffic in mid-term, Next Generation Air Transportation System (NextGen) airspace. We contrasted the performance of controllers using a sector information display and NextGen automation tools while working with familiar and unfamiliar sectors. The airspace included five sectors from Oakland and Salt Lake City Centers configured as a "generic center" called "West High Center." The Controller Information Tool was used to present essential information for managing these sectors. The Multi Aircraft Control System air traffic control simulator provided data link and conflict detection and resolution. There were five experienced air traffic controller participants. Each was familiar with one or two of the five sectors, but not the others. The participants rotated through all five sectors during the ten data collection runs. The results addressing workload, traffic management, and safety, as well as controller and observer comments, supported the generic sector concept. The unfamiliar sectors were comparable to the familiar sectors on all relevant measures.
NASA Astrophysics Data System (ADS)
Osenga, E. C.; Cundiff, J.; Arnott, J. C.; Katzenberger, J.; Taylor, J. R.; Jack-Scott, E.
2015-12-01
An interactive tool called the Forest Health Index (FHI) has been developed for the Roaring Fork watershed of Colorado, with the purpose of improving public understanding of local forest management and ecosystem dynamics. The watershed contains large areas of White River National Forest, which plays a significant role in the local economy, particularly for recreation and tourism. Local interest in healthy forests is therefore strong, but public understanding of forest ecosystems is often simplified. This can pose challenges for land managers and researchers seeking a scientifically informed approach to forest restoration, management, and planning. Now in its second iteration, the FHI is a tool designed to help bridge that gap. The FHI uses a suite of indicators to create a numeric rating of forest functionality and change, based on the desired forest state in relation to four categories: Ecological Integrity, Public Health and Safety, Ecosystem Services, and Sustainable Use and Management. The rating is based on data derived from several sources including local weather stations, stream gauge data, SNOTEL sites, and National Forest Service archives. In addition to offering local outreach and education, this project offers broader insight into effective communication methods, as well as into the challenges of using quantitative analysis to rate ecosystem health. Goals of the FHI include its use in schools as a means of using local data and place-based learning to teach basic math and science concepts, improved public understanding of ecological complexity and need for ongoing forest management, and, in the future, its use as a model for outreach tools in other forested communities in the Intermountain West.
Distributed data analysis in ATLAS
NASA Astrophysics Data System (ADS)
Nilsson, Paul; Atlas Collaboration
2012-12-01
Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and Nordu Grid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interface to the multitude of execution backends (local, batch, and grid). The PanDA workload management system provides a set of utilities called PanDA Client; with these tools users can easily submit Athena analysis jobs to the PanDA-managed resources. Distributed data is managed by Don Quixote 2, a system developed by ATLAS; DQ2 is used to replicate datasets according to the data distribution policies and maintains a central catalog of file locations. The operation of the grid resources is continually monitored by the Ganga Robot functional testing system, and infrequent site stress tests are performed using the Hammer Cloud system. In addition, the DAST shift team is a group of power users who take shifts to provide distributed analysis user support; this team has effectively relieved the burden of support from the developers.
The theory of interface slicing
NASA Technical Reports Server (NTRS)
Beck, Jon
1993-01-01
Interface slicing is a new tool which was developed to facilitate reuse-based software engineering, by addressing the following problems, needs, and issues: (1) size of systems incorporating reused modules; (2) knowledge requirements for program modification; (3) program understanding for reverse engineering; (4) module granularity and domain management; and (5) time and space complexity of conventional slicing. The definition of a form of static program analysis called interface slicing is addressed.
Wendy A. Kuntz; Peter B. Stacey
1997-01-01
Individual identification, especially in rare species, can provide managers with critical information about demographic processes. Traditionally, banding has been the only effective method of marking individuals. However, banding's drawbacks have led some researchers to suggest vocal analysis as an alternative. We explore this prospect for Mexican Spotted Owls (...
Sloane, Elliot B; Rosow, Eric; Adam, Joe; Shine, Dave
2006-01-01
Each individual U.S. Air Force, Army, and Navy Surgeon General has integrated oversight of global medical supplies and resources using the Joint Medical Asset Repository (JMAR). A Business Intelligence system called the JMAR Executive Dashboard Initiative (JEDI) was developed over a three-year period to add real-time interactive data-mining tools and executive dashboards. Medical resources can now be efficiently reallocated to military, veteran, family, or civilian purposes and inventories can be maintained at lean levels with peaks managed by interactive dashboards that reduce workload and errors.
Efficient Bulk Data Replication for the Earth System Grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sim, Alex; Gunter, Dan; Natarajan, Vijaya
2010-03-10
The Earth System Grid (ESG) community faces the difficult challenge of managing the distribution of massive data sets to thousands of scientists around the world. To move data replicas efficiently, the ESG has developed a data transfer management tool called the Bulk Data Mover (BDM). We describe the performance results of the current system and plans towards extending the techniques developed so far for the up- coming project, in which the ESG will employ advanced networks to move multi-TB datasets with the ulti- mate goal of helping researchers understand climate change and its potential impacts on world ecology and society.
Sadak, Tatiana; Wright, Jacob; Borson, Soo
2018-05-01
The National Alzheimer's Plan calls for improving health care for people living with dementia and supporting their caregivers as capable health care partners. Clinically useful measurement tools are needed to monitor caregivers' knowledge and skills for managing patients' often complex health care needs as well as their own self-care. We created and validated a comprehensive, caregiver-centered measure, Managing Your Loved One's Health (MYLOH), based on a core set of health care management domains endorsed by both providers and caregivers. In this article, we describe its development and preliminary cultural tailoring. MYLOH is a questionnaire containing 29 items, grouped into six domains, which requires <20 min to complete. MYLOH can be used to guide conversations between clinicians and caregivers around health care management of people with dementia, as the basis for targeted health care coaching, and as an outcome measure in comprehensive dementia care management interventions.
Software engineering and data management for automated payload experiment tool
NASA Technical Reports Server (NTRS)
Maddux, Gary A.; Provancha, Anna; Chattam, David
1994-01-01
The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by University of Alabama in Huntsville (UAH) and provide versions of the software in a Macintosh and Windows compatible format. Appendix 1 science requirements document (SRD) Users Manual is attached.
A decision support tool for synchronizing technology advances with strategic mission objectives
NASA Technical Reports Server (NTRS)
Hornstein, Rhoda S.; Willoughby, John K.
1992-01-01
Successful accomplishment of the objectives of many long-range future missions in areas such as space systems, land-use planning, and natural resource management requires significant technology developments. This paper describes the development of a decision-support data-derived tool called MisTec for helping strategic planners to determine technology development alternatives and to synchronize the technology development schedules with the performance schedules of future long-term missions. Special attention is given to the operations, concept, design, and functional capabilities of the MisTec. The MisTec was initially designed for manned Mars mission, but can be adapted to support other high-technology long-range strategic planning situations, making it possible for a mission analyst, planner, or manager to describe a mission scenario, determine the technology alternatives for making the mission achievable, and to plan the R&D activity necessary to achieve the required technology advances.
Design of an instrument to measure the quality of care in Physical Therapy.
Cavalheiro, Leny Vieira; Eid, Raquel Afonso Caserta; Talerman, Claudia; Prado, Cristiane do; Gobbi, Fátima Cristina Martorano; Andreoli, Paola Bruno de Araujo
2015-01-01
To design an instrument composed of domains that would demonstrate physical therapy activities and generate a consistent index to represent the quality of care in physical therapy. The methodology Lean Six Sigma was used to design the tool. The discussion involved seven different management groups staff. By means of brainstorming and Cause & Effect Matrix, we set up the process map. Five requirements composed the quality of care index in physical therapy, after application of the tool called Cause & Effect Matrix. The following requirements were assessed: physical therapist performance, care outcome indicator, adherence to physical therapy protocols, measure whether the prognosis and treatment outcome was achieved and Infrastructure. The proposed design allowed evaluating several items related to physical therapy service, enabling customization, reproducibility and benchmarking with other organizations. For management, this index provides the opportunity to identify areas for improvement and the strengths of the team and process of physical therapy care.
Hepler, Jeff A; Neumann, Cathy
2003-04-01
To enhance environmental compliance, the U.S. Department of Defense (DOD) recently developed and implemented a standardized environmental audit tool called The Environmental Assessment and Management (TEAM) Guide. Utilization of a common audit tool (TEAM Guide) throughout DOD agencies could be an effective agent of positive change. If, however, the audit tool is inappropriate, environmental compliance at DOD facilities could worsen. Furthermore, existing audit systems such as the U.S. Environmental Protection Agency's (U.S. EPA's) Generic Protocol for Conducting Environmental Audits of Federal Facilities and the International Organization for Standardization's (ISO's) Standard 14001, "Environmental Management System Audits," may be abandoned even if they offer significant advantages over TEAM Guide audit tool. Widespread use of TEAM Guide should not take place until thorough and independent evaluation has been performed. The purpose of this paper is to compare DOD's TEAM Guide audit tool with U.S. EPA's Generic Protocol for Conducting Environmental Audits of Federal Facilities and ISO 14001, in order to assess which is most appropriate and effective for DOD facilities, and in particular those operated by the U.S. Army Corps of Engineers (USACE). USACE was selected as a result of one author's recent experience as a district environmental compliance coordinator responsible for the audit mission at this agency. Specific recommendations for enhancing the quality of environmental audits at all DOD facilities also are given.
Clinical Assessment of Risk Management: an INtegrated Approach (CARMINA).
Tricarico, Pierfrancesco; Tardivo, Stefano; Sotgiu, Giovanni; Moretti, Francesca; Poletti, Piera; Fiore, Alberto; Monturano, Massimo; Mura, Ida; Privitera, Gaetano; Brusaferro, Silvio
2016-08-08
Purpose - The European Union recommendations for patient safety calls for shared clinical risk management (CRM) safety standards able to guide organizations in CRM implementation. The purpose of this paper is to develop a self-evaluation tool to measure healthcare organization performance on CRM and guide improvements over time. Design/methodology/approach - A multi-step approach was implemented including: a systematic literature review; consensus meetings with an expert panel from eight Italian leader organizations to get to an agreement on the first version; field testing to test instrument feasibility and flexibility; Delphi strategy with a second expert panel for content validation and balanced scoring system development. Findings - The self-assessment tool - Clinical Assessment of Risk Management: an INtegrated Approach includes seven areas (governance, communication, knowledge and skills, safe environment, care processes, adverse event management, learning from experience) and 52 standards. Each standard is evaluated according to four performance levels: minimum; monitoring; outcomes; and improvement actions, which resulted in a feasible, flexible and valid instrument to be used throughout different organizations. Practical implications - This tool allows practitioners to assess their CRM activities compared to minimum levels, monitor performance, benchmarking with other institutions and spreading results to different stakeholders. Originality/value - The multi-step approach allowed us to identify core minimum CRM levels in a field where no consensus has been reached. Most standards may be easily adopted in other countries.
Boisen, Egil; Bygholm, Ann; Cavan, David; Hejlesen, Ole K
2003-07-01
Within diabetes care, the majority of health decisions are in the hands of the patient. Therefore, the concepts of disease management and self-care represent inescapable challenges for both patient and healthcare professionals, entailing a considerable amount of learning. Thus, a computerised diabetes disease management systems (CDDM) is to be seen not merely as tools for the medical treatment, but also as pedagogical tools to enhance patient competence. The unfortunate lack of success for most knowledge-based systems might be related to the problem of finding an adequate way of evaluating the systems from their development through the implementation phase to the daily clinical practice. The following presents the initial methodological considerations for evaluating the usefulness of a CDDM system called DiasNet, which is being implemented as a learning tool for patients. The evaluation of usefulness of a CDDM, we claim, entails clinical assessment taking into account the challenges and pitfalls in diabetes disease management. Drawing on activity theory, we suggest the concept of copability as a supplement to 'usability' and 'utility' when determining 'usefulness'. We maintain that it is necessary to ask how well the user copes with the new situation using the system. As ways to measure copability of DiasNet the concepts of coping and learning are discussed, as well as ways this methodology might inform systems development, implementation, and daily clinical practice.
Integrated Decision-Making Tool to Develop Spent Fuel Strategies for Research Reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beatty, Randy L; Harrison, Thomas J
IAEA Member States operating or having previously operated a Research Reactor are responsible for the safe and sustainable management and disposal of associated radioactive waste, including research reactor spent nuclear fuel (RRSNF). This includes the safe disposal of RRSNF or the corresponding equivalent waste returned after spent fuel reprocessing. One key challenge to developing general recommendations lies in the diversity of spent fuel types, locations and national/regional circumstances rather than mass or volume alone. This is especially true given that RRSNF inventories are relatively small, and research reactors are rarely operated at a high power level or duration typical ofmore » commercial power plants. Presently, many countries lack an effective long-term policy for managing RRSNF. This paper presents results of the International Atomic Energy Agency (IAEA) Coordinated Research Project (CRP) #T33001 on Options and Technologies for Managing the Back End of the Research Reactor Nuclear Fuel Cycle which includes an Integrated Decision Making Tool called BRIDE (Back-end Research reactor Integrated Decision Evaluation). This is a multi-attribute decision-making tool that combines the Total Estimated Cost of each life-cycle scenario with Non-economic factors such as public acceptance, technical maturity etc and ranks optional back-end scenarios specific to member states situations in order to develop a specific member state strategic plan with a preferred or recommended option for managing spent fuel from Research Reactors.« less
Integrated Modeling Approach for Optimal Management of Water, Energy and Food Security Nexus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xiaodong; Vesselinov, Velimir Valentinov
We report that water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-periodmore » socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. Lastly, the obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.« less
Integrated Modeling Approach for Optimal Management of Water, Energy and Food Security Nexus
Zhang, Xiaodong; Vesselinov, Velimir Valentinov
2016-12-28
We report that water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-periodmore » socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. Lastly, the obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.« less
NASA Astrophysics Data System (ADS)
Ines, A. V. M.; Han, E.; Baethgen, W.
2017-12-01
Advances in seasonal climate forecasts (SCFs) during the past decades have brought great potential to improve agricultural climate risk managements associated with inter-annual climate variability. In spite of popular uses of crop simulation models in addressing climate risk problems, the models cannot readily take seasonal climate predictions issued in the format of tercile probabilities of most likely rainfall categories (i.e, below-, near- and above-normal). When a skillful SCF is linked with the crop simulation models, the informative climate information can be further translated into actionable agronomic terms and thus better support strategic and tactical decisions. In other words, crop modeling connected with a given SCF allows to simulate "what-if" scenarios with different crop choices or management practices and better inform the decision makers. In this paper, we present a decision support tool, called CAMDT (Climate Agriculture Modeling and Decision Tool), which seamlessly integrates probabilistic SCFs to DSSAT-CSM-Rice model to guide decision-makers in adopting appropriate crop and agricultural water management practices for given climatic conditions. The CAMDT has a functionality to disaggregate a probabilistic SCF into daily weather realizations (either a parametric or non-parametric disaggregation method) and to run DSSAT-CSM-Rice with the disaggregated weather realizations. The convenient graphical user-interface allows easy implementation of several "what-if" scenarios for non-technical users and visualize the results of the scenario runs. In addition, the CAMDT also translates crop model outputs to economic terms once the user provides expected crop price and cost. The CAMDT is a practical tool for real-world applications, specifically for agricultural climate risk management in the Bicol region, Philippines, having a great flexibility for being adapted to other crops or regions in the world. CAMDT GitHub: https://github.com/Agro-Climate/CAMDT
Precision Departure Release Capability (PDRC) Technology Description
NASA Technical Reports Server (NTRS)
Engelland, Shawn A.; Capps, Richard; Day, Kevin; Robinson, Corissia; Null, Jody R.
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows which may be subject to constraints that create localized demand-capacity imbalances. When demand exceeds capacity, Traffic Management Coordinators (TMCs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves use of a Call for Release (CFR) procedure wherein the Tower must call the Center TMC to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System (NextGen) plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that uses this technology to improve tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept helps reduce uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station (NTX) in Dallas-Fort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations. This paper presents the Technology Description. Companion papers include the Final Report and a Concept of Operations.
Precision Departure Release Capability (PDRC) Concept of Operations
NASA Technical Reports Server (NTRS)
Engelland, Shawn; Capps, Richard A.; Day, Kevin Brian
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows which may be subject to constraints that create localized demandcapacity imbalances. When demand exceeds capacity Traffic Management Coordinators (TMCs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves use of a Call for Release (CFR) procedure wherein the Tower must call the Center TMC to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System (NextGen) plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that uses this technology to improve tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept helps reduce uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station (NTX) in DallasFort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations. This paper presents the Concept of Operations. Companion papers include the Final Report and a Technology Description. ? SUBJECT:
[Focus on the customer: an essential tool in management competence in nursing].
Ruthes, Rosa Maria; Feldman, Liliane Bauer; Cunha, Isabel Cristina Kowal Olm
2010-01-01
Reflection about a doctorate's thesis. It was carried through an abstraction of the described theory by specialist authors in thematic, a transposition for the practical of competency management in nursing. The essential competency shows attributes of knowledge, ability and attitude - called CHA, to add value to the organization of health with focus in the customer. It is essential to awake in all the team, the responsibility and real concern with the patient, family and visitors. The professional constitutes oneself as the inspired leader of its group. The management competency of the nurse can give change to the development of the human resources and promote the enchantment in the attendance of the customers, making the difference in the health organizations.
Crew resource management in the ICU: the need for culture change.
Haerkens, Marck Htm; Jenkins, Donald H; van der Hoeven, Johannes G
2012-08-22
Intensive care frequently results in unintentional harm to patients and statistics don't seem to improve. The ICU environment is especially unforgiving for mistakes due to the multidisciplinary, time-critical nature of care and vulnerability of the patients. Human factors account for the majority of adverse events and a sound safety climate is therefore essential. This article reviews the existing literature on aviation-derived training called Crew Resource Management (CRM) and discusses its application in critical care medicine. CRM focuses on teamwork, threat and error management and blame free discussion of human mistakes. Though evidence is still scarce, the authors consider CRM to be a promising tool for culture change in the ICU setting, if supported by leadership and well-designed follow-up.
Occupational voice demands and their impact on the call-centre industry.
Hazlett, D E; Duffy, O M; Moorhead, S A
2009-04-20
Within the last decade there has been a growth in the call-centre industry in the UK, with a growing awareness of the voice as an important tool for successful communication. Occupational voice problems such as occupational dysphonia, in a business which relies on healthy, effective voice as the primary professional communication tool, may threaten working ability and occupational health and safety of workers. While previous studies of telephone call-agents have reported a range of voice symptoms and functional vocal health problems, there have been no studies investigating the use and impact of vocal performance in the communication industry within the UK. This study aims to address a significant gap in the evidence-base of occupational health and safety research. The objectives of the study are: 1. to investigate the work context and vocal communication demands for call-agents; 2. to evaluate call-agents' vocal health, awareness and performance; and 3. to identify key risks and training needs for employees and employers within call-centres. This is an occupational epidemiological study, which plans to recruit call-centres throughout the UK and Ireland. Data collection will consist of three components: 1. interviews with managers from each participating call-centre to assess their communication and training needs; 2. an online biopsychosocial questionnaire will be administered to investigate the work environment and vocal demands of call-agents; and 3. voice acoustic measurements of a random sample of participants using the Multi-dimensional Voice Program (MDVP). Qualitative content analysis from the interviews will identify underlying themes and issues. A multivariate analysis approach will be adopted using Structural Equation Modelling (SEM), to develop voice measurement models in determining the construct validity of potential factors contributing to occupational dysphonia. Quantitative data will be analysed using SPSS version 15. Ethical approval is granted for this study from the School of Communication, University of Ulster. The results from this study will provide the missing element of voice-based evidence, by appraising the interactional dimensions of vocal health and communicative performance. This information will be used to inform training for call-agents and to contribute to health policies within the workplace, in order to enhance vocal health.
NASA Astrophysics Data System (ADS)
Krehbiel, C.; Maiersperger, T.; Friesz, A.; Harriman, L.; Quenzer, R.; Impecoven, K.
2016-12-01
Three major obstacles facing big Earth data users include data storage, management, and analysis. As the amount of satellite remote sensing data increases, so does the need for better data storage and management strategies to exploit the plethora of data now available. Standard GIS tools can help big Earth data users whom interact with and analyze increasingly large and diverse datasets. In this presentation we highlight how NASA's Land Processes Distributed Active Archive Center (LP DAAC) is tackling these big Earth data challenges. We provide a real life use case example to describe three tools and services provided by the LP DAAC to more efficiently exploit big Earth data in a GIS environment. First, we describe the Open-source Project for a Network Data Access Protocol (OPeNDAP), which calls to specific data, minimizing the amount of data that a user downloads and improves the efficiency of data downloading and processing. Next, we cover the LP DAAC's Application for Extracting and Exploring Analysis Ready Samples (AppEEARS), a web application interface for extracting and analyzing land remote sensing data. From there, we review an ArcPython toolbox that was developed to provide quality control services to land remote sensing data products. Locating and extracting specific subsets of larger big Earth datasets improves data storage and management efficiency for the end user, and quality control services provides a straightforward interpretation of big Earth data. These tools and services are beneficial to the GIS user community in terms of standardizing workflows and improving data storage, management, and analysis tactics.
Kinect, a Novel Cutting Edge Tool in Pavement Data Collection
NASA Astrophysics Data System (ADS)
Mahmoudzadeh, A.; Firoozi Yeganeh, S.; Golroo, A.
2015-12-01
Pavement roughness and surface distress detection is of interest of decision makers due to vehicle safety, user satisfaction, and cost saving. Data collection, as a core of pavement management systems, is required for these detections. There are two major types of data collection: traditional/manual data collection and automated/semi-automated data collection. This paper study different non-destructive tools in detecting cracks and potholes. For this purpose, automated data collection tools, which have been utilized recently are discussed and their applications are criticized. The main issue is the significant amount of money as a capital investment needed to buy the vehicle. The main scope of this paper is to study the approach and related tools that not only are cost-effective but also precise and accurate. The new sensor called Kinect has all of these specifications. It can capture both RGB images and depth which are of significant use in measuring cracks and potholes. This sensor is able to take image of surfaces with adequate resolution to detect cracks along with measurement of distance between sensor and obstacles in front of it which results in depth of defects. This technology has been very recently studied by few researchers in different fields of studies such as project management, biomedical engineering, etc. Pavement management has not paid enough attention to use of Kinect in monitoring and detecting distresses. This paper is aimed at providing a thorough literature review on usage of Kinect in pavement management and finally proposing the best approach which is cost-effective and precise.
NASA Astrophysics Data System (ADS)
Farmer, J. Doyne; Gallegati, M.; Hommes, C.; Kirman, A.; Ormerod, P.; Cincotti, S.; Sanchez, A.; Helbing, D.
2012-11-01
We outline a vision for an ambitious program to understand the economy and financial markets as a complex evolving system of coupled networks of interacting agents. This is a completely different vision from that currently used in most economic models. This view implies new challenges and opportunities for policy and managing economic crises. The dynamics of such models inherently involve sudden and sometimes dramatic changes of state. Further, the tools and approaches we use emphasize the analysis of crises rather than of calm periods. In this they respond directly to the calls of Governors Bernanke and Trichet for new approaches to macroeconomic modelling.
Animal disease outbreak control: the use of crisis management tools.
Kroschewski, K; Kramer, M; Micklich, A; Staubach, C; Carmanns, R; Conraths, F J
2006-04-01
In this era of globalisation the effective control of animal disease outbreaks requires powerful crisis management tools. In the 1990s software packages for different sectors of the government and agricultural industry began to be developed. In 2004, as a special application for tracking the movement of animals and animal products, the European Union developed the Trade Control and Expert System (TRACES) on the basis of its predecessor, the ANImal MOvement (ANIMO) project. The nationwide use of the ANIMO system by the veterinary authorities in Germany marked the beginning of the development in 1993 of a computerised national animal disease reporting system--the TierSeuchenNachrichten (TSN)--using the ANIMO hardware and software components. In addition to TRACES and TSN the third pillar for the management of animal disease outbreaks and crises in Germany is the national cattle and swine database--called Herkunftssicherungs- und Informationssystem für Tiere. A high degree of standardisation is necessary when integrating the different solutions at all levels of government and with the private sector. In this paper, the authors describe the use of these tools on the basis of their experience and in relation to what we can do now and what we should opt for in the future.
77 FR 72830 - Request for Comments on Request for Continued Examination (RCE) Practice
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-06
... the submission of written comments using a Web-based collaboration tool called IdeaScale[supreg]; and... collaboration tool called IdeaScale[supreg]. The tool allows users to post comments on a topic, and view and...
NASA Astrophysics Data System (ADS)
Gallo, E. M.; Hogue, T. S.; Bell, C. D.; Spahr, K.; McCray, J. E.
2017-12-01
The water quality of receiving streams and waterbodies in urban watersheds are increasingly polluted from stormwater runoff. The implementation of Green Infrastructure (GI), which includes Low Impact Developments (LIDs) and Best Management Practices (BMPs), within a watershed aim to mitigate the effects of urbanization by reducing pollutant loads, runoff volume, and storm peak flow. Stormwater modeling is generally used to assess the impact of GIs implemented within a watershed. These modeling tools are useful for determining the optimal suite of GIs to maximize pollutant load reduction and minimize cost. However, stormwater management for most resource managers and communities also includes the implementation of grey and hybrid stormwater infrastructure. An integrated decision support tool, called i-DST, that allows for the optimization and comprehensive life-cycle cost assessment of grey, green, and hybrid stormwater infrastructure, is currently being developed. The i-DST tool will evaluate optimal stormwater runoff management by taking into account the diverse economic, environmental, and societal needs associated with watersheds across the United States. Three watersheds from southern California will act as a test site and assist in the development and initial application of the i-DST tool. The Ballona Creek, Dominguez Channel, and Los Angeles River Watersheds are located in highly urbanized Los Angeles County. The water quality of the river channels flowing through each are impaired by heavy metals, including copper, lead, and zinc. However, despite being adjacent to one another within the same county, modeling results, using EPA System for Urban Stormwater Treatment and Analysis INtegration (SUSTAIN), found that the optimal path to compliance in each watershed differs significantly. The differences include varied costs, suites of BMPs, and ancillary benefits. This research analyzes how the economic, physical, and hydrological differences between the three watersheds shape the optimal plan for stormwater management.
The effectiveness of energy management system on energy efficiency in the building
NASA Astrophysics Data System (ADS)
Julaihi, F.; Ibrahim, S. H.; Baharun, A.; Affendi, R.; Nawi, M. N. M.
2017-10-01
Energy plays a key role in achieving the desired economic growth for the country. Worldwide industries use 40 percent energy for material and consumption protection to fulfil human needs which contributes almost 37 percent of global greenhouse gases emissions. One of the approach in order to reduce the emission of greenhouse gases to the environment is by conserving energy. This could be executed by implementing energy management especially in commercial and office buildings as daily electricity consumption is high in this type of building. Energy management can also increase the efficiency of energy in the building. Study has been conducted to investigate the performance on implementation of energy management system in office building. Energy management is one of the contemporary challenges, thus study adopts an exploratory approach by using a tool developed by UNIDO called EnMS or Energy Management System. Findings show that by implementing energy management can reduce electricity consumption up to 30%. However, serious initiatives by the organization are needed to promote the effectiveness of energy management.
Experience factors in performing periodic physical evaluations
NASA Technical Reports Server (NTRS)
Hoffman, A. A.
1969-01-01
The lack of scientific basis in the so-called periodic health examinations on military personnel inclusive of the Executive Health Program is outlined. This latter program can well represent a management tool of the company involved in addition to being a status symbol. A multiphasic screening technique is proposed in conjunction with an automated medical history questionnaire for preventive occupational medicine methodology. The need to collate early sickness consultation or clinic visit histories with screening techniques is emphasized.
Extended Testability Analysis Tool
NASA Technical Reports Server (NTRS)
Melcher, Kevin; Maul, William A.; Fulton, Christopher
2012-01-01
The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.
Tools and Approaches for the Construction of Knowledge Models from the Neuroscientific Literature
Burns, Gully A. P. C.; Khan, Arshad M.; Ghandeharizadeh, Shahram; O’Neill, Mark A.; Chen, Yi-Shin
2015-01-01
Within this paper, we describe a neuroinformatics project (called “NeuroScholar,” http://www.neuroscholar.org/) that enables researchers to examine, manage, manipulate, and use the information contained within the published neuroscientific literature. The project is built within a multi-level, multi-component framework constructed with the use of software engineering methods that themselves provide code-building functionality for neuroinformaticians. We describe the different software layers of the system. First, we present a hypothetical usage scenario illustrating how NeuroScholar permits users to address large-scale questions in a way that would otherwise be impossible. We do this by applying NeuroScholar to a “real-world” neuroscience question: How is stress-related information processed in the brain? We then explain how the overall design of NeuroScholar enables the system to work and illustrate different components of the user interface. We then describe the knowledge management strategy we use to store interpretations. Finally, we describe the software engineering framework we have devised (called the “View-Primitive-Data Model framework,” [VPDMf]) to provide an open-source, accelerated software development environment for the project. We believe that NeuroScholar will be useful to experimental neuroscientists by helping them interact with the primary neuroscientific literature in a meaningful way, and to neuroinformaticians by providing them with useful, affordable software engineering tools. PMID:15055395
Staccini, Pascal; Dufour, Jean-Charles; Raps, Hervé; Fieschi, Marius
2005-01-01
Making educational material be available on a network cannot be reduced to merely implementing hypermedia and interactive resources on a server. A pedagogical schema has to be defined to guide students for learning and to provide teachers with guidelines to prepare valuable and upgradeable resources. Components of a learning environment, as well as interactions between students and other roles such as author, tutor and manager, can be deduced from cognitive foundations of learning, such as the constructivist approach. Scripting the way a student will to navigate among information nodes and interact with tools to build his/her own knowledge can be a good way of deducing the features of the graphic interface related to the management of the objects. We defined a typology of pedagogical resources, their data model and their logic of use. We implemented a generic and web-based authoring and publishing platform (called J@LON for Join And Learn On the Net) within an object-oriented and open-source programming environment (called Zope) embedding a content management system (called Plone). Workflow features have been used to mark the progress of students and to trace the life cycle of resources shared by the teaching staff. The platform integrated advanced on line authoring features to create interactive exercises and support live courses diffusion. The platform engine has been generalized to the whole curriculum of medical studies in our faculty; it also supports an international master of risk management in health care and will be extent to all other continuous training diploma.
Master Middle Ware: A Tool to Integrate Water Resources and Fish Population Dynamics Models
NASA Astrophysics Data System (ADS)
Yi, S.; Sandoval Solis, S.; Thompson, L. C.; Kilduff, D. P.
2017-12-01
Linking models that investigate separate components of ecosystem processes has the potential to unify messages regarding management decisions by evaluating potential trade-offs in a cohesive framework. This project aimed to improve the ability of riparian resource managers to forecast future water availability conditions and resultant fish habitat suitability, in order to better inform their management decisions. To accomplish this goal, we developed a middleware tool that is capable of linking and overseeing the operations of two existing models, a water resource planning tool Water Evaluation and Planning (WEAP) model and a habitat-based fish population dynamics model (WEAPhish). First, we designed the Master Middle Ware (MMW) software in Visual Basic for Application® in one Excel® file that provided a familiar framework for both data input and output Second, MMW was used to link and jointly operate WEAP and WEAPhish, using Visual Basic Application (VBA) macros to implement system level calls to run the models. To demonstrate the utility of this approach, hydrological, biological, and middleware model components were developed for the Butte Creek basin. This tributary of the Sacramento River, California is managed for both hydropower and the persistence of a threatened population of spring-run Chinook salmon (Oncorhynchus tschawytscha). While we have demonstrated the use of MMW for a particular watershed and fish population, MMW can be customized for use with different rivers and fish populations, assuming basic data requirements are met. This model integration improves on ad hoc linkages for managing data transfer between software programs by providing a consistent, user-friendly, and familiar interface across different model implementations. Furthermore, the data-viewing capabilities of MMW facilitate the rapid interpretation of model results by hydrologists, fisheries biologists, and resource managers, in order to accelerate learning and management decision making.
Fonda, Stephanie J; Paulsen, Christine A; Perkins, Joan; Kedziora, Richard J; Rodbard, David; Bursell, Sven-Erik
2008-02-01
Research suggests Internet-based care management tools are associated with improvements in care and patient outcomes. However, although such tools change workflow, rarely is their usability addressed and reported. This article presents a usability study of an Internet-based informatics application called the Comprehensive Diabetes Management Program (CDMP), developed by content experts and technologists. Our aim is to demonstrate a process for conducting a usability study of such a tool and to report results. We conducted the usability test with six diabetes care providers under controlled conditions. Each provider worked with the CDMP in a single session using a "think aloud" process. Providers performed standardized tasks with fictitious patient data, and we observed how they approached these tasks, documenting verbalizations and subjective ratings. The providers then completed a usability questionnaire and interviews. Overall, the scores on the usability questionnaire were neutral to favorable. For specific subdomains of the questionnaire, the providers' reported problems with the application's ease of use, performance, and support features, but were satisfied with its visual appeal and content. The results from the observational and interview data indicated areas for improvement, particularly in navigation and terminology. The usability study identified several issues for improvement, confirming the need for usability testing of Internet-based informatics applications, even those developed by experts. To our knowledge, there have been no other usability studies of an Internet-based informatics application with the functionality of the CDMP. Such studies can form the foundation for translation of Internet-based medical informatics tools into clinical practice.
[Disease management programs from a health insurer's point of view].
Szymkowiak, Christof; Walkenhorst, Karen; Straub, Christoph
2003-06-01
Disease Management Programmes represent a great challenge to the German statutory health insurance system. According to politicians, disease management programmes are an appropriate tool for increasing the level of care for chronically ill patients significantly, while at the same time they can slow down the cost explosion in health care. The statutory health insurers' point of view yields a more refined picture of the chances and risks involved. The chances are that a medical guideline-based, evidence-based, co-operative care of the chronically ill could be established. But also, there are the risks of misuse of disease management programmes and of misallocation of funds due to the ill-advised linkage with the so-called risk compensation scheme (RSA) balancing the sickness funds' structural deficits through redistribution. The nation-wide introduction of disease management programmes appears to be a gigantic experiment whose aim is to change the care of chronically ill patients and whose outcome is unpredictable.
Software Sharing Enables Smarter Content Management
NASA Technical Reports Server (NTRS)
2007-01-01
In 2004, NASA established a technology partnership with Xerox Corporation to develop high-tech knowledge management systems while providing new tools and applications that support the Vision for Space Exploration. In return, NASA provides research and development assistance to Xerox to progress its product line. The first result of the technology partnership was a new system called the NX Knowledge Network (based on Xerox DocuShare CPX). Created specifically for NASA's purposes, this system combines Netmark-practical database content management software created by the Intelligent Systems Division of NASA's Ames Research Center-with complementary software from Xerox's global research centers and DocuShare. NX Knowledge Network was tested at the NASA Astrobiology Institute, and is widely used for document management at Ames, Langley Research Center, within the Mission Operations Directorate at Johnson Space Center, and at the Jet Propulsion Laboratory, for mission-related tasks.
Performance Support Tools: Delivering Value when and where It Is Needed
ERIC Educational Resources Information Center
McManus, Paul; Rossett, Allison
2006-01-01
Some call them Electronic Performance Support Systems (EPSSs). Others prefer Performance Support Tools (PSTs) or decision support tools. One might call EPSSs or PSTs job aids on steroids, technological tools that provide critical information or advice needed to move forward at a particular moment in time. Characteristic advantages of an EPSS or a…
CWA 15793 2011 Planning and Implementation Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gross, Alan; Nail, George
This software, built on an open source platform called Electron (runs on Chromium and Node.js), is designed to assist organizations in the implementation of a biorisk management system consistent with the requirements of the international, publicly available guidance document CEN Workshop Agreement 15793:2011 (CWA 15793). The software includes tools for conducting organizational gap analysis against CWA 15793 requirements, planning tools to support the implementation of CWA 15793 requirements, and performance monitoring support. The gap analysis questions are based on the text of CWA 15793, and its associated guidance document, CEN Workshop Agreement 16393:2012. The authors have secured permission from themore » publisher of CWA 15793, the European Committee for Standardization (CEN), to use language from the document in the software, with the understanding that the software will be made available freely, without charge.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
The Profile Interface Generator (PIG) is a tool for loosely coupling applications and performance tools. It enables applications to write code that looks like standard C and Fortran functions calls, without requiring that applications link to specific implementations of those function calls. Performance tools can register with PIG in order to listen to only the calls that give information they care about. This interface reduces the build and configuration burden on application developers and allows semantic instrumentation to live in production codes without interfering with production runs.
Hodge, Stacie; Helliar, Sebastian; Macdonald, Hamish Ian; Mackey, Paul
2018-01-01
Until now, there have been no published surgical triage tools. We have developed the first such tool with a tiered escalation policy, aiming to improve identification and management of critically unwell patients. The existing sheet which is used to track new referrals and admissions to the surgical assessment unit was reviewed. The sheet was updated and a traffic light triage tool generated using National Early Warning Scores (NEWS), sepsis criteria and user discretion. A tiered escalation policy to guide urgency of assessment was introduced and education sessions for all staff undertaken, to ensure understanding and compliance. Through multiple 'plan-do-study-act' cycles, the new system and its efficiency have been analysed. Prior to intervention, documentation of NEWS did not occur and only 13% of admission observations were communicated to the surgical team. Following multiple cycles and interventions, 93% of patients were fully triaged, and 80% of 'red' and 'amber' patients' observations were communicated to the surgical team. The average time for a registrar to review a 'red' patient was 37 min and 79% of 'green' patients were reviewed within an hour of their presentation. Rapid identification of the unwell patient is crucial. Here we publish the first triage tool that enables early assessment of septic and otherwise potentially unwell surgical patients.
Doctor-patient communication on the telephone.
Curtis, P; Evens, S
1989-01-01
Since its invention, the telephone has been an important tool in medical practice, particularly for primary care physicians. Approximately half the calls made to a physician's office during regular consulting hours are for clinical problems and most are handled effectively over the phone without an immediate office visit. Telephone encounters are generally very brief, and managing such calls requires a pragmatic approach that is often quite different from the approach taken in the office visit. The telephone encounter should be recognized and recorded as a specific medical interaction in the medical chart for both clinical and legal reasons. Effective telephone encounters depend on good communication skills; decision making regarding disposition is a major goal. The physician's perception of a medical problem may be different from the patient's; patients are frequently seeking advice and reassurance rather than diagnosis and treatment, and may call because of anxiety and psychological stress. For physicians and their families who are not prepared for after-hours telephone encounters, calls that interrupt more "legitimate" activities may result in anger or frustration for the physician and dissatisfaction for the patient.
SedInConnect: a stand-alone, free and open source tool for the assessment of sediment connectivity
NASA Astrophysics Data System (ADS)
Crema, Stefano; Cavalli, Marco
2018-02-01
There is a growing call, within the scientific community, for solid theoretic frameworks and usable indices/models to assess sediment connectivity. Connectivity plays a significant role in characterizing structural properties of the landscape and, when considered in combination with forcing processes (e.g., rainfall-runoff modelling), can represent a valuable analysis for an improved landscape management. In this work, the authors present the development and application of SedInConnect: a free, open source and stand-alone application for the computation of the Index of Connectivity (IC), as expressed in Cavalli et al. (2013) with the addition of specific innovative features. The tool is intended to have a wide variety of users, both from the scientific community and from the authorities involved in the environmental planning. Thanks to its open source nature, the tool can be adapted and/or integrated according to the users' requirements. Furthermore, presenting an easy-to-use interface and being a stand-alone application, the tool can help management experts in the quantitative assessment of sediment connectivity in the context of hazard and risk assessment. An application to a sample dataset and an overview on up-to-date applications of the approach and of the tool shows the development potential of such analyses. The modelled connectivity, in fact, appears suitable not only to characterize sediment dynamics at the catchment scale but also to integrate prediction models and as a tool for helping geomorphological interpretation.
Science-based Forest Management in an Era of Climate Change
NASA Astrophysics Data System (ADS)
Swanston, C.; Janowiak, M.; Brandt, L.; Butler, P.; Handler, S.; Shannon, D.
2014-12-01
Recognizing the need to provide climate adaptation information, training, and tools to forest managers, the Forest Service joined with partners in 2009 to launch a comprehensive effort called the Climate Change Response Framework (www.forestadaptation.org). The Framework provides a structured approach to help managers integrate climate considerations into forest management plans and then implement adaptation actions on the ground. A planning tool, the Adaptation Workbook, is used in conjunction with vulnerability assessments and a diverse "menu" of adaptation approaches to generate site-specific adaptation actions that meet explicit management objectives. Additionally, a training course, designed around the Adaptation Workbook, leads management organizations through this process of designing on-the-ground adaptation tactics for their management projects. The Framework is now being actively pursued in 20 states in the Northwoods, Central Hardwoods, Central Appalachians, Mid-Atlantic, and New England. The Framework community includes over 100 science and management groups, dozens of whom have worked together to complete six ecoregional vulnerability assessments covering nearly 135 million acres. More than 75 forest and urban forest adaptation strategies and approaches were synthesized from peer-reviewed and gray literature, expert solicitation, and on-the-ground adaptation projects. These are being linked through the Adaptation Workbook process to on-the-ground adaptation tactics being planned and employed in more than 50 adaptation "demonstrations". This presentation will touch on the scientific and professional basis of the vulnerability assessments, and showcase efforts where adaptation actions are currently being implemented in forests.
Crew resource management in the ICU: the need for culture change
2012-01-01
Intensive care frequently results in unintentional harm to patients and statistics don’t seem to improve. The ICU environment is especially unforgiving for mistakes due to the multidisciplinary, time-critical nature of care and vulnerability of the patients. Human factors account for the majority of adverse events and a sound safety climate is therefore essential. This article reviews the existing literature on aviation-derived training called Crew Resource Management (CRM) and discusses its application in critical care medicine. CRM focuses on teamwork, threat and error management and blame free discussion of human mistakes. Though evidence is still scarce, the authors consider CRM to be a promising tool for culture change in the ICU setting, if supported by leadership and well-designed follow-up. PMID:22913855
Practical choices for infobutton customization: experience from four sites.
Cimino, James J; Overby, Casey L; Devine, Emily B; Hulse, Nathan C; Jing, Xia; Maviglia, Saverio M; Del Fiol, Guilherme
2013-01-01
Context-aware links between electronic health records (EHRs) and online knowledge resources, commonly called "infobuttons" are being used increasingly as part of EHR "meaningful use" requirements. While an HL7 standard exists for specifying how the links should be constructed, there is no guidance on what links to construct. Collectively, the authors manage four infobutton systems that serve 16 institutions. The purpose of this paper is to publish our experience with linking various resources and specifying particular criteria that can be used by infobutton managers to select resources that are most relevant for a given situation. This experience can be used directly by those wishing to customize their own EHRs, for example by using the OpenInfobutton infobutton manager and its configuration tool, the Librarian Infobutton Tailoring Environment.
NASA Astrophysics Data System (ADS)
Wollschläger, Ute; Helming, Katharina; Heinrich, Uwe; Bartke, Stephan; Kögel-Knabner, Ingrid; Russell, David; Eberhardt, Einar; Vogel, Hans-Jörg
2016-04-01
Fertile soils are central resources for the production of biomass and provision of food and energy. A growing world population and latest climate targets lead to an increasing demand for both, food and bio-energy, which require preserving and improving the long-term productivity of soils as a bio-economic resource. At the same time, other soil functions and ecosystem services need to be maintained. To render soil management sustainable, we need to establish a scientific knowledge base about complex soil system processes that allows for the development of model tools to quantitatively predict the impact of a multitude of management measures on soil functions. This, finally, will allow for the provision of site-specific options for sustainable soil management. To face this challenge, the German Federal Ministry of Education and Research recently launched the funding program "Soil as a Natural Resource for the Bio-Economy - BonaRes". In a joint effort, ten collaborative projects and the coordinating BonaRes Centre are engaged to close existing knowledge gaps for a profound and systemic understanding of soil functions and their sensitivity to soil management. This presentation provides an overview of the concept of the BonaRes Centre which is responsible for i) setting up a comprehensive data base for soil-related information, ii) the development of model tools aiming to estimate the impact of different management measures on soil functions, and iii) establishing a web-based portal providing decision support tools for a sustainable soil management. A specific focus of the presentation will be laid on the so-called "knowledge-portal" providing the infrastructure for a community effort towards a comprehensive meta-analysis on soil functions as a basis for future model developments.
New Human-Computer Interface Concepts for Mission Operations
NASA Technical Reports Server (NTRS)
Fox, Jeffrey A.; Hoxie, Mary Sue; Gillen, Dave; Parkinson, Christopher; Breed, Julie; Nickens, Stephanie; Baitinger, Mick
2000-01-01
The current climate of budget cuts has forced the space mission operations community to reconsider how it does business. Gone are the days of building one-of-kind control centers with teams of controllers working in shifts 24 hours per day, 7 days per week. Increasingly, automation is used to significantly reduce staffing needs. In some cases, missions are moving towards lights-out operations where the ground system is run semi-autonomously. On-call operators are brought in only to resolve anomalies. Some operations concepts also call for smaller operations teams to manage an entire family of spacecraft. In the not too distant future, a skeleton crew of full-time general knowledge operators will oversee the operations of large constellations of small spacecraft, while geographically distributed specialists will be assigned to emergency response teams based on their expertise. As the operations paradigms change, so too must the tools to support the mission operations team's tasks. Tools need to be built not only to automate routine tasks, but also to communicate varying types of information to the part-time, generalist, or on-call operators and specialists more effectively. Thus, the proper design of a system's user-system interface (USI) becomes even more importance than before. Also, because the users will be accessing these systems from various locations (e.g., control center, home, on the road) via different devices with varying display capabilities (e.g., workstations, home PCs, PDAS, pagers) over connections with various bandwidths (e.g., dial-up 56k, wireless 9.6k), the same software must have different USIs to support the different types of users, their equipment, and their environments. In other words, the software must now adapt to the needs of the users! This paper will focus on the needs and the challenges of designing USIs for mission operations. After providing a general discussion of these challenges, the paper will focus on the current efforts of creatin(y an effective USI for one specific suite of tools, SERS (The Spacecraft Emergency Response System), which has been built to enable lights-out operations. SERS is a Web-based collaborative environment that enables secure distributed fault management.
From Research to Flight: Thinking About Implementation While Performing Fundamental Research
NASA Technical Reports Server (NTRS)
Johnson, Les
2010-01-01
This slide presentation calls for a strategy to implement new technologies. Such a strategy would allow advanced space transportation technologies to mature for exploration beyond Earth orbit. It discusses the difference between technology push versus technology pull. It also reviews the three basic technology readiness levels (TRL). The presentation traces examples of technology development to flight application: the Space Shuttle Main Engine Advanced Health Management System, the Friction Stir Welding technology the (auto-adjustable pin tool). A couple of technologies currently not in flight, but are being reviewed for potential use are: cryogenic fluid management (CFM), and solar sail propulsion. There is also an attempt to explain why new technologies are so difficult to field.
Design of an instrument to measure the quality of care in Physical Therapy
Cavalheiro, Leny Vieira; Eid, Raquel Afonso Caserta; Talerman, Claudia; do Prado, Cristiane; Gobbi, Fátima Cristina Martorano; Andreoli, Paola Bruno de Araujo
2015-01-01
ABSTRACT Objective: To design an instrument composed of domains that would demonstrate physical therapy activities and generate a consistent index to represent the quality of care in physical therapy. Methods: The methodology Lean Six Sigma was used to design the tool. The discussion involved seven different management groups staff. By means of brainstorming and Cause & Effect Matrix, we set up the process map. Results: Five requirements composed the quality of care index in physical therapy, after application of the tool called Cause & Effect Matrix. The following requirements were assessed: physical therapist performance, care outcome indicator, adherence to physical therapy protocols, measure whether the prognosis and treatment outcome was achieved and Infrastructure. Conclusion: The proposed design allowed evaluating several items related to physical therapy service, enabling customization, reproducibility and benchmarking with other organizations. For management, this index provides the opportunity to identify areas for improvement and the strengths of the team and process of physical therapy care. PMID:26154548
'Big Data' Collaboration: Exploring, Recording and Sharing Enterprise Knowledge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sukumar, Sreenivas R; Ferrell, Regina Kay
2013-01-01
As data sources and data size proliferate, knowledge discovery from "Big Data" is starting to pose several challenges. In this paper, we address a specific challenge in the practice of enterprise knowledge management while extracting actionable nuggets from diverse data sources of seemingly-related information. In particular, we address the challenge of archiving knowledge gained through collaboration, dissemination and visualization as part of the data analysis, inference and decision-making lifecycle. We motivate the implementation of an enterprise data-discovery and knowledge recorder tool, called SEEKER based on real world case-study. We demonstrate SEEKER capturing schema and data-element relationships, tracking the data elementsmore » of value based on the queries and the analytical artifacts that are being created by analysts as they use the data. We show how the tool serves as digital record of institutional domain knowledge and a documentation for the evolution of data elements, queries and schemas over time. As a knowledge management service, a tool like SEEKER saves enterprise resources and time by avoiding analytic silos, expediting the process of multi-source data integration and intelligently documenting discoveries from fellow analysts.« less
Willacy, Erika; Bratton, Shelly
2015-09-30
Public health management is a pillar of public health practice. Only through effective management can research, theory, and scientific innovation be translated into successful public health action. With this in mind, the U.S. Centers for Disease Control and Prevention (CDC) has developed an innovative program called Improving Public Health Management for Action (IMPACT) which aims to address this critical need by building an effective cadre of public health managers to work alongside scientists to prepare for and respond to disease threats and to effectively implement public health programs. IMPACT is a 2-year, experiential learning program that provides fellows with the management tools and opportunities to apply their new knowledge in the field, all while continuing to serve the Ministry of Health (MoH). IMPACT will launch in 2016 in 2 countries with the intent of expanding to additional countries in future years resulting in a well-trained cadre of public health managers around the world. © 2016 by Kerman University of Medical Sciences.
Review of LCA studies of solid waste management systems – Part I: Lessons learned and perspectives
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laurent, Alexis, E-mail: alau@dtu.dk; Bakas, Ioannis; Clavreul, Julie
Highlights: • We perform a critical review of 222 LCA studies of solid waste management systems. • Studies mainly concentrated in Europe with little application in developing countries. • Assessments of relevant waste types apart from household waste have been overlooked. • Local specificities of systems prevent a meaningful generalisation of the LCA results. • LCA should support recommendations representative of the local conditions. - Abstract: The continuously increasing solid waste generation worldwide calls for management strategies that integrate concerns for environmental sustainability. By quantifying environmental impacts of systems, life cycle assessment (LCA) is a tool, which can contribute tomore » answer that call. But how, where and to which extent has it been applied to solid waste management systems (SWMSs) until now, and which lessons can be learnt from the findings of these LCA applications? To address these questions, we performed a critical review of 222 published LCA studies of SWMS. We first analysed the geographic distribution and found that the published studies have primarily been concentrated in Europe with little application in developing countries. In terms of technological coverage, they have largely overlooked application of LCA to waste prevention activities and to relevant waste types apart from household waste, e.g. construction and demolition waste. Waste management practitioners are thus encouraged to abridge these gaps in future applications of LCA. In addition to this contextual analysis, we also evaluated the findings of selected studies of good quality and found that there is little agreement in the conclusions among them. The strong dependence of each SWMS on local conditions, such as waste composition or energy system, prevents a meaningful generalisation of the LCA results as we find it in the waste hierarchy. We therefore recommend stakeholders in solid waste management to regard LCA as a tool, which, by its ability of capturing the local specific conditions in the modelling of environmental impacts and benefits of a SWMS, allows identifying critical problems and proposing improvement options adapted to the local specificities.« less
NASA Astrophysics Data System (ADS)
Polyakov, S. P.; Kryukov, A. P.; Demichev, A. P.
2018-01-01
We present a simple set of command line interface tools called Docker Container Manager (DCM) that allow users to create and manage Docker containers with preconfigured SSH access while keeping the users isolated from each other and restricting their access to the Docker features that could potentially disrupt the work of the server. Users can access DCM server via SSH and are automatically redirected to DCM interface tool. From there, they can create new containers, stop, restart, pause, unpause, and remove containers and view the status of the existing containers. By default, the containers are also accessible via SSH using the same private key(s) but through different server ports. Additional publicly available ports can be mapped to the respective ports of a container, allowing for some network services to be run within it. The containers are started from read-only filesystem images. Some initial images must be provided by the DCM server administrators, and after containers are configured to meet one’s needs, the changes can be saved as new images. Users can see the available images and remove their own images. DCM server administrators are provided with commands to create and delete users. All commands were implemented as Python scripts. The tools allow to deploy and debug medium-sized distributed systems for simulation in different fields on one or several local computers.
Cause-and-effect mapping of critical events.
Graves, Krisanne; Simmons, Debora; Galley, Mark D
2010-06-01
Health care errors are routinely reported in the scientific and public press and have become a major concern for most Americans. In learning to identify and analyze errors health care can develop some of the skills of a learning organization, including the concept of systems thinking. Modern experts in improving quality have been working in other high-risk industries since the 1920s making structured organizational changes through various frameworks for quality methods including continuous quality improvement and total quality management. When using these tools, it is important to understand systems thinking and the concept of processes within organization. Within these frameworks of improvement, several tools can be used in the analysis of errors. This article introduces a robust tool with a broad analytical view consistent with systems thinking, called CauseMapping (ThinkReliability, Houston, TX, USA), which can be used to systematically analyze the process and the problem at the same time. Copyright 2010 Elsevier Inc. All rights reserved.
Knowledge sharing and collaboration in translational research, and the DC-THERA Directory
Gündel, Michaela; Austyn, Jonathan M.; Cavalieri, Duccio; Scognamiglio, Ciro; Brandizi, Marco
2011-01-01
Biomedical research relies increasingly on large collections of data sets and knowledge whose generation, representation and analysis often require large collaborative and interdisciplinary efforts. This dimension of ‘big data’ research calls for the development of computational tools to manage such a vast amount of data, as well as tools that can improve communication and access to information from collaborating researchers and from the wider community. Whenever research projects have a defined temporal scope, an additional issue of data management arises, namely how the knowledge generated within the project can be made available beyond its boundaries and life-time. DC-THERA is a European ‘Network of Excellence’ (NoE) that spawned a very large collaborative and interdisciplinary research community, focusing on the development of novel immunotherapies derived from fundamental research in dendritic cell immunobiology. In this article we introduce the DC-THERA Directory, which is an information system designed to support knowledge management for this research community and beyond. We present how the use of metadata and Semantic Web technologies can effectively help to organize the knowledge generated by modern collaborative research, how these technologies can enable effective data management solutions during and beyond the project lifecycle, and how resources such as the DC-THERA Directory fit into the larger context of e-science. PMID:21969471
NASA Astrophysics Data System (ADS)
Grigoras, Costin; Carminati, Federico; Vladimirovna Datskova, Olga; Schreiner, Steffen; Lee, Sehoon; Zhu, Jianlin; Gheata, Mihaela; Gheata, Andrei; Saiz, Pablo; Betev, Latchezar; Furano, Fabrizio; Mendez Lorenzo, Patricia; Grigoras, Alina Gabriela; Bagnasco, Stefano; Peters, Andreas Joachim; Saiz Santos, Maria Dolores
2011-12-01
With the LHC and ALICE entering a full operation and production modes, the amount of Simulation and RAW data processing and end user analysis computational tasks are increasing. The efficient management of all these tasks, all of which have large differences in lifecycle, amounts of processed data and methods to analyze the end result, required the development and deployment of new tools in addition to the already existing Grid infrastructure. To facilitate the management of the large scale simulation and raw data reconstruction tasks, ALICE has developed a production framework called a Lightweight Production Manager (LPM). The LPM is automatically submitting jobs to the Grid based on triggers and conditions, for example after a physics run completion. It follows the evolution of the job and publishes the results on the web for worldwide access by the ALICE physicists. This framework is tightly integrated with the ALICE Grid framework AliEn. In addition to the publication of the job status, LPM is also allowing a fully authenticated interface to the AliEn Grid catalogue, to browse and download files, and in the near future will provide simple types of data analysis through ROOT plugins. The framework is also being extended to allow management of end user jobs.
Implementing iRound: A Computer-Based Auditing Tool.
Brady, Darcie
Many hospitals use rounding or auditing as a tool to help identify gaps and needs in quality and process performance. Some hospitals are also using rounding to help improve patient experience. It is known that purposeful rounding helps improve Hospital Consumer Assessment of Healthcare Providers and Systems scores by helping manage patient expectations, provide service recovery, and recognize quality caregivers. Rounding works when a standard method is used across the facility, where data are comparable and trustworthy. This facility had a pen-and-paper process in place that made data reporting difficult, created a silo culture between departments, and most audits and rounds were completed differently on each unit. It was recognized that this facility needed to standardize the rounding and auditing process. The tool created by the Advisory Board called iRound was chosen as the tool this facility would use for patient experience rounds as well as process and quality rounding. The success of the iRound tool in this facility depended on several factors that started many months before implementation to current everyday usage.
Evolution of a residue laboratory network and the management tools for monitoring its performance.
Lins, E S; Conceição, E S; Mauricio, A De Q
2012-01-01
Since 2005 the National Residue & Contaminants Control Plan (NRCCP) in Brazil has been considerably enhanced, increasing the number of samples, substances and species monitored, and also the analytical detection capability. The Brazilian laboratory network was forced to improve its quality standards in order to comply with the NRCP's own evolution. Many aspects such as the limits of quantification (LOQs), the quality management systems within the laboratories and appropriate method validation are in continuous improvement, generating new scenarios and demands. Thus, efficient management mechanisms for monitoring network performance and its adherence to the established goals and guidelines are required. Performance indicators associated to computerised information systems arise as a powerful tool to monitor the laboratories' activity, making use of different parameters to describe this activity on a day-to-day basis. One of these parameters is related to turnaround times, and this factor is highly affected by the way each laboratory organises its management system, as well as the regulatory requirements. In this paper a global view is presented of the turnaround times related to the type of analysis, laboratory, number of samples per year, type of matrix, country region and period of the year, all these data being collected from a computerised system called SISRES. This information gives a solid background to management measures aiming at the improvement of the service offered by the laboratory network.
DeMAID: A Design Manager's Aide for Intelligent Decomposition user's guide
NASA Technical Reports Server (NTRS)
Rogers, James L.
1989-01-01
A design problem is viewed as a complex system divisible into modules. Before the design of a complex system can begin, the couplings among modules and the presence of iterative loops is determined. This is important because the design manager must know how to group the modules into subsystems and how to assign subsystems to design teams so that changes in one subsystem will have predictable effects on other subsystems. Determining these subsystems is not an easy, straightforward process and often important couplings are overlooked. Moreover, the planning task must be repeated as new information become available or as the design specifications change. The purpose of this research is to develop a knowledge-based tool called the Design Manager's Aide for Intelligent Decomposition (DeMAID) to act as an intelligent advisor for the design manager. DeMaid identifies the subsystems of a complex design problem, orders them into a well-structured format, and marks the couplings among the subsystems to facilitate the use of multilevel tools. DeMAID also provides the design manager with the capability of examining the trade-offs between sequential and parallel processing. This type of approach could lead to a substantial savings or organizing and displaying a complex problem as a sequence of subsystems easily divisible among design teams. This report serves as a User's Guide for the program.
Clinical detection of deletion structural variants in whole-genome sequences
Noll, Aaron C; Miller, Neil A; Smith, Laurie D; Yoo, Byunggil; Fiedler, Stephanie; Cooley, Linda D; Willig, Laurel K; Petrikin, Josh E; Cakici, Julie; Lesko, John; Newton, Angela; Detherage, Kali; Thiffault, Isabelle; Saunders, Carol J; Farrow, Emily G; Kingsmore, Stephen F
2016-01-01
Optimal management of acutely ill infants with monogenetic diseases requires rapid identification of causative haplotypes. Whole-genome sequencing (WGS) has been shown to identify pathogenic nucleotide variants in such infants. Deletion structural variants (DSVs, >50 nt) are implicated in many genetic diseases, and tools have been designed to identify DSVs using short-read WGS. Optimisation and integration of these tools into a WGS pipeline could improve diagnostic sensitivity and specificity of WGS. In addition, it may improve turnaround time when compared with current CNV assays, enhancing utility in acute settings. Here we describe DSV detection methods for use in WGS for rapid diagnosis in acutely ill infants: SKALD (Screening Konsensus and Annotation of Large Deletions) combines calls from two tools (Breakdancer and GenomeStrip) with calibrated filters and clinical interpretation rules. In four WGS runs, the average analytic precision (positive predictive value) of SKALD was 78%, and recall (sensitivity) was 27%, when compared with validated reference DSV calls. When retrospectively applied to a cohort of 36 families with acutely ill infants SKALD identified causative DSVs in two. The first was heterozygous deletion of exons 1–3 of MMP21 in trans with a heterozygous frame-shift deletion in two siblings with transposition of the great arteries and heterotaxy. In a newborn female with dysmorphic features, ventricular septal defect and persistent pulmonary hypertension, SKALD identified the breakpoints of a heterozygous, de novo 1p36.32p36.13 deletion. In summary, consensus DSV calling, implemented in an 8-h computational pipeline with parameterised filtering, has the potential to increase the diagnostic yield of WGS in acutely ill neonates and discover novel disease genes. PMID:29263817
A user-friendly tool to evaluate the effectiveness of no-take marine reserves.
Villaseñor-Derbez, Juan Carlos; Faro, Caio; Wright, Melaina; Martínez, Jael; Fitzgerald, Sean; Fulton, Stuart; Mancha-Cisneros, Maria Del Mar; McDonald, Gavin; Micheli, Fiorenza; Suárez, Alvin; Torre, Jorge; Costello, Christopher
2018-01-01
Marine reserves are implemented to achieve a variety of objectives, but are seldom rigorously evaluated to determine whether those objectives are met. In the rare cases when evaluations do take place, they typically focus on ecological indicators and ignore other relevant objectives such as socioeconomics and governance. And regardless of the objectives, the diversity of locations, monitoring protocols, and analysis approaches hinder the ability to compare results across case studies. Moreover, analysis and evaluation of reserves is generally conducted by outside researchers, not the reserve managers or users, plausibly thereby hindering effective local management and rapid response to change. We present a framework and tool, called "MAREA", to overcome these challenges. Its purpose is to evaluate the extent to which any given reserve has achieved its stated objectives. MAREA provides specific guidance on data collection and formatting, and then conducts rigorous causal inference analysis based on data input by the user, providing real-time outputs about the effectiveness of the reserve. MAREA's ease of use, standardization of state-of-the-art inference methods, and ability to analyze marine reserve effectiveness across ecological, socioeconomic, and governance objectives could dramatically further our understanding and support of effective marine reserve management.
A user-friendly tool to evaluate the effectiveness of no-take marine reserves
Fitzgerald, Sean; Fulton, Stuart; Mancha-Cisneros, Maria del Mar; McDonald, Gavin; Micheli, Fiorenza; Suárez, Alvin; Torre, Jorge
2018-01-01
Marine reserves are implemented to achieve a variety of objectives, but are seldom rigorously evaluated to determine whether those objectives are met. In the rare cases when evaluations do take place, they typically focus on ecological indicators and ignore other relevant objectives such as socioeconomics and governance. And regardless of the objectives, the diversity of locations, monitoring protocols, and analysis approaches hinder the ability to compare results across case studies. Moreover, analysis and evaluation of reserves is generally conducted by outside researchers, not the reserve managers or users, plausibly thereby hindering effective local management and rapid response to change. We present a framework and tool, called “MAREA”, to overcome these challenges. Its purpose is to evaluate the extent to which any given reserve has achieved its stated objectives. MAREA provides specific guidance on data collection and formatting, and then conducts rigorous causal inference analysis based on data input by the user, providing real-time outputs about the effectiveness of the reserve. MAREA’s ease of use, standardization of state-of-the-art inference methods, and ability to analyze marine reserve effectiveness across ecological, socioeconomic, and governance objectives could dramatically further our understanding and support of effective marine reserve management. PMID:29381762
[Nurse's concept in the managerial conception of a basic health unit].
Passos, Joanir Pereira; Ciosak, Suely Itsuko
2006-12-01
This study is part of a larger survey called "Use of indicators in nurses' managerial practice in Basic Health Care Units in the city of Rio de Janeiro", which was carried out in the Basic Health Care Units of the Planning Area 5.3 and whose objectives were to identify nurses' conception regarding the tools required for management in those units and to discuss the role of management in organizing health services. The study is descriptive and data were collected in interviews with seven nurse managers. The results show that health services actions are organized and directed to the purpose of the working process through the relationship established between the object, the instruments and the final product, and that for those nurses the end result to be achieved is client's satisfaction and the quality of medical and nursing care.
Integrated farm sustainability assessment for the environmental management of rural activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stachetii Rodrigues, Geraldo, E-mail: stacheti@cnpma.embrapa.b; Aparecida Rodrigues, Izilda, E-mail: isis@cnpma.embrapa.b; Almeida Buschinelli, Claudio Cesar de, E-mail: buschi@cnpma.embrapa.b
2010-07-15
Farmers have been increasingly called upon to respond to an ongoing redefinition in consumers' demands, having as a converging theme the search for sustainable production practices. In order to satisfy this objective, instruments for the environmental management of agricultural activities have been sought out. Environmental impact assessment methods are appropriate tools to address the choice of technologies and management practices to minimize negative effects of agricultural development, while maximizing productive efficiency, sound usage of natural resources, conservation of ecological assets and equitable access to wealth generation means. The 'system for weighted environmental impact assessment of rural activities' (APOIA-NovoRural) presented inmore » this paper is organized to provide integrated farm sustainability assessment according to quantitative environmental standards and defined socio-economic benchmarks. The system integrates sixty-two objective indicators in five sustainability dimensions - (i) Landscape ecology, (ii) Environmental quality (atmosphere, water and soil), (iii) Sociocultural values, (iv) Economic values, and (v) Management and administration. Impact indices are expressed in three integration levels: (i) specific indicators, that offer a diagnostic and managerial tool for farmers and rural administrators, by pointing out particular attributes of the rural activities that may be failing to comply with defined environmental performance objectives; (ii) integrated sustainability dimensions, that show decision-makers the major contributions of the rural activities toward local sustainable development, facilitating the definition of control actions and promotion measures; and (iii) aggregated sustainability index, that can be considered a yardstick for eco-certification purposes. Nine fully documented case studies carried out with the APOIA-NovoRural system, focusing on different scales, diverse rural activities/farming systems, and contrasting spatial/territorial contexts, attest to the malleability of the method and its applicability as an integrated farm environmental management tool.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-01
... Extension of Call for Nominations for the Bureau of Land Management's California Desert District Advisory... Management's (BLM) California Desert District is extending the call for nominations from the public for six..., California Desert District Office, 22835 Calle San Juan De Los Lagos, Moreno Valley, CA 92553. FOR FURTHER...
Implementation research: a mentoring programme to improve laboratory quality in Cambodia
Voeurng, Vireak; Sek, Sophat; Song, Sophanna; Vong, Nora; Tous, Chansamrach; Flandin, Jean-Frederic; Confer, Deborah; Costa, Alexandre; Martin, Robert
2016-01-01
Abstract Objective To implement a mentored laboratory quality stepwise implementation (LQSI) programme to strengthen the quality and capacity of Cambodian hospital laboratories. Methods We recruited four laboratory technicians to be mentors and trained them in mentoring skills, laboratory quality management practices and international standard organization (ISO) 15189 requirements for medical laboratories. Separately, we trained staff from 12 referral hospital laboratories in laboratory quality management systems followed by tri-weekly in-person mentoring on quality management systems implementation using the LQSI tool, which is aligned with the ISO 15189 standard. The tool was adapted from a web-based resource into a software-based spreadsheet checklist, which includes a detailed action plan and can be used to qualitatively monitor each laboratory’s progress. The tool – translated into Khmer – included a set of quality improvement activities grouped into four phases for implementation with increasing complexity. Project staff reviewed the laboratories’ progress and challenges in weekly conference calls and bi-monthly meetings with focal points of the health ministry, participating laboratories and local partners. We present the achievements in implementation from September 2014 to March 2016. Findings As of March 2016, the 12 laboratories have completed 74–90% of the 104 activities in phase 1, 53–78% of the 178 activities in phase 2, and 18–26% of the 129 activities in phase 3. Conclusion Regular on-site mentoring of laboratories using a detailed action plan in the local language allows staff to learn concepts of quality management system and learn on the job without disruption to laboratory service provision. PMID:27843164
Squires, Seth Ian; Boal, Allan John; Lamont, Selina; Naismith, Graham D
2017-10-01
Patient self-management and its service integration is not a new concept but it may be a key component in the long-term sustainability of inflammatory bowel disease (IBD) service provision, when considering growing disease prevalence and limited resources. The IBD team at the Royal Alexandra and Vale of Leven Hospitals in the Clyde Valley region developed a self-management tool, called the 'flare card'. Patients were asked to complete a questionnaire which reflected their opinion on its viability as a self-management intervention. In addition, its utility in terms of service use over a 10-month period in 2016 was compared with a similar cohort of patients over 10 months in 2015. Patients overall felt that the 'flare card' was a viable self-management tool. Positive feedback identified that the intervention could help them aid control over their IBD, improve medication adherence, reduce symptoms and reflected a feeling of patient-centred IBD care. The comparison between 2015 and 2016 service use revealed a significant reduction in IBD and non-IBD service usage, Steroid prescribing and unscheduled IBD care in the flare card supported cohort. IBD services must continue to adapt to changes within the National Health Service bearing in mind long-term sustainability and continued care provision. The 'flare card' goes further in an attempt to optimise Crohn's disease and ulcerative colitis management by harmonising clinician evaluation and patient's self-initiation of therapy and investigation.
Vandenhove, Hildegarde; Turcanu, Catrinel
2016-10-01
The options adopted for recovery of agricultural land after the Chernobyl and Fukushima accidents are compared by examining their technical and socio-economic aspects. The analysis highlights commonalities such as the implementation of tillage and other types of countermeasures and differences in approach, such as preferences for topsoil removal in Fukushima and the application of K fertilizers in Chernobyl. This analysis shows that the recovery approach needs to be context-specific to best suit the physical, social, and political environment. The complex nature of the decision problem calls for a formal process for engaging stakeholders and the development of adequate decision support tools. Integr Environ Assess Manag 2016;12:662-666. © 2016 SETAC. © 2016 SETAC.
A hierarchical distributed control model for coordinating intelligent systems
NASA Technical Reports Server (NTRS)
Adler, Richard M.
1991-01-01
A hierarchical distributed control (HDC) model for coordinating cooperative problem-solving among intelligent systems is described. The model was implemented using SOCIAL, an innovative object-oriented tool for integrating heterogeneous, distributed software systems. SOCIAL embeds applications in 'wrapper' objects called Agents, which supply predefined capabilities for distributed communication, control, data specification, and translation. The HDC model is realized in SOCIAL as a 'Manager'Agent that coordinates interactions among application Agents. The HDC Manager: indexes the capabilities of application Agents; routes request messages to suitable server Agents; and stores results in a commonly accessible 'Bulletin-Board'. This centralized control model is illustrated in a fault diagnosis application for launch operations support of the Space Shuttle fleet at NASA, Kennedy Space Center.
Developing a Web-Based Ppgis, as AN Environmental Reporting Service
NASA Astrophysics Data System (ADS)
Ranjbar Nooshery, N.; Taleai, M.; Kazemi, R.; Ebadi, K.
2017-09-01
Today municipalities are searching for new tools to empower locals for changing the future of their own areas by increasing their participation in different levels of urban planning. These tools should involve the community in planning process using participatory approaches instead of long traditional top-down planning models and help municipalities to obtain proper insight about major problems of urban neighborhoods from the residents' point of view. In this matter, public participation GIS (PPGIS) which enables citizens to record and following up their feeling and spatial knowledge regarding problems of the city in the form of maps have been introduced. In this research, a tool entitled CAER (Collecting & Analyzing of Environmental Reports) is developed. In the first step, a software framework based on Web-GIS tool, called EPGIS (Environmental Participatory GIS) has been designed to support public participation in reporting urban environmental problems and to facilitate data flow between citizens and municipality. A web-based cartography tool was employed for geo-visualization and dissemination of map-based reports. In the second step of CAER, a subsystem is developed based on SOLAP (Spatial On-Line Analytical Processing), as a data mining tools to elicit the local knowledge facilitating bottom-up urban planning practices and to help urban managers to find hidden relations among the recorded reports. This system is implemented in a case study area in Boston, Massachusetts and its usability was evaluated. The CAER should be considered as bottom-up planning tools to collect people's problems and views about their neighborhood and transmits them to the city officials. It also helps urban planners to find solutions for better management from citizen's viewpoint and gives them this chance to develop good plans to the neighborhoods that should be satisfied the citizens.
Boxwala, A A; Chaney, E L; Fritsch, D S; Friedman, C P; Rosenman, J G
1998-09-01
The purpose of this investigation was to design and implement a prototype physician workstation, called PortFolio, as a platform for developing and evaluating, by means of controlled observer studies, user interfaces and interactive tools for analyzing and managing digital portal images. The first observer study was designed to measure physician acceptance of workstation technology, as an alternative to a view box, for inspection and analysis of portal images for detection of treatment setup errors. The observer study was conducted in a controlled experimental setting to evaluate physician acceptance of the prototype workstation technology exemplified by PortFolio. PortFolio incorporates a windows user interface, a compact kit of carefully selected image analysis tools, and an object-oriented data base infrastructure. The kit evaluated in the observer study included tools for contrast enhancement, registration, and multimodal image visualization. Acceptance was measured in the context of performing portal image analysis in a structured protocol designed to simulate clinical practice. The acceptability and usage patterns were measured from semistructured questionnaires and logs of user interactions. Radiation oncologists, the subjects for this study, perceived the tools in PortFolio to be acceptable clinical aids. Concerns were expressed regarding user efficiency, particularly with respect to the image registration tools. The results of our observer study indicate that workstation technology is acceptable to radiation oncologists as an alternative to a view box for clinical detection of setup errors from digital portal images. Improvements in implementation, including more tools and a greater degree of automation in the image analysis tasks, are needed to make PortFolio more clinically practical.
Towards a flexible middleware for context-aware pervasive and wearable systems.
Muro, Marco; Amoretti, Michele; Zanichelli, Francesco; Conte, Gianni
2012-11-01
Ambient intelligence and wearable computing call for innovative hardware and software technologies, including a highly capable, flexible and efficient middleware, allowing for the reuse of existing pervasive applications when developing new ones. In the considered application domain, middleware should also support self-management, interoperability among different platforms, efficient communications, and context awareness. In the on-going "everything is networked" scenario scalability appears as a very important issue, for which the peer-to-peer (P2P) paradigm emerges as an appealing solution for connecting software components in an overlay network, allowing for efficient and balanced data distribution mechanisms. In this paper, we illustrate how all these concepts can be placed into a theoretical tool, called networked autonomic machine (NAM), implemented into a NAM-based middleware, and evaluated against practical problems of pervasive computing.
Full Life Cycle of Data Analysis with Climate Model Diagnostic Analyzer (CMDA)
NASA Astrophysics Data System (ADS)
Lee, S.; Zhai, C.; Pan, L.; Tang, B.; Zhang, J.; Bao, Q.; Malarout, N.
2017-12-01
We have developed a system that supports the full life cycle of a data analysis process, from data discovery, to data customization, to analysis, to reanalysis, to publication, and to reproduction. The system called Climate Model Diagnostic Analyzer (CMDA) is designed to demonstrate that the full life cycle of data analysis can be supported within one integrated system for climate model diagnostic evaluation with global observational and reanalysis datasets. CMDA has four subsystems that are highly integrated to support the analysis life cycle. Data System manages datasets used by CMDA analysis tools, Analysis System manages CMDA analysis tools which are all web services, Provenance System manages the meta data of CMDA datasets and the provenance of CMDA analysis history, and Recommendation System extracts knowledge from CMDA usage history and recommends datasets/analysis tools to users. These four subsystems are not only highly integrated but also easily expandable. New datasets can be easily added to Data System and scanned to be visible to the other subsystems. New analysis tools can be easily registered to be available in the Analysis System and Provenance System. With CMDA, a user can start a data analysis process by discovering datasets of relevance to their research topic using the Recommendation System. Next, the user can customize the discovered datasets for their scientific use (e.g. anomaly calculation, regridding, etc) with tools in the Analysis System. Next, the user can do their analysis with the tools (e.g. conditional sampling, time averaging, spatial averaging) in the Analysis System. Next, the user can reanalyze the datasets based on the previously stored analysis provenance in the Provenance System. Further, they can publish their analysis process and result to the Provenance System to share with other users. Finally, any user can reproduce the published analysis process and results. By supporting the full life cycle of climate data analysis, CMDA improves the research productivity and collaboration level of its user.
Crisis and emergency risk communication as an integrative model.
Reynolds, Barbara; W Seeger, Matthew
2005-01-01
This article describes a model of communication known as crisis and emergency risk communication (CERC). The model is outlined as a merger of many traditional notions of health and risk communication with work in crisis and disaster communication. The specific kinds of communication activities that should be called for at various stages of disaster or crisis development are outlined. Although crises are by definition uncertain, equivocal, and often chaotic situations, the CERC model is presented as a tool health communicators can use to help manage these complex events.
On Constructing, Grouping and Using Topical Ontology for Semantic Matching
NASA Astrophysics Data System (ADS)
Tang, Yan; de Baer, Peter; Zhao, Gang; Meersman, Robert
An ontology topic is used to group concepts from different contexts (or even from different domain ontologies). This paper presents a pattern-driven modeling methodology for constructing and grouping topics in an ontology (PAD-ON methodology), which is used for matching similarities between competences in the human resource management (HRM) domain. The methodology is supported by a tool called PAD-ON. This paper demonstrates our recent achievement in the work from the EC Prolix project. The paper approach is applied to the training processes at British Telecom as the test bed.
Evaluating Variant Calling Tools for Non-Matched Next-Generation Sequencing Data
NASA Astrophysics Data System (ADS)
Sandmann, Sarah; de Graaf, Aniek O.; Karimi, Mohsen; van der Reijden, Bert A.; Hellström-Lindberg, Eva; Jansen, Joop H.; Dugas, Martin
2017-02-01
Valid variant calling results are crucial for the use of next-generation sequencing in clinical routine. However, there are numerous variant calling tools that usually differ in algorithms, filtering strategies, recommendations and thus, also in the output. We evaluated eight open-source tools regarding their ability to call single nucleotide variants and short indels with allelic frequencies as low as 1% in non-matched next-generation sequencing data: GATK HaplotypeCaller, Platypus, VarScan, LoFreq, FreeBayes, SNVer, SAMtools and VarDict. We analysed two real datasets from patients with myelodysplastic syndrome, covering 54 Illumina HiSeq samples and 111 Illumina NextSeq samples. Mutations were validated by re-sequencing on the same platform, on a different platform and expert based review. In addition we considered two simulated datasets with varying coverage and error profiles, covering 50 samples each. In all cases an identical target region consisting of 19 genes (42,322 bp) was analysed. Altogether, no tool succeeded in calling all mutations. High sensitivity was always accompanied by low precision. Influence of varying coverages- and background noise on variant calling was generally low. Taking everything into account, VarDict performed best. However, our results indicate that there is a need to improve reproducibility of the results in the context of multithreading.
Cognitive simulation as a tool for cognitive task analysis.
Roth, E M; Woods, D D; Pople, H E
1992-10-01
Cognitive simulations are runnable computer programs that represent models of human cognitive activities. We show how one cognitive simulation built as a model of some of the cognitive processes involved in dynamic fault management can be used in conjunction with small-scale empirical data on human performance to uncover the cognitive demands of a task, to identify where intention errors are likely to occur, and to point to improvements in the person-machine system. The simulation, called Cognitive Environment Simulation or CES, has been exercised on several nuclear power plant accident scenarios. Here we report one case to illustrate how a cognitive simulation tool such as CES can be used to clarify the cognitive demands of a problem-solving situation as part of a cognitive task analysis.
Modular workcells: modern methods for laboratory automation.
Felder, R A
1998-12-01
Laboratory automation is beginning to become an indispensable survival tool for laboratories facing difficult market competition. However, estimates suggest that only 8% of laboratories will be able to afford total laboratory automation systems. Therefore, automation vendors have developed alternative hardware configurations called 'modular automation', to fit the smaller laboratory. Modular automation consists of consolidated analyzers, integrated analyzers, modular workcells, and pre- and post-analytical automation. These terms will be defined in this paper. Using a modular automation model, the automated core laboratory will become a site where laboratory data is evaluated by trained professionals to provide diagnostic information to practising physicians. Modem software information management and process control tools will complement modular hardware. Proper standardization that will allow vendor-independent modular configurations will assure success of this revolutionary new technology.
Team table: a framework and tool for continuous factory planning
NASA Astrophysics Data System (ADS)
Sihn, Wilfried; Bischoff, Juergen; von Briel, Ralf; Josten, Marcus
2000-10-01
Growing market turbulences and shorter product life cycles require a continuous adaptation of factory structures resulting in a continuous factory planning process. Therefore a new framework is developed which focuses on configuration and data management process integration. This enable an online system performance evaluation based on continuous availability of current data. The use of this framework is especially helpful and will guarantee high cost and time savings, when used in the early stages of the planning, called the concept or rough planning phase. The new framework is supported by a planning round table as a tool for team-based configuration processes integrating the knowledge of all persons involved in planning processes. A case study conducted at a German company shows the advantages which can be achieved by implementing the new framework and methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pardo-Bosch, Francesc, E-mail: francesc.pardo@upc.edu; Political Science Department, University of California - Berkeley; Aguado, Antonio, E-mail: antonio.aguado@upc.edu
Infrastructure construction, one of the biggest driving forces of the economy nowadays, requires a huge analysis and clear transparency to decide what projects have to be executed with the few resources available. With the aim to provide the public administrations a tool with which they can make their decisions easier, the Sustainability Index of Infrastructure Projects (SIIP) has been defined, with a multi-criteria decision system called MIVES, in order to classify non-uniform investments. This index evaluates, in two inseparable stages, the contribution to the sustainable development of each infrastructure project, analyzing its social, environmental and economic impact. The result ofmore » the SIIP allows to decide the order with which projects will be prioritized. The case of study developed proves the adaptability and utility of this tool for the ordinary budget management.« less
Brooks, Jennifer C; Pinto, Meredith; Gill, Adrienne; Hills, Katherine E; Murthy, Shivani; Podgornik, Michelle N; Hernandez, Luis F; Rose, Dale A; Angulo, Frederick J; Rzeszotarski, Peter
2016-07-08
Establishing a functional incident management system (IMS) is important in the management of public health emergencies. In response to the 2014-2016 Ebola virus disease (Ebola) epidemic in West Africa, CDC established the Emergency Management Development Team (EMDT) to coordinate technical assistance for developing emergency management capacity in Guinea, Liberia, and Sierra Leone. EMDT staff, deployed staff, and partners supported each country to develop response goals and objectives, identify gaps in response capabilities, and determine strategies for coordinating response activities. To monitor key programmatic milestones and assess changes in emergency management and response capacities over time, EMDT implemented three data collection methods in country: coordination calls, weekly written situation reports, and an emergency management dashboard tool. On the basis of the information collected, EMDT observed improvements in emergency management capacity over time in all three countries. The collaborations in each country yielded IMS structures that streamlined response and laid the foundation for long-term emergency management programs.The activities summarized in this report would not have been possible without collaboration with many U.S and international partners (http://www.cdc.gov/vhf/ebola/outbreaks/2014-west-africa/partners.html).
The reliability-quality relationship for quality systems and quality risk management.
Claycamp, H Gregg; Rahaman, Faiad; Urban, Jason M
2012-01-01
Engineering reliability typically refers to the probability that a system, or any of its components, will perform a required function for a stated period of time and under specified operating conditions. As such, reliability is inextricably linked with time-dependent quality concepts, such as maintaining a state of control and predicting the chances of losses from failures for quality risk management. Two popular current good manufacturing practice (cGMP) and quality risk management tools, failure mode and effects analysis (FMEA) and root cause analysis (RCA) are examples of engineering reliability evaluations that link reliability with quality and risk. Current concepts in pharmaceutical quality and quality management systems call for more predictive systems for maintaining quality; yet, the current pharmaceutical manufacturing literature and guidelines are curiously silent on engineering quality. This commentary discusses the meaning of engineering reliability while linking the concept to quality systems and quality risk management. The essay also discusses the difference between engineering reliability and statistical (assay) reliability. The assurance of quality in a pharmaceutical product is no longer measured only "after the fact" of manufacturing. Rather, concepts of quality systems and quality risk management call for designing quality assurance into all stages of the pharmaceutical product life cycle. Interestingly, most assays for quality are essentially static and inform product quality over the life cycle only by being repeated over time. Engineering process reliability is the fundamental concept that is meant to anticipate quality failures over the life cycle of the product. Reliability is a well-developed theory and practice for other types of manufactured products and manufacturing processes. Thus, it is well known to be an appropriate index of manufactured product quality. This essay discusses the meaning of reliability and its linkages with quality systems and quality risk management.
Advances in Grid Computing for the FabrIc for Frontier Experiments Project at Fermialb
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herner, K.; Alba Hernandex, A. F.; Bhat, S.
The FabrIc for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientic Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of diering size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certicate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have signicantly matured, and present an increasinglymore » complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the eorts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production work ows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular work ows, and support troubleshooting and triage in case of problems. Recently a new certicate management infrastructure called Distributed Computing Access with Federated Identities (DCAFI) has been put in place that has eliminated our dependence on a Fermilab-specic third-party Certicate Authority service and better accommodates FIFE collaborators without a Fermilab Kerberos account. DCAFI integrates the existing InCommon federated identity infrastructure, CILogon Basic CA, and a MyProxy service using a new general purpose open source tool. We will discuss the general FIFE onboarding strategy, progress in expanding FIFE experiments presence on the Open Science Grid, new tools for job monitoring, the POMS service, and the DCAFI project.« less
Advances in Grid Computing for the Fabric for Frontier Experiments Project at Fermilab
NASA Astrophysics Data System (ADS)
Herner, K.; Alba Hernandez, A. F.; Bhat, S.; Box, D.; Boyd, J.; Di Benedetto, V.; Ding, P.; Dykstra, D.; Fattoruso, M.; Garzoglio, G.; Kirby, M.; Kreymer, A.; Levshina, T.; Mazzacane, A.; Mengel, M.; Mhashilkar, P.; Podstavkov, V.; Retzke, K.; Sharma, N.; Teheran, J.
2017-10-01
The Fabric for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientific Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of differing size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certificate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have significantly matured, and present an increasingly complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the efforts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production workflows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular workflows, and support troubleshooting and triage in case of problems. Recently a new certificate management infrastructure called Distributed Computing Access with Federated Identities (DCAFI) has been put in place that has eliminated our dependence on a Fermilab-specific third-party Certificate Authority service and better accommodates FIFE collaborators without a Fermilab Kerberos account. DCAFI integrates the existing InCommon federated identity infrastructure, CILogon Basic CA, and a MyProxy service using a new general purpose open source tool. We will discuss the general FIFE onboarding strategy, progress in expanding FIFE experiments presence on the Open Science Grid, new tools for job monitoring, the POMS service, and the DCAFI project.
Gomersall, Judith Streak; Canuto, Karla; Aromataris, Edoardo; Braunack-Mayer, Annette; Brown, Alex
2016-02-01
To describe the main characteristics of systematic reviews addressing questions of chronic disease and related risk factors for Indigenous Australians. We searched databases for systematic reviews meeting inclusion criteria. Two reviewers assessed quality and extracted characteristics using pre-defined tools. We identified 14 systematic reviews. Seven synthesised evidence about health intervention effectiveness; four addressed chronic disease or risk factor prevalence; and six conducted critical appraisal as per current best practice. Only three reported steps to align the review with standards for ethical research with Indigenous Australians and/or capture Indigenous-specific knowledge. Most called for more high-quality research. Systematic review is an under-utilised method for gathering evidence to inform chronic disease prevention and management for Indigenous Australians. Relevance of future systematic reviews could be improved by: 1) aligning questions with community priorities as well as decision maker needs; 2) involvement of, and leadership by, Indigenous researchers with relevant cultural and contextual knowledge; iii) use of critical appraisal tools that include traditional risk of bias assessment criteria and criteria that reflect Indigenous standards of appropriate research. Systematic review method guidance, tools and reporting standards are required to ensure alignment with ethical obligations and promote rigor and relevance. © 2015 Public Health Association of Australia.
de Aranzabal, Itziar; Schmitz, María F; Pineda, Francisco D
2009-11-01
Tourism and landscape are interdependent concepts. Nature- and culture-based tourism are now quite well developed activities and can constitute an excellent way of exploiting the natural resources of certain areas, and should therefore be considered as key objectives in landscape planning and management in a growing number of countries. All of this calls for careful evaluation of the effects of tourism on the territory. This article focuses on an integrated spatial method for landscape analysis aimed at quantifying the relationship between preferences of visitors and landscape features. The spatial expression of the model relating types of leisure and recreational preferences to the potential capacity of the landscape to meet them involves a set of maps showing degrees of potential visitor satisfaction. The method constitutes a useful tool for the design of tourism planning and management strategies, with landscape conservation as a reference.
Exotic encounters with dental implants: managing complications with unidentified systems.
Mattheos, N; Janda, M Schittek
2012-06-01
As the application of dental implants increases worldwide, so is the number of technical and biological complications that general dental practitioners will be called to manage, while maintaining implant patients. In addition, the greater patient mobility encountered today combined with a growing trend of 'dental implant tourism' will very often result in situations where the dentist is requested to deal with complications in implants placed elsewhere and which sometimes might be of an 'exotic' system one cannot directly recognize. Such a situation can pose significant challenges to even experienced clinicians. The challenges are not only in the scientific field, but often include professional and ethical implications. This case report will discuss strategies for the management of implant complications in cases of unidentified implant systems. Critical factors in such situations would be the clinician's experience and special training, the correct radiographic technique, as well as access to the appropriate tools and devices. © 2012 Australian Dental Association.
Model-based diagnostics for Space Station Freedom
NASA Technical Reports Server (NTRS)
Fesq, Lorraine M.; Stephan, Amy; Martin, Eric R.; Lerutte, Marcel G.
1991-01-01
An innovative approach to fault management was recently demonstrated for the NASA LeRC Space Station Freedom (SSF) power system testbed. This project capitalized on research in model-based reasoning, which uses knowledge of a system's behavior to monitor its health. The fault management system (FMS) can isolate failures online, or in a post analysis mode, and requires no knowledge of failure symptoms to perform its diagnostics. An in-house tool called MARPLE was used to develop and run the FMS. MARPLE's capabilities are similar to those available from commercial expert system shells, although MARPLE is designed to build model-based as opposed to rule-based systems. These capabilities include functions for capturing behavioral knowledge, a reasoning engine that implements a model-based technique known as constraint suspension, and a tool for quickly generating new user interfaces. The prototype produced by applying MARPLE to SSF not only demonstrated that model-based reasoning is a valuable diagnostic approach, but it also suggested several new applications of MARPLE, including an integration and testing aid, and a complement to state estimation.
Karson, T. H.; Perkins, C.; Dixon, C.; Ehresman, J. P.; Mammone, G. L.; Sato, L.; Schaffer, J. L.; Greenes, R. A.
1997-01-01
A component-based health information resource, delivered on an intranet and the Internet, utilizing World Wide Web (WWW) technology, has been built to meet the needs of a large integrated delivery network (IDN). Called PartnerWeb, this resource is intended to provide a variety of health care and reference information to both practitioners and consumers/patients. The initial target audience has been providers. Content management for the numerous departments, divisions, and other organizational entities within the IDN is accomplished by a distributed authoring and editing environment. Structured entry using a set of form tools into databases facilitates consistency of information presentation, while empowering designated authors and editors in the various entities to be responsible for their own materials, but not requiring them to be technically skilled. Each form tool manages an encapsulated component. The output of each component can be a dynamically generated display on WWW platforms, or an appropriate interface to other presentation environments. The PartnerWeb project lays the foundation for both an internal and external communication infrastructure for the enterprise that can facilitate information dissemination. PMID:9357648
Mission Options Scoping Tool for Mars Orbiters: Mass Cost Calculator (MC2)
NASA Technical Reports Server (NTRS)
Sturm, Eric J., II; Deutsch, Marie-Jose; Harmon, Corey; Nakagawa, Roy; Kinsey, Robert; Lopez, Nino; Kudrle, Paul; Evans, Alex
2007-01-01
Prior to developing the details of an advanced mission study, the mission architecture trade space is typically explored to assess the scope of feasible options. This paper describes the main features of an Excel-based tool, called the Mass-Cost-Calculator (MC2 ), which is used to perform rapid, high-level mass and cost options analyses of Mars orbiter missions. MC2 consists of a combination of databases, analytical solutions, and parametric relationships to enable quick evaluation of new mission concepts and comparison of multiple architecture options. The tool's outputs provide program management and planning teams with answers to "what if" queries, as well as an understanding of the driving mission elements, during the pre-project planning phase. These outputs have been validated against the outputs generated by the Advanced Projects Design Team (Team X) at NASA's Jet Propulsion Laboratory (JPL). The architecture of the tool allows for future expansion to other orbiters beyond Mars, and to non-orbiter missions, such as those involving fly-by spacecraft, probes, landers, rovers, or other mission elements.
Elger, William R
2006-04-01
Responding to changing trends in how the University of Michigan Medical School (UMMS) has been traditionally financed and anticipating that these trends will continue, in 2002 the executive leadership at the UMMS embarked upon a course designed to change not only the school's financial structure but its management culture as well. Changing traditional ways of thinking about budgets and developing a set of key performance indicators that demonstrate how certain activities shape the use of resources has brought greater understanding of how to optimize those resources to the greatest extent. Through internally developed Web-based software applications called M-STAT, M-DASH and M-ALERT (which are strategic reporting tools that the author describes), the UMMS now can manage resources in a completely different way. These tools are used to spot general financial trends or examine a more specific financial element (such as trends in grant funding or clinical activity), track the utilization of research space, calculate the break-even cost of research space, and most important, model various "what-if" scenarios to help plan effectively for the future needs of the UMMS. The strategic reporting system is still being integrated throughout the UMMS, so there has not yet been time to measure the system's efficacy or its shortcomings. Nevertheless, important lessons have already been learned, which the author presents.
Measuring the strategic readiness of intangible assets.
Kaplan, Robert S; Norton, David P
2004-02-01
Measuring the value of intangible assets such as company culture, knowledge management systems, and employees' skills is the holy grail of accounting. Executives know that these intangibles, being hard to imitate, are powerful sources of sustainable competitive advantage. If managers could measure them, they could manage the company's competitive position more easily and accurately. In one sense, the challenge is impossible. Intangible assets are unlike financial and physical resources in that their value depends on how well they serve the organizations that own them. But while this prevents an independent valuation of intangible assets, it also points to an altogether different approach for assessing their worth. In this article, the creators of the Balanced Scorecard draw on its tools and framework--in particular, a tool called the strategy map--to present a step-by-step way to determine "strategic readiness," which refers to the alignment of an organization's human, information, and organization capital with its strategy. In the method the authors describe, the firm identifies the processes most critical to creating and delivering its value proposition and determines the human, information, and organization capital the processes require. Some managers shy away from measuring intangible assets because they seem so subjective. But by using the systematic approaches set out in this article, companies can now measure what they want, rather than wanting only what they can currently measure.
NASA Technical Reports Server (NTRS)
Muniz, R.; Hochstadt, J.; Boelke J.; Dalton, A.
2011-01-01
The Content Documents are created and managed under the System Software group with. Launch Control System (LCS) project. The System Software product group is lead by NASA Engineering Control and Data Systems branch (NEC3) at Kennedy Space Center. The team is working on creating Operating System Images (OSI) for different platforms (i.e. AIX, Linux, Solaris and Windows). Before the OSI can be created, the team must create a Content Document which provides the information of a workstation or server, with the list of all the software that is to be installed on it and also the set where the hardware belongs. This can be for example in the LDS, the ADS or the FR-l. The objective of this project is to create a User Interface Web application that can manage the information of the Content Documents, with all the correct validations and filters for administrator purposes. For this project we used one of the most excellent tools in agile development applications called Ruby on Rails. This tool helps pragmatic programmers develop Web applications with Rails framework and Ruby programming language. It is very amazing to see how a student can learn about OOP features with the Ruby language, manage the user interface with HTML and CSS, create associations and queries with gems, manage databases and run a server with MYSQL, run shell commands with command prompt and create Web frameworks with Rails. All of this in a real world project and in just fifteen weeks!
Nohria, Nitin; Joyce, William; Roberson, Bruce
2003-07-01
When it comes to improving business performance, managers have no shortage of tools and techniques to choose from. But what really works? What's critical, and what's optional? Two business professors and a former McKinsey consultant set out to answer those questions. In a ground-breaking, five-year study that involved more than 50 academics and consultants, the authors analyzed 200 management techniques as they were employed by 160 companies over ten years. Their findings at a high level? Business basics really matter. In this article, the authors outline the management practices that are imperative for sustained superior financial performance--their "4+2 formula" for business success. They provide examples of companies that achieved varying degrees of success depending on whether they applied the formula, and they suggest ways that other companies can achieve excellence. The 160 companies in their study--called the Evergreen Project--were divided into 40 quads, each comprising four companies in a narrowly defined industry. Based on its performance between 1986 and 1996, each company in each quad was classified as either a winner (for instance, Dollar General), a loser (Kmart), a climber (Target), or a tumbler (the Limited). Without exception, the companies that outperformed their industry peers excelled in what the authors call the four primary management practices: strategy, execution, culture, and structure. And they supplemented their great skill in those areas with a mastery of any two of four secondary management practices: talent, leadership, innovation, and mergers and partnerships. A company that consistently follows this 4+2 formula has a better than 90% chance of sustaining superior performance, according to the authors.
Weak signal detection: A discrete window of opportunity for achieving 'Vision 90:90:90'?
Burman, Christopher J; Aphane, Marota; Delobelle, Peter
2016-01-01
UNAIDS' Vision 90:90:90 is a call to 'end AIDS'. Developing predictive foresight of the unpredictable changes that this journey will entail could contribute to the ambition of 'ending AIDS'. There are few opportunities for managing unpredictable changes. We introduce 'weak signal detection' as a potential opportunity to fill this void. Combining futures and complexity theory, we reflect on two pilot case studies that involved the Archetype Extraction technique and the SenseMaker(®) Collector(™) tool. Both the piloted techniques have the potentials to surface weak signals--but there is room for improvement. A management response to a complex weak signal requires pattern management, rather than an exclusive focus on behaviour management. Weak signal detection is a window of opportunity to improve resilience to unpredictable changes in the HIV/AIDS landscape that can both reduce the risk that emerges from the changes and increase the visibility of opportunities to exploit the unpredictable changes that could contribute to 'ending AIDS'.
ERIC Educational Resources Information Center
Heys, Chris
2008-01-01
Excel, Microsoft's spreadsheet program, offers several tools which have proven useful in solving some optimization problems that arise in operations research. We will look at two such tools, the Excel modules called Solver and Goal Seek--this after deriving an equation, called the "cash accumulation equation", to be used in conjunction with them.
Chemical annotation of small and peptide-like molecules at the Protein Data Bank
Young, Jasmine Y.; Feng, Zukang; Dimitropoulos, Dimitris; Sala, Raul; Westbrook, John; Zhuravleva, Marina; Shao, Chenghua; Quesada, Martha; Peisach, Ezra; Berman, Helen M.
2013-01-01
Over the past decade, the number of polymers and their complexes with small molecules in the Protein Data Bank archive (PDB) has continued to increase significantly. To support scientific advancements and ensure the best quality and completeness of the data files over the next 10 years and beyond, the Worldwide PDB partnership that manages the PDB archive is developing a new deposition and annotation system. This system focuses on efficient data capture across all supported experimental methods. The new deposition and annotation system is composed of four major modules that together support all of the processing requirements for a PDB entry. In this article, we describe one such module called the Chemical Component Annotation Tool. This tool uses information from both the Chemical Component Dictionary and Biologically Interesting molecule Reference Dictionary to aid in annotation. Benchmark studies have shown that the Chemical Component Annotation Tool provides significant improvements in processing efficiency and data quality. Database URL: http://wwpdb.org PMID:24291661
eFarm: A Tool for Better Observing Agricultural Land Systems
Yu, Qiangyi; Shi, Yun; Tang, Huajun; Yang, Peng; Xie, Ankun; Liu, Bin; Wu, Wenbin
2017-01-01
Currently, observations of an agricultural land system (ALS) largely depend on remotely-sensed images, focusing on its biophysical features. While social surveys capture the socioeconomic features, the information was inadequately integrated with the biophysical features of an ALS and the applications are limited due to the issues of cost and efficiency to carry out such detailed and comparable social surveys at a large spatial coverage. In this paper, we introduce a smartphone-based app, called eFarm: a crowdsourcing and human sensing tool to collect the geotagged ALS information at the land parcel level, based on the high resolution remotely-sensed images. We illustrate its main functionalities, including map visualization, data management, and data sensing. Results of the trial test suggest the system works well. We believe the tool is able to acquire the human–land integrated information which is broadly-covered and timely-updated, thus presenting great potential for improving sensing, mapping, and modeling of ALS studies. PMID:28245554
Chemical annotation of small and peptide-like molecules at the Protein Data Bank.
Young, Jasmine Y; Feng, Zukang; Dimitropoulos, Dimitris; Sala, Raul; Westbrook, John; Zhuravleva, Marina; Shao, Chenghua; Quesada, Martha; Peisach, Ezra; Berman, Helen M
2013-01-01
Over the past decade, the number of polymers and their complexes with small molecules in the Protein Data Bank archive (PDB) has continued to increase significantly. To support scientific advancements and ensure the best quality and completeness of the data files over the next 10 years and beyond, the Worldwide PDB partnership that manages the PDB archive is developing a new deposition and annotation system. This system focuses on efficient data capture across all supported experimental methods. The new deposition and annotation system is composed of four major modules that together support all of the processing requirements for a PDB entry. In this article, we describe one such module called the Chemical Component Annotation Tool. This tool uses information from both the Chemical Component Dictionary and Biologically Interesting molecule Reference Dictionary to aid in annotation. Benchmark studies have shown that the Chemical Component Annotation Tool provides significant improvements in processing efficiency and data quality. Database URL: http://wwpdb.org.
NASA Astrophysics Data System (ADS)
Pedelì, C.
2013-07-01
In order to make the most of the digital outsourced documents, based on new technologies (e.g.: 3D LASER scanners, photogrammetry, etc.), a new approach was followed and a new ad hoc information system was implemented. The obtained product allow to the final user to reuse and manage the digital documents providing graphic tools and an integrated specific database to manage the entire documentation and conservation process, starting from the condition assessment until the conservation / restoration work. The system is organised on two main modules: Archaeology and Conservation. This paper focus on the features and the advantages of the second one. In particular it is emphasized its logical organisation, the possibility to easily mapping by using a very precise 3D metric platform, to benefit of the integrated relational database which allows to well organise, compare, keep and manage different kind of information at different level. Conservation module can manage along the time the conservation process of a site, monuments, object or excavation and conservation work in progress. An alternative approach called OVO by the author of this paper, force the surveyor to observe and describe the entity decomposing it on functional components, materials and construction techniques. Some integrated tools as the "ICOMOS-ISCS Illustrated glossary … " help the user to describe pathologies with a unified approach and terminology. Also the conservation project phase is strongly supported to envision future intervention and cost. A final section is devoted to record the conservation/restoration work already done or in progress. All information areas of the conservation module are interconnected to each other to allows to the system a complete interchange of graphic and alphanumeric data. The conservation module it self is connected to the archaeological one to create an interdisciplinary daily tool.
Calvo-Lerma, Joaquim; Martinez-Jimenez, Celia P; Lázaro-Ramos, Juan-Pablo; Andrés, Ana; Crespo-Escobar, Paula; Stav, Erlend; Schauber, Cornelia; Pannese, Lucia; Hulst, Jessie M; Suárez, Lucrecia; Colombo, Carla; Barreto, Celeste; de Boeck, Kris; Ribes-Koninckx, Carmen
2017-01-01
Introduction For the optimal management of children with cystic fibrosis, there are currently no efficient tools for the precise adjustment of pancreatic enzyme replacement therapy, either for advice on appropriate dietary intake or for achieving an optimal nutrition status. Therefore, we aim to develop a mobile application that ensures a successful nutritional therapy in children with cystic fibrosis. Methods and analysis A multidisciplinary team of 12 partners coordinate their efforts in 9 work packages that cover the entire so-called ‘from laboratory to market’ approach by means of an original and innovative co-design process. A cohort of 200 patients with cystic fibrosis aged 1–17 years are enrolled. We will develop an innovative, clinically tested mobile health application for patients and health professionals involved in cystic fibrosis management. The mobile application integrates the research knowledge and innovative tools for maximising self-management with the aim of leading to a better nutritional status, quality of life and disease prognosis. Bringing together different and complementary areas of knowledge is fundamental for tackling complex challenges in disease treatment, such as optimal nutrition and pancreatic enzyme replacement therapy in cystic fibrosis. Patients are expected to benefit the most from the outcomes of this innovative project. Ethics and dissemination The project is approved by the Ethics Committee of the coordinating organisation, Hospital Universitari La Fe (Ref: 2014/0484). Scientific findings will be disseminated via journals and conferences addressed to clinicians, food scientists, information and communications technology experts and patients. The specific dissemination working group within the project will address the wide audience communication through the website (http://www.mycyfapp.eu), the social networks and the newsletter. PMID:28302638
Heisler, Michele; Mase, Rebecca; Brown, Brianne; Wilson, Shayla; Reeves, Pamela J.
2017-01-01
Background Racial and ethnic minority adults with diabetes living in under-resourced communities face multiple barriers to sustaining self-management behaviors necessary to improve diabetes outcomes. Peer support and decision support tools each have been associated with improved diabetes outcomes. Methods 289 primarily African American adults with poor glycemic control will be recruited from the Detroit Veteran’s Administration Hospital and randomized to Technology-Enhanced Coaching (TEC) or Peer Coaching alone. Participants in both arms will be assigned a peer coach trained in autonomy-supportive approaches. Coaches are diabetes patients with prior poor glycemic control who now have good control. All participants meet face-to-face initially with their coach to review diabetes education materials and develop an action plan. Educational materials in the TEC arm are delivered via a web-based, educational tool tailored with each participant’s personalized health data (iDecide). Over the next six months, Coaches call their assigned participants once a week to provide support for weekly action steps. Data are also collected on an Observational Control group with no contact with study staff. Changes in A1c, blood pressure, other patient-centered outcomes and mediators and moderators of intervention effects will be assessed. Discussion Tailored e-Health tools with educational content may enhance the effectiveness of peer coaching programs to better prepare patients to set self-management goals, identify action plans, and discuss treatment options with their health care providers. The study will provide insights for scalable self-management support programs for diabetes and chronic illnesses that require high levels of sustained patient self-management. PMID:28132876
CampusGIS of the University of Cologne: a tool for orientation, navigation, and management
NASA Astrophysics Data System (ADS)
Baaser, U.; Gnyp, M. L.; Hennig, S.; Hoffmeister, D.; Köhn, N.; Laudien, R.; Bareth, G.
2006-10-01
The working group for GIS and Remote Sensing at the Department of Geography at the University of Cologne has established a WebGIS called CampusGIS of the University of Cologne. The overall task of the CampusGIS is the connection of several existing databases at the University of Cologne with spatial data. These existing databases comprise data about staff, buildings, rooms, lectures, and general infrastructure like bus stops etc. These information were yet not linked to their spatial relation. Therefore, a GIS-based method is developed to link all the different databases to spatial entities. Due to the philosophy of the CampusGIS, an online-GUI is programmed which enables users to search for staff, buildings, or institutions. The query results are linked to the GIS database which allows the visualization of the spatial location of the searched entity. This system was established in 2005 and is operational since early 2006. In this contribution, the focus is on further developments. First results of (i) including routing services in, (ii) programming GUIs for mobile devices for, and (iii) including infrastructure management tools in the CampusGIS are presented. Consequently, the CampusGIS is not only available for spatial information retrieval and orientation. It also serves for on-campus navigation and administrative management.
Southern African Office of Astronomy for Development: A New Hub for Astronomy for Development
NASA Astrophysics Data System (ADS)
Mutondo, Moola S.; Simpemba, Prospery
2016-10-01
A new Astronomy for Development hub needs innovative tools and programs. SAROAD is developing exciting tools integrating Raspberry Pi technology to bring cost-effective astronomy content to learning centres. SAROAD would also like to report achievements in realizing the IAU's strategic plan. In order to manage, evaluate and coordinate regional IAU (International Astronomical Union) capacity building programmes, including the recruitment and mobilization of volunteers, SAROAD has built an intranet that is accessible to regional members upon request. Using this resource, regional members can see and participate in regional activities. SAROAD has commenced with projects in the three Task Force areas of Universities and Research, Children and Schools and Public Outreach. Under the three Task Force areas, a total of seven projects have commenced in Zambia (some supported by funds from IAU Annual Call for proposals).
Component Technology for High-Performance Scientific Simulation Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Epperly, T; Kohn, S; Kumfert, G
2000-11-09
We are developing scientific software component technology to manage the complexity of modem, parallel simulation software and increase the interoperability and re-use of scientific software packages. In this paper, we describe a language interoperability tool named Babel that enables the creation and distribution of language-independent software libraries using interface definition language (IDL) techniques. We have created a scientific IDL that focuses on the unique interface description needs of scientific codes, such as complex numbers, dense multidimensional arrays, complicated data types, and parallelism. Preliminary results indicate that in addition to language interoperability, this approach provides useful tools for thinking about themore » design of modem object-oriented scientific software libraries. Finally, we also describe a web-based component repository called Alexandria that facilitates the distribution, documentation, and re-use of scientific components and libraries.« less
Design, implementation and migration of security systems as an extreme project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scharmer, Carol; Trujillo, David
2010-08-01
Decision Trees, algorithms, software code, risk management, reports, plans, drawings, change control, presentations, and analysis - all useful tools and efforts but time consuming, resource intensive, and potentially costly for projects that have absolute schedule and budget constraints. What are necessary and prudent efforts when a customer calls with a major security problem that needs to be fixed with a proven, off-the-approval-list, multi-layered integrated system with high visibility and limited funding and expires at the end of the Fiscal Year? Whether driven by budget cycles, safety, or by management decree, many such projects begin with generic scopes and funding allocatedmore » based on a rapid management 'guestimate.' Then a Project Manager (PM) is assigned a project with a predefined and potentially limited scope, compressed schedule, and potentially insufficient funding. The PM is tasked to rapidly and cost effectively coordinate a requirements-based design, implementation, test, and turnover of a fully operational system to the customer, all while the customer is operating and maintaining an existing security system. Many project management manuals call this an impossible project that should not be attempted. However, security is serious business and the reality is that rapid deployment of proven systems via an 'Extreme Project' is sometimes necessary. Extreme Projects can be wildly successful but require a dedicated team of security professionals lead by an experienced project manager using a highly-tailored and agile project management process with management support at all levels, all combined with significant interface with the customer. This paper does not advocate such projects or condone eliminating the valuable analysis and project management techniques. Indeed, having worked on a well-planned project provides the basis for experienced team members to complete Extreme Projects. This paper does, however, provide insight into what it takes for projects to be successfully implemented and accepted when completed under extreme conditions.« less
Design implementation and migration of security systems as an extreme project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scharmer, Carol
2010-10-01
Decision Trees, algorithms, software code, risk management, reports, plans, drawings, change control, presentations, and analysis - all useful tools and efforts but time consuming, resource intensive, and potentially costly for projects that have absolute schedule and budget constraints. What are necessary and prudent efforts when a customer calls with a major security problem that needs to be fixed with a proven, off-the-approval-list, multi-layered integrated system with high visibility and limited funding and expires at the end of the Fiscal Year? Whether driven by budget cycles, safety, or by management decree, many such projects begin with generic scopes and funding allocatedmore » based on a rapid management 'guestimate.' Then a Project Manager (PM) is assigned a project with a predefined and potentially limited scope, compressed schedule, and potentially insufficient funding. The PM is tasked to rapidly and cost effectively coordinate a requirements-based design, implementation, test, and turnover of a fully operational system to the customer, all while the customer is operating and maintaining an existing security system. Many project management manuals call this an impossible project that should not be attempted. However, security is serious business and the reality is that rapid deployment of proven systems via an 'Extreme Project' is sometimes necessary. Extreme Projects can be wildly successful but require a dedicated team of security professionals lead by an experienced project manager using a highly-tailored and agile project management process with management support at all levels, all combined with significant interface with the customer. This paper does not advocate such projects or condone eliminating the valuable analysis and project management techniques. Indeed, having worked on a well-planned project provides the basis for experienced team members to complete Extreme Projects. This paper does, however, provide insight into what it takes for projects to be successfully implemented and accepted when completed under extreme conditions.« less
Towards a Pattern-Driven Topical Ontology Modeling Methodology in Elderly Care Homes
NASA Astrophysics Data System (ADS)
Tang, Yan; de Baer, Peter; Zhao, Gang; Meersman, Robert; Pudkey, Kevin
This paper presents a pattern-driven ontology modeling methodology, which is used to create topical ontologies in the human resource management (HRM) domain. An ontology topic is used to group concepts from different contexts (or even from different domain ontologies). We use the Organization for Economic Co-operation and Development (OECD) and the National Vocational Qualification (NVQ) as the resource to create the topical ontologies in this paper. The methodology is implemented in a tool called PAD-ON suit. The paper approach is illustrated with a use case from elderly care homes in UK.
Finding the sweet spot: how to get the right staffing for variable workloads.
Bryce, David J; Christensen, Taylor J
2011-03-01
All too often, hospital department managers set their staff schedules too much in anticipation of high levels of demand for services, leading to higher-than-necessary staffing costs when demand is lower than expected. The opposite approach of scheduling too few staff to meet demand, then relying on on-call or callback staff to address the shortage, also results in higher-than-necessary costs due to the premium wages that such staff must be paid. A staffing and workload simulation tool allows hospital departments to find the right balance between these extremes.
2007-11-01
Engineer- ing Research Laboratory is currently developing a set of facility ‘architec- tural’ programming tools , called Facility ComposerTM (FC). FC...requirements in the early phases of project development. As the facility program, crite- ria, and requirements are chosen, these tools populate the IFC...developing a set of facility “ar- chitectural” programming tools , called Facility Composer (FC), to support the capture and tracking of facility criteria
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-13
..., National Institutes of Health; Notice of a Conference Call of the NIH Scientific Management Review Board... hereby given of a conference call meeting of the Scientific Management Review Board. The NIH Reform Act... such units, or establishing or terminating such units. The purpose of the Scientific Management Review...
Spatially dynamic forest management to sustain biodiversity and economic returns.
Mönkkönen, Mikko; Juutinen, Artti; Mazziotta, Adriano; Miettinen, Kaisa; Podkopaev, Dmitry; Reunanen, Pasi; Salminen, Hannu; Tikkanen, Olli-Pekka
2014-02-15
Production of marketed commodities and protection of biodiversity in natural systems often conflict and thus the continuously expanding human needs for more goods and benefits from global ecosystems urgently calls for strategies to resolve this conflict. In this paper, we addressed what is the potential of a forest landscape to simultaneously produce habitats for species and economic returns, and how the conflict between habitat availability and timber production varies among taxa. Secondly, we aimed at revealing an optimal combination of management regimes that maximizes habitat availability for given levels of economic returns. We used multi-objective optimization tools to analyze data from a boreal forest landscape consisting of about 30,000 forest stands simulated 50 years into future. We included seven alternative management regimes, spanning from the recommended intensive forest management regime to complete set-aside of stands (protection), and ten different taxa representing a wide variety of habitat associations and social values. Our results demonstrate it is possible to achieve large improvements in habitat availability with little loss in economic returns. In general, providing dead-wood associated species with more habitats tended to be more expensive than providing requirements for other species. No management regime alone maximized habitat availability for the species, and systematic use of any single management regime resulted in considerable reductions in economic returns. Compared with an optimal combination of management regimes, a consistent application of the recommended management regime would result in 5% reduction in economic returns and up to 270% reduction in habitat availability. Thus, for all taxa a combination of management regimes was required to achieve the optimum. Refraining from silvicultural thinnings on a proportion of stands should be considered as a cost-effective management in commercial forests to reconcile the conflict between economic returns and habitat required by species associated with dead-wood. In general, a viable strategy to maintain biodiversity in production landscapes would be to diversify management regimes. Our results emphasize the importance of careful landscape level forest management planning because optimal combinations of management regimes were taxon-specific. For cost-efficiency, the results call for balanced and correctly targeted strategies among habitat types. Copyright © 2013 Elsevier Ltd. All rights reserved.
A Practical Guide to the Technology and Adoption of Software Process Automation
1994-03-01
IDE’s integration of Software through Pictures, CodeCenter, and FrameMaker ). However, successful use of in- tegrated tools, as reflected in actual...tool for a specific platform. Thus, when a Work Context calls for a word processor, the weaver.tis file can be set up to call FrameMaker for the Sun4
Leavesley, G.H.; Markstrom, S.L.; Viger, R.J.
2004-01-01
The interdisciplinary nature and increasing complexity of water- and environmental-resource problems require the use of modeling approaches that can incorporate knowledge from a broad range of scientific disciplines. The large number of distributed hydrological and ecosystem models currently available are composed of a variety of different conceptualizations of the associated processes they simulate. Assessment of the capabilities of these distributed models requires evaluation of the conceptualizations of the individual processes, and the identification of which conceptualizations are most appropriate for various combinations of criteria, such as problem objectives, data constraints, and spatial and temporal scales of application. With this knowledge, "optimal" models for specific sets of criteria can be created and applied. The U.S. Geological Survey (USGS) Modular Modeling System (MMS) is an integrated system of computer software that has been developed to provide these model development and application capabilities. MMS supports the integration of models and tools at a variety of levels of modular design. These include individual process models, tightly coupled models, loosely coupled models, and fully-integrated decision support systems. A variety of visualization and statistical tools are also provided. MMS has been coupled with the Bureau of Reclamation (BOR) object-oriented reservoir and river-system modeling framework, RiverWare, under a joint USGS-BOR program called the Watershed and River System Management Program. MMS and RiverWare are linked using a shared relational database. The resulting database-centered decision support system provides tools for evaluating and applying optimal resource-allocation and management strategies to complex, operational decisions on multipurpose reservoir systems and watersheds. Management issues being addressed include efficiency of water-resources management, environmental concerns such as meeting flow needs for endangered species, and optimizing operations within the constraints of multiple objectives such as power generation, irrigation, and water conservation. This decision support system approach is being developed, tested, and implemented in the Gunni-son, Yakima, San Juan, Rio Grande, and Truckee River basins of the western United States. Copyright ASCE 2004.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garnett, Kenisha, E-mail: k.garnett@cranfield.ac.uk; Cooper, Tim, E-mail: t.h.cooper@ntu.ac.uk
2014-12-15
Highlights: • A review of public engagement in waste management decision-making is undertaken. • Enhanced public engagement is explored as a means to legitimise waste decisions. • Analytical–deliberative processes are explored as a tool for effective dialogue. • Considerations for integrating public values with technical analysis are outlined. • Insights into the design of appropriate public engagement processes are provided. - Abstract: The complexity of municipal waste management decision-making has increased in recent years, accompanied by growing scrutiny from stakeholders, including local communities. This complexity reflects a socio-technical framing of the risks and social impacts associated with selecting technologies andmore » sites for waste treatment and disposal facilities. Consequently there is growing pressure on local authorities for stakeholders (including communities) to be given an early opportunity to shape local waste policy in order to encourage swift planning, development and acceptance of the technologies needed to meet statutory targets to divert waste from landfill. This paper presents findings from a research project that explored the use of analytical–deliberative processes as a legitimising tool for waste management decision-making. Adopting a mixed methods approach, the study revealed that communicating the practical benefits of more inclusive forms of engagement is proving difficult even though planning and policy delays are hindering development and implementation of waste management infrastructure. Adopting analytical–deliberative processes at a more strategic level will require local authorities and practitioners to demonstrate how expert-citizen deliberations may foster progress in resolving controversial issues, through change in individuals, communities and institutions. The findings suggest that a significant shift in culture will be necessary for local authorities to realise the potential of more inclusive decision processes. This calls for political actors and civic society to collaborate in institutionalising public involvement in both strategic and local planning structures.« less
A WPS Based Architecture for Climate Data Analytic Services (CDAS) at NASA
NASA Astrophysics Data System (ADS)
Maxwell, T. P.; McInerney, M.; Duffy, D.; Carriere, L.; Potter, G. L.; Doutriaux, C.
2015-12-01
Faced with unprecedented growth in the Big Data domain of climate science, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute trusted and tested analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using trusted climate data analysis tools (ESMF, CDAT, NCO, etc.). The framework is structured as a set of interacting modules allowing maximal flexibility in deployment choices. The current set of module managers include: Staging Manager: Runs the computation locally on the WPS server or remotely using tools such as celery or SLURM. Compute Engine Manager: Runs the computation serially or distributed over nodes using a parallelization framework such as celery or spark. Decomposition Manger: Manages strategies for distributing the data over nodes. Data Manager: Handles the import of domain data from long term storage and manages the in-memory and disk-based caching architectures. Kernel manager: A kernel is an encapsulated computational unit which executes a processor's compute task. Each kernel is implemented in python exploiting existing analysis packages (e.g. CDAT) and is compatible with all CDAS compute engines and decompositions. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be executed using either direct web service calls, a python script or application, or a javascript-based web application. Client packages in python or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends, compare multiple reanalysis datasets, and variability.
Brodaty, Henry; Low, Lee-Fay; Liu, Zhixin; Fletcher, Jennifer; Roast, Joel; Goodenough, Belinda; Chenoweth, Lynn
2014-12-01
To test the hypothesis that individual and institutional-level factors influence the effects of a humor therapy intervention on aged care residents. Data were from the humor therapy group of the Sydney Multisite Intervention of LaughterBosses and ElderClowns, or SMILE, study, a single-blind cluster randomized controlled trial of humor therapy conducted over 12 weeks; assessments were performed at baseline, week 13, and week 26. One hundred eighty-nine individuals from 17 Sydney residential aged care facilities were randomly allocated to the humor therapy intervention. Professional performers called "ElderClowns" provided 9-12 weekly humor therapy 2-hour sessions, augmented by trained staff, called "LaughterBosses." Outcome measures were as follows: Cornell Scale for Depression in Dementia, Cohen-Mansfield Agitation Inventory, Neuropsychiatric Inventory, the withdrawal subscale of Multidimensional Observation Scale for Elderly Subjects, and proxy-rated quality of life in dementia population scale. Facility-level measures were as follows: support of the management for the intervention, commitment levels of LaughterBosses, Environmental Audit Tool scores, and facility level of care provided (high/low). Resident-level measures were engagement, functional ability, disease severity, and time-in-care. Multilevel path analyses simultaneously modeled resident engagement at the individual level (repeated measures) and the effects of management support and staff commitment to humor therapy at the cluster level. Models indicated flow-on effects, whereby management support had positive effects on LaughterBoss commitment, and LaughterBoss commitment increased resident engagement. Higher resident engagement was associated with reduced depression, agitation, and neuropsychiatric scores. Effectiveness of psychosocial programs in residential aged care can be enhanced by management support, staff commitment, and active resident engagement. Copyright © 2014 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.
Energy Data Management Manual for the Wastewater Treatment Sector
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lemar, Paul; De Fontaine, Andre
Energy efficiency has become a higher priority within the wastewater treatment sector, with facility operators and state and local governments ramping up efforts to reduce energy costs and improve environmental performance. Across the country, municipal wastewater treatment plants are estimated to consume more than 30 terawatt hours per year of electricity, which equates to about $2 billion in annual electric costs. Electricity alone can constitute 25% to 40% of a wastewater treatment plant’s annual operating budget and make up a significant portion of a given municipality’s total energy bill. These energy needs are expected to grow over time, driven bymore » population growth and increasingly stringent water quality requirements. The purpose of this document is to describe the benefits of energy data management, explain how it can help drive savings when linked to a strong energy management program, and provide clear, step-by-step guidance to wastewater treatment plants on how to appropriately track energy performance. It covers the basics of energy data management and related concepts and describes different options for key steps, recognizing that a single approach may not work for all agencies. Wherever possible, the document calls out simpler, less time-intensive approaches to help smaller plants with more limited resources measure and track energy performance. Reviews of key, publicly available energy-tracking tools are provided to help organizations select a tool that makes the most sense for them. Finally, this document describes additional steps wastewater treatment plant operators can take to build on their energy data management systems and further accelerate energy savings.« less
Mobile Health Devices as Tools for Worldwide Cardiovascular Risk Reduction and Disease Management
Piette, John D.; List, Justin; Rana, Gurpreet K.; Townsend, Whitney; Striplin, Dana; Heisler, Michele
2015-01-01
We examined evidence on whether mobile health (mHealth) tools, including Interactive Voice Response (IVR) calls, short message service (SMS) or text messaging, and smartphones, can improve lifestyle behaviors and management related to cardiovascular diseases throughout the world. We conducted a state-of-the-art review and literature synthesis of peer-reviewed and grey literature published since 2004. The review prioritized randomized trials and studies focused on cardiovascular diseases and risk factors, but included other reports when they represented the best available evidence. The search emphasized reports on the potential benefits of mHealth interventions implemented in low- and middle-income countries (LMICs). IVR and SMS interventions can improve cardiovascular preventive care in developed countries by addressing risk factors including weight, smoking, and physical activity. IVR and SMS-based interventions for cardiovascular disease management also have shown benefits with respect to hypertension management, hospital readmissions, and diabetic glycemic control. Multi-modal interventions including web-based communication with clinicians and mHealth-enabled clinical monitoring with feedback also have shown benefits. The evidence regarding the potential benefits of interventions using smartphones and social media is still developing. Studies of mHealth interventions have been conducted in more than 30 LMICs, and evidence to date suggests that programs are feasible and may improve medication adherence and disease outcomes. Emerging evidence suggests that mHealth interventions may improve cardiovascular-related lifestyle behaviors and disease management. Next generation mHealth programs developed worldwide should be based on evidence-based behavioral theories and incorporate advances in artificial intelligence for adapting systems automatically to patients’ unique and changing needs. PMID:25690685
Mobile Health Devices as Tools for Worldwide Cardiovascular Risk Reduction and Disease Management
Piette, John D.; List, Justin; Rana, Gurpreet K.; Townsend, Whitney; Striplin, Dana; Heisler, Michele
2016-01-01
We examined evidence on whether mobile health (mHealth) tools, including Interactive Voice Response (IVR) calls, short message service (SMS) or text messaging, and smartphones, can improve lifestyle behaviors and management related to cardiovascular diseases throughout the world. We conducted a state-of-the-art review and literature synthesis of peer-reviewed and grey literature published since 2004. The review prioritized randomized trials and studies focused on cardiovascular diseases and risk factors, but included other reports when they represented the best available evidence. The search emphasized reports on the potential benefits of mHealth interventions implemented in low- and middle-income countries (LMICs). IVR and SMS interventions can improve cardiovascular preventive care in developed countries by addressing risk factors including weight, smoking, and physical activity. IVR and SMS-based interventions for cardiovascular disease management also have shown benefits with respect to hypertension management, hospital readmissions, and diabetic glycemic control. Multi-modal interventions including web-based communication with clinicians and mHealth-enabled clinical monitoring with feedback also have shown benefits. The evidence regarding the potential benefits of interventions using smartphones and social media is still developing. Studies of mHealth interventions have been conducted in more than 30 LMICs, and evidence to date suggests that programs are feasible and may improve medication adherence and disease outcomes. Emerging evidence suggests that mHealth interventions may improve cardiovascular-related lifestyle behaviors and disease management. Next generation mHealth programs developed worldwide should be based on evidence-based behavioral theories and incorporate advances in artificial intelligence for adapting systems automatically to patients’ unique and changing needs. PMID:26596977
Chronic disease management systems registries in rural health care.
Skinner, Anne; Fraser-Maginn, Roslyn; Mueller, Keith J
2006-05-01
Health care quality is being addressed from a variety of policy perspectives. The 2001 Institute of Medicine report, Crossing the Quality Chasm, calls for sweeping action involving a five-part strategy for change in the U.S. health care system. This agenda for change includes use of evidence-based approaches to address common conditions, the majority of which are chronic. A Chronic Disease Management System (CDMS), or registry, is a tool that helps providers efficiently collect and analyze patient information to promote quality care for the rural population. CDMSs can provide a technological entry point for the impending use of Electronic Medical Records. A CDMS is a patient-centered electronic database tool that helps providers diagnose, treat, and manage chronic diseases. The purpose of this brief is to discuss the different types of CDMSs used by a sample of 14 state organizations and 19 local rural clinics in Maine, Nebraska, New Mexico, South Carolina, Washington, and Wisconsin. As part of a larger study examining the challenges and innovations in implementing disease management programs in rural areas, we conducted interviews with national, state, and local contacts. During interviews, respondents helped us understand the usefulness and functionalities of commonly used CDMSs in rural facilities. Our focus was on the use of CDMSs in the management of diabetes, a disease prevalent in rural populations. (1) CDMSs are readily available to rural clinics and are being implemented and maintained by clinic staff with minimal expenditures for technology. (2) Use of a standardized system in a collaborative helps provide data comparisons and share costs involved with technical assistance services across the group.
NASA Astrophysics Data System (ADS)
Malik, T.; Foster, I.; Goodall, J. L.; Peckham, S. D.; Baker, J. B. H.; Gurnis, M.
2015-12-01
Research activities are iterative, collaborative, and now data- and compute-intensive. Such research activities mean that even the many researchers who work in small laboratories must often create, acquire, manage, and manipulate much diverse data and keep track of complex software. They face difficult data and software management challenges, and data sharing and reproducibility are neglected. There is signficant federal investment in powerful cyberinfrastructure, in part to lesson the burden associated with modern data- and compute-intensive research. Similarly, geoscience communities are establishing research repositories to facilitate data preservation. Yet we observe a large fraction of the geoscience community continues to struggle with data and software management. The reason, studies suggest, is not lack of awareness but rather that tools do not adequately support time-consuming data life cycle activities. Through NSF/EarthCube-funded GeoDataspace project, we are building personalized, shareable dataspaces that help scientists connect their individual or research group efforts with the community at large. The dataspaces provide a light-weight multiplatform research data management system with tools for recording research activities in what we call geounits, so that a geoscientist can at any time snapshot and preserve, both for their own use and to share with the community, all data and code required to understand and reproduce a study. A software-as-a-service (SaaS) deployment model enhances usability of core components, and integration with widely used software systems. In this talk we will present the open-source GeoDataspace project and demonstrate how it is enabling reproducibility across geoscience domains of hydrology, space science, and modeling toolkits.
Garnett, Kenisha; Cooper, Tim
2014-12-01
The complexity of municipal waste management decision-making has increased in recent years, accompanied by growing scrutiny from stakeholders, including local communities. This complexity reflects a socio-technical framing of the risks and social impacts associated with selecting technologies and sites for waste treatment and disposal facilities. Consequently there is growing pressure on local authorities for stakeholders (including communities) to be given an early opportunity to shape local waste policy in order to encourage swift planning, development and acceptance of the technologies needed to meet statutory targets to divert waste from landfill. This paper presents findings from a research project that explored the use of analytical-deliberative processes as a legitimising tool for waste management decision-making. Adopting a mixed methods approach, the study revealed that communicating the practical benefits of more inclusive forms of engagement is proving difficult even though planning and policy delays are hindering development and implementation of waste management infrastructure. Adopting analytical-deliberative processes at a more strategic level will require local authorities and practitioners to demonstrate how expert-citizen deliberations may foster progress in resolving controversial issues, through change in individuals, communities and institutions. The findings suggest that a significant shift in culture will be necessary for local authorities to realise the potential of more inclusive decision processes. This calls for political actors and civic society to collaborate in institutionalising public involvement in both strategic and local planning structures. Copyright © 2014 Elsevier Ltd. All rights reserved.
Reed, Richard L; Battersby, Malcolm; Osborne, Richard H; Bond, Malcolm J; Howard, Sara L; Roeger, Leigh
2011-11-01
The prevalence of older Australians with multiple chronic diseases is increasing and now accounts for a large proportion of total health care utilisation. Chronic disease self-management support (CDSMS) has become a core service component of many community based health programs because it is considered a useful tool in improving population health outcomes and reducing the financial burden of chronic disease care. However, the evidence base to justify these support programs is limited, particularly for older people with multiple chronic diseases. We describe an ongoing trial examining the effectiveness of a particular CDSMS approach called the Flinders Program. The Flinders Program is a clinician-led generic self-management intervention that provides a set of tools and a structured process that enables health workers and patients to collaboratively assess self-management behaviours, identify problems, set goals, and develop individual care plans covering key self-care, medical, psychosocial and carer issues. A sample of 252 older Australians that have two or more chronic conditions will be randomly assigned to receive either CDSMS or an attention control intervention (health information only) for 6 months. Outcomes will be assessed using self-reported health measures taken at baseline and post-intervention. This project will be the first comprehensive evaluation of CDSMS in this population. Findings are expected to guide consumers, clinicians and policymakers in the use of CDSMS, as well as facilitate prioritisation of public monies towards evidence-based services. Copyright © 2011 Elsevier Inc. All rights reserved.
Integrating a mobile health setup in a chronic disease management network.
Ding, Hang; Ireland, Derek; Jayasena, Rajiv; Curmi, Jamie; Karunanithi, Mohan
2013-01-01
Supporting self management of chronic disease in collaboration with primary healthcare has been a national priority in order to mitigate the emerging disease burden on the already strained healthcare system. However, in practice, the uptake of self-management programs and compliance with clinical guidelines remain poor. Time constraints due to work commitments and lack of efficient monitoring tools have been the major barrier to the uptake and compliance. In this paper, we present a newly integrated mobile health system with a clinical chronic disease management network called cdmNet, which has already been validated to facilitate General Practitioners (GPs) to provide collaborative disease management services. The newly integrated solution takes advantage of the latest mobile web and wireless Bluetooth communication techniques to enable patients to record health data entries through ubiquitous mobile phones, and allows the data to be simultaneously shared by multidisciplinary care teams. This integration would enable patients to self-manage their chronic disease conditions in collaboration with GPs and hence, improve the uptake and compliance. Additionally, the proposed integration will provide a useful framework encouraging the translation of innovative mobile health technologies into highly regulated healthcare systems.
Integrated care management: aligning medical call centers and nurse triage services.
Kastens, J M
1998-01-01
Successful integrated delivery systems must aggressively design new approaches to managing patient care. Implementing a comprehensive care management model to coordinate patient care across the continuum is essential to improving patient care and reducing costs. The practice of telephone nursing and the need for experienced registered nurses to staff medical call centers, nurse triage centers, and outbound telemanagement is expanding as the penetration of full-risk capitated managed care contracts are signed. As health systems design their new care delivery approaches and care management models, medical call centers will be an integral approach to managing demand for services, chronic illnesses, and prevention strategies.
Points of attention in designing tools for regional brownfield prioritization.
Limasset, Elsa; Pizzol, Lisa; Merly, Corinne; Gatchett, Annette M; Le Guern, Cécile; Martinát, Stanislav; Klusáček, Petr; Bartke, Stephan
2018-05-01
The regeneration of brownfields has been increasingly recognized as a key instrument in sustainable land management, since free developable land (or so called "greenfields") has become a scare and more expensive resource, especially in densely populated areas. However, the complexity of these sites requires considerable efforts to successfully complete their revitalization projects, thus requiring the development and application of appropriate tools to support decision makers in the selection of promising sites where efficiently allocate the limited financial resources. The design of effective prioritization tools is a complex process, which requires the analysis and consideration of critical points of attention (PoAs) which has been identified considering the state of the art in literature, and lessons learned from previous developments of regional brownfield (BF) prioritization processes, frameworks and tools. Accordingly, we identified 5 PoAs, namely 1) Assessing end user needs and orientation discussions, 2) Availability and quality of the data needed for the BF prioritization tool, 3) Communication and stakeholder engagement 4) Drivers of regeneration success, and 5) Financing and application costs. To deepen and collate the most recent knowledge on the topics from scientists and practitioners, we organized a focus group discussion within a special session at the AquaConSoil (ACS) conference 2017, where participants were asked to add their experience and thoughts to the discussion in order to identify the most significant and urgent points of attention in BF prioritization tool design. The result of this assessment is a comprehensive table (Table 2), which can support problem owners, investors, service providers, regulators, public and private land managers, decision makers etc. in the identification of the main aspects (sub-topics) to be considered and their relative influences and in the comprehension of the general patterns and challenges to be faced when dealing with the development of BF prioritization tools. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kahil, Mohamed Taher; Dinar, Ariel; Albiac, Jose
2015-03-01
Growing water extractions combined with emerging demands for environment protection increase competition for scarce water resources worldwide, especially in arid and semiarid regions. In those regions, climate change is projected to exacerbate water scarcity and increase the recurrence and intensity of droughts. These circumstances call for methodologies that can support the design of sustainable water management. This paper presents a hydro-economic model that links a reduced form hydrological component, with economic and environmental components. The model is applied to an arid and semiarid basin in Southeastern Spain to analyze the effects of droughts and to assess alternative adaptation policies. Results indicate that drought events have large impacts on social welfare, with the main adjustments sustained by irrigation and the environment. The water market policy seems to be a suitable option to overcome the negative economic effects of droughts, although the environmental effects may weaken its advantages for society. The environmental water market policy, where water is acquired for the environment, is an appealing policy to reap the private benefits of markets while protecting ecosystems. The current water management approach in Spain, based on stakeholders' cooperation, achieves almost the same economic outcomes and better environmental outcomes compared to a pure water market. These findings call for a reconsideration of the current management in arid and semiarid basins around the world. The paper illustrates the potential of hydro-economic modeling for integrating the multiple dimensions of water resources, becoming a valuable tool in the advancement of sustainable water management policies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, M.; Anderson, D.P.
1988-01-01
Marionette is a system for distributed parallel programming in an environment of networked heterogeneous computer systems. It is based on a master/slave model. The master process can invoke worker operations (asynchronous remote procedure calls to single slaves) and context operations (updates to the state of all slaves). The master and slaves also interact through shared data structures that can be modified only by the master. The master and slave processes are programmed in a sequential language. The Marionette runtime system manages slave process creation, propagates shared data structures to slaves as needed, queues and dispatches worker and context operations, andmore » manages recovery from slave processor failures. The Marionette system also includes tools for automated compilation of program binaries for multiple architectures, and for distributing binaries to remote fuel systems. A UNIX-based implementation of Marionette is described.« less
Drewes, Rich; Zou, Quan; Goodman, Philip H
2009-01-01
Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading "glue" tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS.
Drewes, Rich; Zou, Quan; Goodman, Philip H.
2008-01-01
Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading “glue” tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS. PMID:19506707
Lackey, Amanda E; Moshiri, Mariam; Pandey, Tarun; Lall, Chandana; Lalwani, Neeraj; Bhargava, Puneet
2014-05-01
In an era of declining reimbursements and tightening of the job market, today's radiologists are forced to "make do with less." With the rollout of the Patient Protection and Affordable Care Act, commonly called "Obamacare," radiologists will be expected not only to interpret studies but to also take on many additional roles, adding a new layer of complexity to already demanding daily duties. These changes make it more important than ever to develop a personal workflow management system incorporating some of the most potent productivity tools. In this article, the authors discuss current productivity techniques and related software with the most potential to help radiologists keep up with the ever increasing demands on their time at the work place and help us lead more balanced lives. Published by Elsevier Inc.
In search of practice performance data? Call the hospital.
Bellile, S K
1997-01-01
Comparative performance data is increasingly being used by hospitals and managed care plans to evaluate physician practices. Outcomes data can also be a valuable tool for continuous improvement within a practice. Administrators need to understand the different categories and sources of physician practice data. Hospitals are a particularly good, yet often underutilized, data resource. Descriptive, financial and clinical information available from hospital systems can be used to compare one physician's performance to norms for specific case types (e.g. DRG's), focus internal review efforts and support managed care marketing and negotiation. Administrators need to identify key hospital contacts, make specific data requests and knowledgeably (and cautiously) interpret the data received. Finally, the administrator plays a crucial role turning data into information: identifying and presenting key findings and insuring that the information is used to the group's competitive advantage.
CALL in the Zone of Proximal Development: Novelty Effects and Teacher Guidance
ERIC Educational Resources Information Center
Karlström, Petter; Lundin, Eva
2013-01-01
Digital tools are not always used in the manner their designers had in mind. Therefore, it is not enough to assume that learning through CALL tools occurs in intended ways, if at all. We have studied the use of an enhanced word processor for writing essays in Swedish as a second language. The word processor contained natural language processing…
SPOT: Optimization Tool for Network Adaptable Security
NASA Astrophysics Data System (ADS)
Ksiezopolski, Bogdan; Szalachowski, Pawel; Kotulski, Zbigniew
Recently we have observed the growth of the intelligent application especially with its mobile character, called e-anything. The implementation of these applications provides guarantee of security requirements of the cryptographic protocols which are used in the application. Traditionally the protocols have been configured with the strongest possible security mechanisms. Unfortunately, when the application is used by means of the mobile devices, the strongest protection can lead to the denial of services for them. The solution of this problem is introducing the quality of protection models which will scale the protection level depending on the actual threat level. In this article we would like to introduce the application which manages the protection level of the processes in the mobile environment. The Security Protocol Optimizing Tool (SPOT) optimizes the cryptographic protocol and defines the protocol version appropriate to the actual threat level. In this article the architecture of the SPOT is presented with a detailed description of the included modules.
National Cycle Program (NCP) Common Analysis Tool for Aeropropulsion
NASA Technical Reports Server (NTRS)
Follen, G.; Naiman, C.; Evans, A.
1999-01-01
Through the NASA/Industry Cooperative Effort (NICE) agreement, NASA Lewis and industry partners are developing a new engine simulation, called the National Cycle Program (NCP), which is the initial framework of NPSS. NCP is the first phase toward achieving the goal of NPSS. This new software supports the aerothermodynamic system simulation process for the full life cycle of an engine. The National Cycle Program (NCP) was written following the Object Oriented Paradigm (C++, CORBA). The software development process used was also based on the Object Oriented paradigm. Software reviews, configuration management, test plans, requirements, design were all apart of the process used in developing NCP. Due to the many contributors to NCP, the stated software process was mandatory for building a common tool intended for use by so many organizations. The U.S. aircraft and airframe companies recognize NCP as the future industry standard for propulsion system modeling.
Empowering the Community to Manage Diabetes Better: An Integrated Partnership-Based Model.
Bamne, Arun; Shah, Daksha; Palkar, Sanjivani; Uppal, Shweta; Majumdar, Anurita; Naik, Rohan
2016-01-01
Rising number of diabetes cases in India calls for collaboration between the public and private sectors. Municipal Corporation of Greater Mumbai (MCGM) partnered with Eli Lilly and Company (India) [Eli Lilly] to strengthen the capacity of their diabetes clinics. Medical Officers, dispensaries and Assistant Medical Officers (AMOs) located at attached health posts were trained on an educational tool, Diabetes Conversation Map™ (DCM) by a Master Trainer. This tool was then used to educate patients and caregivers visiting the MCGM diabetes clinics. Twenty-eight centers conducted 168 sessions, and 1616 beneficiaries availed the education over six months. General feedback from health providers was that DCM helps clear misconceptions among patients and caregivers in an interactive way and also improves compliance of patients. This communication highlights a unique public-private partnership where the sincere efforts of public sector organization (MCGM) were complemented by the educational expertise lent by a private firm.
TRAPR: R Package for Statistical Analysis and Visualization of RNA-Seq Data.
Lim, Jae Hyun; Lee, Soo Youn; Kim, Ju Han
2017-03-01
High-throughput transcriptome sequencing, also known as RNA sequencing (RNA-Seq), is a standard technology for measuring gene expression with unprecedented accuracy. Numerous bioconductor packages have been developed for the statistical analysis of RNA-Seq data. However, these tools focus on specific aspects of the data analysis pipeline, and are difficult to appropriately integrate with one another due to their disparate data structures and processing methods. They also lack visualization methods to confirm the integrity of the data and the process. In this paper, we propose an R-based RNA-Seq analysis pipeline called TRAPR, an integrated tool that facilitates the statistical analysis and visualization of RNA-Seq expression data. TRAPR provides various functions for data management, the filtering of low-quality data, normalization, transformation, statistical analysis, data visualization, and result visualization that allow researchers to build customized analysis pipelines.
Use of acoustics to deter bark beetles from entering tree material.
Aflitto, Nicholas C; Hofstetter, Richard W
2014-12-01
Acoustic technology is a potential tool to protect wood materials and eventually live trees from colonization by bark beetles. Bark beetles such as the southern pine beetle Dendroctonus frontalis, western pine beetle D. brevicomis and pine engraver Ips pini (Coleoptera: Curculionidae) use chemical and acoustic cues to communicate and to locate potential mates and host trees. In this study, the efficacy of sound treatments on D. frontalis, D. brevicomis and I. pini entry into tree materials was tested. Acoustic treatments significantly influenced whether beetles entered pine logs in the laboratory. Playback of artificial sounds reduced D. brevicomis entry into logs, and playback of stress call sounds reduced D. frontalis entry into logs. Sound treatments had no effect on I. pini entry into logs. The reduction in bark beetle entry into logs using particular acoustic treatments indicates that sound could be used as a viable management tool. © 2013 Society of Chemical Industry.
Spacecraft command verification: The AI solution
NASA Technical Reports Server (NTRS)
Fesq, Lorraine M.; Stephan, Amy; Smith, Brian K.
1990-01-01
Recently, a knowledge-based approach was used to develop a system called the Command Constraint Checker (CCC) for TRW. CCC was created to automate the process of verifying spacecraft command sequences. To check command files by hand for timing and sequencing errors is a time-consuming and error-prone task. Conventional software solutions were rejected when it was estimated that it would require 36 man-months to build an automated tool to check constraints by conventional methods. Using rule-based representation to model the various timing and sequencing constraints of the spacecraft, CCC was developed and tested in only three months. By applying artificial intelligence techniques, CCC designers were able to demonstrate the viability of AI as a tool to transform difficult problems into easily managed tasks. The design considerations used in developing CCC are discussed and the potential impact of this system on future satellite programs is examined.
Biopathways representation and simulation on hybrid functional petri net.
Matsuno, Hiroshi; Tanaka, Yukiko; Aoshima, Hitoshi; Doi, Atsushi; Matsui, Mika; Miyano, Satoru
2011-01-01
The following two matters should be resolved in order for biosimulation tools to be accepted by users in biology/medicine: (1) remove issues which are irrelevant to biological importance, and (2) allow users to represent biopathways intuitively and understand/manage easily the details of representation and simulation mechanism. From these criteria, we firstly define a novel notion of Petri net called Hybrid Functional Petri Net (HFPN). Then, we introduce a software tool, Genomic Object Net, for representing and simulating biopathways, which we have developed by employing the architecture of HFPN. In order to show the usefulness of Genomic Object Net for representing and simulating biopathways, we show two HFPN representations of gene regulation mechanisms of Drosophila melanogaster (fruit fly) circadian rhythm and apoptosis induced by Fas ligand. The simulation results of these biopathways are also correlated with biological observations. The software is available to academic users from http://www.GenomicObject.Net/.
Use of a Knowledge Management System in Waste Management Projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gruendler, D.; Boetsch, W.U.; Holzhauer, U.
2006-07-01
In Germany the knowledge management system 'WasteInfo' about waste management and disposal issues has been developed and implemented. Beneficiaries of 'WasteInfo' are official decision makers having access to a large information pool. The information pool is fed by experts, so called authors This means compiling of information, evaluation and assigning of appropriate properties (metadata) to this information. The knowledge management system 'WasteInfo' has been introduced at the WM04, the operation of 'WasteInfo' at the WM05. The recent contribution describes the additional advantage of the KMS being used as a tool for the dealing with waste management projects. This specific aspectmore » will be demonstrated using a project concerning a comparative analysis of the implementation of repositories in six countries using nuclear power as examples: The information of 'WasteInfo' is assigned to categories and structured according to its origin and type of publication. To use 'WasteInfo' as a tool for the processing the projects, a suitable set of categories has to be developed for each project. Apart from technical and scientific aspects, the selected project deals with repository strategies and policies in various countries, with the roles of applicants and authorities in licensing procedures, with safety philosophy and with socio-economic concerns. This new point of view has to be modelled in the categories. Similar to this, new sources of information such as local and regional dailies or particular web-sites have to be taken into consideration. In this way 'WasteInfo' represents an open document which reflects the current status of the respective repository policy in several countries. Information with particular meaning for the German repository planning is marked and by this may influence the German strategy. (authors)« less
Assessment of tools for protection of quality of water: Uncontrollable discharges of pollutants.
Dehghani Darmian, Mohsen; Hashemi Monfared, Seyed Arman; Azizyan, Gholamreza; Snyder, Shane A; Giesy, John P
2018-06-06
Selecting an appropriate crisis management plans during uncontrollable loading of pollution to water systems is crucial. In this research the quality of water resources against uncontrollable pollution is protected by use of suitable tools. Case study which was chosen in this investigation was a river-reservoir system. Analytical and numerical solutions of pollutant transport equation were considered as the simulation strategy to calculate the efficient tools to protect water quality. These practical instruments are dilution flow and a new tool called detention time which is proposed and simulated for the first time in this study. For uncontrollable pollution discharge which was approximately 130% of the river's assimilation capacity, as long as the duration of contact (T c ) was considered as a constraint, by releasing 30% of the base flow of the river from the upstream dilution reservoir, the unallowable pollution could be treated. Moreover, when the affected distance (X c ) was selected as a constraint, the required detention time that the rubber dam should detained the water to be treated was equal to 187% of the initial duration of contact. Copyright © 2018 Elsevier Inc. All rights reserved.
Dynamic Load-Balancing for Distributed Heterogeneous Computing of Parallel CFD Problems
NASA Technical Reports Server (NTRS)
Ecer, A.; Chien, Y. P.; Boenisch, T.; Akay, H. U.
2000-01-01
The developed methodology is aimed at improving the efficiency of executing block-structured algorithms on parallel, distributed, heterogeneous computers. The basic approach of these algorithms is to divide the flow domain into many sub- domains called blocks, and solve the governing equations over these blocks. Dynamic load balancing problem is defined as the efficient distribution of the blocks among the available processors over a period of several hours of computations. In environments with computers of different architecture, operating systems, CPU speed, memory size, load, and network speed, balancing the loads and managing the communication between processors becomes crucial. Load balancing software tools for mutually dependent parallel processes have been created to efficiently utilize an advanced computation environment and algorithms. These tools are dynamic in nature because of the chances in the computer environment during execution time. More recently, these tools were extended to a second operating system: NT. In this paper, the problems associated with this application will be discussed. Also, the developed algorithms were combined with the load sharing capability of LSF to efficiently utilize workstation clusters for parallel computing. Finally, results will be presented on running a NASA based code ADPAC to demonstrate the developed tools for dynamic load balancing.
An efficient framework for Java data processing systems in HPC environments
NASA Astrophysics Data System (ADS)
Fries, Aidan; Castañeda, Javier; Isasi, Yago; Taboada, Guillermo L.; Portell de Mora, Jordi; Sirvent, Raül
2011-11-01
Java is a commonly used programming language, although its use in High Performance Computing (HPC) remains relatively low. One of the reasons is a lack of libraries offering specific HPC functions to Java applications. In this paper we present a Java-based framework, called DpcbTools, designed to provide a set of functions that fill this gap. It includes a set of efficient data communication functions based on message-passing, thus providing, when a low latency network such as Myrinet is available, higher throughputs and lower latencies than standard solutions used by Java. DpcbTools also includes routines for the launching, monitoring and management of Java applications on several computing nodes by making use of JMX to communicate with remote Java VMs. The Gaia Data Processing and Analysis Consortium (DPAC) is a real case where scientific data from the ESA Gaia astrometric satellite will be entirely processed using Java. In this paper we describe the main elements of DPAC and its usage of the DpcbTools framework. We also assess the usefulness and performance of DpcbTools through its performance evaluation and the analysis of its impact on some DPAC systems deployed in the MareNostrum supercomputer (Barcelona Supercomputing Center).
Knowledge-acquisition tools for medical knowledge-based systems.
Lanzola, G; Quaglini, S; Stefanelli, M
1995-03-01
Knowledge-based systems (KBS) have been proposed to solve a large variety of medical problems. A strategic issue for KBS development and maintenance are the efforts required for both knowledge engineers and domain experts. The proposed solution is building efficient knowledge acquisition (KA) tools. This paper presents a set of KA tools we are developing within a European Project called GAMES II. They have been designed after the formulation of an epistemological model of medical reasoning. The main goal is that of developing a computational framework which allows knowledge engineers and domain experts to interact cooperatively in developing a medical KBS. To this aim, a set of reusable software components is highly recommended. Their design was facilitated by the development of a methodology for KBS construction. It views this process as comprising two activities: the tailoring of the epistemological model to the specific medical task to be executed and the subsequent translation of this model into a computational architecture so that the connections between computational structures and their knowledge level counterparts are maintained. The KA tools we developed are illustrated taking examples from the behavior of a KBS we are building for the management of children with acute myeloid leukemia.
VIPER: a web application for rapid expert review of variant calls.
Wöste, Marius; Dugas, Martin
2018-06-01
With the rapid development in next-generation sequencing, cost and time requirements for genomic sequencing are decreasing, enabling applications in many areas such as cancer research. Many tools have been developed to analyze genomic variation ranging from single nucleotide variants to whole chromosomal aberrations. As sequencing throughput increases, the number of variants called by such tools also grows. Often employed manual inspection of such calls is thus becoming a time-consuming procedure. We developed the Variant InsPector and Expert Rating tool (VIPER) to speed up this process by integrating the Integrative Genomics Viewer into a web application. Analysts can then quickly iterate through variants, apply filters and make decisions based on the generated images and variant metadata. VIPER was successfully employed in analyses with manual inspection of more than 10 000 calls. VIPER is implemented in Java and Javascript and is freely available at https://github.com/MarWoes/viper. marius.woeste@uni-muenster.de. Supplementary data are available at Bioinformatics online.
Halvade-RNA: Parallel variant calling from transcriptomic data using MapReduce.
Decap, Dries; Reumers, Joke; Herzeel, Charlotte; Costanza, Pascal; Fostier, Jan
2017-01-01
Given the current cost-effectiveness of next-generation sequencing, the amount of DNA-seq and RNA-seq data generated is ever increasing. One of the primary objectives of NGS experiments is calling genetic variants. While highly accurate, most variant calling pipelines are not optimized to run efficiently on large data sets. However, as variant calling in genomic data has become common practice, several methods have been proposed to reduce runtime for DNA-seq analysis through the use of parallel computing. Determining the effectively expressed variants from transcriptomics (RNA-seq) data has only recently become possible, and as such does not yet benefit from efficiently parallelized workflows. We introduce Halvade-RNA, a parallel, multi-node RNA-seq variant calling pipeline based on the GATK Best Practices recommendations. Halvade-RNA makes use of the MapReduce programming model to create and manage parallel data streams on which multiple instances of existing tools such as STAR and GATK operate concurrently. Whereas the single-threaded processing of a typical RNA-seq sample requires ∼28h, Halvade-RNA reduces this runtime to ∼2h using a small cluster with two 20-core machines. Even on a single, multi-core workstation, Halvade-RNA can significantly reduce runtime compared to using multi-threading, thus providing for a more cost-effective processing of RNA-seq data. Halvade-RNA is written in Java and uses the Hadoop MapReduce 2.0 API. It supports a wide range of distributions of Hadoop, including Cloudera and Amazon EMR.
Serials Management by Microcomputer: The Potential of DBMS.
ERIC Educational Resources Information Center
Vogel, J. Thomas; Burns, Lynn W.
1984-01-01
Describes serials management at Philadelphia College of Textiles and Science library via a microcomputer, a file manager called PFS, and a relational database management system called dBase II. Check-in procedures, programing with dBase II, "static" and "active" databases, and claim procedures are discussed. Check-in forms are…
Love, Cynthia B.; Arnesen, Stacey J.; Phillips, Steven J.; Windom, Robert E.
2016-01-01
From 2010 to 2013, the National Library of Medicine (NLM) Disaster Information Management Research Center (DIMRC) continued to build its programs and services on the foundation laid in its starting years, 2008–2010. Prior to 2008, NLM had a long history of providing health information, training, and tools in response to disasters. Aware of this legacy, the NLM long range plan (Charting a Course for the 21st Century: NLM’s Long Range Plan 2006–2016) called for creation of a center to show “a strong commitment to disaster remediation and to provide a platform for demonstrating how libraries and librarians can be part of the solution to this national problem”. NLM is continuing efforts to ensure that medical libraries have plans for the continuity of their operations, librarians are trained to understand their roles in preparedness and response, online disaster health information resources are available for many audiences and in multiple formats, and research is conducted on tools to enhance the exchange of critical information during and following disasters. This paper describes the 2010–2013 goals and activities of DIMRC and its future plans. PMID:27570333
Decision-support information system to manage mass casualty incidents at a level 1 trauma center.
Bar-El, Yaron; Tzafrir, Sara; Tzipori, Idan; Utitz, Liora; Halberthal, Michael; Beyar, Rafael; Reisner, Shimon
2013-12-01
Mass casualty incidents are probably the greatest challenge to a hospital. When such an event occurs, hospitals are required to instantly switch from their routine activity to conditions of great uncertainty and confront needs that exceed resources. We describe an information system that was uniquely designed for managing mass casualty events. The web-based system is activated when a mass casualty event is declared; it displays relevant operating procedures, checklists, and a log book. The system automatically or semiautomatically initiates phone calls and public address announcements. It collects real-time data from computerized clinical and administrative systems in the hospital, and presents them to the managing team in a clear graphic display. It also generates periodic reports and summaries of available or scarce resources that are sent to predefined recipients. When the system was tested in a nationwide exercise, it proved to be an invaluable tool for informed decision making in demanding and overwhelming situations such as mass casualty events.
Using GIS to generate spatially balanced random survey designs for natural resource applications.
Theobald, David M; Stevens, Don L; White, Denis; Urquhart, N Scott; Olsen, Anthony R; Norman, John B
2007-07-01
Sampling of a population is frequently required to understand trends and patterns in natural resource management because financial and time constraints preclude a complete census. A rigorous probability-based survey design specifies where to sample so that inferences from the sample apply to the entire population. Probability survey designs should be used in natural resource and environmental management situations because they provide the mathematical foundation for statistical inference. Development of long-term monitoring designs demand survey designs that achieve statistical rigor and are efficient but remain flexible to inevitable logistical or practical constraints during field data collection. Here we describe an approach to probability-based survey design, called the Reversed Randomized Quadrant-Recursive Raster, based on the concept of spatially balanced sampling and implemented in a geographic information system. This provides environmental managers a practical tool to generate flexible and efficient survey designs for natural resource applications. Factors commonly used to modify sampling intensity, such as categories, gradients, or accessibility, can be readily incorporated into the spatially balanced sample design.
Managing biological networks by using text mining and computer-aided curation
NASA Astrophysics Data System (ADS)
Yu, Seok Jong; Cho, Yongseong; Lee, Min-Ho; Lim, Jongtae; Yoo, Jaesoo
2015-11-01
In order to understand a biological mechanism in a cell, a researcher should collect a huge number of protein interactions with experimental data from experiments and the literature. Text mining systems that extract biological interactions from papers have been used to construct biological networks for a few decades. Even though the text mining of literature is necessary to construct a biological network, few systems with a text mining tool are available for biologists who want to construct their own biological networks. We have developed a biological network construction system called BioKnowledge Viewer that can generate a biological interaction network by using a text mining tool and biological taggers. It also Boolean simulation software to provide a biological modeling system to simulate the model that is made with the text mining tool. A user can download PubMed articles and construct a biological network by using the Multi-level Knowledge Emergence Model (KMEM), MetaMap, and A Biomedical Named Entity Recognizer (ABNER) as a text mining tool. To evaluate the system, we constructed an aging-related biological network that consist 9,415 nodes (genes) by using manual curation. With network analysis, we found that several genes, including JNK, AP-1, and BCL-2, were highly related in aging biological network. We provide a semi-automatic curation environment so that users can obtain a graph database for managing text mining results that are generated in the server system and can navigate the network with BioKnowledge Viewer, which is freely available at http://bioknowledgeviewer.kisti.re.kr.
NASA Astrophysics Data System (ADS)
Basista, A.
2013-12-01
There are many tools to manage spatial data. They called Geographic Information System (GIS), which apart from data visualization in space, let users make various spatial analysis. Thanks to them, it is possible to obtain more, essential information for real estate market analysis. Many scientific research present GIS exploitation to future mass valuation, because it is necessary to use advanced tools to manage such a huge real estates' data sets gathered for mass valuation needs. In practice, appraisers use rarely these tools for single valuation, because there are not many available GIS tools to support real estate valuation. The paper presents the functionality of geoinformatic subsystem, that is used to support real estate market analysis and real estate valuation. There are showed a detailed description of the process relied to attributes' inputting into the database and the attributes' values calculation based on the proposed definition of attributes' scales. This work presents also the algorithm of similar properties selection that was implemented within the described subsystem. The main stage of this algorithm is the calculation of the price creative indicator for each real estate, using their attributes' values. The set of properties, chosen in this way, are visualized on the map. The geoinformatic subsystem is used for the un-built real estates and living premises. Geographic Information System software was used to worked out this project. The basic functionality of gvSIG software (open source software) was extended and some extra functions were added to support real estate market analysis.
NASA Astrophysics Data System (ADS)
Weber, J.; Domenico, B.
2004-12-01
This paper is an example of what we call data interactive publications. With a properly configured workstation, the readers can click on "hotspots" in the document that launches an interactive analysis tool called the Unidata Integrated Data Viewer (IDV). The IDV will enable the readers to access, analyze and display datasets on remote servers as well as documents describing them. Beyond the parameters and datasets initially configured into the paper, the analysis tool will have access to all the other dataset parameters as well as to a host of other datasets on remote servers. These data interactive publications are built on top of several data delivery, access, discovery, and visualization tools developed by Unidata and its partner organizations. For purposes of illustrating this integrative technology, we will use data from the event of Hurricane Charley over Florida from August 13-15, 2004. This event illustrates how components of this process fit together. The Local Data Manager (LDM), Open-source Project for a Network Data Access Protocol (OPeNDAP) and Abstract Data Distribution Environment (ADDE) services, Thematic Realtime Environmental Distributed Data Service (THREDDS) cataloging services, and the IDV are highlighted in this example of a publication with embedded pointers for accessing and interacting with remote datasets. An important objective of this paper is to illustrate how these integrated technologies foster the creation of documents that allow the reader to learn the scientific concepts by direct interaction with illustrative datasets, and help build a framework for integrated Earth System science.
[The balanced scorecard. "Tool or toy" in hospitals].
Brinkmann, A; Gebhard, F; Isenmann, R; Bothner, U; Mohl, U; Schwilk, B
2003-10-01
The change in hospital funding with diagnosis related groups (DRG), medical advances as well as demographic changes will call for new quantitative and qualitative standards imposed on German hospitals. Increasing costs and competition in the health care sector requires new and innovative strategies for resource management. Today's policy is mainly defined by rationing and intensified workload. The introduction of DRGs will presumably further constrict management perspectives on pure financial aspects. However, to ensure future development, compassionate services and continued existence of hospitals, a balance of seemingly conflicting perspectives, such as finance, customer, process, learning and growth are of utmost importance. Herein doctors and nurses in leading positions should play a key role in changing management practice. For several years the balanced scorecard has been successfully used as a strategic management concept in non-profit organizations, even in the health care sector. This concept complies with the multidimensional purposes of hospitals and focuses on policy deployment. Finally it gives the opportunity to involve all employees in the original development, communication and execution of a balanced scorecard approach.
Weak signal detection: A discrete window of opportunity for achieving ‘Vision 90:90:90’?
Burman, Christopher J.; Aphane, Marota; Delobelle, Peter
2016-01-01
Abstract Introduction: UNAIDS’ Vision 90:90:90 is a call to ‘end AIDS’. Developing predictive foresight of the unpredictable changes that this journey will entail could contribute to the ambition of ‘ending AIDS’. There are few opportunities for managing unpredictable changes. We introduce ‘weak signal detection’ as a potential opportunity to fill this void. Method: Combining futures and complexity theory, we reflect on two pilot case studies that involved the Archetype Extraction technique and the SenseMaker® Collector™ tool. Results: Both the piloted techniques have the potentials to surface weak signals – but there is room for improvement. Discussion: A management response to a complex weak signal requires pattern management, rather than an exclusive focus on behaviour management. Conclusion: Weak signal detection is a window of opportunity to improve resilience to unpredictable changes in the HIV/AIDS landscape that can both reduce the risk that emerges from the changes and increase the visibility of opportunities to exploit the unpredictable changes that could contribute to ‘ending AIDS’. PMID:26821952
NASA Technical Reports Server (NTRS)
Lee, Paul U.; Smith, Nancy M.; Prevot, Thomas; Homola, Jeffrey R.
2010-01-01
When demand for an airspace sector exceeds capacity, the balance can be re-established by reducing the demand, increasing the capacity, or both. The Multi-Sector Planner (MSP) concept has been proposed to better manage traffic demand by modifying trajectories across multiple sectors. A complementary approach to MSP, called Flexible Airspace Management (FAM), reconfigures the airspace such that capacity can be reallocated dynamically to balance the traffic demand across multiple sectors, resulting in fewer traffic management initiatives. The two concepts have been evaluated with a series of human-in-the-loop simulations at the Airspace Operations Laboratory to examine and refine the roles of the human operators in these concepts, as well as their tools and procedural requirements. So far MSP and FAM functions have been evaluated individually but the integration of the two functions is desirable since there are significant overlaps in their goals, geographic/temporal scope of the problem space, and the implementation timeframe. Ongoing research is planned to refine the humans roles in the integrated concept.
ON DEVELOPING TOOLS AND METHODS FOR ENVIRONMENTALLY BENIGN PROCESSES
Two types of tools are generally needed for designing processes and products that are cleaner from environmental impact perspectives. The first kind is called process tools. Process tools are based on information obtained from experimental investigations in chemistry., material s...
Opening the door to coordination of care through teachable moments.
Berg, Gregory D; Korn, Allan M; Thomas, Eileen; Klemka-Walden, Linda; Bigony, Marysanta D; Newman, John F
2007-10-01
The challenge for care coordination is to identify members at a moment in time when they are receptive to intervention and provide the appropriate care management services. This manuscript describes a pilot program using inbound nurse advice calls from members to engage them in a care management program including disease management (DM). Annual medical claims diagnoses were used to identify members and their associated disease conditions. For each condition group for each year, nurse advice call data were used to calculate inbound nurse advice service call rates for each group. A pilot program was set up to engage inbound nurse advice callers in a broader discussion of their health concerns and refer them to a care management program. Among the program results, both the call rate by condition group and the correlation between average costs and call rates show that higher cost groups of members call the nurse advice service disproportionately more than lower cost members. Members who entered the DM programs through the nurse advice service were more likely to stay in the program than those who participated in the standard opt-in program. The results of this pilot program suggest that members who voluntarily call in to the nurse advice service for triage are at a "teachable moment" and highly motivated to participate in appropriate care management programs. The implication is that the nurse advice service may well be an innovative and effective way to enhance participation in a variety of care management programs including DM.
DOT National Transportation Integrated Search
2003-04-01
Introduction Motorist aid call boxes are used to provide motorist assistance, improve safety, and can serve as an incident detection tool. More recently, Intelligent Transportation Systems (ITS) applications have been added to call box systems to enh...
Ramirez, Magaly; Wu, Shinyi; Jin, Haomiao; Ell, Kathleen; Gross-Schulman, Sandra; Myerchin Sklaroff, Laura; Guterman, Jeffrey
2016-01-25
Remote patient monitoring is increasingly integrated into health care delivery to expand access and increase effectiveness. Automation can add efficiency to remote monitoring, but patient acceptance of automated tools is critical for success. From 2010 to 2013, the Diabetes-Depression Care-management Adoption Trial (DCAT)-a quasi-experimental comparative effectiveness research trial aimed at accelerating the adoption of collaborative depression care in a safety-net health care system-tested a fully automated telephonic assessment (ATA) depression monitoring system serving low-income patients with diabetes. The aim of this study was to determine patient acceptance of ATA calls over time, and to identify factors predicting long-term patient acceptance of ATA calls. We conducted two analyses using data from the DCAT technology-facilitated care arm, in which for 12 months the ATA system periodically assessed depression symptoms, monitored treatment adherence, prompted self-care behaviors, and inquired about patients' needs for provider contact. Patients received assessments at 6, 12, and 18 months using Likert-scale measures of willingness to use ATA calls, preferred mode of reach, perceived ease of use, usefulness, nonintrusiveness, privacy/security, and long-term usefulness. For the first analysis (patient acceptance over time), we computed descriptive statistics of these measures. In the second analysis (predictive factors), we collapsed patients into two groups: those reporting "high" versus "low" willingness to use ATA calls. To compare them, we used independent t tests for continuous variables and Pearson chi-square tests for categorical variables. Next, we jointly entered independent factors found to be significantly associated with 18-month willingness to use ATA calls at the univariate level into a logistic regression model with backward selection to identify predictive factors. We performed a final logistic regression model with the identified significant predictive factors and reported the odds ratio estimates and 95% confidence intervals. At 6 and 12 months, respectively, 89.6% (69/77) and 63.7% (49/77) of patients "agreed" or "strongly agreed" that they would be willing to use ATA calls in the future. At 18 months, 51.0% (64/125) of patients perceived ATA calls as useful and 59.7% (46/77) were willing to use the technology. Moreover, in the first 6 months, most patients reported that ATA calls felt private/secure (75.9%, 82/108) and were easy to use (86.2%, 94/109), useful (65.1%, 71/109), and nonintrusive (87.2%, 95/109). Perceived usefulness, however, decreased to 54.1% (59/109) in the second 6 months of the trial. Factors predicting willingness to use ATA calls at the 18-month follow-up were perceived privacy/security and long-term perceived usefulness of ATA calls. No patient characteristics were significant predictors of long-term acceptance. In the short term, patients are generally accepting of ATA calls for depression monitoring, with ATA call design and the care management intervention being primary factors influencing patient acceptance. Acceptance over the long term requires that the system be perceived as private/secure, and that it be constantly useful for patients' needs of awareness of feelings, self-care reminders, and connectivity with health care providers. ClinicalTrials.gov NCT01781013; https://clinicaltrials.gov/ct2/show/NCT01781013 (Archived by WebCite at http://www.webcitation.org/6e7NGku56).
Ramirez, Magaly; Jin, Haomiao; Ell, Kathleen; Gross-Schulman, Sandra; Myerchin Sklaroff, Laura; Guterman, Jeffrey
2016-01-01
Background Remote patient monitoring is increasingly integrated into health care delivery to expand access and increase effectiveness. Automation can add efficiency to remote monitoring, but patient acceptance of automated tools is critical for success. From 2010 to 2013, the Diabetes-Depression Care-management Adoption Trial (DCAT)–a quasi-experimental comparative effectiveness research trial aimed at accelerating the adoption of collaborative depression care in a safety-net health care system–tested a fully automated telephonic assessment (ATA) depression monitoring system serving low-income patients with diabetes. Objective The aim of this study was to determine patient acceptance of ATA calls over time, and to identify factors predicting long-term patient acceptance of ATA calls. Methods We conducted two analyses using data from the DCAT technology-facilitated care arm, in which for 12 months the ATA system periodically assessed depression symptoms, monitored treatment adherence, prompted self-care behaviors, and inquired about patients’ needs for provider contact. Patients received assessments at 6, 12, and 18 months using Likert-scale measures of willingness to use ATA calls, preferred mode of reach, perceived ease of use, usefulness, nonintrusiveness, privacy/security, and long-term usefulness. For the first analysis (patient acceptance over time), we computed descriptive statistics of these measures. In the second analysis (predictive factors), we collapsed patients into two groups: those reporting “high” versus “low” willingness to use ATA calls. To compare them, we used independent t tests for continuous variables and Pearson chi-square tests for categorical variables. Next, we jointly entered independent factors found to be significantly associated with 18-month willingness to use ATA calls at the univariate level into a logistic regression model with backward selection to identify predictive factors. We performed a final logistic regression model with the identified significant predictive factors and reported the odds ratio estimates and 95% confidence intervals. Results At 6 and 12 months, respectively, 89.6% (69/77) and 63.7% (49/77) of patients “agreed” or “strongly agreed” that they would be willing to use ATA calls in the future. At 18 months, 51.0% (64/125) of patients perceived ATA calls as useful and 59.7% (46/77) were willing to use the technology. Moreover, in the first 6 months, most patients reported that ATA calls felt private/secure (75.9%, 82/108) and were easy to use (86.2%, 94/109), useful (65.1%, 71/109), and nonintrusive (87.2%, 95/109). Perceived usefulness, however, decreased to 54.1% (59/109) in the second 6 months of the trial. Factors predicting willingness to use ATA calls at the 18-month follow-up were perceived privacy/security and long-term perceived usefulness of ATA calls. No patient characteristics were significant predictors of long-term acceptance. Conclusions In the short term, patients are generally accepting of ATA calls for depression monitoring, with ATA call design and the care management intervention being primary factors influencing patient acceptance. Acceptance over the long term requires that the system be perceived as private/secure, and that it be constantly useful for patients’ needs of awareness of feelings, self-care reminders, and connectivity with health care providers. Trial Registration ClinicalTrials.gov NCT01781013; https://clinicaltrials.gov/ct2/show/NCT01781013 (Archived by WebCite at http://www.webcitation.org/6e7NGku56) PMID:26810139
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-09
... nuclear weapons development and government- sponsored nuclear energy research. EMAB provides advice to the... DEPARTMENT OF ENERGY Notice of Call for Nominations for Appointment to the Environmental... open call to the public to submit nominations for membership on the Environmental Management Advisory...
Abidi, Samina; Vallis, Michael; Piccinini-Vallis, Helena; Imran, Syed Ali; Abidi, Syed Sibte Raza
2018-04-18
Behavioral science is now being integrated into diabetes self-management interventions. However, the challenge that presents itself is how to translate these knowledge resources during care so that primary care practitioners can use them to offer evidence-informed behavior change support and diabetes management recommendations to patients with diabetes. The aim of this study was to develop and evaluate a computerized decision support platform called "Diabetes Web-Centric Information and Support Environment" (DWISE) that assists primary care practitioners in applying standardized behavior change strategies and clinical practice guidelines-based recommendations to an individual patient and empower the patient with the skills and knowledge required to self-manage their diabetes through planned, personalized, and pervasive behavior change strategies. A health care knowledge management approach is used to implement DWISE so that it features the following functionalities: (1) assessment of primary care practitioners' readiness to administer validated behavior change interventions to patients with diabetes; (2) educational support for primary care practitioners to help them offer behavior change interventions to patients; (3) access to evidence-based material, such as the Canadian Diabetes Association's (CDA) clinical practice guidelines, to primary care practitioners; (4) development of personalized patient self-management programs to help patients with diabetes achieve healthy behaviors to meet CDA targets for managing type 2 diabetes; (5) educational support for patients to help them achieve behavior change; and (6) monitoring of the patients' progress to assess their adherence to the behavior change program and motivating them to ensure compliance with their program. DWISE offers these functionalities through an interactive Web-based interface to primary care practitioners, whereas the patient's self-management program and associated behavior interventions are delivered through a mobile patient diary via mobile phones and tablets. DWISE has been tested for its usability, functionality, usefulness, and acceptance through a series of qualitative studies. For the primary care practitioner tool, most usability problems were associated with the navigation of the tool and the presentation, formatting, understandability, and suitability of the content. For the patient tool, most issues were related to the tool's screen layout, design features, understandability of the content, clarity of the labels used, and navigation across the tool. Facilitators and barriers to DWISE use in a shared decision-making environment have also been identified. This work has provided a unique electronic health solution to translate complex health care knowledge in terms of easy-to-use, evidence-informed, point-of-care decision aids for primary care practitioners. Patients' feedback is now being used to make necessary modification to DWISE. ©Samina Abidi, Michael Vallis, Helena Piccinini-Vallis, Syed Ali Imran, Syed Sibte Raza Abidi. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 18.04.2018.
NASA Technical Reports Server (NTRS)
1979-01-01
The machinery pictured is a set of Turbodyne steam turbines which power a sugar mill at Bell Glade, Florida. A NASA-developed computer program called NASTRAN aided development of these and other turbines manufactured by Turbodyne Corporation's Steam Turbine Division, Wellsville, New York. An acronym for NASA Structural Analysis Program, NASTRAN is a predictive tool which advises development teams how a structural design will perform under service use conditions. Turbodyne uses NASTRAN to analyze the dynamic behavior of steam turbine components, achieving substantial savings in development costs. One of the most widely used spinoffs, NASTRAN is made available to private industry through NASA's Computer Software Management Information Center (COSMIC) at the University of Georgia.
Fundamental changes to EPA's research enterprise: the path forward.
Anastas, Paul T
2012-01-17
Environmental protection in the United States has reached a critical juncture. It has become clear that to address the complex and interrelated environmental challenges we face, we must augment our traditional approaches. The scientific community must build upon its deep understanding of risk assessment, risk management, and reductionism with tools, technologies, insights and approaches to pursue sustainability. The U.S. Environmental Protection Agency (EPA) has recognized this need for systemic change by implementing a new research paradigm called "The Path Forward." This paper outlines the principles of the Path Forward and the actions taken since 2010 to align EPA's research efforts with the goal of sustainability.
2012-01-09
NASA Goddard Space Flight Center Financial Manager and White House 2011 SAVE award winner Matthew Ritsko is seen during a television interview at NASA Headquarters shortly after meeting with President Obama at the White House on Monday, Jan. 9, 2011, in Washington. The Presidential Securing Americans' Value and Efficiency (SAVE) program gives front-line federal workers the chance to submit their ideas on how their agencies can save money and work more efficiently. Matthew's proposal calls for NASA to create a "lending library" where specialized space tools and hardware purchased by one NASA organization will be made available to other NASA programs and projects. Photo Credit: (NASA/Bill Ingalls)
HCI∧2 framework: a software framework for multimodal human-computer interaction systems.
Shen, Jie; Pantic, Maja
2013-12-01
This paper presents a novel software framework for the development and research in the area of multimodal human-computer interface (MHCI) systems. The proposed software framework, which is called the HCI∧2 Framework, is built upon publish/subscribe (P/S) architecture. It implements a shared-memory-based data transport protocol for message delivery and a TCP-based system management protocol. The latter ensures that the integrity of system structure is maintained at runtime. With the inclusion of bridging modules, the HCI∧2 Framework is interoperable with other software frameworks including Psyclone and ActiveMQ. In addition to the core communication middleware, we also present the integrated development environment (IDE) of the HCI∧2 Framework. It provides a complete graphical environment to support every step in a typical MHCI system development process, including module development, debugging, packaging, and management, as well as the whole system management and testing. The quantitative evaluation indicates that our framework outperforms other similar tools in terms of average message latency and maximum data throughput under a typical single PC scenario. To demonstrate HCI∧2 Framework's capabilities in integrating heterogeneous modules, we present several example modules working with a variety of hardware and software. We also present an example of a full system developed using the proposed HCI∧2 Framework, which is called the CamGame system and represents a computer game based on hand-held marker(s) and low-cost camera(s).
Enabling eHealth as a Pathway for Patient Engagement: a Toolkit for Medical Practice.
Graffigna, Guendalina; Barello, Serena; Triberti, Stefano; Wiederhold, Brenda K; Bosio, A Claudio; Riva, Giuseppe
2014-01-01
Academic and managerial interest in patient engagement is rapidly earning attention and becoming a necessary tool for researchers, clinicians and policymakers worldwide to manage the increasing burden of chronic conditions. The concept of patient engagement calls for a reframe of healthcare organizations' models and approaches to care. This also requires innovations in the direction of facilitating the exchanges between the patients and the healthcare. eHealth, namely the use of new communication technologies to provide healthcare, is proved to be proposable to innovate healthcare organizations and to improve exchanges between patients and health providers. However, little attention has been still devoted to how to best design eHealth tools in order to engage patients in their care. eHealth tools have to be appropriately designed according to the specific patients' unmet needs and priorities featuring the different phases of the engagement process. Basing on the Patient Engagement model and on the Positive Technology paradigm, we suggest a toolkit of phase-specific technological resources, highlighting their specific potentialities in fostering the patient engagement process.
Ada Structure Design Language (ASDL)
NASA Technical Reports Server (NTRS)
Chedrawi, Lutfi
1986-01-01
An artist acquires all the necessary tools before painting a scene. In the same analogy, a software engineer needs the necessary tools to provide their design with the proper means for implementation. Ada provide these tools. Yet, as an artist's painting needs a brochure to accompany it for further explanation of the scene, an Ada design also needs a document along with it to show the design in its detailed structure and hierarchical order. Ada could be self-explanatory in small programs not exceeding fifty lines of code in length. But, in a large environment, ranging from thousands of lines and above, Ada programs need to be well documented to be preserved and maintained. The language used to specify an Ada document is called Ada Structure Design Language (ASDL). This language sets some rules to help derive a well formatted Ada detailed design document. The rules are defined to meet the needs of a project manager, a maintenance team, a programmer and a systems designer. The design document templates, the document extractor, and the rules set forth by the ASDL are explained in detail.
Pohjola, Mikko V; Pohjola, Pasi; Tainio, Marko; Tuomisto, Jouni T
2013-06-26
The calls for knowledge-based policy and policy-relevant research invoke a need to evaluate and manage environment and health assessments and models according to their societal outcomes. This review explores how well the existing approaches to assessment and model performance serve this need. The perspectives to assessment and model performance in the scientific literature can be called: (1) quality assurance/control, (2) uncertainty analysis, (3) technical assessment of models, (4) effectiveness and (5) other perspectives, according to what is primarily seen to constitute the goodness of assessments and models. The categorization is not strict and methods, tools and frameworks in different perspectives may overlap. However, altogether it seems that most approaches to assessment and model performance are relatively narrow in their scope. The focus in most approaches is on the outputs and making of assessments and models. Practical application of the outputs and the consequential outcomes are often left unaddressed. It appears that more comprehensive approaches that combine the essential characteristics of different perspectives are needed. This necessitates a better account of the mechanisms of collective knowledge creation and the relations between knowledge and practical action. Some new approaches to assessment, modeling and their evaluation and management span the chain from knowledge creation to societal outcomes, but the complexity of evaluating societal outcomes remains a challenge.
Johnson, Bill
2014-01-01
Medical practices receive hundreds if not thousands of calls every week from patients, payers, pharmacies, and others. Outsourcing call centers can be a smart move to improve efficiency, lower costs, improve customer care, ensure proper payer management, and ensure regulatory compliance. This article discusses how to know when it's time to move to an outsourced call center, the benefits of making the move, how to choose the right call center, and how to make the transition. It also provides tips on how to manage the call center to ensure the objectives are being met.
Anderson, S; Shannon, K; Li, J; Lee, Y; Chettiar, J; Goldenberg, S; Krüsi, A
2016-11-17
Despite a large body of evidence globally demonstrating that the criminalization of sex workers increases HIV/STI risks, we know far less about the impact of criminalization and policing of managers and in-call establishments on HIV/STI prevention among sex workers, and even less so among migrant sex workers. Analysis draws on ethnographic fieldwork and 46 qualitative interviews with migrant sex workers, managers and business owners of in-call sex work venues in Metro Vancouver, Canada. The criminalization of in-call venues and third parties explicitly limits sex workers' access to HIV/STI prevention, including manager restrictions on condoms and limited onsite access to sexual health information and HIV/STI testing. With limited labour protections and socio-cultural barriers, criminalization and policing undermine the health and human rights of migrant sex workers working in -call venues. This research supports growing evidence-based calls for decriminalization of sex work, including the removal of criminal sanctions targeting third parties and in-call venues, alongside programs and policies that better protect the working conditions of migrant sex workers as critical to HIV/STI prevention and human rights.
Automatic building information model query generation
Jiang, Yufei; Yu, Nan; Ming, Jiang; ...
2015-12-01
Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approachmore » to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. In conclusion, by demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.« less
Automatic building information model query generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Yufei; Yu, Nan; Ming, Jiang
Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approachmore » to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. In conclusion, by demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.« less
Integrated management of thesis using clustering method
NASA Astrophysics Data System (ADS)
Astuti, Indah Fitri; Cahyadi, Dedy
2017-02-01
Thesis is one of major requirements for student in pursuing their bachelor degree. In fact, finishing the thesis involves a long process including consultation, writing manuscript, conducting the chosen method, seminar scheduling, searching for references, and appraisal process by the board of mentors and examiners. Unfortunately, most of students find it hard to match all the lecturers' free time to sit together in a seminar room in order to examine the thesis. Therefore, seminar scheduling process should be on the top of priority to be solved. Manual mechanism for this task no longer fulfills the need. People in campus including students, staffs, and lecturers demand a system in which all the stakeholders can interact each other and manage the thesis process without conflicting their timetable. A branch of computer science named Management Information System (MIS) could be a breakthrough in dealing with thesis management. This research conduct a method called clustering to distinguish certain categories using mathematics formulas. A system then be developed along with the method to create a well-managed tool in providing some main facilities such as seminar scheduling, consultation and review process, thesis approval, assessment process, and also a reliable database of thesis. The database plays an important role in present and future purposes.
Risk management, derivatives and shariah compliance
NASA Astrophysics Data System (ADS)
Bacha, Obiyathulla Ismath
2013-04-01
Despite the impressive growth of Islamic Banking and Finance (IBF), a number of weaknesses remain. The most important of this is perhaps the lack of shariah compliant risk management tools. While the risk sharing philosophy of Islamic Finance requires the acceptance of risk to justify returns, the shariah also requires adherents to avoid unnecessary risk-maysir. The requirement to avoid maysir is in essence a call for the prudent management of risk. Contemporary risk management revolves around financial engineering, the building blocks of which are financial derivatives. Despite the proven efficacy of derivatives in the management of risk in the conventional space, shariah scholars appear to be suspicious and uneasy with their use in IBF. Some have imposed outright prohibition of their use. This paper re-examines the issue of contemporary derivative instruments and shariah compliance. The shariah compatibility of derivatives is shown in a number of ways. First, by way of qualitative evaluation of whether derivatives can be made to comply with the key prohibitions of the sharia. Second, by way of comparing the payoff profiles of derivatives with risk sharing finance and Bai Salam contracts. Finally, the equivalence between shariah compliant derivatives like the IPRS and Islamic FX Currency Forwards with conventional ones is presented.
Teaching the principles of health management to first year veterinary students.
Duffield, Todd; Lissemore, Kerry; Sandals, David
2003-01-01
A course called Health Management 1 was created as part of a new DVM curriculum at the Ontario Veterinary College. This full year course was designed to introduce students to basic concepts of health management, integrating the disciplines of epidemiology, ethology, and public health in the context of selected animal industries. The course was comprised of 60 lecture hours and four two-hour laboratories. A common definition of health management, incorporating five principles, was used throughout the course, in order to reinforce the concepts and to maintain continuity between lecture blocks. Unlike in the years prior to the introduction of the new curriculum, epidemiology was presented as a tool of health management rather than as a separate discipline. To supplement the lecture and laboratory material, a Web-based resource was created and the students were required to review the appropriate section prior to each lecture block. Small quizzes, consisting of 10 questions each within WebCT, were used to stimulate self-directed learning. Overall, the course was well received by the students. The Web resources combined with the WebCT quizzes proved to be an effective method of stimulating students to prepare for lecture.
Terminal Area Conflict Detection and Resolution Tool
NASA Technical Reports Server (NTRS)
Verma, Savita Arora
2011-01-01
This poster will describe analysis of a conflict detection and resolution tool for the terminal area called T-TSAFE. With altitude clearance information, the tool can reduce false alerts to as low as 2 per hour.
Knowledge Management: A Skeptic's Guide
NASA Technical Reports Server (NTRS)
Linde, Charlotte
2006-01-01
A viewgraph presentation discussing knowledge management is shown. The topics include: 1) What is Knowledge Management? 2) Why Manage Knowledge? The Presenting Problems; 3) What Gets Called Knowledge Management? 4) Attempts to Rethink Assumptions about Knowledgs; 5) What is Knowledge? 6) Knowledge Management and INstitutional Memory; 7) Knowledge Management and Culture; 8) To solve a social problem, it's easier to call for cultural rather than organizational change; 9) Will the Knowledge Management Effort Succeed? and 10) Backup: Metrics for Valuing Intellectural Capital i.e. Knowledge.
How to make experience your company's best teacher.
Kleiner, A; Roth, G
1997-01-01
In our personal life, experience is often the best teacher. Not so in corporate life. After a major event--a product failure, a downsizing crisis, or a merger--many companies stumble along, oblivious to the lessons of the past. Mistakes get repeated, but smart decisions do not. Most important, the old ways of thinking are never discussed, so they are still in place to spawn new mishaps. Individuals will often tell you that they understand what went wrong (or right). Yet their insights are rarely shared openly. And they are analyzed and internalized by the company even less frequently. Why? Because managers have few tools with which to capture institutional experience, disseminate its lessons, and translate them into effective action. In an effort to solve this problem, a group of social scientists, business managers, and journalists at MIT have developed and tested a tool called the learning history. It is a written narrative of a company's recent critical event, nearly all of it presented in two columns. In one column, relevant episodes are described by the people who took part in them, were affected by them, or observed them. In the other, learning historians--trained outsiders and knowledgeable insiders--identify recurrent themes in the narrative, pose questions, and raise "undiscussable" issues. The learning history forms the basis for group discussions, both for those involved in the event and for others who also might learn from it. The authors believe that this tool--based on the ancient practice of community storytelling--can build trust, raise important issues, transfer knowledge from one part of a company to another, and help build a body of generalizable knowledge about management.
... is usually done using a tool called a stethoscope. Health care providers routinely listen to a person's ... unborn infants. This can be done with a stethoscope or with sound waves (called Doppler ultrasound). Auscultation ...
ERIC Educational Resources Information Center
Parmaxi, Antigoni; Zaphiris, Panayiotis
2017-01-01
This study explores the research development pertaining to the use of Web 2.0 technologies in the field of Computer-Assisted Language Learning (CALL). Published research manuscripts related to the use of Web 2.0 tools in CALL have been explored, and the following research foci have been determined: (1) Web 2.0 tools that dominate second/foreign…
Trinkaus, Hans L; Gaisser, Andrea E
2010-09-01
Nearly 30,000 individual inquiries are answered annually by the telephone cancer information service (CIS, KID) of the German Cancer Research Center (DKFZ). The aim was to develop a tool for evaluating these calls, and to support the complete counseling process interactively. A novel software tool is introduced, based on a structure similar to a music score. Treating the interaction as a "duet", guided by the CIS counselor, the essential contents of the dialogue are extracted automatically. For this, "trained speech recognition" is applied to the (known) counselor's part, and "keyword spotting" is used on the (unknown) client's part to pick out specific items from the "word streams". The outcomes fill an abstract score representing the dialogue. Pilot tests performed on a prototype of SACA (Software Assisted Call Analysis) resulted in a basic proof of concept: Demographic data as well as information regarding the situation of the caller could be identified. The study encourages following up on the vision of an integrated SACA tool for supporting calls online and performing statistics on its knowledge database offline. Further research perspectives are to check SACA's potential in comparison with established interaction analysis systems like RIAS. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.
ToTem: a tool for variant calling pipeline optimization.
Tom, Nikola; Tom, Ondrej; Malcikova, Jitka; Pavlova, Sarka; Kubesova, Blanka; Rausch, Tobias; Kolarik, Miroslav; Benes, Vladimir; Bystry, Vojtech; Pospisilova, Sarka
2018-06-26
High-throughput bioinformatics analyses of next generation sequencing (NGS) data often require challenging pipeline optimization. The key problem is choosing appropriate tools and selecting the best parameters for optimal precision and recall. Here we introduce ToTem, a tool for automated pipeline optimization. ToTem is a stand-alone web application with a comprehensive graphical user interface (GUI). ToTem is written in Java and PHP with an underlying connection to a MySQL database. Its primary role is to automatically generate, execute and benchmark different variant calling pipeline settings. Our tool allows an analysis to be started from any level of the process and with the possibility of plugging almost any tool or code. To prevent an over-fitting of pipeline parameters, ToTem ensures the reproducibility of these by using cross validation techniques that penalize the final precision, recall and F-measure. The results are interpreted as interactive graphs and tables allowing an optimal pipeline to be selected, based on the user's priorities. Using ToTem, we were able to optimize somatic variant calling from ultra-deep targeted gene sequencing (TGS) data and germline variant detection in whole genome sequencing (WGS) data. ToTem is a tool for automated pipeline optimization which is freely available as a web application at https://totem.software .
Communicating marine reserve science to diverse audiences
Grorud-Colvert, Kirsten; Lester, Sarah E.; Airamé, Satie; Neeley, Elizabeth; Gaines, Steven D.
2010-01-01
As human impacts cause ecosystem-wide changes in the oceans, the need to protect and restore marine resources has led to increasing calls for and establishment of marine reserves. Scientific information about marine reserves has multiplied over the last decade, providing useful knowledge about this tool for resource users, managers, policy makers, and the general public. This information must be conveyed to nonscientists in a nontechnical, credible, and neutral format, but most scientists are not trained to communicate in this style or to develop effective strategies for sharing their scientific knowledge. Here, we present a case study from California, in which communicating scientific information during the process to establish marine reserves in the Channel Islands and along the California mainland coast expanded into an international communication effort. We discuss how to develop a strategy for communicating marine reserve science to diverse audiences and highlight the influence that effective science communication can have in discussions about marine management. PMID:20427745
Regional management of farmland feeding geese using an ecological prioritization tool.
Madsen, Jesper; Bjerrum, Morten; Tombre, Ingunn M
2014-10-01
Wild geese foraging on farmland cause increasing conflicts with agricultural interests, calling for a strategic approach to mitigation. In central Norway, conflicts between farmers and spring-staging pink-footed geese feeding on pastures have escalated. To alleviate the conflict, a scheme by which farmers are subsidized to allow geese to forage undisturbed was introduced. To guide allocation of subsidies, an ecological-based ranking of fields at a regional level was recommended and applied. Here we evaluate the scheme. On average, 40 % of subsidized fields were in the top 5 % of the ranking, and 80 % were within the top 20 %. Goose grazing pressure on subsidized pastures was 13 times higher compared to a stratified random selection of non-subsidized pastures, capturing 67 % of the pasture feeding geese despite that subsidized fields only comprised 13 % of the grassland area. Close dialogue between scientists and managers is regarded as a key to the success of the scheme.
A scope classification of data quality requirements for food composition data.
Presser, Karl; Hinterberger, Hans; Weber, David; Norrie, Moira
2016-02-15
Data quality is an important issue when managing food composition data since the usage of the data can have a significant influence on policy making and further research. Although several frameworks for data quality have been proposed, general tools and measures are still lacking. As a first step in this direction, we investigated data quality requirements for an information system to manage food composition data, called FoodCASE. The objective of our investigation was to find out if different requirements have different impacts on the intrinsic data quality that must be regarded during data quality assessment and how these impacts can be described. We refer to the resulting classification with its categories as the scope classification of data quality requirements. As proof of feasibility, the scope classification has been implemented in the FoodCASE system. Copyright © 2015 Elsevier Ltd. All rights reserved.
TERMTrial--terminology-based documentation systems for cooperative clinical trials.
Merzweiler, A; Weber, R; Garde, S; Haux, R; Knaup-Gregori, P
2005-04-01
Within cooperative groups of multi-center clinical trials a standardized documentation is a prerequisite for communication and sharing of data. Standardizing documentation systems means standardizing the underlying terminology. The management and consistent application of terminology systems is a difficult and fault-prone task, which should be supported by appropriate software tools. Today, documentation systems for clinical trials are often implemented as so-called Remote-Data-Entry-Systems (RDE-systems). Although there are many commercial systems, which support the development of RDE-systems there is none offering a comprehensive terminological support. Therefore, we developed the software system TERMTrial which consists of a component for the definition and management of terminology systems for cooperative groups of clinical trials and two components for the terminology-based automatic generation of trial databases and terminology-based interactive design of electronic case report forms (eCRFs). TERMTrial combines the advantages of remote data entry with a comprehensive terminological control.
Environmental accounting--a decision support tool in WWTP operation and management.
Clauson-Kaas, J; Poulsen, T S; Jacobsen, B N; Guildal, T; Wenzel, H
2001-01-01
The various emissions to water, air and soil from the municipal wastewater treatment plant of Avedore Wastewater Service Company are accounted for and quantified in terms of the environmental impacts to which they contribute: global warming, acidification, eutrophication, space demand for controlled deposition of residues, as well as persistent toxicity, human toxicity and eco-toxicity. The impacts are expressed on the same scale, namely as fraction of the total per capita loads in a national scenario 1990, also called the person equivalent or PE1990. This provides a compact and informative overview of the environmental impacts and allows for a holistic prioritisation in the operation and management of the plant. The accounting shows that the resulting emissions per person in the catchment area of the plant correspond to 0.5-5.0% of the average Danish PE1990 for the impacts in question.
Precautionary policies in local government: green chemistry and safer alternatives.
Raphael, Debbie O; Geiger, Chris A
2011-01-01
Local governments like the City and County of San Francisco have shouldered the burden of toxic chemicals released into the environment through the substantial costs of health care, environmental cleanup, and infrastructure to purify drinking water, manage wastewater, and manage solid waste. Cities can no longer afford to wait for federal regulation to prevent toxic chemicals from appearing in products used locally. San Francisco's Precautionary Principle Policy calls on the City to act on early warning signs of harm and to use the best available science to identify safer alternatives. Under its umbrella, a wide array of policy tools have been utilized including financial incentives through procurement contracts, certification and promotion of safer business practices, requirements for information disclosure, and bans and restrictions on the sale of products when safer alternatives are readily available. These policies can often become the models for regional, state, and national change.
Serendipia: Castilla-La Mancha telepathology network
Peces, Carlos; García-Rojo, Marcial; Sacristán, José; Gallardo, Antonio José; Rodríguez, Ambrosio
2008-01-01
Nowadays, there is no standard solution for acquiring, archiving and communication of Pathology digital images. In addition, there does not exist any commercial Pathology Information System (LIS) that can manage the relationship between the reports generated by the pathologist and their corresponding images. Due to this situation, the Healthcare Service of Castilla-La Mancha decided to create a completely digital Pathology Department, the project is called SERENDIPIA. SERENDIPIA project provides all the necessary image acquiring devices needed to cover all kind of images that can be generated in a Pathology Department. In addition, in the SERENDIPIA project an Information System was developed that allows, on the one hand, it to cover the daily workflow of a Pathology Department (including the storage and the manage of the reports and its images), and, on the other hand, the Information System provides a WEB telepathology portal with collaborative tools like second opinion. PMID:18673519
NASA Technical Reports Server (NTRS)
2002-01-01
Under a Phase II SBIR contract, Kennedy and Lumina Decision Systems, Inc., jointly developed the Schedule and Cost Risk Analysis Modeling (SCRAM) system, based on a version of Lumina's flagship software product, Analytica(R). Acclaimed as "the best single decision-analysis program yet produced" by MacWorld magazine, Analytica is a "visual" tool used in decision-making environments worldwide to build, revise, and present business models, minus the time-consuming difficulty commonly associated with spreadsheets. With Analytica as their platform, Kennedy and Lumina created the SCRAM system in response to NASA's need to identify the importance of major delays in Shuttle ground processing, a critical function in project management and process improvement. As part of the SCRAM development project, Lumina designed a version of Analytica called the Analytica Design Engine (ADE) that can be easily incorporated into larger software systems. ADE was commercialized and utilized in many other developments, including web-based decision support.
Communicating marine reserve science to diverse audiences.
Grorud-Colvert, Kirsten; Lester, Sarah E; Airamé, Satie; Neeley, Elizabeth; Gaines, Steven D
2010-10-26
As human impacts cause ecosystem-wide changes in the oceans, the need to protect and restore marine resources has led to increasing calls for and establishment of marine reserves. Scientific information about marine reserves has multiplied over the last decade, providing useful knowledge about this tool for resource users, managers, policy makers, and the general public. This information must be conveyed to nonscientists in a nontechnical, credible, and neutral format, but most scientists are not trained to communicate in this style or to develop effective strategies for sharing their scientific knowledge. Here, we present a case study from California, in which communicating scientific information during the process to establish marine reserves in the Channel Islands and along the California mainland coast expanded into an international communication effort. We discuss how to develop a strategy for communicating marine reserve science to diverse audiences and highlight the influence that effective science communication can have in discussions about marine management.
Modeling a Civil Event Case Study for Consequence Management Using the IMPRINT Forces Module
NASA Technical Reports Server (NTRS)
Gacy, Marc; Gosakan, Mala; Eckdahl, Angela; Miller, Jeffrey R.
2012-01-01
A critical challenge in the Consequence Management (CM) domain is the appropriate allocation of necessary and skilled military and civilian personnel and materiel resources in unexpected emergencies. To aid this process we used the Forces module in the Improved Performance Research Integration Tool (IMPRINT). This module enables analysts to enter personnel and equipment capabilities, prioritized schedules and numbers available, along with unexpected emergency requirements in order to assess force response requirements. Using a suspected terrorist threat on a college campus, we developed a test case model which exercised the capabilities of the module, including the scope and scale of operations. The model incorporates data from multiple sources, including daily schedules and frequency of events such as fire calls. Our preliminary results indicate that the model can predict potential decreases in civilian emergency response coverage due to an involved unplanned incident requiring significant portions of police, fire and civil responses teams.
NASA Astrophysics Data System (ADS)
Albano, R.; Sole, A.; Adamowski, J.
2015-02-01
As evidenced by the EU Floods Directive (2007/60/EC), flood management strategies in Europe have undergone a shift in focus in recent years. The goal of flood prevention using structural measures has been replaced by an emphasis on the management of flood risks using non-structural measures. One implication of this is that it is no longer public authorities alone who take responsibility for flood management. A broader range of stakeholders, who may experience the negative effects of flooding, also take on responsibility to protect themselves. Therefore, it is vital that information concerning flood risks are conveyed to those who may be affected in order to facilitate the self-protection of citizens. Experience shows that even where efforts have been made to communicate flood risks, problems persist. There is a need for the development of new tools, which are able to rapidly disseminate flood risk information to the general public. To be useful, these tools must be able to present information relevant to the location of the user. Moreover, the content and design of the tool need to be adjusted to laypeople's needs. Dissemination and communication influences both people's access to and understanding of natural risk information. Such a tool could be a useful aid to effective management of flood risks. To address this gap, a Web-based Geographical Information System, (WebGIS), has been developed through the collaborative efforts of a group of scientists, hazard and risk analysts and managers, GIS analysts, system developers and communication designers. This tool, called "READY: Risk, Extreme Events, Adaptation, Defend Yourself", aims to enhance the general public knowledge of flood risk, making them more capable of responding appropriately during a flood event. The READY WebGIS has allowed for the visualization and easy querying of a complex hazard and risk database thanks to a high degree of interactivity and its easily readable maps. In this way, READY has enabled fast exploration of alternative flood scenarios or past calamitous events. Combined also with a system of graphic symbols designed ad hoc for communication of self-protection behaviors, it is believed READY could lead to an increase in citizen participation, informed discussion and consensus building. The platform has been developed for a site-specific application, i.e. the Basilicata Region, Italy, has been selected as pilot application area. The goal of the prototype is to raise citizen awareness of flood risks, and to build social capacity and enhanced resilience to flood events.
NASA Astrophysics Data System (ADS)
Albano, R.; Sole, A.; Adamowski, J.
2015-07-01
As evidenced by the EU Floods Directive (2007/60/EC), flood management strategies in Europe have undergone a shift in focus in recent years. The goal of flood prevention using structural measures has been replaced by an emphasis on the management of flood risks using non-structural measures. One implication of this is that public authorities alone not only take responsibility for flood management. A broader range of stakeholders, who may personally experience the negative effects of flooding, also take on responsibility for protecting themselves. Therefore, it is vital that information concerning flood risks is conveyed to those who may be affected in order to facilitate the self-protection of citizens. Experience shows that problems persist even where efforts have been made to communicate flood risks. There is a need for the development of new tools that are able to rapidly disseminate flood-risk information to the general public. To be useful these tools must be able to present information relevant to the location of the user. Moreover, the content and design of the tool need to be adjusted to laypeople's needs. Dissemination and communication influence both people's access to and understanding of natural risk information. Such a tool could be a useful aid to effective management of flood risks. To address this gap, a web-based geographical information system (WebGIS) has been developed through the collaborative efforts of a group of scientists, hazard and risk analysts and managers, GIS analysts, system developers and communication designers. This tool, called "READY: Risk, Extreme Events, Adaptation, Defend Yourself", aims to enhance the general public knowledge of flood risk, making citizens more capable of responding appropriately during a flood event. The READY WebGIS has allowed for the visualization and easy querying of a complex hazard and risk database thanks to a high degree of interactivity and easily read maps. In this way, READY has enabled fast exploration of alternative flood scenarios or past calamitous events. Combined also with a system of graphic symbols designed ad hoc for communication of self-protection behaviours, it is believed READY could lead to an increase in citizen participation, informed discussion and consensus building. The platform has been developed for a site-specific application: the Basilicata region, Italy, has been selected as pilot application area. The goal of the prototype is to raise citizen awareness of flood risks and to build social capacity and enhanced resilience to flood events.
Mobile Health Devices as Tools for Worldwide Cardiovascular Risk Reduction and Disease Management.
Piette, John D; List, Justin; Rana, Gurpreet K; Townsend, Whitney; Striplin, Dana; Heisler, Michele
2015-11-24
We examined evidence on whether mobile health (mHealth) tools, including interactive voice response calls, short message service, or text messaging, and smartphones, can improve lifestyle behaviors and management related to cardiovascular diseases throughout the world. We conducted a state-of-the-art review and literature synthesis of peer-reviewed and gray literature published since 2004. The review prioritized randomized trials and studies focused on cardiovascular diseases and risk factors, but included other reports when they represented the best available evidence. The search emphasized reports on the potential benefits of mHealth interventions implemented in low- and middle-income countries. Interactive voice response and short message service interventions can improve cardiovascular preventive care in developed countries by addressing risk factors including weight, smoking, and physical activity. Interactive voice response and short message service-based interventions for cardiovascular disease management also have shown benefits with respect to hypertension management, hospital readmissions, and diabetic glycemic control. Multimodal interventions including Web-based communication with clinicians and mHealth-enabled clinical monitoring with feedback also have shown benefits. The evidence regarding the potential benefits of interventions using smartphones and social media is still developing. Studies of mHealth interventions have been conducted in >30 low- and middle-income countries, and evidence to date suggests that programs are feasible and may improve medication adherence and disease outcomes. Emerging evidence suggests that mHealth interventions may improve cardiovascular-related lifestyle behaviors and disease management. Next-generation mHealth programs developed worldwide should be based on evidence-based behavioral theories and incorporate advances in artificial intelligence for adapting systems automatically to patients' unique and changing needs. © 2015 American Heart Association, Inc.
Bagonza, James; Rutebemberwa, Elizeus; Eckmanns, Tim; Ekirapa-Kiracho, Elizabeth
2015-11-25
The integrated Community Case Management (iCCM) of childhood illnesses strategy has been adopted world over to reduce child related ill health and mortality. Community Health workers (CHWs) who implement this strategy need a regular supply of drugs to effectively treat children under 5 years with malaria, pneumonia and diarrhea. In this paper, we report the prevalence and factors influencing availability of medicines for managing malaria, pneumonia and diarrhea in communities in central Uganda. A cross sectional study was conducted among 303 CHWs in Wakiso district in central Uganda. Eligible CHWs from two randomly selected Health Sub Districts (HSDs) were interviewed. Questionnaires, check lists, record reviews were used to collect information on CHW background characteristics, CHW's prescription behaviors, health system support factors and availability of iCCM drugs. Multivariable logistic regression analysis was done to assess factors associated with availability of iCCM drugs. Out of 300 CHWs, 239 (79.9%) were females and mean age was 42.1 (standard deviation =11.1 years). The prevalence of iCCM drug availability was 8.3% and 33 respondents (11%) had no drugs at all. Factors associated with iCCM drug availability were; being supervised within the last month (adjusted OR = 3.70, 95% CI 1.22-11.24), appropriate drug prescriptions (adjusted OR = 3.71, 95% CI 1.38-9.96), regular submission of drug reports (adjusted OR = 4.02, 95% CI 1.62-10.10) and having a respiratory timer as a diagnostic tool (adjusted OR =3.11, 95% CI 1.08-9.00). The low medicine stocks for the community management of childhood illnesses calls for strengthening of CHW supervision, medicine prescription and reporting, and increasing availability of functional diagnostic tools.
Examining A Health Care Price Transparency Tool: Who Uses It, And How They Shop For Care.
Sinaiko, Anna D; Rosenthal, Meredith B
2016-04-01
Calls for transparency in health care prices are increasing, in an effort to encourage and enable patients to make value-based decisions. Yet there is very little evidence of whether and how patients use health care price transparency tools. We evaluated the experiences, in the period 2011-12, of an insured population of nonelderly adults with Aetna's Member Payment Estimator, a web-based tool that provides real-time, personalized, episode-level price estimates. Overall, use of the tool increased during the study period but remained low. Nonetheless, for some procedures the number of people searching for prices of services (called searchers) was high relative to the number of people who received the service (called patients). Among Aetna patients who had an imaging service, childbirth, or one of several outpatient procedures, searchers for price information were significantly more likely to be younger and healthier and to have incurred higher annual deductible spending than patients who did not search for price information. A campaign to deliver price information to consumers may be important to increase patients' engagement with price transparency tools. Project HOPE—The People-to-People Health Foundation, Inc.
Calvo-Lerma, Joaquim; Martinez-Jimenez, Celia P; Lázaro-Ramos, Juan-Pablo; Andrés, Ana; Crespo-Escobar, Paula; Stav, Erlend; Schauber, Cornelia; Pannese, Lucia; Hulst, Jessie M; Suárez, Lucrecia; Colombo, Carla; Barreto, Celeste; de Boeck, Kris; Ribes-Koninckx, Carmen
2017-03-16
For the optimal management of children with cystic fibrosis, there are currently no efficient tools for the precise adjustment of pancreatic enzyme replacement therapy, either for advice on appropriate dietary intake or for achieving an optimal nutrition status. Therefore, we aim to develop a mobile application that ensures a successful nutritional therapy in children with cystic fibrosis. A multidisciplinary team of 12 partners coordinate their efforts in 9 work packages that cover the entire so-called 'from laboratory to market' approach by means of an original and innovative co-design process. A cohort of 200 patients with cystic fibrosis aged 1-17 years are enrolled. We will develop an innovative, clinically tested mobile health application for patients and health professionals involved in cystic fibrosis management. The mobile application integrates the research knowledge and innovative tools for maximising self-management with the aim of leading to a better nutritional status, quality of life and disease prognosis. Bringing together different and complementary areas of knowledge is fundamental for tackling complex challenges in disease treatment, such as optimal nutrition and pancreatic enzyme replacement therapy in cystic fibrosis. Patients are expected to benefit the most from the outcomes of this innovative project. The project is approved by the Ethics Committee of the coordinating organisation, Hospital Universitari La Fe (Ref: 2014/0484). Scientific findings will be disseminated via journals and conferences addressed to clinicians, food scientists, information and communications technology experts and patients. The specific dissemination working group within the project will address the wide audience communication through the website (http://www.mycyfapp.eu), the social networks and the newsletter. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Towbin, Alexander J; Hall, Seth; Moskovitz, Jay; Johnson, Neil D; Donnelly, Lane F
2011-01-01
Communication of acute or critical results between the radiology department and referring clinicians has been a deficiency of many radiology departments. The failure to perform or document these communications can lead to poor patient care, patient safety issues, medical-legal issues, and complaints from referring clinicians. To mitigate these factors, a communication and documentation tool was created and incorporated into our departmental customer service program. This article will describe the implementation of a comprehensive customer service program in a hospital-based radiology department. A comprehensive customer service program was created in the radiology department. Customer service representatives were hired to answer the telephone calls to the radiology reading rooms and to help convey radiology results. The radiologists, referring clinicians, and customer service representatives were then linked via a novel workflow management system. This workflow management system provided tools to help facilitate the communication needs of each group. The number of studies with results conveyed was recorded from the implementation of the workflow management system. Between the implementation of the workflow management system on August 1, 2005, and June 1, 2009, 116,844 radiology results were conveyed to the referring clinicians and documented in the system. This accounts for more than 14% of the 828,516 radiology cases performed in this time frame. We have been successful in creating a comprehensive customer service program to convey and document communication of radiology results. This program has been widely used by the ordering clinicians as well as radiologists since its inception.
HealthATM: personal health cyberinfrastructure for underserved populations.
Botts, Nathan E; Horan, Thomas A; Thoms, Brian P
2011-05-01
There is an opportunity for personal health record (PHR) systems to play a vital role in fostering health self-management within underserved populations. If properly designed and promoted, it is possible that patients will use PHRs to become more empowered in taking an active role toward managing their health needs. This research examines the potential of a cyberinfrastructure-based PHR to encourage patient activation in health care, while also having population health implications. A multi-phased, iterative research approach was used to design and evaluate a PHR system called HealthATM, which utilizes services from a cloud computing environment. These services were integrated into an ATM-style interface aimed at providing a broad range of health consumers with the ability to manage health conditions and encourage accomplishment of health goals. Evaluation of the PHR included 115 patients who were clients of several free clinics in Los Angeles County. The majority of patients perceived ease of use (74%) and confidence (73%) in using the HealthATM system, and thought they would like to use it frequently (73%). Patients also indicated a belief in being responsible for their own health. However, fewer felt as though they were able to maintain necessary life changes to improve their health. Findings from the field tests suggest that PHRs can be a beneficial health management tool for underserved populations. In order for these types of tools to be effective within safety-net communities, they must be technically accessible and provide meaningful opportunities to increase patient engagement in their health care. Copyright © 2011. Published by Elsevier Inc.
How to map your industry's profit pool.
Gadiesh, O; Gilbert, J L
1998-01-01
Many managers chart strategy without a full understanding of the sources and distribution of profits in their industry. Sometimes they focus their sights on revenues instead of profits, mistakenly assuming that revenue growth will eventually translate into profit growth. In other cases, they simply lack the data or the analytical tools required to isolate and measure variations in profitability. In this Manager's Tool Kit, the authors present a way to think clearly about where the money's being made in any industry. They describe a framework for analyzing how profits are distributed among the activities that form an industry's value chain. Such an analysis can provide a company's managers with a rich understanding of their industry's profit structure--what the authors call its profit pool--enabling them to identify which activities are generating disproportionately large or small shares of profits. Even more important, a profit-pool map opens a window onto the underlying structure of the industry, helping managers see the various forces that are determining the distribution of profits. As such, a profit-pool map provides a solid basis for strategic thinking. Mapping a profit pool involves four steps: defining the boundaries of the pool, estimating the pool's overall size, estimating the size of each value-chain activity in the pool, and checking and reconciling the calculations. The authors briefly describe each step and then apply the process by providing a detailed example of a hypothetical retail bank. They conclude by looking at ways of organizing the data in chart form as a first step toward plotting a profit-pool strategy.
NASA Astrophysics Data System (ADS)
Koutiva, Ifigeneia; Makropoulos, Christos
2015-04-01
The urban water system's sustainable evolution requires tools that can analyse and simulate the complete cycle including both physical and cultural environments. One of the main challenges, in this regard, is the design and development of tools that are able to simulate the society's water demand behaviour and the way policy measures affect it. The effects of these policy measures are a function of personal opinions that subsequently lead to the formation of people's attitudes. These attitudes will eventually form behaviours. This work presents the design of an ABM tool for addressing the social dimension of the urban water system. The created tool, called Urban Water Agents' Behaviour (UWAB) model, was implemented, using the NetLogo agent programming language. The main aim of the UWAB model is to capture the effects of policies and environmental pressures to water conservation behaviour of urban households. The model consists of agents representing urban households that are linked to each other creating a social network that influences the water conservation behaviour of its members. Household agents are influenced as well by policies and environmental pressures, such as drought. The UWAB model simulates behaviour resulting in the evolution of water conservation within an urban population. The final outcome of the model is the evolution of the distribution of different conservation levels (no, low, high) to the selected urban population. In addition, UWAB is implemented in combination with an existing urban water management simulation tool, the Urban Water Optioneering Tool (UWOT) in order to create a modelling platform aiming to facilitate an adaptive approach of water resources management. For the purposes of this proposed modelling platform, UWOT is used in a twofold manner: (1) to simulate domestic water demand evolution and (2) to simulate the response of the water system to the domestic water demand evolution. The main advantage of the UWAB - UWOT model integration is that it allows the investigation of the effects of different water demand management strategies to an urban population's water demand behaviour and ultimately the effects of these policies to the volume of domestic water demand and the water resources system. The proposed modelling platform is optimised to simulate the effects of water policies during the Athens drought period of 1988-1994. The calibrated modelling platform is then applied to evaluate scenarios of water supply, water demand and water demand management strategies.
Moore, Eider B; Poliakov, Andrew V; Lincoln, Peter; Brinkley, James F
2007-01-01
Background Three-dimensional (3-D) visualization of multimodality neuroimaging data provides a powerful technique for viewing the relationship between structure and function. A number of applications are available that include some aspect of 3-D visualization, including both free and commercial products. These applications range from highly specific programs for a single modality, to general purpose toolkits that include many image processing functions in addition to visualization. However, few if any of these combine both stand-alone and remote multi-modality visualization in an open source, portable and extensible tool that is easy to install and use, yet can be included as a component of a larger information system. Results We have developed a new open source multimodality 3-D visualization application, called MindSeer, that has these features: integrated and interactive 3-D volume and surface visualization, Java and Java3D for true cross-platform portability, one-click installation and startup, integrated data management to help organize large studies, extensibility through plugins, transparent remote visualization, and the ability to be integrated into larger information management systems. We describe the design and implementation of the system, as well as several case studies that demonstrate its utility. These case studies are available as tutorials or demos on the associated website: . Conclusion MindSeer provides a powerful visualization tool for multimodality neuroimaging data. Its architecture and unique features also allow it to be extended into other visualization domains within biomedicine. PMID:17937818
Moore, Eider B; Poliakov, Andrew V; Lincoln, Peter; Brinkley, James F
2007-10-15
Three-dimensional (3-D) visualization of multimodality neuroimaging data provides a powerful technique for viewing the relationship between structure and function. A number of applications are available that include some aspect of 3-D visualization, including both free and commercial products. These applications range from highly specific programs for a single modality, to general purpose toolkits that include many image processing functions in addition to visualization. However, few if any of these combine both stand-alone and remote multi-modality visualization in an open source, portable and extensible tool that is easy to install and use, yet can be included as a component of a larger information system. We have developed a new open source multimodality 3-D visualization application, called MindSeer, that has these features: integrated and interactive 3-D volume and surface visualization, Java and Java3D for true cross-platform portability, one-click installation and startup, integrated data management to help organize large studies, extensibility through plugins, transparent remote visualization, and the ability to be integrated into larger information management systems. We describe the design and implementation of the system, as well as several case studies that demonstrate its utility. These case studies are available as tutorials or demos on the associated website: http://sig.biostr.washington.edu/projects/MindSeer. MindSeer provides a powerful visualization tool for multimodality neuroimaging data. Its architecture and unique features also allow it to be extended into other visualization domains within biomedicine.
Data Management Guidance in the Context of Climate Risk-Management
NASA Astrophysics Data System (ADS)
Sylak-Glassman, E.
2016-12-01
Climate risk-management, while a national issue, often occurs at a local level. To prepare for the effects of climate change, community decision-makers require a diverse set of data from historical records, social science, observations, and models, much of which is collected and curated by Federal agencies. The President's Climate Action Plan calls for building stronger and safer communities and infrastructure to prepare the United States for the impacts of climate change, and the Obama Administration has prioritized making Federal data more discoverable, accessible, and usable to inform both climate risk-management, and other data-informed decisions. In order to understand the state of guidance for data provision for climate risk-management, we analyzed Federal, agency, and interagency documents such as the Common Framework for Earth-Observation Data, related to open data, climate data, and data management in general. We examined guidance related to the principles of data discovery, access, and ease of use, as well as the data management categories of application programming interfaces, controlled vocabularies and ontologies, metadata, persistent dataset identifiers, preservation, and usage metrics. This analysis showed both the extent of guidance provided, as well as gaps in guidance. Following the literature review, we held structured conversations with Federal climate data managers and tool developers to identify areas where further efforts could enhance provision of agency data for climate risk-management. Our analysis can be used by data managers to understand how various data management practices can help improve climate risk-management and where to find further guidance.
Ketso: A New Tool for Extension Professionals
ERIC Educational Resources Information Center
Bates, James S.
2016-01-01
Extension professionals employ many techniques and tools to obtain feedback, input, information, and data from stakeholders, research participants, and program learners. An information-gathering tool called Ketso is described in this article. This tool and its associated techniques can be used in all phases of program development, implementation,…
A call for benchmarking transposable element annotation methods.
Hoen, Douglas R; Hickey, Glenn; Bourque, Guillaume; Casacuberta, Josep; Cordaux, Richard; Feschotte, Cédric; Fiston-Lavier, Anna-Sophie; Hua-Van, Aurélie; Hubley, Robert; Kapusta, Aurélie; Lerat, Emmanuelle; Maumus, Florian; Pollock, David D; Quesneville, Hadi; Smit, Arian; Wheeler, Travis J; Bureau, Thomas E; Blanchette, Mathieu
2015-01-01
DNA derived from transposable elements (TEs) constitutes large parts of the genomes of complex eukaryotes, with major impacts not only on genomic research but also on how organisms evolve and function. Although a variety of methods and tools have been developed to detect and annotate TEs, there are as yet no standard benchmarks-that is, no standard way to measure or compare their accuracy. This lack of accuracy assessment calls into question conclusions from a wide range of research that depends explicitly or implicitly on TE annotation. In the absence of standard benchmarks, toolmakers are impeded in improving their tools, annotators cannot properly assess which tools might best suit their needs, and downstream researchers cannot judge how accuracy limitations might impact their studies. We therefore propose that the TE research community create and adopt standard TE annotation benchmarks, and we call for other researchers to join the authors in making this long-overdue effort a success.
Changes in HIV needs identified by the National AIDS Hotline of Trinidad and Tobago.
Reid, Sandra D; Nielsen, Anders L; Reddock, Rhoda
2010-02-01
To examine utilization of the National AIDS Hotline of Trinidad and Tobago (AIDSLINE), evaluate its validity as a reliable data source for monitoring national HIV-related needs, and identify changes in caller requests between two different time periods. A total of 7 046 anonymous hotline calls in 1998-2002 (T1) and 2 338 calls in 2007 (T2) were analyzed for associations between caller characteristics and call content. A subsample of the data was also analyzed qualitatively. T1 findings were compared with HIV-related data collected by national policy-makers during that period, to evaluate the hotline's validity as a data source, and findings from T2, to reveal changes in call content over time. In T1, the hotline was well utilized for information and counseling by both the general population and those living with HIV/AIDS. Call content from T2 indicated an increase versus T1 in 1) general awareness of HIV and other sexually transmitted diseases; 2) HIV testing; and 3) knowledge of HIV symptoms and transmission. HIV-related mental health needs, and the relationship between HIV and both child sexual abuse (CSA) and intimate partner violence (IPV), were identified as emerging issues. AIDSLINE is a well-utilized tool for providing information and counseling on national HIV-related issues, and a valid, cost-effective, easily accessed information source for planners and policy-makers involved in HIV management. Over the two study periods, there was an increase in HIV awareness and testing and in requests related to mental health, CSA, and IPV, but no change in sexual behaviors.
Using telephony data to facilitate discovery of clinical workflows.
Rucker, Donald W
2017-04-19
Discovery of clinical workflows to target for redesign using methods such as Lean and Six Sigma is difficult. VoIP telephone call pattern analysis may complement direct observation and EMR-based tools in understanding clinical workflows at the enterprise level by allowing visualization of institutional telecommunications activity. To build an analytic framework mapping repetitive and high-volume telephone call patterns in a large medical center to their associated clinical units using an enterprise unified communications server log file and to support visualization of specific call patterns using graphical networks. Consecutive call detail records from the medical center's unified communications server were parsed to cross-correlate telephone call patterns and map associated phone numbers to a cost center dictionary. Hashed data structures were built to allow construction of edge and node files representing high volume call patterns for display with an open source graph network tool. Summary statistics for an analysis of exactly one week's call detail records at a large academic medical center showed that 912,386 calls were placed with a total duration of 23,186 hours. Approximately half of all calling called number pairs had an average call duration under 60 seconds and of these the average call duration was 27 seconds. Cross-correlation of phone calls identified by clinical cost center can be used to generate graphical displays of clinical enterprise communications. Many calls are short. The compact data transfers within short calls may serve as automation or re-design targets. The large absolute amount of time medical center employees were engaged in VoIP telecommunications suggests that analysis of telephone call patterns may offer additional insights into core clinical workflows.
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
A visual analytical approach to rock art panel condition assessment
NASA Astrophysics Data System (ADS)
Vogt, Brandon J.
Rock art is a term for pecked, scratched, or painted symbols found on rock surfaces, most typically joint faces called rock art panels. Because rock art exists on rock at the atmosphere interface, it is highly susceptible to the destructive processes of weathering. Thus, rock weathering scientists, including those that study both natural and cultural surfaces, play a key role towards understanding rock art longevity. The mapping of weathering forms on rock art panels serves as a basis from which to assess overall panel instability. This work examines fissures, case hardened surfaces, crumbly disintegration, and lichen. Knowledge of instability, as measured through these and other weathering forms, provides integral information to land managers and archaeological conservators required to prioritize panels for remedial action. The work is divided into five chapters, three of which are going to be submitted as a peer-reviewed journal manuscript. The second chapter, written as a manuscript for International Newsletter on Rock Art, describes a specific set of criteria that lead to the development of a mapping tool for weathering forms, called 'mapping weathering forms in three dimensions' (MapWeF). The third chapter, written as a manuscript for Remote Sensing of Environment, presents the methodology used to develop MapWeF. The chapter incorporates terrestrial laser scanning, a geographic information system (GIS), geovisualization, image analysis, and exploratory spatial data analysis (ESDA) to identify, map, and quantify weathering features known to cause instability on rock art panels. The methodology implemented in the third chapter satisfies the criteria described in Chapter Two. In the fourth chapter, prepared as a manuscript for Geomorphology, MapWeF is applied to a site management case study, focusing on a region---southeastern Colorado---with notoriously weak and endangered sandstone rock art panels. The final conclusions chapter describes contributions of the work to GIScience and rock weathering, and discusses how MapWeF, as a diagnostic tool, fits into a larger national vision by linking existing rock art stability characterizations to cultural resource management-related conservation action.
Amanzi: An Open-Source Multi-process Simulator for Environmental Applications
NASA Astrophysics Data System (ADS)
Moulton, J. D.; Molins, S.; Johnson, J. N.; Coon, E.; Lipnikov, K.; Day, M.; Barker, E.
2014-12-01
The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments begin with simplified models, and add geometric and geologic complexity as understanding is gained. The Platform toolsets (Akuna) generates these conceptual models and Amanzi provides the computational engine to perform the simulations, returning the results for analysis and visualization. In this presentation we highlight key elements of the design, algorithms and implementations used in Amanzi. In particular, the hierarchical and modular design is aligned with the coupled processes being sumulated, and naturally supports a wide range of model complexity. This design leverages a dynamic data manager and the synergy of two graphs (one from the high-level perspective of the models the other from the dependencies of the variables in the model) to enable this flexible model configuration at run time. Moreover, to model sites with complex hydrostratigraphy, as well as engineered systems, we are developing a dual unstructured/structured capability. Recently, these capabilities have been collected in a framework named Arcos, and efforts have begun to improve interoperability between the unstructured and structured AMR approaches in Amanzi. To leverage a range of biogeochemistry capability from the community (e.g., CrunchFlow, PFLOTRAN, etc.), a biogeochemistry interface library was developed called Alquimia. To ensure that Amanzi is truly an open-source community code we require a completely open-source tool chain for our development. We will comment on elements of this tool chain, including the testing and documentation development tools such as docutils, and Sphinx. Finally, we will show simulation results from our phased demonstrations, including the geochemically complex Savannah River F-Area seepage basins.
Soldering Tool for Integrated Circuits
NASA Technical Reports Server (NTRS)
Takahashi, Ted H.
1987-01-01
Many connections soldered simultaneously in confined spaces. Improved soldering tool bonds integrated circuits onto printed-circuit boards. Intended especially for use with so-called "leadless-carrier" integrated circuits.
NASA Astrophysics Data System (ADS)
Bell, C.; Li, Y.; Lopez, E.; Hogue, T. S.
2017-12-01
Decision support tools that quantitatively estimate the cost and performance of infrastructure alternatives are valuable for urban planners. Such a tool is needed to aid in planning stormwater projects to meet diverse goals such as the regulation of stormwater runoff and its pollutants, minimization of economic costs, and maximization of environmental and social benefits in the communities served by the infrastructure. This work gives a brief overview of an integrated decision support tool, called i-DST, that is currently being developed to serve this need. This presentation focuses on the development of a default database for the i-DST that parameterizes water quality treatment efficiency of stormwater best management practices (BMPs) by region. Parameterizing the i-DST by region will allow the tool to perform accurate simulations in all parts of the United States. A national dataset of BMP performance is analyzed to determine which of a series of candidate regionalizations explains the most variance in the national dataset. The data used in the regionalization analysis comes from the International Stormwater BMP Database and data gleaned from an ongoing systematic review of peer-reviewed and gray literature. In addition to identifying a regionalization scheme for water quality performance parameters in the i-DST, our review process will also provide example methods and protocols for systematic reviews in the field of Earth Science.
Challenges in Achieving Trajectory-Based Operations
NASA Technical Reports Server (NTRS)
Cate, Karen Tung
2012-01-01
In the past few years much of the global ATM research community has proposed advanced systems based on Trajectory-Based Operations (TBO). The concept of TBO uses four-dimensional aircraft trajectories as the base information for managing safety and capacity. Both the US and European advanced ATM programs call for the sharing of trajectory data across different decision support tools for successful operations. However, the actual integration of TBO systems presents many challenges. Trajectory predictors are built to meet the specific needs of a particular system and are not always compatible with others. Two case studies are presented which examine the challenges of introducing a new concept into two legacy systems in regards to their trajectory prediction software. The first case describes the issues with integrating a new decision support tool with a legacy operational system which overlap in domain space. These tools perform similar functions but are driven by different requirements. The difference in the resulting trajectories can lead to conflicting advisories. The second case looks at integrating this same new tool with a legacy system originally developed as an integrated system, but diverged many years ago. Both cases illustrate how the lack of common architecture concepts for the trajectory predictors added cost and complexity to the integration efforts.
Adaptive awareness for personal and small group decision making.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perano, Kenneth J.; Tucker, Steve; Pancerella, Carmen M.
2003-12-01
Many situations call for the use of sensors monitoring physiological and environmental data. In order to use the large amounts of sensor data to affect decision making, we are coupling heterogeneous sensors with small, light-weight processors, other powerful computers, wireless communications, and embedded intelligent software. The result is an adaptive awareness and warning tool, which provides both situation awareness and personal awareness to individuals and teams. Central to this tool is a sensor-independent architecture, which combines both software agents and a reusable core software framework that manages the available hardware resources and provides services to the agents. Agents can recognizemore » cues from the data, warn humans about situations, and act as decision-making aids. Within the agents, self-organizing maps (SOMs) are used to process physiological data in order to provide personal awareness. We have employed a novel clustering algorithm to train the SOM to discern individual body states and activities. This awareness tool has broad applicability to emergency teams, military squads, military medics, individual exercise and fitness monitoring, health monitoring for sick and elderly persons, and environmental monitoring in public places. This report discusses our hardware decisions, software framework, and a pilot awareness tool, which has been developed at Sandia National Laboratories.« less
Ocean Drilling Program: TAMRF Administrative Services: Meeting, Travel, and
Port-Call Information ODP/TAMU Science Operator Home Mirror sites ODP/TAMU staff Cruise information Science and curation services Publication services and products Drilling services and tools Online ODP Meeting, Travel, and Port-Call Information All ODP meeting and port-call activities are complete
Try These Time Management Tips.
ERIC Educational Resources Information Center
Bimrose, Jack J.
1987-01-01
Offers 12 time management tips for harried school administrators, including using a personal calendar, calling five-minute meetings with secretaries, mail-sorting, delegating or declining certain tasks, controlling visitors, screening phone calls, streamlining meetings, and other ideas. (MLH)
Thinking about Pregnancy After Premature Birth
... Moms Need Blog News & Media News Videos Mission stories Ambassadors Spotlights Tools & Resources Frequently asked media questions ... a kind of fertility treatment called assisted reproductive technology (also called ART). Fertility treatment is medical treatment ...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-07
..., please e-mail [email protected] . or call (866) 208-3676 (toll free). For TTY, call (202) 502... Resources Management, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for... proceeding of EquiPower Resources Management, LLC's application for market-based rate authority, with an...
Towards Agent-Oriented Approach to a Call Management System
NASA Astrophysics Data System (ADS)
Ashamalla, Amir Nabil; Beydoun, Ghassan; Low, Graham
There is more chance of a completed sale if the end customers and relationship managers are suitably matched. This in turn can reduce the number of calls made by a call centre reducing operational costs such as working time and phone bills. This chapter is part of ongoing research aimed at helping a CMC to make better use of its personnel and equipment while maximizing the value of the service it offers to its client companies and end customers. This is accomplished by ensuring the optimal use of resources with appropriate real-time scheduling and load balancing and matching the end customers to appropriate relationship managers. In a globalized market, this may mean taking into account the cultural environment of the customer, as well as the appropriate profile and/or skill of the relationship manager to communicate effectively with the end customer. The chapter evaluates the suitability of a MAS to a call management system and illustrates the requirement analysis phase using i* models.
Structural design/margin assessment
NASA Technical Reports Server (NTRS)
Ryan, R. S.
1993-01-01
Determining structural design inputs and the structural margins following design completion is one of the major activities in space exploration. The end result is a statement of these margins as stability, safety factors on ultimate and yield stresses, fracture limits (fracture control), fatigue lifetime, reuse criteria, operational criteria and procedures, stability factors, deflections, clearance, handling criteria, etc. The process is normally called a load cycle and is time consuming, very complex, and involves much more than structures. The key to successful structural design is the proper implementation of the process. It depends on many factors: leadership and management of the process, adequate analysis and testing tools, data basing, communications, people skills, and training. This process and the various factors involved are discussed.
Comprehensive Environmental Assessment: A Meta-Assessment Approach
2012-01-01
With growing calls for changes in the field of risk assessment, improved systematic approaches for addressing environmental issues with greater transparency and stakeholder engagement are needed to ensure sustainable trade-offs. Here we describe the comprehensive environmental assessment (CEA) approach as a holistic way to manage complex information and to structure input from diverse stakeholder perspectives to support environmental decision-making for the near- and long-term. We further note how CEA builds upon and incorporates other available tools and approaches, describe its current application at the U.S. Environmental Protection Agency, and point out how it could be extended in evaluating a major issue such as the sustainability of biofuels. PMID:22889372
Minerva: User-Centered Science Operations Software Capability for Future Human Exploration
NASA Technical Reports Server (NTRS)
Deans, Matthew; Marquez, Jessica J.; Cohen, Tamar; Miller, Matthew J.; Deliz, Ivonne; Hillenius, Steven; Hoffman, Jeffrey; Lee, Yeon Jin; Lees, David; Norheim, Johannes;
2017-01-01
In June of 2016, the Biologic Analog Science Associated with Lava Terrains (BASALT) research project conducted its first field deployment, which we call BASALT-1. BASALT-1 consisted of a science-driven field campaign in a volcanic field in Idaho as a simulated human mission to Mars. Scientists and mission operators were provided a suite of ground software tools that we refer to collectively as Minerva to carry out their work. Minerva provides capabilities for traverse planning and route optimization, timeline generation and display, procedure management, execution monitoring, data archiving, visualization, and search. This paper describes the Minerva architecture, constituent components, use cases, and some preliminary findings from the BASALT-1 campaign.
2004-03-03
Dr. Tom Mace, NASA DFRC Director of Airborne Sciences, and Walter Klein(far right), NASA DFRC Airborne Science Mission Manager, brief John Danilovich, US Ambassador to Costa Rica, and NASA Administrator Sean O'Keefe onboard NASA's DC-8 during a stop-off on the AirSAR 2004 Mesoamerica campaign. AirSAR 2004 Mesoamerica is a three-week expedition by an international team of scientists that will use an all-weather imaging tool, called the Airborne Synthetic Aperture Radar (AirSAR), in a mission ranging from the tropical rain forests of Central America to frigid Antarctica.
VIP tour of NASA DFRC's DC-8 during the AirSAR 2004 Mesoamerica campaign
2004-03-03
VIP tour of NASA DFRC's DC-8 airborne laboratory during the AirSAR 2004 Mesoamerica campaign given by Craig Dobson, NASA Program Manager for AirSAR, L-R: Dr. Sonia Marta Mora, President of the Costa Rican National Rector’s Council; NASA Administrator Sean O'Keefe; Fernando Gutierrez, Costa Rican Minister of Science and Technology(MICIT); Mr. John Danilovich, US Ambassador to Costa Rica; and Dobson. AirSAR 2004 Mesoamerica is a three-week expedition by an international team of scientists that will use an all-weather imaging tool, called the Airborne Synthetic Aperture Radar (AirSAR), in a mission ranging from the tropical rain forests of Central America to frigid Antarctica.
NASA Astrophysics Data System (ADS)
Girvetz, E. H.; Zganjar, C.; Raber, G. T.; Hoekstra, J.; Lawler, J. J.; Kareiva, P.
2008-12-01
Now that there is overwhelming evidence of global climate change, scientists, managers and planners (i.e. practitioners) need to assess the potential impacts of climate change on particular ecological systems, within specific geographic areas, and at spatial scales they care about, in order to make better land management, planning, and policy decisions. Unfortunately, this application of climate science to real world decisions and planning has proceeded too slowly because we lack tools for translating cutting-edge climate science and climate-model outputs into something managers and planners can work with at local or regional scales (CCSP 2008). To help increase the accessibility of climate information, we have developed a freely-available, easy-to-use, web-based climate-change analysis toolbox, called ClimateWizard, for assessing how climate has and is projected to change at specific geographic locations throughout the world. The ClimateWizard uses geographic information systems (GIS), web-services (SOAP/XML), statistical analysis platforms (e.g. R- project), and web-based mapping services (e.g. Google Earth/Maps, KML/GML) to provide a variety of different analyses (e.g. trends and departures) and outputs (e.g. maps, graphs, tables, GIS layers). Because ClimateWizard analyzes large climate datasets stored remotely on powerful computers, users of the tool do not need to have fast computers or expensive software, but simply need access to the internet. The analysis results are then provided to users in a Google Maps webpage tailored to the specific climate-change question being asked. The ClimateWizard is not a static product, but rather a framework to be built upon and modified to suit the purposes of specific scientific, management, and policy questions. For example, it can be expanded to include bioclimatic variables (e.g. evapotranspiration) and marine data (e.g. sea surface temperature), as well as improved future climate projections, and climate-change impact analyses involving hydrology, vegetation, wildfire, disease, and food security. By harnessing the power of computer and web- based technologies, the ClimateWizard puts local, regional, and global climate-change analyses in the hands of a wider array of managers, planners, and scientists.
OCSEGen: Open Components and Systems Environment Generator
NASA Technical Reports Server (NTRS)
Tkachuk, Oksana
2014-01-01
To analyze a large system, one often needs to break it into smaller components.To analyze a component or unit under analysis, one needs to model its context of execution, called environment, which represents the components with which the unit interacts. Environment generation is a challenging problem, because the environment needs to be general enough to uncover unit errors, yet precise enough to make the analysis tractable. In this paper, we present a tool for automated environment generation for open components and systems. The tool, called OCSEGen, is implemented on top of the Soot framework. We present the tool's current support and discuss its possible future extensions.
(abstract) Generic Modeling of a Life Support System for Process Technology Comparisons
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.
1993-01-01
This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support systems and process technology options for a Lunar Base and a Mars Exploration Mission.
Semantic Importance Sampling for Statistical Model Checking
2015-01-16
SMT calls while maintaining correctness. Finally, we implement SIS in a tool called osmosis and use it to verify a number of stochastic systems with...2 surveys related work. Section 3 presents background definitions and concepts. Section 4 presents SIS, and Section 5 presents our tool osmosis . In...which I∗M|=Φ(x) = 1. We do this by first randomly selecting a cube c from C∗ with uniform probability since each cube has equal probability 9 5. OSMOSIS
An integrated GIS application system for soil moisture data assimilation
NASA Astrophysics Data System (ADS)
Wang, Di; Shen, Runping; Huang, Xiaolong; Shi, Chunxiang
2014-11-01
The gaps in knowledge and existing challenges in precisely describing the land surface process make it critical to represent the massive soil moisture data visually and mine the data for further research.This article introduces a comprehensive soil moisture assimilation data analysis system, which is instructed by tools of C#, IDL, ArcSDE, Visual Studio 2008 and SQL Server 2005. The system provides integrated service, management of efficient graphics visualization and analysis of land surface data assimilation. The system is not only able to improve the efficiency of data assimilation management, but also comprehensively integrate the data processing and analysis tools into GIS development environment. So analyzing the soil moisture assimilation data and accomplishing GIS spatial analysis can be realized in the same system. This system provides basic GIS map functions, massive data process and soil moisture products analysis etc. Besides,it takes full advantage of a spatial data engine called ArcSDE to effeciently manage, retrieve and store all kinds of data. In the system, characteristics of temporal and spatial pattern of soil moiture will be plotted. By analyzing the soil moisture impact factors, it is possible to acquire the correlation coefficients between soil moisture value and its every single impact factor. Daily and monthly comparative analysis of soil moisture products among observations, simulation results and assimilations can be made in this system to display the different trends of these products. Furthermore, soil moisture map production function is realized for business application.
Xu, Lingyang; Hou, Yali; Bickhart, Derek M; Song, Jiuzhou; Liu, George E
2013-06-25
Copy number variations (CNVs) are gains and losses of genomic sequence between two individuals of a species when compared to a reference genome. The data from single nucleotide polymorphism (SNP) microarrays are now routinely used for genotyping, but they also can be utilized for copy number detection. Substantial progress has been made in array design and CNV calling algorithms and at least 10 comparison studies in humans have been published to assess them. In this review, we first survey the literature on existing microarray platforms and CNV calling algorithms. We then examine a number of CNV calling tools to evaluate their impacts using bovine high-density SNP data. Large incongruities in the results from different CNV calling tools highlight the need for standardizing array data collection, quality assessment and experimental validation. Only after careful experimental design and rigorous data filtering can the impacts of CNVs on both normal phenotypic variability and disease susceptibility be fully revealed.
Educating medical staff about responding to a radiological or nuclear emergency.
McCurley, M Carol; Miller, Charles W; Tucker, Florie E; Guinn, Amy; Donnelly, Elizabeth; Ansari, Armin; Holcombe, Maire; Nemhauser, Jeffrey B; Whitcomb, Robert C
2009-05-01
A growing body of audience research reveals medical personnel in hospitals are unprepared for a large-scale radiological emergency such as a terrorist event involving radioactive or nuclear materials. Also, medical personnel in hospitals lack a basic understanding of radiation principles, as well as diagnostic and treatment guidelines for radiation exposure. Clinicians have indicated that they lack sufficient training on radiological emergency preparedness; they are potentially unwilling to treat patients if those patients are perceived to be radiologically contaminated; and they have major concerns about public panic and overloading of clinical systems. In response to these findings, the Centers for Disease Control and Prevention (CDC) has developed a tool kit for use by hospital medical personnel who may be called on to respond to unintentional or intentional mass-casualty radiological and nuclear events. This tool kit includes clinician fact sheets, a clinician pocket guide, a digital video disc (DVD) of just-in-time basic skills training, a CD-ROM training on mass-casualty management, and a satellite broadcast dealing with medical management of radiological events. CDC training information emphasizes the key role that medical health physicists can play in the education and support of emergency department activities following a radiological or nuclear mass-casualty event.
CRAB3: Establishing a new generation of services for distributed analysis at CMS
NASA Astrophysics Data System (ADS)
Cinquilli, M.; Spiga, D.; Grandi, C.; Hernàndez, J. M.; Konstantinov, P.; Mascheroni, M.; Riahi, H.; Vaandering, E.
2012-12-01
In CMS Computing the highest priorities for analysis tools are the improvement of the end users’ ability to produce and publish reliable samples and analysis results as well as a transition to a sustainable development and operations model. To achieve these goals CMS decided to incorporate analysis processing into the same framework as data and simulation processing. This strategy foresees that all workload tools (TierO, Tier1, production, analysis) share a common core with long term maintainability as well as the standardization of the operator interfaces. The re-engineered analysis workload manager, called CRAB3, makes use of newer technologies, such as RESTFul based web services and NoSQL Databases, aiming to increase the scalability and reliability of the system. As opposed to CRAB2, in CRAB3 all work is centrally injected and managed in a global queue. A pool of agents, which can be geographically distributed, consumes work from the central services serving the user tasks. The new architecture of CRAB substantially changes the deployment model and operations activities. In this paper we present the implementation of CRAB3, emphasizing how the new architecture improves the workflow automation and simplifies maintainability. In particular, we will highlight the impact of the new design on daily operations.
Astronomical large projects managed with MANATEE: management tool for effective engineering
NASA Astrophysics Data System (ADS)
García-Vargas, M. L.; Mujica-Alvarez, E.; Pérez-Calpena, A.
2012-09-01
This paper describes MANATEE, which is the Management project web tool developed by FRACTAL, specifically designed for managing large astronomical projects. MANATEE facilitates the management by providing an overall view of the project and the capabilities to control the three main projects parameters: scope, schedule and budget. MANATEE is one of the three tools of the FRACTAL System & Project Suite, which is composed also by GECO (System Engineering Tool) and DOCMA (Documentation Management Tool). These tools are especially suited for those Consortia and teams collaborating in a multi-discipline, complex project in a geographically distributed environment. Our Management view has been applied successfully in several projects and currently is being used for Managing MEGARA, the next instrument for the GTC 10m telescope.
Unified Access Architecture for Large-Scale Scientific Datasets
NASA Astrophysics Data System (ADS)
Karna, Risav
2014-05-01
Data-intensive sciences have to deploy diverse large scale database technologies for data analytics as scientists have now been dealing with much larger volume than ever before. While array databases have bridged many gaps between the needs of data-intensive research fields and DBMS technologies (Zhang 2011), invocation of other big data tools accompanying these databases is still manual and separate the database management's interface. We identify this as an architectural challenge that will increasingly complicate the user's work flow owing to the growing number of useful but isolated and niche database tools. Such use of data analysis tools in effect leaves the burden on the user's end to synchronize the results from other data manipulation analysis tools with the database management system. To this end, we propose a unified access interface for using big data tools within large scale scientific array database using the database queries themselves to embed foreign routines belonging to the big data tools. Such an invocation of foreign data manipulation routines inside a query into a database can be made possible through a user-defined function (UDF). UDFs that allow such levels of freedom as to call modules from another language and interface back and forth between the query body and the side-loaded functions would be needed for this purpose. For the purpose of this research we attempt coupling of four widely used tools Hadoop (hadoop1), Matlab (matlab1), R (r1) and ScaLAPACK (scalapack1) with UDF feature of rasdaman (Baumann 98), an array-based data manager, for investigating this concept. The native array data model used by an array-based data manager provides compact data storage and high performance operations on ordered data such as spatial data, temporal data, and matrix-based data for linear algebra operations (scidbusr1). Performances issues arising due to coupling of tools with different paradigms, niche functionalities, separate processes and output data formats have been anticipated and considered during the design of the unified architecture. The research focuses on the feasibility of the designed coupling mechanism and the evaluation of the efficiency and benefits of our proposed unified access architecture. Zhang 2011: Zhang, Ying and Kersten, Martin and Ivanova, Milena and Nes, Niels, SciQL: Bridging the Gap Between Science and Relational DBMS, Proceedings of the 15th Symposium on International Database Engineering Applications, 2011. Baumann 98: Baumann, P., Dehmel, A., Furtado, P., Ritsch, R., Widmann, N., "The Multidimensional Database System RasDaMan", SIGMOD 1998, Proceedings ACM SIGMOD International Conference on Management of Data, June 2-4, 1998, Seattle, Washington, 1998. hadoop1: hadoop.apache.org, "Hadoop", http://hadoop.apache.org/, [Online; accessed 12-Jan-2014]. scalapack1: netlib.org/scalapack, "ScaLAPACK", http://www.netlib.org/scalapack,[Online; accessed 12-Jan-2014]. r1: r-project.org, "R", http://www.r-project.org/,[Online; accessed 12-Jan-2014]. matlab1: mathworks.com, "Matlab Documentation", http://www.mathworks.de/de/help/matlab/,[Online; accessed 12-Jan-2014]. scidbusr1: scidb.org, "SciDB User's Guide", http://scidb.org/HTMLmanual/13.6/scidb_ug,[Online; accessed 01-Dec-2013].
NASA Technical Reports Server (NTRS)
Thomas, Stan J.
1993-01-01
KATE (Knowledge-based Autonomous Test Engineer) is a model-based software system developed in the Artificial Intelligence Laboratory at the Kennedy Space Center for monitoring, fault detection, and control of launch vehicles and ground support systems. In order to bring KATE to the level of performance, functionality, and integratability needed for firing room applications, efforts are underway to implement KATE in the C++ programming language using an X-windows interface. Two programs which were designed and added to the collection of tools which comprise the KATE toolbox are described. The first tool, called the schematic viewer, gives the KATE user the capability to view digitized schematic drawings in the KATE environment. The second tool, called the model editor, gives the KATE model builder a tool for creating and editing knowledge base files. Design and implementation issues having to do with these two tools are discussed. It will be useful to anyone maintaining or extending either the schematic viewer or the model editor.
The role of probiotic bacteria in managing periodontal disease: a systematic review.
Matsubara, Victor Haruo; Bandara, H M H N; Ishikawa, Karin Hitomi; Mayer, Marcia Pinto Alves; Samaranayake, Lakshman Perera
2016-07-01
The frequent recolonization of treated sites by periodontopathogens and the emergence of antibiotic resistance have led to a call for new therapeutic approaches for managing periodontal diseases. As probiotics are considered a new tool for combating infectious diseases, we systematically reviewed the evidences for their effectiveness in the management of periodontitis. An electronic search was performed in the MEDLINE, SCOPUS and Cochrane Library databases up to March 2016 using the terms 'periodontitis', 'chronic periodontitis', 'probiotic(s)', 'prebiotic(s)', 'symbiotic(s)', 'Bifidobacterium and 'Lactobacillus'. Only randomized controlled trials (RCTs) were included in the present study. Analysis of 12 RCTs revealed that in general, oral administration of probiotics improved the recognized clinical signs of chronic and aggressive periodontitis such as probing pocket depth, bleeding on probing, and attachment loss, with a concomitant reduction in the levels of major periodontal pathogens. Continuous probiotic administration, laced mainly with Lactobacillus species, was necessary to maintain these benefits. Expert commentary: Oral administration of probiotics is a safe and effective adjunct to conventional mechanical treatment (scaling) in the management of periodontitis, specially the chronic disease entity. Their adjunctive use is likely to improve disease indices and reduce the need for antibiotics.
Empowering the Community to Manage Diabetes Better: An Integrated Partnership-Based Model
Bamne, Arun; Shah, Daksha; Palkar, Sanjivani; Uppal, Shweta; Majumdar, Anurita; Naik, Rohan
2016-01-01
Context Rising number of diabetes cases in India calls for collaboration between the public and private sectors. Aims: Municipal Corporation of Greater Mumbai (MCGM) partnered with Eli Lilly and Company (India) [Eli Lilly] to strengthen the capacity of their diabetes clinics. Materials and Methods Medical Officers, dispensaries and Assistant Medical Officers (AMOs) located at attached health posts were trained on an educational tool, Diabetes Conversation Map™ (DCM) by a Master Trainer. This tool was then used to educate patients and caregivers visiting the MCGM diabetes clinics. Results Twenty-eight centers conducted 168 sessions, and 1616 beneficiaries availed the education over six months. General feedback from health providers was that DCM helps clear misconceptions among patients and caregivers in an interactive way and also improves compliance of patients. Conclusions This communication highlights a unique public-private partnership where the sincere efforts of public sector organization (MCGM) were complemented by the educational expertise lent by a private firm. PMID:27051094
Esposito, Pasquale; Dal Canton, Antonio
2014-11-06
Evaluation and improvement of quality of care provided to the patients are of crucial importance in the daily clinical practice and in the health policy planning and financing. Different tools have been developed, including incident analysis, health technology assessment and clinical audit. The clinical audit consist of measuring a clinical outcome or a process, against well-defined standards set on the principles of evidence-based medicine in order to identify the changes needed to improve the quality of care. In particular, patients suffering from chronic renal diseases, present many problems that have been set as topics for clinical audit projects, such as hypertension, anaemia and mineral metabolism management. Although the results of these studies have been encouraging, demonstrating the effectiveness of audit, overall the present evidence is not clearly in favour of clinical audit. These findings call attention to the need to further studies to validate this methodology in different operating scenarios. This review examines the principle of clinical audit, focusing on experiences performed in nephrology settings.
Judging adaptive management practices of U.S. agencies.
Fischman, Robert L; Ruhl, J B
2016-04-01
All U.S. federal agencies administering environmental laws purport to practice adaptive management (AM), but little is known about how they actually implement this conservation tool. A gap between the theory and practice of AM is revealed in judicial decisions reviewing agency adaptive management plans. We analyzed all U.S. federal court opinions published through 1 January 2015 to identify the agency AM practices courts found most deficient. The shortcomings included lack of clear objectives and processes, monitoring thresholds, and defined actions triggered by thresholds. This trio of agency shortcuts around critical, iterative steps characterizes what we call AM-lite. Passive AM differs from active AM in its relative lack of management interventions through experimental strategies. In contrast, AM-lite is a distinctive form of passive AM that fails to provide for the iterative steps necessary to learn from management. Courts have developed a sophisticated understanding of AM and often offer instructive rather than merely critical opinions. The role of the judiciary is limited by agency discretion under U.S. administrative law. But courts have overturned some agency AM-lite practices and insisted on more rigorous analyses to ensure that the promised benefits of structured learning and fine-tuned management have a reasonable likelihood of occurring. Nonetheless, there remains a mismatch in U.S. administrative law between the flexibility demanded by adaptive management and the legal objectives of transparency, public participation, and finality. © 2015 Society for Conservation Biology.
NASA Astrophysics Data System (ADS)
Moulton, J. D.; Steefel, C. I.; Yabusaki, S.; Castleton, K.; Scheibe, T. D.; Keating, E. H.; Freedman, V. L.
2013-12-01
The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments use a graded and iterative approach, beginning with simplified highly abstracted models, and adding geometric and geologic complexity as understanding is gained. To build confidence in this assessment capability, extensive testing of the underlying tools is needed. Since the tools themselves, such as the subsurface flow and reactive-transport simulator, Amanzi, are under active development, testing must be both hierarchical and highly automated. In this presentation we show how we have met these requirements, by leveraging the python-based open-source documentation system called Sphinx with several other open-source tools. Sphinx builds on the reStructured text tool docutils, with important extensions that include high-quality formatting of equations, and integrated plotting through matplotlib. This allows the documentation, as well as the input files for tests, benchmark and tutorial problems, to be maintained with the source code under a version control system. In addition, it enables developers to build documentation in several different formats (e.g., html and pdf) from a single source. We will highlight these features, and discuss important benefits of this approach for Amanzi. In addition, we'll show that some of ASCEM's other tools, such as the sampling provided by the Uncertainty Quantification toolset, are naturally leveraged to enable more comprehensive testing. Finally, we will highlight the integration of this hiearchical testing and documentation framework with our build system and tools (CMake, CTest, and CDash).
Paradigms and problems: The practice of social science in natural resource management
Michael E. Patterson; Daniel R. Williams
1998-01-01
Increasingly, natural resource management is seeing calls for new paradigms. These calls pose challenges that have implications not only for planning and management, but also for the practice of science. As a consequence, the profession needs to deepen its understanding of the nature of science by exploring recent advances in the philosophy of science....
77 FR 37705 - Notice of Call for Nominations for the Wild Horse and Burro Advisory Board
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-22
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLWO260000 L10600000 XQ0000] Notice of Call for Nominations for the Wild Horse and Burro Advisory Board AGENCY: Bureau of Land Management... management, protection, and control of wild free-roaming horses and burros on the public lands administered...
Unobtrusive integration of data management with fMRI analysis.
Poliakov, Andrew V; Hertzenberg, Xenia; Moore, Eider B; Corina, David P; Ojemann, George A; Brinkley, James F
2007-01-01
This note describes a software utility, called X-batch which addresses two pressing issues typically faced by functional magnetic resonance imaging (fMRI) neuroimaging laboratories (1) analysis automation and (2) data management. The first issue is addressed by providing a simple batch mode processing tool for the popular SPM software package (http://www.fil.ion. ucl.ac.uk/spm/; Welcome Department of Imaging Neuroscience, London, UK). The second is addressed by transparently recording metadata describing all aspects of the batch job (e.g., subject demographics, analysis parameters, locations and names of created files, date and time of analysis, and so on). These metadata are recorded as instances of an extended version of the Protégé-based Experiment Lab Book ontology created by the Dartmouth fMRI Data Center. The resulting instantiated ontology provides a detailed record of all fMRI analyses performed, and as such can be part of larger systems for neuroimaging data management, sharing, and visualization. The X-batch system is in use in our own fMRI research, and is available for download at http://X-batch.sourceforge.net/.
Ayurvedic intervention in the management of uterine fibroids: A Case series.
Dhiman, Kamini
2014-01-01
Uterine enlargement is common in reproductive life of a female. Other than pregnancy, it is seen most frequently in the result of leiomyomas. Leiomyomas, are benign smooth muscle neoplasmas that typically originate from the myometrium, due to fibrous consistency and are also called as fibroid. They may be identified in asymptomatic women during routine pelvic examination or may cause symptoms. Typical complaints include pain, pressure sensations, dysmenorrhea or abnormal uterine bleeding. Management of uterine fibroid through surgery is available to meet urgent need of the patient, but challenges remain to establish a satisfactory conservatory medical treatment till date. Hence, it was critically reviewed in the context of Granthi Roga (disease) and treatment protocol befitting the Samprapti Vighatana of Granthi (encapsulated growth) was subjected in patients of uterine fibroids. Seven cases of uterine fibroid were managed by Ayurvedic intervention. Ultrasonography (USG) of the lower abdomen was the main investigative/diagnostic tool in this study. After 7 weeks, patients presented with USG report as absence of uterine fibroid. Ayurvedic formulations Kanchanara Guggulu, Shigru Guggulu, and Haridra Khand are found to be effective treatment modality in uterine fibroid.
Ayurvedic intervention in the management of uterine fibroids: A Case series
Dhiman, Kamini
2014-01-01
Uterine enlargement is common in reproductive life of a female. Other than pregnancy, it is seen most frequently in the result of leiomyomas. Leiomyomas, are benign smooth muscle neoplasmas that typically originate from the myometrium, due to fibrous consistency and are also called as fibroid. They may be identified in asymptomatic women during routine pelvic examination or may cause symptoms. Typical complaints include pain, pressure sensations, dysmenorrhea or abnormal uterine bleeding. Management of uterine fibroid through surgery is available to meet urgent need of the patient, but challenges remain to establish a satisfactory conservatory medical treatment till date. Hence, it was critically reviewed in the context of Granthi Roga (disease) and treatment protocol befitting the Samprapti Vighatana of Granthi (encapsulated growth) was subjected in patients of uterine fibroids. Seven cases of uterine fibroid were managed by Ayurvedic intervention. Ultrasonography (USG) of the lower abdomen was the main investigative/diagnostic tool in this study. After 7 weeks, patients presented with USG report as absence of uterine fibroid. Ayurvedic formulations Kanchanara Guggulu, Shigru Guggulu, and Haridra Khand are found to be effective treatment modality in uterine fibroid. PMID:26664240
ROSA: Resource-Oriented Service Management Schemes for Web of Things in a Smart Home.
Liao, Chun-Feng; Chen, Peng-Yu
2017-09-21
A Pervasive-computing-enriched smart home environment, which contains many embedded and tiny intelligent devices and sensors coordinated by service management mechanisms, is capable of anticipating intentions of occupants and providing appropriate services accordingly. Although there are a wealth of research achievements in recent years, the degree of market acceptance is still low. The main reason is that most of the devices and services in such environments depend on particular platform or technology, making it hard to develop an application by composing the devices or services. Meanwhile, the concept of Web of Things (WoT) is becoming popular recently. Based on WoT, the developers can build applications based on popular web tools or technologies. Consequently, the objective of this paper is to propose a set of novel WoT-driven plug-and-play service management schemes for a smart home called Resource-Oriented Service Administration (ROSA). We have implemented an application prototype, and experiments are performed to show the effectiveness of the proposed approach. The results of this research can be a foundation for realizing the vision of "end user programmable smart environments".
NASA Astrophysics Data System (ADS)
Mortier, S.; Van Daele, K.; Meganck, L.
2017-08-01
Heritage organizations in Flanders started using thesauri fairly recently compared to other countries. This paper starts with examining the historical use of thesauri and controlled vocabularies in computer systems by the Flemish Government dealing with immovable cultural heritage. Their evolution from simple, flat, controlled lists to actual thesauri with scope notes, hierarchical and equivalence relations and links to other thesauri will be discussed. An explanation will be provided for the evolution in our approach to controlled vocabularies, and how they radically changed querying and the way data is indexed in our systems. Technical challenges inherent to complex thesauri and how to overcome them will be outlined. These issues being solved, thesauri have become an essential feature of the Flanders Heritage inventory management system. The number of vocabularies rose over the years and became an essential tool for integrating heritage from different disciplines. As a final improvement, thesauri went from being a core part of one application (the inventory management system) to forming an essential part of a new general resource oriented system architecture for Flanders Heritage influenced by Linked Data. For this purpose, a generic SKOS based editor was created. Due to the SKOS model being generic enough to be used outside of Flanders Heritage, the decision was made early on to develop this editor as an open source project called Atramhasis and share it with the larger heritage world.
Mining collections of compounds with Screening Assistant 2
2012-01-01
Background High-throughput screening assays have become the starting point of many drug discovery programs for large pharmaceutical companies as well as academic organisations. Despite the increasing throughput of screening technologies, the almost infinite chemical space remains out of reach, calling for tools dedicated to the analysis and selection of the compound collections intended to be screened. Results We present Screening Assistant 2 (SA2), an open-source JAVA software dedicated to the storage and analysis of small to very large chemical libraries. SA2 stores unique molecules in a MySQL database, and encapsulates several chemoinformatics methods, among which: providers management, interactive visualisation, scaffold analysis, diverse subset creation, descriptors calculation, sub-structure / SMART search, similarity search and filtering. We illustrate the use of SA2 by analysing the composition of a database of 15 million compounds collected from 73 providers, in terms of scaffolds, frameworks, and undesired properties as defined by recently proposed HTS SMARTS filters. We also show how the software can be used to create diverse libraries based on existing ones. Conclusions Screening Assistant 2 is a user-friendly, open-source software that can be used to manage collections of compounds and perform simple to advanced chemoinformatics analyses. Its modular design and growing documentation facilitate the addition of new functionalities, calling for contributions from the community. The software can be downloaded at http://sa2.sourceforge.net/. PMID:23327565
Mining collections of compounds with Screening Assistant 2.
Guilloux, Vincent Le; Arrault, Alban; Colliandre, Lionel; Bourg, Stéphane; Vayer, Philippe; Morin-Allory, Luc
2012-08-31
High-throughput screening assays have become the starting point of many drug discovery programs for large pharmaceutical companies as well as academic organisations. Despite the increasing throughput of screening technologies, the almost infinite chemical space remains out of reach, calling for tools dedicated to the analysis and selection of the compound collections intended to be screened. We present Screening Assistant 2 (SA2), an open-source JAVA software dedicated to the storage and analysis of small to very large chemical libraries. SA2 stores unique molecules in a MySQL database, and encapsulates several chemoinformatics methods, among which: providers management, interactive visualisation, scaffold analysis, diverse subset creation, descriptors calculation, sub-structure / SMART search, similarity search and filtering. We illustrate the use of SA2 by analysing the composition of a database of 15 million compounds collected from 73 providers, in terms of scaffolds, frameworks, and undesired properties as defined by recently proposed HTS SMARTS filters. We also show how the software can be used to create diverse libraries based on existing ones. Screening Assistant 2 is a user-friendly, open-source software that can be used to manage collections of compounds and perform simple to advanced chemoinformatics analyses. Its modular design and growing documentation facilitate the addition of new functionalities, calling for contributions from the community. The software can be downloaded at http://sa2.sourceforge.net/.
Pohjola, Mikko V.; Pohjola, Pasi; Tainio, Marko; Tuomisto, Jouni T.
2013-01-01
The calls for knowledge-based policy and policy-relevant research invoke a need to evaluate and manage environment and health assessments and models according to their societal outcomes. This review explores how well the existing approaches to assessment and model performance serve this need. The perspectives to assessment and model performance in the scientific literature can be called: (1) quality assurance/control, (2) uncertainty analysis, (3) technical assessment of models, (4) effectiveness and (5) other perspectives, according to what is primarily seen to constitute the goodness of assessments and models. The categorization is not strict and methods, tools and frameworks in different perspectives may overlap. However, altogether it seems that most approaches to assessment and model performance are relatively narrow in their scope. The focus in most approaches is on the outputs and making of assessments and models. Practical application of the outputs and the consequential outcomes are often left unaddressed. It appears that more comprehensive approaches that combine the essential characteristics of different perspectives are needed. This necessitates a better account of the mechanisms of collective knowledge creation and the relations between knowledge and practical action. Some new approaches to assessment, modeling and their evaluation and management span the chain from knowledge creation to societal outcomes, but the complexity of evaluating societal outcomes remains a challenge. PMID:23803642
VCFR: A package to manipulate and visualize variant call format data in R
USDA-ARS?s Scientific Manuscript database
Software to call single nucleotide polymorphisms or related genetic variants has converged on the variant call format (vcf) as their output format of choice. This has created a need for tools to work with vcf files. While an increasing number of software exists to read vcf data, many of them only ex...
ERIC Educational Resources Information Center
Hamel, Marie-Josee; Caws, Catherine
2010-01-01
This article discusses CALL development from both educational and ergonomic perspectives. It focuses on the learner-task-tool interaction, in particular on the aspects contributing to its overall quality, herein called "usability." Two pilot studies are described that were carried out with intermediate to advanced learners of French in two…
NASA Astrophysics Data System (ADS)
Foufoula-Georgiou, E.; Czuba, J. A.; Belmont, P.; Wilcock, P. R.; Gran, K. B.; Kumar, P.
2015-12-01
Climatic trends and agricultural intensification in Midwestern U.S. landscapes has contributed to hydrologic regime shifts and a cascade of changes to water quality and river ecosystems. Informing management and policy to mitigate undesired consequences requires a careful scientific analysis that includes data-based inference and conceptual/physical modeling. It also calls for a systems approach that sees beyond a single stream to the whole watershed, favoring the adoption of minimal complexity rather than highly parameterized models for scenario evaluation and comparison. Minimal complexity models can focus on key dynamic processes of the system of interest, reducing problems of model structure bias and equifinality. Here we present a comprehensive analysis of climatic, hydrologic, and ecologic trends in the Minnesota River basin, a 45,000 km2 basin undergoing continuous agricultural intensification and suffering from declining water quality and aquatic biodiversity. We show that: (a) it is easy to arrive at an erroneous view of the system using traditional analyses and modeling tools; (b) even with a well-founded understanding of the key drivers and processes contributing to the problem, there are multiple pathways for minimizing/reversing environmental degradation; and (c) addressing the underlying driver of change (i.e., increased streamflows and reduced water storage due to agricultural drainage practices) by restoring a small amount of water storage in the landscape results in multiple non-linear improvements in downstream water quality. We argue that "optimization" between ecosystem services and economic considerations requires simple modeling frameworks, which include the most essential elements of the whole system and allow for evaluation of alternative management scenarios. Science-based approaches informing management and policy are urgent in this region calling for a new era of watershed management to new and accelerating stressors at the intersection of the food-water-energy-environment nexus.
Measuring infrastructure: A key step in program evaluation and planning
Schmitt, Carol L.; Glasgow, LaShawn; Lavinghouze, S. Rene; Rieker, Patricia P.; Fulmer, Erika; McAleer, Kelly; Rogers, Todd
2016-01-01
State tobacco prevention and control programs (TCPs) require a fully functioning infrastructure to respond effectively to the Surgeon General’s call for accelerating the national reduction in tobacco use. The literature describes common elements of infrastructure; however, a lack of valid and reliable measures has made it difficult for program planners to monitor relevant infrastructure indicators and address observed deficiencies, or for evaluators to determine the association among infrastructure, program efforts, and program outcomes. The Component Model of Infrastructure (CMI) is a comprehensive, evidence-based framework that facilitates TCP program planning efforts to develop and maintain their infrastructure. Measures of CMI components were needed to evaluate the model’s utility and predictive capability for assessing infrastructure. This paper describes the development of CMI measures and results of a pilot test with nine state TCP managers. Pilot test findings indicate that the tool has good face validity and is clear and easy to follow. The CMI tool yields data that can enhance public health efforts in a funding-constrained environment and provides insight into program sustainability. Ultimately, the CMI measurement tool could facilitate better evaluation and program planning across public health programs. PMID:27037655
Measuring infrastructure: A key step in program evaluation and planning.
Schmitt, Carol L; Glasgow, LaShawn; Lavinghouze, S Rene; Rieker, Patricia P; Fulmer, Erika; McAleer, Kelly; Rogers, Todd
2016-06-01
State tobacco prevention and control programs (TCPs) require a fully functioning infrastructure to respond effectively to the Surgeon General's call for accelerating the national reduction in tobacco use. The literature describes common elements of infrastructure; however, a lack of valid and reliable measures has made it difficult for program planners to monitor relevant infrastructure indicators and address observed deficiencies, or for evaluators to determine the association among infrastructure, program efforts, and program outcomes. The Component Model of Infrastructure (CMI) is a comprehensive, evidence-based framework that facilitates TCP program planning efforts to develop and maintain their infrastructure. Measures of CMI components were needed to evaluate the model's utility and predictive capability for assessing infrastructure. This paper describes the development of CMI measures and results of a pilot test with nine state TCP managers. Pilot test findings indicate that the tool has good face validity and is clear and easy to follow. The CMI tool yields data that can enhance public health efforts in a funding-constrained environment and provides insight into program sustainability. Ultimately, the CMI measurement tool could facilitate better evaluation and program planning across public health programs. Copyright © 2016 Elsevier Ltd. All rights reserved.
A Baldrige Process for ethics?
Goodpaster, Kenneth E; Maines, T Dean; Weimerskirch, Arnold M
2004-04-01
In this paper we describe and explore a management tool called the Caux Round Table Self-Assessment and Improvement Process (SAIP). Based upon the Caux Round Table Principles for Business--a stakeholder-based, transcultural statement of business values--the SAIP assists executives with the task of shaping their firm's conscience through an organizational self-appraisal process. This process is modeled after the self-assessment methodology pioneered by the Malcolm Baldrige National Quality Award Program. After briefly describing the SAIP, we address three topics. First, we examine similarities and differences between the Baldrige approach to corporate self-assessment and the self-assessment process utilized within the SAIP. Second, we report initial findings from two beta tests of the tool. These illustrate both the SAIP's ability to help organizations strengthen their commitment to ethically responsible conduct, and some of the tool's limitations. Third, we briefly analyze various dimensions of the business scandals of 2001-2002 (Enron, WorldCom, Tyco, etc.) in light of the ethical requirements articulated with the SAIP. This analysis suggests that the SAIP can help link the current concerns of stakeholders--for example, investors and the general public--to organizational practice, by providing companies with a practical way to incorporate critical lessons from these unfortunate events.
Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique.
Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Shaw, Philip J; Ukosakit, Kittipat; Tragoonrung, Somvong; Tongsima, Sissades
2015-01-01
DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. This work presents an automated genotyping tool from DNA gel electrophoresis images, called GELect, which was written in Java and made available through the imageJ framework. With a novel automated image processing workflow, the tool can accurately segment lanes from a gel matrix, intelligently extract distorted and even doublet bands that are difficult to identify by existing image processing tools. Consequently, genotyping from DNA gel electrophoresis can be performed automatically allowing users to efficiently conduct large scale DNA fingerprinting via DNA gel electrophoresis. The software is freely available from http://www.biotec.or.th/gi/tools/gelect.
NASA Astrophysics Data System (ADS)
Abdel-Aal, H. A.; Mansori, M. El
2012-12-01
Cutting tools are subject to extreme thermal and mechanical loads during operation. The state of loading is intensified in dry cutting environment especially when cutting the so called hard-to-cut-materials. Although, the effect of mechanical loads on tool failure have been extensively studied, detailed studies on the effect of thermal dissipation on the deterioration of the cutting tool are rather scarce. In this paper we study failure of coated carbide tools due to thermal loading. The study emphasizes the role assumed by the thermo-physical properties of the tool material in enhancing or preventing mass attrition of the cutting elements within the tool. It is shown that within a comprehensive view of the nature of conduction in the tool zone, thermal conduction is not solely affected by temperature. Rather it is a function of the so called thermodynamic forces. These are the stress, the strain, strain rate, rate of temperature rise, and the temperature gradient. Although that within such consideration description of thermal conduction is non-linear, it is beneficial to employ such a form because it facilitates a full mechanistic understanding of thermal activation of tool wear.
VRLane: a desktop virtual safety management program for underground coal mine
NASA Astrophysics Data System (ADS)
Li, Mei; Chen, Jingzhu; Xiong, Wei; Zhang, Pengpeng; Wu, Daozheng
2008-10-01
VR technologies, which generate immersive, interactive, and three-dimensional (3D) environments, are seldom applied to coal mine safety work management. In this paper, a new method that combined the VR technologies with underground mine safety management system was explored. A desktop virtual safety management program for underground coal mine, called VRLane, was developed. The paper mainly concerned about the current research advance in VR, system design, key techniques and system application. Two important techniques were introduced in the paper. Firstly, an algorithm was designed and implemented, with which the 3D laneway models and equipment models can be built on the basis of the latest mine 2D drawings automatically, whereas common VR programs established 3D environment by using 3DS Max or the other 3D modeling software packages with which laneway models were built manually and laboriously. Secondly, VRLane realized system integration with underground industrial automation. VRLane not only described a realistic 3D laneway environment, but also described the status of the coal mining, with functions of displaying the run states and related parameters of equipment, per-alarming the abnormal mining events, and animating mine cars, mine workers, or long-wall shearers. The system, with advantages of cheap, dynamic, easy to maintenance, provided a useful tool for safety production management in coal mine.
The Watershed Management Optimization Support Tool (WMOST) is a public-domain software application designed to aid decision makers with integrated water resources management. The tool allows water resource managers and planners to screen a wide-range of management practices for c...
76 FR 2086 - Call for Applications for Commerce Spectrum Management Advisory Committee; National...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-12
... DEPARTMENT OF COMMERCE National Telecommunications and Information Administration Call for Applications for Commerce Spectrum Management Advisory Committee; National Telecommunications and Information... Telecommunications and Information Administration (NTIA) seeks applications from persons interested in serving on the...
Vallis, Michael; Piccinini-Vallis, Helena; Imran, Syed Ali; Abidi, Syed Sibte Raza
2018-01-01
Background Behavioral science is now being integrated into diabetes self-management interventions. However, the challenge that presents itself is how to translate these knowledge resources during care so that primary care practitioners can use them to offer evidence-informed behavior change support and diabetes management recommendations to patients with diabetes. Objective The aim of this study was to develop and evaluate a computerized decision support platform called “Diabetes Web-Centric Information and Support Environment” (DWISE) that assists primary care practitioners in applying standardized behavior change strategies and clinical practice guidelines–based recommendations to an individual patient and empower the patient with the skills and knowledge required to self-manage their diabetes through planned, personalized, and pervasive behavior change strategies. Methods A health care knowledge management approach is used to implement DWISE so that it features the following functionalities: (1) assessment of primary care practitioners’ readiness to administer validated behavior change interventions to patients with diabetes; (2) educational support for primary care practitioners to help them offer behavior change interventions to patients; (3) access to evidence-based material, such as the Canadian Diabetes Association’s (CDA) clinical practice guidelines, to primary care practitioners; (4) development of personalized patient self-management programs to help patients with diabetes achieve healthy behaviors to meet CDA targets for managing type 2 diabetes; (5) educational support for patients to help them achieve behavior change; and (6) monitoring of the patients’ progress to assess their adherence to the behavior change program and motivating them to ensure compliance with their program. DWISE offers these functionalities through an interactive Web-based interface to primary care practitioners, whereas the patient’s self-management program and associated behavior interventions are delivered through a mobile patient diary via mobile phones and tablets. DWISE has been tested for its usability, functionality, usefulness, and acceptance through a series of qualitative studies. Results For the primary care practitioner tool, most usability problems were associated with the navigation of the tool and the presentation, formatting, understandability, and suitability of the content. For the patient tool, most issues were related to the tool’s screen layout, design features, understandability of the content, clarity of the labels used, and navigation across the tool. Facilitators and barriers to DWISE use in a shared decision-making environment have also been identified. Conclusions This work has provided a unique electronic health solution to translate complex health care knowledge in terms of easy-to-use, evidence-informed, point-of-care decision aids for primary care practitioners. Patients’ feedback is now being used to make necessary modification to DWISE. PMID:29669705
Teaching Web Security Using Portable Virtual Labs
ERIC Educational Resources Information Center
Chen, Li-Chiou; Tao, Lixin
2012-01-01
We have developed a tool called Secure WEb dEvelopment Teaching (SWEET) to introduce security concepts and practices for web application development. This tool provides introductory tutorials, teaching modules utilizing virtualized hands-on exercises, and project ideas in web application security. In addition, the tool provides pre-configured…
Damos, Petros; Colomar, Lucía-Adriana Escudero; Ioriatti, Claudio
2015-06-26
This review focuses on the process of adapting the original concept of Integrated Pest Management (IPM) to the wider conception of the Integrated Fruit Production (IFP) implemented in Europe. Even though most of the pest management strategies still rely on the use of synthetic pesticides, a wide array of innovative and environmentally friendly tools are now available as possible alternative to the pesticides within the modern apple production system. We also highlight how recent pest management strategies and tools have created an opening for research towards IPM improvement, including the use of biorational pesticides, semiochemicals and biological control. Forecasting models, new tree training systems and innovative spray equipment have also been developed to improve treatment coverage, to mitigate pesticide drift and to reduce chemical residues on fruits. The possible threats that jeopardize the effective implementation of IPM and particularly the risks related to the development of the pesticide resistance and the introduction of new invasive pests are also reviewed. With the directive 128/09, the European legislation recognizes IPM as a strategic approach for the sustainable use of pesticides. Within this context, IPM and related guidelines is called to meet different areas of concern in relation to the worker and bystander safety. Beside the traditional economic criteria of the market-oriented agriculture, sustainable agriculture includes the assessment of the environmental impact of the agronomic practices within the societal context where they take place. As a consequence of the raising consumer concerns about environmental impacts generated by the fruit production, IFP certification over product standards, including process aspects, are frequently required by consumers and supermarket chains.
Damos, Petros; Escudero Colomar, Lucía-Adriana; Ioriatti, Claudio
2015-01-01
This review focuses on the process of adapting the original concept of Integrated Pest Management (IPM) to the wider conception of the Integrated Fruit Production (IFP) implemented in Europe. Even though most of the pest management strategies still rely on the use of synthetic pesticides, a wide array of innovative and environmentally friendly tools are now available as possible alternative to the pesticides within the modern apple production system. We also highlight how recent pest management strategies and tools have created an opening for research towards IPM improvement, including the use of biorational pesticides, semiochemicals and biological control. Forecasting models, new tree training systems and innovative spray equipment have also been developed to improve treatment coverage, to mitigate pesticide drift and to reduce chemical residues on fruits. The possible threats that jeopardize the effective implementation of IPM and particularly the risks related to the development of the pesticide resistance and the introduction of new invasive pests are also reviewed. With the directive 128/09, the European legislation recognizes IPM as a strategic approach for the sustainable use of pesticides. Within this context, IPM and related guidelines is called to meet different areas of concern in relation to the worker and bystander safety. Beside the traditional economic criteria of the market-oriented agriculture, sustainable agriculture includes the assessment of the environmental impact of the agronomic practices within the societal context where they take place. As a consequence of the raising consumer concerns about environmental impacts generated by the fruit production, IFP certification over product standards, including process aspects, are frequently required by consumers and supermarket chains. PMID:26463407
Architecture for biomedical multimedia information delivery on the World Wide Web
NASA Astrophysics Data System (ADS)
Long, L. Rodney; Goh, Gin-Hua; Neve, Leif; Thoma, George R.
1997-10-01
Research engineers at the National Library of Medicine are building a prototype system for the delivery of multimedia biomedical information on the World Wide Web. This paper discuses the architecture and design considerations for the system, which will be used initially to make images and text from the third National Health and Nutrition Examination Survey (NHANES) publicly available. We categorized our analysis as follows: (1) fundamental software tools: we analyzed trade-offs among use of conventional HTML/CGI, X Window Broadway, and Java; (2) image delivery: we examined the use of unconventional TCP transmission methods; (3) database manager and database design: we discuss the capabilities and planned use of the Informix object-relational database manager and the planned schema for the HNANES database; (4) storage requirements for our Sun server; (5) user interface considerations; (6) the compatibility of the system with other standard research and analysis tools; (7) image display: we discuss considerations for consistent image display for end users. Finally, we discuss the scalability of the system in terms of incorporating larger or more databases of similar data, and the extendibility of the system for supporting content-based retrieval of biomedical images. The system prototype is called the Web-based Medical Information Retrieval System. An early version was built as a Java applet and tested on Unix, PC, and Macintosh platforms. This prototype used the MiniSQL database manager to do text queries on a small database of records of participants in the second NHANES survey. The full records and associated x-ray images were retrievable and displayable on a standard Web browser. A second version has now been built, also a Java applet, using the MySQL database manager.
32 CFR 806b.54 - Information collections, records, and forms or information management tools (IMT).
Code of Federal Regulations, 2010 CFR
2010-07-01
... information management tools (IMT). 806b.54 Section 806b.54 National Defense Department of Defense (Continued..., records, and forms or information management tools (IMT). (a) Information Collections. No information.../pubfiles/af/37/afman37-139/afman37-139.pdf. (c) Forms or Information Management Tools (Adopted and...
32 CFR 806b.54 - Information collections, records, and forms or information management tools (IMT).
Code of Federal Regulations, 2011 CFR
2011-07-01
... information management tools (IMT). 806b.54 Section 806b.54 National Defense Department of Defense (Continued..., records, and forms or information management tools (IMT). (a) Information Collections. No information.../pubfiles/af/37/afman37-139/afman37-139.pdf. (c) Forms or Information Management Tools (Adopted and...
32 CFR 806b.54 - Information collections, records, and forms or information management tools (IMT).
Code of Federal Regulations, 2012 CFR
2012-07-01
... information management tools (IMT). 806b.54 Section 806b.54 National Defense Department of Defense (Continued..., records, and forms or information management tools (IMT). (a) Information Collections. No information.../pubfiles/af/37/afman37-139/afman37-139.pdf. (c) Forms or Information Management Tools (Adopted and...
32 CFR 806b.54 - Information collections, records, and forms or information management tools (IMT).
Code of Federal Regulations, 2013 CFR
2013-07-01
... information management tools (IMT). 806b.54 Section 806b.54 National Defense Department of Defense (Continued..., records, and forms or information management tools (IMT). (a) Information Collections. No information.../pubfiles/af/37/afman37-139/afman37-139.pdf. (c) Forms or Information Management Tools (Adopted and...
32 CFR 806b.54 - Information collections, records, and forms or information management tools (IMT).
Code of Federal Regulations, 2014 CFR
2014-07-01
... information management tools (IMT). 806b.54 Section 806b.54 National Defense Department of Defense (Continued..., records, and forms or information management tools (IMT). (a) Information Collections. No information.../pubfiles/af/37/afman37-139/afman37-139.pdf. (c) Forms or Information Management Tools (Adopted and...
78 FR 60860 - Commerce Spectrum Management Advisory Committee, Call for Applications
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-02
... DEPARTMENT OF COMMERCE National Telecommunications and Information Administration Commerce... Information Administration, U.S. Department of Commerce. ACTION: Notice and call for applications to serve on... seeking applications from persons interested in serving on the Department of Commerce Spectrum Management...
ESIF Call for High-Impact Integrated Projects | Energy Systems Integration
Integrated Projects As a U.S. Department of Energy user facility, the Energy Systems Integration Facility concepts, tools, and technologies needed to measure, analyze, predict, protect, and control the grid of the Facility | NREL ESIF Call for High-Impact Integrated Projects ESIF Call for High-Impact
Using WebQuests as Idea Banks for Fostering Autonomy in Online Language Courses
ERIC Educational Resources Information Center
Sadaghian, Shirin; Marandi, S. Susan
2016-01-01
The concept of language learner autonomy has influenced Computer-Assisted Language Learning (CALL) to the extent that Schwienhorst (2012) informs us of a paradigm change in CALL design in the light of learner autonomy. CALL is not considered a tool anymore, but a learner environment available to language learners anywhere in the world. Based on a…
A Tool for Conditions Tag Management in ATLAS
NASA Astrophysics Data System (ADS)
Sharmazanashvili, A.; Batiashvili, G.; Gvaberidze, G.; Shekriladze, L.; Formica, A.; Atlas Collaboration
2014-06-01
ATLAS Conditions data include about 2 TB in a relational database and 400 GB of files referenced from the database. Conditions data is entered and retrieved using COOL, the API for accessing data in the LCG Conditions Database infrastructure. It is managed using an ATLAS-customized python based tool set. Conditions data are required for every reconstruction and simulation job, so access to them is crucial for all aspects of ATLAS data taking and analysis, as well as by preceding tasks to derive optimal corrections to reconstruction. Optimized sets of conditions for processing are accomplished using strict version control on those conditions: a process which assigns COOL Tags to sets of conditions, and then unifies those conditions over data-taking intervals into a COOL Global Tag. This Global Tag identifies the set of conditions used to process data so that the underlying conditions can be uniquely identified with 100% reproducibility should the processing be executed again. Understanding shifts in the underlying conditions from one tag to another and ensuring interval completeness for all detectors for a set of runs to be processed is a complex task, requiring tools beyond the above mentioned python utilities. Therefore, a JavaScript /PHP based utility called the Conditions Tag Browser (CTB) has been developed. CTB gives detector and conditions experts the possibility to navigate through the different databases and COOL folders; explore the content of given tags and the differences between them, as well as their extent in time; visualize the content of channels associated with leaf tags. This report describes the structure and PHP/ JavaScript classes of functions of the CTB.
BS-virus-finder: virus integration calling using bisulfite sequencing data.
Gao, Shengjie; Hu, Xuesong; Xu, Fengping; Gao, Changduo; Xiong, Kai; Zhao, Xiao; Chen, Haixiao; Zhao, Shancen; Wang, Mengyao; Fu, Dongke; Zhao, Xiaohui; Bai, Jie; Mao, Likai; Li, Bo; Wu, Song; Wang, Jian; Li, Shengbin; Yang, Huangming; Bolund, Lars; Pedersen, Christian N S
2018-01-01
DNA methylation plays a key role in the regulation of gene expression and carcinogenesis. Bisulfite sequencing studies mainly focus on calling single nucleotide polymorphism, different methylation region, and find allele-specific DNA methylation. Until now, only a few software tools have focused on virus integration using bisulfite sequencing data. We have developed a new and easy-to-use software tool, named BS-virus-finder (BSVF, RRID:SCR_015727), to detect viral integration breakpoints in whole human genomes. The tool is hosted at https://github.com/BGI-SZ/BSVF. BS-virus-finder demonstrates high sensitivity and specificity. It is useful in epigenetic studies and to reveal the relationship between viral integration and DNA methylation. BS-virus-finder is the first software tool to detect virus integration loci by using bisulfite sequencing data. © The Authors 2017. Published by Oxford University Press.
VoiceThread: A Useful Program Evaluation Tool
ERIC Educational Resources Information Center
Mott, Rebecca
2018-01-01
With today's technology, Extension professionals have a variety of tools available for program evaluation. This article describes an innovative platform called VoiceThread that has been used in many classrooms but also is useful for conducting virtual focus group research. I explain how this tool can be used to collect qualitative participant…
Rapid Response to Decision Making for Complex Issues - How Technologies of Cooperation Can Help
2005-11-01
creating bottom–up taxonomies—called folksonomies —using metadata tools like del.icio.us (in which users create their own tags for bookmarking Web...tools such as RSS, tagging (and the consequent development of folksonomies ), wikis, and group visualization tools all help multiply the individual
Coordinated Fault-Tolerance for High-Performance Computing Final Project Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Panda, Dhabaleswar Kumar; Beckman, Pete
2011-07-28
With the Coordinated Infrastructure for Fault Tolerance Systems (CIFTS, as the original project came to be called) project, our aim has been to understand and tackle the following broad research questions, the answers to which will help the HEC community analyze and shape the direction of research in the field of fault tolerance and resiliency on future high-end leadership systems. Will availability of global fault information, obtained by fault information exchange between the different HEC software on a system, allow individual system software to better detect, diagnose, and adaptively respond to faults? If fault-awareness is raised throughout the system throughmore » fault information exchange, is it possible to get all system software working together to provide a more comprehensive end-to-end fault management on the system? What are the missing fault-tolerance features that widely used HEC system software lacks today that would inhibit such software from taking advantage of systemwide global fault information? What are the practical limitations of a systemwide approach for end-to-end fault management based on fault awareness and coordination? What mechanisms, tools, and technologies are needed to bring about fault awareness and coordination of responses on a leadership-class system? What standards, outreach, and community interaction are needed for adoption of the concept of fault awareness and coordination for fault management on future systems? Keeping our overall objectives in mind, the CIFTS team has taken a parallel fourfold approach. Our central goal was to design and implement a light-weight, scalable infrastructure with a simple, standardized interface to allow communication of fault-related information through the system and facilitate coordinated responses. This work led to the development of the Fault Tolerance Backplane (FTB) publish-subscribe API specification, together with a reference implementation and several experimental implementations on top of existing publish-subscribe tools. We enhanced the intrinsic fault tolerance capabilities representative implementations of a variety of key HPC software subsystems and integrated them with the FTB. Targeting software subsystems included: MPI communication libraries, checkpoint/restart libraries, resource managers and job schedulers, and system monitoring tools. Leveraging the aforementioned infrastructure, as well as developing and utilizing additional tools, we have examined issues associated with expanded, end-to-end fault response from both system and application viewpoints. From the standpoint of system operations, we have investigated log and root cause analysis, anomaly detection and fault prediction, and generalized notification mechanisms. Our applications work has included libraries for fault-tolerance linear algebra, application frameworks for coupled multiphysics applications, and external frameworks to support the monitoring and response for general applications. Our final goal was to engage the high-end computing community to increase awareness of tools and issues around coordinated end-to-end fault management.« less
Oceans 2.0: a Data Management Infrastructure as a Platform
NASA Astrophysics Data System (ADS)
Pirenne, B.; Guillemot, E.
2012-04-01
Oceans 2.0: a Data Management Infrastructure as a Platform Benoît Pirenne, Associate Director, IT, NEPTUNE Canada Eric Guillemot, Manager, Software Development, NEPTUNE Canada The Data Management and Archiving System (DMAS) serving the needs of a number of undersea observing networks such as VENUS and NEPTUNE Canada was conceived from the beginning as a Service-Oriented Infrastructure. Its core functional elements (data acquisition, transport, archiving, retrieval and processing) can interact with the outside world using Web Services. Those Web Services can be exploited by a variety of higher level applications. Over the years, DMAS has developed Oceans 2.0: an environment where these techniques are implemented. The environment thereby becomes a platform in that it allows for easy addition of new and advanced features that build upon the tools at the core of the system. The applications that have been developed include: data search and retrieval, including options such as data product generation, data decimation or averaging, etc. dynamic infrastructure description (search all observatory metadata) and visualization data visualization, including dynamic scalar data plots, integrated fast video segment search and viewing Building upon these basic applications are new concepts, coming from the Web 2.0 world that DMAS has added: They allow people equipped only with a web browser to collaborate and contribute their findings or work results to the wider community. Examples include: addition of metadata tags to any part of the infrastructure or to any data item (annotations) ability to edit and execute, share and distribute Matlab code on-line, from a simple web browser, with specific calls within the code to access data ability to interactively and graphically build pipeline processing jobs that can be executed on the cloud web-based, interactive instrument control tools that allow users to truly share the use of the instruments and communicate with each other and last but not least: a public tool in the form of a game, that crowd-sources the inventory of the underwater video archive content, thereby adding tremendous amounts of metadata Beyond those tools that represent the functionality presently available to users, a number of the Web Services dedicated to data access are being exposed for anyone to use. This allows not only for ad hoc data access by individuals who need non-interactive access, but will foster the development of new applications in a variety of areas.
ERIC Educational Resources Information Center
Upitis, Rena; Brook, Julia
2017-01-01
Even though there are demonstrated benefits of using online tools to support student musicians, there is a persistent challenge of providing sufficient and effective professional development for independent music teachers to use such tools successfully. This paper describes several methods for helping teachers use an online tool called iSCORE,…
1985-03-18
increased profitability, productivity and more effective management ." Members of the British side also called for an increase in Japan’s defense...copyright transfer, or rental permission is called technology trade. According to the Statistic Bureau of the Management and Coordina- tion Agency...1- 8.8 -Swizerland 6.3 5.6 Others 13.1 Source: Statistics Bureau of the Management and Coordination Agency China. 66.2% of Japan’s imported
Watershed Management Optimization Support Tool (WMOST) Workshop.
EPA's Watershed Management Optimization Support Tool (WMOST) version 2 is a decision support tool designed to facilitate integrated water management by communities at the small watershed scale. WMOST allows users to look across management options in stormwater (including green i...
NASA Astrophysics Data System (ADS)
Hooijer, A.; van Os, A. G.
Recent flood events and socio-economic developments have increased the awareness of the need for improved flood risk management along the Rhine and Meuse Rivers. In response to this, the IRMA-SPONGE program incorporated 13 research projects in which over 30 organisations from all 6 River Basin Countries co-operated. The pro- gram is financed partly by the European INTERREG Rhine-Meuse Activities (IRMA). The main aim of IRMA-SPONGE is defined as: "The development of methodologies and tools to assess the impact of flood risk reduction measures and of land-use and climate change scenarios. This to support the spatial planning process in establish- ing alternative strategies for an optimal realisation of the hydraulic, economical and ecological functions of the Rhine and Meuse River Basins." Further important objec- tives are to promote transboundary co-operation in flood risk management by both scientific and management organisations, and to promote public participation in flood management issues. The projects in the program are grouped in three clusters, looking at measures from different scientific angles. The results of the projects in each cluster have been evaluated to define recommendations for flood risk management; some of these outcomes call for a change to current practices, e.g.: 1. (Flood Risk and Hydrol- ogy cluster): hydrological changes due to climate change exceed those due to further land use change, and are significant enough to necessitate a change in flood risk man- agement strategies if the currently claimed protection levels are to be sustained. 2. (Flood Protection and Ecology cluster): to not only provide flood protection but also enhance the ecological quality of rivers and floodplains, new flood risk management concepts ought to integrate ecological knowledge from start to finish, with a clear perspective on the type of nature desired and the spatial and time scales considered. 3. (Flood Risk Management and Spatial Planning cluster): extreme floods can not be prevented by taking mainly upstream measures; significant and space-consuming lo- cal measures will therefore be needed in the lower Rhine and Meuse deltas. However, there is also a need for improved flood risk management upstream, which calls for better spatial planning procedures. More detailed information on the IRMA-SPONGE program can be found on our website: www.irma-sponge.org.
PC tools for project management: Programs and the state-of-the-practice
NASA Technical Reports Server (NTRS)
Bishop, Peter C.; Freedman, Glenn B.; Dede, Christopher J.; Lidwell, William; Learned, David
1990-01-01
The use of microcomputer tools for NASA project management; which features are the most useful; the impact of these tools on job performance and individual style; and the prospects for new features in project management tools and related tools are addressed. High, mid, and low end PM tools are examined. The pro's and con's of the tools are assessed relative to various tasks. The strengths and weaknesses of the tools are presented through cases and demonstrations.
NASA Astrophysics Data System (ADS)
Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Strawhacker, C.; Pulsifer, P. L.; Thurmes, N.
2016-12-01
The United States National Science Foundation funded PermaData project led by the National Snow and Ice Data Center (NSIDC) with a team from the Global Terrestrial Network for Permafrost (GTN-P) aimed to improve permafrost data access and discovery. We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the GTN-P. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets. Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs. Originally it was written to capture a scientist's personal, iterative, data manipulation and quality control process of visually and programmatically iterating through inconsistent input data, examining it to find problems, adding operations to address the problems, and rerunning until the data could be translated into the GTN-P standard format. Iterative development of this tool led to a Fortran/Python hybrid then, with consideration of users, licensing, version control, packaging, and workflow, to a publically available, robust, usable application. Transitioning to Python allowed the use of open source frameworks for the workflow core and integration with a javascript graphical workflow interface. DIT is targeted to automatically handle 90% of the data processing for field scientists, modelers, and non-discipline scientists. It is available as an open source tool in GitHub packaged for a subset of Mac, Windows, and UNIX systems as a desktop application with a graphical workflow manager. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets (270 sites), and is scheduled to translate 10 more datasets ( 1000 sites) from the legacy inactive site data holdings of the Frozen Ground Data Center (FGDC). Iterative development has provided the permafrost and wider scientific community with an extendable tool designed specifically for the iterative process of translating unruly data.
Roux, D J
2001-06-01
This article explores the strategies that were, and are being, used to facilitate the transition from scientific development to operational application of the South African River Health Programme (RHP). Theoretical models from the field of the management of technology are used to provide insight into the dynamics that influence the relationship between the creation and application of environmental programmes, and the RHP in particular. Four key components of the RHP design are analysed, namely the (a) guiding team, (b) concepts, tools and methods, (c) infra-structural innovations and (d) communication. These key components evolved over three broad life stages of the programme, which are called the design, growth and anchoring stages.
Auction-based bandwidth allocation in the Internet
NASA Astrophysics Data System (ADS)
Wei, Jiaolong; Zhang, Chi
2002-07-01
It has been widely accepted that auctioning which is the pricing approach with minimal information requirement is a proper tool to manage scare network resources. Previous works focus on Vickrey auction which is incentive compatible in classic auction theory. In the beginning of this paper, the faults of the most representative auction-based mechanisms are discussed. And then a new method called uniform-price auction (UPA), which has the simplest auction rule is proposed and it's incentive compatibility in the network environment is also proved. Finally, the basic mode is extended to support applications which require minimum bandwidth guarantees for a given time period by introducing derivative market, and a market mechanism for network resource allocation which is predictable, riskless, and simple for end-users is completed.
Taxonomy-Based Approaches to Quality Assurance of Ontologies
Perl, Yehoshua; Ochs, Christopher
2017-01-01
Ontologies are important components of health information management systems. As such, the quality of their content is of paramount importance. It has been proven to be practical to develop quality assurance (QA) methodologies based on automated identification of sets of concepts expected to have higher likelihood of errors. Four kinds of such sets (called QA-sets) organized around the themes of complex and uncommonly modeled concepts are introduced. A survey of different methodologies based on these QA-sets and the results of applying them to various ontologies are presented. Overall, following these approaches leads to higher QA yields and better utilization of QA personnel. The formulation of additional QA-set methodologies will further enhance the suite of available ontology QA tools. PMID:29158885
Airtightness the simple(CS) way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, S.
Builders who might buck against such time consuming air sealing methods as polyethylene wrap and the airtight drywall approach (ADA) may respond better to current strategies. One such method, called SimpleCS, has proven especially effective. SimpleCS, pronounced simplex, stands for simple caulk and seal. A modification of the ADA, SimpleCS is an air-sealing management tool, a simplified systems approach to building tight homes. The system address the crucial question of when and by whom various air sealing steps should be done. It avoids the problems that often occur when later contractors cut open polyethylene wrap to drill holes in themore » drywall. The author describes how SimpleCS works, and the cost and training involved.« less
2004-03-03
Personnel viewing AirSAR hardware while touring the outside of NASA's DC-8 during a stop-off on the AirSAR 2004 Mesoamerica campaign, L-R: Fernando Gutierrez, Costa Rican Minister of Science and Technology(MICIT); NASA Administrator Sean O'Keefe; Dr. Gahssem Asrar, NASA Associate Administrator for Earth Science Enterprises; JPL scientist Bruce Chapman; and Craig Dobson, NASA Program Manager for AirSAR. AirSAR 2004 Mesoamerica is a three-week expedition by an international team of scientists that will use an all-weather imaging tool, called the Airborne Synthetic Aperture Radar (AirSAR), in a mission ranging from the tropical rain forests of Central America to frigid Antarctica.
Kijsanayotin, Boonchai
2013-01-01
Thailand achieved universal healthcare coverage with the implementation of the Universal Coverage Scheme (UCS) in 2001. This study employed qualitative method to explore the impact of the UCS on the country's health information systems (HIS) and health information technology (HIT) development. The results show that health insurance beneficiary registration system helps improve providers' service workflow and country vital statistics. Implementation of casemix financing tool, Thai Diagnosis-Related Groups, has stimulated health providers' HIS and HIT capacity building, data and medical record quality and the adoption of national administrative data standards. The system called "Disease Management Information Systems" aiming at reimbursement for select diseases increased the fragmentation of HIS and increase burden on data management to providers. The financial incentive of outpatient data quality improvement project enhance providers' HIS and HIT investment and also induce data fraudulence tendency. Implementation of UCS has largely brought favorable impact on the country HIS and HIT development. However, the unfavorable effects are also evident.
El Hanandeh, Ali; El-Zein, Abbas
2010-01-01
A modified version of the multi-criteria decision aid, ELECTRE III has been developed to account for uncertainty in criteria weightings and threshold values. The new procedure, called ELECTRE-SS, modifies the exploitation phase in ELECTRE III, through a new definition of the pre-order and the introduction of a ranking index (RI). The new approach accommodates cases where incomplete or uncertain preference data are present. The method is applied to a case of selecting a management strategy for the bio-degradable fraction in the municipal solid waste of Sydney. Ten alternatives are compared against 11 criteria. The results show that anaerobic digestion (AD) and composting of paper are less environmentally sound options than recycling. AD is likely to out-perform incineration where a market for heating does not exist. Moreover, landfilling can be a sound alternative, when considering overall performance and conditions of uncertainty.
Sadiq, Rehan; Rodriguez, Manuel J
2005-04-01
Interpreting water quality data routinely generated for control and monitoring purposes in water distribution systems is a complicated task for utility managers. In fact, data for diverse water quality indicators (physico-chemical and microbiological) are generated at different times and at different locations in the distribution system. To simplify and improve the understanding and the interpretation of water quality, methodologies for aggregation and fusion of data must be developed. In this paper, the Dempster-Shafer theory also called theory of evidence is introduced as a potential methodology for interpreting water quality data. The conceptual basis of this methodology and the process for its implementation are presented by two applications. The first application deals with the interpretation of spatial water quality data fusion, while the second application deals with the development of water quality index based on key monitored indicators. Based on the obtained results, the authors discuss the potential contribution of theory of evidence as a decision-making tool for water quality management.
The 5S lean method as a tool of industrial management performances
NASA Astrophysics Data System (ADS)
Filip, F. C.; Marascu-Klein, V.
2015-11-01
Implementing the 5S (seiri, seiton, seiso, seiketsu, and shitsuke) method is carried out through a significant study whose purpose to analyse and deployment the management performance in order to emphasize the problems and working mistakes, reducing waste (stationary and waiting times), flow transparency, storage areas by properly marking and labelling, establishing standards work (everyone knows exactly where are the necessary things), safety and ergonomic working places (the health of all employees). The study describes the impact of the 5S lean method implemented to storing, cleaning, developing and sustaining a production working place from an industrial company. In order to check and sustain the 5S process, it is needed to use an internal audit, called “5S audit”. Implementing the 5S methodology requires organization and safety of the working process, properly marking and labelling of the working place, and audits to establish the work in progress and to maintain the improved activities.
NASA Astrophysics Data System (ADS)
Gleick, P. H.
2010-12-01
The failure of traditional water management systems in the 20th century -- what I call the "hard path for water" -- is evident in several ways, including the persistent inability to meet basic human needs for safe water and adequate sanitation for vast populations, ongoing and accelerating aquatic ecosystem collapses , and growing political disputes over water allocation, management, and use, even in regions where substantial investment in water has been made. Progress in resolving these problems, especially in the face of unavoidable climate changes, growing populations, and constrained financial systems, will require bridging hydrologic and social sciences in new ways. Integrating social and cultural knowledge with new economic and technological tools and classical hydrologic and climatological sciences can produce a new “soft path for water” that offers the opportunity to move toward sustainable water systems. This talk will define the soft path for water and offer examples of innovative steps already being taken along that path in the western United States, South Africa, India, and elsewhere.
The creation of prehistoric man. Aimé Rutot and the Eolith controversy, 1900-1920.
De Bont, Raf
2003-12-01
Although he died in obscurity, the Belgian museum conservator Aimé Rutot (1847-1933) was one of the most famous European archaeologists between 1900 and 1920. The focus of his scientific interest was stone flints, which he claimed to be the oldest known human tools, so-called eoliths. Skeptics maintained that the flints showed no marks of human workmanship, but Rutot nevertheless managed to spread his "Eolithic theory" in an important part of the scientific community. This essay demonstrates how material objects--series of stone flints and sets of statues that purported to reconstruct prehistoric "races"--were given scientific meaning by Rutot. Rutot diffused his ideas by disseminating his stones and statues, thus enlarging his networks of influence. For a time he managed to be at the material center of a trade network as well as at the intellectual center of archaeological debate. The essay shows how Rutot achieved this status and how he eventually fell from favor among serious scientists.
Gosselin, Emilie; Bourgault, Patricia; Lavoie, Stephan; Coleman, Robin-Marie; Méziat-Burdin, Anne
2014-12-01
Pain management in the intensive care unit is often inadequate. There is no tool available to assess nursing pain management practices. The aim of this study was to develop and validate a measuring tool to assess nursing pain management in the intensive care unit during standardized clinical simulation. A literature review was performed to identify relevant components demonstrating optimal pain management in adult intensive care units and to integrate them in an observation tool. This tool was submitted to an expert panel and pretested. It was then used to assess pain management practice during 26 discrete standardized clinical simulation sessions with intensive care nurses. The Nursing Observation Tool for Pain Management (NOTPaM) contains 28 statements grouped into 8 categories, which are grouped into 4 dimensions: subjective assessment, objective assessment, interventions, and reassessment. The tool's internal consistency was calculated at a Cronbach's alpha of 0.436 for the whole tool; the alpha varies from 0.328 to 0.518 for each dimension. To evaluate the inter-rater reliability, intra-class correlation coefficient was used, which was calculated at 0.751 (p < .001) for the whole tool, with variations from 0.619 to 0.920 (p < .01) between dimensions. The expert panel was satisfied with the content and face validity of the tool. The psychometric qualities of the NOTPaM developed in this study are satisfactory. However, the tool could be improved with slight modifications. Nevertheless, it was useful in assessing intensive care nurses' pain management in a standardized clinical simulation. The NOTPaM is the first tool created for this purpose. Copyright © 2014 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.
Watershed Management Optimization Support Tool v3
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that facilitates integrated water management at the local or small watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed context that is, accou...
Decades of CALL Development: A Retrospective of the Work of James Pusack and Sue Otto
ERIC Educational Resources Information Center
Otto, Sue E. K.
2010-01-01
In this article, the author describes a series of projects that James Pusack and the author engaged in together, a number of them to develop CALL authoring tools. With their shared love of technology and dedication to language teaching and learning, they embarked on a long and immensely enjoyable career in CALL during which each project evolved…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-06
... Change To Expand the Availability of Risk Management Tools November 30, 2012. Pursuant to Section 19(b)(1... Exchange's Risk Management Tool (the ``Tool'') to all Exchange Members.\\3\\ The Tool is currently available... Access Risk Management Tool.\\6\\ This optional service acts as a risk filter by causing the orders of...
Ecological perspective: Linking ecology, GIS, and remote sensing to ecosystem management
Allen, Craig D.; Sample, V. Alaric
1994-01-01
Awareness of significant human impacts on the ecology of Earth's landscapes is not new (Thomas 1956). Over the past decade (Forman and Godron 1986, Urban et a1. 1987) applications of geographic information systems (GIS) and remote sensing technologies have supported a rapid rise in landscape.stale research. The heightened recognition within the research community of the ecological linkages between local sites and larger spatial scales has spawned increasing calls for more holistic management of landscapes (Noss 1983, Harris 1984, Risser 1985, Norse et al. 1986, Agee and Johnson 1988, Franklin 1989, Brooks and Grant 1992, Endangered Species Update-Special Issue 1993, Crow 1994, Grumbine 1994). As a result agencies such as the U.S. Forest Service, U.S. Fish and Wildlife Service, and National Park Service are now converging on "ecosystem management" as a new paradigm to sustainably manage wildlands and maintain biodiversity. However, as this transition occurs, several impediments to implementation of this new paradigm persist, including(1) significant uncenainty among many land managers about the definition and goals of ecosystem management,(2) inadequate ecological information on the past and present processes and structural conditions of target ecosystems,(3) insufficient experience on the part of land managers with the rapidly diversifying array of GIS and remote sensing tools to effectively use them to support ecology-based land management, and(4) a paucity of intimate, long-term relationships between people (including land managers) and the particular landscape communities to which they belong.This chapter provides an ecological perspective on these issues as applied to ecosystem management in a southwestern U.S. landscape.
Raymond, M; Harrison, M C
2014-12-01
Effective communication, co-operation and teamwork have been identified as key determinants of patient safety. SBAR (Situation, Background, Assessment and Recommendation) is a communication tool recommended by the World Health Organization and the UK National Health Service. SBAR is a structured method for communicating critical information that requires immediate attention and action, contributing to effective escalation of management and increased patient safety. To our knowledge, this is the first study showing use of SBAR in South Africa (SA). To determine the effectiveness of adopting the SBAR communication tool in an acute clinical setting in SA. In the first phase of this study, neonatal nurses and doctors at Groote Schuur Hospital, Cape Town, were gathered in a focus group and given a questionnaire asking about communication in the neonatal department. Neonatal nurses and doctors were then trained to use SBAR. A telephone audit demonstrated an increase in SBAR use by registrars from 29% to 70% when calling consultants for help. After training, the majority of staff agreed that SBAR had helped with communication, confidence, and quality of patient care. There was qualitative evidence that SBAR led to greater promptness in care of acutely ill patients. Adopting SBAR was associated with perceived improvement in communication between professionals and in the quality and safety of patient care. It is suggested that this simple tool be introduced to many other hospitals in SA.
Auzéric, M; Bellemère, J; Conort, O; Roubille, R; Allenet, B; Bedouch, P; Rose, F-X; Juste, M; Charpiat, B
2009-11-01
Pharmacists play an important role in prescription analysis. They are involved in therapeutic drug monitoring, particularly for drugs with a narrow therapeutic index, prevention and management of drug interactions, and may be called in to identify side effects and adverse events related to drug therapy. For the polymedicated patient, the medical file, the list of prescribed drugs and the history of their administration may be insufficient to adequately assign the responsibility of a given adverse effect to one or more drugs. Graphical representations can sometimes be useful to describe and clarify a sequence of events. In addition, as part of their academic course, students have many occasions to hear about "side effects" and "drug interactions". However, in the academic setting, there are few opportunities to observe the evolution and the consequences of these events. In the course of their hospital training, these students are required to perform patient follow-up for pharmacotherapeutic or educational purposes and to comment case reports to physicians. The aim of this paper is to present a tool facilitating the graphic display of drug interaction consequences and side effects. This tool can be a useful aid for causality assessment. It structures the students' training course and helps them better understand the commentaries pharmacists provide for physicians. Further development of this tool should contribute to the prevention of adverse drug events.
An empowerment-based approach to developing innovative e-health tools for self-management.
Alpay, Laurence; van der Boog, Paul; Dumaij, Adrie
2011-12-01
E-health is seen as an important technological tool in achieving self-management; however, there is little evidence of how effective e-health is for self-management. Example tools remain experimental and there is limited knowledge yet about the design, use, and effects of this class of tools. By way of introducing a new view on the development of e-health tools dedicated to self-management we aim to contribute to the discussion for further research in this area. Our assumption is that patient empowerment is an important mechanism of e-health self-management and we suggest incorporating it within the development of self-management tools. Important components of empowerment selected from literature are: communication, education and health literacy, information, self-care, decision aids and contact with fellow patients. All components require skills of both patients and the physicians. In this discussion paper we propose how the required skills can be used to specify effective self-management tools.
Press Calls | NOAA Gulf Spill Restoration
Projects Near You Strategic Frameworks Monitoring and Adaptive Management Restoration Areas Alabama Florida Archive Home Press Calls Press Calls Gulf Spill Restoration Menu Home Restoration Areas About Us Data How
Coordinating complex decision support activities across distributed applications
NASA Technical Reports Server (NTRS)
Adler, Richard M.
1994-01-01
Knowledge-based technologies have been applied successfully to automate planning and scheduling in many problem domains. Automation of decision support can be increased further by integrating task-specific applications with supporting database systems, and by coordinating interactions between such tools to facilitate collaborative activities. Unfortunately, the technical obstacles that must be overcome to achieve this vision of transparent, cooperative problem-solving are daunting. Intelligent decision support tools are typically developed for standalone use, rely on incompatible, task-specific representational models and application programming interfaces (API's), and run on heterogeneous computing platforms. Getting such applications to interact freely calls for platform independent capabilities for distributed communication, as well as tools for mapping information across disparate representations. Symbiotics is developing a layered set of software tools (called NetWorks! for integrating and coordinating heterogeneous distributed applications. he top layer of tools consists of an extensible set of generic, programmable coordination services. Developers access these services via high-level API's to implement the desired interactions between distributed applications.
The Three C's for Urban Science Education
ERIC Educational Resources Information Center
Emdin, Chris
2008-01-01
In this article, the author outlines briefly what he calls the three C's--a set of tools that can be used to improve urban science education. The author then describes ways that these tools can support students who have traditionally been marginalized. These three aligned and closely connected tools provide practical ways to engage students in…
Challenging Google, Microsoft Unveils a Search Tool for Scholarly Articles
ERIC Educational Resources Information Center
Carlson, Scott
2006-01-01
Microsoft has introduced a new search tool to help people find scholarly articles online. The service, which includes journal articles from prominent academic societies and publishers, puts Microsoft in direct competition with Google Scholar. The new free search tool, which should work on most Web browsers, is called Windows Live Academic Search…
Evaluation of Knowla: An Online Assessment and Learning Tool
ERIC Educational Resources Information Center
Thompson, Meredith Myra; Braude, Eric John
2016-01-01
The assessment of learning in large online courses requires tools that are valid, reliable, easy to administer, and can be automatically scored. We have evaluated an online assessment and learning tool called Knowledge Assembly, or Knowla. Knowla measures a student's knowledge in a particular subject by having the student assemble a set of…
ERIC Educational Resources Information Center
Weldeana, Hailu Nigus; Sbhatu, Desta Berhe
2017-01-01
Background: This article reports contributions of an assessment tool called Portfolio of Evidence (PE) in learning college geometry. Material and methods: Two classes of second-year students from one Ethiopian teacher education college, assigned into Treatment and Comparison classes, were participated. The assessment tools used in the Treatment…
iMAR: An Interactive Web-Based Application for Mapping Herbicide Resistant Weeds.
Panozzo, Silvia; Colauzzi, Michele; Scarabel, Laura; Collavo, Alberto; Rosan, Valentina; Sattin, Maurizio
2015-01-01
Herbicides are the major weed control tool in most cropping systems worldwide. However, the high reliance on herbicides has led to environmental issues as well as to the evolution of herbicide-resistant biotypes. Resistance is a major concern in modern agriculture and early detection of resistant biotypes is therefore crucial for its management and prevention. In this context, a timely update of resistance biotypes distribution is fundamental to devise and implement efficient resistance management strategies. Here we present an innovative web-based application called iMAR (interactive MApping of Resistance) for the mapping of herbicide resistant biotypes. It is based on open source software tools and translates into maps the data reported in the GIRE (Italian herbicide resistance working group) database of herbicide resistance at national level. iMAR allows an automatic, easy and cost-effective updating of the maps a nd provides two different systems, "static" and "dynamic". In the first one, the user choices are guided by a hierarchical tree menu, whereas the latter is more flexible and includes a multiple choice criteria (type of resistance, weed species, region, cropping systems) that permits customized maps to be created. The generated information can be useful to various stakeholders who are involved in weed resistance management: farmers, advisors, national and local decision makers as well as the agrochemical industry. iMAR is freely available, and the system has the potential to handle large datasets and to be used for other purposes with geographical implications, such as the mapping of invasive plants or pests.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, G.
1984-09-01
Two classifications of fishing jobs are discussed: open hole and cased hole. When there is no casing in the area of the fish, it is called open hole fishing. When the fish is inside the casing, it is called cased hole fishing. The article lists various things that can become a fish-stuck drill pipe, including: broken drill pipe, drill collars, bit, bit cones, hand tools dropped in the well, sanded up or mud stuck tubing, packers become stuck, and much more. It is suggested that on a fishing job, all parties involved should cooperate with each other, and that fishingmore » tool people obtain all the information concerning the well. That way they can select the right tools and methods to clean out the well as quickly as possible.« less
Management aspects of Gemini's base facility operations project
NASA Astrophysics Data System (ADS)
Arriagada, Gustavo; Nitta, Atsuko; Adamson, A. J.; Nunez, Arturo; Serio, Andrew; Cordova, Martin
2016-08-01
Gemini's Base Facilities Operations (BFO) Project provided the capabilities to perform routine nighttime operations without anyone on the summit. The expected benefits were to achieve money savings and to become an enabler of the future development of remote operations. The project was executed using a tailored version of Prince2 project management methodology. It was schedule driven and managing it demanded flexibility and creativity to produce what was needed, taking into consideration all the constraints present at the time: Time available to implement BFO at Gemini North (GN), two years. The project had to be done in a matrix resources environment. There were only three resources assigned exclusively to BFO. The implementation of new capabilities had to be done without disrupting operations. And we needed to succeed, introducing the new operational model that implied Telescope and instrumentation Operators (Science Operations Specialists - SOS) relying on technology to assess summit conditions. To meet schedule we created a large number of concurrent smaller projects called Work Packages (WP). To be reassured that we would successfully implement BFO, we initially spent a good portion of time and effort, collecting and learning about user's needs. This was done through close interaction with SOSs, Observers, Engineers and Technicians. Once we had a clear understanding of the requirements, we took the approach of implementing the "bare minimum" necessary technology that would meet them and that would be maintainable in the long term. Another key element was the introduction of the "gradual descent" concept. In this, we increasingly provided tools to the SOSs and Observers to prevent them from going outside the control room during nighttime operations, giving them the opportunity of familiarizing themselves with the new tools over a time span of several months. Also, by using these tools at an early stage, Engineers and Technicians had more time for debugging, problem fixing and systems usage and servicing training as well.
A software platform for statistical evaluation of patient respiratory patterns in radiation therapy.
Dunn, Leon; Kenny, John
2017-10-01
The aim of this work was to design and evaluate a software tool for analysis of a patient's respiration, with the goal of optimizing the effectiveness of motion management techniques during radiotherapy imaging and treatment. A software tool which analyses patient respiratory data files (.vxp files) created by the Varian Real-Time Position Management System (RPM) was developed to analyse patient respiratory data. The software, called RespAnalysis, was created in MATLAB and provides four modules, one each for determining respiration characteristics, providing breathing coaching (biofeedback training), comparing pre and post-training characteristics and performing a fraction-by-fraction assessment. The modules analyse respiratory traces to determine signal characteristics and specifically use a Sample Entropy algorithm as the key means to quantify breathing irregularity. Simulated respiratory signals, as well as 91 patient RPM traces were analysed with RespAnalysis to test the viability of using the Sample Entropy for predicting breathing regularity. Retrospective assessment of patient data demonstrated that the Sample Entropy metric was a predictor of periodic irregularity in respiration data, however, it was found to be insensitive to amplitude variation. Additional waveform statistics assessing the distribution of signal amplitudes over time coupled with Sample Entropy method were found to be useful in assessing breathing regularity. The RespAnalysis software tool presented in this work uses the Sample Entropy method to analyse patient respiratory data recorded for motion management purposes in radiation therapy. This is applicable during treatment simulation and during subsequent treatment fractions, providing a way to quantify breathing irregularity, as well as assess the need for breathing coaching. It was demonstrated that the Sample Entropy metric was correlated to the irregularity of the patient's respiratory motion in terms of periodicity, whilst other metrics, such as percentage deviation of inhale/exhale peak positions provided insight into respiratory amplitude regularity. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Watershed Management Optimization Support Tool (WMOST) v3: User Guide
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that facilitates integrated water management at the local or small watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed context that is, accou...
Watershed Management Optimization Support Tool (WMOST) v3: Theoretical Documentation
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that facilitates integrated water management at the local or small watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed context, accounting fo...
Watershed Management Optimization Support Tool (WMOST) v2: Theoretical Documentation
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that evaluates the relative cost-effectiveness of management practices at the local or watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed c...
Advanced Simulation and Computing: A Summary Report to the Director's Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, M G; Peck, T
2003-06-01
It has now been three years since the Advanced Simulation and Computing Program (ASCI), as managed by Defense and Nuclear Technologies (DNT) Directorate, has been reviewed by this Director's Review Committee (DRC). Since that time, there has been considerable progress for all components of the ASCI Program, and these developments will be highlighted in this document and in the presentations planned for June 9 and 10, 2003. There have also been some name changes. Today, the Program is called ''Advanced Simulation and Computing,'' Although it retains the familiar acronym ASCI, the initiative nature of the effort has given way tomore » sustained services as an integral part of the Stockpile Stewardship Program (SSP). All computing efforts at LLNL and the other two Defense Program (DP) laboratories are funded and managed under ASCI. This includes the so-called legacy codes, which remain essential tools in stockpile stewardship. The contract between the Department of Energy (DOE) and the University of California (UC) specifies an independent appraisal of Directorate technical work and programmatic management. Such represents the work of this DNT Review Committee. Beginning this year, the Laboratory is implementing a new review system. This process was negotiated between UC, the National Nuclear Security Administration (NNSA), and the Laboratory Directors. Central to this approach are eight performance objectives that focus on key programmatic and administrative goals. Associated with each of these objectives are a number of performance measures to more clearly characterize the attainment of the objectives. Each performance measure has a lead directorate and one or more contributing directorates. Each measure has an evaluation plan and has identified expected documentation to be included in the ''Assessment File''.« less
Decision Analysis Tools for Volcano Observatories
NASA Astrophysics Data System (ADS)
Hincks, T. H.; Aspinall, W.; Woo, G.
2005-12-01
Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.
Aerospace Power Systems Design and Analysis (APSDA) Tool
NASA Technical Reports Server (NTRS)
Truong, Long V.
1998-01-01
The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.
Managing a work-life balance: the experiences of midwives working in a group practice setting.
Fereday, Jennifer; Oster, Candice
2010-06-01
To explore how a group of midwives achieved a work-life balance working within a caseload model of care with flexible work hours and on-call work. in-depth interviews were conducted and the data were analysed using a data-driven thematic analysis technique. Children, Youth and Women's Health Service (CYWHS) (previously Women's and Children's Hospital), Adelaide, where a midwifery service known as Midwifery Group Practice (MGP) offers a caseload model of care to women within a midwife-managed unit. 17 midwives who were currently working, or had previously worked, in MGP. analysis of the midwives' individual experiences provided insight into how midwives managed the flexible hours and on-call work to achieve a sustainable work-life balance within a caseload model of care. it is important for midwives working in MGP to actively manage the flexibility of their role with time on call. Organisational, team and individual structure influenced how flexibility of hours was managed; however, a period of adjustment was required to achieve this balance. the study findings offer a description of effective, sustainable strategies to manage flexible hours and on-call work that may assist other midwives working in a similar role or considering this type of work setting. Copyright 2008 Elsevier Ltd. All rights reserved.
Verbist, Bie M P; Thys, Kim; Reumers, Joke; Wetzels, Yves; Van der Borght, Koen; Talloen, Willem; Aerssens, Jeroen; Clement, Lieven; Thas, Olivier
2015-01-01
In virology, massively parallel sequencing (MPS) opens many opportunities for studying viral quasi-species, e.g. in HIV-1- and HCV-infected patients. This is essential for understanding pathways to resistance, which can substantially improve treatment. Although MPS platforms allow in-depth characterization of sequence variation, their measurements still involve substantial technical noise. For Illumina sequencing, single base substitutions are the main error source and impede powerful assessment of low-frequency mutations. Fortunately, base calls are complemented with quality scores (Qs) that are useful for differentiating errors from the real low-frequency mutations. A variant calling tool, Q-cpileup, is proposed, which exploits the Qs of nucleotides in a filtering strategy to increase specificity. The tool is imbedded in an open-source pipeline, VirVarSeq, which allows variant calling starting from fastq files. Using both plasmid mixtures and clinical samples, we show that Q-cpileup is able to reduce the number of false-positive findings. The filtering strategy is adaptive and provides an optimized threshold for individual samples in each sequencing run. Additionally, linkage information is kept between single-nucleotide polymorphisms as variants are called at the codon level. This enables virologists to have an immediate biological interpretation of the reported variants with respect to their antiviral drug responses. A comparison with existing SNP caller tools reveals that calling variants at the codon level with Q-cpileup results in an outstanding sensitivity while maintaining a good specificity for variants with frequencies down to 0.5%. The VirVarSeq is available, together with a user's guide and test data, at sourceforge: http://sourceforge.net/projects/virtools/?source=directory. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Lee, Katharine
2004-01-01
The Surface Management System (SMS) is a decision support tool that will help controllers, traffic managers, and NAS users manage the movements of aircraft on the surface of busy airports, improving capacity, efficiency, and flexibility. The Advanced Air Transportation Technologies (AATT) Project at NASA is developing SMS in cooperation with the FAA's Free Flight Phase 2 (FFP2) pro5ram. SMS consists of three parts: a traffic management tool, a controller tool, and a National Airspace System (NAS) information tool.
76 FR 52640 - Pacific Fishery Management Council; Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-23
... Pacific Fishery Management Council's (Pacific Council) ad hoc groundfish Essential Fish Habitat Review Committee (EFHRC) will hold a conference call to continue the periodic review of groundfish Essential Fish Habitat (EFH). DATES: The conference call will be held Friday, September 9, 2011 from 9 a.m. to 11 a.m...
Lehmann, Ashton E; Kozin, Elliott D; Sethi, Rosh K V; Wong, Kevin; Lin, Brian M; Gray, Stacey T; Cunningham, Michael J
2018-05-01
Otolaryngology residents are often responsible for triaging after-hours patient calls. However, residents receive little training on this topic. Data are limited on the clinical content, reporting, and management of otolaryngology patient calls. This study aimed to characterize the patient concerns residents handle by phone and their subsequent management and reporting. Retrospective review. Five hundred consecutive after-hours patient calls in a tertiary pediatric hospital were reviewed. Data collected included patient and caller demographics, clinical concerns, surgical history, recommendations, and subsequent emergency department (ED) visits. On average, 3.7 calls occurred per shift, 2.8 on weekday and 5.9 on weekend shifts. Mean patient age was 6.6 years. Mothers (71%) called most frequently. The majority of calls were postoperative (64.2%). Of postoperative calls, most occurred within 3 days of surgery (52.3%). Most calls were for surgical site bleeding (19.9%). Residents recommended ED evaluation for 17.2% of calls, of which 20.9% returned to the primary institution ED. ED evaluation was recommended more frequently for postoperative patients (P = .040), particularly following adenotonsillectomy (51.2%) or surgical site bleeding (18.6%). With respect to documentation, 32.8% of medical record numbers were absent, 11.8% had name errors, and 2.2% of patients could not be identified. This is the first study to analyze the management and reporting of patient calls by otolaryngology residents. A wide array of clinical concerns are triaged by phone conversations. The study has implications for both resident and patient education. 4. Laryngoscope, 128:E163-E170, 2018. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.
Waldrop, Deborah P; Clemency, Brian; Lindstrom, Heather A; Clemency Cordes, Colleen
2015-09-01
Emergency 911 calls are often made when the end stage of an advanced illness is accompanied by alarming symptoms and substantial anxiety for family caregivers, particularly when an approaching death is not anticipated. How prehospital providers (paramedics and emergency medical technicians) manage emergency calls near death influences how and where people will die, if their end-of-life choices are upheld and how appropriately health care resources are used. The purpose of this study was to explore and describe how prehospital providers assess and manage end-of-life emergency calls. In-depth and in-person interviews were conducted with 43 prehospital providers. Interviews were audiotaped, transcribed, and entered into ATLAS.ti for data management and coding. Qualitative data analysis involved systematic and axial coding to identify and describe emergent themes. Four themes illustrate the nature and dynamics of emergency end-of-life calls: 1) multifocal assessment (e.g., of the patient, family, and environment), 2) family responses (e.g., emotional, behavioral), 3) conflicts (e.g., missing do-not-resuscitate order, patient-family conflicts), and 4) management of the dying process (e.g., family witnessed resuscitation or asking family to leave, decisions about hospital transport). After a rapid comprehensive multifocal assessment, family responses and the existence of conflicts mediate decision making about possible interventions. The importance of managing symptom crises and stress responses that accompany the dying process is particularly germane to quality care at life's end. The results suggest the importance of increasing prehospital providers' abilities to uphold advance directives and patients' end-of-life wishes while managing family emotions near death. Copyright © 2015 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.
Managing Epilepsy Well: Emerging e-Tools for epilepsy self-management.
Shegog, Ross; Bamps, Yvan A; Patel, Archna; Kakacek, Jody; Escoffery, Cam; Johnson, Erica K; Ilozumba, Ukwuoma O
2013-10-01
The Managing Epilepsy Well (MEW) Network was established in 2007 by the Centers for Disease Control and Prevention Epilepsy Program to expand epilepsy self-management research. The network has employed collaborative research strategies to develop, test, and disseminate evidence-based, community-based, and e-Health interventions (e-Tools) for epilepsy self-management for people with epilepsy, caregivers, and health-care providers. Since its inception, MEW Network collaborators have conducted formative studies (n=7) investigating the potential of e-Health to support epilepsy self-management and intervention studies evaluating e-Tools (n=5). The MEW e-Tools (the MEW website, WebEase, UPLIFT, MINDSET, and PEARLS online training) and affiliated e-Tools (Texting 4 Control) are designed to complement self-management practices in each phase of the epilepsy care continuum. These tools exemplify a concerted research agenda, shared methodological principles and models for epilepsy self-management, and a communal knowledge base for implementing e-Health to improve quality of life for people with epilepsy. © 2013.
The FASTER Approach: A New Tool for Calculating Real-Time Tsunami Flood Hazards
NASA Astrophysics Data System (ADS)
Wilson, R. I.; Cross, A.; Johnson, L.; Miller, K.; Nicolini, T.; Whitmore, P.
2014-12-01
In the aftermath of the 2010 Chile and 2011 Japan tsunamis that struck the California coastline, emergency managers requested that the state tsunami program provide more detailed information about the flood potential of distant-source tsunamis well ahead of their arrival time. The main issue is that existing tsunami evacuation plans call for evacuation of the predetermined "worst-case" tsunami evacuation zone (typically at a 30- to 50-foot elevation) during any "Warning" level event; the alternative is to not call an evacuation at all. A solution to provide more detailed information for secondary evacuation zones has been the development of tsunami evacuation "playbooks" to plan for tsunami scenarios of various sizes and source locations. To determine a recommended level of evacuation during a distant-source tsunami, an analytical tool has been developed called the "FASTER" approach, an acronym for factors that influence the tsunami flood hazard for a community: Forecast Amplitude, Storm, Tides, Error in forecast, and the Run-up potential. Within the first couple hours after a tsunami is generated, the National Tsunami Warning Center provides tsunami forecast amplitudes and arrival times for approximately 60 coastal locations in California. At the same time, the regional NOAA Weather Forecast Offices in the state calculate the forecasted coastal storm and tidal conditions that will influence tsunami flooding. Providing added conservatism in calculating tsunami flood potential, we include an error factor of 30% for the forecast amplitude, which is based on observed forecast errors during recent events, and a site specific run-up factor which is calculated from the existing state tsunami modeling database. The factors are added together into a cumulative FASTER flood potential value for the first five hours of tsunami activity and used to select the appropriate tsunami phase evacuation "playbook" which is provided to each coastal community shortly after the forecast is provided.
33 CFR 402.7 - Service Incentive Program.
Code of Federal Regulations, 2014 CFR
2014-07-01
... number of calls scheduled for the Navigation Season. Additional calls to the system may be added during the season. (f) The carrier will advise the Manager of port rotation, outlining core ports of calls... carrier must meet 75% schedule adherence with a minimum of four (4) Great Lakes calls during the...
Best Practice for After-Hours Hospice Symptom Management: A Literature Review.
Slack, Cheryl
2015-10-01
Medicare-certified hospice home care agencies must provide a 24/7 on-call system to respond to patient and caregiver concerns. How these calls are handled impacts patient and family outcomes and satisfaction. Ideally, hospice nurses provide adequate caregiver education during routine visits to minimize the need for after-hours calls. A literature review provided evidence that hospice nurse education and appropriate telephone support improves symptom management, enhances family support, provides a sense of security, reduces anxiety, and promotes comfort.
Watershed Management Optimization Support Tool (WMOST) v2: User Manual and Case Studies
The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that evaluates the relative cost-effectiveness of management practices at the local or watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed c...
Lynch, Abigail J.; Taylor, William W.; McCright, Aaron M.
2016-01-01
Decision support tools can aid decision making by systematically incorporating information, accounting for uncertainties, and facilitating evaluation between alternatives. Without user buy-in, however, decision support tools can fail to influence decision-making processes. We surveyed fishery researchers, managers, and fishers affiliated with the Lake Whitefish Coregonus clupeaformis fishery in the 1836 Treaty Waters of Lakes Huron, Michigan, and Superior to assess opinions of current and future management needs to identify barriers to, and opportunities for, developing a decision support tool based on Lake Whitefish recruitment projections with climate change. Approximately 64% of 39 respondents were satisfied with current management, and nearly 85% agreed that science was well integrated into management programs. Though decision support tools can facilitate science integration into management, respondents suggest that they face significant implementation barriers, including lack of political will to change management and perceived uncertainty in decision support outputs. Recommendations from this survey can inform development of decision support tools for fishery management in the Great Lakes and other regions.
Carrier, Emily; Reschovsky, James
2009-12-01
Use of care management tools--such as group visits or patient registries--varies widely among primary care physicians whose practices care for patients with four common chronic conditions--asthma, diabetes, congestive heart failure and depression--according to a new national study by the Center for Studying Health System Change (HSC). For example, less than a third of these primary care physicians in 2008 reported their practices use nurse managers to coordinate care, and only four in 10 were in practices using registries to keep track of patients with chronic conditions. Physicians also used care management tools for patients with some chronic conditions but not others. Practice size and setting were strongly related to the likelihood that physicians used care management tools, with solo and smaller group practices least likely to use care management tools. The findings suggest that, along with experimenting with financial incentives for primary care physicians to adopt care management tools, policy makers might consider developing community-level care management resources, such as nurse managers, that could be shared among smaller physician practices.
ERIC Educational Resources Information Center
Ambrose, Regina Maria; Palpanathan, Shanthini
2017-01-01
Computer-assisted language learning (CALL) has evolved through various stages in both technology as well as the pedagogical use of technology (Warschauer & Healey, 1998). Studies show that the CALL trend has facilitated students in their English language writing with useful tools such as computer based activities and word processing. Students…
ERIC Educational Resources Information Center
Sheehan, Mark D.; Thorpe, Todd; Dunn, Robert
2015-01-01
Much has been gained over the years in various educational fields that have taken advantage of CALL. In many cases, CALL has facilitated learning and provided teachers and students access to materials and tools that would have remained out of reach were it not for technology. Nonetheless, there are still cases where a lack of funding or access to…
Sartori, Riccardo; Costantini, Arianna; Ceschi, Andrea; Tommasi, Francesco
2018-01-01
The article aims to be a reflective paper on the interconnected concepts of training, development and innovation and the potential they have in dealing with change in organizations. We call change both the process through which something becomes different and the result of that process. Change management is the expression used to define the complex of activities, functions, and tools (such as training courses) through which an organization deals with the introduction of something new that is relevant for both its survival and growth. Training and development are labels used to define those educational activities implemented in organizations to empower the competences of workers, employees and managers in the lifelong learning perspective of improving their performance. Consequently, we define competences as those personal characteristics that allow people to be effective in the changing contexts of both workplace and everyday life. They are also necessary in organizational innovation, which is the process of transforming ideas or inventions into goods or services that generate value and for which customers will pay. Training, development, and innovation are three different but interconnected functions by which organizations manage change. What is the state of the art of the literature dealing with these topics? Here, is a critical review on the matter. PMID:29662463
Sartori, Riccardo; Costantini, Arianna; Ceschi, Andrea; Tommasi, Francesco
2018-01-01
The article aims to be a reflective paper on the interconnected concepts of training, development and innovation and the potential they have in dealing with change in organizations. We call change both the process through which something becomes different and the result of that process. Change management is the expression used to define the complex of activities, functions, and tools (such as training courses) through which an organization deals with the introduction of something new that is relevant for both its survival and growth. Training and development are labels used to define those educational activities implemented in organizations to empower the competences of workers, employees and managers in the lifelong learning perspective of improving their performance. Consequently, we define competences as those personal characteristics that allow people to be effective in the changing contexts of both workplace and everyday life. They are also necessary in organizational innovation , which is the process of transforming ideas or inventions into goods or services that generate value and for which customers will pay. Training, development, and innovation are three different but interconnected functions by which organizations manage change. What is the state of the art of the literature dealing with these topics? Here, is a critical review on the matter.
EPA Registers Innovative Tool to Control Corn Rootworm
Ribonucleic acid interference (RNAi) based Plant Incorporated Protectant (PIP) technology is a new and innovative scientific tool utilized by U.S. growers. Learn more about RNAi technology and the 4 new products containing the RNAi based PIP called SMARTST
Concepts, tools, and strategies for effluent testing: An international survey
Whole effluent testing (also called Direct Toxicity Assessment) remains a critical long-term assessment tool for aquatic environmental protection. Use of animal alternative approaches for wastewater testing is expected to increase as more regulatory authorities routinely require ...
Asthma education for school staff in Riyadh city: effectiveness of pamphlets as an educational tool.
Abdel Gawwad, Ensaf S; El-Herishi, Sultana
2007-01-01
Teachers and support staff are often called upon to manage asthma at school but may have little knowledge and understanding of the condition. The objectives of this study were to develop educational package (pamphlets) about asthma, and assess its effectiveness as an educational tool for schools' staff through evaluation of its impact on the staff's asthma-related knowledge, attitudes and management practices on their pupils. A pre-post experimental research design was used in Riyadh city with distribution of self-administered questionnaires and asthma package to 4 randomly selected girls schools compounds. Participants were school staff (n = 297) of primary, intermediate and secondary schools. Results showed that only 5.7% of the staff had received previous training in asthma education. Lack of knowledge and misconceptions about asthma medication were evident among a considerable proportion of the staff specifically for use of antibiotics, steroids, side effect of ventolin, and addicting effect of inhalers. At pretest, only 35% and 40.1% of the staff had good level of knowledge and management practices. At posttest, the corresponding percentages increased significantly to be 83.9% and 68.6% respectively. The mean total score of staff's asthma related-attitudes became more favorable towards asthma education after intervention, it increased significantly from 53.5 to be 55.0. Total posttest knowledge score was the only predictor of both staff attitudes and management practices constituting 9.1% and 10.2% of their variance. The great majority cited lack of training (92%), unavailability school policy (86.8%), and shortage of educational resources (88.3%) as barriers against asthma education and management in their schools. Most of school staff had poor to fair level of asthma knowledge and management practices. Such simple educational intervention using pamphlets and demonstration of inhaler use and peaked flow meter was significantly successful in enhancing staff's asthma-related knowledge, attitudes and management practices among their pupils. It is very important that training is directed to all staff as pre-service and in-service programs.
ERIC Educational Resources Information Center
Bustamante, Rebecca M.
2006-01-01
This module is designed to introduce educational leaders to an organizational assessment tool called a "culture audit." Literature on organizational cultural competence suggests that culture audits are a valuable tool for determining how well school policies, programs, and practices respond to the needs of diverse groups and prepare…
Active Reading Documents (ARDs): A Tool to Facilitate Meaningful Learning through Reading
ERIC Educational Resources Information Center
Dubas, Justin M.; Toledo, Santiago A.
2015-01-01
Presented here is a practical tool called the Active Reading Document (ARD) that can give students the necessary incentive to engage with the text/readings. By designing the tool to incrementally develop student understanding of the material through reading using Marzano's Taxonomy as a framework, the ARD offers support through scaffolding as they…
HYPATIA--An Online Tool for ATLAS Event Visualization
ERIC Educational Resources Information Center
Kourkoumelis, C.; Vourakis, S.
2014-01-01
This paper describes an interactive tool for analysis of data from the ATLAS experiment taking place at the world's highest energy particle collider at CERN. The tool, called HYPATIA/applet, enables students of various levels to become acquainted with particle physics and look for discoveries in a similar way to that of real research.
Developing Multimedia Courseware for the Internet's Java versus Shockwave.
ERIC Educational Resources Information Center
Majchrzak, Tina L.
1996-01-01
Describes and compares two methods for developing multimedia courseware for use on the Internet: an authoring tool called Shockwave, and an object-oriented language called Java. Topics include vector graphics, browsers, interaction with network protocols, data security, multithreading, and computer languages versus development environments. (LRW)
76 FR 3653 - Alaska Region's Subsistence Resource Commission (SRC) Program; Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-20
... subsistence management issues. The NPS SRC program is authorized under Title VIII, Section 808 of the Alaska...: 1. Call to order. 2. SRC Roll Call and Confirmation of Quorum. 3. Welcome and Introductions. 4.... c. Resource Management Program Update. 14. Public and other Agency Comments. 15. SRC Work Session...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-09
... INFORMATION CONTACT: If you have questions on this rule, call or email Mr. Jim Rousseau, Bridge Management Specialist, Fifth Coast Guard District; telephone: (757) 398-6557; Email: [email protected] . If you have any questions on viewing the docket, call Renee V. Wright, Program Manager, Docket Operations...
Piette, John D.; Marinec, Nicolle; Gallegos-Cabriales, Esther C.; Gutierrez-Valverde, Juana Mercedes; Rodriguez-Saldaña, Joel; Mendoz-Alevares, Milton; Silveira, Maria J.
2013-01-01
We used data from Interactive Voice Response (IVR) self-management support studies in Honduras, Mexico, and the United States (US) to determine whether IVR calls to Spanish-speaking patients with chronic illnesses is a feasible strategy for improving monitoring and education between face-to-face visits. 268 patients with diabetes or hypertension participated in 6–12 weeks of weekly IVR follow-up. IVR calls emanated from US servers with connections via Voice over IP. More than half (54%) of patients enrolled with an informal caregiver who received automated feedback based on the patient’s assessments, and clinical staff received urgent alerts. Participants had on average 6.1 years of education, and 73% were women. After 2,443 person weeks of follow-up, patients completed 1,494 IVR assessments. Call completion rates were higher in the US (75%) than in Honduras (59%) or Mexico (61%; p<0.001). Patients participating with an informal caregiver were more likely to complete calls (adjusted odds ratio [AOR]: 1.53; 95% confidence interval [CI]: 1.04, 2.25) while patients reporting fair or poor health at enrollment were less likely (AOR:0.59; 95% CI: 0.38, 0.92). Satisfaction rates were high, with 98% of patients reporting that the system was easy to use, and 86% reporting that the calls helped them a great deal in managing their health problems. In summary, IVR self-management support is feasible among Spanish-speaking patients with chronic disease, including those living in less-developed countries. Voice over IP can be used to deliver IVR disease management services internationally; involving informal caregivers may increase patient engagement. PMID:23532005
Management Matters: The Library Media Specialist's Management Toolbox
ERIC Educational Resources Information Center
Pappas, Marjorie L.
2004-01-01
Library media specialists need tools to help them manage the school library media program. The Internet includes a vast array of tools that a library media specialist might find useful. The websites and electronic resources included in this article are only a representative sample and future columns may explore additional tools. All the tools are…
Vehicle Thermal Management Models and Tools | Transportation Research |
NREL Models and Tools Vehicle Thermal Management Models and Tools The National Renewable Energy Laboratory's (NREL's) vehicle thermal management modeling tools allow researchers to assess the trade-offs and calculate the potential benefits of thermal design options. image of three models of semi truck cabs. Truck
Adoption of online health management tools among healthy older adults: An exploratory study.
Zettel-Watson, Laura; Tsukerman, Dmitry
2016-06-01
As the population ages and chronic diseases abound, overburdened healthcare systems will increasingly require individuals to manage their own health. Online health management tools, quickly increasing in popularity, have the potential to diminish or even replace in-person contact with health professionals, but overall efficacy and usage trends are unknown. The current study explored perceptions and usage patterns among users of online health management tools, and identified barriers and barrier-breakers among non-users. An online survey was completed by 169 computer users (aged 50+). Analyses revealed that a sizable minority (37%) of participants use online health management tools and most users (89%) are satisfied with these tools, but a limited range of tools are being used and usage occurs in relatively limited domains. Improved awareness and education for online health management tools could enhance people's abilities to remain at home as they age, reducing the financial burden on formal assistance programs. © The Author(s) 2014.
The role of risk-based prioritization in total quality management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, C.T.
1994-10-01
The climate in which government managers must make decisions grows more complex and uncertain. All stakeholders - the public, industry, and Congress - are demanding greater consciousness, responsibility, and accountability of programs and their budgets. Yet, managerial decisions have become multifaceted, involve greater risk, and operate over much longer time periods. Over the last four or five decades, as policy analysis and decisions became more complex, scientists from psychology, operations research, systems science, and economics have developed a more or less coherent process called decision analysis to aid program management. The process of decision analysis - a systems theoretic approachmore » - provides the backdrop for this paper. The Laboratory Integrated Prioritization System (LIPS) has been developed as a systems analytic and risk-based prioritization tool to aid the management of the Tri-Labs` (Lawrence Livermore, Los Alamos, and Sandia) operating resources. Preliminary analyses of the effects of LIPS has confirmed the practical benefits of decision and systems sciences - the systematic, quantitative reduction in uncertainty. To date, the use of LIPS - and, hence, its value - has been restricted to resource allocation within the Tri-Labs` operations budgets. This report extends the role of risk-based prioritization to the support of DOE Total Quality Management (TQM) programs. Furthermore, this paper will argue for the requirement to institutionalize an evolutionary, decision theoretic approach to the policy analysis of the Department of Energy`s Program Budget.« less
Vernier, Françoise; Leccia-Phelpin, Odile; Lescot, Jean-Marie; Minette, Sébastien; Miralles, André; Barberis, Delphine; Scordia, Charlotte; Kuentz-Simonet, Vanessa; Tonneau, Jean-Philippe
2017-03-01
Non-point source pollution is a cause of major concern within the European Union. This is reflected in increasing public and political focus on a more sustainable use of pesticides, as well as a reduction in diffuse pollution. Climate change will likely to lead to an even more intensive use of pesticides in the future, affecting agriculture in many ways. At the same time, the Water Framework Directive (WFD) and associated EU policies called for a "good" ecological and chemical status to be achieved for water bodies by the end of 2015, currently delayed to 2021-2027 due to a lack of efficiency in policies and timescale of resilience for hydrosystems, especially groundwater systems. Water managers need appropriate and user-friendly tools to design agro-environmental policies. These tools should help them to evaluate the potential impacts of mitigation measures on water resources, more clearly define protected areas, and more efficiently distribute financial incentives to farmers who agree to implement alternative practices. At present, a number of reports point out that water managers do not use appropriate information from monitoring or models to make decisions and set environmental action plans. In this paper, we propose an integrated and collaborative approach to analyzing changes in land use, farming systems, and practices and to assess their effects on agricultural pressure and pesticide transfers to waters. The integrated modeling of agricultural scenario (IMAS) framework draws on a range of data and expert knowledge available within areas where a pesticide action plan can be defined to restore the water quality, French "Grenelle law" catchment areas, French Water Development and Management Plan areas, etc. A so-called "reference scenario" represents the actual soil occupation and pesticide-spraying practices used in both conventional and organic farming. A number of alternative scenarios are then defined in cooperation with stakeholders, including socio-economic conditions for developing alternative agricultural systems or targeting mitigation measures. Our integrated assessment of these scenarios combines the calculation of spatialized environmental indicators with integrated bio-economic modeling. The latter is achieved by a combined use of Soil and Water Assessment Tool (SWAT) modeling with our own purpose-built land use generator module (Generator of Land Use version 2 (GenLU2)) and an economic model developed using General Algebraic Modeling System (GAMS) for cost-effectiveness assessment. This integrated approach is applied to two embedded catchment areas (total area of 360,000 ha) within the Charente river basin (SW France). Our results show that it is possible to differentiate scenarios based on their effectiveness, represented by either evolution of pressure (agro-environmental indicators) or transport into waters (pesticide concentrations). By analyzing the implementation costs borne by farmers, it is possible to identify the most cost-effective scenarios at sub-basin and other aggregated levels (WFD hydrological entities, sensitive areas). Relevant results and indicators are fed into a specifically designed database. Data warehousing is used to provide analyses and outputs at all thematic, temporal, or spatial aggregated levels, defined by the stakeholders (type of crops, herbicides, WFD areas, years), using Spatial On-Line Analytical Processing (SOLAP) tools. The aim of this approach is to allow public policy makers to make more informed and reasoned decisions when managing sensitive areas and/or implementing mitigation measures.
Shah Che Hamzah, Mohd Shaharudin; Ahmad, Rashidi; Nik Abdul Rahman, Nik Hisamuddin; Pardi, Kasmah Wati; Jaafar, Naimah; Wan Adnan, Wan Aasim; Jaalam, Kamaruddin; Sahil Jamalullail, Syed Mohsin
2005-01-01
This retrospective study attempted to identify the pattern of ambulance calls for the past two years at the Hospital Universiti Sains Malaysia (HUSM) and Hospital Kota Bharu (HKB). This study will provide a simple method of acquiring information related to ambulance response time (ART) and to test whether it met the international standards and needs of the client. Additionally, this paper takes into account the management of emergency calls. This included ambulance response time, which was part of Emergency Medical Services (EMS) episode: onset of ART, which started when details like phone number of the caller, exact location of the incident and the nature of the main complaint had been noted. ART ended when the emergency team arrived at the scene of incident. Information regarding ambulance calls from the record offices of HUSM and HKB was recorded for the year 2001 and 2002, tabulated and analyzed. There was a significant difference in the total number of calls managed by HUSM and HKB in the year 2001. It was noted that 645 calls were managed by HUSM while 1069 calls were recorded at HKB. In the year 2002, however, HUSM led with 613 extra numbers of calls as compare to HKB with 1193 numbers of calls. The pattern of ambulance calls observed is thought to possibly be influenced by social activities like local festivities, school holidays and the seasons. Further, it is observed that no studies were previously undertaken to compare the ART at both the HUSM and HKB to that of the international standards. In fact, a literature review undertaken so far showed no similar studies have been done for the whole Malaysia. PMID:22605956
DataHub: Science data management in support of interactive exploratory analysis
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Rubin, Mark R.
1993-01-01
The DataHub addresses four areas of significant needs: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within the DataHub is the integration of three technologies, viz. knowledge-based expert systems, science visualization, and science data management. This integration is based on a concept called the DataHub. With the DataHub concept, science investigators are able to apply a more complete solution to all nodes of a distributed system. Both computational nodes and interactives nodes are able to effectively and efficiently use the data services (access, retrieval, update, etc), in a distributed, interdisciplinary information system in a uniform and standard way. This allows the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to information. The DataHub includes all the required end-to-end components and interfaces to demonstrate the complete concept.
DataHub - Science data management in support of interactive exploratory analysis
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Rubin, Mark R.
1993-01-01
DataHub addresses four areas of significant need: scientific visualization and analysis; science data management; interactions in a distributed, heterogeneous environment; and knowledge-based assistance for these functions. The fundamental innovation embedded within the DataHub is the integration of three technologies, viz. knowledge-based expert systems, science visualization, and science data management. This integration is based on a concept called the DataHub. With the DataHub concept, science investigators are able to apply a more complete solution to all nodes of a distributed system. Both computational nodes and interactive nodes are able to effectively and efficiently use the data services (access, retrieval, update, etc.) in a distributed, interdisciplinary information system in a uniform and standard way. This allows the science investigators to concentrate on their scientific endeavors, rather than to involve themselves in the intricate technical details of the systems and tools required to accomplish their work. Thus, science investigators need not be programmers. The emphasis is on the definition and prototyping of system elements with sufficient detail to enable data analysis and interpretation leading to information. The DataHub includes all the required end-to-end components and interfaces to demonstrate the complete concept.
ROSA: Resource-Oriented Service Management Schemes for Web of Things in a Smart Home
Chen, Peng-Yu
2017-01-01
A Pervasive-computing-enriched smart home environment, which contains many embedded and tiny intelligent devices and sensors coordinated by service management mechanisms, is capable of anticipating intentions of occupants and providing appropriate services accordingly. Although there are a wealth of research achievements in recent years, the degree of market acceptance is still low. The main reason is that most of the devices and services in such environments depend on particular platform or technology, making it hard to develop an application by composing the devices or services. Meanwhile, the concept of Web of Things (WoT) is becoming popular recently. Based on WoT, the developers can build applications based on popular web tools or technologies. Consequently, the objective of this paper is to propose a set of novel WoT-driven plug-and-play service management schemes for a smart home called Resource-Oriented Service Administration (ROSA). We have implemented an application prototype, and experiments are performed to show the effectiveness of the proposed approach. The results of this research can be a foundation for realizing the vision of “end user programmable smart environments”. PMID:28934159
Strategic planning for MyRA performance: A causal loop diagram approach
NASA Astrophysics Data System (ADS)
Abidin, Norhaslinda Zainal; Zaibidi, Nerda Zura; Karim, Khairah Nazurah
2017-10-01
The nexus of research and innovation in higher education are continually receiving worldwide priority attention. Hence, Malaysia has taken its move to enhance public universities as a center of excellence by introducing the status of Research University (RU). To inspire all universities towards becoming a research university, The Ministry of Higher Education (MoHE) had revised an assessment called Malaysian Research Assessment Instrument (MyRA) to evaluate the performance of existence RUs, and other potential higher education institutions. The available spreadsheet tool to access MyRA performance is inadequate to support strategic planning. Since, higher education management is a complex system, in which components and their interactions are ever changing over time, there is a need to for an efficient approach to investigate system behavior and devise research management policies for the benefit of the institution itself and the higher education system. In this paper, we proposed a system dynamics simulation model to evaluate the impact of policies for obtaining the highest performance in MyRA assessment. Causal loop diagram is developed to investigate the relationship of various elements in research management, their inter-relationship that link together and their evolution of behavior over time.
The role of colonoscopy in managing diverticular disease of the colon.
Tursi, Antonio
2015-03-01
Diverticulosis of the colon is frequently found on routine colonoscopy, and the incidence of diverticular disease and its complications appears to be increasing. The role of colonoscopy in managing this disease is still controversial. Colonoscopy plays a key role in managing diverticular bleeding. Several techniques have been effectively used in this field, but band ligation seems to be the best in preventing rebleeding. Colonoscopy is also effective in posing a correct differential diagnosis with other forms of chronic colitis involving colon harbouring diverticula (in particular with Crohn's disease or Segmental Colitis Associated with Diverticulosis). The role of colonoscopy to confirm diagnosis of uncomplicated diverticulitis is still under debate, since the risk of advanced colonic neoplasia in patients admitted for acute uncomplicated diverticulitis is not increased as compared to the average-risk population. On the contrary, colonoscopy is mandatory if patients complain of persistent symptoms or after resolution of an episode of complicated diverticulitis. Finally, a recent endoscopic classification, called Diverticular Inflammation and Complications Assessment (DICA), has been developed and validated. This classification seems to be a promising tool for predicting the outcome of the colon harboring diverticula, but further, prospective studies have to confirm its predictive role on the outcome of the disease.
Software platform for managing the classification of error- related potentials of observers
NASA Astrophysics Data System (ADS)
Asvestas, P.; Ventouras, E.-C.; Kostopoulos, S.; Sidiropoulos, K.; Korfiatis, V.; Korda, A.; Uzunolglu, A.; Karanasiou, I.; Kalatzis, I.; Matsopoulos, G.
2015-09-01
Human learning is partly based on observation. Electroencephalographic recordings of subjects who perform acts (actors) or observe actors (observers), contain a negative waveform in the Evoked Potentials (EPs) of the actors that commit errors and of observers who observe the error-committing actors. This waveform is called the Error-Related Negativity (ERN). Its detection has applications in the context of Brain-Computer Interfaces. The present work describes a software system developed for managing EPs of observers, with the aim of classifying them into observations of either correct or incorrect actions. It consists of an integrated platform for the storage, management, processing and classification of EPs recorded during error-observation experiments. The system was developed using C# and the following development tools and frameworks: MySQL, .NET Framework, Entity Framework and Emgu CV, for interfacing with the machine learning library of OpenCV. Up to six features can be computed per EP recording per electrode. The user can select among various feature selection algorithms and then proceed to train one of three types of classifiers: Artificial Neural Networks, Support Vector Machines, k-nearest neighbour. Next the classifier can be used for classifying any EP curve that has been inputted to the database.
Allergy and Asthma Care in the Mobile Phone Era.
Huang, Xinyuan; Matricardi, Paolo Maria
2016-05-21
Strategies to improve patients' adherence to treatment are essential to reduce the great health and economic burden of allergic rhinitis and asthma. Mobile phone applications (apps) for a better management of allergic diseases are growing in number, but their usefulness for doctors and patients is still debated. Controlled trials have investigated the feasibility, cost-effectiveness, security, and perspectives of the use of tele-medicine in the self-management of asthma. These studies focused on different tools or devices, such as SMS, telephone calls, automatic voice response system, mobile applications, speech recognition system, or cloud-computing systems. While some trials concluded that m-Health can improve asthma control and the patient's quality of life, others did not show any advantage in relation to usual care. The only controlled study on allergic rhinitis showed an improvement of adherence to treatment among tele-monitored patients compared to those managed with usual care. Most studies have also highlighted a few shortcomings and limitations of tele-medicine, mainly concerning security and cost-efficiency. The use of smartphones and apps for a personalized asthma and allergy care needs to be further evaluated and optimized before conclusions on its usefulness can be drawn.
Development and Validation of the Children's Voice Handicap Index-10 for Parents.
Ricci-Maccarini, Andrea; De Maio, Vincenzo; Murry, Thomas; Schindler, Antonio
2016-01-01
The Children's Voice Handicap Index-10 (CVHI-10) was introduced as a tool for self-assessment of children's dysphonia. However, in the management of children with voice disorders, both parents' and children's perspectives play an important role. Because a self-tool including both a children's and a parents' version does not exist yet, the aim of the study was to develop and validate an assessment tool which parallels the CVHI-10 for parents to assess the level of voice handicap in their child's voice. Observational, prospective, cross-sectional study. To develop a CVHI-10 for parents, called "CVHI-10-P", the CVHI-10 items were adapted to reflect parents' responses about their child. Fifty-five children aged 7-12 years completed the CVHI-10, whereas their parents completed the CVHI-10-P. Each child's voice was also perceptually assessed by an otolaryngologist using the Grade Breathness Roughness (GRB) scale. Fifty-one of the 55 children underwent voice therapy (VT) and were assessed afterward using the GRB, CVHI-10, and CVHI-10-P. CVHI-10-P internal consistency was satisfactory (Cronbach alpha = .78). Correlation between CVHI-10-P and CVHI-10 was moderate (r = 0.37). CVHI-10-P total scores were lower than CVHI-10 scores in most of the cases. Single-item mean scores were always lower in CVHI-10-P compared with CVHI-10, with the exception of the only one item of the CVHI-10-P that directly involves the parent's experience (item 10). Data gained from one tool are not directly related to the other, suggesting that these two tools appraise the child's voice handicap from different perspectives. The overall perceptual assessment scores of the 51 children after VT significantly improved. There was a statistically significant reduction of the total scores and for each item in CVHI-10 and CVHI-10-P after VT. These data support the adoption of the CVHI-10-P as an assessment tool and an outcome measure for management of children's voice disorders. CVHI-10-P is a valid tool to appraise parents' perspective of their child's voice disorder. The use of the CVHI-10 and the CVHI-10-P is recommended for objectively determining the level of voice handicap in children by parents and child. Copyright © 2016 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Omitaomu, Olufemi A; Parish, Esther S; Nugent, Philip J
Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). Formore » all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.« less
NASA Astrophysics Data System (ADS)
Buongiorno, Maria Fabrizia; Musacchio, Massimo; Silvestri, Malvina; Spinetti, Claudia; Corradini, Stefano; Lombardo, Valerio; Merucci, Luca; Sansosti, Eugenio; Pugnagli, Sergio; Teggi, Sergio; Pace, Gaetano; Fermi, Marco; Zoffoli, Simona
2007-10-01
The Project called Sistema Rischio Vulcanico (SRV) is funded by the Italian Space Agency (ASI) in the frame of the National Space Plan 2003-2005 under the Earth Observations section for natural risks management. The SRV Project is coordinated by the Istituto Nazionale di Geofisica e Vulcanologia (INGV) which is responsible at national level for the volcanic monitoring. The objective of the project is to develop a pre-operative system based on EO data and ground measurements integration to support the volcanic risk monitoring of the Italian Civil Protection Department which requirements and need are well integrated in the GMES Emergency Core Services program. The project philosophy is to implement, by incremental versions, specific modules which allow to process, store and visualize through Web GIS tools EO derived parameters considering three activity phases: 1) knowledge and prevention; 2) crisis; 3) post crisis. In order to combine effectively the EO data and the ground networks measurements the system will implement a multi-parametric analysis tool, which represents and unique tool to analyze contemporaneously a large data set of data in "near real time". The SRV project will be tested his operational capabilities on three Italian Volcanoes: Etna,Vesuvio and Campi Flegrei.