Sample records for benefit analysis tool

  1. Rapid Benefit Indicators (RBI) Spatial Analysis Tools

    EPA Science Inventory

    The Rapid Benefit Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration - A Rapid Benefits Indicators Approach for Decision Makers. This spatial analysis tool is intended to be used to analyze existing spatial informatio...

  2. Smart roadside initiative macro benefit analysis : user’s guide for the benefit-cost analysis tool.

    DOT National Transportation Integrated Search

    2015-03-01

    Through the Smart Roadside Initiative (SRI), a Benefit-Cost Analysis (BCA) tool was developed for the evaluation of various new transportation technologies at a State level and to provide results that could support technology adoption by a State Depa...

  3. Cost benefit analysis: applications and future opportunities.

    DOT National Transportation Integrated Search

    2016-06-01

    Cost Benefit Analysis (CBA or Benefit Cost Analysis BCA) is an evaluation tool that state transportation agencies can : use to compare infrastructure project options across transportation modes and gauge if the discounted value of benefits : exce...

  4. SHRP2 EconWorks : wider economic benefits analysis tools : final report.

    DOT National Transportation Integrated Search

    2016-01-01

    CDM Smith has completed an evaluation of the EconWorks Wider Economic Benefits (W.E.B.) : Analysis Tools for Connecticut Department of Transportation (CTDOT). The intent of this : evaluation was to compare the results of the outputs of this toolkit t...

  5. Cost Benefit Analysis: Cost Benefit Analysis for Human Effectiveness Research: Bioacoustic Protection

    DTIC Science & Technology

    2001-07-21

    APPENDIX A. ACRONYMS ACCES Attenuating Custom Communication Earpiece System ACEIT Automated Cost estimating Integrated Tools AFSC Air Force...documented in the ACEIT cost estimating tool developed by Tecolote, Inc. The factor used was 14 percent of PMP. 1.3 System Engineering/ Program...The data source is the ASC Aeronautical Engineering Products Cost Factor Handbook which is documented in the ACEIT cost estimating tool developed

  6. Rapid Benefit Indicators (RBI) Spatial Analysis Toolset - Manual

    EPA Science Inventory

    The Rapid Benefit Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration - A Rapid Benefits Indicators Approach for Decision Makers. This spatial analysis tool is intended to be used to analyze existing spatial informatio...

  7. An Introduction to Benefit-Cost Analysis for Evaluating Public Expenditure Alternatives. Learning Packages in the Policy Sciences, PS-22.

    ERIC Educational Resources Information Center

    LaPlante, Josephine M.; Durham, Taylor R.

    A revised edition of PS-14, "An Introduction to Benefit-Cost Analysis for Evaluating Public Programs," presents concepts and techniques of benefit-cost analysis as tools that can be used to assist in deciding between alternatives. The goals of the new edition include teaching students to think about the possible benefits and costs of each…

  8. Benefit-Cost Analysis as a Teaching Tool.

    ERIC Educational Resources Information Center

    Dowd, Richard F.

    1980-01-01

    Demonstrates how benefit-cost and present-value analyses can be used to assess the potential social benefits of government projects and to illustrate how interest rates affect decision-making in government and business. (AYC)

  9. Smart roadside initiative macro benefit analysis project report.

    DOT National Transportation Integrated Search

    2015-03-31

    Through the Smart Roadside Initiative (SRI), a Benefit-Cost Analysis (BCA) tool was developed for the evaluation of various new transportation technologies at a State level and to provide results that could support technology adoption by a State Depa...

  10. Economic impact of a nationwide interoperable e-Health system using the PENG evaluation tool.

    PubMed

    Parv, L; Saluse, J; Aaviksoo, A; Tiik, M; Sepper, R; Ross, P

    2012-01-01

    The aim of this paper is to evaluate the costs and benefits of the Estonian interoperable health information exchange system. In addition, a framework will be built for follow-up monitoring and analysis of a nationwide HIE system. PENG evaluation tool was used to map and quantify the costs and benefits arising from type II diabetic patient management for patients, providers and the society. The analysis concludes with a quantification based on real costs and potential benefits identified by a panel of experts. Setting up a countrywide interoperable eHealth system incurs a large initial investment. However, if the system is working seamlessly, benefits will surpass costs within three years. The results show that while the society stands to benefit the most, the costs will be mainly borne by the healthcare providers. Therefore, new government policies should be devised to encourage providers to invest to ensure society wide benefits.

  11. A framework for a cost benefit analysis of the Fairfax County, Virginia Alcohol Safety Action Project.

    DOT National Transportation Integrated Search

    1973-01-01

    Cost-benefit analysis is sometimes a useful tool for evaluating the advantages and disadvantages of alternative courses of action. The first half of this study was an attempt to further the use of such analysis in the evaluation of a highway safety p...

  12. Measuring Security Effectiveness and Efficiency at U.S. Commercial Airports

    DTIC Science & Technology

    2013-03-01

    formative program evaluation and policy analysis to investigate current airport security programs. It identifies innovative public administration and...policy-analysis tools that could provide potential benefits to airport security . These tools will complement the System Based Risk Management framework if

  13. Developing an Ecosystem Services online Decision Support Tool to Assess the Impacts of Climate Change and Urban Growth in the Santa Cruz Watershed; Where We Live, Work, and Play

    EPA Science Inventory

    Processes through which ecosystems provide goods or benefit people can be referred to as "ecosystems services”, which may be quantified to clarify decision-making, with techniques including cost-benefit analysis. We are developing an online decision support tool, the Santa Cruz W...

  14. A Cost-Effectiveness/Benefit Analysis Model for Postsecondary Vocational Programs. Technical Report.

    ERIC Educational Resources Information Center

    Kim, Jin Eun

    A cost-effectiveness/benefit analysis is defined as a technique for measuring the outputs of existing and new programs in relation to their specified program objectives, against the costs of those programs. In terms of its specific use, the technique is conceptualized as a systems analysis method, an evaluation method, and a planning tool for…

  15. Benefits Assessment for Tactical Runway Configuration Management Tool

    NASA Technical Reports Server (NTRS)

    Oseguera-Lohr, Rosa; Phojanamongkolkij, Nipa; Lohr, Gary; Fenbert, James W.

    2013-01-01

    The Tactical Runway Configuration Management (TRCM) software tool was developed to provide air traffic flow managers and supervisors with recommendations for airport configuration changes and runway usage. The objective for this study is to conduct a benefits assessment at Memphis (MEM), Dallas Fort-Worth (DFW) and New York's John F. Kennedy (JFK) airports using the TRCM tool. Results from simulations using the TRCM-generated runway configuration schedule are compared with results using historical schedules. For the 12 days of data used in this analysis, the transit time (arrival fix to spot on airport movement area for arrivals, or spot to departure fix for departures) for MEM departures is greater (7%) than for arrivals (3%); for JFK, there is a benefit for arrivals (9%) but not for departures (-2%); for DFW, arrivals show a slight benefit (1%), but this is offset by departures (-2%). Departure queue length benefits show fewer aircraft in queue for JFK (29%) and MEM (11%), but not for DFW (-13%). Fuel savings for surface operations at MEM are seen for both arrivals and departures. At JFK there are fuel savings for arrivals, but these are offset by increased fuel use for departures. In this study, no surface fuel benefits resulted for DFW. Results suggest that the TRCM algorithm requires modifications for complex surface traffic operations that can cause taxi delays. For all three airports, the average number of changes in flow direction (runway configuration) recommended by TRCM was many times greater than the historical data; TRCM would need to be adapted to a particular airport's needs, to limit the number of changes to acceptable levels. The results from this analysis indicate the TRCM tool can provide benefits at some high-capacity airports. The magnitude of these benefits depends on many airport-specific factors and would require adaptation of the TRCM tool; a detailed assessment is needed prior to determining suitability for a particular airport.

  16. Marketing--A Controllable Tool for Education Administrators.

    ERIC Educational Resources Information Center

    Smith, Wendell C.

    1980-01-01

    Educational marketing is now becoming legitimized. Marketing techniques such as cost benefit analysis and the selection of a mix of promotional methods are tools that educational administrators should understand and use. (SK)

  17. Practical thoughts on cost-benefit analysis and health services.

    PubMed

    Burchell, A; Weeden, R

    1982-08-01

    Cost-benefit analysis is fast becoming--if it is not already--an essential tool in decision making. It is, however, a complex subject, and one in which few doctors have been trained. This paper offers practical thoughts on the art of cost-benefit analysis, and is written for clinicians and other medical specialists who, though inexpert in the techniques of accountancy, nevertheless wish to carry out their own simple analyses in a manner that will enable them, and others, to take effective decisions.

  18. Analysis of public benefits for Pennsylvania rail freight funding

    DOT National Transportation Integrated Search

    2011-01-04

    Building on best practices from other states and Pennsylvanias existing evaluation processes, this project developed an assessment tool to help the Pennsylvania Department of Transportation (PennDOT) analyze the public benefits resulting from the ...

  19. Operational Analysis of Time-Optimal Maneuvering for Imaging Spacecraft

    DTIC Science & Technology

    2013-03-01

    imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic Hierarchy Process (AHP)-based...the Singapore-developed X-SAT imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic...89  B.  FUTURE WORK................................................................................. 90  APPENDIX A. STK DATA AND BENEFIT

  20. Using a formal requirements management tool for system engineering: first results at ESO

    NASA Astrophysics Data System (ADS)

    Zamparelli, Michele

    2006-06-01

    The attention to proper requirement analysis and maintenance is growing in modern astronomical undertakings. The increasing degree of complexity that current and future generations of projects have reached requires substantial system engineering efforts and the usage of all available technology to keep project development under control. One such technology is a tool which helps managing relationships between deliverables at various development stages, and across functional subsystems and disciplines as different as software, mechanics, optics and electronics. The immediate benefits are traceability and the possibility to do impact analysis. An industrially proven tool for requirements management is presented together with the first results across some projects at ESO and a cost/benefit analysis of its usage. Experience gathered so far shows that the extensibility and configurability of the tool from one hand, and integration with common documentation formats and standards on the other, make it appear as a promising solution for even small scale system development.

  1. New Tool for Benefit-Cost Analysis in Evaluating Transportation Alternatives

    DOT National Transportation Integrated Search

    1997-01-01

    The Intermodal Surface Transportation Efficiency Act (ISTEA) emphasizes assessment of multi-modal alternatives and demand management strategies. In 1995, the Federal Highway Administration (FHWA) developed a corridor sketch planning tool called the S...

  2. The role of benefit transfer in ecosystem service valuation

    USGS Publications Warehouse

    Richardson, Leslie A.; Loomis, John; Kroeger, Timm; Casey, Frank

    2015-01-01

    The demand for timely monetary estimates of the economic value of nonmarket ecosystem goods and services has steadily increased over the last few decades. This article describes the use of benefit transfer to generate monetary value estimates of ecosystem services specifically. The article provides guidance for conducting such benefit transfers and summarizes advancements in benefit transfer methods, databases and analysis tools designed to facilitate its application.

  3. Use of advanced analysis tools to support freeway corridor freight planning.

    DOT National Transportation Integrated Search

    2010-07-22

    Advanced corridor freight management and pricing strategies are increasingly being chosen to : address freight mobility challenges. As a result, evaluation tools are needed to assess the benefits : of these strategies as compared to other alternative...

  4. Strategic Analysis Overview

    NASA Technical Reports Server (NTRS)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  5. Benefit Incidence Analysis of Government Spending on Public-Private Partnership Schooling under Universal Secondary Education Policy in Uganda

    ERIC Educational Resources Information Center

    Wokadala, J.; Barungi, M.

    2015-01-01

    The study establishes whether government spending on private universal secondary education (USE) schools is equitable across quintiles disaggregated by gender and by region in Uganda. The study employs benefit incidence analysis tool on the Uganda National Panel Survey (UNPS 2009/10) data to establish the welfare impact of public subsidy on…

  6. Quantifying the benefits to the national economy from secondary applications of NASA technology, executive summary

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The feasibility of systematically quantifying the economic benefits of secondary applications of NASA related R and D was investigated. Based upon the tools of economic theory and econometric analysis, a set of empirical methods was developed and selected applications were made to demonstrate their workability. Analyses of the technological developments related to integrated circuits, cryogenic insulation, gas turbines, and computer programs for structural analysis indicated substantial secondary benefits accruing from NASA's R and D in these areas.

  7. Quantifying the benefits to the national economy from secondary applications of NASA technology

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The feasibility of systematically quantifying the economic benefits of secondary applications of NASA related R and D is investigated. Based upon the tools of economic theory and econometric analysis, it develops a set of empirical methods and makes selected applications to demonstrate their workability. Analyses of the technological developments related to integrated circuits, cryogenic insulation, gas turbines, and computer programs for structural analysis indicated substantial secondary benefits accruing from NASA's R and D in these areas.

  8. A Comparative Analysis of Commercial Off-The-Shelf Naval Simulations and Classic Operations Research Models

    DTIC Science & Technology

    2009-09-01

    69 VI. CONCLUSIONS AND RECOMMENDATIONS ........................73 A. CONCLUSION ........................................73 1. Benefits of Off...simulation software results and similar results produced from the thesis work conducted by Ozdemir (2009). This study directly benefits decision makers...interested in identifying and benefiting from a cost- effective, readily available aggregated learning tool, with the potential to provide tactical

  9. "Black Magic" and "Gold Dust": The Epistemic and Political Uses of Evidence Tools in Public Health Policy Making

    ERIC Educational Resources Information Center

    Stewart, Ellen; Smith, Katherine E.

    2015-01-01

    Concerns about the limited influence of research on decision making have prompted the development of tools intended to mediate evidence for policy audiences. This article focuses on three examples, prominent in public health: impact assessments; systematic reviews; and economic decision-making tools (cost-benefit analysis and scenario modelling).…

  10. Applying analysis tools in planning for operations : case study #3 -- using archived data as a tool for operations planning

    DOT National Transportation Integrated Search

    2009-09-01

    More and more, transportation system operators are seeing the benefits of strengthening links between planning and operations. A critical element in improving transportation decision-making and the effectiveness of transportation systems related to o...

  11. Analysis of design tool attributes with regards to sustainability benefits

    NASA Astrophysics Data System (ADS)

    Zain, S.; Ismail, A. F.; Ahmad, Z.; Adesta, E. Y. T.

    2018-01-01

    The trend of global manufacturing competitiveness has shown a significant shift from profit and customer driven business to a more harmonious sustainability paradigm. This new direction, which emphasises the interests of three pillars of sustainability, i.e., social, economic and environment dimensions, has changed the ways products are designed. As a result, the roles of design tools in the product development stage of manufacturing in adapting to the new strategy are vital and increasingly challenging. The aim of this paper is to review the literature on the attributes of design tools with regards to the sustainability perspective. Four well-established design tools are selected, namely Quality Function Deployment (QFD), Failure Mode and Element Analysis (FMEA), Design for Six Sigma (DFSS) and Design for Environment (DfE). By analysing previous studies, the main attributes of each design tool and its benefits with respect to each sustainability dimension throughout four stages of product lifecycle are discussed. From this study, it is learnt that each of the design tools contributes to the three pillars of sustainability either directly or indirectly, but they are unbalanced and not holistic. Therefore, the prospective of improving and optimising the design tools is projected, and the possibility of collaboration between the different tools is discussed.

  12. Benefit-cost analysis of fishery rehabilitation projects: A Great Lakes case study. Spec. issue: Responses to marine resource change/social sciences perspective

    USGS Publications Warehouse

    Bishop, R.C.; Milliman, S.R.; Boyle, K.J.; Johnson, B.L.

    1990-01-01

    Tools of benefit-cost analysis are used to evaluate a project to rehabilitate the yellow perch (Perca flavescens ) fishery of Green Bay, Wisconsin. Both sport and commercial fishers harvest from this stock, which has been suffering from much reduced productivity since the early 1960s. The project is composed of commercial quotas and other regulations. Measures of benefits and costs were used that explicitly incorporate uncertainly about the potential level of success of the project. The analysis shows that commercial fish producers will more or less break even compared to where they would have been without the project, but that substantial recreational benefits can be expected.

  13. Geographical Network Analysis and Spatial Econometrics as Tools to Enhance Our Understanding of Student Migration Patterns and Benefits in the U.S. Higher Education Network

    ERIC Educational Resources Information Center

    González Canché, Manuel S.

    2018-01-01

    This study measures the extent to which student outmigration outside the 4-year sector takes place and posits that the benefits from attracting non-resident students exist regardless of sector of enrollment. The study also provides empirical evidence about the relevance of employing geographical network analysis (GNA) and spatial econometrics in…

  14. Buffer$--An Economic Analysis Tool

    Treesearch

    Gary Bentrup

    2007-01-01

    Buffer$ is an economic spreadsheet tool for analyzing the cost-benefits of conservation buffers by resource professionals. Conservation buffers are linear strips of vegetation managed for multiple landowner and societal objectives. The Microsoft Excel based spreadsheet can calculate potential income derived from a buffer, including income from cost-share/incentive...

  15. Digital fabrication of textiles: an analysis of electrical networks in 3D knitted functional fabrics

    NASA Astrophysics Data System (ADS)

    Vallett, Richard; Knittel, Chelsea; Christe, Daniel; Castaneda, Nestor; Kara, Christina D.; Mazur, Krzysztof; Liu, Dani; Kontsos, Antonios; Kim, Youngmoo; Dion, Genevieve

    2017-05-01

    Digital fabrication methods are reshaping design and manufacturing processes through the adoption of pre-production visualization and analysis tools, which help minimize waste of materials and time. Despite the increasingly widespread use of digital fabrication techniques, comparatively few of these advances have benefited the design and fabrication of textiles. The development of functional fabrics such as knitted touch sensors, antennas, capacitors, and other electronic textiles could benefit from the same advances in electrical network modeling that revolutionized the design of integrated circuits. In this paper, the efficacy of using current state-of-the-art digital fabrication tools over the more common trialand- error methods currently used in textile design is demonstrated. Gaps are then identified in the current state-of-the-art tools that must be resolved to further develop and streamline the rapidly growing field of smart textiles and devices, bringing textile production into the realm of 21st century manufacturing.

  16. Data Rights and Responsibilities

    PubMed Central

    Wyndham, Jessica M.

    2015-01-01

    A human-rights-based analysis can be a useful tool for the scientific community and policy makers as they develop codes of conduct, harmonized standards, and national policies for data sharing. The human rights framework provides a shared set of values and norms across borders, defines rights and responsibilities of various actors involved in data sharing, addresses the potential harms as well as the benefits of data sharing, and offers a framework for balancing competing values. The right to enjoy the benefits of scientific progress and its applications offers a particularly helpful lens through which to view data as both a tool of scientific inquiry to which access is vital and as a product of science from which everyone should benefit. PMID:26297755

  17. NEURON and Python.

    PubMed

    Hines, Michael L; Davison, Andrew P; Muller, Eilif

    2009-01-01

    The NEURON simulation program now allows Python to be used, alone or in combination with NEURON's traditional Hoc interpreter. Adding Python to NEURON has the immediate benefit of making available a very extensive suite of analysis tools written for engineering and science. It also catalyzes NEURON software development by offering users a modern programming tool that is recognized for its flexibility and power to create and maintain complex programs. At the same time, nothing is lost because all existing models written in Hoc, including graphical user interface tools, continue to work without change and are also available within the Python context. An example of the benefits of Python availability is the use of the xml module in implementing NEURON's Import3D and CellBuild tools to read MorphML and NeuroML model specifications.

  18. NEURON and Python

    PubMed Central

    Hines, Michael L.; Davison, Andrew P.; Muller, Eilif

    2008-01-01

    The NEURON simulation program now allows Python to be used, alone or in combination with NEURON's traditional Hoc interpreter. Adding Python to NEURON has the immediate benefit of making available a very extensive suite of analysis tools written for engineering and science. It also catalyzes NEURON software development by offering users a modern programming tool that is recognized for its flexibility and power to create and maintain complex programs. At the same time, nothing is lost because all existing models written in Hoc, including graphical user interface tools, continue to work without change and are also available within the Python context. An example of the benefits of Python availability is the use of the xml module in implementing NEURON's Import3D and CellBuild tools to read MorphML and NeuroML model specifications. PMID:19198661

  19. Integrating Transportation Modeling and Desktop GIS: A Practical and Affordable Analysis Tool for Small and Medium Sized Communities

    DOT National Transportation Integrated Search

    1998-09-16

    This paper and presentation discuss some of the benefits of integrating travel : demand models and desktop GIS (ArchInfo and ArcView for PCs) as a : cost-effective and staff saving tool, as well as specific improvements to : transportation planning m...

  20. Time Analysis: Still an Important Accountability Tool.

    ERIC Educational Resources Information Center

    Fairchild, Thomas N.; Seeley, Tracey J.

    1994-01-01

    Reviews benefits to school counselors of conducting a time analysis. Describes time analysis system that authors have used, including case illustration of how authors used data to effect counseling program changes. System described followed process outlined by Fairchild: identifying services, devising coding system, keeping records, synthesizing…

  1. Process analytical technology in the pharmaceutical industry: a toolkit for continuous improvement.

    PubMed

    Scott, Bradley; Wilcock, Anne

    2006-01-01

    Process analytical technology (PAT) refers to a series of tools used to ensure that quality is built into products while at the same time improving the understanding of processes, increasing efficiency, and decreasing costs. It has not been widely adopted by the pharmaceutical industry. As the setting for this paper, the current pharmaceutical manufacturing paradigm and PAT guidance to date are discussed prior to the review of PAT principles and tools, benefits, and challenges. The PAT toolkit contains process analyzers, multivariate analysis tools, process control tools, and continuous improvement/knowledge management/information technology systems. The integration and implementation of these tools is complex, and has resulted in uncertainty with respect to both regulation and validation. The paucity of staff knowledgeable in this area may complicate adoption. Studies to quantitate the benefits resulting from the adoption of PAT within the pharmaceutical industry would be a valuable addition to the qualitative studies that are currently available.

  2. Depth of manual dismantling analysis: A cost–benefit approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Achillas, Ch., E-mail: c.achillas@ihu.edu.gr; Aidonis, D.; Vlachokostas, Ch.

    Highlights: ► A mathematical modeling tool for OEMs. ► The tool can be used by OEMs, recyclers of electr(on)ic equipment or WEEE management systems’ regulators. ► The tool makes use of cost–benefit analysis in order to determine the optimal depth of product disassembly. ► The reusable materials and the quantity of metals and plastics recycled can be quantified in an easy-to-comprehend manner. - Abstract: This paper presents a decision support tool for manufacturers and recyclers towards end-of-life strategies for waste electrical and electronic equipment. A mathematical formulation based on the cost benefit analysis concept is herein analytically described in ordermore » to determine the parts and/or components of an obsolete product that should be either non-destructively recovered for reuse or be recycled. The framework optimally determines the depth of disassembly for a given product, taking into account economic considerations. On this basis, it embeds all relevant cost elements to be included in the decision-making process, such as recovered materials and (depreciated) parts/components, labor costs, energy consumption, equipment depreciation, quality control and warehousing. This tool can be part of the strategic decision-making process in order to maximize profitability or minimize end-of-life management costs. A case study to demonstrate the models’ applicability is presented for a typical electronic product in terms of structure and material composition. Taking into account the market values of the pilot product’s components, the manual disassembly is proven profitable with the marginal revenues from recovered reusable materials to be estimated at 2.93–23.06 €, depending on the level of disassembly.« less

  3. In Favor of Clear Thinking: Incorporating Moral Rules Into a Wise Cost-Benefit Analysis-Commentary on Bennis, Medin, & Bartels (2010).

    PubMed

    Bazerman, Max H; D Greene, Joshua

    2010-03-01

    Bennis, Medin, and Bartels (2010, this issue) have contributed an interesting article on the comparative benefit of moral rules versus cost-benefit analysis (CBA). Many of their specific comments are accurate, useful, and insightful. At the same time, we believe they have misrepresented CBA and have reached a set of conclusions that are misguided and, if adopted wholesale, potentially dangerous. Overall, they offer wise suggestions for making CBA more effective, rather than eliminating CBA as a decision-making tool. © The Author(s) 2010.

  4. Airport GSE Model Directions

    EPA Pesticide Factsheets

    User manual for the GSEModel which is a spreadsheet analysis tool for quantifying emission benefits and calculating the cost-effectiveness of converting to cleaner-burning fuels and engine technologies.

  5. EPA’s EnviroAtlas: Identifying Nature’s Benefits, Deficits, and Opportunities for Equitable Distribution in Populated Places#

    EPA Science Inventory

    The web-based EnviroAtlas is an easy-to-use mapping and analysis tool built by the U.S. Environmental Protection Agency and its partners to provide information, data, and research on the relationships between ecosystems, built infrastructure, and societal well-being. The tool is ...

  6. I-394 Minneapolis, Minnesota, analysis plan.

    DOT National Transportation Integrated Search

    2010-02-01

    This AMS Analysis Plan for the Interstate 394 (I-394) Pioneer Corridor outlines the various tasks associated with the application of the ICM AMS tools to the corridor in support of a benefit/cost assessment of the proposed strategies. The report prov...

  7. Integrated corridor management analysis, modeling, and simulation results for the test corridor.

    DOT National Transportation Integrated Search

    2008-06-01

    This report documents the Integrated Corridor Management (ICM) Analysis Modeling and Simulation (AMS) tools and strategies used on a Test Corridor, presents results and lessons-learned, and documents the relative capability of AMS to support benefit-...

  8. Granting Teachers the "Benefit of the Doubt" in Performance Evaluations

    ERIC Educational Resources Information Center

    Rogge, Nicky

    2011-01-01

    Purpose: This paper proposes a benefit of the doubt (BoD) approach to construct and analyse teacher effectiveness scores (i.e. SET scores). Design/methodology/approach: The BoD approach is related to data envelopment analysis (DEA), a linear programming tool for evaluating the relative efficiency performance of a set of similar units (e.g. firms,…

  9. U.S. 75 Dallas, Texas, analysis plan.

    DOT National Transportation Integrated Search

    2010-02-01

    This AMS Analysis Plan for the U.S. 75 Pioneer Corridor outlines the various tasks associated with the application of the ICM AMS tools and strategies to the corridor, in support of a benefit-cost assessment of the proposed strategies. The report pro...

  10. Reviewing the economic efficiency of disaster risk management

    NASA Astrophysics Data System (ADS)

    Mechler, Reinhard

    2013-04-01

    There is a lot of rhetoric suggesting that disaster risk management (DRM) pays, yet surprisingly little in the way of hard facts. Cost-benefit analysis (CBA) is one major tool that can provide quantitative information about the prioritization of disaster risk management (DRM) (and climate adaptation) based on economic principles. Yet, on a global scale, there has been surprisingly little robust evidence on the economic efficiency and benefits of risk management measures. This review shows that for the limited evidence reported the economic case for DRM across a range of hazards is strong and that the benefits of investing in DRM outweigh the costs of doing so, on average, by about four times the cost in terms of avoided and reduced losses. Most studies using a CBA approach focus on structural DRM and most information has been made available on physical flood prevention. There have been some limited studies on preparedness and risk financing. The global evidence base is limited and estimates appear not very solid, and overall, in line with the conclusion of the recent IPCC SREX report, there is limited evidence and medium agreement across the literature. Some of the factors behind the limited robustness are inherent to CBA more widely: these challenges comprise the inability to price intangibles, evaluating strategies rather than single projects, difficulties in assessing softer rather than infrastructure-related options, choices regarding a proper discount rate, lack of accounting for the distribution of benefits and costs and difficulties with assessing nonmarket values such as those related to health, the environment, or public goods. Although techniques exist to address some of these challenges, they are not very likely to easily go away. Other challenges associated specifically with DRM, such as the need and difficulty to undertake risk -based analysis can be overcome, and there have been manuals and reports providing a way forward. In an age of austerity, cost-benefit analysis continues to be an important tool for prioritising efficient DRM measures, yet with a shifting emphasis from infrastructure-based options (hard resilience) to preparedness and systemic interventions (soft resilience), other tools such as cost-effectiveness analysis, multi-criteria analysis and robust decision-making approaches deserve more attention.

  11. Ares Project Technology Assessment: Approach and Tools

    NASA Technical Reports Server (NTRS)

    Hueter, Uwe; Tyson, Richard

    2010-01-01

    Technology assessments provide a status of the development maturity of specific technologies. Along with benefit analysis, the risks the project assumes can be quantified. Normally due to budget constraints, the competing technologies are prioritized and decisions are made which ones to fund. A detailed technology development plan is produced for the selected technologies to provide a roadmap to reach the desired maturity by the project s critical design review. Technology assessments can be conducted for both technology only tasks or for product development programs. This paper is primarily biased toward the product development programs. The paper discusses the Ares Project s approach to technology assessment. System benefit analysis, risk assessment, technology prioritization, and technology readiness assessment are addressed. A description of the technology readiness level tool being used is provided.

  12. Benefits and applications of interdisciplinary digital tools for environmental meta-reviews and analyses

    NASA Astrophysics Data System (ADS)

    Grubert, Emily; Siders, Anne

    2016-09-01

    Digitally-aided reviews of large bodies of text-based information, such as academic literature, are growing in capability but are not yet common in environmental fields. Environmental sciences and studies can benefit from application of digital tools to create comprehensive, replicable, interdisciplinary reviews that provide rapid, up-to-date, and policy-relevant reports of existing work. This work reviews the potential for applications of computational text mining and analysis tools originating in the humanities to environmental science and policy questions. Two process-oriented case studies of digitally-aided environmental literature reviews and meta-analyses illustrate potential benefits and limitations. A medium-sized, medium-resolution review (∼8000 journal abstracts and titles) focuses on topic modeling as a rapid way to identify thematic changes over time. A small, high-resolution review (∼300 full text journal articles) combines collocation and network analysis with manual coding to synthesize and question empirical field work. We note that even small digitally-aided analyses are close to the upper limit of what can be done manually. Established computational methods developed in humanities disciplines and refined by humanities and social science scholars to interrogate large bodies of textual data are applicable and useful in environmental sciences but have not yet been widely applied. Two case studies provide evidence that digital tools can enhance insight. Two major conclusions emerge. First, digital tools enable scholars to engage large literatures rapidly and, in some cases, more comprehensively than is possible manually. Digital tools can confirm manually identified patterns or identify additional patterns visible only at a large scale. Second, digital tools allow for more replicable and transparent conclusions to be drawn from literature reviews and meta-analyses. The methodological subfields of digital humanities and computational social sciences will likely continue to create innovative tools for analyzing large bodies of text, providing opportunities for interdisciplinary collaboration with the environmental fields.

  13. Design and Analysis of Turbines for Space Applications

    NASA Technical Reports Server (NTRS)

    Griffin, Lisa W.; Dorney, Daniel J.; Huber, Frank W.

    2003-01-01

    In order to mitigate the risk of rocket propulsion development, efficient, accurate, detailed fluid dynamics analysis of the turbomachinery is necessary. This analysis is used for component development, design parametrics, performance prediction, and environment definition. To support this requirement, a task was developed at NASAh4arshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. The turbine chosen on which to demonstrate the procedure was a supersonic design suitable for a reusable launch vehicle (RLV). The hot gas path and blading were redesigned to obtain an increased efficiency. The redesign of the turbine was conducted with a consideration of system requirements, realizing that a highly efficient turbine that, for example, significantly increases engine weight, is of limited benefit. Both preliminary and detailed designs were considered. To generate an improved design, one-dimensional (1D) design and analysis tools, computational fluid dynamics (CFD), response surface methodology (RSM), and neural nets (NN) were used.

  14. Integrated corridor management I-15 San Diego, California : analysis plan.

    DOT National Transportation Integrated Search

    2010-02-01

    This AMS Analysis Plan for the I-15 Corridor outlines the various tasks associated with the application of the ICM AMS tools and strategies to this corridor in order to support benefit-cost assessment for the successful implementation of ICM. The rep...

  15. Cost benefit analysis for smart grid projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karali, Nihan; He, Gang; Mauzey, J

    The U.S. is unusual in that a definition of the term “smart grid” was written into legislation, appearing in the Energy Independence and Security Act (2007). When the recession called for stimulus spending and the American Recovery and Reinvestment Act (ARRA, 2009) was passed, a framework already existed for identification of smart grid projects. About $4.5B of the U.S. Department of Energy’s (U.S. DOE’s) $37B allocation from ARRA was directed to smart grid projects of two types, investment grants and demonstrations. Matching funds from other sources more than doubled the total value of ARRA-funded smart grid projects. The Smart Gridmore » Investment Grant Program (SGIG) consumed all but $620M of the ARRA funds, which was available for the 32 projects in the Smart Grid Demonstration Program (SGDP, or demonstrations). Given the economic potential of these projects and the substantial investments required, there was keen interest in estimating the benefits of the projects (i.e., quantifying and monetizing the performance of smart grid technologies). Common method development and application, data collection, and analysis to calculate and publicize the benefits were central objectives of the program. For this purpose standard methods and a software tool, the Smart Grid Computational Tool (SGCT), were developed by U.S. DOE and a spreadsheet model was made freely available to grantees and other analysts. The methodology was intended to define smart grid technologies or assets, the mechanisms by which they generate functions, their impacts and, ultimately, their benefits. The SGCT and its application to the Demonstration Projects are described, and actual projects in Southern California and in China are selected to test and illustrate the tool. The usefulness of the methodology and tool for international analyses is then assessed.« less

  16. Data Rights and Responsibilities: A Human Rights Perspective on Data Sharing.

    PubMed

    Harris, Theresa L; Wyndham, Jessica M

    2015-07-01

    A human-rights-based analysis can be a useful tool for the scientific community and policy makers as they develop codes of conduct, harmonized standards, and national policies for data sharing. The human rights framework provides a shared set of values and norms across borders, defines rights and responsibilities of various actors involved in data sharing, addresses the potential harms as well as the benefits of data sharing, and offers a framework for balancing competing values. The right to enjoy the benefits of scientific progress and its applications offers a particularly helpful lens through which to view data as both a tool of scientific inquiry to which access is vital and as a product of science from which everyone should benefit. © The Author(s) 2015.

  17. The current role of high-resolution mass spectrometry in food analysis.

    PubMed

    Kaufmann, Anton

    2012-05-01

    High-resolution mass spectrometry (HRMS), which is used for residue analysis in food, has gained wider acceptance in the last few years. This development is due to the availability of more rugged, sensitive, and selective instrumentation. The benefits provided by HRMS over classical unit-mass-resolution tandem mass spectrometry are considerable. These benefits include the collection of full-scan spectra, which provides greater insight into the composition of a sample. Consequently, the analyst has the freedom to measure compounds without previous compound-specific tuning, the possibility of retrospective data analysis, and the capability of performing structural elucidations of unknown or suspected compounds. HRMS strongly competes with classical tandem mass spectrometry in the field of quantitative multiresidue methods (e.g., pesticides and veterinary drugs). It is one of the most promising tools when moving towards nontargeted approaches. Certain hardware and software issues still have to be addressed by the instrument manufacturers for it to dislodge tandem mass spectrometry from its position as the standard trace analysis tool.

  18. Beyond Utility: The Liberal Arts and the Ends of Education

    ERIC Educational Resources Information Center

    Schmidt, Christopher D.

    2017-01-01

    The humanities must be foundational to all our discussions of curriculum. They are not tools for success in the global economy; they offer no return on investment; and they will not fit into a cost-benefit analysis. Rather, they are our only means of knowing what success is, or what we mean by "cost" or "benefit." We must treat…

  19. Integration of PKPD relationships into benefit-risk analysis.

    PubMed

    Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar

    2015-11-01

    Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit-risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit-risk assessment. In addition, we propose the use of pharmacokinetic-pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit-risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit-risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit-risk balance before extensive evidence is generated in clinical practice. Benefit-risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. © 2015 The British Pharmacological Society.

  20. Looking beyond borders: integrating best practices in benefit-risk analysis into the field of food and nutrition.

    PubMed

    Tijhuis, M J; Pohjola, M V; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken-Schröder, G; Poto, M; Tuomisto, J T; Ueland, O; White, B C; Holm, F; Verhagen, H

    2012-01-01

    An integrated benefit-risk analysis aims to give guidance in decision situations where benefits do not clearly prevail over risks, and explicit weighing of benefits and risks is thus indicated. The BEPRARIBEAN project aims to advance benefit-risk analysis in the area of food and nutrition by learning from other fields. This paper constitutes the final stage of the project, in which commonalities and differences in benefit-risk analysis are identified between the Food and Nutrition field and other fields, namely Medicines, Food Microbiology, Environmental Health, Economics and Marketing-Finance, and Consumer Perception. From this, ways forward are characterized for benefit-risk analysis in Food and Nutrition. Integrated benefit-risk analysis in Food and Nutrition may advance in the following ways: Increased engagement and communication between assessors, managers, and stakeholders; more pragmatic problem-oriented framing of assessment; accepting some risk; pre- and post-market analysis; explicit communication of the assessment purpose, input and output; more human (dose-response) data and more efficient use of human data; segmenting populations based on physiology; explicit consideration of value judgments in assessment; integration of multiple benefits and risks from multiple domains; explicit recognition of the impact of consumer beliefs, opinions, views, perceptions, and attitudes on behaviour; and segmenting populations based on behaviour; the opportunities proposed here do not provide ultimate solutions; rather, they define a collection of issues to be taken account of in developing methods, tools, practices and policies, as well as refining the regulatory context, for benefit-risk analysis in Food and Nutrition and other fields. Thus, these opportunities will now need to be explored further and incorporated into benefit-risk practice and policy. If accepted, incorporation of these opportunities will also involve a paradigm shift in Food and Nutrition benefit-risk analysis towards conceiving the analysis as a process of creating shared knowledge among all stakeholders. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Ambitious Pedagogy by Novice Teachers: Who Benefits from Tool-Supported Collaborative Inquiry into Practice and Why?

    ERIC Educational Resources Information Center

    Windschitl, Mark; Thompson, Jessica; Braaten, Melissa

    2011-01-01

    Background/Context: The collegial analysis of student work artifacts has been effective in advancing the practice of experienced teachers; however, the use of such strategies as a centerpiece for induction has not been explored, nor has the development of tool systems to support such activity with novices. Purpose/Objective: We tested the…

  2. Applying analysis tools in planning for operations

    DOT National Transportation Integrated Search

    2009-09-01

    More and more, transportation system operators are seeing the benefits of strengthening links between planning and operations. A critical element in improving transportation decision-making and the effectiveness of transportation systems related to o...

  3. The Importance of Place and Time in Translating Knowledge About Canada's Compassionate Care Benefit to Informal Caregivers

    PubMed Central

    Dykeman, Sarah; Williams, Allison

    2013-01-01

    Canada's Compassionate Care Benefit (CCB), an employment insurance program designed to allow Canadian workers time off to care for a dying relative or friend, has had low uptake since its inception. Due to their role in working with family caregivers, social workers are one group of primary health care professionals who have been identified as benefiting from a knowledge translation campaign. Knowledge tools about the CCB have been developed through social worker input in a prior study. This article presents the findings of a qualitative exploratory intervention. Social workers (n = 8) utilized the tools for 6 months and discussed their experiences with them. Data analysis revealed references to time and space constraints in using to the tools, and demonstrated the impact of time geography on knowledge translation about the CCB. The results suggest that knowledge translation about the CCB could be targeted toward caregivers earlier on in the disease progression before the terminal diagnosis, and knowledge tools must be disseminated to more locations. These results may be valuable to policymakers and palliative care providers, as well as theorists interested in ongoing applications of time geography in knowledge translation and the consumption/production of care. PMID:24295098

  4. Integrating ecosystem services analysis into scenario planning practice: accounting for street tree benefits with i-Tree valuation in Central Texas.

    PubMed

    Hilde, Thomas; Paterson, Robert

    2014-12-15

    Scenario planning continues to gain momentum in the United States as an effective process for building consensus on long-range community plans and creating regional visions for the future. However, efforts to integrate more sophisticated information into the analytical framework to help identify important ecosystem services have lagged in practice. This is problematic because understanding the tradeoffs of land consumption patterns on ecological integrity is central to mitigating the environmental degradation caused by land use change and new development. In this paper we describe how an ecosystem services valuation model, i-Tree, was integrated into a mainstream scenario planning software tool, Envision Tomorrow, to assess the benefits of public street trees for alternative future development scenarios. The tool is then applied to development scenarios from the City of Hutto, TX, a Central Texas Sustainable Places Project demonstration community. The integrated tool represents a methodological improvement for scenario planning practice, offers a way to incorporate ecosystem services analysis into mainstream planning processes, and serves as an example of how open source software tools can expand the range of issues available for community and regional planning consideration, even in cases where community resources are limited. The tool also offers room for future improvements; feasible options include canopy analysis of various future land use typologies, as well as a generalized street tree model for broader U.S. application. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. EconoMe-Develop - a calculation tool for multi-risk assessment and benefit-cost-analysis

    NASA Astrophysics Data System (ADS)

    Bründl, M.

    2012-04-01

    Public money is used to finance the protection of human life, material assets and the environment against natural hazards. This limited resource should be used in a way that it achieves the maximum possible effect by minimizing as many risks as possible. Hence, decision-makers are facing the question which mitigation measures should be prioritised. Benefit-Cost-Analysis (BCA) is a recognized method for determining the economic efficiency of investments in mitigation measures. In Switzerland, the Federal Office for the Environment (FOEN) judges the benefit-cost-ratio of mitigation projects on the base of the results of the calculation tool "EconoMe" [1]. The check of the economic efficiency of mitigation projects with an investment of more than 1 million CHF (800,000 EUR) by using "EconoMe" is mandatory since 2008 in Switzerland. Within "EconoMe", most calculation parameters cannot be changed by the user allowing for comparable results. Based on the risk guideline "RIKO" [2] an extended version of the operational version of "EconoMe", called "EconoMe-Develop" was developed. "EconoMe-Develop" is able to deal with various natural hazard processes and thus allows multi-risk assessments, since all restrictions of the operational version of "EconoMe" like e.g. the number of scenarios and expositions, vulnerability, spatial probability of processes and probability of presence of objects, are not existing. Additionally, the influences of uncertainty of calculation factors, like e.g. vulnerability, on the final results can be determined. "EconoMe-Develop" offers import and export of data, e.g. results of GIS-analysis. The possibility for adapting the tool to user specific requirements makes EconoMe-Develop an easy-to-use tool for risk assessment and assessment of economic efficiency of mitigation projects for risk experts. In the paper we will present the most important features of the tool and we will illustrate the application by a practical example.

  6. Interactive Learning: The Casewriting Method as an Entire Semester Course for Higher Education.

    ERIC Educational Resources Information Center

    Bowen, Brent D.

    This guide explains the reasons for employing the case method as a tool in the academic discipline of aviation. It promotes the use of case writing as a unique opportunity to derive even further benefits from case analysis. The benefits to students of using case writing as a learning strategy include a focus on the strategy of a real situation;…

  7. A business case evaluation of workplace engineering noise control: a net-cost model.

    PubMed

    Lahiri, Supriya; Low, Colleen; Barry, Michael

    2011-03-01

    This article provides a convenient tool for companies to determine the costs and benefits of alternative interventions to prevent noise-induced hearing loss (NIHL). Contextualized for Singapore and in collaboration with Singapore's Ministry of Manpower, the Net-Cost model evaluates costs of intervention for equipment and labor, avoided costs of productivity losses and medical care, and productivity gains from the employer's economic perspective. To pilot this approach, four case studies are presented, with varying degrees of economic benefits to the employer, including one in which multifactor productivity is the main driver. Although compliance agencies may not require economic analysis of NIHL, given scarce resources in a market-driven economy, this tool enables stakeholders to understand and compare the costs and benefits of NIHL interventions comprehensively and helps in determining risk management strategies.

  8. Measuring the Air Quality and Transportation Impacts of Infill Development

    EPA Pesticide Factsheets

    This report summarizes three case studies. The analysis shows how standard forecasting tools can be modified to capture at least some of the transportation and air quality benefits of brownfield and infill development.

  9. CO-Benefits Risk Assessment (COBRA) Health Impacts Screening and Mapping Tool

    EPA Pesticide Factsheets

    The COBRA (Co-Benefits Risk Assessment) screening tool can be used by state and local governments to estimate the health and economic benefits of clean energy policies. Find information about how to use the tool here.

  10. Toward a Responsibility-Catering Prioritarian Ethical Theory of Risk.

    PubMed

    Wikman-Svahn, Per; Lindblom, Lars

    2018-03-05

    Standard tools used in societal risk management such as probabilistic risk analysis or cost-benefit analysis typically define risks in terms of only probabilities and consequences and assume a utilitarian approach to ethics that aims to maximize expected utility. The philosopher Carl F. Cranor has argued against this view by devising a list of plausible aspects of the acceptability of risks that points towards a non-consequentialist ethical theory of societal risk management. This paper revisits Cranor's list to argue that the alternative ethical theory responsibility-catering prioritarianism can accommodate the aspects identified by Cranor and that the elements in the list can be used to inform the details of how to view risks within this theory. An approach towards operationalizing the theory is proposed based on a prioritarian social welfare function that operates on responsibility-adjusted utilities. A responsibility-catering prioritarian ethical approach towards managing risks is a promising alternative to standard tools such as cost-benefit analysis.

  11. Applications of Earth Observations for Fisheries Management: An analysis of socioeconomic benefits

    NASA Astrophysics Data System (ADS)

    Friedl, L.; Kiefer, D. A.; Turner, W.

    2013-12-01

    This paper will discuss the socioeconomic impacts of a project applying Earth observations and models to support management and conservation of tuna and other marine resources in the eastern Pacific Ocean. A project team created a software package that produces statistical analyses and dynamic maps of habitat for pelagic ocean biota. The tool integrates sea surface temperature and chlorophyll imagery from MODIS, ocean circulation models, and other data products. The project worked with the Inter-American Tropical Tuna Commission, which issues fishery management information, such as stock assessments, for the eastern Pacific region. The Commission uses the tool and broader habitat information to produce better estimates of stock and thus improve their ability to identify species that could be at risk of overfishing. The socioeconomic analysis quantified the relative value that Earth observations contributed to accurate stock size assessments through improvements in calculating population size. The analysis team calculated the first-order economic costs of a fishery collapse (or shutdown), and they calculated the benefits of improved estimates that reduce the uncertainty of stock size and thus reduce the risk of fishery collapse. The team estimated that the project reduced the probability of collapse of different fisheries, and the analysis generated net present values of risk mitigation. USC led the project with sponsorship from the NASA Earth Science Division's Applied Sciences Program, which conducted the socioeconomic impact analysis. The paper will discuss the project and focus primarily on the analytic methods, impact metrics, and the results of the socioeconomic benefits analysis.

  12. Rapid Benefit Indicator (RBI) Checklist Tool - Quick Start ...

    EPA Pesticide Factsheets

    The Rapid Benefits Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration – A Rapid Benefits Indicators Approach for Decision Makers. This checklist tool is intended to be used to record information as you answer the questions in that guide. When performing a Rapid Benefits Indicator (RBI) assessment on wetlands restoration site(s) results can be recorded and reviewed using this VBA enabled MS Excel Checklist Tool.

  13. Down-to-Earth Benefits of Space Exploration: Past, Present, Future

    NASA Technical Reports Server (NTRS)

    Neumann, Benjamin

    2005-01-01

    A ventricular device that helps a weakened heart keep pumping while awaiting a transplant. A rescue tool for extracting victims from dangerous situations such as car wrecks. A video analysis tool used to investigate the bombing at the 1996 Olympics in Atlanta. A sound-differentiation tool for safer air traffic control. A refrigerator that run without electricity or batteries. These are just a few of the spin-offs of NASA technology that have benefited society in recent years. Now, as NASA sets its vision on space exploration, particularly of the moon and Mars, even more benefits to society are possible. This expansion of societal benefits is tied to a new emphasis on technology infusion or spin-in. NASA is seeking partners with industry, universities, and other government laboratories to help the Agency address its specific space exploration needs in five areas: (1) advanced studies, concepts, and tools; (2) advanced materials; (3) communications, computing, electronics, and imaging; (4) software, intelligent systems, and modeling; and (5) power, propulsion, and chemical systems. These spin-in partnerships will offer benefits to U.S. economic development as well as new products for the global market. As a complement to these spin-in benefits, NASA also is examining the possible future spin-outs of the innovations related to its new space exploration mission. A matrix that charts NASA's needs against various business sectors is being developed to fully understand the implications for society and industry of spin-in and spin-out. This matrix already has been used to help guide NASA s efforts to secure spin-in partnerships. This paper presents examples of NASA spin-offs, discusses NASA s present spin-in/spin-out projects for pursuing partnerships, and considers some of the future societal benefits to be reaped from these partnerships. This paper will complement the proposed paper by Frank Schowengerdt on the Innovative Partnerships Program structure and how to work with the PP.

  14. FMCSA safety program effectiveness measurement : carrier intervention effectiveness model, version 1.0 : [analysis brief].

    DOT National Transportation Integrated Search

    2015-01-01

    The Carrier Intervention Effectiveness Model (CIEM) : provides the Federal Motor Carrier Safety : Administration (FMCSA) with a tool for measuring : the safety benefits of carrier interventions conducted : under the Compliance, Safety, Accountability...

  15. Benefit-Cost Analysis of Undergraduate Education Programs: An Example Analysis of the Freshman Research Initiative

    ERIC Educational Resources Information Center

    Walcott, Rebecca L.; Corso, Phaedra S.; Rodenbusch, Stacia E.; Dolan, Erin L.

    2018-01-01

    Institutions and administrators regularly have to make difficult choices about how best to invest resources to serve students. Yet economic evaluation, or the systematic analysis of the relationship between costs and outcomes of a program or policy, is relatively uncommon in higher education. This type of evaluation can be an important tool for…

  16. ResStock Analysis Tool | Buildings | NREL

    Science.gov Websites

    Energy and Cost Savings for U.S. Homes Contact Eric Wilson to learn how ResStock can benefit your approach to large-scale residential energy analysis by combining: Large public and private data sources uncovered $49 billion in potential annual utility bill savings through cost-effective energy efficiency

  17. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    PubMed

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  18. Sensory dominance and multisensory integration as screening tools in aging.

    PubMed

    Murray, Micah M; Eardley, Alison F; Edginton, Trudi; Oyekan, Rebecca; Smyth, Emily; Matusz, Pawel J

    2018-06-11

    Multisensory information typically confers neural and behavioural advantages over unisensory information. We used a simple audio-visual detection task to compare healthy young (HY), healthy older (HO) and mild-cognitive impairment (MCI) individuals. Neuropsychological tests assessed individuals' learning and memory impairments. First, we provide much-needed clarification regarding the presence of enhanced multisensory benefits in both healthily and abnormally aging individuals. The pattern of sensory dominance shifted with healthy and abnormal aging to favour a propensity of auditory-dominant behaviour (i.e., detecting sounds faster than flashes). Notably, multisensory benefits were larger only in healthy older than younger individuals who were also visually-dominant. Second, we demonstrate that the multisensory detection task offers benefits as a time- and resource-economic MCI screening tool. Receiver operating characteristic (ROC) analysis demonstrated that MCI diagnosis could be reliably achieved based on the combination of indices of multisensory integration together with indices of sensory dominance. Our findings showcase the importance of sensory profiles in determining multisensory benefits in healthy and abnormal aging. Crucially, our findings open an exciting possibility for multisensory detection tasks to be used as a cost-effective screening tool. These findings clarify relationships between multisensory and memory functions in aging, while offering new avenues for improved dementia diagnostics.

  19. Ten best resources for conducting financing and benefit incidence analysis in resource-poor settings.

    PubMed

    Wiseman, Virginia; Asante, Augustine; Price, Jennifer; Hayen, Andrew; Irava, Wayne; Martins, Joao; Guinness, Lorna; Jan, Stephen

    2015-10-01

    Many low- and middle-income countries are seeking to reform their health financing systems to move towards universal coverage. This typically means that financing is based on people's ability to pay while, for service use, benefits are based on the need for health care. Financing incidence analysis (FIA) and benefit incidence analysis (BIA) are two popular tools used to assess equity in health systems financing and service use. FIA studies examine who pays for the health sector and how these contributions are distributed according to socioeconomic status (SES). BIA determines who benefits from health care spending, with recipients ranked by their relative SES. In this article, we identify 10 resources to assist researchers and policy makers seeking to undertake or interpret findings from financing and benefit incidence analyses in the health sector. The article pays particular attention to the data requirements, computations, methodological challenges and country level experiences with these types of analyses. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2014; all rights reserved.

  20. The Mission Planning Lab: A Visualization and Analysis Tool

    NASA Technical Reports Server (NTRS)

    Daugherty, Sarah C.; Cervantes, Benjamin W.

    2009-01-01

    Simulation and visualization are powerful decision making tools that are time-saving and cost-effective. Space missions pose testing and e valuation challenges that can be overcome through modeling, simulatio n, and visualization of mission parameters. The National Aeronautics and Space Administration?s (NASA) Wallops Flight Facility (WFF) capi talizes on the benefits of modeling, simulation, and visualization to ols through a project initiative called The Mission Planning Lab (MPL ).

  1. NASA to Test In-Flight Folding Spanwise Adaptive Wing to Enhance Aircraft Efficiency

    NASA Image and Video Library

    2014-10-21

    The objectives of testing on PTERA include the development of tools and vetting of system integration, evaluation of vehicle control law, and analysis of SAW airworthiness to examine benefits to in-flight efficiency.

  2. FMCSA safety program effectiveness measurement : carrier intervention effectiveness Model, version 1.1, analysis brief.

    DOT National Transportation Integrated Search

    2016-11-01

    The Carrier Intervention Effectiveness Model (CIEM) provides the Federal Motor Carrier Safety Administration (FMCSA) with a tool for measuring the safety benefits of carrier interventions conducted under the Compliance, Safety, Accountability (CSA) e...

  3. A cost-benefit analysis for materials management information systems.

    PubMed

    Slapak-Iacobelli, L; Wilde, A H

    1993-02-01

    The cost-benefit analysis provided the system planners with valuable information that served many purposes. It answered the following questions: Why was the CCF undertaking this project? What were the alternatives? How much was it going to cost? And what was the expected outcome? The process of developing cost-benefit the document kept the project team focused. It also motivated them to involve additional individuals from materials management and accounts payable in its development. A byproduct of this involvement was buy-in and commitment to the project by everyone in these areas. Consequently, the project became a team effort championed by many and not just one. We were also able to introduce two new information system processes: 1) a management review process with goals and anticipated results, and 2) a quality assurance process that ensured the CCF had a better product in the end. The cost-benefit analysis provided a planning tool that assisted in successful implementation of an integrated materials management information system.

  4. SLIPTA e-Tool improves laboratory audit process in Vietnam and Cambodia.

    PubMed

    Nguyen, Thuong T; McKinney, Barbara; Pierson, Antoine; Luong, Khue N; Hoang, Quynh T; Meharwal, Sandeep; Carvalho, Humberto M; Nguyen, Cuong Q; Nguyen, Kim T; Bond, Kyle B

    2014-01-01

    The Stepwise Laboratory Quality Improvement Process Towards Accreditation (SLIPTA) checklist is used worldwide to drive quality improvement in laboratories in developing countries and to assess the effectiveness of interventions such as the Strengthening Laboratory Management Toward Accreditation (SLMTA) programme. However, the paper-based format of the checklist makes administration cumbersome and limits timely analysis and communication of results. In early 2012, the SLMTA team in Vietnam developed an electronic SLIPTA checklist tool. The e-Tool was pilot tested in Vietnam in mid-2012 and revised. It was used during SLMTA implementation in Vietnam and Cambodia in 2012 and 2013 and further revised based on auditors' feedback about usability. The SLIPTA e-Tool enabled rapid turn-around of audit results, reduced workload and language barriers and facilitated analysis of national results. Benefits of the e-Tool will be magnified with in-country scale-up of laboratory quality improvement efforts and potential expansion to other countries.

  5. User's manual for The TIM benefit-cost (TIM-BC) Tool (Version: 1.0.0)

    DOT National Transportation Integrated Search

    2015-07-04

    This document serves as a users manual for the Traffic Incident Management Benefit-Cost Tool (TIM-BC) Version 1.0.0 - Safety Service Patrol Benefit-Cost (SSP-BC) Tool, which is used to assist State and local engineers and decisionmakers with evalu...

  6. XMI2USE: A Tool for Transforming XMI to USE Specifications

    NASA Astrophysics Data System (ADS)

    Sun, Wuliang; Song, Eunjee; Grabow, Paul C.; Simmonds, Devon M.

    The UML-based Specification Environment (USE) tool supports syntactic analysis, type checking, consistency checking, and dynamic validation of invariants and pre-/post conditions specified in the Object Constraint Language (OCL). Due to its animation and analysis power, it is useful when checking critical non-functional properties such as security policies. However, the USE tool requires one to specify (i.e., "write") a model using its own textual language and does not allow one to import any model specification files created by other UML modeling tools. Hence, to make the best use of existing UML tools, we often create a model with OCL constraints using a modeling tool such as the IBM Rational Software Architect (RSA) and then use the USE tool for model validation. This approach, however, requires a manual transformation between the specifications of two different tool formats, which is error-prone and diminishes the benefit of automated model-level validations. In this paper, we describe our own implementation of a specification transformation engine that is based on the Model Driven Architecture (MDA) framework and currently supports automatic tool-level transformations from RSA to USE.

  7. Applied Meteorology Unit (AMU) Quarterly Report Third Quarter FY-08

    NASA Technical Reports Server (NTRS)

    Bauman, William; Crawford, Winifred; Barrett, Joe; Watson, Leela; Dreher, Joseph

    2008-01-01

    This report summarizes the Applied Meteorology Unit (AMU) activities for the third quarter of Fiscal Year 2008 (April - June 2008). Tasks reported on are: Peak Wind Tool for User Launch Commit Criteria (LCC), Anvil Forecast Tool in AWIPS Phase II, Completion of the Edward Air Force Base (EAFB) Statistical Guidance Wind Tool, Volume Averaged Height Integ rated Radar Reflectivity (VAHIRR), Impact of Local Sensors, Radar Scan Strategies for the PAFB WSR-74C Replacement, VAHIRR Cost Benefit Analysis, and WRF Wind Sensitivity Study at Edwards Air Force Base

  8. A semi-quantitative approach to GMO risk-benefit analysis.

    PubMed

    Morris, E Jane

    2011-10-01

    In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process.

  9. Automated Sensitivity Analysis of Interplanetary Trajectories

    NASA Technical Reports Server (NTRS)

    Knittel, Jeremy; Hughes, Kyle; Englander, Jacob; Sarli, Bruno

    2017-01-01

    This work describes a suite of Python tools known as the Python EMTG Automated Trade Study Application (PEATSA). PEATSA was written to automate the operation of trajectory optimization software, simplify the process of performing sensitivity analysis, and was ultimately found to out-perform a human trajectory designer in unexpected ways. These benefits will be discussed and demonstrated on sample mission designs.

  10. Using Molecular Visualization to Explore Protein Structure and Function and Enhance Student Facility with Computational Tools

    ERIC Educational Resources Information Center

    Terrell, Cassidy R.; Listenberger, Laura L.

    2017-01-01

    Recognizing that undergraduate students can benefit from analysis of 3D protein structure and function, we have developed a multiweek, inquiry-based molecular visualization project for Biochemistry I students. This project uses a virtual model of cyclooxygenase-1 (COX-1) to guide students through multiple levels of protein structure analysis. The…

  11. HydroClimATe: hydrologic and climatic analysis toolkit

    USGS Publications Warehouse

    Dickinson, Jesse; Hanson, Randall T.; Predmore, Steven K.

    2014-01-01

    The potential consequences of climate variability and climate change have been identified as major issues for the sustainability and availability of the worldwide water resources. Unlike global climate change, climate variability represents deviations from the long-term state of the climate over periods of a few years to several decades. Currently, rich hydrologic time-series data are available, but the combination of data preparation and statistical methods developed by the U.S. Geological Survey as part of the Groundwater Resources Program is relatively unavailable to hydrologists and engineers who could benefit from estimates of climate variability and its effects on periodic recharge and water-resource availability. This report documents HydroClimATe, a computer program for assessing the relations between variable climatic and hydrologic time-series data. HydroClimATe was developed for a Windows operating system. The software includes statistical tools for (1) time-series preprocessing, (2) spectral analysis, (3) spatial and temporal analysis, (4) correlation analysis, and (5) projections. The time-series preprocessing tools include spline fitting, standardization using a normal or gamma distribution, and transformation by a cumulative departure. The spectral analysis tools include discrete Fourier transform, maximum entropy method, and singular spectrum analysis. The spatial and temporal analysis tool is empirical orthogonal function analysis. The correlation analysis tools are linear regression and lag correlation. The projection tools include autoregressive time-series modeling and generation of many realizations. These tools are demonstrated in four examples that use stream-flow discharge data, groundwater-level records, gridded time series of precipitation data, and the Multivariate ENSO Index.

  12. Bite the apple, get driven out of the garden: A risky story telling at the ASME town meeting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Majumdar, K.C.

    1994-11-01

    Risk, the all-encompassing four-letter word became a widely used household cliche and an institutional mantra in the nineties. Risk analysis models from the Garden of Eden to the Capitol Hill lawn have made a number of sharp paradigm shifts to evolve itself as a decision-making tool from individual risk perception to societal risk-based regulatory media. Risk always coexists with benefit and is arbitrated by costs. Risk-benefit analysis has been in use in business and industry in economic ventures for a long time. Only recently risk management in its current state of development, evolved as a regulatory tool for controlling largemore » technological systems that have potential impacts on the health and safety of the public and on the sustainability of the ecology and the environment. This paper summarizes the evolution of the risk management concepts and models in industry and the regulatory agencies in the US over the last three decades. It also discusses the benefits and limitations of this evolving discipline as it is applied to high-risk technologies from the nuclear power plant and petrochemical industry, etc. to nuclear weapons technology.« less

  13. SLIPTA e-Tool improves laboratory audit process in Vietnam and Cambodia

    PubMed Central

    Nguyen, Thuong T.; McKinney, Barbara; Pierson, Antoine; Luong, Khue N.; Hoang, Quynh T.; Meharwal, Sandeep; Carvalho, Humberto M.; Nguyen, Cuong Q.; Nguyen, Kim T.

    2014-01-01

    Background The Stepwise Laboratory Quality Improvement Process Towards Accreditation (SLIPTA) checklist is used worldwide to drive quality improvement in laboratories in developing countries and to assess the effectiveness of interventions such as the Strengthening Laboratory Management Toward Accreditation (SLMTA) programme. However, the paper-based format of the checklist makes administration cumbersome and limits timely analysis and communication of results. Development of e-Tool In early 2012, the SLMTA team in Vietnam developed an electronic SLIPTA checklist tool. The e-Tool was pilot tested in Vietnam in mid-2012 and revised. It was used during SLMTA implementation in Vietnam and Cambodia in 2012 and 2013 and further revised based on auditors’ feedback about usability. Outcomes The SLIPTA e-Tool enabled rapid turn-around of audit results, reduced workload and language barriers and facilitated analysis of national results. Benefits of the e-Tool will be magnified with in-country scale-up of laboratory quality improvement efforts and potential expansion to other countries. PMID:29043190

  14. Mentoring for NHS doctors: perceived benefits across the personal–professional interface

    PubMed Central

    Steven, A; Oxley, J; Fleming, WG

    2008-01-01

    Summary Objective To investigate NHS doctors' perceived benefits of being involved in mentoring schemes and to explore the overlaps and relationships between areas of benefit. Design Extended qualitative analysis of a multi-site interview study following an interpretivist approach. Setting Six NHS mentoring schemes across England. Main outcome measures Perceived benefits. Results While primary analysis resulted in lists of perceived benefits, the extended analysis revealed three overarching areas: professional practice, personal well-being and development. Benefits appear to go beyond a doctor's professional role to cross the personal–professional interface. Problem solving and change management seem to be key processes underpinning the raft of personal and professional benefits reported. A conceptual map was developed to depict these areas and relationships. In addition secondary analysis suggests that in benefitting one area mentoring may lead to consequential benefits in others. Conclusions Prior research into mentoring has mainly taken place in a single health care sector. This multi-site study suggests that the perceived benefits of involvement in mentoring may cross the personal/professional interface and may override organizational differences. Furthermore the map developed highlights the complex relationships which exist between the three areas of professional practice, personal wellbeing and personal and professional development. Given the consistency of findings across several studies it seems probable that organizations would be strengthened by doctors who feel more satisfied and confident in their professional roles as a result of participation in mentoring. Mentoring may have the potential to take us beyond individual limits to greater benefits and the conceptual map may offer a starting point for the development of outcome criteria and evaluation tools for mentoring schemes. PMID:19029356

  15. Multicriteria decision analysis: Overview and implications for environmental decision making

    USGS Publications Warehouse

    Hermans, Caroline M.; Erickson, Jon D.; Erickson, Jon D.; Messner, Frank; Ring, Irene

    2007-01-01

    Environmental decision making involving multiple stakeholders can benefit from the use of a formal process to structure stakeholder interactions, leading to more successful outcomes than traditional discursive decision processes. There are many tools available to handle complex decision making. Here we illustrate the use of a multicriteria decision analysis (MCDA) outranking tool (PROMETHEE) to facilitate decision making at the watershed scale, involving multiple stakeholders, multiple criteria, and multiple objectives. We compare various MCDA methods and their theoretical underpinnings, examining methods that most realistically model complex decision problems in ways that are understandable and transparent to stakeholders.

  16. Cost-Loss Analysis of Ensemble Solar Wind Forecasting: Space Weather Use of Terrestrial Weather Tools

    NASA Astrophysics Data System (ADS)

    Henley, E. M.; Pope, E. C. D.

    2017-12-01

    This commentary concerns recent work on solar wind forecasting by Owens and Riley (2017). The approach taken makes effective use of tools commonly used in terrestrial weather—notably, via use of a simple model—generation of an "ensemble" forecast, and application of a "cost-loss" analysis to the resulting probabilistic information, to explore the benefit of this forecast to users with different risk appetites. This commentary aims to highlight these useful techniques to the wider space weather audience and to briefly discuss the general context of application of terrestrial weather approaches to space weather.

  17. Building and evaluating an informatics tool to facilitate analysis of a biomedical literature search service in an academic medical center library.

    PubMed

    Hinton, Elizabeth G; Oelschlegel, Sandra; Vaughn, Cynthia J; Lindsay, J Michael; Hurst, Sachiko M; Earl, Martha

    2013-01-01

    This study utilizes an informatics tool to analyze a robust literature search service in an academic medical center library. Structured interviews with librarians were conducted focusing on the benefits of such a tool, expectations for performance, and visual layout preferences. The resulting application utilizes Microsoft SQL Server and .Net Framework 3.5 technologies, allowing for the use of a web interface. Customer tables and MeSH terms are included. The National Library of Medicine MeSH database and entry terms for each heading are incorporated, resulting in functionality similar to searching the MeSH database through PubMed. Data reports will facilitate analysis of the search service.

  18. [SYSTEMATIZATION, ORDER AND SECURITY AS A RESULT OF THE NURSE ASSESSMENT IN PAEDIATRIC EMERGENCY: A METAMORPHOSIS PROCESS].

    PubMed

    García-Hernández, M-Noelia; Fraga-Hernández, Ma Elena; Mahtani-Chugani, Vinita

    2014-12-01

    To determine from the health care professionals perspective the impact on clinical practice of incorporating an assessment tool for primary care paediatric emergency. Qualitative study based on the collection of written documents. Twenty-four wide and detailed documents were collected. Thematic analysis was used. Participants were 9 nurses and 7 paediatricians, all with experience in the Paediatric Emergency Department. The results are grouped into three areas: perception of previous situation; benefits perceived; difficulties of the change process related to the triage instrument. The benefits perceived include the achievement of the objectives related to triage as well as collateral benefits for the organization and distribution of structural resources, adequacy of human resources, self-assessment and professional recognition, improvement of team communication and users service perception. The difficulties identified are related to the feasibility of using this instrument when patient flow is high and to the need of specialized training. All participants perceived more benefits than disadvantages, and both nurses and paediatricians experienced the process as a positive experience. The introduction of the assessment tool had a broader impact than expected.

  19. The EnviroAtlas: Connecting ecosystems, people, and well-being

    EPA Science Inventory

    The EnviroAtlas is a web-based application containing a collection of geospatial data, analysis tools, and interpretive information focused on ecosystem goods and services. Ecosystem goods and services are essentially defined as the benefits that humans receive from nature and en...

  20. A benefit-cost analysis tool for assessing guardrail needs for two-lane rural roads in Virginia.

    DOT National Transportation Integrated Search

    2015-10-01

    Guardrail is installed along the roadside to shield hazards such as steep slopes and bridge piers from vehicles. Although : the Virginia Department of Transportations Road Design Manual provides guidance for determining where to install guardrail ...

  1. GREEN CHEMISTRY AND POLLUTION PREVENTION TOOLS

    EPA Science Inventory

    Green Chemistry and Design for the Environment bring benefits as they can be a relatively low societal cost way to avoid pollution before it occurs. EPA supports these fields from the early stages of research and knowledge development through to assessment, economic analysis, edu...

  2. Applying analysis tools in planning for operations : case study #1 -- operations strategy impact reference and deployment guidance

    DOT National Transportation Integrated Search

    2009-09-01

    More and more, transportation system operators are seeing the benefits of strengthening links between planning and operations. A critical element in improving transportation decision-making and the effectiveness of transportation systems related to o...

  3. Rapid Benefit Indicator (RBI) Checklist Tool - Quick Start Manual

    EPA Science Inventory

    The Rapid Benefits Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration – A Rapid Benefits Indicators Approach for Decision Makers. This checklist tool is intended to be used to record information as you answer the ques...

  4. The application of systems thinking concepts, methods, and tools to global health practices: An analysis of case studies.

    PubMed

    Wilkinson, Jessica; Goff, Morgan; Rusoja, Evan; Hanson, Carl; Swanson, Robert Chad

    2018-06-01

    This review of systems thinking (ST) case studies seeks to compile and analyse cases from ST literature and provide practitioners with a reference for ST in health practice. Particular attention was given to (1) reviewing the frequency and use of key ST terms, methods, and tools in the context of health, and (2) extracting and analysing longitudinal themes across cases. A systematic search of databases was conducted, and a total of 36 case studies were identified. A combination of integrative and inductive qualitative approaches to analysis was used. Most cases identified took place in high-income countries and applied ST retrospectively. The most commonly used ST terms were agent/stakeholder/actor (n = 29), interdependent/interconnected (n = 28), emergence (n = 26), and adaptability/adaptation (n = 26). Common ST methods and tools were largely underutilized. Social network analysis was the most commonly used method (n = 4), and innovation or change management history was the most frequently used tool (n = 11). Four overarching themes were identified; the importance of the interdependent and interconnected nature of a health system, characteristics of leaders in a complex adaptive system, the benefits of using ST, and barriers to implementing ST. This review revealed that while much has been written about the potential benefits of applying ST to health, it has yet to completely transition from theory to practice. There is however evidence of the practical use of an ST lens as well as specific methods and tools. With clear examples of ST applications, the global health community will be better equipped to understand and address key health challenges. © 2017 John Wiley & Sons, Ltd.

  5. Rapid Analysis and Manufacturing Propulsion Technology (RAMPT)

    NASA Technical Reports Server (NTRS)

    Fikes, John C.

    2018-01-01

    NASA's strategic plan calls for the development of enabling technologies, improved production methods, and advanced design and analysis tools related to the agency's objectives to expand human presence in the solar system. NASA seeks to advance exploration, science, innovation, benefits to humanity, and international collaboration, as well as facilitate and utilize U.S. commercial capabilities to deliver cargo and crew to space.

  6. Automated Sensitivity Analysis of Interplanetary Trajectories for Optimal Mission Design

    NASA Technical Reports Server (NTRS)

    Knittel, Jeremy; Hughes, Kyle; Englander, Jacob; Sarli, Bruno

    2017-01-01

    This work describes a suite of Python tools known as the Python EMTG Automated Trade Study Application (PEATSA). PEATSA was written to automate the operation of trajectory optimization software, simplify the process of performing sensitivity analysis, and was ultimately found to out-perform a human trajectory designer in unexpected ways. These benefits will be discussed and demonstrated on sample mission designs.

  7. The application of seismic risk-benefit analysis to land use planning in Taipei City.

    PubMed

    Hung, Hung-Chih; Chen, Liang-Chun

    2007-09-01

    In the developing countries of Asia local authorities rarely use risk analysis instruments as a decision-making support mechanism during planning and development procedures. The main purpose of this paper is to provide a methodology to enable planners to undertake such analyses. We illustrate a case study of seismic risk-benefit analysis for the city of Taipei, Taiwan, using available land use maps and surveys as well as a new tool developed by the National Science Council in Taiwan--the HAZ-Taiwan earthquake loss estimation system. We use three hypothetical earthquakes to estimate casualties and total and annualised direct economic losses, and to show their spatial distribution. We also characterise the distribution of vulnerability over the study area using cluster analysis. A risk-benefit ratio is calculated to express the levels of seismic risk attached to alternative land use plans. This paper suggests ways to perform earthquake risk evaluations and the authors intend to assist city planners to evaluate the appropriateness of their planning decisions.

  8. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  9. USEPA’s Land‐Based Materials Management Exposure and Risk Assessment Tool System

    EPA Science Inventory

    It is recognized that some kinds of 'waste' materials can in fact be reused as input materials for making safe products that benefit society. RIMM (Risk-Informed Materials Management) provides an integrated data gathering and analysis capability to enable scientifically rigorous ...

  10. Assessing and interpreting the benefits derived from implementing and using asset management systems.

    DOT National Transportation Integrated Search

    2011-06-23

    Interest in asset management has grown over the last two decades but agencies continue to be : concerned about the cost to develop and implement asset management processes. While originally : introduced as a tool for policy analysis, HERS-ST is free ...

  11. Applying analysis tools in planning for operations : case study #4 -- application of microsimulation in combination with travel demand models

    DOT National Transportation Integrated Search

    2009-09-01

    More and more, transportation system operators are seeing the benefits of strengthening links between planning and operations. A critical element in improving transportation decision-making and the effectiveness of transportation systems related to o...

  12. FMCSA Safety Program Effectiveness Measurement: Carrier Intervention Effectiveness Model, Version 1.1-Report for FY 2014 Interventions - Analysis Brief

    DOT National Transportation Integrated Search

    2018-04-01

    The Carrier Intervention Effectiveness Model (CIEM) provides the Federal Motor Carrier Safety Administration (FMCSA) with a tool for measuring the safety benefits of carrier interventions conducted under the Compliance, Safety, Accountability (CSA) e...

  13. FMCSA safety program effectiveness measurement : carrier intervention effectiveness model, version 1.1 - report for FY 2013 interventions : analysis brief

    DOT National Transportation Integrated Search

    2017-04-01

    The Carrier Intervention Effectiveness Model (CIEM) provides the Federal Motor Carrier Safety Administration (FMCSA) with a tool for measuring the safety benefits of carrier interventions conducted under the Compliance, Safety, Accountability (CSA) e...

  14. The Role of the Atmosphere in the Provision of Ecosystem Services

    EPA Science Inventory

    Solving the environmental problems that we are facing today requires holistic approaches to analysis and decision making that include social and economic aspects. The concept of ecosystem services, defined as the benefits people obtain from ecosystems, is one potential tool to p...

  15. Probabilistic cost-benefit analysis of disaster risk management in a development context.

    PubMed

    Kull, Daniel; Mechler, Reinhard; Hochrainer-Stigler, Stefan

    2013-07-01

    Limited studies have shown that disaster risk management (DRM) can be cost-efficient in a development context. Cost-benefit analysis (CBA) is an evaluation tool to analyse economic efficiency. This research introduces quantitative, stochastic CBA frameworks and applies them in case studies of flood and drought risk reduction in India and Pakistan, while also incorporating projected climate change impacts. DRM interventions are shown to be economically efficient, with integrated approaches more cost-effective and robust than singular interventions. The paper highlights that CBA can be a useful tool if certain issues are considered properly, including: complexities in estimating risk; data dependency of results; negative effects of interventions; and distributional aspects. The design and process of CBA must take into account specific objectives, available information, resources, and the perceptions and needs of stakeholders as transparently as possible. Intervention design and uncertainties should be qualified through dialogue, indicating that process is as important as numerical results. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.

  16. Surgical approaches to complex vascular lesions: the use of virtual reality and stereoscopic analysis as a tool for resident and student education.

    PubMed

    Agarwal, Nitin; Schmitt, Paul J; Sukul, Vishad; Prestigiacomo, Charles J

    2012-08-01

    Virtual reality training for complex tasks has been shown to be of benefit in fields involving highly technical and demanding skill sets. The use of a stereoscopic three-dimensional (3D) virtual reality environment to teach a patient-specific analysis of the microsurgical treatment modalities of a complex basilar aneurysm is presented. Three different surgical approaches were evaluated in a virtual environment and then compared to elucidate the best surgical approach. These approaches were assessed with regard to the line-of-sight, skull base anatomy and visualisation of the relevant anatomy at the level of the basilar artery and surrounding structures. Overall, the stereoscopic 3D virtual reality environment with fusion of multimodality imaging affords an excellent teaching tool for residents and medical students to learn surgical approaches to vascular lesions. Future studies will assess the educational benefits of this modality and develop a series of metrics for student assessments.

  17. Mining Marketing Data

    NASA Technical Reports Server (NTRS)

    2002-01-01

    MarketMiner(R) Products, a line of automated marketing analysis tools manufactured by MarketMiner, Inc., can benefit organizations that perform significant amounts of direct marketing. MarketMiner received a Small Business Innovation Research (SBIR) contract from NASA's Johnson Space Center to develop the software as a data modeling tool for space mission applications. The technology was then built into the company current products to provide decision support for business and marketing applications. With the tool, users gain valuable information about customers and prospects from existing data in order to increase sales and profitability. MarketMiner(R) is a registered trademark of MarketMiner, Inc.

  18. Application of Bayesian and cost benefit risk analysis in water resources management

    NASA Astrophysics Data System (ADS)

    Varouchakis, E. A.; Palogos, I.; Karatzas, G. P.

    2016-03-01

    Decision making is a significant tool in water resources management applications. This technical note approaches a decision dilemma that has not yet been considered for the water resources management of a watershed. A common cost-benefit analysis approach, which is novel in the risk analysis of hydrologic/hydraulic applications, and a Bayesian decision analysis are applied to aid the decision making on whether or not to construct a water reservoir for irrigation purposes. The alternative option examined is a scaled parabolic fine variation in terms of over-pumping violations in contrast to common practices that usually consider short-term fines. The methodological steps are analytically presented associated with originally developed code. Such an application, and in such detail, represents new feedback. The results indicate that the probability uncertainty is the driving issue that determines the optimal decision with each methodology, and depending on the unknown probability handling, each methodology may lead to a different optimal decision. Thus, the proposed tool can help decision makers to examine and compare different scenarios using two different approaches before making a decision considering the cost of a hydrologic/hydraulic project and the varied economic charges that water table limit violations can cause inside an audit interval. In contrast to practices that assess the effect of each proposed action separately considering only current knowledge of the examined issue, this tool aids decision making by considering prior information and the sampling distribution of future successful audits.

  19. An Investigation of Human Performance Model Validation

    DTIC Science & Technology

    2005-03-01

    of design decisions, what the costs and benefits are for each of the stages of analysis options. On the ’benefits’ side, the manager needs to know...confidence. But we also want to know that we are not expending any more effort (and other costs ) than necessary to ensure that the right decision is...supported at each stage. Ultimately, we want to enable SBA managers to have confidence that they are selecting the right HPM tools and using them correctly in

  20. A case study using the PrOACT-URL and BRAT frameworks for structured benefit risk assessment.

    PubMed

    Nixon, Richard; Dierig, Christoph; Mt-Isa, Shahrul; Stöckert, Isabelle; Tong, Thaison; Kuhls, Silvia; Hodgson, Gemma; Pears, John; Waddingham, Ed; Hockley, Kimberley; Thomson, Andrew

    2016-01-01

    While benefit-risk assessment is a key component of the drug development and maintenance process, it is often described in a narrative. In contrast, structured benefit-risk assessment builds on established ideas from decision analysis and comprises a qualitative framework and quantitative methodology. We compare two such frameworks, applying multi-criteria decision-analysis (MCDA) within the PrOACT-URL framework and weighted net clinical benefit (wNCB), within the BRAT framework. These are applied to a case study of natalizumab for the treatment of relapsing remitting multiple sclerosis. We focus on the practical considerations of applying these methods and give recommendations for visual presentation of results. In the case study, we found structured benefit-risk analysis to be a useful tool for structuring, quantifying, and communicating the relative benefit and safety profiles of drugs in a transparent, rational and consistent way. The two frameworks were similar. MCDA is a generic and flexible methodology that can be used to perform a structured benefit-risk in any common context. wNCB is a special case of MCDA and is shown to be equivalent to an extension of the number needed to treat (NNT) principle. It is simpler to apply and understand than MCDA and can be applied when all outcomes are measured on a binary scale. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Translating Cholesterol Guidelines Into Primary Care Practice: A Multimodal Cluster Randomized Trial

    PubMed Central

    Eaton, Charles B.; Parker, Donna R.; Borkan, Jeffrey; McMurray, Jerome; Roberts, Mary B.; Lu, Bing; Goldman, Roberta; Ahern, David K.

    2011-01-01

    PURPOSE We wanted to determine whether an intervention based on patient activation and a physician decision support tool was more effective than usual care for improving adherence to National Cholesterol Education Program guidelines. METHODS A 1-year cluster randomized controlled trial was performed using 30 primary care practices (4,105 patients) in southeastern New England. The main outcome was the percentage of patients screened for hyperlipidemia and treated to their low-density lipoprotein (LDL) and non–high-density lipoprotein (HDL) cholesterol goals. RESULTS After 1 year of intervention, both randomized practice groups improved screening (89% screened), and 74% of patients in both groups were at their LDL and non-HDL cholesterol goals (P <.001). Using intent-to-treat analysis, we found no statistically significant differences between practice groups in screening or percentage of patients who achieved LDL and non-HDL cholesterol goals. Post hoc analysis showed practices who made high use of the patient activation kiosk were more likely to have patients screened (odds ratio [OR] = 2.54; 95% confidence interval [CI], 1.97–3.27) compared with those who made infrequent or no use. Additionally, physicians who made high use of decision support tools were more likely to have their patients at their LDL cholesterol goals (OR = 1.27; 95% CI, 1.07–1.50) and non-HDL goals (OR = 1.23; 95% CI, 1.04–1.46) than low-use or no-use physicians. CONCLUSION This study showed null results with the intent-to-treat analysis regarding the benefits of a patient activation and a decision support tool in improving cholesterol management in primary care practices. Post hoc analysis showed a potential benefit in practices that used the e-health tools more frequently in screening and management of dyslipidemia. Further research on how to incorporate and increase adoption of user-friendly, patient-centered e-health tools to improve screening and management of chronic diseases and their risk factors is warranted. PMID:22084264

  2. CERCLA-linked environmental impact and benefit analysis: Evaluating remedial alternatives for the Portland Harbor Superfund Site, Portland, Oregon, USA.

    PubMed

    McNally, Amanda D; Fitzpatrick, Anne G; Mirchandani, Sera; Salmon, Matthew; Edwards, Deborah A

    2018-01-01

    This analysis focused on evaluating the environmental consequences of remediation, providing indicators for the environmental quality pillar of 3 "pillars" of the Portland Harbor Sustainability Project (PHSP) framework (the other 2 pillars are economic viability and social equity). The project an environmental impact and benefit analysis (EIBA) and an EIBA-based cost-benefit analysis. Metrics developed in the EIBA were used to quantify and compare remedial alternatives' environmental benefits and impacts in the human and ecological domains, as a result of remedial actions (relative to no action). The cost-benefit results were used to evaluate whether remediation costs were proportionate or disproportionate to the environmental benefits. Alternatives B and D had the highest overall benefit scores, and Alternative F was disproportionately costly relative to its achieved benefits when compared to the other remedial alternatives. Indeed, the costlier alternatives with larger remedial footprints had lower overall EIBA benefit scores-because of substantially more air emissions, noise, and light impacts, and more disturbance to business, recreational access, and habitat during construction-compared to the less costly and smaller alternatives. Put another way, the adverse effects during construction tended to outweigh the long-term benefits, and the net environmental impacts of the larger remedial alternatives far outweighed their small incremental improvements in risk reduction. Results of this Comprehensive Environmental Response Compensation and Liability Act (CERCLA)-linked environmental analysis were integrated with indicators of economic and social impacts of remediation in a stakeholder values-based sustainability framework. These tools (EIBA, EIBA-based cost-benefit analysis, economic impact assessment, and the stakeholder values-based integration) provide transparent and quantitative evaluations of the benefits and impacts associated with remedial alternatives, and should be applied to complex remediation projects to aid environmental decision making. Integr Environ Assess Manag 2018;14:22-31. © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).

  3. Employing broadband spectra and cluster analysis to assess thermal defoliation of cotton

    USDA-ARS?s Scientific Manuscript database

    Growers and field scouts need assistance in surveying cotton (Gossypium hirsutum L.) fields subjected to thermal defoliation to reap the benefits provided by this nonchemical defoliation method. A study was conducted to evaluate broadband spectral data and unsupervised classification as tools for s...

  4. The Exponential Expansion of Simulation: How Simulation has Grown as a Research Tool

    DTIC Science & Technology

    2012-09-01

    exponential growth of computing power. Although other analytic approaches also benefit from this trend, keyword searches of several scholarly search ... engines reveal that the reliance on simulation is increasing more rapidly. A descriptive analysis paints a compelling picture: simulation is frequently

  5. Microgenetic Learning Analytics Methods: Workshop Report

    ERIC Educational Resources Information Center

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  6. A decision framework for identifying models to estimate forest ecosystem services gains from restoration

    USGS Publications Warehouse

    Christin, Zachary; Bagstad, Kenneth J.; Verdone, Michael

    2016-01-01

    Restoring degraded forests and agricultural lands has become a global conservation priority. A growing number of tools can quantify ecosystem service tradeoffs associated with forest restoration. This evolving “tools landscape” presents a dilemma: more tools are available, but selecting appropriate tools has become more challenging. We present a Restoration Ecosystem Service Tool Selector (RESTS) framework that describes key characteristics of 13 ecosystem service assessment tools. Analysts enter information about their decision context, services to be analyzed, and desired outputs. Tools are filtered and presented based on five evaluative criteria: scalability, cost, time requirements, handling of uncertainty, and applicability to benefit-cost analysis. RESTS uses a spreadsheet interface but a web-based interface is planned. Given the rapid evolution of ecosystem services science, RESTS provides an adaptable framework to guide forest restoration decision makers toward tools that can help quantify ecosystem services in support of restoration.

  7. Mapping the Delivery of Societal Benefit through the International Arctic Observations Assessment Framework

    NASA Astrophysics Data System (ADS)

    Lev, S. M.; Gallo, J.

    2017-12-01

    The international Arctic scientific community has identified the need for a sustained and integrated portfolio of pan-Arctic Earth-observing systems. In 2017, an international effort was undertaken to develop the first ever Value Tree framework for identifying common research and operational objectives that rely on Earth observation data derived from Earth-observing systems, sensors, surveys, networks, models, and databases to deliver societal benefits in the Arctic. A Value Tree Analysis is a common tool used to support decision making processes and is useful for defining concepts, identifying objectives, and creating a hierarchical framework of objectives. A multi-level societal benefit area value tree establishes the connection from societal benefits to the set of observation inputs that contribute to delivering those benefits. A Value Tree that relies on expert domain knowledge from Arctic and non-Arctic nations, international researchers, Indigenous knowledge holders, and other experts to develop a framework to serve as a logical and interdependent decision support tool will be presented. Value tree examples that map the contribution of Earth observations in the Arctic to achieving societal benefits will be presented in the context of the 2017 International Arctic Observations Assessment Framework. These case studies will highlight specific observing products and capability groups where investment is needed to contribute to the development of a sustained portfolio of Arctic observing systems.

  8. Human performance cognitive-behavioral modeling: a benefit for occupational safety.

    PubMed

    Gore, Brian F

    2002-01-01

    Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.

  9. Human performance cognitive-behavioral modeling: a benefit for occupational safety

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2002-01-01

    Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.

  10. Single Event Analysis and Fault Injection Techniques Targeting Complex Designs Implemented in Xilinx-Virtex Family Field Programmable Gate Array (FPGA) Devices

    NASA Technical Reports Server (NTRS)

    Berg, Melanie D.; LaBel, Kenneth; Kim, Hak

    2014-01-01

    An informative session regarding SRAM FPGA basics. Presenting a framework for fault injection techniques applied to Xilinx Field Programmable Gate Arrays (FPGAs). Introduce an overlooked time component that illustrates fault injection is impractical for most real designs as a stand-alone characterization tool. Demonstrate procedures that benefit from fault injection error analysis.

  11. Discriminant Analysis as a Tool for Admission Selection to Special Academic Programs. AIR 1986 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Kissel, Mary Ann

    The use of stepwise discriminant analysis as a means to select entering students who would benefit from a special program for the disadvantaged was studied. In fall 1984, 278 full-time black students were admitted as first-time students to a large urban university. Of the total, 200 entered a special program for the disadvantaged and 78 entered…

  12. Strategic thinking for radiology.

    PubMed

    Schilling, R B

    1997-08-01

    We have now analyzed the use and benefits of four Strategic Thinking Tools for Radiology: the Vision Statement, the High Five, the Two-by-Two, and Real-Win-Worth. Additional tools will be provided during the tutorial. The tools provided above should be considered as examples. They all contain the 10 benefits outlined earlier to varying degrees. It is extremely important that the tools be used in a manner consistent with the Vision Statement of the organization. The specific situation, the effectiveness of the team, and the experience developed with the tools over time will determine the true benefits of the process. It has also been shown that with active use of the types of tools provided above, teams have learned to modify the tools for increased effectiveness and have created additional tools for specific purposes. Once individuals in the organization become committed to improving communication and to using tools/frameworks for solving problems as a team, effectiveness becomes boundless.

  13. Evaluating Opportunities to Improve Material and Energy Impacts in Commodity Supply Chains.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanes, Rebecca J.; Carpenter, Alberta

    When evaluated at the process level, next-generation technologies may be more energy and emissions intensive than current technology. However, many advanced technologies have the potential to reduce material and energy consumption in upstream or downstream processing stages. In order to fully understand the benefits and consequences of technology deployment, next-generation technologies should be evaluated in context, as part of a supply chain. This work presents the Material Flows through Industry (MFI) scenario modeling tool. The MFI tool is a cradle-to-gate linear network model of the U.S. industrial sector that can model a wide range of manufacturing scenarios, including changes inmore » production technology, increases in industrial energy efficiency, and substitution between functionally equivalent materials. The MFI tool was developed to perform supply chain scale analyses in order to quantify the impacts and benefits of next-generation technologies and materials at that scale. For the analysis presented in this paper, the MFI tool is utilized to explore a case study comparing a steel supply chain to the supply chains of several functionally equivalent materials. Several of the alternatives to the baseline steel supply chain include next-generation production technologies and materials. Results of the case study show that aluminum production scenarios can out-perform the steel supply chain by using either an advanced smelting technology or an increased aluminum recycling rate. The next-generation material supply chains do not perform as well as either aluminum or steel, but may offer additional use phase reductions in energy and emissions that are outside the scope of the MFI tool. Future work will combine results from the MFI tool with a use phase analysis.« less

  14. Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking

    NASA Technical Reports Server (NTRS)

    Turgeon, Gregory; Price, Petra

    2010-01-01

    A feasibility study was performed on a representative aerospace system to determine the following: (1) the benefits and limitations to using SCADE , a commercially available tool for model checking, in comparison to using a proprietary tool that was studied previously [1] and (2) metrics for performing the model checking and for assessing the findings. This study was performed independently of the development task by a group unfamiliar with the system, providing a fresh, external perspective free from development bias.

  15. Inertial focusing of microparticles and its limitations

    NASA Astrophysics Data System (ADS)

    Cruz, FJ; Hooshmand Zadeh, S.; Wu, ZG; Hjort, K.

    2016-10-01

    Microfluidic devices are useful tools for healthcare, biological and chemical analysis and materials synthesis amongst fields that can benefit from the unique physics of these systems. In this paper we studied inertial focusing as a tool for hydrodynamic sorting of particles by size. Theory and experimental results are provided as a background for a discussion on how to extend the technology to submicron particles. Different geometries and dimensions of microchannels were designed and simulation data was compared to the experimental results.

  16. Statistical comparison of a hybrid approach with approximate and exact inference models for Fusion 2+

    NASA Astrophysics Data System (ADS)

    Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew

    2007-04-01

    One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.

  17. A machine learning tool for re-planning and adaptive RT: A multicenter cohort investigation.

    PubMed

    Guidi, G; Maffei, N; Meduri, B; D'Angelo, E; Mistretta, G M; Ceroni, P; Ciarmatori, A; Bernabei, A; Maggi, S; Cardinali, M; Morabito, V E; Rosica, F; Malara, S; Savini, A; Orlandi, G; D'Ugo, C; Bunkheila, F; Bono, M; Lappi, S; Blasi, C; Lohr, F; Costi, T

    2016-12-01

    To predict patients who would benefit from adaptive radiotherapy (ART) and re-planning intervention based on machine learning from anatomical and dosimetric variations in a retrospective dataset. 90 patients (pts) treated for head-neck cancer (H&N) formed a multicenter data-set. 41 H&N pts (45.6%) were considered for learning; 49 pts (54.4%) were used to test the tool. A homemade machine-learning classifier was developed to analyze volume and dose variations of parotid glands (PG). Using deformable image registration (DIR) and GPU, patients' conditions were analyzed automatically. Support Vector Machines (SVM) was used for time-series evaluation. "Inadequate" class identified patients that might benefit from replanning. Double-blind evaluation by two radiation oncologists (ROs) was carried out to validate day/week selected for re-planning by the classifier. The cohort was affected by PG mean reduction of 23.7±8.8%. During the first 3weeks, 86.7% cases show PG deformation aligned with predefined tolerance, thus not requiring re-planning. From 4th week, an increased number of pts would potentially benefit from re-planning: a mean of 58% of cases, with an inter-center variability of 8.3%, showed "inadequate" conditions. 11% of cases showed "bias" due to DIR and script failure; 6% showed "warning" output due to potential positioning issues. Comparing re-planning suggested by tool with recommended by ROs, the 4th week seems the most favorable time in 70% cases. SVM and decision-making tool was applied to overcome ART challenges. Pts would benefit from ART and ideal time for re-planning intervention was identified in this retrospective analysis. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  18. Integrating Social Media Technologies in Higher Education: Costs-Benefits Analysis

    ERIC Educational Resources Information Center

    Okoro, Ephraim

    2012-01-01

    Social networking and electronic channels of communication are effective tools in the process of teaching and learning and have increasingly improved the quality of students' learning outcomes in higher education in recent years. The process encourages students' active engagement, collaboration, and participation in class activities and group…

  19. Nomogram for predicting the benefit of neoadjuvant chemoradiotherapy for patients with esophageal cancer: a SEER-Medicare analysis.

    PubMed

    Eil, Robert; Diggs, Brian S; Wang, Samuel J; Dolan, James P; Hunter, John G; Thomas, Charles R

    2014-02-15

    The survival impact of neoadjuvant chemoradiotherapy (CRT) on esophageal cancer remains difficult to establish for specific patients. The aim of the current study was to create a Web-based prediction tool providing individualized survival projections based on tumor and treatment data. Patients diagnosed with esophageal cancer between 1997 and 2005 were selected from the Surveillance, Epidemiology, and End Results (SEER)-Medicare database. The covariates analyzed were sex, T and N classification, histology, total number of lymph nodes examined, and treatment with esophagectomy or CRT followed by esophagectomy. After propensity score weighting, a log-logistic regression model for overall survival was selected based on the Akaike information criterion. A total of 824 patients with esophageal cancer who were treated with esophagectomy or trimodal therapy met the selection criteria. On multivariate analysis, age, sex, T and N classification, number of lymph nodes examined, treatment, and histology were found to be significantly associated with overall survival and were included in the regression analysis. Preoperative staging data and final surgical margin status were not available within the SEER-Medicare data set and therefore were not included. The model predicted that patients with T4 or lymph node disease benefitted from CRT. The internally validated concordance index was 0.72. The SEER-Medicare database of patients with esophageal cancer can be used to produce a survival prediction tool that: 1) serves as a counseling and decision aid to patients and 2) assists in risk modeling. Patients with T4 or lymph node disease appeared to benefit from CRT. This nomogram may underestimate the benefit of CRT due to its variable downstaging effect on pathologic stage. It is available at skynet.ohsu.edu/nomograms. © 2013 American Cancer Society.

  20. The value of job analysis, job description and performance.

    PubMed

    Wolfe, M N; Coggins, S

    1997-01-01

    All companies, regardless of size, are faced with the same employment concerns. Efficient personnel management requires the use of three human resource techniques--job analysis, job description and performance appraisal. These techniques and tools are not for large practices only. Small groups can obtain the same benefits by employing these performance control measures. Job analysis allows for the development of a compensation system. Job descriptions summarize the most important duties. Performance appraisals help reward outstanding work.

  1. Handling value added tax (VAT) in economic evaluations: should prices include VAT?

    PubMed

    Bech, Mickael; Christiansen, Terkel; Gyrd-Hansen, Dorte

    2006-01-01

    In health economic evaluations, value added tax is commonly treated as a transfer payment. Following this argument, resources are valued equal to their net-of-tax prices in economic evaluations applying a societal perspective. In this article we argue that if there is the possibility that a new healthcare intervention may expand the healthcare budget, the social cost of input factors should be the gross-of-tax prices and not the net-of-tax prices. The rising interest in cost-benefit analysis and the use of absolute thresholds, net benefit estimates and acceptability curves in cost-effectiveness analysis makes this argument highly relevant for an appropriate use of these tools in prioritisation.

  2. Measuring laboratory-based influenza surveillance capacity: development of the 'International Influenza Laboratory Capacity Review' Tool.

    PubMed

    Muir-Paulik, S A; Johnson, L E A; Kennedy, P; Aden, T; Villanueva, J; Reisdorf, E; Humes, R; Moen, A C

    2016-01-01

    The 2005 International Health Regulations (IHR 2005) emphasized the importance of laboratory capacity to detect emerging diseases including novel influenza viruses. To support IHR 2005 requirements and the need to enhance influenza laboratory surveillance capacity, the Association of Public Health Laboratories (APHL) and the Centers for Disease Control and Prevention (CDC) Influenza Division developed the International Influenza Laboratory Capacity Review (Tool). Data from 37 assessments were reviewed and analyzed to verify that the quantitative analysis results accurately depicted a laboratory's capacity and capabilities. Subject matter experts in influenza and laboratory practice used an iterative approach to develop the Tool incorporating feedback and lessons learnt through piloting and implementation. To systematically analyze assessment data, a quantitative framework for analysis was added to the Tool. The review indicated that changes in scores consistently reflected enhanced or decreased capacity. The review process also validated the utility of adding a quantitative analysis component to the assessments and the benefit of establishing a baseline from which to compare future assessments in a standardized way. Use of the Tool has provided APHL, CDC and each assessed laboratory with a standardized analysis of the laboratory's capacity. The information generated is used to improve laboratory systems for laboratory testing and enhance influenza surveillance globally. We describe the development of the Tool and lessons learnt. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Cost Benefit Analysis Modeling Tool for Electric vs. ICE Airport Ground Support Equipment – Development and Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James Francfort; Kevin Morrow; Dimitri Hochard

    2007-02-01

    This report documents efforts to develop a computer tool for modeling the economic payback for comparative airport ground support equipment (GSE) that are propelled by either electric motors or gasoline and diesel engines. The types of GSE modeled are pushback tractors, baggage tractors, and belt loaders. The GSE modeling tool includes an emissions module that estimates the amount of tailpipe emissions saved by replacing internal combustion engine GSE with electric GSE. This report contains modeling assumptions, methodology, a user’s manual, and modeling results. The model was developed based on the operations of two airlines at four United States airports.

  4. Strategic Planning and Energy Options Analysis for the Fort Peck Assiniboine and Sioux Tribes. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williamson, Jim S; Greenwood Village, CO 80112

    2007-03-31

    Strategic Planning and Energy Options Analysis provides the Fort Peck Tribes with a tool to build analytical capabilities and local capacity to extract the natural and energy resource potential for the benefit of the tribal community. Each resource is identified irrespective of the development potential and is viewed as an absolute resulting in a comprehensive resource assessment for Tribal energy planning

  5. The multifactorial role of the 3Rs in shifting the harm-benefit analysis in animal models of disease.

    PubMed

    Graham, Melanie L; Prescott, Mark J

    2015-07-15

    Ethics on animal use in science in Western society is based on utilitarianism, weighing the harms and benefits to the animals involved against those of the intended human beneficiaries. The 3Rs concept (Replacement, Reduction, Refinement) is both a robust framework for minimizing animal use and suffering (addressing the harms to animals) and a means of supporting high quality science and translation (addressing the benefits). The ambiguity of basic research performed early in the research continuum can sometimes make harm-benefit analysis more difficult since anticipated benefit is often an incremental contribution to a field of knowledge. On the other hand, benefit is much more evident in translational research aimed at developing treatments for direct application in humans or animals suffering from disease. Though benefit may be easier to define, it should certainly not be considered automatic. Issues related to model validity seriously compromise experiments and have been implicated as a major impediment in translation, especially in complex disease models where harms to animals can be intensified. Increased investment and activity in the 3Rs is delivering new research models, tools and approaches with reduced reliance on animal use, improved animal welfare, and improved scientific and predictive value. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. The multifactorial role of the 3Rs in shifting the harm-benefit analysis in animal models of disease

    PubMed Central

    Graham, Melanie L.; Prescott, Mark J.

    2015-01-01

    Ethics on animal use in science in Western society is based on utilitarianism, weighing the harms and benefits to the animals involved against those of the intended human beneficiaries. The 3Rs concept (Replacement, Reduction, Refinement) is both a robust framework for minimizing animal use and suffering (addressing the harms to animals) and a means of supporting high quality science and translation (addressing the benefits). The ambiguity of basic research performed early in the research continuum can sometimes make harm-benefit analysis more difficult since anticipated benefit is often an incremental contribution to a field of knowledge. On the other hand, benefit is much more evident in translational research aimed at developing treatments for direct application in humans or animals suffering from disease. Though benefit may be easier to define, it should certainly not be considered automatic. Issues related to model validity seriously compromise experiments and have been implicated as a major impediment in translation, especially in complex disease models where harms to animals can be intensified. Increased investment and activity in the 3Rs is delivering new research models, tools and approaches with reduced reliance on animal use, improved animal welfare, and improved scientific and predictive value. PMID:25823812

  7. Decision-support tool for assessing biomanufacturing strategies under uncertainty: stainless steel versus disposable equipment for clinical trial material preparation.

    PubMed

    Farid, Suzanne S; Washbrook, John; Titchener-Hooker, Nigel J

    2005-01-01

    This paper presents the application of a decision-support tool, SIMBIOPHARMA, for assessing different manufacturing strategies under uncertainty for the production of biopharmaceuticals. SIMBIOPHARMA captures both the technical and business aspects of biopharmaceutical manufacture within a single tool that permits manufacturing alternatives to be evaluated in terms of cost, time, yield, project throughput, resource utilization, and risk. Its use for risk analysis is demonstrated through a hypothetical case study that uses the Monte Carlo simulation technique to imitate the randomness inherent in manufacturing subject to technical and market uncertainties. The case study addresses whether start-up companies should invest in a stainless steel pilot plant or use disposable equipment for the production of early phase clinical trial material. The effects of fluctuating product demands and titers on the performance of a biopharmaceutical company manufacturing clinical trial material are analyzed. The analysis highlights the impact of different manufacturing options on the range in possible outcomes for the project throughput and cost of goods and the likelihood that these metrics exceed a critical threshold. The simulation studies highlight the benefits of incorporating uncertainties when evaluating manufacturing strategies. Methods of presenting and analyzing information generated by the simulations are suggested. These are used to help determine the ranking of alternatives under different scenarios. The example illustrates the benefits to companies of using such a tool to improve management of their R&D portfolios so as to control the cost of goods.

  8. ZBIT Bioinformatics Toolbox: A Web-Platform for Systems Biology and Expression Data Analysis

    PubMed Central

    Römer, Michael; Eichner, Johannes; Dräger, Andreas; Wrzodek, Clemens; Wrzodek, Finja; Zell, Andreas

    2016-01-01

    Bioinformatics analysis has become an integral part of research in biology. However, installation and use of scientific software can be difficult and often requires technical expert knowledge. Reasons are dependencies on certain operating systems or required third-party libraries, missing graphical user interfaces and documentation, or nonstandard input and output formats. In order to make bioinformatics software easily accessible to researchers, we here present a web-based platform. The Center for Bioinformatics Tuebingen (ZBIT) Bioinformatics Toolbox provides web-based access to a collection of bioinformatics tools developed for systems biology, protein sequence annotation, and expression data analysis. Currently, the collection encompasses software for conversion and processing of community standards SBML and BioPAX, transcription factor analysis, and analysis of microarray data from transcriptomics and proteomics studies. All tools are hosted on a customized Galaxy instance and run on a dedicated computation cluster. Users only need a web browser and an active internet connection in order to benefit from this service. The web platform is designed to facilitate the usage of the bioinformatics tools for researchers without advanced technical background. Users can combine tools for complex analyses or use predefined, customizable workflows. All results are stored persistently and reproducible. For each tool, we provide documentation, tutorials, and example data to maximize usability. The ZBIT Bioinformatics Toolbox is freely available at https://webservices.cs.uni-tuebingen.de/. PMID:26882475

  9. Using scenario analysis to determine managed care strategy.

    PubMed

    Krentz, S E; Gish, R S

    2000-09-01

    In today's volatile healthcare environment, traditional planning tools are inadequate to guide financial managers of provider organizations in developing managed care strategies. These tools often disregard the uncertainty surrounding market forces such as employee benefit structure, the future of Medicare managed care, and the impact of consumer behavior. Scenario analysis overcomes this limitation by acknowledging the uncertain healthcare environment and articulating a set of plausible alternative futures, thus supplying financial executives with the perspective to craft strategies that can improve the market position of their organizations. By being alert for trigger points that might signal the rise of a specific scenario, financial managers can increase their preparedness for changes in market forces.

  10. An Economic Analysis of Residential Photovoltaic Systems with and without Energy Storage

    NASA Astrophysics Data System (ADS)

    Kizito, Rodney

    Residential photovoltaic (PV) systems serve as a source of electricity generation that is separate from the traditional utilities. Investor investment into residential PV systems provides several financial benefits such as federal tax credit incentives for installation, net metering credit from excess generated electricity added back to the grid, and savings in price per kilowatt-hour (kWh) from the PV system generation versus the increasing conventional utility price per kWh. As much benefit as stand-alone PV systems present, the incorporation of energy storage yields even greater benefits. Energy storage (ES) is capable of storing unused PV provided energy from daytime periods of high solar supply but low consumption. This allows the investor to use the stored energy when the cost of conventional utility power is high, while also allowing for excess stored energy to be sold back to the grid. This paper aims to investigate the overall returns for investor's investing in solely PV and ES-based PV systems by using a return of investment (ROI) economic analysis. The analysis is carried out over three scenarios: (1) residence without a PV system or ES, (2) residence with just a PV system, and (3) residence with both a PV system and ES. Due to the variation in solar exposure across the regions of the United States, this paper performs an analysis for eight of the top solar market states separately, accounting for the specific solar generation capabilities of each state. A Microsoft Excel tool is provided for computation of the ROI in scenario 2 and 3. A benefit-cost ration (BCR) is used to depict the annual economic performance of the PV system (scenario 2) and PV + ES system (scenario 3). The tool allows the user to adjust the variables and parameters to satisfy the users' specific investment situation.

  11. Combining real-time monitoring and knowledge-based analysis in MARVEL

    NASA Technical Reports Server (NTRS)

    Schwuttke, Ursula M.; Quan, A. G.; Angelino, R.; Veregge, J. R.

    1993-01-01

    Real-time artificial intelligence is gaining increasing attention for applications in which conventional software methods are unable to meet technology needs. One such application area is the monitoring and analysis of complex systems. MARVEL, a distributed monitoring and analysis tool with multiple expert systems, was developed and successfully applied to the automation of interplanetary spacecraft operations at NASA's Jet Propulsion Laboratory. MARVEL implementation and verification approaches, the MARVEL architecture, and the specific benefits that were realized by using MARVEL in operations are described.

  12. Flight Operations Analysis Tool

    NASA Technical Reports Server (NTRS)

    Easter, Robert; Herrell, Linda; Pomphrey, Richard; Chase, James; Wertz Chen, Julie; Smith, Jeffrey; Carter, Rebecca

    2006-01-01

    Flight Operations Analysis Tool (FLOAT) is a computer program that partly automates the process of assessing the benefits of planning spacecraft missions to incorporate various combinations of launch vehicles and payloads. Designed primarily for use by an experienced systems engineer, FLOAT makes it possible to perform a preliminary analysis of trade-offs and costs of a proposed mission in days, whereas previously, such an analysis typically lasted months. FLOAT surveys a variety of prior missions by querying data from authoritative NASA sources pertaining to 20 to 30 mission and interface parameters that define space missions. FLOAT provides automated, flexible means for comparing the parameters to determine compatibility or the lack thereof among payloads, spacecraft, and launch vehicles, and for displaying the results of such comparisons. Sparseness, typical of the data available for analysis, does not confound this software. FLOAT effects an iterative process that identifies modifications of parameters that could render compatible an otherwise incompatible mission set.

  13. Benefits and Pitfalls: Simple Guidelines for the Use of Social Networking Tools in K-12 Education

    ERIC Educational Resources Information Center

    Huffman, Stephanie

    2013-01-01

    The article will outline a framework for the use of social networking tools in K-12 education framed around four thought provoking questions: 1) what are the benefits and pitfalls of using social networking tools in P-12 education, 2) how do we plan effectively for the use of social networking tool, 3) what role does professional development play…

  14. Analoguing Creativity & Culture: A Method for Metaphors.

    ERIC Educational Resources Information Center

    Thompson, Timothy N.

    Adding to the benefits of using metaphors as tools, "analoguing" (a method of analysis that focuses on metaphors for meanings in use and meanings of metaphors in use) helps avoid excessive categorization and separation by looking for unities and patterns in phenomena rather than for divisions. Six months of observation of patterns of…

  15. Visibility into the Work: TQM Work Process Analysis with HPT and ISD.

    ERIC Educational Resources Information Center

    Beagles, Charles A.; Griffin, Steven L.

    2003-01-01

    Discusses the use of total quality management (TQM), work process flow diagrams, and ISD (instructional systems development) tools with HPT (human performance technology) to address performance gaps in the Veterans Benefits Administration (VBA). Describes performance goals, which were to improve accuracy and reduce backlog of claim files. (LRW)

  16. Experiences with Text Mining Large Collections of Unstructured Systems Development Artifacts at JPL

    NASA Technical Reports Server (NTRS)

    Port, Dan; Nikora, Allen; Hihn, Jairus; Huang, LiGuo

    2011-01-01

    Often repositories of systems engineering artifacts at NASA's Jet Propulsion Laboratory (JPL) are so large and poorly structured that they have outgrown our capability to effectively manually process their contents to extract useful information. Sophisticated text mining methods and tools seem a quick, low-effort approach to automating our limited manual efforts. Our experiences of exploring such methods mainly in three areas including historical risk analysis, defect identification based on requirements analysis, and over-time analysis of system anomalies at JPL, have shown that obtaining useful results requires substantial unanticipated efforts - from preprocessing the data to transforming the output for practical applications. We have not observed any quick 'wins' or realized benefit from short-term effort avoidance through automation in this area. Surprisingly we have realized a number of unexpected long-term benefits from the process of applying text mining to our repositories. This paper elaborates some of these benefits and our important lessons learned from the process of preparing and applying text mining to large unstructured system artifacts at JPL aiming to benefit future TM applications in similar problem domains and also in hope for being extended to broader areas of applications.

  17. Cost-benefit and effectiveness analysis of rapid testing for MRSA carriage in a hospital setting.

    PubMed

    Henson, Gay; Ghonim, Elham; Swiatlo, Andrea; King, Shelia; Moore, Kimberly S; King, S Travis; Sullivan, Donna

    2014-01-01

    A cost-effectiveness analysis was conducted comparing the polymerase chain reaction assay and traditional microbiological culture as screening tools for the identification of methicillin-resistant Staphylococcus aureus (MRSA) in patients admitted to the pediatric and surgical intensive care units (PICU and SICU) at a 722 bed academic medical center. In addition, the cost benefits of identification of colonized MRSA patients were determined. The cost-effectiveness analysis employed actual hospital and laboratory costs, not patient costs. The actual cost of the PCR assay was higher than the microbiological culture identification of MRSA ($602.95 versus $364.30 per positive carrier identified). However, this did not include the decreased turn-around time of PCR assays compared to traditional culture techniques. Patient costs were determined indirectly in the cost-benefit analysis of clinical outcome. There was a reduction in MRSA hospital-acquired infection (3.5 MRSA HAI/month without screening versus 0.6/month with screening by PCR). A cost-benefit analysis based on differences in length of stay suggests an associated savings in hospitalization costs: MRSA HAI with 29.5 day median LOS at $63,810 versus MRSA identified on admission with 6 day median LOS at $14,561, a difference of $49,249 per hospitalization. Although this pilot study was small and it is not possible to directly relate the cost-effectiveness and cost-benefit analysis due to confounding factors such as patient underlying morbidity and mortality, a reduction of 2.9 MRSA HAI/month associated with PCR screening suggests potential savings in hospitalization costs of $142,822 per month.

  18. RLV Turbine Performance Optimization

    NASA Technical Reports Server (NTRS)

    Griffin, Lisa W.; Dorney, Daniel J.

    2001-01-01

    A task was developed at NASA/Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. Final results of the preliminary design and the results of the two-dimensional (2D) detailed design of the first-stage vane of a supersonic turbine suitable for a reusable launch vehicle (R-LV) are presented. Analytical techniques for obtaining the results are also discussed.

  19. Implementation and impact of an online tool used in primary care to improve access to financial benefits for patients: a study protocol.

    PubMed

    Aery, Anjana; Rucchetto, Anne; Singer, Alexander; Halas, Gayle; Bloch, Gary; Goel, Ritika; Raza, Danyaal; Upshur, Ross E G; Bellaire, Jackie; Katz, Alan; Pinto, Andrew David

    2017-10-22

    Addressing the social determinants of health has been identified as crucial to reducing health inequities. However, few evidence-based interventions exist. This study emerges from an ongoing collaboration between physicians, researchers and a financial literacy organisation. Our study will answer the following: Is an online tool that improves access to financial benefits feasible and acceptable? Can such a tool be integrated into clinical workflow? What are patient perspectives on the tool and what is the short-term impact on access to benefits? An advisory group made up of patients living on low incomes and representatives from community agencies supports this study. We will recruit three primary care sites in Toronto, Ontario and three in Winnipeg, Manitoba that serve low-income communities. We will introduce clinicians to screening for poverty and how benefits can increase income. Health providers will be encouraged to use the tool with any patient seen. The health provider and patient will complete the online tool together, generating a tailored list of benefits and resources to assist with obtaining these benefits. A brief survey on this experience will be administered to patients after they complete the tool, as well as a request to contact them in 1 month. Those who agree to be contacted will be interviewed on whether the intervention improved access to financial benefits. We will also administer an online survey to providers and conduct focus groups at each site. Key ethical concerns include that patients may feel discomfort when being asked about their financial situation, may feel obliged to complete the tool and may have their expectations falsely raised about receiving benefits. Providers will be trained to address each of these concerns. We will share our findings with providers and policy-makers interested in addressing the social determinants of health within healthcare settings. Clinicaltrials.gov: NCT02959866. Registered 7 November 2016. Retrospectively registered. Pre-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  20. Assessment tools for unrecognized myocardial infarction: a cross-sectional analysis of the REasons for geographic and racial differences in stroke population

    PubMed Central

    2013-01-01

    Background Routine electrocardiograms (ECGs) are not recommended for asymptomatic patients because the potential harms are thought to outweigh any benefits. Assessment tools to identify high risk individuals may improve the harm versus benefit profile of screening ECGs. In particular, people with unrecognized myocardial infarction (UMI) have elevated risk for cardiovascular events and death. Methods Using logistic regression, we developed a basic assessment tool among 16,653 participants in the REasons for Geographic and Racial Differences in Stroke (REGARDS) study using demographics, self-reported medical history, blood pressure, and body mass index and an expanded assessment tool using information on 51 potential variables. UMI was defined as electrocardiogram evidence of myocardial infarction without a self-reported history (n = 740). Results The basic assessment tool had a c-statistic of 0.638 (95% confidence interval 0.617 - 0.659) and included age, race, smoking status, body mass index, systolic blood pressure, and self-reported history of transient ischemic attack, deep vein thrombosis, falls, diabetes, and hypertension. A predicted probability of UMI > 3% provided a sensitivity of 80% and a specificity of 30%. The expanded assessment tool had a c-statistic of 0.654 (95% confidence interval 0.634-0.674). Because of the poor performance of these assessment tools, external validation was not pursued. Conclusions Despite examining a large number of potential correlates of UMI, the assessment tools did not provide a high level of discrimination. These data suggest defining groups with high prevalence of UMI for targeted screening will be difficult. PMID:23530553

  1. Survey of Ambient Air Pollution Health Risk Assessment Tools.

    PubMed

    Anenberg, Susan C; Belova, Anna; Brandt, Jørgen; Fann, Neal; Greco, Sue; Guttikunda, Sarath; Heroux, Marie-Eve; Hurley, Fintan; Krzyzanowski, Michal; Medina, Sylvia; Miller, Brian; Pandey, Kiran; Roos, Joachim; Van Dingenen, Rita

    2016-09-01

    Designing air quality policies that improve public health can benefit from information about air pollution health risks and impacts, which include respiratory and cardiovascular diseases and premature death. Several computer-based tools help automate air pollution health impact assessments and are being used for a variety of contexts. Expanding information gathered for a May 2014 World Health Organization expert meeting, we survey 12 multinational air pollution health impact assessment tools, categorize them according to key technical and operational characteristics, and identify limitations and challenges. Key characteristics include spatial resolution, pollutants and health effect outcomes evaluated, and method for characterizing population exposure, as well as tool format, accessibility, complexity, and degree of peer review and application in policy contexts. While many of the tools use common data sources for concentration-response associations, population, and baseline mortality rates, they vary in the exposure information source, format, and degree of technical complexity. We find that there is an important tradeoff between technical refinement and accessibility for a broad range of applications. Analysts should apply tools that provide the appropriate geographic scope, resolution, and maximum degree of technical rigor for the intended assessment, within resources constraints. A systematic intercomparison of the tools' inputs, assumptions, calculations, and results would be helpful to determine the appropriateness of each for different types of assessment. Future work would benefit from accounting for multiple uncertainty sources and integrating ambient air pollution health impact assessment tools with those addressing other related health risks (e.g., smoking, indoor pollution, climate change, vehicle accidents, physical activity). © 2016 Society for Risk Analysis.

  2. Vibrations Detection in Industrial Pumps Based on Spectral Analysis to Increase Their Efficiency

    NASA Astrophysics Data System (ADS)

    Rachid, Belhadef; Hafaifa, Ahmed; Boumehraz, Mohamed

    2016-03-01

    Spectral analysis is the key tool for the study of vibration signals in rotating machinery. In this work, the vibration analysis applied for conditional preventive maintenance of such machines is proposed, as part of resolved problems related to vibration detection on the organs of these machines. The vibration signal of a centrifugal pump was treated to mount the benefits of the approach proposed. The obtained results present the signal estimation of a pump vibration using Fourier transform technique compared by the spectral analysis methods based on Prony approach.

  3. Probing the benefits of real-time tracking during cancer care

    PubMed Central

    Patel, Rupa A.; Klasnja, Predrag; Hartzler, Andrea; Unruh, Kenton T.; Pratt, Wanda

    2012-01-01

    People with cancer experience many unanticipated symptoms and struggle to communicate them to clinicians. Although researchers have developed patient-reported outcome (PRO) tools to address this problem, such tools capture retrospective data intended for clinicians to review. In contrast, real-time tracking tools with visible results for patients could improve health outcomes and communication with clinicians, while also enhancing patients’ symptom management. To understand potential benefits of such tools, we studied the tracking behaviors of 25 women with breast cancer. We provided 10 of these participants with a real-time tracking tool that served as a “technology probe” to uncover behaviors and benefits from voluntary use. Our findings showed that while patients’ tracking behaviors without a tool were fragmented and sporadic, these behaviors with a tool were more consistent. Participants also used tracked data to see patterns among symptoms, feel psychosocial comfort, and improve symptom communication with clinicians. We conclude with design implications for future real-time tracking tools. PMID:23304413

  4. Digital Support Platform: a qualitative research study investigating the feasibility of an internet-based, postdiagnostic support platform for families living with dementia

    PubMed Central

    Killin, Lewis O J; Russ, Tom C; Surdhar, Sushee Kaur; Yoon, Youngseo; McKinstry, Brian; Gibson, Grant; MacIntyre, Donald J

    2018-01-01

    Objectives To establish the feasibility of the Digital Support Platform (DSP), an internet-based, postdiagnostic tool designed for families living with a diagnosis of dementia. Design Qualitative methods using normalisation process theory as an analysis framework for semistructured interview transcriptions. Setting A community care setting in the South-East Scotland. Participants We interviewed 10 dyads of people with Alzheimer’s, vascular or mixed dementia (PWD), and their family carers, who had been given and had used the DSP for at least 2 months. Results Our analysis revealed that the DSP was predominantly understood and used by the carers rather than PWD, and was used alongside tools and methods they already used to care for their relative. The DSP was interpreted as a tool that may be of benefit to those experiencing later stages of dementia or with physical care needs. Carers stated that the DSP may be of benefit in the future, reflecting a disinclination to prepare for or anticipate for future needs, rather than focus on those needs present at the time of distribution. PWD spoke positively about an interest in learning to use technology more effectively and enjoyed having their own tablet devices. Conclusions The DSP was not wholly appropriate for families living with dementia in its early stages. The views of carers confirmed that postdiagnostic support was valued, but emphasised the importance of tailoring this support to the exact needs and current arrangements of families. There may be a benefit to introducing, encouraging, providing and teaching internet-enabled technology to those PWD who do not currently have access. Training should be provided when introducing new technology to PWD. PMID:29654028

  5. Valuing vaccines using value of statistical life measures.

    PubMed

    Laxminarayan, Ramanan; Jamison, Dean T; Krupnick, Alan J; Norheim, Ole F

    2014-09-03

    Vaccines are effective tools to improve human health, but resources to pursue all vaccine-related investments are lacking. Benefit-cost and cost-effectiveness analysis are the two major methodological approaches used to assess the impact, efficiency, and distributional consequences of disease interventions, including those related to vaccinations. Childhood vaccinations can have important non-health consequences for productivity and economic well-being through multiple channels, including school attendance, physical growth, and cognitive ability. Benefit-cost analysis would capture such non-health benefits; cost-effectiveness analysis does not. Standard cost-effectiveness analysis may grossly underestimate the benefits of vaccines. A specific willingness-to-pay measure is based on the notion of the value of a statistical life (VSL), derived from trade-offs people are willing to make between fatality risk and wealth. Such methods have been used widely in the environmental and health literature to capture the broader economic benefits of improving health, but reservations remain about their acceptability. These reservations remain mainly because the methods may reflect ability to pay, and hence be discriminatory against the poor. However, willingness-to-pay methods can be made sensitive to income distribution by using appropriate income-sensitive distributional weights. Here, we describe the pros and cons of these methods and how they compare against standard cost-effectiveness analysis using pure health metrics, such as quality-adjusted life years (QALYs) and disability-adjusted life years (DALYs), in the context of vaccine priorities. We conclude that if appropriately used, willingness-to-pay methods will not discriminate against the poor, and they can capture important non-health benefits such as financial risk protection, productivity gains, and economic wellbeing. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. An Excel Spreadsheet Model for States and Districts to Assess the Cost-Benefit of School Nursing Services.

    PubMed

    Wang, Li Yan; O'Brien, Mary Jane; Maughan, Erin D

    2016-11-01

    This paper describes a user-friendly, Excel spreadsheet model and two data collection instruments constructed by the authors to help states and districts perform cost-benefit analyses of school nursing services delivered by full-time school nurses. Prior to applying the model, states or districts need to collect data using two forms: "Daily Nurse Data Collection Form" and the "Teacher Survey." The former is used to record daily nursing activities, including number of student health encounters, number of medications administered, number of student early dismissals, and number of medical procedures performed. The latter is used to obtain estimates for the time teachers spend addressing student health issues. Once inputs are entered in the model, outputs are automatically calculated, including program costs, total benefits, net benefits, and benefit-cost ratio. The spreadsheet model, data collection tools, and instructions are available at the NASN website ( http://www.nasn.org/The/CostBenefitAnalysis ).

  7. Icarus: visualizer for de novo assembly evaluation.

    PubMed

    Mikheenko, Alla; Valin, Gleb; Prjibelski, Andrey; Saveliev, Vladislav; Gurevich, Alexey

    2016-11-01

    : Data visualization plays an increasingly important role in NGS data analysis. With advances in both sequencing and computational technologies, it has become a new bottleneck in genomics studies. Indeed, evaluation of de novo genome assemblies is one of the areas that can benefit from the visualization. However, even though multiple quality assessment methods are now available, existing visualization tools are hardly suitable for this purpose. Here, we present Icarus-a novel genome visualizer for accurate assessment and analysis of genomic draft assemblies, which is based on the tool QUAST. Icarus can be used in studies where a related reference genome is available, as well as for non-model organisms. The tool is available online and as a standalone application. http://cab.spbu.ru/software/icarus CONTACT: aleksey.gurevich@spbu.ruSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. Development of a module for Cost-Benefit analysis of risk reduction measures for natural hazards for the CHANGES-SDSS platform

    NASA Astrophysics Data System (ADS)

    Berlin, Julian; Bogaard, Thom; Van Westen, Cees; Bakker, Wim; Mostert, Eric; Dopheide, Emile

    2014-05-01

    Cost benefit analysis (CBA) is a well know method used widely for the assessment of investments either in the private and public sector. In the context of risk mitigation and the evaluation of risk reduction alternatives for natural hazards its use is very important to evaluate the effectiveness of such efforts in terms of avoided monetary losses. However the current method has some disadvantages related to the spatial distribution of the costs and benefits, the geographical distribution of the avoided damage and losses, the variation in areas that are benefited in terms of invested money and avoided monetary risk. Decision-makers are often interested in how the costs and benefits are distributed among different administrative units of a large area or region, so they will be able to compare and analyse the cost and benefits per administrative unit as a result of the implementation of the risk reduction projects. In this work we first examined the Cost benefit procedure for natural hazards, how the costs are assessed for several structural and non-structural risk reduction alternatives, we also examined the current problems of the method such as the inclusion of cultural and social considerations that are complex to monetize , the problem of discounting future values using a defined interest rate and the spatial distribution of cost and benefits. We also examined the additional benefits and the indirect costs associated with the implementation of the risk reduction alternatives such as the cost of having a ugly landscape (also called negative benefits). In the last part we examined the current tools and software used in natural hazards assessment with support to conduct CBA and we propose design considerations for the implementation of the CBA module for the CHANGES-SDSS Platform an initiative of the ongoing 7th Framework Programme "CHANGES of the European commission. Keywords: Risk management, Economics of risk mitigation, EU Flood Directive, resilience, prevention, cost benefit analysis, spatial distribution of costs and benefits

  9. Socio-economic analysis: a tool for assessing the potential of nanotechnologies

    NASA Astrophysics Data System (ADS)

    Brignon, Jean-Marc

    2011-07-01

    Cost-Benefit Analysis (CBA) has a long history, especially in the USA, of being used for the assessment of new regulation, new infrastructure and more recently for new technologies. Under the denomination of Socio-Economic Analysis (SEA), this concept is used in EU safety and environmental regulation, especially for the placing of chemicals on the market (REACh regulation) and the operation of industrial installations (Industrial Emissions Directive). As far as REACh and other EU legislation apply specifically to nanomaterials in the future, SEA might become an important assessment tool for nanotechnologies. The most important asset of SEA regarding nanomaterials, is the comparison with alternatives in socio-economic scenarios, which is key for the understanding of how a nanomaterial "socially" performs in comparison with its alternatives. "Industrial economics" methods should be introduced in SEAs to make industry and the regulator share common concepts and visions about economic competitiveness implications of regulating nanotechnologies, SEA and Life Cycle Analysis (LCA) can complement each other : Socio-Economic LCA are increasingly seen as a complete assessment tool for nanotechnologies, but the perspective between Social LCA and SEA are different and the respective merits and limitations of both approaches should be kept in mind. SEA is a "pragmatic regulatory impact analysis", that uses a cost/benefit framework analysis but remains open to other disciplines than economy, and open to the participation of stakeholders for the construction of scenarios of the deployment of technologies and the identification of alternatives. SEA is "pragmatic" in the sense that it is driven by the purpose to assess "what happens" with the introduction of nanotechnology, and uses methodologies such as Life Cycle Analysis only as far as they really contribute to that goal. We think that, being pragmatic, SEA is also adaptative, which is a key quality to handle the novelty of economic and social effects expected from nanotechnology.

  10. Evaluation and refinement of a handheld health information technology tool to support the timely update of bedside visual cues to prevent falls in hospitals.

    PubMed

    Teh, Ruth C-A; Visvanathan, Renuka; Ranasinghe, Damith; Wilson, Anne

    2018-06-01

    To evaluate clinicians' perspectives, before and after clinical implementation (i.e. trial) of a handheld health information technology (HIT) tool, incorporating an iPad device and automatically generated visual cues for bedside display, for falls risk assessment and prevention in hospital. This pilot study utilized mixed-methods research with focus group discussions and Likert-scale surveys to elicit clinicians' attitudes. The study was conducted across three phases within two medical wards of the Queen Elizabeth Hospital. Phase 1 (pretrial) involved focus group discussion (five staff) and surveys (48 staff) to elicit preliminary perspectives on tool use, benefits and barriers to use and recommendations for improvement. Phase 2 (tool trial) involved HIT tool implementation on two hospital wards over consecutive 12-week periods. Phase 3 (post-trial) involved focus group discussion (five staff) and surveys (29 staff) following tool implementation, with similar themes as in Phase 1. Qualitative data were evaluated using content analysis, and quantitative data using descriptive statistics and logistic regression analysis, with subgroup analyses on user status (P ≤ 0.05). Four findings emerged on clinicians' experience, positive perceptions, negative perceptions and recommendations for improvement of the tool. Pretrial, clinicians were familiar with using visual cues in hospital falls prevention. They identified potential benefits of the HIT tool in obtaining timely, useful falls risk assessment to improve patient care. During the trial, the wards differed in methods of tool implementation, resulting in lower uptake by clinicians on the subacute ward. Post-trial, clinicians remained supportive for incorporating the tool into clinical practice; however, there were issues with usability and lack of time for tool use. Staff who had not used the tool had less appreciation for it improving their understanding of patients' falls risk factors (odds ratio 0.12), or effectively preventing hospital falls (odds ratio 0.12). Clinicians' recommendations resulted in subsequent technological refinement of the tool, and provision of an additional iPad device for more efficient use. This study adds to the limited pool of knowledge about clinicians' attitudes toward health technology use in falls avoidance. Clinicians were willing to use the HIT tool, and their concerns about its usability were addressed in ongoing tool improvement. Including end-users in the development and refinement processes, as well as having high staff uptake of new technologies, is important in improving their acceptance and usage, and in maximizing beneficial feedback to further inform tool development.

  11. A quantitative benefit-risk assessment approach to improve decision making in drug development: Application of a multicriteria decision analysis model in the development of combination therapy for overactive bladder.

    PubMed

    de Greef-van der Sandt, I; Newgreen, D; Schaddelee, M; Dorrepaal, C; Martina, R; Ridder, A; van Maanen, R

    2016-04-01

    A multicriteria decision analysis (MCDA) approach was developed and used to estimate the benefit-risk of solifenacin and mirabegron and their combination in the treatment of overactive bladder (OAB). The objectives were 1) to develop an MCDA tool to compare drug effects in OAB quantitatively, 2) to establish transparency in the evaluation of the benefit-risk profile of various dose combinations, and 3) to quantify the added value of combination use compared to monotherapies. The MCDA model was developed using efficacy, safety, and tolerability attributes and the results of a phase II factorial design combination study were evaluated. Combinations of solifenacin 5 mg and mirabegron 25 mg and mirabegron 50 (5+25 and 5+50) scored the highest clinical utility and supported combination therapy development of solifenacin and mirabegron for phase III clinical development at these dose regimens. This case study underlines the benefit of using a quantitative approach in clinical drug development programs. © 2015 The American Society for Clinical Pharmacology and Therapeutics.

  12. Freva - Freie Univ Evaluation System Framework for Scientific Infrastructures in Earth System Modeling

    NASA Astrophysics Data System (ADS)

    Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Schartner, Thomas; Kirchner, Ingo; Rust, Henning W.; Cubasch, Ulrich; Ulbrich, Uwe

    2016-04-01

    The Freie Univ Evaluation System Framework (Freva - freva.met.fu-berlin.de) is a software infrastructure for standardized data and tool solutions in Earth system science. Freva runs on high performance computers to handle customizable evaluation systems of research projects, institutes or universities. It combines different software technologies into one common hybrid infrastructure, including all features present in the shell and web environment. The database interface satisfies the international standards provided by the Earth System Grid Federation (ESGF). Freva indexes different data projects into one common search environment by storing the meta data information of the self-describing model, reanalysis and observational data sets in a database. This implemented meta data system with its advanced but easy-to-handle search tool supports users, developers and their plugins to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitation of the provision and usage of tools and climate data automatically increases the number of scientists working with the data sets and identifying discrepancies. The integrated web-shell (shellinabox) adds a degree of freedom in the choice of the working environment and can be used as a gate to the research projects HPC. Plugins are able to integrate their e.g. post-processed results into the database of the user. This allows e.g. post-processing plugins to feed statistical analysis plugins, which fosters an active exchange between plugin developers of a research project. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a database. Configurations and results of the tools can be shared among scientists via shell or web system. Therefore, plugged-in tools benefit from transparency and reproducibility. Furthermore, if configurations match while starting an evaluation plugin, the system suggests to use results already produced by other users - saving CPU/h, I/O, disk space and time. The efficient interaction between different technologies improves the Earth system modeling science framed by Freva.

  13. Application of enhanced modern structured analysis techniques to Space Station Freedom electric power system requirements

    NASA Technical Reports Server (NTRS)

    Biernacki, John; Juhasz, John; Sadler, Gerald

    1991-01-01

    A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.

  14. Filtering Essays by Means of a Software Tool: Identifying Poor Essays

    ERIC Educational Resources Information Center

    Seifried, Eva; Lenhard, Wolfgang; Spinath, Birgit

    2017-01-01

    Writing essays and receiving feedback can be useful for fostering students' learning and motivation. When faced with large class sizes, it is desirable to identify students who might particularly benefit from feedback. In this article, we tested the potential of Latent Semantic Analysis (LSA) for identifying poor essays. A total of 14 teaching…

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, Sadie

    This fact sheet overviews the benefits of using the RE Data Explorer tool to analyze and develop renewable energy zones. Renewable energy zones are developed through a transmission planning and approval process customized for renewable energy. RE Data Explorer analysis can feed into broader stakeholder discussions and allow stakeholders to easily visualize potential zones. Stakeholders can access pertinent data to inform transmission planning and enable investment.

  16. Reporting Data with "Over-the-Counter" Data Analysis Supports Improves Educators' Data Analyses

    ERIC Educational Resources Information Center

    Rankin, Jenny Grant

    2014-01-01

    The benefits of making data-informed decisions to improve learning rely on educators correctly interpreting given data. Many educators routinely misinterpret data, even at districts with proactive support for data use. The tool most educators use for data analyses, which is an information technology data system or its reports, typically reports…

  17. The Three Gorges Project: How sustainable?

    NASA Astrophysics Data System (ADS)

    Kepa Brian Morgan, Te Kipa; Sardelic, Daniel N.; Waretini, Amaria F.

    2012-08-01

    SummaryIn 1984 the Government of China approved the decision to construct the Three Gorges Dam Project, the largest project since the Great Wall. The project had many barriers to overcome, and the decision was made at a time when sustainability was a relatively unknown concept. The decision to construct the Three Gorges Project remains contentious today, especially since Deputy Director of the Three Gorges Project Construction Committee, Wang Xiaofeng, stated that "We absolutely cannot relax our guard against ecological and environmental security problems sparked by the Three Gorges Project" (Bristow, 2007; McCabe, 2007). The question therefore was posed: how sustainable is the Three Gorges Project? Conventional approaches to sustainability assessment tend to use monetary based assessment aligned to triple bottom line thinking. That is, projects are evaluated as trade-offs between economic, environmental and social costs and benefits. The question of sustainability is considered using such a traditional Cost-Benefit Analysis approach, as undertaken in 1988 by a CIPM-Yangtze Joint Venture, and the Mauri Model Decision Making Framework (MMDMF). The Mauri Model differs from other approaches in that sustainability performance indicators are considered independently from any particular stakeholder bias. Bias is then introduced subsequently as a sensitivity analysis on the raw results obtained. The MMDMF is unique in that it is based on the Māori concept of Mauri, the binding force between the physical and the spiritual attributes of something, or the capacity to support life in the air, soil, and water. This concept of Mauri is analogous to the Chinese concept of Qi, and there are many analogous concepts in other cultures. It is the universal relevance of Mauri that allows its use to assess sustainability. This research identified that the MMDMF was a strong complement to Cost-Benefit Analysis, which is not designed as a sustainability assessment tool in itself. The MMDMF does have relevance in identifying areas of conflict, and it can support the Cost-Benefit Analysis in assessing sustainability, as a Decision Support Tool. The research concluded that, based on both models, the Three Gorges Project as understood in 1988, and incorporating more recent sustainability analysis is contributing to enhanced sustainability.

  18. Generating community-built tools for data sharing and analysis in environmental networks

    USGS Publications Warehouse

    Read, Jordan S.; Gries, Corinna; Read, Emily K.; Klug, Jennifer; Hanson, Paul C.; Hipsey, Matthew R.; Jennings, Eleanor; O'Reilley, Catherine; Winslow, Luke A.; Pierson, Don; McBride, Christopher G.; Hamilton, David

    2016-01-01

    Rapid data growth in many environmental sectors has necessitated tools to manage and analyze these data. The development of tools often lags behind the proliferation of data, however, which may slow exploratory opportunities and scientific progress. The Global Lake Ecological Observatory Network (GLEON) collaborative model supports an efficient and comprehensive data–analysis–insight life cycle, including implementations of data quality control checks, statistical calculations/derivations, models, and data visualizations. These tools are community-built and openly shared. We discuss the network structure that enables tool development and a culture of sharing, leading to optimized output from limited resources. Specifically, data sharing and a flat collaborative structure encourage the development of tools that enable scientific insights from these data. Here we provide a cross-section of scientific advances derived from global-scale analyses in GLEON. We document enhancements to science capabilities made possible by the development of analytical tools and highlight opportunities to expand this framework to benefit other environmental networks.

  19. Extended benefit cost analysis as an instrument of economic valuated in Petungkriyono forest ecosystem services

    NASA Astrophysics Data System (ADS)

    Damayanti, Irma; Nur Bambang, Azis; Retnaningsih Soeprobowati, Tri

    2018-05-01

    Petungkriyono is the last tropical forest in Java and provides biodiversity including rare flora and fauna that must be maintained, managed and utilized in order to give meaning for humanity and sustainability. Services of Forest Ecosystem in Petungkriyono are included such as goods supply, soil-water conservation, climate regulation, purification environment and flora fauna habitats. The approach of this study is the literature review from various studies before perceiving the influenced of economic valuation in determining the measurement conservation strategies of Petungkriyono Natural Forest Ecosystem in Pekalongan Regency. The aims of this study are to analyzing an extended benefit cost of natural forest ecosystems and internalizing them in decision making. The method of quantification and valuation of forest ecosystem is Cost and Benefit Analysis (CBA) which is a standard economic appraisal tools government in development economics. CBA offers the possibility capturing impact of the project. By using productivity subtitution value and extended benefit cost analysis any comodity such as Backwoods,Pine Woods, Puspa woods and Pine Gum. Water value, preventive buildings of landslide and carbon sequestration have total economic value of IDR.163.065.858.080, and the value of Extended Benefit Cost Ratio in Petungkriyono is 281.35 %. However, from the result is expected the local government of Pekalongan to have high motivation in preserve the existence of Petungkriyono forest.

  20. Multi-criteria analysis for PM10 planning

    NASA Astrophysics Data System (ADS)

    Pisoni, Enrico; Carnevale, Claudio; Volta, Marialuisa

    To implement sound air quality policies, Regulatory Agencies require tools to evaluate outcomes and costs associated to different emission reduction strategies. These tools are even more useful when considering atmospheric PM10 concentrations due to the complex nonlinear processes that affect production and accumulation of the secondary fraction of this pollutant. The approaches presented in the literature (Integrated Assessment Modeling) are mainly cost-benefit and cost-effective analysis. In this work, the formulation of a multi-objective problem to control particulate matter is proposed. The methodology defines: (a) the control objectives (the air quality indicator and the emission reduction cost functions); (b) the decision variables (precursor emission reductions); (c) the problem constraints (maximum feasible technology reductions). The cause-effect relations between air quality indicators and decision variables are identified tuning nonlinear source-receptor models. The multi-objective problem solution provides to the decision maker a set of not-dominated scenarios representing the efficient trade-off between the air quality benefit and the internal costs (emission reduction technology costs). The methodology has been implemented for Northern Italy, often affected by high long-term exposure to PM10. The source-receptor models used in the multi-objective analysis are identified processing long-term simulations of GAMES multiphase modeling system, performed in the framework of CAFE-Citydelta project.

  1. Industrial Technology Modernization Program. Project 20. Consolidation and Automation of Material and Tool Storage. Phase 2

    DTIC Science & Technology

    1987-06-15

    GENERAL DYNAMICS FORT WORTH DIVISION INDUSTRIAL TECHNOLOGY00 N MODERNIZATION PROGRAM Phase 2 Final Project Report DT C JUNO 7 1989J1K PROJECT 20...CLASSIFICATION O THIS PAGE All other editions are obsolete. unclassified Honeywell JUNE 15, 1987 GENERAL DYNAMICS FORT WORTH DIVISION INDUSTRIAL ...SYSTEMIEQUIPMENT/MACHINING SPECIFICATIONS 33 9 VENDOR/ INDUSTRY ANALYSIS FINDING 39 10 MIS REQUIREMENTS/IMPROVEMENTS 45 11 COST BENEFIT ANALYSIS 48 12 IMPLEMENTATION

  2. Development of an Integrated Human Factors Toolkit

    NASA Technical Reports Server (NTRS)

    Resnick, Marc L.

    2003-01-01

    An effective integration of human abilities and limitations is crucial to the success of all NASA missions. The Integrated Human Factors Toolkit facilitates this integration by assisting system designers and analysts to select the human factors tools that are most appropriate for the needs of each project. The HF Toolkit contains information about a broad variety of human factors tools addressing human requirements in the physical, information processing and human reliability domains. Analysis of each tool includes consideration of the most appropriate design stage, the amount of expertise in human factors that is required, the amount of experience with the tool and the target job tasks that are needed, and other factors that are critical for successful use of the tool. The benefits of the Toolkit include improved safety, reliability and effectiveness of NASA systems throughout the agency. This report outlines the initial stages of development for the Integrated Human Factors Toolkit.

  3. Linear regression metamodeling as a tool to summarize and present simulation model results.

    PubMed

    Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M

    2013-10-01

    Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.

  4. Learners' Perceptions of the Benefits of Voice Tool-Based Tasks on Their Spoken Performance

    ERIC Educational Resources Information Center

    Wilches, Astrid

    2014-01-01

    The purpose of this study is to investigate learners' perceptions of the benefits of tasks using voice tools to reinforce their oral skills. Additionally, this study seeks to determine what aspects of task design affected the students' perceptions. Beginner learners aged 18 to 36 with little or no experience in the use of technological tools for…

  5. Benefits of Efficient Windows | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  6. A new spatial multi-criteria decision support tool for site selection for implementation of managed aquifer recharge.

    PubMed

    Rahman, M Azizur; Rusteberg, Bernd; Gogu, R C; Lobo Ferreira, J P; Sauter, Martin

    2012-05-30

    This study reports the development of a new spatial multi-criteria decision analysis (SMCDA) software tool for selecting suitable sites for Managed Aquifer Recharge (MAR) systems. The new SMCDA software tool functions based on the combination of existing multi-criteria evaluation methods with modern decision analysis techniques. More specifically, non-compensatory screening, criteria standardization and weighting, and Analytical Hierarchy Process (AHP) have been combined with Weighted Linear Combination (WLC) and Ordered Weighted Averaging (OWA). This SMCDA tool may be implemented with a wide range of decision maker's preferences. The tool's user-friendly interface helps guide the decision maker through the sequential steps for site selection, those steps namely being constraint mapping, criteria hierarchy, criteria standardization and weighting, and criteria overlay. The tool offers some predetermined default criteria and standard methods to increase the trade-off between ease-of-use and efficiency. Integrated into ArcGIS, the tool has the advantage of using GIS tools for spatial analysis, and herein data may be processed and displayed. The tool is non-site specific, adaptive, and comprehensive, and may be applied to any type of site-selection problem. For demonstrating the robustness of the new tool, a case study was planned and executed at Algarve Region, Portugal. The efficiency of the SMCDA tool in the decision making process for selecting suitable sites for MAR was also demonstrated. Specific aspects of the tool such as built-in default criteria, explicit decision steps, and flexibility in choosing different options were key features, which benefited the study. The new SMCDA tool can be augmented by groundwater flow and transport modeling so as to achieve a more comprehensive approach to the selection process for the best locations of the MAR infiltration basins, as well as the locations of recovery wells and areas of groundwater protection. The new spatial multicriteria analysis tool has already been implemented within the GIS based Gabardine decision support system as an innovative MAR planning tool. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. IKOS: A Framework for Static Analysis based on Abstract Interpretation (Tool Paper)

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.; Laserna, Jorge A.; Shi, Nija; Venet, Arnaud Jean

    2014-01-01

    The RTCA standard (DO-178C) for developing avionic software and getting certification credits includes an extension (DO-333) that describes how developers can use static analysis in certification. In this paper, we give an overview of the IKOS static analysis framework that helps developing static analyses that are both precise and scalable. IKOS harnesses the power of Abstract Interpretation and makes it accessible to a larger class of static analysis developers by separating concerns such as code parsing, model development, abstract domain management, results management, and analysis strategy. The benefits of the approach is demonstrated by a buffer overflow analysis applied to flight control systems.

  8. A Framework for Creating a Function-based Design Tool for Failure Mode Identification

    NASA Technical Reports Server (NTRS)

    Arunajadai, Srikesh G.; Stone, Robert B.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Knowledge of potential failure modes during design is critical for prevention of failures. Currently industries use procedures such as Failure Modes and Effects Analysis (FMEA), Fault Tree analysis, or Failure Modes, Effects and Criticality analysis (FMECA), as well as knowledge and experience, to determine potential failure modes. When new products are being developed there is often a lack of sufficient knowledge of potential failure mode and/or a lack of sufficient experience to identify all failure modes. This gives rise to a situation in which engineers are unable to extract maximum benefits from the above procedures. This work describes a function-based failure identification methodology, which would act as a storehouse of information and experience, providing useful information about the potential failure modes for the design under consideration, as well as enhancing the usefulness of procedures like FMEA. As an example, the method is applied to fifteen products and the benefits are illustrated.

  9. Population Viability Analysis of Riverine Fishes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bates, P.; Chandler, J.; Jager, H.I.

    Many utilities face conflkts between two goals: cost-efficient hydropower generation and protecting riverine fishes. Research to develop ecological simulation tools that can evaluate alternative mitigation strategies in terms of their benefits to fish populations is vital to informed decision-making. In this paper, we describe our approach to population viability analysis of riverine fishes in general and Snake River white sturgeon in particular. We are finding that the individual-based modeling approach used in previous in-stream flow applications is well suited to addressing questions about the viability of species of concern for several reasons. Chief among these are: (1) the abiIity tomore » represent the effects of individual variation in life history characteristics on predicted population viabili~, (2) the flexibili~ needed to quanti~ the ecological benefits of alternative flow management options by representing spatial and temporal variation in flow and temperaturty and (3) the flexibility needed to quantifi the ecological benefits of non-flow related manipulations (i.e., passage, screening and hatchery supplementation).« less

  10. Economic benefit evaluation for renewable energy transmitted by HVDC based on production simulation (PS) and analytic hierarchy process(AHP)

    NASA Astrophysics Data System (ADS)

    Zhang, Jinfang; Zheng, Kuan; Liu, Jun; Huang, Xinting

    2018-02-01

    In order to support North and West China’s RE (RE) development and enhance accommodation in reasonable high level, HVDC’s traditional operation curves need some change to follow the output characteristic of RE, which helps to shrink curtailment electricity and curtailment ratio of RE. In this paper, an economic benefit analysis method based on production simulation (PS) and Analytic hierarchy process (AHP) has been proposed. PS is the basic tool to analyze chosen power system operation situation, and AHP method could give a suitable comparison result among many candidate schemes. Based on four different transmission curve combinations, related economic benefit has been evaluated by PS and AHP. The results and related index have shown the efficiency of suggested method, and finally it has been validated that HVDC operation curve in following RE output mode could have benefit in decreasing RE curtailment level and improving economic operation.

  11. Benefit-Cost Analysis of Undergraduate Education Programs: An Example Analysis of the Freshman Research Initiative.

    PubMed

    Walcott, Rebecca L; Corso, Phaedra S; Rodenbusch, Stacia E; Dolan, Erin L

    2018-01-01

    Institutions and administrators regularly have to make difficult choices about how best to invest resources to serve students. Yet economic evaluation, or the systematic analysis of the relationship between costs and outcomes of a program or policy, is relatively uncommon in higher education. This type of evaluation can be an important tool for decision makers considering questions of resource allocation. Our purpose with this essay is to describe methods for conducting one type of economic evaluation, a benefit-cost analysis (BCA), using an example of an existing undergraduate education program, the Freshman Research Initiative (FRI) at the University of Texas Austin. Our aim is twofold: to demonstrate how to apply BCA methodologies to evaluate an education program and to conduct an economic evaluation of FRI in particular. We explain the steps of BCA, including assessment of costs and benefits, estimation of the benefit-cost ratio, and analysis of uncertainty. We conclude that the university's investment in FRI generates a positive return for students in the form of increased future earning potential. © 2018 R. L. Walcott et al. CBE—Life Sciences Education © 2018 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  12. Water use benefit index as a tool for community-based monitoring of water related trends in the Great Barrier Reef region

    NASA Astrophysics Data System (ADS)

    Smajgl, A.; Larson, S.; Hug, B.; De Freitas, D. M.

    2010-12-01

    SummaryThis paper presents a tool for documenting and monitoring water use benefits in the Great Barrier Reef catchments that allows temporal and spatial comparison along the region. Water, water use benefits and water allocations are currently receiving much attention from Australian policy makers and conservation practitioners. Because of the inherent complexity and variability in water quality, it is essential that scientific information is presented in a meaningful way to policy makers, managers and ultimately, to the general public who have to live with the consequences of the decisions. We developed an inexpensively populated and easily understandable water use benefit index as a tool for community-based monitoring of water related trends in the Great Barrier Reef region. The index is developed based on a comparative list of selected water-related indices integrating attributes across physico-chemical, economic, social, and ecological domains currently used in the assessment of water quality, water quantity and water use benefits in Australia. Our findings indicate that the proposed index allows the identification of water performance indicators by temporal and spatial comparisons. Benefits for decision makers and conservation practitioners include a flexible way of prioritization towards the domain with highest concern. The broader community benefits from a comprehensive and user-friendly tool, communicating changes in water quality trends more effectively.

  13. Discovering the Lost Ark of Possibilities: Bringing Visibility to the Invisible Art Form of Film Music in Your Music Classroom

    ERIC Educational Resources Information Center

    Keown, Daniel J.

    2015-01-01

    Conventional music learning in schools could benefit from the study of the music from films, television, and video games. This article offers practical applications for including film music as an outlet for analysis, an interdisciplinary compositional art form, a viable teaching tool, and an authentic performance/production experience. Music…

  14. The Use of a Serious Game and Academic Performance of Undergraduate Accounting Students: An Empirical Analysis

    ERIC Educational Resources Information Center

    Malaquias, Rodrigo Fernandes; Malaquias, Fernanda Francielle de Oliveira; Borges, Dermeval M., Jr.; Zambra, Pablo

    2018-01-01

    The literature on serious games (SGs) indicates that they are very useful tools to improve the teaching/learning process. In this paper, we analyze some potential benefits of a SG on academic performance of undergraduate accounting students. The database is comprised of scores obtained by students during an undergraduate discipline related with…

  15. Are Review Skills and Academic Writing Skills Related? An Exploratory Analysis via Multi Source Feedback Tools

    ERIC Educational Resources Information Center

    Razi, Salim

    2016-01-01

    Because students learn from each other as well as lecturers, it is important to create opportunities for collaboration in writing classes. Teachers now benefit from access to plagiarism detectors that can also provide feedback. This exploratory study considers the role of four review types, open and anonymous, involving the students themselves,…

  16. CoMET: Cost and Mass Evaluation Tool for Spacecraft and Mission Design

    NASA Technical Reports Server (NTRS)

    Bieber, Ben S.

    2005-01-01

    New technology in space exploration is often developed without a complete knowledge of its impact. While the immediate benefits of a new technology are obvious, it is harder to understand its indirect consequences, which ripple through the entire system. COMET is a technology evaluation tool designed to illuminate how specific technology choices affect a mission at each system level. COMET uses simplified models for mass, power, and cost to analyze performance parameters of technologies of interest. The sensitivity analysis that CoMET provides shows whether developing a certain technology will greatly benefit the project or not. CoMET is an ongoing project approaching a web-based implementation phase. This year, development focused on the models for planetary daughter craft, such as atmospheric probes, blimps and balloons, and landers. These models are developed through research into historical data, well established rules of thumb, and engineering judgment of experts at JPL. The model is validated by corroboration with JpL advanced mission studies. Other enhancements to COMET include adding launch vehicle analysis and integrating an updated cost model. When completed, COMET will allow technological development to be focused on areas that will most drastically improve spacecraft performance.

  17. Depth of manual dismantling analysis: a cost-benefit approach.

    PubMed

    Achillas, Ch; Aidonis, D; Vlachokostas, Ch; Karagiannidis, A; Moussiopoulos, N; Loulos, V

    2013-04-01

    This paper presents a decision support tool for manufacturers and recyclers towards end-of-life strategies for waste electrical and electronic equipment. A mathematical formulation based on the cost benefit analysis concept is herein analytically described in order to determine the parts and/or components of an obsolete product that should be either non-destructively recovered for reuse or be recycled. The framework optimally determines the depth of disassembly for a given product, taking into account economic considerations. On this basis, it embeds all relevant cost elements to be included in the decision-making process, such as recovered materials and (depreciated) parts/components, labor costs, energy consumption, equipment depreciation, quality control and warehousing. This tool can be part of the strategic decision-making process in order to maximize profitability or minimize end-of-life management costs. A case study to demonstrate the models' applicability is presented for a typical electronic product in terms of structure and material composition. Taking into account the market values of the pilot product's components, the manual disassembly is proven profitable with the marginal revenues from recovered reusable materials to be estimated at 2.93-23.06 €, depending on the level of disassembly. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. NASA Subsonic Rotary Wing Project-Multidisciplinary Analysis and Technology Development: Overview

    NASA Technical Reports Server (NTRS)

    Yamauchi, Gloria K.

    2009-01-01

    This slide presentation reviews the objectives of the Multidisciplinary Analysis and Technology Development (MDATD) in the Subsonic Rotary Wing project. The objectives are to integrate technologies and analyses to enable advanced rotorcraft and provide a roadmap to guide Level 1 and 2 research. The MDATD objectives will be met by conducting assessments of advanced technology benefits, developing new or enhanced design tools, and integrating Level 2 discipline technologies to develop and enable system-level analyses and demonstrations.

  19. Using Performance Tools to Support Experiments in HPC Resilience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naughton, III, Thomas J; Boehm, Swen; Engelmann, Christian

    2014-01-01

    The high performance computing (HPC) community is working to address fault tolerance and resilience concerns for current and future large scale computing platforms. This is driving enhancements in the programming environ- ments, specifically research on enhancing message passing libraries to support fault tolerant computing capabilities. The community has also recognized that tools for resilience experimentation are greatly lacking. However, we argue that there are several parallels between performance tools and resilience tools . As such, we believe the rich set of HPC performance-focused tools can be extended (repurposed) to benefit the resilience community. In this paper, we describe the initialmore » motivation to leverage standard HPC per- formance analysis techniques to aid in developing diagnostic tools to assist fault tolerance experiments for HPC applications. These diagnosis procedures help to provide context for the system when the errors (failures) occurred. We describe our initial work in leveraging an MPI performance trace tool to assist in provid- ing global context during fault injection experiments. Such tools will assist the HPC resilience community as they extend existing and new application codes to support fault tolerances.« less

  20. A new software tool for 3D motion analyses of the musculo-skeletal system.

    PubMed

    Leardini, A; Belvedere, C; Astolfi, L; Fantozzi, S; Viceconti, M; Taddei, F; Ensini, A; Benedetti, M G; Catani, F

    2006-10-01

    Many clinical and biomechanical research studies, particularly in orthopaedics, nowadays involve forms of movement analysis. Gait analysis, video-fluoroscopy of joint replacement, pre-operative planning, surgical navigation, and standard radiostereometry would require tools for easy access to three-dimensional graphical representations of rigid segment motion. Relevant data from this variety of sources need to be organised in structured forms. Registration, integration, and synchronisation of segment position data are additional necessities. With this aim, the present work exploits the features of a software tool recently developed within a EU-funded project ('Multimod') in a series of different research studies. Standard and advanced gait analysis on a normal subject, in vivo fluoroscopy-based three-dimensional motion of a replaced knee joint, patellar and ligament tracking on a knee specimen by a surgical navigation system, stem-to-femur migration pattern on a patient operated on total hip replacement, were analysed with standard techniques and all represented by this innovative software tool. Segment pose data were eventually obtained from these different techniques, and were successfully imported and organised in a hierarchical tree within the tool. Skeletal bony segments, prosthesis component models and ligament links were registered successfully to corresponding marker position data for effective three-dimensional animations. These were shown in various combinations, in different views, from different perspectives, according to possible specific research interests. Bioengineering and medical professionals would be much facilitated in the interpretation of the motion analysis measurements necessary in their research fields, and would benefit therefore from this software tool.

  1. Prototype Development of a Tradespace Analysis Tool for Spaceflight Medical Resources.

    PubMed

    Antonsen, Erik L; Mulcahy, Robert A; Rubin, David; Blue, Rebecca S; Canga, Michael A; Shah, Ronak

    2018-02-01

    The provision of medical care in exploration-class spaceflight is limited by mass, volume, and power constraints, as well as limitations of available skillsets of crewmembers. A quantitative means of exploring the risks and benefits of inclusion or exclusion of onboard medical capabilities may help to inform the development of an appropriate medical system. A pilot project was designed to demonstrate the utility of an early tradespace analysis tool for identifying high-priority resources geared toward properly equipping an exploration mission medical system. Physician subject matter experts identified resources, tools, and skillsets required, as well as associated criticality scores of the same, to meet terrestrial, U.S.-specific ideal medical solutions for conditions concerning for exploration-class spaceflight. A database of diagnostic and treatment actions and resources was created based on this input and weighed against the probabilities of mission-specific medical events to help identify common and critical elements needed in a future exploration medical capability. Analysis of repository data demonstrates the utility of a quantitative method of comparing various medical resources and skillsets for future missions. Directed database queries can provide detailed comparative estimates concerning likelihood of resource utilization within a given mission and the weighted utility of tangible and intangible resources. This prototype tool demonstrates one quantitative approach to the complex needs and limitations of an exploration medical system. While this early version identified areas for refinement in future version development, more robust analysis tools may help to inform the development of a comprehensive medical system for future exploration missions.Antonsen EL, Mulcahy RA, Rubin D, Blue RS, Canga MA, Shah R. Prototype development of a tradespace analysis tool for spaceflight medical resources. Aerosp Med Hum Perform. 2018; 89(2):108-114.

  2. Poverty Risk Index as A New Methodology for Social Inequality Distribution Assessment

    NASA Astrophysics Data System (ADS)

    Swiader, Małgorzata; Szewrański, Szymon; Kazak, Jan

    2017-10-01

    The paper presents new concept of poverty risk index measurement due to dynamics of urban development among years. The rapid urbanization could seriously surpass the capacity of the most cities, which may lead to insufficient services of their inhabitants. Consequence of this situation could be polarized, social differentiated cities with high rates of urban poverty. The measurement and analysis of urban poverty phenomenon requires the dedicated tools and techniques. The data based assessment could allow planners and public policy makers to develop more socially integrated cities. This paper presents analysis of urban poverty phenomenon in Wrocław city (Poland) during period 2010-2012. This analysis was conducted for ten Social Assistance Terrain Units (SATU) delineated at the city area. Our primary study objective concerns the proposal and calculation of poverty risk index based on diagnostic features, which represent the most common causes of social benefits granting, as: number of single households granted permanent benefits, number of people in families granted permanent benefits, number of people in families granted temporary benefits due to unemployment, number of people in families granted temporary benefits due to disability, number of people in families granted meals for children. The calculation was conducted by using the theory of development pattern - Hellwig’s economic development measure. The analysis of poverty risk index showed that commonly the central and south-eastern part of the city is characterized by the highest poverty risk index. The obtained results of the inequalities spatial distribution relate to European and American patterns of poverty concentration in urban structures.

  3. Techno-economic analysis and decision making for PHEV benefits to society, consumers, policymakers and automakers

    NASA Astrophysics Data System (ADS)

    Al-Alawi, Baha Mohammed

    Plug-in hybrid electric vehicles (PHEVs) are an emerging automotive technology that has the capability to reduce transportation environmental impacts, but at an increased production cost. PHEVs can draw and store energy from an electric grid and consequently show reductions in petroleum consumption, air emissions, ownership costs, and regulation compliance costs, and various other externalities. Decision makers in the policy, consumer, and industry spheres would like to understand the impact of HEV and PHEV technologies on the U.S. vehicle fleets, but to date, only the disciplinary characteristics of PHEVs been considered. The multidisciplinary tradeoffs between vehicle energy sources, policy requirements, market conditions, consumer preferences and technology improvements are not well understood. For example, the results of recent studies have posited the importance of PHEVs to the future US vehicle fleet. No studies have considered the value of PHEVs to automakers and policy makers as a tool for achieving US corporate average fuel economy (CAFE) standards which are planned to double by 2030. Previous studies have demonstrated the cost and benefit of PHEVs but there is no study that comprehensively accounts for the cost and benefits of PHEV to consumers. The diffusion rate of hybrid electric vehicle (HEV) and PHEV technology into the marketplace has been estimated by existing studies using various tools and scenarios, but results show wide variations between studies. There is no comprehensive modeling study that combines policy, consumers, society and automakers in the U.S. new vehicle sales cost and benefits analysis. The aim of this research is to build a potential framework that can simulate and optimize the benefits of PHEVs for a multiplicity of stakeholders. This dissertation describes the results of modeling that integrates the effects of PHEV market penetration on policy, consumer and economic spheres. A model of fleet fuel economy and CAFE compliance for a large US automaker will be developed. A comprehensive total cost of ownership model will be constructed to calculate and compare the cost and benefits of PHEVs, conventional vehicles (CVs) and HEVs. Then a comprehensive literature review of PHEVs penetration rate studies will be developed to review and analyze the primary purposes, methods, and results of studies of PHEV market penetration. Finally a multi-criteria modeling system will incorporate results of the support model results. In this project, the models, analysis and results will provide a broader understanding of the benefits and costs of PHEV technology and the parties to whom those benefits accrue. The findings will provide important information for consumers, automakers and policy makers to understand and define HEVs and PHEVs costs, benefits, expected penetration rate and the preferred vehicle design and technology scenario to meet the requirements of policy, society, industry and consumers.

  4. Contamination and Surface Preparation Effects on Composite Bonding

    NASA Technical Reports Server (NTRS)

    Kutscha, Eileen O.; Vahey, Paul G.; Belcher, Marcus A.; VanVoast, Peter J.; Grace, William B.; Blohowiak, Kay Y.; Palmieri, Frank L.; Connell, John W.

    2017-01-01

    Results presented here demonstrate the effect of several prebond surface contaminants (hydrocarbon, machining fluid, latex, silicone, peel ply residue, release film) on bond quality, as measured by fracture toughness and failure modes of carbon fiber reinforced epoxy substrates bonded in secondary and co-bond configurations with paste and film adhesives. Additionally, the capability of various prebond surface property measurement tools to detect contaminants and potentially predict subsequent bond performance of three different adhesives is also shown. Surface measurement methods included water contact angle, Dyne solution wettability, optically stimulated electron emission spectroscopy, surface free energy, inverse gas chromatography, and Fourier transform infrared spectroscopy with chemometrics analysis. Information will also be provided on the effectiveness of mechanical and energetic surface treatments to recover a bondable surface after contamination. The benefits and drawbacks of the various surface analysis tools to detect contaminants and evaluate prebond surfaces after surface treatment were assessed as well as their ability to correlate to bond performance. Surface analysis tools were also evaluated for their potential use as in-line quality control of adhesive bonding parameters in the manufacturing environment.

  5. MS Data Miner: a web-based software tool to analyze, compare, and share mass spectrometry protein identifications.

    PubMed

    Dyrlund, Thomas F; Poulsen, Ebbe T; Scavenius, Carsten; Sanggaard, Kristian W; Enghild, Jan J

    2012-09-01

    Data processing and analysis of proteomics data are challenging and time consuming. In this paper, we present MS Data Miner (MDM) (http://sourceforge.net/p/msdataminer), a freely available web-based software solution aimed at minimizing the time required for the analysis, validation, data comparison, and presentation of data files generated in MS software, including Mascot (Matrix Science), Mascot Distiller (Matrix Science), and ProteinPilot (AB Sciex). The program was developed to significantly decrease the time required to process large proteomic data sets for publication. This open sourced system includes a spectra validation system and an automatic screenshot generation tool for Mascot-assigned spectra. In addition, a Gene Ontology term analysis function and a tool for generating comparative Excel data reports are included. We illustrate the benefits of MDM during a proteomics study comprised of more than 200 LC-MS/MS analyses recorded on an AB Sciex TripleTOF 5600, identifying more than 3000 unique proteins and 3.5 million peptides. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Scalability Analysis of Gleipnir: A Memory Tracing and Profiling Tool, on Titan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janjusic, Tommy; Kartsaklis, Christos; Wang, Dali

    2013-01-01

    Application performance is hindered by a variety of factors but most notably driven by the well know CPU-memory speed gap (also known as the memory wall). Understanding application s memory behavior is key if we are trying to optimize performance. Understanding application performance properties is facilitated with various performance profiling tools. The scope of profiling tools varies in complexity, ease of deployment, profiling performance, and the detail of profiled information. Specifically, using profiling tools for performance analysis is a common task when optimizing and understanding scientific applications on complex and large scale systems such as Cray s XK7. This papermore » describes the performance characteristics of using Gleipnir, a memory tracing tool, on the Titan Cray XK7 system when instrumenting large applications such as the Community Earth System Model. Gleipnir is a memory tracing tool built as a plug-in tool for the Valgrind instrumentation framework. The goal of Gleipnir is to provide fine-grained trace information. The generated traces are a stream of executed memory transactions mapped to internal structures per process, thread, function, and finally the data structure or variable. Our focus was to expose tool performance characteristics when using Gleipnir with a combination of an external tools such as a cache simulator, Gl CSim, to characterize the tool s overall performance. In this paper we describe our experience with deploying Gleipnir on the Titan Cray XK7 system, report on the tool s ease-of-use, and analyze run-time performance characteristics under various workloads. While all performance aspects are important we mainly focus on I/O characteristics analysis due to the emphasis on the tools output which are trace-files. Moreover, the tool is dependent on the run-time system to provide the necessary infrastructure to expose low level system detail; therefore, we also discuss any theoretical benefits that can be achieved if such modules were present.« less

  7. Benchmarking of Decision-Support Tools Used for Tiered Sustainable Remediation Appraisal.

    PubMed

    Smith, Jonathan W N; Kerrison, Gavin

    2013-01-01

    Sustainable remediation comprises soil and groundwater risk-management actions that are selected, designed, and operated to maximize net environmental, social, and economic benefit (while assuring protection of human health and safety). This paper describes a benchmarking exercise to comparatively assess potential differences in environmental management decision making resulting from application of different sustainability appraisal tools ranging from simple (qualitative) to more quantitative (multi-criteria and fully monetized cost-benefit analysis), as outlined in the SuRF-UK framework. The appraisal tools were used to rank remedial options for risk management of a subsurface petroleum release that occurred at a petrol filling station in central England. The remediation options were benchmarked using a consistent set of soil and groundwater data for each tier of sustainability appraisal. The ranking of remedial options was very similar in all three tiers, and an environmental management decision to select the most sustainable options at tier 1 would have been the same decision at tiers 2 and 3. The exercise showed that, for relatively simple remediation projects, a simple sustainability appraisal led to the same remediation option selection as more complex appraisal, and can be used to reliably inform environmental management decisions on other relatively simple land contamination projects.

  8. Assessing task-technology fit in a PACS upgrade: do users' and developers' appraisals converge?

    PubMed

    Lepanto, Luigi; Sicotte, Claude; Lehoux, Pascale

    2011-12-01

    The purpose of this study was to measure users' perceived benefits of a picture archiving and communication system (PACS) upgrade, and compare their responses to those predicted by developers. The Task-Technology Fit (TTF) model served as the theoretical framework to study the relation between TTF, utilization, and perceived benefits. A self-administered survey was distributed to radiologists working in a university hospital undergoing a PACS upgrade. Four variables were measured: impact, utilization, TTF, and perceived net benefits. The radiologists were divided into subgroups according to their utilization profiles. Analysis of variance was performed and the hypotheses were tested with regression analysis. Interviews were conducted with developers involved in the PACS upgrade who were asked to predict impact and TTF. Users identified only a moderate fit between the PACS enhancements and their tasks, while developers predicted a high level of TTF. The combination of a moderate fit and an underestimation of the potential impact of changes in the PACS led to a low score for perceived net benefits. Results varied significantly among user subgroups. Globally, the data support the hypotheses that TTF predicts utilization and perceived net benefits, but not that utilization predicts perceived net benefits. TTF is a valid tool to assess perceived benefits, but it is important to take into account the characteristics of users. In the context of a technology that is rapidly evolving, there needs to be an alignment of what users perceive as a good fit and the functionality developers incorporate into their products.

  9. Early Experiences Porting the NAMD and VMD Molecular Simulation and Analysis Software to GPU-Accelerated OpenPOWER Platforms

    PubMed Central

    Stone, John E.; Hynninen, Antti-Pekka; Phillips, James C.; Schulten, Klaus

    2017-01-01

    All-atom molecular dynamics simulations of biomolecules provide a powerful tool for exploring the structure and dynamics of large protein complexes within realistic cellular environments. Unfortunately, such simulations are extremely demanding in terms of their computational requirements, and they present many challenges in terms of preparation, simulation methodology, and analysis and visualization of results. We describe our early experiences porting the popular molecular dynamics simulation program NAMD and the simulation preparation, analysis, and visualization tool VMD to GPU-accelerated OpenPOWER hardware platforms. We report our experiences with compiler-provided autovectorization and compare with hand-coded vector intrinsics for the POWER8 CPU. We explore the performance benefits obtained from unique POWER8 architectural features such as 8-way SMT and its value for particular molecular modeling tasks. Finally, we evaluate the performance of several GPU-accelerated molecular modeling kernels and relate them to other hardware platforms. PMID:29202130

  10. Optimization of an Advanced Hybrid Wing Body Concept Using HCDstruct Version 1.2

    NASA Technical Reports Server (NTRS)

    Quinlan, Jesse R.; Gern, Frank H.

    2016-01-01

    Hybrid Wing Body (HWB) aircraft concepts continue to be promising candidates for achieving the simultaneous fuel consumption and noise reduction goals set forth by NASA's Environmentally Responsible Aviation (ERA) project. In order to evaluate the projected benefits, improvements in structural analysis at the conceptual design level were necessary; thus, NASA researchers developed the Hybrid wing body Conceptual Design and structural optimization (HCDstruct) tool to perform aeroservoelastic structural optimizations of advanced HWB concepts. In this paper, the authors present substantial updates to the HCDstruct tool and related analysis, including: the addition of four inboard and eight outboard control surfaces and two all-movable tail/rudder assemblies, providing a full aeroservoelastic analysis capability; the implementation of asymmetric load cases for structural sizing applications; and a methodology for minimizing control surface actuation power using NASTRAN SOL 200 and HCDstruct's aeroservoelastic finite-element model (FEM).

  11. Inspection planning development: An evolutionary approach using reliability engineering as a tool

    NASA Technical Reports Server (NTRS)

    Graf, David A.; Huang, Zhaofeng

    1994-01-01

    This paper proposes an evolutionary approach for inspection planning which introduces various reliability engineering tools into the process and assess system trade-offs among reliability, engineering requirement, manufacturing capability and inspection cost to establish an optimal inspection plan. The examples presented in the paper illustrate some advantages and benefits of the new approach. Through the analysis, reliability and engineering impacts due to manufacturing process capability and inspection uncertainty are clearly understood; the most cost effective and efficient inspection plan can be established and associated risks are well controlled; some inspection reductions and relaxations are well justified; and design feedbacks and changes may be initiated from the analysis conclusion to further enhance reliability and reduce cost. The approach is particularly promising as global competitions and customer quality improvement expectations are rapidly increasing.

  12. A Method for Making Cross-Comparable Estimates of the Benefits of Decision Support Technologies for Air Traffic Management

    NASA Technical Reports Server (NTRS)

    Lee, David; Long, Dou; Etheridge, Mel; Plugge, Joana; Johnson, Jesse; Kostiuk, Peter

    1998-01-01

    We present a general method for making cross comparable estimates of the benefits of NASA-developed decision support technologies for air traffic management, and we apply a specific implementation of the method to estimate benefits of three decision support tools (DSTs) under development in NASA's advanced Air Transportation Technologies Program: Active Final Approach Spacing Tool (A-FAST), Expedite Departure Path (EDP), and Conflict Probe and Trial Planning Tool (CPTP). The report also reviews data about the present operation of the national airspace system (NAS) to identify opportunities for DST's to reduce delays and inefficiencies.

  13. Perspectives on procedure-based assessments: a thematic analysis of semistructured interviews with 10 UK surgical trainees.

    PubMed

    Shalhoub, Joseph; Marshall, Dominic C; Ippolito, Kate

    2017-03-24

    The introduction of competency-based training has necessitated development and implementation of accompanying mechanisms for assessment. Procedure-based assessments (PBAs) are an example of workplace-based assessments that are used to examine focal competencies in the workplace. The primary objective was to understand surgical trainees' perspective on the value of PBA. Semistructured interviews with 10 surgical trainees individually interviewed to explore their views. Interviews were audio-recorded and transcribed; following this, they were open and axial coded. Thematic analysis was then performed. Semistructured interviews yielded several topical and recurring themes. In trainees' experience, the use of PBAs as a summative tool limits their educational value. Trainees reported a lack of support from seniors and variation in the usefulness of the tool based on stage of training. Concerns related to the validity of PBAs for evaluating trainees' performance with reports of 'gaming' the system and trainees completing their own assessments. Trainees did identify the significant value of PBAs when used correctly. Benefits included the identification of additional learning opportunities, standardisation of assessment and their role in providing a measure of progress. The UK surgical trainees interviewed identified both limitations and benefits to PBAs; however, we would argue based on their responses and our experience that their use as a summative tool limits their formative use as an educational opportunity. PBAs should either be used exclusively to support learning or solely as a summative tool; if so, further work is needed to audit, validate and standardise them for this purpose. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  14. Cost-benefit analysis as a part of sustainability assessment of remediation alternatives for contaminated land.

    PubMed

    Söderqvist, Tore; Brinkhoff, Petra; Norberg, Tommy; Rosén, Lars; Back, Pär-Erik; Norrman, Jenny

    2015-07-01

    There is an increasing demand amongst decision-makers and stakeholders for identifying sustainable remediation alternatives at contaminated sites, taking into account that remediation typically results in both positive and negative consequences. Multi-criteria analysis (MCA) is increasingly used for sustainability appraisal, and the Excel-based MCA tool Sustainable Choice Of REmediation (SCORE) has been developed to provide a relevant and transparent assessment of the sustainability of remediation alternatives relative to a reference alternative, considering key criteria in the economic, environmental and social sustainability domains, and taking uncertainty into explicit account through simulation. The focus of this paper is the use of cost-benefit analysis (CBA) as a part of SCORE for assessing the economic sustainability of remediation alternatives. An economic model is used for deriving a cost-benefit rule, which in turn motivates cost and benefit items in a CBA of remediation alternatives. The empirical part of the paper is a CBA application on remediation alternatives for the Hexion site, a former chemical industry area close to the city of Göteborg in SW Sweden. The impact of uncertainties in and correlations across benefit and cost items on CBA results is illustrated. For the Hexion site, the traditional excavation-and-disposal remediation alternative had the lowest expected net present value, which illustrates the importance of also considering other alternatives before deciding upon how a remediation should be carried out. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. The Aviation System Analysis Capability Air Carrier Cost-Benefit Model

    NASA Technical Reports Server (NTRS)

    Gaier, Eric M.; Edlich, Alexander; Santmire, Tara S.; Wingrove, Earl R.., III

    1999-01-01

    To meet its objective of assisting the U.S. aviation industry with the technological challenges of the future, NASA must identify research areas that have the greatest potential for improving the operation of the air transportation system. Therefore, NASA is developing the ability to evaluate the potential impact of various advanced technologies. By thoroughly understanding the economic impact of advanced aviation technologies and by evaluating how the new technologies will be used in the integrated aviation system, NASA aims to balance its aeronautical research program and help speed the introduction of high-leverage technologies. To meet these objectives, NASA is building the Aviation System Analysis Capability (ASAC). NASA envisions ASAC primarily as a process for understanding and evaluating the impact of advanced aviation technologies on the U.S. economy. ASAC consists of a diverse collection of models and databases used by analysts and other individuals from the public and private sectors brought together to work on issues of common interest to organizations in the aviation community. ASAC also will be a resource available to the aviation community to analyze; inform; and assist scientists, engineers, analysts, and program managers in their daily work. The ASAC differs from previous NASA modeling efforts in that the economic behavior of buyers and sellers in the air transportation and aviation industries is central to its conception. Commercial air carriers, in particular, are an important stakeholder in this community. Therefore, to fully evaluate the implications of advanced aviation technologies, ASAC requires a flexible financial analysis tool that credibly links the technology of flight with the financial performance of commercial air carriers. By linking technical and financial information, NASA ensures that its technology programs will continue to benefit the user community. In addition, the analysis tool must be capable of being incorporated into the wide-ranging suite of economic and technical models that comprise ASAC. This report describes an Air Carrier Cost-Benefit Model (CBM) that meets these requirements. The ASAC CBM is distinguished from many of the aviation cost-benefit models by its exclusive focus on commercial air carriers. The model considers such benefit categories as time and fuel savings, utilization opportunities, reliability and capacity enhancements, and safety and security improvements. The model distinguishes between benefits that are predictable and those that occur randomly. By making such a distinction, the model captures the ability of air carriers to reoptimize scheduling and crew assignments for predictable benefits. In addition, the model incorporates a life-cycle cost module for new technology, which applies the costs of nonrecurring acquisitions, recurring maintenance and operation, and training to each aircraft equipment type independently.

  16. ROCKETSHIP: a flexible and modular software tool for the planning, processing and analysis of dynamic MRI studies.

    PubMed

    Barnes, Samuel R; Ng, Thomas S C; Santa-Maria, Naomi; Montagne, Axel; Zlokovic, Berislav V; Jacobs, Russell E

    2015-06-16

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a promising technique to characterize pathology and evaluate treatment response. However, analysis of DCE-MRI data is complex and benefits from concurrent analysis of multiple kinetic models and parameters. Few software tools are currently available that specifically focuses on DCE-MRI analysis with multiple kinetic models. Here, we developed ROCKETSHIP, an open-source, flexible and modular software for DCE-MRI analysis. ROCKETSHIP incorporates analyses with multiple kinetic models, including data-driven nested model analysis. ROCKETSHIP was implemented using the MATLAB programming language. Robustness of the software to provide reliable fits using multiple kinetic models is demonstrated using simulated data. Simulations also demonstrate the utility of the data-driven nested model analysis. Applicability of ROCKETSHIP for both preclinical and clinical studies is shown using DCE-MRI studies of the human brain and a murine tumor model. A DCE-MRI software suite was implemented and tested using simulations. Its applicability to both preclinical and clinical datasets is shown. ROCKETSHIP was designed to be easily accessible for the beginner, but flexible enough for changes or additions to be made by the advanced user as well. The availability of a flexible analysis tool will aid future studies using DCE-MRI. A public release of ROCKETSHIP is available at https://github.com/petmri/ROCKETSHIP .

  17. The role of communities in sustainable land and forest management: The case of Nyanga, Zvimba and Guruve districts of Zimbabwe

    PubMed Central

    Sagonda, Ruvimbo; Kaundikiza, Munyaradzi

    2016-01-01

    Forest benefit analysis is vital in ensuring sustainable community-based natural resources management. Forest depletion and degradation are key issues in rural Zimbabwe and strategies to enhance sustainable forest management are continually sought. This study was carried out to assess the impact of forests on communities from Nyanga, Guruve and Zvimba districts of Zimbabwe. It is based on a Big Lottery Fund project implemented by Progressio-UK and Environment Africa. It focuses on identifying replicable community forest and land management strategies and the level of benefits accruing to the community. Analysis of change was based on the Income and Food Security and Forest benefits, which also constitutes the tools used during the research. The study confirms the high rate of deforestation and the increased realisation by communities to initiate practical measures aimed at protecting and sustaining forest and land resources from which they derive economic and social benefits. The results highlight the value of community structures (Farmer Field Schools and Environmental Action Groups) as conduits for natural resource management. The interconnectivity among forests, agricultural systems and the integral role of people are recognised as key to climate change adaptation.

  18. Raising parents' awareness of the benefits of immunization by using a visual aid tool.

    PubMed

    Mulumba, Jose Gaby Tshikuka; Daoud, Saada; Kabang, Bandé

    2007-07-01

    A visual aid tool was used in two communities of Chad to raise parents' awareness of the benefits of immunization. In one community, the tool was administered by social workers two weeks before national immunization days (NIDs) and in the other community by vaccinators during NIDs. Parents' awareness significantly rose in both communities but was more significant in the community where the tool was administered by social workers. A significant association was found between parents' unawareness and children who missed immunization in both communities.

  19. Podcasting as a Mobile Learning Technology: A Study of iTunes U Learners

    ERIC Educational Resources Information Center

    Rosell-Aguilar, Fernando

    2015-01-01

    Despite the fact that portability was perceived as one of the major benefits of podcasting as a teaching and learning tool, little evidence has been found of users taking advantage of this feature for academic use. This paper reports on a major study (1886 responses) of iTunes U users. The analysis compares the responses of those participants who…

  20. A New Vision for Integrated Breast Care.

    DTIC Science & Technology

    1998-09-01

    Analysis tools to Mapping; and established counseling methods to Debriefing. We are now investigating how Neurolinguistic Programming to may help... programs and services for the benefit of the patient. Our Continuous Quality Improvement, Informatics and Education Cores are working together to help...streamline implementation of programs . This enables us to identify the quality improvements we hope to gain by changing a service and the quality

  1. Finding Citations to Social Work Literature: The Relative Benefits of Using "Web of Science," "Scopus," or "Google Scholar"

    ERIC Educational Resources Information Center

    Bergman, Elaine M. Lasda

    2012-01-01

    Past studies of citation coverage of "Web of Science," "Scopus," and "Google Scholar" do not demonstrate a consistent pattern that can be applied to the interdisciplinary mix of resources used in social work research. To determine the utility of these tools to social work researchers, an analysis of citing references to well-known social work…

  2. Nano-Launcher Technologies, Approaches, and Life Cycle Assessment. Phase II

    NASA Technical Reports Server (NTRS)

    Zapata, Edgar

    2014-01-01

    Assist in understanding NASA technology and investment approaches, and other driving factors, necessary for enabling dedicated nano-launchers by industry at a cost and flight rate that (1) could support and be supported by an emerging nano-satellite market and (2) would benefit NASAs needs. Develop life-cycle cost, performance and other NASA analysis tools or models required to understand issues, drivers and challenges.

  3. An Investigation of Students' Perceptions of Learning Benefits of Weblogs in an East Asian Context: A Rasch Analysis

    ERIC Educational Resources Information Center

    Goh, Jonathan W. P.; Quek, Chin Joo; Lee, Ong Kim

    2010-01-01

    In the 1980s we witnessed the dawning of the "Information Age". Today, the use of information technology has become an integral part of our lives. Education is no exception. With the introduction of Web 2.0 tools such as weblogs, students are presented a new platform for interaction and exchanging ideas. A review of the literature…

  4. Interconnection Assessment Methodology and Cost Benefit Analysis for High-Penetration PV Deployment in the Arizona Public Service System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baggu, Murali; Giraldez, Julieta; Harris, Tom

    In an effort to better understand the impacts of high penetrations of photovoltaic (PV) generators on distribution systems, Arizona Public Service and its partners completed a multi-year project to develop the tools and knowledge base needed to safely and reliably integrate high penetrations of utility- and residential-scale PV. Building upon the APS Community Power Project-Flagstaff Pilot, this project investigates the impact of PV on a representative feeder in northeast Flagstaff. To quantify and catalog the effects of the estimated 1.3 MW of PV that will be installed on the feeder (both smaller units at homes and large, centrally located systems),more » high-speed weather and electrical data acquisition systems and digital 'smart' meters were designed and installed to facilitate monitoring and to build and validate comprehensive, high-resolution models of the distribution system. These models are being developed to analyze the impacts of PV on distribution circuit protection systems (including coordination and anti-islanding), predict voltage regulation and phase balance issues, and develop volt/VAr control schemes. This paper continues from a paper presented at the 2014 IEEE PVSC conference that described feeder model evaluation and high penetration advanced scenario analysis, specifically feeder reconfiguration. This paper presents results from Phase 5 of the project. Specifically, the paper discusses tool automation; interconnection assessment methodology and cost benefit analysis.« less

  5. Surgical process improvement tools: defining quality gaps and priority areas in gastrointestinal cancer surgery.

    PubMed

    Wei, A C; Devitt, K S; Wiebe, M; Bathe, O F; McLeod, R S; Urbach, D R

    2014-04-01

    Surgery is a cornerstone of cancer treatment, but significant differences in the quality of surgery have been reported. Surgical process improvement tools (spits) modify the processes of care as a means to quality improvement (qi). We were interested in developing spits in the area of gastrointestinal (gi) cancer surgery. We report the recommendations of an expert panel held to define quality gaps and establish priority areas that would benefit from spits. The present study used the knowledge-to-action cycle was as a framework. Canadian experts in qi and in gi cancer surgery were assembled in a nominal group workshop. Participants evaluated the merits of spits, described gaps in current knowledge, and identified and ranked processes of care that would benefit from qi. A qualitative analysis of the workshop deliberations using modified grounded theory methods identified major themes. The expert panel consisted of 22 participants. Experts confirmed that spits were an important strategy for qi. The top-rated spits included clinical pathways, electronic information technology, and patient safety tools. The preferred settings for use of spits included preoperative and intraoperative settings and multidisciplinary contexts. Outcomes of interest were cancer-related outcomes, process, and the technical quality of surgery measures. Surgical process improvement tools were confirmed as an important strategy. Expert panel recommendations will be used to guide future research efforts for spits in gi cancer surgery.

  6. Collaborative socioeconomic tool development to address management and planning needs

    USGS Publications Warehouse

    Richardson, Leslie A.; Huber, Christopher; Cullinane Thomas, Catherine; Donovan, Elizabeth; Koontz, Lynne M.

    2014-01-01

    Public lands and resources managed by the National Park Service (NPS) and other land management agencies provide a wide range of social and economic benefits to both nearby local communities and society as a whole, ranging from job creation, to access to unique recreational opportunities, to subsistence and tribal uses of the land. Over the years, there has been an increased need to identify and analyze the socioeconomic effects of the public’s use of NPS lands and resources, and the wide range of NPS land management decisions. This need stems from laws such as the National Environmental Policy Act (NEPA), increased litigation and appeals on NPS management decisions, as well as an overall need to demonstrate how parks benefit communities and the American public. To address these needs, the U.S. Geological Survey (USGS) and NPS have an ongoing partnership to collaboratively develop socioeconomic tools to support planning needs and resource management. This article discusses two such tools. The first, Assessing Socioeconomic Planning Needs (ASPN), was developed to help NPS planners and managers identify key social and economic issues that can arise as a result of land management actions. The second tool, the Visitor Spending Effects (VSE) model, provides a specific example of a type of analysis that may be recommended by ASPN. The remainder of this article discusses the development, main features, and plans for future versions and applications of both ASPN and the VSE.

  7. Surgical process improvement tools: defining quality gaps and priority areas in gastrointestinal cancer surgery

    PubMed Central

    Wei, A.C.; Devitt, K.S.; Wiebe, M.; Bathe, O.F.; McLeod, R.S.; Urbach, D.R.

    2014-01-01

    Background Surgery is a cornerstone of cancer treatment, but significant differences in the quality of surgery have been reported. Surgical process improvement tools (spits) modify the processes of care as a means to quality improvement (qi). We were interested in developing spits in the area of gastrointestinal (gi) cancer surgery. We report the recommendations of an expert panel held to define quality gaps and establish priority areas that would benefit from spits. Methods The present study used the knowledge-to-action cycle was as a framework. Canadian experts in qi and in gi cancer surgery were assembled in a nominal group workshop. Participants evaluated the merits of spits, described gaps in current knowledge, and identified and ranked processes of care that would benefit from qi. A qualitative analysis of the workshop deliberations using modified grounded theory methods identified major themes. Results The expert panel consisted of 22 participants. Experts confirmed that spits were an important strategy for qi. The top-rated spits included clinical pathways, electronic information technology, and patient safety tools. The preferred settings for use of spits included preoperative and intraoperative settings and multidisciplinary contexts. Outcomes of interest were cancer-related outcomes, process, and the technical quality of surgery measures. Conclusions Surgical process improvement tools were confirmed as an important strategy. Expert panel recommendations will be used to guide future research efforts for spits in gi cancer surgery. PMID:24764704

  8. Cost-effectiveness analysis and HIV screening: the emergency medicine perspective.

    PubMed

    Hsu, Heather; Walensky, Rochelle P

    2011-07-01

    Cost-effectiveness analysis is a useful tool for decisionmakers charged with prioritizing of the myriad medical interventions in the emergency department (ED). This analytic approach may be especially helpful for ranking programs that are competing for scarce resources while attempting to maximize net health benefits. In this article, we review the health economics literature on HIV screening in EDs and introduce the methods of cost-effectiveness analysis for medical interventions. We specifically describe the incremental cost-effectiveness ratio--its calculation, the derivation of ratio components, and the interpretation of these ratios. Copyright © 2011. Published by Mosby, Inc.

  9. Are Visual Informatics Actually Useful in Practice: A Study in a Film Studies Context

    NASA Astrophysics Data System (ADS)

    Mohamad Ali, Nazlena; Smeaton, Alan F.

    This paper describes our work in examining the question of whether providing a visual informatics application in an educational scenario, in particular, providing video content analysis, does actually yield real benefit in practice. We provide a new software tool in the domain of movie content analysis technologies for use by students of film studies students at Dublin City University, and we try to address the research question of measuring the 'benefit' from the use of these technologies to students. We examine their real practices in studying for the module using our advanced application as compared to using conventional DVD browsing of movie content. In carrying out this experiment, we found that students have better essay outcomes, higher satisfactions levels and the mean time spent on movie analyzing is longer with the new technologies.

  10. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    NASA Astrophysics Data System (ADS)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  11. [Cost-effectiveness analysis and diet quality index applied to the WHO Global Strategy].

    PubMed

    Machado, Flávia Mori Sarti; Simões, Arlete Naresse

    2008-02-01

    To test the use of cost-effectiveness analysis as a decision making tool in the production of meals for the inclusion of the recommendations published in the World Health Organization's Global Strategy. Five alternative options for breakfast menu were assessed previously to their adoption in a food service at a university in the state of Sao Paulo, Southeastern Brazil, in 2006. Costs of the different options were based on market prices of food items (direct cost). Health benefits were estimated based on adaptation of the Diet Quality Index (DQI). Cost-effectiveness ratios were estimated by dividing benefits by costs and incremental cost-effectiveness ratios were estimated as cost differential per unit of additional benefit. The meal choice was based on health benefit units associated to direct production cost as well as incremental effectiveness per unit of differential cost. The analysis showed the most simple option with the addition of a fruit (DQI = 64 / cost = R$ 1.58) as the best alternative. Higher effectiveness was seen in the options with a fruit portion (DQI1=64 / DQI3=58 / DQI5=72) compared to the others (DQI2=48 / DQI4=58). The estimate of cost-effectiveness ratio allowed to identifying the best breakfast option based on cost-effectiveness analysis and Diet Quality Index. These instruments allow easy application easiness and objective evaluation which are key to the process of inclusion of public or private institutions under the Global Strategy directives.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowen, Benjamin; Ruebel, Oliver; Fischer, Curt Fischer R.

    BASTet is an advanced software library written in Python. BASTet serves as the analysis and storage library for the OpenMSI project. BASTet is an integrate framework for: i) storage of spectral imaging data, ii) storage of derived analysis data, iii) provenance of analyses, iv) integration and execution of analyses via complex workflows. BASTet implements the API for the HDF5 storage format used by OpenMSI. Analyses that are developed using BASTet benefit from direct integration with storage format, automatic tracking of provenance, and direct integration with command-line and workflow execution tools. BASTet also defines interfaces to enable developers to directly integratemore » their analysis with OpenMSI's web-based viewing infrastruture without having to know OpenMSI. BASTet also provides numerous helper classes and tools to assist with the conversion of data files, ease parallel implementation of analysis algorithms, ease interaction with web-based functions, description methods for data reduction. BASTet also includes detailed developer documentation, user tutorials, iPython notebooks, and other supporting documents.« less

  13. Digital image analysis in pathology: benefits and obligation.

    PubMed

    Laurinavicius, Arvydas; Laurinaviciene, Aida; Dasevicius, Darius; Elie, Nicolas; Plancoulaine, Benoît; Bor, Catherine; Herlin, Paulette

    2012-01-01

    Pathology has recently entered the era of personalized medicine. This brings new expectations for the accuracy and precision of tissue-based diagnosis, in particular, when quantification of histologic features and biomarker expression is required. While for many years traditional pathologic diagnosis has been regarded as ground truth, this concept is no longer sufficient in contemporary tissue-based biomarker research and clinical use. Another major change in pathology is brought by the advancement of virtual microscopy technology enabling digitization of microscopy slides and presenting new opportunities for digital image analysis. Computerized vision provides an immediate benefit of increased capacity (automation) and precision (reproducibility), but not necessarily the accuracy of the analysis. To achieve the benefit of accuracy, pathologists will have to assume an obligation of validation and quality assurance of the image analysis algorithms. Reference values are needed to measure and control the accuracy. Although pathologists' consensus values are commonly used to validate these tools, we argue that the ground truth can be best achieved by stereology methods, estimating the same variable as an algorithm is intended to do. Proper adoption of the new technology will require a new quantitative mentality in pathology. In order to see a complete and sharp picture of a disease, pathologists will need to learn to use both their analogue and digital eyes.

  14. Cost-benefit analysis of drug treatment services: review of the literature*

    PubMed

    Cartwright, William S.

    2000-03-01

    BACKGROUND: How valuable is public investment in treatment for drug abuse and dependency in the real world of everyday practice? Does drug abuse treatment provide benefits and how are they valued? What are the costs of obtaining outcomes and benefits? Cost-benefit analysis attempts to answer these questions in a standard analytic framework. AIMS: This paper reviews cost-benefit analyses with scientific merit so that analysts will have a current picture of the state of the research. It will also give public decision-makers information with regards to the available evidence for policy purposes. METHOD: Bibliographic searches were performed. Studies were obtained through the assistance of the Parklawn Health Library system, a component of the US Public Health Service. Selected studies were from the scientific literature with the exception of eight studies published as governmental reports. RESULTS: Cost-benefit studies have fallen into the following categories: (i) planning models for delivery systems in states and cities; (ii) short-term follow-up studies of individuals, (iii) single individual programs and (iv) state system's monitoring of outcomes. In 18 cost-benefit studies, a persistent finding is that benefits exceed costs, even when not all benefits are accounted for in the analysis. Much variation is found in the implementation of cost-benefit methods, and this is detailed across discussions of effectiveness, benefits and costs. Studies have emphasized the cost savings to society from the reduction in external costs created by the behavioral consequences of addiction and drug use. DISCUSSION: Economic analysis of drug treatment requires sophisticated conceptualization and measurement. Cost-benefit analysis of drug treatment has been a significant analytical exercise since the early 1970s when the public drug treatment system was founded in the United States. CONCLUSION: Drug abuse treatment services may be considered as contributing positive economic returns to society. However, considerable work needs to be done to standardize methods used in the studies. A striking area of omission is the absence of studies for adolescents and only one for women in treatment. IMPLICATIONS FOR HEALTH CARE PROVISION AND USE: Finding a positive net social benefit should assist policy-makers with decisions related to drug abuse treatment expenditures. Additional work on allocation of budget dollars across various drug treatment services will be needed. IMPLICATIONS FOR HEALTH POLICY FORMULATION: Government agencies and other stakeholders in national health care systems must realize that cost-benefit studies are an important tool for decision-making. Rational strategies can only be addressed by examining alternatives for the efficient allocation and equitable distribution of scarce resources. IMPLICATIONS FOR FURTHER RESEARCH: Future research should focus on standardizing the methods used in the cost-benefit analysis. Extensions should examine methods related to the willingness-to-pay approach. Studies are needed for drug abuse treatment targeted to adolescents and women. More studies should be published in the scientific literature.

  15. Digital Support Platform: a qualitative research study investigating the feasibility of an internet-based, postdiagnostic support platform for families living with dementia.

    PubMed

    Killin, Lewis O J; Russ, Tom C; Surdhar, Sushee Kaur; Yoon, Youngseo; McKinstry, Brian; Gibson, Grant; MacIntyre, Donald J

    2018-04-12

    To establish the feasibility of the Digital Support Platform (DSP), an internet-based, postdiagnostic tool designed for families living with a diagnosis of dementia. Qualitative methods using normalisation process theory as an analysis framework for semistructured interview transcriptions. A community care setting in the South-East Scotland. We interviewed 10 dyads of people with Alzheimer's, vascular or mixed dementia (PWD), and their family carers, who had been given and had used the DSP for at least 2 months. Our analysis revealed that the DSP was predominantly understood and used by the carers rather than PWD, and was used alongside tools and methods they already used to care for their relative. The DSP was interpreted as a tool that may be of benefit to those experiencing later stages of dementia or with physical care needs. Carers stated that the DSP may be of benefit in the future, reflecting a disinclination to prepare for or anticipate for future needs, rather than focus on those needs present at the time of distribution. PWD spoke positively about an interest in learning to use technology more effectively and enjoyed having their own tablet devices. The DSP was not wholly appropriate for families living with dementia in its early stages. The views of carers confirmed that postdiagnostic support was valued, but emphasised the importance of tailoring this support to the exact needs and current arrangements of families. There may be a benefit to introducing, encouraging, providing and teaching internet-enabled technology to those PWD who do not currently have access. Training should be provided when introducing new technology to PWD. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. The Blue DRAGON--a system for monitoring the kinematics and the dynamics of endoscopic tools in minimally invasive surgery for objective laparoscopic skill assessment.

    PubMed

    Rosen, Jacob; Brown, Jeffrey D; Barreca, Marco; Chang, Lily; Hannaford, Blake; Sinanan, Mika

    2002-01-01

    Minimally invasive surgeiy (MIS) involves a multi-dimensional series of tasks requiring a synthesis between visual information and the kinematics and dynamics of the surgical tools. Analysis of these sources of information is a key step in mastering MIS surgery but may also be used to define objective criteria for characterizing surgical performance. The BIueDRAGON is a new system for acquiring the kinematics and the dynamics of two endoscopic tools along with the visual view of the surgical scene. It includes two four-bar mechanisms equipped with position and force torque sensors for measuring the positions and the orientations (P/O) of two endoscopic tools along with the forces and torques applied by the surgeons hands. The methodology of decomposing the surgical task is based on a fully connected, finite-states (28 states) Markov model where each states corresponded to a fundamental tool/tissue interaction based on the tool kinematics and associated with unique F/T signatures. The experimental protocol included seven MIS tasks performed on an animal model (pig) by 30 surgeons at different levels of their residency training. Preliminary analysis of these data showed that major differences between residents at different skill levels were: (i) the types of tool/tissue interactions being used, (ii) the transitions between tool/tissue interactions being applied by each hand, (iii) time spent while perfonning each tool/tissue interaction, (iv) the overall completion time, and (v) the variable F/T magnitudes being applied by the subjects through the endoscopic tools. Systems like surgical robots or virtual reality simulators that inherently measure the kinematics and the dynamics of the surgical tool may benefit from inclusion of the proposed methodology for analysis of efficacy and objective evaluation of surgical skills during training.

  17. Citizens unite for computational immunology!

    PubMed

    Belden, Orrin S; Baker, Sarah Catherine; Baker, Brian M

    2015-07-01

    Recruiting volunteers who can provide computational time, programming expertise, or puzzle-solving talent has emerged as a powerful tool for biomedical research. Recent projects demonstrate the potential for such 'crowdsourcing' efforts in immunology. Tools for developing applications, new funding opportunities, and an eager public make crowdsourcing a serious option for creative solutions for computationally-challenging problems. Expanded uses of crowdsourcing in immunology will allow for more efficient large-scale data collection and analysis. It will also involve, inspire, educate, and engage the public in a variety of meaningful ways. The benefits are real - it is time to jump in! Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  19. Properties of O dwarf stars in 30 Doradus

    NASA Astrophysics Data System (ADS)

    Sabín-Sanjulián, Carolina; VFTS Collaboration

    2017-11-01

    We perform a quantitative spectroscopic analysis of 105 presumably single O dwarf stars in 30 Doradus, located within the Large Magellanic Cloud. We use mid-to-high resolution multi-epoch optical spectroscopic data obtained within the VLT-FLAMES Tarantula Survey. Stellar and wind parameters are derived by means of the automatic tool iacob-gbat, which is based on a large grid of fastwind models. We also benefit from the Bayesian tool bonnsai to estimate evolutionary masses. We provide a spectral calibration for the effective temperature of O dwarf stars in the LMC, deal with the mass discrepancy problem and investigate the wind properties of the sample.

  20. Dynamic Systems Analysis for Turbine Based Aero Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.

    2016-01-01

    The aircraft engine design process seeks to optimize the overall system-level performance, weight, and cost for a given concept. Steady-state simulations and data are used to identify trade-offs that should be balanced to optimize the system in a process known as systems analysis. These systems analysis simulations and data may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic systems analysis provides the capability for assessing the dynamic tradeoffs at an earlier stage of the engine design process. The dynamic systems analysis concept, developed tools, and potential benefit are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed to provide the user with an estimate of the closed-loop performance (response time) and operability (high pressure compressor surge margin) for a given engine design and set of control design requirements. TTECTrA along with engine deterioration information, can be used to develop a more generic relationship between performance and operability that can impact the engine design constraints and potentially lead to a more efficient engine.

  1. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, wemore » built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of tools targeted at Office of Science Leadership Class computer systems and selected Office of Science application codes. We describe the contributions made by the team at the University of Wisconsin. The project built on the efforts in Open|SpeedShop funded by DOE/NNSA and the DOE/NNSA Tri-Lab community, extended Open|Speedshop to the Office of Science Leadership Class Computing Facilities, and addressed new challenges found on these cutting edge systems. Work done under this project at Wisconsin can be divided into two categories, new algorithms and techniques for debugging, and foundation infrastructure work on our Dyninst binary analysis and instrumentation toolkits and MRNet scalability infrastructure.« less

  2. Water Conservation Measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ian Metzger, Jesse Dean

    2010-12-31

    This software requires inputs of simple water fixture inventory information and calculates the water/energy and cost benefits of various retrofit opportunities. This tool includes water conservation measures for: Low-flow Toilets, Low-flow Urinals, Low-flow Faucets, and Low-flow Showheads. This tool calculates water savings, energy savings, demand reduction, cost savings, and building life cycle costs including: simple payback, discounted payback, net-present value, and savings to investment ratio. In addition this tool also displays the environmental benefits of a project.

  3. Benefit-Cost Analysis of Foot-and-Mouth Disease Vaccination at the Farm-Level in South Vietnam.

    PubMed

    Truong, Dinh Bao; Goutard, Flavie Luce; Bertagnoli, Stéphane; Delabouglise, Alexis; Grosbois, Vladimir; Peyre, Marisa

    2018-01-01

    This study aimed to analyze the financial impact of foot-and-mouth disease (FMD) outbreaks in cattle at the farm-level and the benefit-cost ratio (BCR) of biannual vaccination strategy to prevent and eradicate FMD for cattle in South Vietnam. Production data were collected from 49 small-scale dairy farms, 15 large-scale dairy farms, and 249 beef farms of Long An and Tay Ninh province using a questionaire. Financial data of FMD impacts were collected using participatory tools in 37 villages of Long An province. The net present value, i.e., the difference between the benefits (additional revenue and saved costs) and costs (additional costs and revenue foregone), of FMD vaccination in large-scale dairy farms was 2.8 times higher than in small-scale dairy farms and 20 times higher than in beef farms. The BCR of FMD vaccination over 1 year in large-scale dairy farms, small-scale dairy farms, and beef farms were 11.6 [95% confidence interval (95% CI) 6.42-16.45], 9.93 (95% CI 3.45-16.47), and 3.02 (95% CI 0.76-7.19), respectively. The sensitivity analysis showed that varying the vaccination cost had more effect on the BCR of cattle vaccination than varying the market price. This benefit-cost analysis of biannual vaccination strategy showed that investment in FMD prevention can be financially profitable, and therefore sustainable, for dairy farmers. For beef cattle, it is less certain that vaccination is profitable. Additional benefit-cost analysis study of vaccination strategies at the national-level would be required to evaluate and adapt the national strategy to achieve eradication of this disease in Vietnam.

  4. Tobacco Regulation and Cost-Benefit Analysis: How Should We Value Foregone Consumer Surplus?

    PubMed

    Levy, Helen G; Norton, Edward C; Smith, Jeffrey A

    2018-01-01

    Recent tobacco regulations proposed by the Food and Drug Administration have raised a thorny question: how should the cost-benefit analysis accompanying such policies value foregone consumer surplus associated with regulation-induced reductions in smoking? In a model with rational and fully informed consumers, this question is straightforward. There is disagreement, however, about whether consumers are rational and fully informed, and the literature offers little practical guidance about what approach the FDA should use if they are not. In this paper, we outline the history of the FDA's recent attempts to regulate cigarettes and other tobacco products and how they have valued foregone consumer surplus in cost-benefit analyses. We advocate replacing the approach used in most of this literature, which first calculates health gains associated with regulation and then "offsets" them by some factor reflecting consumer surplus losses, with a more general behavioral public finance framework for welfare analysis. This framework applies standard tools of welfare analysis to consumer demand that may be "biased" (that is, not necessarily rational and fully informed) without requiring specific assumptions about the reason for the bias. This framework would require estimates of both biased and unbiased consumer demand; we sketch an agenda to help develop these in the context of smoking. The use of this framework would substantially reduce the confusion currently surrounding welfare analysis of tobacco regulation.

  5. Tobacco Regulation and Cost-Benefit Analysis: How Should We Value Foregone Consumer Surplus?

    PubMed Central

    Levy, Helen G.; Norton, Edward C.; Smith, Jeffrey A.

    2016-01-01

    Recent tobacco regulations proposed by the Food and Drug Administration have raised a thorny question: how should the cost-benefit analysis accompanying such policies value foregone consumer surplus associated with regulation-induced reductions in smoking? In a model with rational and fully informed consumers, this question is straightforward. There is disagreement, however, about whether consumers are rational and fully informed, and the literature offers little practical guidance about what approach the FDA should use if they are not. In this paper, we outline the history of the FDA’s recent attempts to regulate cigarettes and other tobacco products and how they have valued foregone consumer surplus in cost-benefit analyses. We advocate replacing the approach used in most of this literature, which first calculates health gains associated with regulation and then “offsets” them by some factor reflecting consumer surplus losses, with a more general behavioral public finance framework for welfare analysis. This framework applies standard tools of welfare analysis to consumer demand that may be “biased” (that is, not necessarily rational and fully informed) without requiring specific assumptions about the reason for the bias. This framework would require estimates of both biased and unbiased consumer demand; we sketch an agenda to help develop these in the context of smoking. The use of this framework would substantially reduce the confusion currently surrounding welfare analysis of tobacco regulation. PMID:29404381

  6. Benefits of a Unified LaSRS++ Simulation for NAS-Wide and High-Fidelity Modeling

    NASA Technical Reports Server (NTRS)

    Glaab, Patricia; Madden, Michael

    2014-01-01

    The LaSRS++ high-fidelity vehicle simulation was extended in 2012 to support a NAS-wide simulation mode. Since the initial proof-of-concept, the LaSRS++ NAS-wide simulation is maturing into a research-ready tool. A primary benefit of this new capability is the consolidation of the two modeling paradigms under a single framework to save cost, facilitate iterative concept testing between the two tools, and to promote communication and model sharing between user communities at Langley. Specific benefits of each type of modeling are discussed along with the expected benefits of the unified framework. Current capability details of the LaSRS++ NAS-wide simulations are provided, including the visualization tool, live data interface, trajectory generators, terminal routing for arrivals and departures, maneuvering, re-routing, navigation, winds, and turbulence. The plan for future development is also described.

  7. Ultra Lightweight Ballutes for Return to Earth from the Moon

    NASA Technical Reports Server (NTRS)

    Masciarelli, James P.; Lin, John K. H.; Ware, Joanne S.; Rohrschneider, Reuben R.; Braun, Robert D.; Bartels, Robert E.; Moses, Robert W.; Hall, Jeffery L.

    2006-01-01

    Ultra lightweight ballutes offer revolutionary mass and cost benefits along with flexibility in flight system design compared to traditional entry system technologies. Under funding provided by NASA s Exploration Systems Research & Technology program, our team was able to make progress in developing this technology through systems analysis and design, evaluation of materials and construction methods, and development of critical analysis tools. Results show that once this technology is mature, significant launch mass savings, operational simplicity, and mission robustness will be available to help carry out NASA s Vision for Space Exploration.

  8. Nuclear Tools For Oilfield Logging-While-Drilling Applications

    NASA Astrophysics Data System (ADS)

    Reijonen, Jani

    2011-06-01

    Schlumberger is an international oilfield service company with nearly 80,000 employees of 140 nationalities, operating globally in 80 countries. As a market leader in oilfield services, Schlumberger has developed a suite of technologies to assess the downhole environment, including, among others, electromagnetic, seismic, chemical, and nuclear measurements. In the past 10 years there has been a radical shift in the oilfield service industry from traditional wireline measurements to logging-while-drilling (LWD) analysis. For LWD measurements, the analysis is performed and the instruments are operated while the borehole is being drilled. The high temperature, high shock, and extreme vibration environment of LWD imposes stringent requirements for the devices used in these applications. This has a significant impact on the design of the components and subcomponents of a downhole tool. Another significant change in the past few years for nuclear-based oilwell logging tools is the desire to replace the sealed radioisotope sources with active, electronic ones. These active radiation sources provide great benefits compared to the isotopic sources, ranging from handling and safety to nonproliferation and well contamination issues. The challenge is to develop electronic generators that have a high degree of reliability for the entire lifetime of a downhole tool. LWD tool testing and operations are highlighted with particular emphasis on electronic radiation sources and nuclear detectors for the downhole environment.

  9. Interactive visualization to advance earthquake simulation

    USGS Publications Warehouse

    Kellogg, L.H.; Bawden, G.W.; Bernardin, T.; Billen, M.; Cowgill, E.; Hamann, B.; Jadamec, M.; Kreylos, O.; Staadt, O.; Sumner, D.

    2008-01-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. Virtual mapping tools allow virtual "field studies" in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations. ?? Birkhaueser 2008.

  10. To What Extent is FAIMS Beneficial in the Analysis of Proteins?

    NASA Astrophysics Data System (ADS)

    Cooper, Helen J.

    2016-04-01

    High field asymmetric waveform ion mobility spectrometry (FAIMS), also known as differential ion mobility spectrometry, is emerging as a tool for biomolecular analysis. In this article, the benefits and limitations of FAIMS for protein analysis are discussed. The principles and mechanisms of FAIMS separation of ions are described, and the differences between FAIMS and conventional ion mobility spectrometry are detailed. Protein analysis is considered from both the top-down (intact proteins) and the bottom-up (proteolytic peptides) perspective. The roles of FAIMS in the analysis of complex mixtures of multiple intact proteins and in the analysis of multiple conformers of a single protein are assessed. Similarly, the application of FAIMS in proteomics and targeted analysis of peptides are considered.

  11. Optimizing Biorefinery Design and Operations via Linear Programming Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Talmadge, Michael; Batan, Liaw; Lamers, Patrick

    The ability to assess and optimize economics of biomass resource utilization for the production of fuels, chemicals and power is essential for the ultimate success of a bioenergy industry. The team of authors, consisting of members from the National Renewable Energy Laboratory (NREL) and the Idaho National Laboratory (INL), has developed simple biorefinery linear programming (LP) models to enable the optimization of theoretical or existing biorefineries. The goal of this analysis is to demonstrate how such models can benefit the developing biorefining industry. It focuses on a theoretical multi-pathway, thermochemical biorefinery configuration and demonstrates how the biorefinery can use LPmore » models for operations planning and optimization in comparable ways to the petroleum refining industry. Using LP modeling tools developed under U.S. Department of Energy's Bioenergy Technologies Office (DOE-BETO) funded efforts, the authors investigate optimization challenges for the theoretical biorefineries such as (1) optimal feedstock slate based on available biomass and prices, (2) breakeven price analysis for available feedstocks, (3) impact analysis for changes in feedstock costs and product prices, (4) optimal biorefinery operations during unit shutdowns / turnarounds, and (5) incentives for increased processing capacity. These biorefinery examples are comparable to crude oil purchasing and operational optimization studies that petroleum refiners perform routinely using LPs and other optimization models. It is important to note that the analyses presented in this article are strictly theoretical and they are not based on current energy market prices. The pricing structure assigned for this demonstrative analysis is consistent with $4 per gallon gasoline, which clearly assumes an economic environment that would favor the construction and operation of biorefineries. The analysis approach and examples provide valuable insights into the usefulness of analysis tools for maximizing the potential benefits of biomass utilization for production of fuels, chemicals and power.« less

  12. An Integrated Tool for Calculating and Reducing Institution Carbon and Nitrogen Footprints

    PubMed Central

    Galloway, James N.; Castner, Elizabeth A.; Andrews, Jennifer; Leary, Neil; Aber, John D.

    2017-01-01

    Abstract The development of nitrogen footprint tools has allowed a range of entities to calculate and reduce their contribution to nitrogen pollution, but these tools represent just one aspect of environmental pollution. For example, institutions have been calculating their carbon footprints to track and manage their greenhouse gas emissions for over a decade. This article introduces an integrated tool that institutions can use to calculate, track, and manage their nitrogen and carbon footprints together. It presents the methodology for the combined tool, describes several metrics for comparing institution nitrogen and carbon footprint results, and discusses management strategies that reduce both the nitrogen and carbon footprints. The data requirements for the two tools overlap substantially, although integrating the two tools does necessitate the calculation of the carbon footprint of food. Comparison results for five institutions suggest that the institution nitrogen and carbon footprints correlate strongly, especially in the utilities and food sectors. Scenario analyses indicate benefits to both footprints from a range of utilities and food footprint reduction strategies. Integrating these two footprints into a single tool will account for a broader range of environmental impacts, reduce data entry and analysis, and promote integrated management of institutional sustainability. PMID:29350217

  13. Aquatic habitat measurement and valuation: imputing social benefits to instream flow levels

    USGS Publications Warehouse

    Douglas, Aaron J.; Johnson, Richard L.

    1991-01-01

    Instream flow conflicts have been analysed from the perspectives offered by policy oriented applied (physical) science, theories of conflict resolution and negotiation strategy, and psychological analyses of the behavior patterns of the bargaining parties. Economics also offers some useful insights in analysing conflict resolution within the context of these water allocation problems. We attempt to analyse the economics of the bargaining process in conjunction with a discussion of the water allocation process. In particular, we examine in detail the relation between certain habitat estimation techniques, and the socially optimal allocation of non-market resources. The results developed here describe the welfare implications implicit in the contemporary general equilibrium analysis of a competitive market economy. We also review certain currently available techniques for assigning dollar values to the social benefits of instream flow. The limitations of non-market valuation techniques with respect to estimating the benefits provided by instream flows and the aquatic habitat contingent on these flows should not deter resource managers from using economic analysis as a basic tool for settling instream flow conflicts.

  14. How Can Visual Analytics Assist Investigative Analysis? Design Implications from an Evaluation.

    PubMed

    Youn-Ah Kang; Görg, Carsten; Stasko, John

    2011-05-01

    Despite the growing number of systems providing visual analytic support for investigative analysis, few empirical studies of the potential benefits of such systems have been conducted, particularly controlled, comparative evaluations. Determining how such systems foster insight and sensemaking is important for their continued growth and study, however. Furthermore, studies that identify how people use such systems and why they benefit (or not) can help inform the design of new systems in this area. We conducted an evaluation of the visual analytics system Jigsaw employed in a small investigative sensemaking exercise, and compared its use to three other more traditional methods of analysis. Sixteen participants performed a simulated intelligence analysis task under one of the four conditions. Experimental results suggest that Jigsaw assisted participants to analyze the data and identify an embedded threat. We describe different analysis strategies used by study participants and how computational support (or the lack thereof) influenced the strategies. We then illustrate several characteristics of the sensemaking process identified in the study and provide design implications for investigative analysis tools based thereon. We conclude with recommendations on metrics and techniques for evaluating visual analytics systems for investigative analysis.

  15. Using discrete choice experiments within a cost-benefit analysis framework: some considerations.

    PubMed

    McIntosh, Emma

    2006-01-01

    A great advantage of the stated preference discrete choice experiment (SPDCE) approach to economic evaluation methodology is its immense flexibility within applied cost-benefit analyses (CBAs). However, while the use of SPDCEs in healthcare has increased markedly in recent years there has been a distinct lack of equivalent CBAs in healthcare using such SPDCE-derived valuations. This article outlines specific issues and some practical suggestions for consideration relevant to the development of CBAs using SPDCE-derived benefits. The article shows that SPDCE-derived CBA can adopt recent developments in cost-effectiveness methodology including the cost-effectiveness plane, appropriate consideration of uncertainty, the net-benefit framework and probabilistic sensitivity analysis methods, while maintaining the theoretical advantage of the SPDCE approach. The concept of a cost-benefit plane is no different in principle to the cost-effectiveness plane and can be a useful tool for reporting and presenting the results of CBAs.However, there are many challenging issues to address for the advancement of CBA methodology using SPCDEs within healthcare. Particular areas for development include the importance of accounting for uncertainty in SPDCE-derived willingness-to-pay values, the methodology of SPDCEs in clinical trial settings and economic models, measurement issues pertinent to using SPDCEs specifically in healthcare, and the importance of issues such as consideration of the dynamic nature of healthcare and the resulting impact this has on the validity of attribute definitions and context.

  16. Staff members' perceived training needs regarding sexuality in residential aged care facilities.

    PubMed

    Villar, Feliciano; Celdrán, Montserrat; Fabà, Josep; Serrat, Rodrigo

    2017-01-01

    The purpose of the article is to ascertain if staff members of residential aged care facilities (RACF) perceive the need for training regarding residents' sexuality, and what, if any, benefits from the training were perceived, and to compare perceived benefits of training between care assistants and professional/managerial staff. Interviews were conducted with 53 staff members of five different RACF in Spain. Their responses to two semistructured questions were transcribed verbatim and submitted to content analysis. Results show that most interviewees said they lacked training about sexuality and aging. Two potential highlighted benefits of the training are knowledge/attitudinal (countering negative attitudes regarding sexuality) and procedural (developing common protocols and tools to manage situations related to sexuality). Care assistants and professional staff agreed on the need for training, though the former emphasized the procedural impact and the latter the knowledge/attitudinal benefits. The results suggest that RACF staff should have an opportunity to receive training on residents' sexuality, as sexual interest and behavior is a key dimension of residents' lives.

  17. Air Emissions and Health Benefits from Using Sugarcane Waste as a Cellulosic Ethanol Feedstock

    NASA Astrophysics Data System (ADS)

    Tsao, C.; Campbell, E.; Chen, Y.; Carmichael, G.; Mena-Carrasco, M.; Spak, S.

    2010-12-01

    Brazil, as the largest ethanol exporter in the world, faces rapid expansion of ethanol production due to the increase of global biofuels demand. Current production of Brazilian sugarcane ethanol causes significant air emissions mainly from the open burning phase of agriculture wastes (i.e. sugarcane straws and leaves) resulting in potential health impacts. One possible measure to avoid undesired burning practices is to increase the utilization of unburned sugarcane residues as a feedstock for cellulosic ethanol. To explore the benefits of this substitution, here we first apply a bottom-up approach combining agronomic data and life-cycle models to investigate spatially and temporally explicit emissions from sugarcane waste burning. We further quantify the health benefits from preventing burning practices using the CMAQ regional air quality model and the BenMAP health benefit analysis tool adapted for Brazilian applications. Furthermore, the health impacts will be converted into monetary values which provide policymakers useful information for the development of cellulosic ethanol.

  18. Adaptive Optics Analysis of Visual Benefit with Higher-order Aberrations Correction of Human Eye - Poster Paper

    NASA Astrophysics Data System (ADS)

    Xue, Lixia; Dai, Yun; Rao, Xuejun; Wang, Cheng; Hu, Yiyun; Liu, Qian; Jiang, Wenhan

    2008-01-01

    Higher-order aberrations correction can improve visual performance of human eye to some extent. To evaluate how much visual benefit can be obtained with higher-order aberrations correction we developed an adaptive optics vision simulator (AOVS). Dynamic real time optimized modal compensation was used to implement various customized higher-order ocular aberrations correction strategies. The experimental results indicate that higher-order aberrations correction can improve visual performance of human eye comparing with only lower-order aberration correction but the improvement degree and higher-order aberration correction strategy are different from each individual. Some subjects can acquire great visual benefit when higher-order aberrations were corrected but some subjects acquire little visual benefit even though all higher-order aberrations were corrected. Therefore, relative to general lower-order aberrations correction strategy, customized higher-order aberrations correction strategy is needed to obtain optimal visual improvement for each individual. AOVS provides an effective tool for higher-order ocular aberrations optometry for customized ocular aberrations correction.

  19. Institutional framework for integrated Pharmaceutical Benefits Management: results from a systematic review

    PubMed Central

    Hermanowski, Tomasz Roman; Drozdowska, Aleksandra Krystyna; Kowalczyk, Marta

    2015-01-01

    Objectives In this paper, we emphasised that effective management of health plans beneficiaries access to reimbursed medicines requires proper institutional set-up. The main objective was to identify and recommend an institutional framework of integrated pharmaceutical care providing effective, safe and equitable access to medicines. Method The institutional framework of drug policy was derived on the basis of publications obtained by systematic reviews. A comparative analysis concerning adaptation of coordinated pharmaceutical care services in the USA, the UK, Poland, Italy, Denmark and Germany was performed. Results While most European Union Member States promote the implementation of selected e-Health tools, like e-Prescribing, these efforts do not necessarily implement an integrated package. There is no single agent who would manage an insured patients’ access to medicines and health care in a coordinated manner, thereby increasing the efficiency and safety of drug policy. More attention should be paid by European Union Member States as to how to integrate various e-Health tools to enhance benefits to both individuals and societies. One solution could be to implement an integrated “pharmacy benefit management” model, which is well established in the USA and Canada and provides an integrated package of cost-containment methods, implemented within a transparent institutional framework and powered by strong motivation of the agent. PMID:26528099

  20. Use and practice of achiral and chiral supercritical fluid chromatography in pharmaceutical analysis and purification.

    PubMed

    Lemasson, Elise; Bertin, Sophie; West, Caroline

    2016-01-01

    The interest of pharmaceutical companies for complementary high-performance chromatographic tools to assess a product's purity or enhance this purity is on the rise. The high-throughput capability and economic benefits of supercritical fluid chromatography, but also the "green" aspect of CO2 as the principal solvent, render supercritical fluid chromatography very attractive for a wide range of pharmaceutical applications. The recent reintroduction of new robust instruments dedicated to supercritical fluid chromatography and the progress in stationary phase technology have also greatly benefited supercritical fluid chromatography. Additionally, it was shown several times that supercritical fluid chromatography could be orthogonal to reversed-phase high-performance liquid chromatography and could efficiently compete with it. Supercritical fluid chromatography is an adequate tool for small molecules of pharmaceutical interest: synthetic intermediates, active pharmaceutical ingredients, impurities, or degradation products. In this review, we first discuss about general chromatographic conditions for supercritical fluid chromatography analysis to better suit compounds of pharmaceutical interest. We also discuss about the use of achiral and chiral supercritical fluid chromatography for analytical purposes and the recent applications in these areas. The use of preparative supercritical fluid chromatography by pharmaceutical companies is also covered. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. The NASA In-Space Propulsion Technology Project, Products, and Mission Applicability

    NASA Technical Reports Server (NTRS)

    Anderson, David J.; Pencil, Eric; Liou, Larry; Dankanich, John; Munk, Michelle M.; Kremic, Tibor

    2009-01-01

    The In-Space Propulsion Technology (ISPT) Project, funded by NASA s Science Mission Directorate (SMD), is continuing to invest in propulsion technologies that will enable or enhance NASA robotic science missions. This overview provides development status, near-term mission benefits, applicability, and availability of in-space propulsion technologies in the areas of aerocapture, electric propulsion, advanced chemical thrusters, and systems analysis tools. Aerocapture investments improved: guidance, navigation, and control models of blunt-body rigid aeroshells; atmospheric models for Earth, Titan, Mars, and Venus; and models for aerothermal effects. Investments in electric propulsion technologies focused on completing NASA s Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6 to 7 kW throttle-able gridded ion system. The project is also concluding its High Voltage Hall Accelerator (HiVHAC) mid-term product specifically designed for a low-cost electric propulsion option. The primary chemical propulsion investment is on the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance for lower cost. The project is also delivering products to assist technology infusion and quantify mission applicability and benefits through mission analysis and tools. In-space propulsion technologies are applicable, and potentially enabling for flagship destinations currently under evaluation, as well as having broad applicability to future Discovery and New Frontiers mission solicitations.

  2. NASA's In-Space Propulsion Technology Project Overview, Near-term Products and Mission Applicability

    NASA Technical Reports Server (NTRS)

    Dankanich, John; Anderson, David J.

    2008-01-01

    The In-Space Propulsion Technology (ISPT) Project, funded by NASA's Science Mission Directorate (SMD), is continuing to invest in propulsion technologies that will enable or enhance NASA robotic science missions. This overview provides development status, near-term mission benefits, applicability, and availability of in-space propulsion technologies in the areas of aerocapture, electric propulsion, advanced chemical thrusters, and systems analysis tools. Aerocapture investments improved (1) guidance, navigation, and control models of blunt-body rigid aeroshells, 2) atmospheric models for Earth, Titan, Mars and Venus, and 3) models for aerothermal effects. Investments in electric propulsion technologies focused on completing NASA s Evolutionary Xenon Thruster (NEXT) ion propulsion system, a 0.6-7 kW throttle-able gridded ion system. The project is also concluding its High Voltage Hall Accelerator (HiVHAC) mid-term product specifically designed for a low-cost electric propulsion option. The primary chemical propulsion investment is on the high-temperature Advanced Material Bipropellant Rocket (AMBR) engine providing higher performance for lower cost. The project is also delivering products to assist technology infusion and quantify mission applicability and benefits through mission analysis and tools. In-space propulsion technologies are applicable, and potentially enabling for flagship destinations currently under evaluation, as well as having broad applicability to future Discovery and New Frontiers mission solicitations.

  3. LICARA nanoSCAN - A tool for the self-assessment of benefits and risks of nanoproducts.

    PubMed

    van Harmelen, Toon; Zondervan-van den Beuken, Esther K; Brouwer, Derk H; Kuijpers, Eelco; Fransman, Wouter; Buist, Harrie B; Ligthart, Tom N; Hincapié, Ingrid; Hischier, Roland; Linkov, Igor; Nowack, Bernd; Studer, Jennifer; Hilty, Lorenz; Som, Claudia

    2016-05-01

    The fast penetration of nanoproducts on the market under conditions of significant uncertainty of their environmental properties and risks to humans creates a need for companies to assess sustainability of their products. Evaluation of the potential benefits and risks to build a coherent story for communication with clients, authorities, consumers, and other stakeholders is getting to be increasingly important, but SMEs often lack the knowledge and expertise to assess risks and communicate them appropriately. This paper introduces LICARA nanoSCAN, a modular web based tool that supports SMEs in assessing benefits and risks associated with new or existing nanoproducts. This tool is unique because it is scanning both the benefits and risks over the nanoproducts life cycle in comparison to a reference product with a similar functionality in order to enable the development of sustainable and competitive nanoproducts. SMEs can use data and expert judgment to answer mainly qualitative and semi-quantitative questions as a part of tool application. Risks to public, workers and consumers are assessed, while the benefits are evaluated for economic, environmental and societal opportunities associated with the product use. The tool provides an easy way to visualize results as well as to identify gaps, missing data and associated uncertainties. The LICARA nanoSCAN has been positively evaluated by several companies and was tested in a number of case studies. The tool helps to develop a consistent and comprehensive argument on the weaknesses and strengths of a nanoproduct that may be valuable for the communication with authorities, clients and among stakeholders in the value chain. LICARA nanoSCAN identifies areas for more detailed assessments, product design improvement or application of risk mitigation measures. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. ASTROS: A multidisciplinary automated structural design tool

    NASA Technical Reports Server (NTRS)

    Neill, D. J.

    1989-01-01

    ASTROS (Automated Structural Optimization System) is a finite-element-based multidisciplinary structural optimization procedure developed under Air Force sponsorship to perform automated preliminary structural design. The design task is the determination of the structural sizes that provide an optimal structure while satisfying numerous constraints from many disciplines. In addition to its automated design features, ASTROS provides a general transient and frequency response capability, as well as a special feature to perform a transient analysis of a vehicle subjected to a nuclear blast. The motivation for the development of a single multidisciplinary design tool is that such a tool can provide improved structural designs in less time than is currently needed. The role of such a tool is even more apparent as modern materials come into widespread use. Balancing conflicting requirements for the structure's strength and stiffness while exploiting the benefits of material anisotropy is perhaps an impossible task without assistance from an automated design tool. Finally, the use of a single tool can bring the design task into better focus among design team members, thereby improving their insight into the overall task.

  5. A cost-benefit analysis of The National Map

    USGS Publications Warehouse

    Halsing, David L.; Theissen, Kevin; Bernknopf, Richard

    2003-01-01

    The Geography Discipline of the U.S. Geological Survey (USGS) has conducted this cost-benefit analysis (CBA) of The National Map. This analysis is an evaluation of the proposed Geography Discipline initiative to provide the Nation with a mechanism to access current and consistent digital geospatial data. This CBA is a supporting document to accompany the Exhibit 300 Capital Asset Plan and Business Case of The National Map Reengineering Program. The framework for estimating the benefits is based on expected improvements in processing information to perform any of the possible applications of spatial data. This analysis does not attempt to determine the benefits and costs of performing geospatial-data applications. Rather, it estimates the change in the differences between those benefits and costs with The National Map and the current situation without it. The estimates of total costs and benefits of The National Map were based on the projected implementation time, development and maintenance costs, rates of data inclusion and integration, expected usage levels over time, and a benefits estimation model. The National Map provides data that are current, integrated, consistent, complete, and more accessible in order to decrease the cost of implementing spatial-data applications and (or) improve the outcome of those applications. The efficiency gains in per-application improvements are greater than the cost to develop and maintain The National Map, meaning that the program would bring a positive net benefit to the Nation. The average improvement in the net benefit of performing a spatial data application was multiplied by a simulated number of application implementations across the country. The numbers of users, existing applications, and rates of application implementation increase over time as The National Map is developed and accessed by spatial data users around the country. Results from the 'most likely' estimates of model parameters and data inputs indicate that, over its 30-year projected lifespan, The National Map will bring a net present value (NPV) of benefits of $2.05 billion in 2001 dollars. The average time until the initial investments (the break-even period) are recovered is 14 years. Table ES-1 shows a running total of NPV in each year of the simulation model. In year 14, The National Map first shows a positive NPV, and so the table is highlighted in gray after that point. Figure ES-1 is a graph of the total benefit and total cost curves of a single model run over time. The curves cross in year 14, when the project breaks even. A sensitivity analysis of the input variables illustrated that these results of the NPV of The National Map are quite robust. Figure ES-2 plots the mean NPV results from 60 different scenarios, each consisting of fifty 30-year runs. The error bars represent a two-standard-deviation range around each mean. The analysis that follows contains the details of the cost-benefit analysis, the framework for evaluating economic benefits, a computational simulation tool, and a sensitivity analysis of model variables and values.

  6. The financial impact of a clinical academic practice partnership.

    PubMed

    Greene, Mary Ann; Turner, James

    2014-01-01

    New strategies to provide clinical experiences for nursing students have caused nursing schools and hospitals to evaluate program costs. A Microsoft Excel model, which captures costs and associated benefits, was developed and is described here. The financial analysis shows that the Clinical Academic Practice Program framework for nursing clinical education, often preferred by students, can offer financial advantages to participating hospitals and schools of nursing. The model is potentially a tool for schools of nursing to enlist hospitals and to help manage expenses of clinical education. Hospitals may also use the Hospital Nursing Unit Staffing and Expense Worksheet in planning staffing when students are assigned to units and the cost/benefit findings to enlist management support.

  7. Non-equilibrium thermodynamics theory of econometric source discovery for large data analysis

    NASA Astrophysics Data System (ADS)

    van Bergem, Rutger; Jenkins, Jeffrey; Benachenhou, Dalila; Szu, Harold

    2014-05-01

    Almost all consumer and firm transactions are achieved using computers and as a result gives rise to increasingly large amounts of data available for analysts. The gold standard in Economic data manipulation techniques matured during a period of limited data access, and the new Large Data Analysis (LDA) paradigm we all face may quickly obfuscate most tools used by Economists. When coupled with an increased availability of numerous unstructured, multi-modal data sets, the impending 'data tsunami' could have serious detrimental effects for Economic forecasting, analysis, and research in general. Given this reality we propose a decision-aid framework for Augmented-LDA (A-LDA) - a synergistic approach to LDA which combines traditional supervised, rule-based Machine Learning (ML) strategies to iteratively uncover hidden sources in large data, the artificial neural network (ANN) Unsupervised Learning (USL) at the minimum Helmholtz free energy for isothermal dynamic equilibrium strategies, and the Economic intuitions required to handle problems encountered when interpreting large amounts of Financial or Economic data. To make the ANN USL framework applicable to economics we define the temperature, entropy, and energy concepts in Economics from non-equilibrium molecular thermodynamics of Boltzmann viewpoint, as well as defining an information geometry, on which the ANN can operate using USL to reduce information saturation. An exemplar of such a system representation is given for firm industry equilibrium. We demonstrate the traditional ML methodology in the economics context and leverage firm financial data to explore a frontier concept known as behavioral heterogeneity. Behavioral heterogeneity on the firm level can be imagined as a firm's interactions with different types of Economic entities over time. These interactions could impose varying degrees of institutional constraints on a firm's business behavior. We specifically look at behavioral heterogeneity for firms that are operating with the label of `Going-Concern' and firms labeled according to institutional influence they may be experiencing, such as constraints on firm hiring/spending while in a Bankruptcy or a Merger procedure. Uncovering invariant features, or behavioral data metrics from observable firm data in an economy can greatly benefit the FED, World Bank, etc. We find that the ML/LDA communities can benefit from Economic intuitions just as much as Economists can benefit from generic data exploration tools. The future of successful Economic data understanding, modeling, simulation, and visualization can be amplified by new A-LDA models and approaches for new and analogous models of Economic system dynamics. The potential benefits of improved economic data analysis and real time decision aid tools are numerous for researchers, analysts, and federal agencies who all deal with increasingly large amounts of complex data to support their decision making.

  8. Multiscale recurrence analysis of spatio-temporal data

    NASA Astrophysics Data System (ADS)

    Riedl, M.; Marwan, N.; Kurths, J.

    2015-12-01

    The description and analysis of spatio-temporal dynamics is a crucial task in many scientific disciplines. In this work, we propose a method which uses the mapogram as a similarity measure between spatially distributed data instances at different time points. The resulting similarity values of the pairwise comparison are used to construct a recurrence plot in order to benefit from established tools of recurrence quantification analysis and recurrence network analysis. In contrast to other recurrence tools for this purpose, the mapogram approach allows the specific focus on different spatial scales that can be used in a multi-scale analysis of spatio-temporal dynamics. We illustrate this approach by application on mixed dynamics, such as traveling parallel wave fronts with additive noise, as well as more complicate examples, pseudo-random numbers and coupled map lattices with a semi-logistic mapping rule. Especially the complicate examples show the usefulness of the multi-scale consideration in order to take spatial pattern of different scales and with different rhythms into account. So, this mapogram approach promises new insights in problems of climatology, ecology, or medicine.

  9. Multiscale recurrence analysis of spatio-temporal data.

    PubMed

    Riedl, M; Marwan, N; Kurths, J

    2015-12-01

    The description and analysis of spatio-temporal dynamics is a crucial task in many scientific disciplines. In this work, we propose a method which uses the mapogram as a similarity measure between spatially distributed data instances at different time points. The resulting similarity values of the pairwise comparison are used to construct a recurrence plot in order to benefit from established tools of recurrence quantification analysis and recurrence network analysis. In contrast to other recurrence tools for this purpose, the mapogram approach allows the specific focus on different spatial scales that can be used in a multi-scale analysis of spatio-temporal dynamics. We illustrate this approach by application on mixed dynamics, such as traveling parallel wave fronts with additive noise, as well as more complicate examples, pseudo-random numbers and coupled map lattices with a semi-logistic mapping rule. Especially the complicate examples show the usefulness of the multi-scale consideration in order to take spatial pattern of different scales and with different rhythms into account. So, this mapogram approach promises new insights in problems of climatology, ecology, or medicine.

  10. Range Process Simulation Tool

    NASA Technical Reports Server (NTRS)

    Phillips, Dave; Haas, William; Barth, Tim; Benjamin, Perakath; Graul, Michael; Bagatourova, Olga

    2005-01-01

    Range Process Simulation Tool (RPST) is a computer program that assists managers in rapidly predicting and quantitatively assessing the operational effects of proposed technological additions to, and/or upgrades of, complex facilities and engineering systems such as the Eastern Test Range. Originally designed for application to space transportation systems, RPST is also suitable for assessing effects of proposed changes in industrial facilities and large organizations. RPST follows a model-based approach that includes finite-capacity schedule analysis and discrete-event process simulation. A component-based, scalable, open architecture makes RPST easily and rapidly tailorable for diverse applications. Specific RPST functions include: (1) definition of analysis objectives and performance metrics; (2) selection of process templates from a processtemplate library; (3) configuration of process models for detailed simulation and schedule analysis; (4) design of operations- analysis experiments; (5) schedule and simulation-based process analysis; and (6) optimization of performance by use of genetic algorithms and simulated annealing. The main benefits afforded by RPST are provision of information that can be used to reduce costs of operation and maintenance, and the capability for affordable, accurate, and reliable prediction and exploration of the consequences of many alternative proposed decisions.

  11. Methods for Analyzing the Benefits and Costs of Distributed Photovoltaic Generation to the U.S. Electric Utility System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Denholm, P.; Margolis, R.; Palmintier, B.

    This report outlines the methods, data, and tools that could be used at different levels of sophistication and effort to estimate the benefits and costs of DGPV. In so doing, we identify the gaps in current benefit-cost-analysis methods, which we hope will inform the ongoing research agenda in this area. The focus of this report is primarily on benefits and costs from the utility or electricity generation system perspective. It is intended to provide useful background information to utility and regulatory decision makers and their staff, who are often being asked to use or evaluate estimates of the benefits andmore » cost of DGPV in regulatory proceedings. Understanding the technical rigor of the range of methods and how they might need to evolve as DGPV becomes a more significant contributor of energy to the electricity system will help them be better consumers of this type of information. This report is also intended to provide information to utilities, policy makers, PV technology developers, and other stakeholders, which might help them maximize the benefits and minimize the costs of integrating DGPV into a changing electricity system.« less

  12. Advances in Photopletysmography Signal Analysis for Biomedical Applications.

    PubMed

    Moraes, Jermana L; Rocha, Matheus X; Vasconcelos, Glauber G; Vasconcelos Filho, José E; de Albuquerque, Victor Hugo C; Alexandria, Auzuir R

    2018-06-09

    Heart Rate Variability (HRV) is an important tool for the analysis of a patient’s physiological conditions, as well a method aiding the diagnosis of cardiopathies. Photoplethysmography (PPG) is an optical technique applied in the monitoring of the HRV and its adoption has been growing significantly, compared to the most commonly used method in medicine, Electrocardiography (ECG). In this survey, definitions of these technique are presented, the different types of sensors used are explained, and the methods for the study and analysis of the PPG signal (linear and nonlinear methods) are described. Moreover, the progress, and the clinical and practical applicability of the PPG technique in the diagnosis of cardiovascular diseases are evaluated. In addition, the latest technologies utilized in the development of new tools for medical diagnosis are presented, such as Internet of Things, Internet of Health Things, genetic algorithms, artificial intelligence and biosensors which result in personalized advances in e-health and health care. After the study of these technologies, it can be noted that PPG associated with them is an important tool for the diagnosis of some diseases, due to its simplicity, its cost⁻benefit ratio, the easiness of signals acquisition, and especially because it is a non-invasive technique.

  13. How to support forest management in a world of change: results of some regional studies.

    PubMed

    Fürst, C; Lorz, C; Vacik, H; Potocic, N; Makeschin, F

    2010-12-01

    This article presents results of several studies in Middle, Eastern and Southeastern Europe on needs and application areas, desirable attributes and marketing potentials of forest management support tools. By comparing present and future application areas, a trend from sectoral planning towards landscape planning and integration of multiple stakeholder needs is emerging. In terms of conflicts, where management support tools might provide benefit, no clear tendencies were found, neither on local nor on regional level. In contrast, on national and European levels, support of the implementation of laws, directives, and regulations was found to be of highest importance. Following the user-requirements analysis, electronic tools supporting communication are preferred against paper-based instruments. The users identified most important attributes of optimized management support tools: (i) a broad accessibility for all users at any time should be guaranteed, (ii) the possibility to integrate iteratively experiences from case studies and from regional experts into the knowledge base (learning system) should be given, and (iii) a self-explanatory user interface is demanded, which is also suitable for users rather inexperienced with electronic tools. However, a market potential analysis revealed that the willingness to pay for management tools is very limited, although the participants specified realistic ranges of maximal amounts of money, which would be invested if the products were suitable and payment inevitable. To bridge the discrepancy between unwillingness to pay and the need to use management support tools, optimized financing or cooperation models between practice and science must be found.

  14. How to Support Forest Management in a World of Change: Results of Some Regional Studies

    NASA Astrophysics Data System (ADS)

    Fürst, C.; Lorz, C.; Vacik, H.; Potocic, N.; Makeschin, F.

    2010-12-01

    This article presents results of several studies in Middle, Eastern and Southeastern Europe on needs and application areas, desirable attributes and marketing potentials of forest management support tools. By comparing present and future application areas, a trend from sectoral planning towards landscape planning and integration of multiple stakeholder needs is emerging. In terms of conflicts, where management support tools might provide benefit, no clear tendencies were found, neither on local nor on regional level. In contrast, on national and European levels, support of the implementation of laws, directives, and regulations was found to be of highest importance. Following the user-requirements analysis, electronic tools supporting communication are preferred against paper-based instruments. The users identified most important attributes of optimized management support tools: (i) a broad accessibility for all users at any time should be guaranteed, (ii) the possibility to integrate iteratively experiences from case studies and from regional experts into the knowledge base (learning system) should be given, and (iii) a self-explanatory user interface is demanded, which is also suitable for users rather inexperienced with electronic tools. However, a market potential analysis revealed that the willingness to pay for management tools is very limited, although the participants specified realistic ranges of maximal amounts of money, which would be invested if the products were suitable and payment inevitable. To bridge the discrepancy between unwillingness to pay and the need to use management support tools, optimized financing or cooperation models between practice and science must be found.

  15. Using Collaborative Simulation Modeling to Develop a Web-Based Tool to Support Policy-Level Decision Making About Breast Cancer Screening Initiation Age

    PubMed Central

    Burnside, Elizabeth S.; Lee, Sandra J.; Bennette, Carrie; Near, Aimee M.; Alagoz, Oguzhan; Huang, Hui; van den Broek, Jeroen J.; Kim, Joo Yeon; Ergun, Mehmet A.; van Ravesteyn, Nicolien T.; Stout, Natasha K.; de Koning, Harry J.; Mandelblatt, Jeanne S.

    2017-01-01

    Background There are no publicly available tools designed specifically to assist policy makers to make informed decisions about the optimal ages of breast cancer screening initiation for different populations of US women. Objective To use three established simulation models to develop a web-based tool called Mammo OUTPuT. Methods The simulation models use the 1970 US birth cohort and common parameters for incidence, digital screening performance, and treatment effects. Outcomes include breast cancers diagnosed, breast cancer deaths averted, breast cancer mortality reduction, false-positive mammograms, benign biopsies, and overdiagnosis. The Mammo OUTPuT tool displays these outcomes for combinations of age at screening initiation (every year from 40 to 49), annual versus biennial interval, lifetime versus 10-year horizon, and breast density, compared to waiting to start biennial screening at age 50 and continuing to 74. The tool was piloted by decision makers (n = 16) who completed surveys. Results The tool demonstrates that benefits in the 40s increase linearly with earlier initiation age, without a specific threshold age. Likewise, the harms of screening increase monotonically with earlier ages of initiation in the 40s. The tool also shows users how the balance of benefits and harms varies with breast density. Surveys revealed that 100% of users (16/16) liked the appearance of the site; 94% (15/16) found the tool helpful; and 94% (15/16) would recommend the tool to a colleague. Conclusions This tool synthesizes a representative subset of the most current CISNET (Cancer Intervention and Surveillance Modeling Network) simulation model outcomes to provide policy makers with quantitative data on the benefits and harms of screening women in the 40s. Ultimate decisions will depend on program goals, the population served, and informed judgments about the weight of benefits and harms. PMID:29376135

  16. Application of risk analysis in water resourses management

    NASA Astrophysics Data System (ADS)

    Varouchakis, Emmanouil; Palogos, Ioannis

    2017-04-01

    A common cost-benefit analysis approach, which is novel in the risk analysis of hydrologic/hydraulic applications, and a Bayesian decision analysis are applied to aid the decision making on whether or not to construct a water reservoir for irrigation purposes. The alternative option examined is a scaled parabolic fine variation in terms of over-pumping violations in contrast to common practices that usually consider short-term fines. Such an application, and in such detail, represents new feedback. The results indicate that the probability uncertainty is the driving issue that determines the optimal decision with each methodology, and depending on the unknown probability handling, each methodology may lead to a different optimal decision. Thus, the proposed tool can help decision makers (stakeholders) to examine and compare different scenarios using two different approaches before making a decision considering the cost of a hydrologic/hydraulic project and the varied economic charges that water table limit violations can cause inside an audit interval. In contrast to practices that assess the effect of each proposed action separately considering only current knowledge of the examined issue, this tool aids decision making by considering prior information and the sampling distribution of future successful audits. This tool is developed in a web service for the easier stakeholders' access.

  17. A standardised, generic, validated approach to stratify the magnitude of clinical benefit that can be anticipated from anti-cancer therapies: the European Society for Medical Oncology Magnitude of Clinical Benefit Scale (ESMO-MCBS).

    PubMed

    Cherny, N I; Sullivan, R; Dafni, U; Kerst, J M; Sobrero, A; Zielinski, C; de Vries, E G E; Piccart, M J

    2015-08-01

    The value of any new therapeutic strategy or treatment is determined by the magnitude of its clinical benefit balanced against its cost. Evidence for clinical benefit from new treatment options is derived from clinical research, in particular phase III randomised trials, which generate unbiased data regarding the efficacy, benefit and safety of new therapeutic approaches. To date, there is no standard tool for grading the magnitude of clinical benefit of cancer therapies, which may range from trivial (median progression-free survival advantage of only a few weeks) to substantial (improved long-term survival). Indeed, in the absence of a standardised approach for grading the magnitude of clinical benefit, conclusions and recommendations derived from studies are often hotly disputed and very modest incremental advances have often been presented, discussed and promoted as major advances or 'breakthroughs'. Recognising the importance of presenting clear and unbiased statements regarding the magnitude of the clinical benefit from new therapeutic approaches derived from high-quality clinical trials, the European Society for Medical Oncology (ESMO) has developed a validated and reproducible tool to assess the magnitude of clinical benefit for cancer medicines, the ESMO Magnitude of Clinical Benefit Scale (ESMO-MCBS). This tool uses a rational, structured and consistent approach to derive a relative ranking of the magnitude of clinically meaningful benefit that can be expected from a new anti-cancer treatment. The ESMO-MCBS is an important first step to the critical public policy issue of value in cancer care, helping to frame the appropriate use of limited public and personal resources to deliver cost-effective and affordable cancer care. The ESMO-MCBS will be a dynamic tool and its criteria will be revised on a regular basis. © The Author 2015. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  18. Neo: an object model for handling electrophysiology data in multiple formats

    PubMed Central

    Garcia, Samuel; Guarino, Domenico; Jaillet, Florent; Jennings, Todd; Pröpper, Robert; Rautenberg, Philipp L.; Rodgers, Chris C.; Sobolev, Andrey; Wachtler, Thomas; Yger, Pierre; Davison, Andrew P.

    2014-01-01

    Neuroscientists use many different software tools to acquire, analyze and visualize electrophysiological signals. However, incompatible data models and file formats make it difficult to exchange data between these tools. This reduces scientific productivity, renders potentially useful analysis methods inaccessible and impedes collaboration between labs. A common representation of the core data would improve interoperability and facilitate data-sharing. To that end, we propose here a language-independent object model, named “Neo,” suitable for representing data acquired from electroencephalographic, intracellular, or extracellular recordings, or generated from simulations. As a concrete instantiation of this object model we have developed an open source implementation in the Python programming language. In addition to representing electrophysiology data in memory for the purposes of analysis and visualization, the Python implementation provides a set of input/output (IO) modules for reading/writing the data from/to a variety of commonly used file formats. Support is included for formats produced by most of the major manufacturers of electrophysiology recording equipment and also for more generic formats such as MATLAB. Data representation and data analysis are conceptually separate: it is easier to write robust analysis code if it is focused on analysis and relies on an underlying package to handle data representation. For that reason, and also to be as lightweight as possible, the Neo object model and the associated Python package are deliberately limited to representation of data, with no functions for data analysis or visualization. Software for neurophysiology data analysis and visualization built on top of Neo automatically gains the benefits of interoperability, easier data sharing and automatic format conversion; there is already a burgeoning ecosystem of such tools. We intend that Neo should become the standard basis for Python tools in neurophysiology. PMID:24600386

  19. Neo: an object model for handling electrophysiology data in multiple formats.

    PubMed

    Garcia, Samuel; Guarino, Domenico; Jaillet, Florent; Jennings, Todd; Pröpper, Robert; Rautenberg, Philipp L; Rodgers, Chris C; Sobolev, Andrey; Wachtler, Thomas; Yger, Pierre; Davison, Andrew P

    2014-01-01

    Neuroscientists use many different software tools to acquire, analyze and visualize electrophysiological signals. However, incompatible data models and file formats make it difficult to exchange data between these tools. This reduces scientific productivity, renders potentially useful analysis methods inaccessible and impedes collaboration between labs. A common representation of the core data would improve interoperability and facilitate data-sharing. To that end, we propose here a language-independent object model, named "Neo," suitable for representing data acquired from electroencephalographic, intracellular, or extracellular recordings, or generated from simulations. As a concrete instantiation of this object model we have developed an open source implementation in the Python programming language. In addition to representing electrophysiology data in memory for the purposes of analysis and visualization, the Python implementation provides a set of input/output (IO) modules for reading/writing the data from/to a variety of commonly used file formats. Support is included for formats produced by most of the major manufacturers of electrophysiology recording equipment and also for more generic formats such as MATLAB. Data representation and data analysis are conceptually separate: it is easier to write robust analysis code if it is focused on analysis and relies on an underlying package to handle data representation. For that reason, and also to be as lightweight as possible, the Neo object model and the associated Python package are deliberately limited to representation of data, with no functions for data analysis or visualization. Software for neurophysiology data analysis and visualization built on top of Neo automatically gains the benefits of interoperability, easier data sharing and automatic format conversion; there is already a burgeoning ecosystem of such tools. We intend that Neo should become the standard basis for Python tools in neurophysiology.

  20. [Cost-benefit analysis of mental health activities in the workplace].

    PubMed

    Tarumi, Kimio; Hagihara, Akihito

    2013-01-01

    In order to examine the cost-benefit of mental health care activities in the workplace, the total costs of the fiscal year 2005, during which the old-type mental health care was conducted, and those of the fiscal years from 2006 to 2008, during which the new-type mental health care was conducted according to the governmental guidelines of each year, were compared using about 3,000 workers in a particular workplace in 2005. The total cost comprised the sum of the medical fees, the payment compensation for sick absences, and expenditures for health care activities of mentally ill health workers. The total costs from 2006 to 2008 were not markedly different from those in 2005, and the benefit due to new-type activity was not shown. However, the following was found: payment compensation for sick absences accounted for 60% of the total cost; personnel expenses which were a large part of the expenditure of health care activities largely changed over the years because of the age structure of the staff in charge. The results show that a cost-benefit analysis may be a useful tool for examining health care activities in the workplace for various members in the workplace although health care issues usually tend to be solved by specialists.

  1. Shouting in the Jungle - The SETI Transmission Debate

    NASA Astrophysics Data System (ADS)

    Schuch, H. P.; Almar, I.

    The prudence of transmitting deliberate messages from Earth into interstellar space remains controversial. Reasoned risk- benefit analysis is needed, to inform policy recommendations by such bodies as the International Academy of Astronautics SETI Permanent Study Group. As a first step, at the 2005 International Astronautical Congress in Fukuoka, we discussed the San Marino Scale, a new analytical tool for assessing transmission risk. That Scale was updated, and a revised version presented at the 2006 IAC in Valencia. We are now in a position to recommend specific improvements to the scale we proposed for quantifying terrestrial transmissions. Our intent is to make this tool better reflect the detectability and potential impact of recent and proposed messages beamed from Earth. We believe the changes proposed herein strengthen the San Marino Scale as an analytical tool, and bring us closer to its eventual adoption.

  2. Clinical utility of gene expression profiling data for clinical decision-making regarding adjuvant therapy in early stage, node-negative breast cancer: a case report.

    PubMed

    Schuster, Steven R; Pockaj, Barbara A; Bothe, Mary R; David, Paru S; Northfelt, Donald W

    2012-09-10

    Breast cancer is the most common malignancy among women in the United States with the second highest incidence of cancer-related death following lung cancer. The decision-making process regarding adjuvant therapy is a time intensive dialogue between the patient and her oncologist. There are multiple tools that help individualize the treatment options for a patient. Population-based analysis with Adjuvant! Online and genomic profiling with Oncotype DX are two commonly used tools in patients with early stage, node-negative breast cancer. This case report illustrates a situation in which the population-based prognostic and predictive information differed dramatically from that obtained from genomic profiling and affected the patient's decision. In light of this case, we discuss the benefits and limitations of these tools.

  3. A Robust Kalman Framework with Resampling and Optimal Smoothing

    PubMed Central

    Kautz, Thomas; Eskofier, Bjoern M.

    2015-01-01

    The Kalman filter (KF) is an extremely powerful and versatile tool for signal processing that has been applied extensively in various fields. We introduce a novel Kalman-based analysis procedure that encompasses robustness towards outliers, Kalman smoothing and real-time conversion from non-uniformly sampled inputs to a constant output rate. These features have been mostly treated independently, so that not all of their benefits could be exploited at the same time. Here, we present a coherent analysis procedure that combines the aforementioned features and their benefits. To facilitate utilization of the proposed methodology and to ensure optimal performance, we also introduce a procedure to calculate all necessary parameters. Thereby, we substantially expand the versatility of one of the most widely-used filtering approaches, taking full advantage of its most prevalent extensions. The applicability and superior performance of the proposed methods are demonstrated using simulated and real data. The possible areas of applications for the presented analysis procedure range from movement analysis over medical imaging, brain-computer interfaces to robot navigation or meteorological studies. PMID:25734647

  4. Overview of Automotive Core Tools: Applications and Benefits

    NASA Astrophysics Data System (ADS)

    Doshi, Jigar A.; Desai, Darshak

    2017-08-01

    Continuous improvement of product and process quality is always challenging and creative task in today's era of globalization. Various quality tools are available and used for the same. Some of them are successful and few of them are not. Considering the complexity in the continuous quality improvement (CQI) process various new techniques are being introduced by the industries, as well as proposed by researchers and academia. Lean Manufacturing, Six Sigma, Lean Six Sigma is some of the techniques. In recent years, there are new tools being opted by the industry, especially automotive, called as Automotive Core Tools (ACT). The intention of this paper is to review the applications and benefits along with existing research on Automotive Core Tools with special emphasis on continuous quality improvement. The methodology uses an extensive review of literature through reputed publications—journals, conference proceedings, research thesis, etc. This paper provides an overview of ACT, its enablers, and exertions, how it evolved into sophisticated methodologies and benefits used in organisations. It should be of value to practitioners of Automotive Core Tools and to academics who are interested in how CQI can be achieved using ACT. It needs to be stressed here that this paper is not intended to scorn Automotive Core Tools, rather, its purpose is limited only to provide a balance on the prevailing positive views toward ACT.

  5. Usefulness of a Regional Health Care Information System in primary care: a case study.

    PubMed

    Maass, Marianne C; Asikainen, Paula; Mäenpää, Tiina; Wanne, Olli; Suominen, Tarja

    2008-08-01

    The goal of this paper is to describe some benefits and possible cost consequences of computer based access to specialised health care information. A before-after activity analysis regarding 20 diabetic patients' clinical appointments was performed in a Health Centre in Satakunta region in Finland. Cost data, an interview, time-and-motion studies, and flow charts based on modelling were applied. Access to up-to-date diagnostic information reduced redundant clinical re-appointments, repeated tests, and mail orders for missing data. Timely access to diagnostic information brought about several benefits regarding workflow, patient care, and disease management. These benefits resulted in theoretical net cost savings. The study results indicated that Regional Information Systems may be useful tools to support performance and improve efficiency. However, further studies are required in order to verify how the monetary savings would impact the performance of Health Care Units.

  6. Benefit-risk Evaluation for Diagnostics: A Framework (BED-FRAME).

    PubMed

    Evans, Scott R; Pennello, Gene; Pantoja-Galicia, Norberto; Jiang, Hongyu; Hujer, Andrea M; Hujer, Kristine M; Manca, Claudia; Hill, Carol; Jacobs, Michael R; Chen, Liang; Patel, Robin; Kreiswirth, Barry N; Bonomo, Robert A

    2016-09-15

    The medical community needs systematic and pragmatic approaches for evaluating the benefit-risk trade-offs of diagnostics that assist in medical decision making. Benefit-Risk Evaluation of Diagnostics: A Framework (BED-FRAME) is a strategy for pragmatic evaluation of diagnostics designed to supplement traditional approaches. BED-FRAME evaluates diagnostic yield and addresses 2 key issues: (1) that diagnostic yield depends on prevalence, and (2) that different diagnostic errors carry different clinical consequences. As such, evaluating and comparing diagnostics depends on prevalence and the relative importance of potential errors. BED-FRAME provides a tool for communicating the expected clinical impact of diagnostic application and the expected trade-offs of diagnostic alternatives. BED-FRAME is a useful fundamental supplement to the standard analysis of diagnostic studies that will aid in clinical decision making. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  7. Evaluation of the Effectiveness of Stormwater Decision Support Tools for Infrastructure Selection and the Barriers to Implementation

    NASA Astrophysics Data System (ADS)

    Spahr, K.; Hogue, T. S.

    2016-12-01

    Selecting the most appropriate green, gray, and / or hybrid system for stormwater treatment and conveyance can prove challenging to decision markers across all scales, from site managers to large municipalities. To help streamline the selection process, a multi-disciplinary team of academics and professionals is developing an industry standard for selecting and evaluating the most appropriate stormwater management technology for different regions. To make the tool more robust and comprehensive, life-cycle cost assessment and optimization modules will be included to evaluate non-monetized and ecosystem benefits of selected technologies. Initial work includes surveying advisory board members based in cities that use existing decision support tools in their infrastructure planning process. These surveys will qualify the decisions currently being made and identify challenges within the current planning process across a range of hydroclimatic regions and city size. Analysis of social and other non-technical barriers to adoption of the existing tools is also being performed, with identification of regional differences and institutional challenges. Surveys will also gage the regional appropriateness of certain stormwater technologies based off experiences in implementing stormwater treatment and conveyance plans. In additional to compiling qualitative data on existing decision support tools, a technical review of components of the decision support tool used will be performed. Gaps in each tool's analysis, like the lack of certain critical functionalities, will be identified and ease of use will be evaluated. Conclusions drawn from both the qualitative and quantitative analyses will be used to inform the development of the new decision support tool and its eventual dissemination.

  8. Developing a mapping tool for tablets

    NASA Astrophysics Data System (ADS)

    Vaughan, Alan; Collins, Nathan; Krus, Mike

    2014-05-01

    Digital field mapping offers significant benefits when compared with traditional paper mapping techniques in that it provides closer integration with downstream geological modelling and analysis. It also provides the mapper with the ability to rapidly integrate new data with existing databases without the potential degradation caused by repeated manual transcription of numeric, graphical and meta-data. In order to achieve these benefits, a number of PC-based digital mapping tools are available which have been developed for specific communities, eg the BGS•SIGMA project, Midland Valley's FieldMove®, and a range of solutions based on ArcGIS® software, which can be combined with either traditional or digital orientation and data collection tools. However, with the now widespread availability of inexpensive tablets and smart phones, a user led demand for a fully integrated tablet mapping tool has arisen. This poster describes the development of a tablet-based mapping environment specifically designed for geologists. The challenge was to deliver a system that would feel sufficiently close to the flexibility of paper-based geological mapping while being implemented on a consumer communication and entertainment device. The first release of a tablet-based geological mapping system from this project is illustrated and will be shown as implemented on an iPad during the poster session. Midland Valley is pioneering tablet-based mapping and, along with its industrial and academic partners, will be using the application in field based projects throughout this year and will be integrating feedback in further developments of this technology.

  9. THRSTER: A THRee-STream Ejector Ramjet Analysis and Design Tool

    NASA Technical Reports Server (NTRS)

    Chue, R. S.; Sabean, J.; Tyll, J.; Bakos, R. J.

    2000-01-01

    An engineering tool for analyzing ejectors in rocket based combined cycle (RBCC) engines has been developed. A key technology for multi-cycle RBCC propulsion systems is the ejector which functions as the compression stage of the ejector ramjet cycle. The THRee STream Ejector Ramjet analysis tool was developed to analyze the complex aerothermodynamic and combustion processes that occur in this device. The formulated model consists of three quasi-one-dimensional streams, one each for the ejector primary flow, the secondary flow, and the mixed region. The model space marches through the mixer, combustor, and nozzle to evaluate the solution along the engine. In its present form, the model is intended for an analysis mode in which the diffusion rates of the primary and secondary into the mixed stream are stipulated. The model offers the ability to analyze the highly two-dimensional ejector flowfield while still benefits from the simplicity and speed of an engineering tool. To validate the developed code, wall static pressure measurements from the Penn-State and NASA-ART RBCC experiments were used to compare with the results generated by the code. The calculated solutions were generally found to have satisfactory agreement with the pressure measurements along the engines, although further modeling effort may be required when a strong shock train is formed at the rocket exhaust. The range of parameters in which the code would generate valid results are presented and discussed.

  10. THRSTER: A Three-Stream Ejector Ramjet Analysis and Design Tool

    NASA Technical Reports Server (NTRS)

    Chue, R. S.; Sabean, J.; Tyll, J.; Bakos, R. J.; Komar, D. R. (Technical Monitor)

    2000-01-01

    An engineering tool for analyzing ejectors in rocket based combined cycle (RBCC) engines has been developed. A key technology for multi-cycle RBCC propulsion systems is the ejector which functions as the compression stage of the ejector ramjet cycle. The THRee STream Ejector Ramjet analysis tool was developed to analyze the complex aerothermodynamic and combustion processes that occur in this device. The formulated model consists of three quasi-one-dimensional streams, one each for the ejector primary flow, the secondary flow, and the mixed region. The model space marches through the mixer, combustor, and nozzle to evaluate the solution along the engine. In its present form, the model is intended for an analysis mode in which the diffusion rates of the primary and secondary into the mixed stream are stipulated. The model offers the ability to analyze the highly two-dimensional ejector flowfield while still benefits from the simplicity and speed of an engineering tool. To validate the developed code, wall static pressure measurements from the Penn-State and NASA-ART RBCC experiments were used to compare with the results generated by the code. The calculated solutions were generally found to have satisfactory agreement with the pressure measurements along the engines, although further modeling effort may be required when a strong shock train is formed at the rocket exhaust. The range of parameters in which the code would generate valid results are presented and discussed.

  11. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  12. Teachers' Views on Digital Educational Tools in English Language Learning: Benefits and Challenges in the Turkish Context

    ERIC Educational Resources Information Center

    Çelik, Servet; Aytin, Kübra

    2014-01-01

    Despite the clear benefits provided by digital educational tools, Turkish teachers of English as a foreign language (EFL) are often seen as failing to take advantage of computing technologies in the classroom. Deficiencies in terms of teachers' digital literacies are often faulted for this omission. The majority of studies concerning Turkish EFL…

  13. CVal: a spreadsheet tool to evaluate the direct benefits and costs of carbon sequestration contracts for managed forests

    Treesearch

    E.M. Bilek; Peter Becker; Tim. McAbee

    2009-01-01

    This documentation is meant to accompany CVal, a downloadable spreadsheet tool. CVal was constructed for foresters, other land management advisors, landowners, and carbon credit aggregators to evaluate the direct benefits and costs of entering into contracts for carbon sequestered in managed forests and forest plantations. CVal was designed to evaluate Exchange...

  14. Peer-Assisted Learning in the Athletic Training Clinical Setting

    PubMed Central

    Henning, Jolene M; Weidner, Thomas G; Jones, James

    2006-01-01

    Context: Athletic training educators often anecdotally suggest that athletic training students enhance their learning by teaching their peers. However, peer-assisted learning (PAL) has not been examined within athletic training education in order to provide evidence for its current use or as a pedagogic tool. Objective: To describe the prevalence of PAL in athletic training clinical education and to identify students' perceptions of PAL. Design: Descriptive. Setting: “The Athletic Training Student Seminar” at the National Athletic Trainers' Association 2002 Annual Meeting and Clinical Symposia. Patients or Other Participants: A convenience sample of 138 entry-level male and female athletic training students. Main Outcome Measure(s): Students' perceptions regarding the prevalence and benefits of and preferences for PAL were measured using the Athletic Training Peer-Assisted Learning Assessment Survey. The Survey is a self-report tool with 4 items regarding the prevalence of PAL and 7 items regarding perceived benefits and preferences. Results: A total of 66% of participants practiced a moderate to large amount of their clinical skills with other athletic training students. Sixty percent of students reported feeling less anxious when performing clinical skills on patients in front of other athletic training students than in front of their clinical instructors. Chi-square analysis revealed that 91% of students enrolled in Commission on Accreditation of Allied Health Education Programs–accredited athletic training education programs learned a minimal to small amount of clinical skills from their peers compared with 65% of students in Joint Review Committee on Educational Programs in Athletic Training–candidacy schools (χ2 3 = 14.57, P < .01). Multiple analysis of variance revealed significant interactions between sex and academic level on several items regarding benefits and preferences. Conclusions: According to athletic training students, PAL is occurring in the athletic training clinical setting. Entry-level students are utilizing their peers as resources for practicing clinical skills and report benefiting from the collaboration. Educators should consider deliberately integrating PAL into athletic training education programs to enhance student learning and collaboration. PMID:16619102

  15. Habitat Equivalency Analysis: A Potential Tool for Estimating Environmental Benefits

    DTIC Science & Technology

    2008-01-01

    of animal tissue produced per year. It is assumed that the dredged areas will be permanently taken out of production and recovery time for the...from the mine seriously contaminated 40 km (25 miles) of nearby Panther Creek, a tributary of the Salmon River. Water quality, benthic fauna, and...restore the biological health of Panther Creek. Trustees responsible for monitoring the cleanup determined that the abundance of Chinook salmon was

  16. The use of COSMIC NASTRAN in an integrated conceptual design environment

    NASA Technical Reports Server (NTRS)

    White, Gil

    1989-01-01

    Changes in both software and hardware are rapidly bringing conceptual engineering tools like finite element analysis into mainstream mechanical design. Systems that integrate all phases of the manufacturing process provide the most cost benefits. The application of programming concepts like object oriented programming allow for the encapsulation of intelligent data within the design geometry. This combined with declining cost in per seat hardware bring new alternatives to the user.

  17. REopt Improves the Operations of Alcatraz's Solar PV-Battery-Diesel Hybrid System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olis, Daniel R; Walker, H. A; Van Geet, Otto D

    This poster identifies operations improvement strategies for a photovoltaic (PV)-battery-diesel hybrid system at the National Park Service's Alcatraz Island using NREL's REopt analysis tool. The current 'cycle charging' strategy results in significant curtailing of energy production from the PV array, requiring excessive diesel use, while also incurring high wear on batteries without benefit of improved efficiency. A simple 'load following' strategy results in near optimal operating cost reduction.

  18. Characterization of Tactical Departure Scheduling in the National Airspace System

    NASA Technical Reports Server (NTRS)

    Capps, Alan; Engelland, Shawn A.

    2011-01-01

    This paper discusses and analyzes current day utilization and performance of the tactical departure scheduling process in the National Airspace System (NAS) to understand the benefits in improving this process. The analysis used operational air traffic data from over 1,082,000 flights during the month of January, 2011. Specific metrics included the frequency of tactical departure scheduling, site specific variances in the technology's utilization, departure time prediction compliance used in the tactical scheduling process and the performance with which the current system can predict the airborne slot that aircraft are being scheduled into from the airport surface. Operational data analysis described in this paper indicates significant room for improvement exists in the current system primarily in the area of reduced departure time prediction uncertainty. Results indicate that a significant number of tactically scheduled aircraft did not meet their scheduled departure slot due to departure time uncertainty. In addition to missed slots, the operational data analysis identified increased controller workload associated with tactical departures which were subject to traffic management manual re-scheduling or controller swaps. An analysis of achievable levels of departure time prediction accuracy as obtained by a new integrated surface and tactical scheduling tool is provided to assess the benefit it may provide as a solution to the identified shortfalls. A list of NAS facilities which are likely to receive the greatest benefit from the integrated surface and tactical scheduling technology are provided.

  19. Analysis of laser therapy and assessment methods in the rehabilitation of temporomandibular disorder: a systematic review of the literature

    PubMed Central

    Herpich, Carolina Marciela; Amaral, Ana Paula; Leal-Junior, Ernesto Cesar Pinto; Tosato, Juliana de Paiva; Gomes, Cid Andre Fidelis de Paula; Arruda, Éric Edmur Camargo; Glória, Igor Phillip dos Santos; Garcia, Marilia Barbosa Santos; Barbosa, Bruno Roberto Borges; Rodrigues, Monique Sampaio; Silva, Katiane Lima; El Hage, Yasmin; Politti, Fabiano; Gonzalez, Tabajara de Oliveira; Bussadori, Sandra Kalil; Biasotto-Gonzalez, Daniela Aparecida

    2015-01-01

    The aim of the present study was to perform a systematic review of the literature on the effects of low-level laser therapy in the treatment of TMD, and to analyze the use of different assessment tools. [Subjects and Methods] Searches were carried out of the BIREME, MEDLINE, PubMed and SciELO electronic databases by two independent researchers for papers published in English and Portuguese using the terms: “temporomandibular joint laser therapy” and “TMJ laser treatment”. [Results] Following the application of the eligibility criteria, 11 papers were selected for in-depth analysis. The papers analyzed exhibited considerable methodological differences, especially with regard to the number of sessions, anatomic site and duration of low-level laser therapy irradiation, as well as irradiation parameters, diagnostic criteria and assessment tools. [Conclusion] Further studies are needed, especially randomized clinical trials, to establish the exact dose and ideal parameters for low-level laser therapy and define the best assessment tools in this promising field of research that may benefit individuals with signs and symptoms of TMD. PMID:25642095

  20. Systems Engineering and Integration (SE and I)

    NASA Technical Reports Server (NTRS)

    Chevers, ED; Haley, Sam

    1990-01-01

    The issue of technology advancement and future space transportation vehicles is addressed. The challenge is to develop systems which can be evolved and improved in small incremental steps where each increment reduces present cost, improves, reliability, or does neither but sets the stage for a second incremental upgrade that does. Future requirements are interface standards for commercial off the shelf products to aid in the development of integrated facilities; enhanced automated code generation system slightly coupled to specification and design documentation; modeling tools that support data flow analysis; and shared project data bases consisting of technical characteristics cast information, measurement parameters, and reusable software programs. Topics addressed include: advanced avionics development strategy; risk analysis and management; tool quality management; low cost avionics; cost estimation and benefits; computer aided software engineering; computer systems and software safety; system testability; and advanced avionics laboratories - and rapid prototyping. This presentation is represented by viewgraphs only.

  1. Reliability and validity of Champion's Health Belief Model Scale for breast cancer screening among Malaysian women.

    PubMed

    Parsa, P; Kandiah, M; Mohd Nasir, M T; Hejar, A R; Nor Afiah, M Z

    2008-11-01

    Breast cancer is the leading cause of cancer deaths in Malaysian women, and the use of breast self-examination (BSE), clinical breast examination (CBE) and mammography remain low in Malaysia. Therefore, there is a need to develop a valid and reliable tool to measure the beliefs that influence breast cancer screening practices. The Champion's Health Belief Model Scale (CHBMS) is a valid and reliable tool to measure beliefs about breast cancer and screening methods in the Western culture. The purpose of this study was to translate the use of CHBMS into the Malaysian context and validate the scale among Malaysian women. A random sample of 425 women teachers was taken from 24 secondary schools in Selangor state, Malaysia. The CHBMS was translated into the Malay language, validated by an expert's panel, back translated, and pretested. Analyses included descriptive statistics of all the study variables, reliability estimates, and construct validity using factor analysis. The mean age of the respondents was 37.2 (standard deviation 7.1) years. Factor analysis yielded ten factors for BSE with eigenvalue greater than 1 (four factors more than the original): confidence 1 (ability to differentiate normal and abnormal changes in the breasts), barriers to BSE, susceptibility for breast cancer, benefits of BSE, health motivation 1 (general health), seriousness 1 (fear of breast cancer), confidence 2 (ability to detect size of lumps), seriousness 2 (fear of long-term effects of breast cancer), health motivation 2 (preventive health practice), and confidence 3 (ability to perform BSE correctly). For CBE and mammography scales, seven factors each were identified. Factors for CBE scale include susceptibility, health motivation 1, benefits of CBE, seriousness 1, barriers of CBE, seriousness 2 and health motivation 2. For mammography the scale includes benefits of mammography, susceptibility, health motivation 1, seriousness 1, barriers to mammography seriousness 2 and health motivation 2. Cronbach's alpha reliability coefficients ranged from 0.774 to 0.939 for the subscales. The translated version of the CHBMS was found to be a valid and reliable tool for use with Malaysian women. It can be used easily to evaluate the health beliefs about breast cancer, BSE, CBE and mammography and for planning interventions. For greater applicability, it is recommended that this tool be tested among ethnically diverse populations.

  2. [Personnel marketing in anesthesiology. Perception, use and evaluation by the target group].

    PubMed

    Berlet, T

    2015-09-01

    The human resources situation in the healthcare system is characterized by a manpower shortage. Recruiting medical staff is of great importance for hospitals and particularly in anesthesiology. Approaching and recruiting staff usually happens through external personnel marketing (PM); however, up until now the efficacy of these PM measures has barely been empirically investigated. The goal of this empirical study was to examine how familiar hospital physicians at varying career levels are with the different tools employed by external PM and how frequently they used as well as rated these tools in terms of benefits. Based on this information, the preferences of medical staff with respect to detailing the workplace of "hospital physician" as well as factors of the hospital's attractiveness as an employer were evaluated. Another aim was to derive recommendations on how to optimize the marketing instruments used for external PM in the healthcare system. In an internet-based survey, 154 female and male physicians were questioned about their knowledge, use and benefit assessment of a total of 43 PM tools. Conventional methods of addressing applicants were commonly used but ranked behind the more personal and direct targeting tools in terms of benefit assessments. Internet-based tools with a conceptual affinity to conventional methods were also highly rated in terms of benefits. In contrast, unconventional methods of addressing applicants were hardly known and were not viewed as being useful. The PM tools from the field of "overall conditions for cooperation in the company" mainly received high to very high benefit assessments. These referred primarily to non-monetary factors, human resource development measures and also to aspects of remuneration. Image-promoting PM tools were rarely assessed as being useful, with the exception of measures aimed at creating personal contact between the hospital or unit/department and applicants or those allowing personal insight into the department's range of activities. The correlation between familiarity with PM tools and positive benefit assessments was low. The results of this PM study can contribute to the development of an effective and efficient conception of PM measures by hospitals. Addressing applicants through traditional job and internet-based advertisements should be supplemented by personalized targeting of potential applicants and innovative targeting instruments must be systematically developed and promoted in order to become effective. It remains questionable whether the target group of external PM can be reached with exclusively or even predominantly image-cultivating measures on behalf of the healthcare company, such as image campaigns. The most effective PM tools create good working conditions and develop factors contributing to the employer's attractiveness. These two tools should be given priority in human resource development also taking material incentives into consideration, all of which support the effective set-up of an employer branding.

  3. Value Tools in Managed Care Decision Making: Current Hurdles and Future Opportunities.

    PubMed

    Schafer, Jeremy; Galante, Dominic; Shafrin, Jason

    2017-06-01

    Organizations such as the National Comprehensive Cancer Network, American Society of Clinical Oncology, Institute for Clinical and Economic Review, and Memorial Sloan Kettering have created distinct tools to help different stakeholders assess the value of oncology treatments. However, the oncology value tools were not necessarily created for payers, and it is unclear whether payers are using these tools as part of their drug management process. To understand what value tools payers are using in oncology management and what benefits and shortcomings the tools may have from the payer perspective. A survey targeting drug coverage decision makers at health plans was conducted in August 2016. Respondents attesting to using 2 or more value tools in drug management were eligible for an additional in-depth interview to understand the respondents' perceived benefits and shortcomings of current value tools. Respondents also were asked to describe desired attributes of a hypothetical payer-centric value tool. A total of 28 respondents representing approximately 160 million commercially insured medical lives completed the survey. Twenty respondents (71%) reported using at least 1 value tool in their drug management process. Twelve respondents (43%) used at least 2 tools, and 4 respondents (14%) used at least 3 tools. A total of 6 respondents were selected for in-depth interviews. Interviewees praised value tools for advancing the discussion on drug value and incorporating clinical evidence. However, interviewees felt available value tools varied on providing firm recommendations and relevant price benchmarks. Respondents most commonly recommended the following attributes of a proposed payer-centric value framework: taking a firm position on product value; product comparisons in lieu of comparative clinical trials; web-based tool access; and tool updates at least quarterly. Interview respondents also expressed some support for allowing manipulation of inputs and inclusion of quality-of-life and patient-reported outcome data. Although nearly half of payers surveyed use 2 or more value tools in the drug management process, payers identified a number of areas where the tools could be revised to increase their utility to payers. No outside funding or assistance of any kind was used for this research or in manuscript preparation. Schafer and Galante are employed by Precision for Value, a payer ad marketing agency that works exclusively with life science companies. Shafrin is employed by Precision Health Economics, a consulting company to insurance and life science industries. Shafer, along with Galante and Shafrin, contributed to study design, data collection, and manuscript preparation. The authors contributed equally to data analysis and interpretation and manuscript revision.

  4. Investigation on Effect of Material Hardness in High Speed CNC End Milling Process.

    PubMed

    Dhandapani, N V; Thangarasu, V S; Sureshkannan, G

    2015-01-01

    This research paper analyzes the effects of material properties on surface roughness, material removal rate, and tool wear on high speed CNC end milling process with various ferrous and nonferrous materials. The challenge of material specific decision on the process parameters of spindle speed, feed rate, depth of cut, coolant flow rate, cutting tool material, and type of coating for the cutting tool for required quality and quantity of production is addressed. Generally, decision made by the operator on floor is based on suggested values of the tool manufacturer or by trial and error method. This paper describes effect of various parameters on the surface roughness characteristics of the precision machining part. The prediction method suggested is based on various experimental analysis of parameters in different compositions of input conditions which would benefit the industry on standardization of high speed CNC end milling processes. The results show a basis for selection of parameters to get better results of surface roughness values as predicted by the case study results.

  5. Investigation on Effect of Material Hardness in High Speed CNC End Milling Process

    PubMed Central

    Dhandapani, N. V.; Thangarasu, V. S.; Sureshkannan, G.

    2015-01-01

    This research paper analyzes the effects of material properties on surface roughness, material removal rate, and tool wear on high speed CNC end milling process with various ferrous and nonferrous materials. The challenge of material specific decision on the process parameters of spindle speed, feed rate, depth of cut, coolant flow rate, cutting tool material, and type of coating for the cutting tool for required quality and quantity of production is addressed. Generally, decision made by the operator on floor is based on suggested values of the tool manufacturer or by trial and error method. This paper describes effect of various parameters on the surface roughness characteristics of the precision machining part. The prediction method suggested is based on various experimental analysis of parameters in different compositions of input conditions which would benefit the industry on standardization of high speed CNC end milling processes. The results show a basis for selection of parameters to get better results of surface roughness values as predicted by the case study results. PMID:26881267

  6. Measuring the costs and benefits of conservation programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Einhorn, M.A.

    1985-07-25

    A step-by-step analysis of the effects of utility-sponsored conservation promoting programs begins by identifying several factors which will reduce a program's effectiveness. The framework for measuring cost savings and designing a conservation program needs to consider the size of appliance subsidies, what form incentives should take, and how will customer behavior change as a result of incentives. Continual reevaluation is necessary to determine whether to change the size of rebates or whether to continue the program. Analytical tools for making these determinations are improving as conceptual breakthroughs in econometrics permit more rigorous analysis. 5 figures.

  7. Integration of Molecular Dynamics Based Predictions into the Optimization of De Novo Protein Designs: Limitations and Benefits.

    PubMed

    Carvalho, Henrique F; Barbosa, Arménio J M; Roque, Ana C A; Iranzo, Olga; Branco, Ricardo J F

    2017-01-01

    Recent advances in de novo protein design have gained considerable insight from the intrinsic dynamics of proteins, based on the integration of molecular dynamics simulations protocols on the state-of-the-art de novo protein design protocols used nowadays. With this protocol we illustrate how to set up and run a molecular dynamics simulation followed by a functional protein dynamics analysis. New users will be introduced to some useful open-source computational tools, including the GROMACS molecular dynamics simulation software package and ProDy for protein structural dynamics analysis.

  8. Research Costs Investigated: A Study Into the Budgets of Dutch Publicly Funded Drug-Related Research.

    PubMed

    van Asselt, Thea; Ramaekers, Bram; Corro Ramos, Isaac; Joore, Manuela; Al, Maiwenn; Lesman-Leegte, Ivonne; Postma, Maarten; Vemer, Pepijn; Feenstra, Talitha

    2018-01-01

    The costs of performing research are an important input in value of information (VOI) analyses but are difficult to assess. The aim of this study was to investigate the costs of research, serving two purposes: (1) estimating research costs for use in VOI analyses; and (2) developing a costing tool to support reviewers of grant proposals in assessing whether the proposed budget is realistic. For granted study proposals from the Netherlands Organization for Health Research and Development (ZonMw), type of study, potential cost drivers, proposed budget, and general characteristics were extracted. Regression analysis was conducted in an attempt to generate a 'predicted budget' for certain combinations of cost drivers, for implementation in the costing tool. Of 133 drug-related research grant proposals, 74 were included for complete data extraction. Because an association between cost drivers and budgets was not confirmed, we could not generate a predicted budget based on regression analysis, but only historic reference budgets given certain study characteristics. The costing tool was designed accordingly, i.e. with given selection criteria the tool returns the range of budgets in comparable studies. This range can be used in VOI analysis to estimate whether the expected net benefit of sampling will be positive to decide upon the net value of future research. The absence of association between study characteristics and budgets may indicate inconsistencies in the budgeting or granting process. Nonetheless, the tool generates useful information on historical budgets, and the option to formally relate VOI to budgets. To our knowledge, this is the first attempt at creating such a tool, which can be complemented with new studies being granted, enlarging the underlying database and keeping estimates up to date.

  9. "Her Story Was Complex": A Twine Workshop for Ten- to Twelve-Year-Old Girls

    ERIC Educational Resources Information Center

    Tran, Kelly M.

    2016-01-01

    In this study, I discuss the need to increase girls' involvement with game design due to the numerous benefits that engaging in this practice might have. In particular, I discuss the tool Twine, an accessible and relatively easy-to-use platform for creating text-based games. I provide an overview of the tool and its potential benefits for…

  10. Interactive Visualization to Advance Earthquake Simulation

    NASA Astrophysics Data System (ADS)

    Kellogg, Louise H.; Bawden, Gerald W.; Bernardin, Tony; Billen, Magali; Cowgill, Eric; Hamann, Bernd; Jadamec, Margarete; Kreylos, Oliver; Staadt, Oliver; Sumner, Dawn

    2008-04-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth’s surface and interior. Virtual mapping tools allow virtual “field studies” in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method’s strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations.

  11. A cost-efficiency and health benefit approach to improve urban air quality.

    PubMed

    Miranda, A I; Ferreira, J; Silveira, C; Relvas, H; Duque, L; Roebeling, P; Lopes, M; Costa, S; Monteiro, A; Gama, C; Sá, E; Borrego, C; Teixeira, J P

    2016-11-01

    When ambient air quality standards established in the EU Directive 2008/50/EC are exceeded, Member States are obliged to develop and implement Air Quality Plans (AQP) to improve air quality and health. Notwithstanding the achievements in emission reductions and air quality improvement, additional efforts need to be undertaken to improve air quality in a sustainable way - i.e. through a cost-efficiency approach. This work was developed in the scope of the recently concluded MAPLIA project "Moving from Air Pollution to Local Integrated Assessment", and focuses on the definition and assessment of emission abatement measures and their associated costs, air quality and health impacts and benefits by means of air quality modelling tools, health impact functions and cost-efficiency analysis. The MAPLIA system was applied to the Grande Porto urban area (Portugal), addressing PM10 and NOx as the most important pollutants in the region. Four different measures to reduce PM10 and NOx emissions were defined and characterized in terms of emissions and implementation costs, and combined into 15 emission scenarios, simulated by the TAPM air quality modelling tool. Air pollutant concentration fields were then used to estimate health benefits in terms of avoided costs (external costs), using dose-response health impact functions. Results revealed that, among the 15 scenarios analysed, the scenario including all 4 measures lead to a total net benefit of 0.3M€·y(-1). The largest net benefit is obtained for the scenario considering the conversion of 50% of open fire places into heat recovery wood stoves. Although the implementation costs of this measure are high, the benefits outweigh the costs. Research outcomes confirm that the MAPLIA system is useful for policy decision support on air quality improvement strategies, and could be applied to other urban areas where AQP need to be implemented and monitored. Copyright © 2016. Published by Elsevier B.V.

  12. Making Scientific Data Usable and Useful

    NASA Astrophysics Data System (ADS)

    Satwicz, T.; Bharadwaj, A.; Evans, J.; Dirks, J.; Clark Cole, K.

    2017-12-01

    Transforming geological data into information that has broad scientific and societal impact is a process fraught with barriers. Data sets and tools are often reported to have poor user experiences (UX) that make scientific work more challenging than it needs be. While many other technical fields have benefited from ongoing improvements to the UX of their tools (e.g., healthcare and financial services) scientists are faced with using tools that are labor intensive and not intuitive. Our research team has been involved in a multi-year effort to understand and improve the UX of scientific tools and data sets. We use a User-Centered Design (UCD) process that involves naturalistic behavioral observation and other qualitative research methods adopted from Human-Computer Interaction (HCI) and related fields. Behavioral observation involves having users complete common tasks on data sets, tools, and websites to identify usability issues and understand the severity of the issues. We measure how successfully they complete tasks and diagnosis the cause of any failures. Behavioral observation is paired with in-depth interviews where users describe their process for generating results (from initial inquiry to final results). By asking detailed questions we unpack common patterns and challenges scientists experience while working with data. We've found that tools built using the UCD process can have a large impact on scientist work flows and greatly reduce the time it takes to process data before analysis. It is often challenging to understand the organization and nuances of data across scientific fields. By better understanding how scientists work we can create tools that make routine tasks less-labor intensive, data easier to find, and solve common issues with discovering new data sets and engaging in interdisciplinary research. There is a tremendous opportunity for advancing scientific knowledge and helping the public benefit from that work by creating intuitive, interactive, and powerful tools and resources for generating knowledge. The pathway to achieving that is through building a detailed understanding of users and their needs, then using this knowledge to inform the design of the data products, tools, and services scientists and non-scientists use to do their work.

  13. National Water Quality Benefits

    EPA Science Inventory

    This project will provide the basis for advancing the goal of producing tools in support of quantifying and valuing changes in water quality for EPA regulations. It will also identify specific data and modeling gaps and Improve benefits estimation for more complete benefit-cost a...

  14. Cryogenic Boil-Off Reduction System

    NASA Astrophysics Data System (ADS)

    Plachta, David W.; Guzik, Monica C.

    2014-03-01

    A computational model of the cryogenic boil-off reduction system being developed by NASA as part of the Cryogenic Propellant Storage and Transfer technology maturation project has been applied to a range of propellant storage tanks sizes for high-performing in-space cryogenic propulsion applications. This effort focuses on the scaling of multi-layer insulation (MLI), cryocoolers, broad area cooling shields, radiators, solar arrays, and tanks for liquid hydrogen propellant storage tanks ranging from 2 to 10 m in diameter. Component scaling equations were incorporated into the Cryogenic Analysis Tool, a spreadsheet-based tool used to perform system-level parametric studies. The primary addition to the evolution of this updated tool is the integration of a scaling method for reverse turbo-Brayton cycle cryocoolers, as well as the development and inclusion of Self-Supporting Multi-Layer Insulation. Mass, power, and sizing relationships are traded parametrically to establish the appropriate loiter period beyond which this boil-off reduction system application reduces mass. The projected benefit compares passive thermal control to active thermal control, where active thermal control is evaluated for reduced boil-off with a 90 K shield, zero boil-off with a single heat interception stage at the tank wall, and zero boil-off with a second interception stage at a 90 K shield. Parametric studies show a benefit over passive storage at loiter durations under one month, in addition to showing a benefit for two-stage zero boil-off in terms of reducing power and mass as compared to single stage zero boil-off. Furthermore, active cooling reduces the effect of varied multi-layer insulation performance, which, historically, has been shown to be significant.

  15. Analysis of the Capacity Potential of Current Day and Novel Configurations for New York's John F. Kennedy Airport

    NASA Technical Reports Server (NTRS)

    Glaab, Patricia; Tamburro, Ralph; Lee, Paul

    2016-01-01

    In 2015, a series of systems analysis studies were conducted on John F. Kennedy Airport in New York (NY) in a collaborative effort between NASA and the Port Authority of New York and New Jersey (PANYNJ). This work was performed to build a deeper understanding of NY airspace and operations to determine the improvements possible through operational changes with tools currently available, and where new technology is required for additional improvement. The analysis was conducted using tool-based mathematical analyses, video inspection and evaluation using recorded arrival/departure/surface traffic captured by the Aerobahn tool (used by Kennedy Airport for surface metering), and aural data archives available publically through the web to inform the video segments. A discussion of impacts of trajectory and operational choices on capacity is presented, including runway configuration and usage (parallel, converging, crossing, shared, independent, staggered), arrival and departure route characteristics (fix sharing, merges, splits), and how compression of traffic is staged. The authorization in March of 2015 for New York to use reduced spacing under the Federal Aviation Administration (FAA) Wake Turbulence Recategorization (RECAT) also offers significant capacity benefit for New York airports when fully transitioned to the new spacing requirements, and the impact of these changes for New York is discussed. Arrival and departure capacity results are presented for each of the current day Kennedy Airport configurations. While the tools allow many variations of user-selected conditions, the analysis for these studies used arrival-priority, no-winds, additional safety buffer of 5% to the required minimum spacing, and a mix of traffic typical for Kennedy. Two additional "novel" configurations were evaluated. These configurations are of interest to Port Authority and to their airline customers, and are believed to offer near-term capacity benefit with minimal operational and equipage changes. One of these is the addition of an Optimized Profile Descent (OPD) route to runways 22L and 22R, and the other is the simultaneous use of 4 runways, which is not currently done at Kennedy. The background and configuration for each of these is described, and the capacity results are presented along with a discussion of drawbacks and enablers for each.

  16. A decision support tool to determine cost-to-benefit of a family-centered in-home program for at-risk adolescents.

    PubMed

    Wilson, Fernando A; Araz, Ozgur M; Thompson, Ronald W; Ringle, Jay L; Mason, W Alex; Stimpson, Jim P

    2016-06-01

    Family-centered program research has demonstrated its effectiveness in improving adolescent outcomes. However, given current fiscal constraints faced by governmental agencies, a recent report from the Institute of Medicine and National Research Council highlighted the need for cost-benefit analyses to inform decision making by policymakers. Furthermore, performance management tools such as balanced scorecards and dashboards do not generally include cost-benefit analyses. In this paper, we describe the development of an Excel-based decision support tool that can be used to evaluate a selected family-based program for at-risk children and adolescents relative to a comparison program or the status quo. This tool incorporates the use of an efficient, user-friendly interface with results provided in concise tabular and graphical formats that may be interpreted without need for substantial training in economic evaluation. To illustrate, we present an application of this tool to evaluate use of Boys Town's In-Home Family Services (IHFS) relative to detention and out-of-home placement in New York City. Use of the decision support tool can help mitigate the need for programs to contract experts in economic evaluation, especially when there are financial or time constraints. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Different Strokes for Different Folks: Visual Presentation Design between Disciplines

    PubMed Central

    Gomez, Steven R.; Jianu, Radu; Ziemkiewicz, Caroline; Guo, Hua; Laidlaw, David H.

    2015-01-01

    We present an ethnographic study of design differences in visual presentations between academic disciplines. Characterizing design conventions between users and data domains is an important step in developing hypotheses, tools, and design guidelines for information visualization. In this paper, disciplines are compared at a coarse scale between four groups of fields: social, natural, and formal sciences; and the humanities. Two commonplace presentation types were analyzed: electronic slideshows and whiteboard “chalk talks”. We found design differences in slideshows using two methods – coding and comparing manually-selected features, like charts and diagrams, and an image-based analysis using PCA called eigenslides. In whiteboard talks with controlled topics, we observed design behaviors, including using representations and formalisms from a participant’s own discipline, that suggest authors might benefit from novel assistive tools for designing presentations. Based on these findings, we discuss opportunities for visualization ethnography and human-centered authoring tools for visual information. PMID:26357149

  18. TRIZ Tool for Optimization of Airport Runway

    NASA Astrophysics Data System (ADS)

    Rao, K. Venkata; Selladurai, V.; Saravanan, R.

    TRIZ tool is used for conceptual design and layout of the novel ascending and descending runway model for the effective utilization of short length airports. Handling bigger aircrafts at smaller airports become the necessity for economic consideration and for the benefit of vast airliners and the aspiring air travelers of the region. The authors’ proposal of ascending and descending runway would enable the operational need of wide body aircrafts such as Boeing 747 and Airbus A380-800. Negotiating take-off and landing of bigger aircrafts at less than 10000 feet runway is an optimization solution. This conceptual model and the theoretical design with its layout is dealt in this paper as Part - I. The computer-aided design and analysis using MATLAB with Simulink tool box to confirm the adequacy of the runway length for the bigger aircrafts at smaller airports is however dealt in subsequent papers.

  19. Different Strokes for Different Folks: Visual Presentation Design between Disciplines.

    PubMed

    Gomez, S R; Jianu, R; Ziemkiewicz, C; Guo, Hua; Laidlaw, D H

    2012-12-01

    We present an ethnographic study of design differences in visual presentations between academic disciplines. Characterizing design conventions between users and data domains is an important step in developing hypotheses, tools, and design guidelines for information visualization. In this paper, disciplines are compared at a coarse scale between four groups of fields: social, natural, and formal sciences; and the humanities. Two commonplace presentation types were analyzed: electronic slideshows and whiteboard "chalk talks". We found design differences in slideshows using two methods - coding and comparing manually-selected features, like charts and diagrams, and an image-based analysis using PCA called eigenslides. In whiteboard talks with controlled topics, we observed design behaviors, including using representations and formalisms from a participant's own discipline, that suggest authors might benefit from novel assistive tools for designing presentations. Based on these findings, we discuss opportunities for visualization ethnography and human-centered authoring tools for visual information.

  20. Status of Low Thrust Work at JSC

    NASA Technical Reports Server (NTRS)

    Condon, Gerald L.

    2004-01-01

    High performance low thrust (solar electric, nuclear electric, variable specific impulse magnetoplasma rocket) propulsion offers a significant benefit to NASA missions beyond low Earth orbit. As NASA (e.g., Prometheus Project) endeavors to develop these propulsion systems and associated power supplies, it becomes necessary to develop a refined trajectory design capability that will allow engineers to develop future robotic and human mission designs that take advantage of this new technology. This ongoing work addresses development of a trajectory design and optimization tool for assessing low thrust (and other types) trajectories. This work targets to advance the state of the art, enable future NASA missions, enable science drivers, and enhance education. This presentation provides a summary of the low thrust-related JSC activities under the ISP program and specifically, provides a look at a new release of a multi-gravity, multispacecraft trajectory optimization tool (Copernicus) along with analysis performed using this tool over the past year.

  1. Analysis of Decisions Made Using the Analytic Hierarchy Process

    DTIC Science & Technology

    2013-09-01

    country petroleum pipelines (Dey, 2003), deciding how best to manage U.S. watersheds (De Steiguer, Duberstein, and Lopes, 2003), and the U. S. Army...many benefits to its use. Primarily these fall under the heading of managing chaos. Specifically, the AHP is a tool that can be used to simplify and...originally. The commonly used scenario is this: the waiter asks if you want chicken or fish, and you reply fish. The waiter then remembers that steak is

  2. Environmental and Economic Implications of Distributed Additive Manufacturing: The Case of Injection Mold Tooling: Environmental Implications of Additive Manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Runze; Riddle, Matthew E.; Graziano, Diane

    Additive manufacturing (AM) holds great potentials in enabling superior engineering functionality, streamlining supply chains, and reducing life cycle impacts compared to conventional manufacturing (CM). This study estimates the net changes in supply-chain lead time, life cycle primary energy consumption, greenhouse gas (GHG) emissions, and life cycle costs (LCC) associated with AM technologies for the case of injection molding, to shed light on the environmental and economic advantages of a shift from international or onshore CM to AM in the United States. A systems modeling framework is developed, with integrations of lead-time analysis, life cycle inventory analysis, LCC model, and scenariosmore » considering design differences, supply-chain options, productions, maintenance, and AM technological developments. AM yields a reduction potential of 3% to 5% primary energy, 4% to 7% GHG emissions, 12% to 60% lead time, and 15% to 35% cost over 1 million cycles of the injection molding production depending on the AM technology advancement in future. The economic advantages indicate the significant role of AM technology in raising global manufacturing competitiveness of local producers, while the relatively small environmental benefits highlight the necessity of considering trade-offs and balance techniques between environmental and economic performances when AM is adopted in the tooling industry. The results also help pinpoint the technological innovations in AM that could lead to broader benefits in future.« less

  3. Environmental and Economic Implications of Distributed Additive Manufacturing: The Case of Injection Mold Tooling: Environmental Implications of Additive Manufacturing

    DOE PAGES

    Huang, Runze; Riddle, Matthew E.; Graziano, Diane; ...

    2017-08-26

    Additive manufacturing (AM) holds great potentials in enabling superior engineering functionality, streamlining supply chains, and reducing life cycle impacts compared to conventional manufacturing (CM). This study estimates the net changes in supply-chain lead time, life cycle primary energy consumption, greenhouse gas (GHG) emissions, and life cycle costs (LCC) associated with AM technologies for the case of injection molding, to shed light on the environmental and economic advantages of a shift from international or onshore CM to AM in the United States. A systems modeling framework is developed, with integrations of lead-time analysis, life cycle inventory analysis, LCC model, and scenariosmore » considering design differences, supply-chain options, productions, maintenance, and AM technological developments. AM yields a reduction potential of 3% to 5% primary energy, 4% to 7% GHG emissions, 12% to 60% lead time, and 15% to 35% cost over 1 million cycles of the injection molding production depending on the AM technology advancement in future. The economic advantages indicate the significant role of AM technology in raising global manufacturing competitiveness of local producers, while the relatively small environmental benefits highlight the necessity of considering trade-offs and balance techniques between environmental and economic performances when AM is adopted in the tooling industry. The results also help pinpoint the technological innovations in AM that could lead to broader benefits in future.« less

  4. Mobile instant messaging for rural community health workers: a case from Malawi.

    PubMed

    Pimmer, Christoph; Mhango, Susan; Mzumara, Alfred; Mbvundula, Francis

    2017-01-01

    Mobile instant messaging (MIM) tools, such as WhatsApp, have transformed global communication practice. In the field of global health, MIM is an increasingly used, but little understood, phenomenon. It remains unclear how MIM can be used by rural community health workers (CHWs) and their facilitators, and what are the associated benefits and constraints. To address this gap, WhatsApp groups were implemented and researched in a rural setting in Malawi. The multi-site case study research triangulated interviews and focus groups of CHWs and facilitators with the thematic qualitative analysis of the actual conversations on WhatsApp. A survey with open questions and the quantitative analysis of WhatsApp conversations were used as supplementary triangulation sources. The use of MIM was differentiated according to instrumental (e.g. mobilising health resources) and participatory purposes (e.g. the enactment of emphatic ties). The identified benefits were centred on the enhanced ease and quality of communication of a geographically distributed health workforce, and the heightened connectedness of a professionally isolated health workforce. Alongside minor technical and connectivity issues, the main challenge for the CHWs was to negotiate divergent expectations regarding the social versus the instrumental use of the space. Despite some challenges and constraints, the implementation of WhatsApp was received positively by the CHWs and it was found to be a useful tool to support distributed rural health work.

  5. New Sepsis Definition (Sepsis-3) and Community-acquired Pneumonia Mortality. A Validation and Clinical Decision-Making Study.

    PubMed

    Ranzani, Otavio T; Prina, Elena; Menéndez, Rosario; Ceccato, Adrian; Cilloniz, Catia; Méndez, Raul; Gabarrus, Albert; Barbeta, Enric; Bassi, Gianluigi Li; Ferrer, Miquel; Torres, Antoni

    2017-11-15

    The Sepsis-3 Task Force updated the clinical criteria for sepsis, excluding the need for systemic inflammatory response syndrome (SIRS) criteria. The clinical implications of the proposed flowchart including the quick Sequential (Sepsis-related) Organ Failure Assessment (qSOFA) and SOFA scores are unknown. To perform a clinical decision-making analysis of Sepsis-3 in patients with community-acquired pneumonia. This was a cohort study including adult patients with community-acquired pneumonia from two Spanish university hospitals. SIRS, qSOFA, the Confusion, Respiratory Rate and Blood Pressure (CRB) score, modified SOFA (mSOFA), the Confusion, Urea, Respiratory Rate, Blood Pressure and Age (CURB-65) score, and Pneumonia Severity Index (PSI) were calculated with data from the emergency department. We used decision-curve analysis to evaluate the clinical usefulness of each score and the primary outcome was in-hospital mortality. Of 6,874 patients, 442 (6.4%) died in-hospital. SIRS presented the worst discrimination, followed by qSOFA, CRB, mSOFA, CURB-65, and PSI. Overall, overestimation of in-hospital mortality and miscalibration was more evident for qSOFA and mSOFA. SIRS had lower net benefit than qSOFA and CRB, significantly increasing the risk of over-treatment and being comparable with the "treat-all" strategy. PSI had higher net benefit than mSOFA and CURB-65 for mortality, whereas mSOFA seemed more applicable when considering mortality/intensive care unit admission. Sepsis-3 flowchart resulted in better identification of patients at high risk of mortality. qSOFA and CRB outperformed SIRS and presented better clinical usefulness as prompt tools for patients with community-acquired pneumonia in the emergency department. Among the tools for a comprehensive patient assessment, PSI had the best decision-aid tool profile.

  6. reSpect: Software for Identification of High and Low Abundance Ion Species in Chimeric Tandem Mass Spectra

    PubMed Central

    Shteynberg, David; Mendoza, Luis; Hoopmann, Michael R.; Sun, Zhi; Schmidt, Frank; Deutsch, Eric W.; Moritz, Robert L.

    2016-01-01

    Most shotgun proteomics data analysis workflows are based on the assumption that each fragment ion spectrum is explained by a single species of peptide ion isolated by the mass spectrometer; however, in reality mass spectrometers often isolate more than one peptide ion within the window of isolation that contributes to additional peptide fragment peaks in many spectra. We present a new tool called reSpect, implemented in the Trans-Proteomic Pipeline (TPP), that enables an iterative workflow whereby fragment ion peaks explained by a peptide ion identified in one round of sequence searching or spectral library search are attenuated based on the confidence of the identification, and then the altered spectrum is subjected to further rounds of searching. The reSpect tool is not implemented as a search engine, but rather as a post search engine processing step where only fragment ion intensities are altered. This enables the application of any search engine combination in the following iterations. Thus, reSpect is compatible with all other protein sequence database search engines as well as peptide spectral library search engines that are supported by the TPP. We show that while some datasets are highly amenable to chimeric spectrum identification and lead to additional peptide identification boosts of over 30% with as many as four different peptide ions identified per spectrum, datasets with narrow precursor ion selection only benefit from such processing at the level of a few percent. We demonstrate a technique that facilitates the determination of the degree to which a dataset would benefit from chimeric spectrum analysis. The reSpect tool is free and open source, provided within the TPP and available at the TPP website. PMID:26419769

  7. reSpect: software for identification of high and low abundance ion species in chimeric tandem mass spectra.

    PubMed

    Shteynberg, David; Mendoza, Luis; Hoopmann, Michael R; Sun, Zhi; Schmidt, Frank; Deutsch, Eric W; Moritz, Robert L

    2015-11-01

    Most shotgun proteomics data analysis workflows are based on the assumption that each fragment ion spectrum is explained by a single species of peptide ion isolated by the mass spectrometer; however, in reality mass spectrometers often isolate more than one peptide ion within the window of isolation that contribute to additional peptide fragment peaks in many spectra. We present a new tool called reSpect, implemented in the Trans-Proteomic Pipeline (TPP), which enables an iterative workflow whereby fragment ion peaks explained by a peptide ion identified in one round of sequence searching or spectral library search are attenuated based on the confidence of the identification, and then the altered spectrum is subjected to further rounds of searching. The reSpect tool is not implemented as a search engine, but rather as a post-search engine processing step where only fragment ion intensities are altered. This enables the application of any search engine combination in the iterations that follow. Thus, reSpect is compatible with all other protein sequence database search engines as well as peptide spectral library search engines that are supported by the TPP. We show that while some datasets are highly amenable to chimeric spectrum identification and lead to additional peptide identification boosts of over 30% with as many as four different peptide ions identified per spectrum, datasets with narrow precursor ion selection only benefit from such processing at the level of a few percent. We demonstrate a technique that facilitates the determination of the degree to which a dataset would benefit from chimeric spectrum analysis. The reSpect tool is free and open source, provided within the TPP and available at the TPP website. Graphical Abstract ᅟ.

  8. reSpect: Software for Identification of High and Low Abundance Ion Species in Chimeric Tandem Mass Spectra

    NASA Astrophysics Data System (ADS)

    Shteynberg, David; Mendoza, Luis; Hoopmann, Michael R.; Sun, Zhi; Schmidt, Frank; Deutsch, Eric W.; Moritz, Robert L.

    2015-11-01

    Most shotgun proteomics data analysis workflows are based on the assumption that each fragment ion spectrum is explained by a single species of peptide ion isolated by the mass spectrometer; however, in reality mass spectrometers often isolate more than one peptide ion within the window of isolation that contribute to additional peptide fragment peaks in many spectra. We present a new tool called reSpect, implemented in the Trans-Proteomic Pipeline (TPP), which enables an iterative workflow whereby fragment ion peaks explained by a peptide ion identified in one round of sequence searching or spectral library search are attenuated based on the confidence of the identification, and then the altered spectrum is subjected to further rounds of searching. The reSpect tool is not implemented as a search engine, but rather as a post-search engine processing step where only fragment ion intensities are altered. This enables the application of any search engine combination in the iterations that follow. Thus, reSpect is compatible with all other protein sequence database search engines as well as peptide spectral library search engines that are supported by the TPP. We show that while some datasets are highly amenable to chimeric spectrum identification and lead to additional peptide identification boosts of over 30% with as many as four different peptide ions identified per spectrum, datasets with narrow precursor ion selection only benefit from such processing at the level of a few percent. We demonstrate a technique that facilitates the determination of the degree to which a dataset would benefit from chimeric spectrum analysis. The reSpect tool is free and open source, provided within the TPP and available at the TPP website.

  9. A practical approach to communicating benefit-risk decisions of medicines to stakeholders.

    PubMed

    Leong, James; Walker, Stuart; Salek, Sam

    2015-01-01

    The importance of a framework for a systematic structured assessment of the benefits and risks has been established, but in addition, it is necessary that the benefit-risk decisions and the processes to derive those decisions are documented and communicated to various stakeholders for accountability. Hence there is now a need to find appropriate tools to enhance communication between regulators and other stakeholders, in a manner that would uphold transparency, consistency and standards. A retrospective, non-comparative study was conducted to determine the applicability and practicality of a summary template in documenting benefit-risk assessment and communicating benefit-risk balance and conclusions for reviewers to other stakeholders. The benefit-risk (BR) Summary Template and its User Manual was evaluated by 12 reviewers within a regulatory agency in Singapore, the Health Sciences Authority (HSA). The BR Summary Template was found to be adequate in documenting benefits, risks, relevant summaries and conclusions, while the User Manual was useful in guiding the reviewer in completing the template. The BR Summary Template was also considered a useful tool for communicating benefit-risk decisions to a variety of stakeholders. The use of a template may be of value for the communicating benefit-risk assessment of medicines to stakeholders.

  10. Digital telephony analysis model and issues

    NASA Astrophysics Data System (ADS)

    Keuthan, Lynn M.

    1995-09-01

    Experts in the fields of digital telephony and communications security have stated the need for an analytical tool for evaluating complex issues. Some important policy issues discussed by experts recently include implementing digital wire-taps, implementation of the 'Clipper Chip', required registration of encryption/decryption keys, and export control of cryptographic equipment. Associated with the implementation of these policies are direct costs resulting from implementation, indirect cost benefits from implementation, and indirect costs resulting from the risks of implementation or factors reducing cost benefits. Presented herein is a model for analyzing digital telephony policies and systems and their associated direct costs and indirect benefit and risk factors. In order to present the structure of the model, issues of national importance and business-related issues are discussed. The various factors impacting the implementation of the associated communications systems and communications security are summarized, and various implementation tradeoffs are compared based on economic benefits/impact. The importance of the issues addressed herein, as well as other digital telephony issues, has greatly increased with the enormous increases in communication system connectivity due to the advance of the National Information Infrastructure.

  11. Opportunities and pitfalls in clinical proof-of-concept: principles and examples.

    PubMed

    Chen, Chao

    2018-04-01

    Clinical proof-of-concept trials crucially inform major resource deployment decisions. This paper discusses several mechanisms for enhancing their rigour and efficiency. The importance of careful consideration when using a surrogate endpoint is illustrated; situational effectiveness of run-in patient enrichment is explored; a versatile tool is introduced to ensure a strong pharmacological underpinning; the benefits of dose-titration are revealed by simulation; and the importance of adequately scheduled observations is shown. The general process of model-based trial design and analysis is described and several examples demonstrate the value in historical data, simulation-guided design, model-based analysis and trial adaptation informed by interim analysis. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Completely automated modal analysis procedure based on the combination of different OMA methods

    NASA Astrophysics Data System (ADS)

    Ripamonti, Francesco; Bussini, Alberto; Resta, Ferruccio

    2018-03-01

    In this work a completely automated output-only Modal Analysis procedure is presented and all its benefits are listed. Based on the merging of different Operational Modal Analysis methods and a statistical approach, the identification process has been improved becoming more robust and giving as results only the real natural frequencies, damping ratios and mode shapes of the system. The effect of the temperature can be taken into account as well, leading to the creation of a better tool for automated Structural Health Monitoring. The algorithm has been developed and tested on a numerical model of a scaled three-story steel building present in the laboratories of Politecnico di Milano.

  13. Protein mass spectra data analysis for clinical biomarker discovery: a global review.

    PubMed

    Roy, Pascal; Truntzer, Caroline; Maucort-Boulch, Delphine; Jouve, Thomas; Molinari, Nicolas

    2011-03-01

    The identification of new diagnostic or prognostic biomarkers is one of the main aims of clinical cancer research. In recent years there has been a growing interest in using high throughput technologies for the detection of such biomarkers. In particular, mass spectrometry appears as an exciting tool with great potential. However, to extract any benefit from the massive potential of clinical proteomic studies, appropriate methods, improvement and validation are required. To better understand the key statistical points involved with such studies, this review presents the main data analysis steps of protein mass spectra data analysis, from the pre-processing of the data to the identification and validation of biomarkers.

  14. Advances in Mid-Infrared Spectroscopy for Chemical Analysis

    NASA Astrophysics Data System (ADS)

    Haas, Julian; Mizaikoff, Boris

    2016-06-01

    Infrared spectroscopy in the 3-20 μm spectral window has evolved from a routine laboratory technique into a state-of-the-art spectroscopy and sensing tool by benefitting from recent progress in increasingly sophisticated spectra acquisition techniques and advanced materials for generating, guiding, and detecting mid-infrared (MIR) radiation. Today, MIR spectroscopy provides molecular information with trace to ultratrace sensitivity, fast data acquisition rates, and high spectral resolution catering to demanding applications in bioanalytics, for example, and to improved routine analysis. In addition to advances in miniaturized device technology without sacrificing analytical performance, selected innovative applications for MIR spectroscopy ranging from process analysis to biotechnology and medical diagnostics are highlighted in this review.

  15. Window Selection Tool | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  16. Value for money - recasting the problem in terms of dynamic access prioritisation.

    PubMed

    Taylor, William J; Laking, George

    2010-01-01

    To develop an approach for achieving value for money in rehabilitation based on dynamic prioritisation of access to services according to individual capacity to benefit. A critical review of economic evaluation and adaptation of a prioritisation method used in determining access to elective surgical services in New Zealand to a rehabilitation context. The cost-effectiveness frontier is not straight but curved, suggesting that some people benefit more from a given intervention than others. An approach that identifies those most likely to benefit from inpatient rehabilitation following stroke (as an example) and enables access in order of capacity to benefit is presented in the context of a quality improvement programme. The approach is operationalised as a prioritisation tool that is dynamic in the sense that is can be reapplied subject to changes in the patient's clinical status. The steps proposed to develop such a tool include qualitative research with expert clinicians, pair-wise comparison of alternative scenarios (1000Minds survey), construction of an economic model of the tool's operation and an observational cohort study to help populate the model and calibrate the tool. A dynamic prioritisation approach to guide access to scarce health-care resources (such as inpatient rehabilitation following stroke) offers a transparent and equitable way of achieving value for money in the delivery of rehabilitation services.

  17. Study on the separation effect of high-speed ultrasonic vibration cutting.

    PubMed

    Zhang, Xiangyu; Sui, He; Zhang, Deyuan; Jiang, Xinggang

    2018-07-01

    High-speed ultrasonic vibration cutting (HUVC) has been proven to be significantly effective when turning Ti-6Al-4V alloy in recent researches. Despite of breaking through the cutting speed restriction of the ultrasonic vibration cutting (UVC) method, HUVC can also achieve the reduction of cutting force and the improvements in surface quality and cutting efficiency in the high-speed machining field. These benefits all result from the separation effect that occurs during the HUVC process. Despite the fact that the influences of vibration and cutting parameters have been discussed in previous researches, the separation analysis of HUVC should be conducted in detail in real cutting situations, and the tool geometry parameters should also be considered. In this paper, three situations are investigated in details: (1) cutting without negative transient clearance angle and without tool wear, (2) cutting with negative transient clearance angle and without tool wear, and (3) cutting with tool wear. And then, complete separation state, partial separation state and continuous cutting state are deduced according to real cutting processes. All the analysis about the above situations demonstrate that the tool-workpiece separation will take place only if appropriate cutting parameters, vibration parameters, and tool geometry parameters are set up. The best separation effect was obtained with a low feedrate and a phase shift approaching 180 degrees. Moreover, flank face interference resulted from the negative transient clearance angle and tool wear contributes to an improved separation effect that makes the workpiece and tool separate even at zero phase shift. Finally, axial and radial transient cutting force are firstly obtained to verify the separation effect of HUVC, and the cutting chips are collected to weigh the influence of flank face interference. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. FISH Finder: a high-throughput tool for analyzing FISH images

    PubMed Central

    Shirley, James W.; Ty, Sereyvathana; Takebayashi, Shin-ichiro; Liu, Xiuwen; Gilbert, David M.

    2011-01-01

    Motivation: Fluorescence in situ hybridization (FISH) is used to study the organization and the positioning of specific DNA sequences within the cell nucleus. Analyzing the data from FISH images is a tedious process that invokes an element of subjectivity. Automated FISH image analysis offers savings in time as well as gaining the benefit of objective data analysis. While several FISH image analysis software tools have been developed, they often use a threshold-based segmentation algorithm for nucleus segmentation. As fluorescence signal intensities can vary significantly from experiment to experiment, from cell to cell, and within a cell, threshold-based segmentation is inflexible and often insufficient for automatic image analysis, leading to additional manual segmentation and potential subjective bias. To overcome these problems, we developed a graphical software tool called FISH Finder to automatically analyze FISH images that vary significantly. By posing the nucleus segmentation as a classification problem, compound Bayesian classifier is employed so that contextual information is utilized, resulting in reliable classification and boundary extraction. This makes it possible to analyze FISH images efficiently and objectively without adjustment of input parameters. Additionally, FISH Finder was designed to analyze the distances between differentially stained FISH probes. Availability: FISH Finder is a standalone MATLAB application and platform independent software. The program is freely available from: http://code.google.com/p/fishfinder/downloads/list Contact: gilbert@bio.fsu.edu PMID:21310746

  19. A descriptive framework for country-level analysis of health care financing arrangements.

    PubMed

    Kutzin, J

    2001-06-01

    Health financing policies are marked by confusion between policy tools and policy objectives, especially in low and middle income countries. This paper attempts to address this problem by providing a conceptual framework that is driven by the normative objective of enhancing the 'insurance function' (access to needed care without financial impoverishment) of health care systems. The framework is proposed as a tool for descriptive analysis of the key functions, policies, and interactions within an existing health care system, and equally as a tool to assist the identification and preliminary assessment of policy options. The aim is to help to clarify the policy levers that are available to enhance the insurance function for the population as efficiently as possible, given the 'starting point' of a country's existing institutional and organizational arrangements. Analysis of health care financing systems using this framework highlights the interactions of various policies and the need for a coherent package of coordinated reforms, rather than a focus on particular organizational forms of 'health insurance'. The content of each main health care system function (revenue collection, pooling of funds, purchasing of services, provision of services) and the market structure with which the implementation of each is organized are found to be particularly important, as are policies with respect to the benefit package and user fees.

  20. Analysis of outcomes in radiation oncology: An integrated computational platform

    PubMed Central

    Liu, Dezhi; Ajlouni, Munther; Jin, Jian-Yue; Ryu, Samuel; Siddiqui, Farzan; Patel, Anushka; Movsas, Benjamin; Chetty, Indrin J.

    2009-01-01

    Radiotherapy research and outcome analyses are essential for evaluating new methods of radiation delivery and for assessing the benefits of a given technology on locoregional control and overall survival. In this article, a computational platform is presented to facilitate radiotherapy research and outcome studies in radiation oncology. This computational platform consists of (1) an infrastructural database that stores patient diagnosis, IMRT treatment details, and follow-up information, (2) an interface tool that is used to import and export IMRT plans in DICOM RT and AAPM/RTOG formats from a wide range of planning systems to facilitate reproducible research, (3) a graphical data analysis and programming tool that visualizes all aspects of an IMRT plan including dose, contour, and image data to aid the analysis of treatment plans, and (4) a software package that calculates radiobiological models to evaluate IMRT treatment plans. Given the limited number of general-purpose computational environments for radiotherapy research and outcome studies, this computational platform represents a powerful and convenient tool that is well suited for analyzing dose distributions biologically and correlating them with the delivered radiation dose distributions and other patient-related clinical factors. In addition the database is web-based and accessible by multiple users, facilitating its convenient application and use. PMID:19544785

  1. Sports teams as superorganisms: implications of sociobiological models of behaviour for research and practice in team sports performance analysis.

    PubMed

    Duarte, Ricardo; Araújo, Duarte; Correia, Vanda; Davids, Keith

    2012-08-01

    Significant criticisms have emerged on the way that collective behaviours in team sports have been traditionally evaluated. A major recommendation has been for future research and practice to focus on the interpersonal relationships developed between team players during performance. Most research has typically investigated team game performance in subunits (attack or defence), rather than considering the interactions of performers within the whole team. In this paper, we offer the view that team performance analysis could benefit from the adoption of biological models used to explain how repeated interactions between grouping individuals scale to emergent social collective behaviours. We highlight the advantages of conceptualizing sports teams as functional integrated 'super-organisms' and discuss innovative measurement tools, which might be used to capture the superorganismic properties of sports teams. These tools are suitable for revealing the idiosyncratic collective behaviours underlying the cooperative and competitive tendencies of different sports teams, particularly their coordination of labour and the most frequent channels of communication and patterns of interaction between team players. The principles and tools presented here can serve as the basis for novel approaches and applications of performance analysis devoted to understanding sports teams as cohesive, functioning, high-order organisms exhibiting their own peculiar behavioural patterns.

  2. Spark and HPC for High Energy Physics Data Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sehrish, Saba; Kowalkowski, Jim; Paterno, Marc

    A full High Energy Physics (HEP) data analysis is divided into multiple data reduction phases. Processing within these phases is extremely time consuming, therefore intermediate results are stored in files held in mass storage systems and referenced as part of large datasets. This processing model limits what can be done with interactive data analytics. Growth in size and complexity of experimental datasets, along with emerging big data tools are beginning to cause changes to the traditional ways of doing data analyses. Use of big data tools for HEP analysis looks promising, mainly because extremely large HEP datasets can be representedmore » and held in memory across a system, and accessed interactively by encoding an analysis using highlevel programming abstractions. The mainstream tools, however, are not designed for scientific computing or for exploiting the available HPC platform features. We use an example from the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) in Geneva, Switzerland. The LHC is the highest energy particle collider in the world. Our use case focuses on searching for new types of elementary particles explaining Dark Matter in the universe. We use HDF5 as our input data format, and Spark to implement the use case. We show the benefits and limitations of using Spark with HDF5 on Edison at NERSC.« less

  3. SCD-HeFT: Use of RR Interval Statistics for Long-term Risk Stratification for Arrhythmic Sudden Cardiac Death

    PubMed Central

    Au-yeung, Wan-tai M.; Reinhall, Per; Poole, Jeanne E.; Anderson, Jill; Johnson, George; Fletcher, Ross D.; Moore, Hans J.; Mark, Daniel B.; Lee, Kerry L.; Bardy, Gust H.

    2015-01-01

    Background In the SCD-HeFT a significant fraction of the congestive heart failure (CHF) patients ultimately did not die suddenly from arrhythmic causes. CHF patients will benefit from better tools to identify if ICD therapy is needed. Objective To identify predictor variables from baseline SCD-HeFT patients’ RR intervals that correlate to arrhythmic sudden cardiac death (SCD) and mortality and to design an ICD therapy screening test. Methods Ten predictor variables were extracted from pre-randomization Holter data from 475 patients enrolled in the SCD-HeFT ICD arm using novel and traditional heart rate variability methods. All variables were correlated to SCD using Mann Whitney-Wilcoxon test and receiver operating characteristic analysis. ICD therapy screening tests were designed by minimizing the cost of false classifications. Survival analysis, including log-rank test and Cox models, was also performed. Results α1 and α2 from detrended fluctuation analysis, the ratio of low to high frequency power, the number of PVCs per hour and heart rate turbulence slope are all statistically significant for predicting the occurrences of SCD (p<0.001) and survival (log-rank p<0.01). The most powerful multivariate predictor tool using the Cox Proportional Hazards was α2 with a hazard ratio of 0.0465 (95% CI: 0.00528 – 0.409, p<0.01). Conclusion Predictor variables from RR intervals correlate to the occurrences of SCD and distinguish survival among SCD-HeFT ICD patients. We believe SCD prediction models should incorporate Holter based RR interval analysis to refine ICD patient selection especially in removing patients who are unlikely to benefit from ICD therapy. PMID:26096609

  4. Engine Icing Data - An Analytics Approach

    NASA Technical Reports Server (NTRS)

    Fitzgerald, Brooke A.; Flegel, Ashlie B.

    2017-01-01

    Engine icing researchers at the NASA Glenn Research Center use the Escort data acquisition system in the Propulsion Systems Laboratory (PSL) to generate and collect a tremendous amount of data every day. Currently these researchers spend countless hours processing and formatting their data, selecting important variables, and plotting relationships between variables, all by hand, generally analyzing data in a spreadsheet-style program (such as Microsoft Excel). Though spreadsheet-style analysis is familiar and intuitive to many, processing data in spreadsheets is often unreproducible and small mistakes are easily overlooked. Spreadsheet-style analysis is also time inefficient. The same formatting, processing, and plotting procedure has to be repeated for every dataset, which leads to researchers performing the same tedious data munging process over and over instead of making discoveries within their data. This paper documents a data analysis tool written in Python hosted in a Jupyter notebook that vastly simplifies the analysis process. From the file path of any folder containing time series datasets, this tool batch loads every dataset in the folder, processes the datasets in parallel, and ingests them into a widget where users can search for and interactively plot subsets of columns in a number of ways with a click of a button, easily and intuitively comparing their data and discovering interesting dynamics. Furthermore, comparing variables across data sets and integrating video data (while extremely difficult with spreadsheet-style programs) is quite simplified in this tool. This tool has also gathered interest outside the engine icing branch, and will be used by researchers across NASA Glenn Research Center. This project exemplifies the enormous benefit of automating data processing, analysis, and visualization, and will help researchers move from raw data to insight in a much smaller time frame.

  5. Validation of the World Health Organization tool for situational analysis to assess emergency and essential surgical care at district hospitals in Ghana.

    PubMed

    Osen, Hayley; Chang, David; Choo, Shelly; Perry, Henry; Hesse, Afua; Abantanga, Francis; McCord, Colin; Chrouser, Kristin; Abdullah, Fizan

    2011-03-01

    The World Health Organization (WHO) Tool for Situational Analysis to Assess Emergency and Essential Surgical Care (hereafter called the WHO Tool) has been used in more than 25 countries and is the largest effort to assess surgical care in the world. However, it has not yet been independently validated. Test-retest reliability is one way to validate the degree to which tests instruments are free from random error. The aim of the present field study was to determine the test-retest reliability of the WHO Tool. The WHO Tool was mailed to 10 district hospitals in Ghana. Written instructions were provided along with a letter from the Ghana Health Services requesting the hospital administrator to complete the survey tool. After ensuring delivery and completion of the forms, the study team readministered the WHO Tool at the time of an on-site visit less than 1 month later. The results of the two tests were compared to calculate kappa statistics for each of the 152 questions in the WHO Tool. The kappa statistic is a statistical measure of the degree of agreement above what would be expected based on chance alone. Ten hospitals were surveyed twice over a short interval (i.e., less than 1 month). Weighted and unweighted kappa statistics were calculated for 152 questions. The median unweighted kappa for the entire survey was 0.43 (interquartile range 0-0.84). The infrastructure section (24 questions) had a median kappa of 0.81; the human resources section (13 questions) had a median kappa of 0.77; the surgical procedures section (67 questions) had a median kappa of 0.00; and the emergency surgical equipment section (48 questions) had a median kappa of 0.81. Hospital capacity survey questions related to infrastructure characteristics had high reliability. However, questions related to process of care had poor reliability and may benefit from supplemental data gathered by direct observation. Limitations to the study include the small sample size: 10 district hospitals in a single country. Consistent and high correlations calculated from the field testing within the present analysis suggest that the WHO Tool for Situational Analysis is a reliable tool where it measures structure and setting, but it should be revised for measuring process of care.

  6. The role of promotion in alcoholism treatment marketing.

    PubMed

    Jones, M A; Self, D R; Owens, C A; Kline, T A

    1988-01-01

    This article is an overview of the promotion function as a part of the ATM's marketing mix. It approaches various promotion decision areas from a managerial perspective, focusing upon some key components of promotion planning. Rather than provide specific operational or implementation details (how to write a brochure) it is more conceptual in nature and offers a framework for promotion planners. The article addresses promotion management, promotion objectives, analysis for promotion planning, the promotion mix, and addresses the benefits and limitations of some specific promotion tools available to the ATM manager. It treats ATMs as a service and reveals specific implications for promotion strategy dictated by services. The article also reports promotion tools employed by Alabama ATMs citing data from the Alabama study.

  7. Whole farm quantification of GHG emissions within smallholder farms in developing countries

    NASA Astrophysics Data System (ADS)

    Seebauer, Matthias

    2014-03-01

    The IPCC has compiled the best available scientific methods into published guidelines for estimating greenhouse gas emissions and emission removals from the land-use sector. In order to evaluate existing GHG quantification tools to comprehensively quantify GHG emissions and removals in smallholder conditions, farm scale quantification was tested with farm data from Western Kenya. After conducting a cluster analysis to identify different farm typologies GHG quantification was exercised using the VCS SALM methodology complemented with IPCC livestock emission factors and the cool farm tool. The emission profiles of four farm clusters representing the baseline conditions in the year 2009 are compared with 2011 where farmers adopted sustainable land management practices (SALM). The results demonstrate the variation in both the magnitude of the estimated GHG emissions per ha between different smallholder farm typologies and the emissions estimated by applying two different accounting tools. The farm scale quantification further shows that the adoption of SALM has a significant impact on emission reduction and removals and the mitigation benefits range between 4 and 6.5 tCO2 ha-1 yr-1 with significantly different mitigation benefits depending on typologies of the crop-livestock systems, their different agricultural practices, as well as adoption rates of improved practices. However, the inherent uncertainty related to the emission factors applied by accounting tools has substantial implications for reported agricultural emissions. With regard to uncertainty related to activity data, the assessment confirms the high variability within different farm types as well as between different parameters surveyed to comprehensively quantify GHG emissions within smallholder farms.

  8. EXO-DAT: AN INFORMATION SYSTEM IN SUPPORT OF THE CoRoT/EXOPLANET SCIENCE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deleuil, M.; Meunier, J. C.; Moutou, C.

    2009-08-15

    Exo-Dat is a database and an information system created primarily in support of the exoplanet program of the COnvection ROtation and planetary Transits (CoRoT) mission. In the directions of CoRoT pointings, it provides a united interface to several sets of data: stellar published catalogs, photometric and spectroscopic data obtained during the mission preparation, results from the mission and from follow-up observations, and several mission-specific technical parameters. The new photometric data constitute the subcatalog Exo-Cat, and give consistent 4-color photometry of 14.0 million stars with a completeness to 19th magnitude in the r-filter. It covers several zones in the galactic planemore » around CoRoT pointings, with a total area of 209 deg{sup 2}. This Exo-Dat information system provides essential technical support to the ongoing CoRoT light-curve analyses and ground-based follow-up by supplying additional complementary information such as the prior knowledge of the star's fundamental parameters or its contamination level inside the large CoRoT photometric mask. The database is fully interfaced with VO tools and thus benefits from existing visualization and analysis tools like TOPCAT or ALADIN. It is accessible to the CoRoT community through the Web, and will be gradually opened to the public. It is the ideal tool to prepare the foreseen statistical studies of the properties of the exoplanetary systems. As a VO-compliant system, such analyses could thus benefit from the most up-to-date classifier tools.« less

  9. Evaluating opportunities to improve material and energy impacts in commodity supply chains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanes, Rebecca J.; Carpenter, Alberta

    When evaluated at the scale of individual processes, next-generation technologies may be more energy and emissions intensive than current technology. Furthermore, many advanced technologies have the potential to reduce material and energy consumption in upstream or downstream processing stages. In order to fully understand the benefits and consequences of technology deployment, next-generation technologies should be evaluated in context, as part of a supply chain. This work presents the Materials Flow through Industry (MFI) supply chain modeling tool. The MFI tool is a cradle-to-gate linear network model of the US industrial sector that can model a wide range of manufacturing scenarios,more » including changes in production technology and increases in industrial energy efficiency. The MFI tool was developed to perform supply chain scale analyses in order to quantify the impacts and benefits of next-generation technologies and materials at that scale. For the analysis presented in this paper, the MFI tool is utilized to explore a case study comparing three lightweight vehicle supply chains to the supply chain of a conventional, standard weight vehicle. Several of the lightweight vehicle supply chains are evaluated under manufacturing scenarios that include next-generation production technologies and next-generation materials. Results indicate that producing lightweight vehicles is more energy and emission intensive than producing the non-lightweight vehicle, but the fuel saved during vehicle use offsets this increase. In this case study, greater reductions in supply chain energy and emissions were achieved through the application of the next-generation technologies than from application of energy efficiency increases.« less

  10. Metabonomics: its potential as a tool in toxicology for safety assessment and data integration.

    PubMed

    Griffin, J L; Bollard, M E

    2004-10-01

    The functional genomic techniques of transcriptomics and proteomics promise unparalleled global information during the drug development process. However, if these technologies are used in isolation the large multivariate data sets produced are often difficult to interpret, and have the potential of missing key metabolic events (e.g. as a result of experimental noise in the system). To better understand the significance of these megavariate data the temporal changes in phenotype must be described. High resolution 1H NMR spectroscopy used in conjunction with pattern recognition provides one such tool for defining the dynamic phenotype of a cell, organ or organism in terms of a metabolic phenotype. In this review the benefits of this metabonomics/metabolomics approach to problems in toxicology will be discussed. One of the major benefits of this approach is its high throughput nature and cost effectiveness on a per sample basis. Using such a method the consortium for metabonomic toxicology (COMET) are currently investigating approximately 150 model liver and kidney toxins. This investigation will allow the generation of expert systems where liver and kidney toxicity can be predicted for model drug compounds, providing a new research tool in the field of drug metabolism. The review will also include how metabonomics may be used to investigate co-responses with transcripts and proteins involved in metabolism and stress responses, such as during drug induced fatty liver disease. By using data integration to combine metabolite analysis and gene expression profiling key perturbed metabolic pathways can be identified and used as a tool to investigate drug function.

  11. Evaluating opportunities to improve material and energy impacts in commodity supply chains

    DOE PAGES

    Hanes, Rebecca J.; Carpenter, Alberta

    2017-01-10

    When evaluated at the scale of individual processes, next-generation technologies may be more energy and emissions intensive than current technology. Furthermore, many advanced technologies have the potential to reduce material and energy consumption in upstream or downstream processing stages. In order to fully understand the benefits and consequences of technology deployment, next-generation technologies should be evaluated in context, as part of a supply chain. This work presents the Materials Flow through Industry (MFI) supply chain modeling tool. The MFI tool is a cradle-to-gate linear network model of the US industrial sector that can model a wide range of manufacturing scenarios,more » including changes in production technology and increases in industrial energy efficiency. The MFI tool was developed to perform supply chain scale analyses in order to quantify the impacts and benefits of next-generation technologies and materials at that scale. For the analysis presented in this paper, the MFI tool is utilized to explore a case study comparing three lightweight vehicle supply chains to the supply chain of a conventional, standard weight vehicle. Several of the lightweight vehicle supply chains are evaluated under manufacturing scenarios that include next-generation production technologies and next-generation materials. Results indicate that producing lightweight vehicles is more energy and emission intensive than producing the non-lightweight vehicle, but the fuel saved during vehicle use offsets this increase. In this case study, greater reductions in supply chain energy and emissions were achieved through the application of the next-generation technologies than from application of energy efficiency increases.« less

  12. Development of a metrics dashboard for monitoring involvement in the 340B Drug Pricing Program.

    PubMed

    Karralli, Rusol; Tipton, Joyce; Dumitru, Doina; Scholz, Lisa; Masilamani, Santhi

    2015-09-01

    An electronic tool to support hospital organizations in monitoring and addressing financial and compliance challenges related to participation in the 340B Drug Pricing Program is described. In recent years there has been heightened congressional and regulatory scrutiny of the federal 340B program, which provides discounted drug prices on Medicaid-covered drugs to safety net hospitals and other 340B-eligible healthcare organizations, or "covered entities." Historically, the 340B program has lacked a metrics-driven reporting framework to help covered entities capture the value of 340B program involvement, community benefits provided to underserved populations, and costs associated with compliance with 340B eligibility requirements. As part of an initiative by a large health system to optimize its 340B program utilization and regulatory compliance efforts, a team of pharmacists led the development of an electronic dashboard tool to help monitor 340B program activities at the system's 340B-eligible facilities. After soliciting input from an array of internal and external 340B program stakeholders, the team designed the dashboard and associated data-entry tools to facilitate the capture and analysis of 340B program-related data in four domains: cost savings and revenue, program maintenance costs, community benefits, and compliance. A large health system enhanced its ability to evaluate and monitor 340B program-related activities through the use of a dashboard tool capturing key metrics on cost savings achieved, maintenance costs, and other aspects of program involvement. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  13. Study on Ultra-deep Azimuthal Electromagnetic Resistivity LWD Tool by Influence Quantification on Azimuthal Depth of Investigation and Real Signal

    NASA Astrophysics Data System (ADS)

    Li, Kesai; Gao, Jie; Ju, Xiaodong; Zhu, Jun; Xiong, Yanchun; Liu, Shuai

    2018-05-01

    This paper proposes a new tool design of ultra-deep azimuthal electromagnetic (EM) resistivity logging while drilling (LWD) for deeper geosteering and formation evaluation, which can benefit hydrocarbon exploration and development. First, a forward numerical simulation of azimuthal EM resistivity LWD is created based on the fast Hankel transform (FHT) method, and its accuracy is confirmed under classic formation conditions. Then, a reasonable range of tool parameters is designed by analyzing the logging response. However, modern technological limitations pose challenges to selecting appropriate tool parameters for ultra-deep azimuthal detection under detectable signal conditions. Therefore, this paper uses grey relational analysis (GRA) to quantify the influence of tool parameters on voltage and azimuthal investigation depth. After analyzing thousands of simulation data under different environmental conditions, the random forest is used to fit data and identify an optimal combination of tool parameters due to its high efficiency and accuracy. Finally, the structure of the ultra-deep azimuthal EM resistivity LWD tool is designed with a theoretical azimuthal investigation depth of 27.42-29.89 m in classic different isotropic and anisotropic formations. This design serves as a reliable theoretical foundation for efficient geosteering and formation evaluation in high-angle and horizontal (HA/HZ) wells in the future.

  14. Enhancing the Characterization of Epistemic Uncertainties in PM2.5 Risk Analyses.

    PubMed

    Smith, Anne E; Gans, Will

    2015-03-01

    The Environmental Benefits Mapping and Analysis Program (BenMAP) is a software tool developed by the U.S. Environmental Protection Agency (EPA) that is widely used inside and outside of EPA to produce quantitative estimates of public health risks from fine particulate matter (PM2.5 ). This article discusses the purpose and appropriate role of a risk analysis tool to support risk management deliberations, and evaluates the functions of BenMAP in this context. It highlights the importance in quantitative risk analyses of characterization of epistemic uncertainty, or outright lack of knowledge, about the true risk relationships being quantified. This article describes and quantitatively illustrates sensitivities of PM2.5 risk estimates to several key forms of epistemic uncertainty that pervade those calculations: the risk coefficient, shape of the risk function, and the relative toxicity of individual PM2.5 constituents. It also summarizes findings from a review of U.S.-based epidemiological evidence regarding the PM2.5 risk coefficient for mortality from long-term exposure. That review shows that the set of risk coefficients embedded in BenMAP substantially understates the range in the literature. We conclude that BenMAP would more usefully fulfill its role as a risk analysis support tool if its functions were extended to better enable and prompt its users to characterize the epistemic uncertainties in their risk calculations. This requires expanded automatic sensitivity analysis functions and more recognition of the full range of uncertainty in risk coefficients. © 2014 Society for Risk Analysis.

  15. Data Access Services that Make Remote Sensing Data Easier to Use

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher

    2010-01-01

    This slide presentation reviews some of the processes that NASA uses to make the remote sensing data easy to use over the World Wide Web. This work involves much research into data formats, geolocation structures and quality indicators, often to be followed by coding a preprocessing program. Only then are the data usable within the analysis tool of choice. The Goddard Earth Sciences Data and Information Services Center is deploying a variety of data access services that are designed to dramatically shorten the time consumed in the data preparation step. On-the-fly conversion to the standard network Common Data Form (netCDF) format with Climate-Forecast (CF) conventions imposes a standard coordinate system framework that makes data instantly readable through several tools, such as the Integrated Data Viewer, Gridded Analysis and Display System, Panoply and Ferret. A similar benefit is achieved by serving data through the Open Source Project for a Network Data Access Protocol (OPeNDAP), which also provides subsetting. The Data Quality Screening Service goes a step further in filtering out data points based on quality control flags, based on science team recommendations or user-specified criteria. Further still is the Giovanni online analysis system which goes beyond handling formatting and quality to provide visualization and basic statistics of the data. This general approach of automating the preparation steps has the important added benefit of enabling use of the data by non-human users (i.e., computer programs), which often make sub-optimal use of the available data due to the need to hard-code data preparation on the client side.

  16. Interfaces and Integration of Medical Image Analysis Frameworks: Challenges and Opportunities.

    PubMed

    Covington, Kelsie; McCreedy, Evan S; Chen, Min; Carass, Aaron; Aucoin, Nicole; Landman, Bennett A

    2010-05-25

    Clinical research with medical imaging typically involves large-scale data analysis with interdependent software toolsets tied together in a processing workflow. Numerous, complementary platforms are available, but these are not readily compatible in terms of workflows or data formats. Both image scientists and clinical investigators could benefit from using the framework which is a most natural fit to the specific problem at hand, but pragmatic choices often dictate that a compromise platform is used for collaboration. Manual merging of platforms through carefully tuned scripts has been effective, but exceptionally time consuming and is not feasible for large-scale integration efforts. Hence, the benefits of innovation are constrained by platform dependence. Removing this constraint via integration of algorithms from one framework into another is the focus of this work. We propose and demonstrate a light-weight interface system to expose parameters across platforms and provide seamless integration. In this initial effort, we focus on four platforms Medical Image Analysis and Visualization (MIPAV), Java Image Science Toolkit (JIST), command line tools, and 3D Slicer. We explore three case studies: (1) providing a system for MIPAV to expose internal algorithms and utilize these algorithms within JIST, (2) exposing JIST modules through self-documenting command line interface for inclusion in scripting environments, and (3) detecting and using JIST modules in 3D Slicer. We review the challenges and opportunities for light-weight software integration both within development language (e.g., Java in MIPAV and JIST) and across languages (e.g., C/C++ in 3D Slicer and shell in command line tools).

  17. Leveraging the power of pooled data for cancer outcomes research.

    PubMed

    Hugh-Yeun, Kiara; Cheung, Winson Y

    2016-08-02

    Clinical trials continue to be the gold standard for determining the efficacy of novel cancer treatments, but they may also expose participants to the potential risks of unpredictable or severe toxicities. The development of validated tools that better inform patients of the benefits and risks associated with clinical trial participation can facilitate the informed consent process. The design and validation of such instruments are strengthened when we leverage the power of pooled data analysis for cancer outcomes research. In a recent study published in the Journal of Clinical Oncology entitled "Determinants of early mortality among 37,568 patients with colon cancer who participated in 25 clinical trials from the adjuvant colon cancer endpoints database," using a large pooled analysis of over 30,000 study participants who were enrolled in clinical trials of adjuvant therapy for early-stage colon cancer, we developed and validated a nomogram depicting the predictors of early cancer mortality. This database of pooled individual-level data allowed for a comprehensive analysis of poor prognostic factors associated with early death; furthermore, it enabled the creation of a nomogram that was able to reliably capture and quantify the benefit-to-risk profile for patients who are considering clinical trial participation. This tool can facilitate treatment decision-making discussions. As China and other Asian countries continue to conduct oncology clinical trials, efforts to collate patient-level information from these studies into a large data repository should be strongly considered since pooled data can increase future capacity for cancer outcomes research, which, in turn, can enhance patient-physician discussions and optimize clinical care.

  18. Vehicle Thermal Management Models and Tools | Transportation Research |

    Science.gov Websites

    NREL Models and Tools Vehicle Thermal Management Models and Tools The National Renewable Energy Laboratory's (NREL's) vehicle thermal management modeling tools allow researchers to assess the trade-offs and calculate the potential benefits of thermal design options. image of three models of semi truck cabs. Truck

  19. A free tool integrating GIS features and workflows to evaluate sediment connectivity in alpine catchments

    NASA Astrophysics Data System (ADS)

    Crema, Stefano; Schenato, Luca; Goldin, Beatrice; Marchi, Lorenzo; Cavalli, Marco

    2014-05-01

    The increased interest in sediment connectivity has brought the geomorphologists' community to focus on sediment fluxes as a key process (Cavalli et al., 2013; Heckmann and Schwanghart, 2013). The challenge of dealing with erosion-related processes in alpine catchments is of primary relevance for different fields of investigations and applications, including, but not limited to natural hazards, hydraulic structures design, ecology and stream restoration. The present work focuses on the development of a free tool for sediment connectivity assessment as described in Cavalli et al. (2013), introducing some novel improvements. The choice of going for a free software is motivated by the need of widening the access and improving participation beyond the restrictions on algorithms customization, typical of commercial software. A couple of features further enhance the tool: being completely free and adopting a user-friendly interface, its target audience includes researchers and stakeholders (e.g., local managers and civil protection authorities in charge of planning the priorities of intervention in the territory), being written in Python programming language, it can benefit from optimized algorithms for high-resolution DEMs (Digital Elevation Models) handling and for propagation workflows implementation; these two factors make the tool computationally competitive with the most recent commercial GIS products. The overall goal of this tool is supporting the analysis of sediment connectivity, facing the challenge of widening, as much as possible, the users' community among scientists and stakeholders. This aspect is crucial, as future improvement of this tool will benefit of feedbacks from users in order to improve the quantitative assessment of sediment connectivity as a major input information for the optimal management of mountain areas. References: Cavalli, M., Trevisani, S., Comiti, F., Marchi, L., 2013. Geomorphometric assessment of spatial sediment connectivity in small Alpine catchments. Geomorphology 188, 31-41. Heckmann, T., Schwanghart, W., 2013. Geomorphic coupling and sediment connectivity in an alpine catchment - Exploring sediment cascades using graph theory. Geomorphology 182, 89-103.

  20. Systems scenarios: a tool for facilitating the socio-technical design of work systems.

    PubMed

    Hughes, Helen P N; Clegg, Chris W; Bolton, Lucy E; Machon, Lauren C

    2017-10-01

    The socio-technical systems approach to design is well documented. Recognising the benefits of this approach, organisations are increasingly trying to work with systems, rather than their component parts. However, few tools attempt to analyse the complexity inherent in such systems, in ways that generate useful, practical outputs. In this paper, we outline the 'System Scenarios Tool' (SST), which is a novel, applied methodology that can be used by designers, end-users, consultants or researchers to help design or re-design work systems. The paper introduces the SST using examples of its application, and describes the potential benefits of its use, before reflecting on its limitations. Finally, we discuss potential opportunities for the tool, and describe sets of circumstances in which it might be used. Practitioner Summary: The paper presents a novel, applied methodological tool, named the 'Systems Scenarios Tool'. We believe this tool can be used as a point of reference by designers, end-users, consultants or researchers, to help design or re-design work systems. Included in the paper are two worked examples, demonstrating the tool's application.

  1. A simple way to unify multicriteria decision analysis (MCDA) and stochastic multicriteria acceptability analysis (SMAA) using a Dirichlet distribution in benefit-risk assessment.

    PubMed

    Saint-Hilary, Gaelle; Cadour, Stephanie; Robert, Veronique; Gasparini, Mauro

    2017-05-01

    Quantitative methodologies have been proposed to support decision making in drug development and monitoring. In particular, multicriteria decision analysis (MCDA) and stochastic multicriteria acceptability analysis (SMAA) are useful tools to assess the benefit-risk ratio of medicines according to the performances of the treatments on several criteria, accounting for the preferences of the decision makers regarding the relative importance of these criteria. However, even in its probabilistic form, MCDA requires the exact elicitations of the weights of the criteria by the decision makers, which may be difficult to achieve in practice. SMAA allows for more flexibility and can be used with unknown or partially known preferences, but it is less popular due to its increased complexity and the high degree of uncertainty in its results. In this paper, we propose a simple model as a generalization of MCDA and SMAA, by applying a Dirichlet distribution to the weights of the criteria and by making its parameters vary. This unique model permits to fit both MCDA and SMAA, and allows for a more extended exploration of the benefit-risk assessment of treatments. The precision of its results depends on the precision parameter of the Dirichlet distribution, which could be naturally interpreted as the strength of confidence of the decision makers in their elicitation of preferences. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Sounding Out Science: Incorporating Audio Technology to Assist Students with Learning Differences in Science Education

    NASA Astrophysics Data System (ADS)

    Gomes, Clement V.

    With the current focus to have all students reach scientific literacy in the U.S, there exists a need to support marginalized students, such as those with Learning Disabilities/Differences (LD), to reach the same educational goals as their mainstream counterparts. This dissertation examines the benefits of using audio assistive technology on the iPad to support LD students to achieve comprehension of science vocabulary and semantics. This dissertation is composed of two papers, both of which include qualitative information supported by quantified data. The first paper, titled Using Technology to Overcome Fundamental Literacy Constraints for Students with Learning Differences to Achieve Scientific Literacy, provides quantified evidence from pretest and posttest analysis that audio technology can be beneficial for seventh grade LD students when learning new and unfamiliar science content. Analysis of observations and student interviews support the findings. The second paper, titled Time, Energy, and Motivation: Utilizing Technology to Ease Science Understanding for Students with Learning Differences, supports the importance of creating technology that is clear, audible, and easy for students to use so they benefit and desire to utilize the learning tool. Multiple correlation of Likert Survey analysis was used to identify four major items and was supported with analysis from observations of and interviews with students, parents, and educators. This study provides useful information to support the rising number of identified LD students and their parents and teachers by presenting the benefits of using audio assistive technology to learn science.

  3. Image guidance improves localization of sonographically occult colorectal liver metastases

    NASA Astrophysics Data System (ADS)

    Leung, Universe; Simpson, Amber L.; Adams, Lauryn B.; Jarnagin, William R.; Miga, Michael I.; Kingham, T. Peter

    2015-03-01

    Assessing the therapeutic benefit of surgical navigation systems is a challenging problem in image-guided surgery. The exact clinical indications for patients that may benefit from these systems is not always clear, particularly for abdominal surgery where image-guidance systems have failed to take hold in the same way as orthopedic and neurosurgical applications. We report interim analysis of a prospective clinical trial for localizing small colorectal liver metastases using the Explorer system (Path Finder Technologies, Nashville, TN). Colorectal liver metastases are small lesions that can be difficult to identify with conventional intraoperative ultrasound due to echogeneity changes in the liver as a result of chemotherapy and other preoperative treatments. Interim analysis with eighteen patients shows that 9 of 15 (60%) of these occult lesions could be detected with image guidance. Image guidance changed intraoperative management in 3 (17%) cases. These results suggest that image guidance is a promising tool for localization of small occult liver metastases and that the indications for image-guided surgery are expanding.

  4. Global alliances effect in coalition forming

    NASA Astrophysics Data System (ADS)

    Vinogradova, Galina; Galam, Serge

    2014-11-01

    Coalition forming is investigated among countries, which are coupled with short range interactions, under the influence of externally-set opposing global alliances. The model extends a recent Natural Model of coalition forming inspired from Statistical Physics, where instabilities are a consequence of decentralized maximization of the individual benefits of actors. In contrast to physics where spins can only evaluate the immediate cost/benefit of a flip of orientation, countries have a long horizon of rationality, which associates with the ability to envision a way up to a better configuration even at the cost of passing through intermediate loosing states. The stabilizing effect is produced through polarization by the global alliances of either a particular unique global interest factor or multiple simultaneous ones. This model provides a versatile theoretical tool for the analysis of real cases and design of novel strategies. Such analysis is provided for several real cases including the Eurozone. The results shed a new light on the understanding of the complex phenomena of planned stabilization in the coalition forming.

  5. Introduction of new process technology into the wastewater treatment sector.

    PubMed

    Parker, Denny S

    2011-06-01

    Innovative wastewater treatment technologies are developed to respond to changing regulatory requirements, increase efficiency, and enhance sustainability or to reduce capital or operating costs. Drawing from experience of five successful new process introductions from both the inventor/developer's and adopter's viewpoints coupled with the application of marketing analysis tools (an S curve), the phases of new technology market penetration can be identified along with the influence of market drivers, marketing, patents and early adopters. The analysis is used to identify measures that have increased the capture of benefits from new technology introduction. These have included funding by the government for research and demonstrations, transparency of information, and the provision of independent technology evaluations. To reduce the barriers and speed the introduction of new technology, and thereby harvest the full benefits from it, our industry must develop mechanisms for sharing risks and any consequences of failure more broadly than just amongst the early adopters. WEF and WERF will continue to have the central role in providing reliable information networks and independent technology evaluations.

  6. Oversight and Community Connections: Building Support for Data Collection by Making it Meaningful

    NASA Astrophysics Data System (ADS)

    Halpern, M.

    2017-12-01

    The continued collection and availability of federal government data becomes more vulnerable when few people know it exists or appreciate its utility. Policymakers will only fight for continued investment in data collection if they can see tangible benefits and feel pressure to do so. Many datasets and analysis tools exist that can benefit from more publicity. Through multiple case studies, we will explore methods that experts can use to connect with local communities and institutions for data sharing and analysis projects that assist with community development and resilience while demonstrating the importance of federal data to people's lives. We will discuss the types of collaborations that are most likely to result in successful outcomes. We will suggest ways that scientists can communicate their successes with policymakers and coordinate with other scientists across the country to ensure that data collection and availability continues to be a national priority, and any attempts to reduce capacity are met with efficient resistance.

  7. * Ethical Issues in the Use of Animal Models for Tissue Engineering: Reflections on Legal Aspects, Moral Theory, Three Rs Strategies, and Harm-Benefit Analysis.

    PubMed

    Liguori, Gabriel R; Jeronimus, Bertus F; de Aquinas Liguori, Tácia T; Moreira, Luiz Felipe P; Harmsen, Martin C

    2017-12-01

    Animal experimentation requires a solid and rational moral foundation. Objective and emphatic decision-making and protocol evaluation by researchers and ethics committees remain a difficult and sensitive matter. This article presents three perspectives that facilitate a consideration of the minimally acceptable standard for animal experiments, in particular, in tissue engineering (TE) and regenerative medicine. First, we review the boundaries provided by law and public opinion in America and Europe. Second, we review contemporary moral theory to introduce the Neo-Rawlsian contractarian theory to objectively evaluate the ethics of animal experiments. Third, we introduce the importance of available reduction, replacement, and refinement strategies, which should be accounted for in moral decision-making and protocol evaluation of animal experiments. The three perspectives are integrated into an algorithmic and graphic harm-benefit analysis tool based on the most relevant aspects of animal models in TE. We conclude with a consideration of future avenues to improve animal experiments.

  8. Baseline characteristics predict risk of progression and response to combined medical therapy for benign prostatic hyperplasia (BPH).

    PubMed

    Kozminski, Michael A; Wei, John T; Nelson, Jason; Kent, David M

    2015-02-01

    To better risk stratify patients, using baseline characteristics, to help optimise decision-making for men with moderate-to-severe lower urinary tract symptoms (LUTS) secondary to benign prostatic hyperplasia (BPH) through a secondary analysis of the Medical Therapy of Prostatic Symptoms (MTOPS) trial. After review of the literature, we identified potential baseline risk factors for BPH progression. Using bivariate tests in a secondary analysis of MTOPS data, we determined which variables retained prognostic significance. We then used these factors in Cox proportional hazard modelling to: i) more comprehensively risk stratify the study population based on pre-treatment parameters and ii) to determine which risk strata stood to benefit most from medical intervention. In all, 3047 men were followed in MTOPS for a mean of 4.5 years. We found varying risks of progression across quartiles. Baseline BPH Impact Index score, post-void residual urine volume, serum prostate-specific antigen (PSA) level, age, American Urological Association Symptom Index score, and maximum urinary flow rate were found to significantly correlate with overall BPH progression in multivariable analysis. Using baseline factors permits estimation of individual patient risk for clinical progression and the benefits of medical therapy. A novel clinical decision tool based on these analyses will allow clinicians to weigh patient-specific benefits against possible risks of adverse effects for a given patient. © 2014 The Authors. BJU International © 2014 BJU International.

  9. Analysis of eye-tracking experiments performed on a Tobii T60

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banks, David C

    2008-01-01

    Commercial eye-gaze trackers have the potential to be an important tool for quantifying the benefits of new visualization techniques. The expense of such trackers has made their use relatively infrequent in visualization studies. As such, it is difficult for researchers to compare multiple devices obtaining several demonstration models is impractical in cost and time, and quantitative measures from real-world use are not readily available. In this paper, we present a sample protocol to determine the accuracy of a gaze-tacking device.

  10. A Longitudinal Analysis of the Acceptance Rates of the Navy’s Voluntary Separation Incentive/Special Separation Benefit (VSI/SSB) Program

    DTIC Science & Technology

    1993-09-23

    Authorization act , as one of the most visible policy tools in its current strategy to downsize the military. The program has been fairly successful in...as substantial reenlistment bonuses to keep quality personnel. These policies have been successful . Today’s military is the most senior of any in the...last 50 years. Ironically, it is the successes of manpower planners in developing these policies, coupled with their increased understanding of the

  11. Disease management with ARIMA model in time series.

    PubMed

    Sato, Renato Cesar

    2013-01-01

    The evaluation of infectious and noninfectious disease management can be done through the use of a time series analysis. In this study, we expect to measure the results and prevent intervention effects on the disease. Clinical studies have benefited from the use of these techniques, particularly for the wide applicability of the ARIMA model. This study briefly presents the process of using the ARIMA model. This analytical tool offers a great contribution for researchers and healthcare managers in the evaluation of healthcare interventions in specific populations.

  12. A Collaborative Reasoning Maintenance System for a Reliable Application of Legislations

    NASA Astrophysics Data System (ADS)

    Tamisier, Thomas; Didry, Yoann; Parisot, Olivier; Feltz, Fernand

    Decision support systems are nowadays used to disentangle all kinds of intricate situations and perform sophisticated analysis. Moreover, they are applied in areas where the knowledge can be heterogeneous, partially un-formalized, implicit, or diffuse. The representation and management of this knowledge become the key point to ensure the proper functioning of the system and keep an intuitive view upon its expected behavior. This paper presents a generic architecture for implementing knowledge-base systems used in collaborative business, where the knowledge is organized into different databases, according to the usage, persistence and quality of the information. This approach is illustrated with Cadral, a customizable automated tool built on this architecture and used for processing family benefits applications at the National Family Benefits Fund of the Grand-Duchy of Luxembourg.

  13. A Preliminary Approach to Adding Indicators of Ecosystem Service Benefits to a Wetlands Functional Assessment Tool

    EPA Science Inventory

    State-level managers and restoration advocates have expressed a desire for approaches that address wetlands services and benefits for two purposes: to demonstrate the benefits of money budgeted for restoration, and to compare proposals when awarding restoration funds for specific...

  14. Reaping the benefits of an open systems approach: getting the commercial approach right

    NASA Astrophysics Data System (ADS)

    Pearson, Gavin; Dawe, Tony; Stubbs, Peter; Worthington, Olwen

    2016-05-01

    Critical to reaping the benefits of an Open System Approach within Defence, or any other sector, is the ability to design the appropriate commercial model (or framework). This paper reports on the development and testing of a commercial strategy decision support tool. The tool set comprises a number of elements, including a process model, and provides business intelligence insights into likely supplier behaviour. The tool has been developed by subject matter experts and has been tested with a number of UK Defence procurement teams. The paper will present the commercial model framework, the elements of the toolset and the results of testing.

  15. Portfolio Analysis Tool

    NASA Technical Reports Server (NTRS)

    Barth, Tim; Zapata, Edgar; Benjamin, Perakath; Graul, Mike; Jones, Doug

    2005-01-01

    Portfolio Analysis Tool (PAT) is a Web-based, client/server computer program that helps managers of multiple projects funded by different customers to make decisions regarding investments in those projects. PAT facilitates analysis on a macroscopic level, without distraction by parochial concerns or tactical details of individual projects, so that managers decisions can reflect the broad strategy of their organization. PAT is accessible via almost any Web-browser software. Experts in specific projects can contribute to a broad database that managers can use in analyzing the costs and benefits of all projects, but do not have access for modifying criteria for analyzing projects: access for modifying criteria is limited to managers according to levels of administrative privilege. PAT affords flexibility for modifying criteria for particular "focus areas" so as to enable standardization of criteria among similar projects, thereby making it possible to improve assessments without need to rewrite computer code or to rehire experts, and thereby further reducing the cost of maintaining and upgrading computer code. Information in the PAT database and results of PAT analyses can be incorporated into a variety of ready-made or customizable tabular or graphical displays.

  16. Economic feasibility study for improving drinking water quality: a case study of arsenic contamination in rural Argentina.

    PubMed

    Molinos-Senante, María; Perez Carrera, Alejo; Hernández-Sancho, Francesc; Fernández-Cirelli, Alicia; Sala-Garrido, Ramón

    2014-12-01

    Economic studies are essential in evaluating the potential external investment support and/or internal tariffs available to improve drinking water quality. Cost-benefit analysis (CBA) is a useful tool to assess the economic feasibility of such interventions, i.e. to take some form of action to improve the drinking water quality. CBA should involve the market and non-market effects associated with the intervention. An economic framework was proposed in this study, which estimated the health avoided costs and the environmental benefits for the net present value of reducing the pollutant concentrations in drinking water. We conducted an empirical application to assess the economic feasibility of removing arsenic from water in a rural area of Argentina. Four small-scale methods were evaluated in our study. The results indicated that the inclusion of non-market benefits was integral to supporting investment projects. In addition, the application of the proposed framework will provide water authorities with more complete information for the decision-making process.

  17. Cost-benefit analysis of water-reuse projects for environmental purposes: a case study for Spanish wastewater treatment plants.

    PubMed

    Molinos-Senante, M; Hernández-Sancho, F; Sala-Garrido, R

    2011-12-01

    Water reuse is an emerging and promising non-conventional water resource. Feasibility studies are essential tools in the decision making process for the implementation of water-reuse projects. However, the methods used to assess economic feasibility tend to focus on internal costs, while external impacts are relegated to unsubstantiated statements about the advantages of water reuse. Using the concept of shadow prices for undesirable outputs of water reclamation, the current study developed a theoretical methodology to assess internal and external economic impacts. The proposed methodological approach is applied to 13 wastewater treatment plants in the Valencia region of Spain that reuse effluent for environmental purposes. Internal benefit analyses indicated that only a proportion of projects were economically viable, while when external benefits are incorporated all projects were economically viable. In conclusion, the economic feasibility assessments of water-reuse projects should quantitatively evaluate economic, environmental and resource availability. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Implementation of Haccp in the Mexican Poultry Processing Industry

    NASA Astrophysics Data System (ADS)

    Maldonado-Siman, Ema; Martínez-Hernández, Pedro Arturo; Ruíz-Flores, Agustín; García-Muñiz, José G.; Cadena-Meneses, José A.

    Hazard Analysis and Critical Control Point (HACCP) is a safety and quality management tool used as major issue in international and domestic trade in food industry. However, detailed information on costs and benefits of HACCP implementation is needed to provide appropriate advice to food processing plants. This paper reports on the perceptions of costs and benefits by the Mexican poultry processing plants and sale destinations. The results suggest that the major costs of implementing and operating HACCP within poultry processing plants are record keeping and external technical advice. The main benefit indicated by the majority of processing plants is a reduction in microbial counts. Over 39% of poultry production is sent to nation-wide chains of supermarkets, and less than 13% is sent to international markets. It was concluded that the adoption of HACCP by the Mexican poultry processing sector is based on the concern to increase and keep the domestic market, rather than to compete in the international market.

  19. Development of a shared decision-making tool to assist patients and clinicians with decisions on oral anticoagulant treatment for atrial fibrillation.

    PubMed

    Kaiser, Karen; Cheng, Wendy Y; Jensen, Sally; Clayman, Marla L; Thappa, Andrew; Schwiep, Frances; Chawla, Anita; Goldberger, Jeffrey J; Col, Nananda; Schein, Jeff

    2015-12-01

    Decision aids (DAs) are increasingly used to operationalize shared decision-making (SDM) but their development is not often described. Decisions about oral anticoagulants (OACs) for atrial fibrillation (AF) involve a trade-off between lowering stroke risk and increasing OAC-associated bleeding risk, and consideration of how treatment affects lifestyle. The benefits and risks of OACs hinge upon a patient's risk factors for stroke and bleeding and how they value these outcomes. We present the development of a DA about AF that estimates patients' risks for stroke and bleeding and assesses their preferences for outcomes. Based on a literature review and expert discussions, we identified stroke and major bleeding risk prediction models and embedded them into risk assessment modules. We identified the most important factors in choosing OAC treatment (warfarin used as the default reference OAC) through focus group discussions with AF patients who had used warfarin and clinician interviews. We then designed preference assessment and introductory modules accordingly. We integrated these modules into a prototype AF SDM tool and evaluated its usability through interviews. Our tool included four modules: (1) introduction to AF and OAC treatment risks and benefits; (2) stroke risk assessment; (3) bleeding risk assessment; and (4) preference assessment. Interactive risk calculators estimated patient-specific stroke and bleeding risks; graphics were developed to communicate these risks. After cognitive interviews, the content was improved. The final AF tool calculates patient-specific risks and benefits of OAC treatment and couples these estimates with patient preferences to improve clinical decision-making. The AF SDM tool may help patients choose whether OAC treatment is best for them and represents a patient-centered, integrative approach to educate patients on the benefits and risks of OAC treatment. Future research is needed to evaluate this tool in a real-world setting. The development process presented can be applied to similar SDM tools.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pennock, Kenneth; Makarov, Yuri V.; Rajagopal, Sankaran

    The need for proactive closed-loop integration of uncertainty information into system operations and probability-based controls is widely recognized, but rarely implemented in system operations. Proactive integration for this project means that the information concerning expected uncertainty ranges for net load and balancing requirements, including required balancing capacity, ramping and ramp duration characteristics, will be fed back into the generation commitment and dispatch algorithms to modify their performance so that potential shortages of these characteristics can be prevented. This basic, yet important, premise is the motivating factor for this project. The achieved project goal is to demonstrate the benefit of suchmore » a system. The project quantifies future uncertainties, predicts additional system balancing needs including the prediction intervals for capacity and ramping requirements of future dispatch intervals, evaluates the impacts of uncertainties on transmission including the risk of overloads and voltage problems, and explores opportunities for intra-hour generation adjustments helping to provide more flexibility for system operators. The resulting benefits culminate in more reliable grid operation in the face of increased system uncertainty and variability caused by solar power. The project identifies that solar power does not require special separate penetration level restrictions or penalization for its intermittency. Ultimately, the collective consideration of all sources of intermittency distributed over a wide area unified with the comprehensive evaluation of various elements of balancing process, i.e. capacity, ramping, and energy requirements, help system operators more robustly and effectively balance generation against load and interchange. This project showed that doing so can facilitate more solar and other renewable resources on the grid without compromising reliability and control performance. Efforts during the project included developing and integrating advanced probabilistic solar forecasts, including distributed PV forecasts, into closed –loop decision making processes. Additionally, new uncertainty quantifications methods and tools for the direct integration of uncertainty and variability information into grid operations at the transmission and distribution levels were developed and tested. During Phase 1, project work focused heavily on the design, development and demonstration of a set of processes and tools that could reliably and efficiently incorporate solar power into California’s grid operations. In Phase 2, connectivity between the ramping analysis tools and market applications software were completed, multiple dispatch scenarios demonstrated a successful reduction of overall uncertainty and an analysis to quantify increases in system operator reliability, and the transmission and distribution system uncertainty prediction tool was introduced to system operation engineers in a live webinar. The project met its goals, the experiments prove the advancements to methods and tools, when working together, are beneficial to not only the California Independent System Operator, but the benefits are transferable to other system operators in the United States.« less

  1. Collaboration pathway(s) using new tools for optimizing `operational' climate monitoring from space

    NASA Astrophysics Data System (ADS)

    Helmuth, Douglas B.; Selva, Daniel; Dwyer, Morgan M.

    2015-09-01

    Consistently collecting the earth's climate signatures remains a priority for world governments and international scientific organizations. Architecting a long term solution requires transforming scientific missions into an optimized robust `operational' constellation that addresses the collective needs of policy makers, scientific communities and global academic users for trusted data. The application of new tools offers pathways for global architecture collaboration. Recent rule-based expert system (RBES) optimization modeling of the intended NPOESS architecture becomes a surrogate for global operational climate monitoring architecture(s). These rulebased systems tools provide valuable insight for global climate architectures, by comparison/evaluation of alternatives and the sheer range of trade space explored. Optimization of climate monitoring architecture(s) for a partial list of ECV (essential climate variables) is explored and described in detail with dialogue on appropriate rule-based valuations. These optimization tool(s) suggest global collaboration advantages and elicit responses from the audience and climate science community. This paper will focus on recent research exploring joint requirement implications of the high profile NPOESS architecture and extends the research and tools to optimization for a climate centric case study. This reflects work from SPIE RS Conferences 2013 and 2014, abridged for simplification30, 32. First, the heavily securitized NPOESS architecture; inspired the recent research question - was Complexity (as a cost/risk factor) overlooked when considering the benefits of aggregating different missions into a single platform. Now years later a complete reversal; should agencies considering Disaggregation as the answer. We'll discuss what some academic research suggests. Second, using the GCOS requirements of earth climate observations via ECV (essential climate variables) many collected from space-based sensors; and accepting their definitions of global coverages intended to ensure the needs of major global and international organizations (UNFCCC and IPCC) are met as a core objective. Consider how new optimization tools like rule-based engines (RBES) offer alternative methods of evaluating collaborative architectures and constellations? What would the trade space of optimized operational climate monitoring architectures of ECV look like? Third, using the RBES tool kit (2014) demonstrate with application to a climate centric rule-based decision engine - optimizing architectural trades of earth observation satellite systems, allowing comparison(s) to existing architectures and gaining insights for global collaborative architectures. How difficult is it to pull together an optimized climate case study - utilizing for example 12 climate based instruments on multiple existing platforms and nominal handful of orbits; for best cost and performance benefits against the collection requirements of representative set of ECV. How much effort and resources would an organization expect to invest to realize these analysis and utility benefits?

  2. Can we replace curation with information extraction software?

    PubMed

    Karp, Peter D

    2016-01-01

    Can we use programs for automated or semi-automated information extraction from scientific texts as practical alternatives to professional curation? I show that error rates of current information extraction programs are too high to replace professional curation today. Furthermore, current IEP programs extract single narrow slivers of information, such as individual protein interactions; they cannot extract the large breadth of information extracted by professional curators for databases such as EcoCyc. They also cannot arbitrate among conflicting statements in the literature as curators can. Therefore, funding agencies should not hobble the curation efforts of existing databases on the assumption that a problem that has stymied Artificial Intelligence researchers for more than 60 years will be solved tomorrow. Semi-automated extraction techniques appear to have significantly more potential based on a review of recent tools that enhance curator productivity. But a full cost-benefit analysis for these tools is lacking. Without such analysis it is possible to expend significant effort developing information-extraction tools that automate small parts of the overall curation workflow without achieving a significant decrease in curation costs.Database URL. © The Author(s) 2016. Published by Oxford University Press.

  3. High-Performance Data Analysis Tools for Sun-Earth Connection Missions

    NASA Technical Reports Server (NTRS)

    Messmer, Peter

    2011-01-01

    The data analysis tool of choice for many Sun-Earth Connection missions is the Interactive Data Language (IDL) by ITT VIS. The increasing amount of data produced by these missions and the increasing complexity of image processing algorithms requires access to higher computing power. Parallel computing is a cost-effective way to increase the speed of computation, but algorithms oftentimes have to be modified to take advantage of parallel systems. Enhancing IDL to work on clusters gives scientists access to increased performance in a familiar programming environment. The goal of this project was to enable IDL applications to benefit from both computing clusters as well as graphics processing units (GPUs) for accelerating data analysis tasks. The tool suite developed in this project enables scientists now to solve demanding data analysis problems in IDL that previously required specialized software, and it allows them to be solved orders of magnitude faster than on conventional PCs. The tool suite consists of three components: (1) TaskDL, a software tool that simplifies the creation and management of task farms, collections of tasks that can be processed independently and require only small amounts of data communication; (2) mpiDL, a tool that allows IDL developers to use the Message Passing Interface (MPI) inside IDL for problems that require large amounts of data to be exchanged among multiple processors; and (3) GPULib, a tool that simplifies the use of GPUs as mathematical coprocessors from within IDL. mpiDL is unique in its support for the full MPI standard and its support of a broad range of MPI implementations. GPULib is unique in enabling users to take advantage of an inexpensive piece of hardware, possibly already installed in their computer, and achieve orders of magnitude faster execution time for numerically complex algorithms. TaskDL enables the simple setup and management of task farms on compute clusters. The products developed in this project have the potential to interact, so one can build a cluster of PCs, each equipped with a GPU, and use mpiDL to communicate between the nodes and GPULib to accelerate the computations on each node.

  4. Aerospace Systems Design in NASA's Collaborative Engineering Environment

    NASA Technical Reports Server (NTRS)

    Monell, Donald W.; Piland, William M.

    1999-01-01

    Past designs of complex aerospace systems involved an environment consisting of collocated design teams with project managers, technical discipline experts, and other experts (e.g. manufacturing and systems operations). These experts were generally qualified only on the basis of past design experience and typically had access to a limited set of integrated analysis tools. These environments provided less than desirable design fidelity, often lead to the inability of assessing critical programmatic and technical issues (e.g., cost risk, technical impacts), and generally derived a design that was not necessarily optimized across the entire system. The continually changing, modern aerospace industry demands systems design processes that involve the best talent available (no matter where it resides) and access to the best design and analysis tools. A solution to these demands involves a design environment referred to as collaborative engineering. The collaborative engineering environment evolving within the National Aeronautics and Space Administration (NASA) is a capability that enables the Agency's engineering infrastructure to interact and use the best state-of-the-art tools and data across organizational boundaries. Using collaborative engineering, the collocated team is replaced with an interactive team structure where the team members are geographically distributed and the best engineering talent can be applied to the design effort regardless of physical location. In addition, a more efficient, higher quality design product is delivered by bringing together the best engineering talent with more up-to-date design and analysis tools. These tools are focused on interactive, multidisciplinary design and analysis with emphasis on the complete life cycle of the system, and they include nontraditional, integrated tools for life cycle cost estimation and risk assessment. NASA has made substantial progress during the last two years in developing a collaborative engineering environment. NASA is planning to use this collaborative engineering infrastructure to provide better aerospace systems life cycle design and analysis, which includes analytical assessment of the technical and programmatic aspects of a system from "cradle to grave." This paper describes the recent NASA developments in the area of collaborative engineering, the benefits (realized and anticipated) of using the developed capability, and the long-term plans for implementing this capability across the Agency.

  5. Aerospace Systems Design in NASA's Collaborative Engineering Environment

    NASA Technical Reports Server (NTRS)

    Monell, Donald W.; Piland, William M.

    2000-01-01

    Past designs of complex aerospace systems involved an environment consisting of collocated design teams with project managers, technical discipline experts, and other experts (e.g., manufacturing and systems operation). These experts were generally qualified only on the basis of past design experience and typically had access to a limited set of integrated analysis tools. These environments provided less than desirable design fidelity, often lead to the inability of assessing critical programmatic and technical issues (e.g., cost, risk, technical impacts), and generally derived a design that was not necessarily optimized across the entire system. The continually changing, modern aerospace industry demands systems design processes that involve the best talent available (no matter where it resides) and access to the the best design and analysis tools. A solution to these demands involves a design environment referred to as collaborative engineering. The collaborative engineering environment evolving within the National Aeronautics and Space Administration (NASA) is a capability that enables the Agency's engineering infrastructure to interact and use the best state-of-the-art tools and data across organizational boundaries. Using collaborative engineering, the collocated team is replaced with an interactive team structure where the team members are geographical distributed and the best engineering talent can be applied to the design effort regardless of physical location. In addition, a more efficient, higher quality design product is delivered by bringing together the best engineering talent with more up-to-date design and analysis tools. These tools are focused on interactive, multidisciplinary design and analysis with emphasis on the complete life cycle of the system, and they include nontraditional, integrated tools for life cycle cost estimation and risk assessment. NASA has made substantial progress during the last two years in developing a collaborative engineering environment. NASA is planning to use this collaborative engineering engineering infrastructure to provide better aerospace systems life cycle design and analysis, which includes analytical assessment of the technical and programmatic aspects of a system from "cradle to grave." This paper describes the recent NASA developments in the area of collaborative engineering, the benefits (realized and anticipated) of using the developed capability, and the long-term plans for implementing this capability across Agency.

  6. Aerospace Systems Design in NASA's Collaborative Engineering Environment

    NASA Astrophysics Data System (ADS)

    Monell, Donald W.; Piland, William M.

    2000-07-01

    Past designs of complex aerospace systems involved an environment consisting of collocated design teams with project managers, technical discipline experts, and other experts (e.g., manufacturing and systems operations). These experts were generally qualified only on the basis of past design experience and typically had access to a limited set of integrated analysis tools. These environments provided less than desirable design fidelity, often led to the inability of assessing critical programmatic and technical issues (e.g., cost, risk, technical impacts), and generally derived a design that was not necessarily optimized across the entire system. The continually changing, modern aerospace industry demands systems design processes that involve the best talent available (no matter where it resides) and access to the best design and analysis tools. A solution to these demands involves a design environment referred to as collaborative engineering. The collaborative engineering environment evolving within the National Aeronautics and Space Administration (NASA) is a capability that enables the Agency's engineering infrastructure to interact and use the best state-of-the-art tools and data across organizational boundaries. Using collaborative engineering, the collocated team is replaced with an interactive team structure where the team members are geographically distributed and the best engineering talent can be applied to the design effort regardless of physical location. In addition, a more efficient, higher quality design product is delivered by bringing together the best engineering talent with more up-to-date design and analysis tools. These tools are focused on interactive, multidisciplinary design and analysis with emphasis on the complete life cycle of the system, and they include nontraditional, integrated tools for life cycle cost estimation and risk assessment. NASA has made substantial progress during the last two years in developing a collaborative engineering environment. NASA is planning to use this collaborative engineering infrastructure to provide better aerospace systems life cycle design and analysis, which includes analytical assessment of the technical and programmatic aspects of a system from "cradle to grave." This paper describes the recent NASA developments in the area of collaborative engineering, the benefits (realized and anticipated) of using the developed capability, and the long-term plans for implementing this capability across the Agency.

  7. Segmentation-free image processing and analysis of precipitate shapes in 2D and 3D

    NASA Astrophysics Data System (ADS)

    Bales, Ben; Pollock, Tresa; Petzold, Linda

    2017-06-01

    Segmentation based image analysis techniques are routinely employed for quantitative analysis of complex microstructures containing two or more phases. The primary advantage of these approaches is that spatial information on the distribution of phases is retained, enabling subjective judgements of the quality of the segmentation and subsequent analysis process. The downside is that computing micrograph segmentations with data from morphologically complex microstructures gathered with error-prone detectors is challenging and, if no special care is taken, the artifacts of the segmentation will make any subsequent analysis and conclusions uncertain. In this paper we demonstrate, using a two phase nickel-base superalloy microstructure as a model system, a new methodology for analysis of precipitate shapes using a segmentation-free approach based on the histogram of oriented gradients feature descriptor, a classic tool in image analysis. The benefits of this methodology for analysis of microstructure in two and three-dimensions are demonstrated.

  8. An Economic Evaluation of Food Safety Education Interventions: Estimates and Critical Data Gaps.

    PubMed

    Zan, Hua; Lambea, Maria; McDowell, Joyce; Scharff, Robert L

    2017-08-01

    The economic evaluation of food safety interventions is an important tool that practitioners and policy makers use to assess the efficacy of their efforts. These evaluations are built on models that are dependent on accurate estimation of numerous input variables. In many cases, however, there is no data available to determine input values and expert opinion is used to generate estimates. This study uses a benefit-cost analysis of the food safety component of the adult Expanded Food and Nutrition Education Program (EFNEP) in Ohio as a vehicle for demonstrating how results based on variable values that are not objectively determined may be sensitive to alternative assumptions. In particular, the focus here is on how reported behavioral change is translated into economic benefits. Current gaps in the literature make it impossible to know with certainty how many people are protected by the education (what are the spillover effects?), the length of time education remains effective, and the level of risk reduction from change in behavior. Based on EFNEP survey data, food safety education led 37.4% of participants to improve their food safety behaviors. Under reasonable default assumptions, benefits from this improvement significantly outweigh costs, yielding a benefit-cost ratio of between 6.2 and 10.0. Incorporation of a sensitivity analysis using alternative estimates yields a greater range of estimates (0.2 to 56.3), which highlights the importance of future research aimed at filling these research gaps. Nevertheless, most reasonable assumptions lead to estimates of benefits that justify their costs.

  9. The effect of introducing computers into an introductory physics problem-solving laboratory

    NASA Astrophysics Data System (ADS)

    McCullough, Laura Ellen

    2000-10-01

    Computers are appearing in every type of classroom across the country. Yet they often appear without benefit of studying their effects. The research that is available on computer use in classrooms has found mixed results, and often ignores the theoretical and instructional contexts of the computer in the classroom. The University of Minnesota's physics department employs a cooperative-group problem solving pedagogy, based on a cognitive apprenticeship instructional model, in its calculus-based introductory physics course. This study was designed to determine possible negative effects of introducing a computerized data-acquisition and analysis tool into this pedagogy as a problem-solving tool for students to use in laboratory. To determine the effects of the computer tool, two quasi-experimental treatment groups were selected. The computer-tool group (N = 170) used a tool, designed for this study (VideoTool), to collect and analyze motion data in the laboratory. The control group (N = 170) used traditional non-computer equipment (spark tapes and Polaroid(TM) film). The curriculum was kept as similar as possible for the two groups. During the ten week academic quarter, groups were examined for effects on performance on conceptual tests and grades, attitudes towards the laboratory and the laboratory tools, and behaviors within cooperative groups. Possible interactions with gender were also examined. Few differences were found between the control and computer-tool groups. The control group received slightly higher scores on one conceptual test, but this difference was not educationally significant. The computer-tool group had slightly more positive attitudes towards using the computer tool than their counterparts had towards the traditional tools. The computer-tool group also perceived that they spoke more frequently about physics misunderstandings, while the control group felt that they discussed equipment difficulties more often. This perceptual difference interacted with gender, with the men in the control group more likely to discuss equipment difficulties than any other group. Overall, the differences between the control and quasi-experimental groups were minimal. It was concluded that carefully replacing traditional data collection and analysis tools with a computer tool had no negative effects on achievement, attitude, group behavior, and did not interact with gender.

  10. Effective use of outcomes data in cardiovascular surgery

    NASA Astrophysics Data System (ADS)

    Yasnoff, William A.; Page, U. S.

    1994-12-01

    We have established the Merged Cardiac Registry (MCR) containing over 100,000 cardiovascular surgery cases from 47 sites in the U.S. and Europe. MCR outcomes data are used by the contributors for clinical quality improvement. A tool for prospective prediction of mortality and stroke for coronary artery bypass graft surgery (83% of the cases), known as RiskMaster, has been developed using a Bayesian model based on 40,819 patients who had their surgery from 1988-92, and tested on 4,244 patients from 1993. In patients with mortality risks of 10% or less (92% of cases), the average risk prediction is identical to the actual 30- day mortality (p > 0.37), while risk is overestimated in higher risk patients. The receiver operating characteristic curve area for mortality prediction is 0.76 +/- 0.02. The RiskMaster prediction tool is now available online or as a standalone software package. MCR data also shows that average mortality risk is identical for a given body surface area regardless of gender. Outcomes data measure the benefits of health care, and are therefore an essential element in cost/benefit analysis. We believe their cost is justified by their use for the rational assessment of treatment alternatives.

  11. Social, ethical and legal barriers to e-health.

    PubMed

    Anderson, James G

    2007-01-01

    Information technology such as electronic medical records (EMRs), electronic prescribing and decision support systems are recognized as essential tools in Europe, the U.S., Canada, Australia, and New Zealand. But significant barriers impede wide-scale adoption of these tools, especially EMR systems. The objectives of this study were to investigate the present status of information technology in health care, the perceived benefits and barriers by primary care physicians. Literature analysis and survey data from primary care physicians on adoption of information technology are reviewed. The U.S. trails European countries as well as Canada, Australia and New Zealand in the use of information technology in primary care. The results of the study indicate that physicians in general perceive benefits to information technology, but also cite major barriers to its implementation in their practices. These barriers include lack of access to capital by health care providers, complex systems and lack of data standards that permit exchange of clinical data, privacy concerns and legal barriers. Overcoming these barriers will require subsidies and performance incentives by payers and government; certification and standardization of vendor applications that permit clinical data exchange; removal of legal barriers; and greater security of medical data to convince practitioners and patients of the value of EMRs.

  12. Characterization of Apps and Other e-Tools for Medication Use: Insights Into Possible Benefits and Risks.

    PubMed

    van Kerkhof, Linda Wilhelmina Maria; van der Laar, Catharina Walthera Egbertha; de Jong, Charlie; Weda, Marjolein; Hegger, Ingrid

    2016-04-06

    In the past years, an enormous increase in the number of available health-related applications (apps) has occurred, from approximately 5800 in 2011 to over 23,000 in 2013, in the iTunes store. However, little is still known regarding the use, possible effectiveness, and risks of these applications. In this study, we focused on apps and other e-tools related to medicine use. A large subset of the general population uses medicines and might benefit from tools that aid in the use of medicine. The aim of the present study was to gain more insight into the characteristics, possible risks, and possible benefits of health apps and e-tools related to medication use. We first made an inventory of apps and other e-tools for medication use (n=116). Tools were coded by two independent researchers, based on the information available in the app stores and websites. Subsequently, for one type of often downloaded apps (aimed at people with diabetes), we investigated users' experiences using an online questionnaire. Results of the inventory show that many apps for medication use are available and that they mainly offer simple functionalities. In line with this, the most experienced benefit by users of apps for regulating blood glucose levels in the online questionnaire was "information quick and conveniently available". Other often experienced benefits were improving health and self-reliance. Results of the inventory show that a minority of the apps for medication use has potentially high risks and for many of the apps it is unclear whether and how personal data are stored. In contrast, online questionnaire among users of apps for blood glucose regulation indicates that they hardly ever experience problems or doubts considering reliability and/or privacy. Although, respondents do mention to experience disadvantages of use due to incomplete apps and apps with poor ease of use. Respondents not using app(s) indicate that they might use them in the future if reliability of the apps and instructions on how to use them are more clear. This study shows that for apps and e-tools related to medicine use a small subset of tools might involve relatively high risks. For the large group of nonmedical devices apps, risks are lower, but risks lie in the enormous availability and low levels of regulation. In addition, both users and nonusers indicated that overall quality of apps (ease of use, completeness, good functionalities) is an issue. Considering that important benefits (eg, improving health and self-reliance) are experienced by many of the respondents using apps for regulating blood glucose levels, improving reliability and quality of apps is likely to have many profits. In addition, creating better awareness regarding the existence and how to use apps will likely improve proper use by more people, enhancing the profits of these tools.

  13. Characterization of Apps and Other e-Tools for Medication Use: Insights Into Possible Benefits and Risks

    PubMed Central

    van der Laar, Catharina Walthera Egbertha; de Jong, Charlie; Weda, Marjolein; Hegger, Ingrid

    2016-01-01

    Background In the past years, an enormous increase in the number of available health-related applications (apps) has occurred, from approximately 5800 in 2011 to over 23,000 in 2013, in the iTunes store. However, little is still known regarding the use, possible effectiveness, and risks of these applications. In this study, we focused on apps and other e-tools related to medicine use. A large subset of the general population uses medicines and might benefit from tools that aid in the use of medicine. Objective The aim of the present study was to gain more insight into the characteristics, possible risks, and possible benefits of health apps and e-tools related to medication use. Methods We first made an inventory of apps and other e-tools for medication use (n=116). Tools were coded by two independent researchers, based on the information available in the app stores and websites. Subsequently, for one type of often downloaded apps (aimed at people with diabetes), we investigated users’ experiences using an online questionnaire. Results Results of the inventory show that many apps for medication use are available and that they mainly offer simple functionalities. In line with this, the most experienced benefit by users of apps for regulating blood glucose levels in the online questionnaire was “information quick and conveniently available”. Other often experienced benefits were improving health and self-reliance. Results of the inventory show that a minority of the apps for medication use has potentially high risks and for many of the apps it is unclear whether and how personal data are stored. In contrast, online questionnaire among users of apps for blood glucose regulation indicates that they hardly ever experience problems or doubts considering reliability and/or privacy. Although, respondents do mention to experience disadvantages of use due to incomplete apps and apps with poor ease of use. Respondents not using app(s) indicate that they might use them in the future if reliability of the apps and instructions on how to use them are more clear. Conclusions This study shows that for apps and e-tools related to medicine use a small subset of tools might involve relatively high risks. For the large group of nonmedical devices apps, risks are lower, but risks lie in the enormous availability and low levels of regulation. In addition, both users and nonusers indicated that overall quality of apps (ease of use, completeness, good functionalities) is an issue. Considering that important benefits (eg, improving health and self-reliance) are experienced by many of the respondents using apps for regulating blood glucose levels, improving reliability and quality of apps is likely to have many profits. In addition, creating better awareness regarding the existence and how to use apps will likely improve proper use by more people, enhancing the profits of these tools. PMID:27052946

  14. LoRTE: Detecting transposon-induced genomic variants using low coverage PacBio long read sequences.

    PubMed

    Disdero, Eric; Filée, Jonathan

    2017-01-01

    Population genomic analysis of transposable elements has greatly benefited from recent advances of sequencing technologies. However, the short size of the reads and the propensity of transposable elements to nest in highly repeated regions of genomes limits the efficiency of bioinformatic tools when Illumina or 454 technologies are used. Fortunately, long read sequencing technologies generating read length that may span the entire length of full transposons are now available. However, existing TE population genomic softwares were not designed to handle long reads and the development of new dedicated tools is needed. LoRTE is the first tool able to use PacBio long read sequences to identify transposon deletions and insertions between a reference genome and genomes of different strains or populations. Tested against simulated and genuine Drosophila melanogaster PacBio datasets, LoRTE appears to be a reliable and broadly applicable tool to study the dynamic and evolutionary impact of transposable elements using low coverage, long read sequences. LoRTE is an efficient and accurate tool to identify structural genomic variants caused by TE insertion or deletion. LoRTE is available for download at http://www.egce.cnrs-gif.fr/?p=6422.

  15. Chemometric Strategies for Peak Detection and Profiling from Multidimensional Chromatography.

    PubMed

    Navarro-Reig, Meritxell; Bedia, Carmen; Tauler, Romà; Jaumot, Joaquim

    2018-04-03

    The increasing complexity of omics research has encouraged the development of new instrumental technologies able to deal with these challenging samples. In this way, the rise of multidimensional separations should be highlighted due to the massive amounts of information that provide with an enhanced analyte determination. Both proteomics and metabolomics benefit from this higher separation capacity achieved when different chromatographic dimensions are combined, either in LC or GC. However, this vast quantity of experimental information requires the application of chemometric data analysis strategies to retrieve this hidden knowledge, especially in the case of nontargeted studies. In this work, the most common chemometric tools and approaches for the analysis of this multidimensional chromatographic data are reviewed. First, different options for data preprocessing and enhancement of the instrumental signal are introduced. Next, the most used chemometric methods for the detection of chromatographic peaks and the resolution of chromatographic and spectral contributions (profiling) are presented. The description of these data analysis approaches is complemented with enlightening examples from omics fields that demonstrate the exceptional potential of the combination of multidimensional separation techniques and chemometric tools of data analysis. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Experimental strain modal analysis for beam-like structure by using distributed fiber optics and its damage detection

    NASA Astrophysics Data System (ADS)

    Cheng, Liangliang; Busca, Giorgio; Cigada, Alfredo

    2017-07-01

    Modal analysis is commonly considered as an effective tool to obtain the intrinsic characteristics of structures including natural frequencies, modal damping ratios, and mode shapes, which are significant indicators for monitoring the health status of engineering structures. The complex mode indicator function (CMIF) can be regarded as an effective numerical tool to perform modal analysis. In this paper, experimental strain modal analysis based on the CMIF has been introduced. Moreover, a distributed fiber-optic sensor, as a dense measuring device, has been applied to acquire strain data along a beam surface. Thanks to the dense spatial resolution of the distributed fiber optics, more detailed mode shapes could be obtained. In order to test the effectiveness of the method, a mass lump—considered as a linear damage component—has been attached to the surface of the beam, and damage detection based on strain mode shape has been carried out. The results manifest that strain modal parameters can be estimated effectively by utilizing the CMIF based on the corresponding simulations and experiments. Furthermore, damage detection based on strain mode shapes benefits from the accuracy of strain mode shape recognition and the excellent performance of the distributed fiber optics.

  17. Systematically evaluating interfaces for RNA-seq analysis from a life scientist perspective.

    PubMed

    Poplawski, Alicia; Marini, Federico; Hess, Moritz; Zeller, Tanja; Mazur, Johanna; Binder, Harald

    2016-03-01

    RNA-sequencing (RNA-seq) has become an established way for measuring gene expression in model organisms and humans. While methods development for refining the corresponding data processing and analysis pipeline is ongoing, protocols for typical steps have been proposed and are widely used. Several user interfaces have been developed for making such analysis steps accessible to life scientists without extensive knowledge of command line tools. We performed a systematic search and evaluation of such interfaces to investigate to what extent these can indeed facilitate RNA-seq data analysis. We found a total of 29 open source interfaces, and six of the more widely used interfaces were evaluated in detail. Central criteria for evaluation were ease of configuration, documentation, usability, computational demand and reporting. No interface scored best in all of these criteria, indicating that the final choice will depend on the specific perspective of users and the corresponding weighting of criteria. Considerable technical hurdles had to be overcome in our evaluation. For many users, this will diminish potential benefits compared with command line tools, leaving room for future improvement of interfaces. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  18. PsychoPy--Psychophysics software in Python.

    PubMed

    Peirce, Jonathan W

    2007-05-15

    The vast majority of studies into visual processing are conducted using computer display technology. The current paper describes a new free suite of software tools designed to make this task easier, using the latest advances in hardware and software. PsychoPy is a platform-independent experimental control system written in the Python interpreted language using entirely free libraries. PsychoPy scripts are designed to be extremely easy to read and write, while retaining complete power for the user to customize the stimuli and environment. Tools are provided within the package to allow everything from stimulus presentation and response collection (from a wide range of devices) to simple data analysis such as psychometric function fitting. Most importantly, PsychoPy is highly extensible and the whole system can evolve via user contributions. If a user wants to add support for a particular stimulus, analysis or hardware device they can look at the code for existing examples, modify them and submit the modifications back into the package so that the whole community benefits.

  19. The role of failure modes and effects analysis in showing the benefits of automation in the blood bank.

    PubMed

    Han, Tae Hee; Kim, Moon Jung; Kim, Shinyoung; Kim, Hyun Ok; Lee, Mi Ae; Choi, Ji Seon; Hur, Mina; St John, Andrew

    2013-05-01

    Failure modes and effects analysis (FMEA) is a risk management tool used by the manufacturing industry but now being applied in laboratories. Teams from six South Korean blood banks used this tool to map their manual and automated blood grouping processes and determine the risk priority numbers (RPNs) as a total measure of error risk. The RPNs determined by each of the teams consistently showed that the use of automation dramatically reduced the RPN compared to manual processes. In addition, FMEA showed where the major risks occur in each of the manual processes and where attention should be prioritized to improve the process. Despite no previous experience with FMEA, the teams found the technique relatively easy to use and the subjectivity associated with assigning risk numbers did not affect the validity of the data. FMEA should become a routine technique for improving processes in laboratories. © 2012 American Association of Blood Banks.

  20. PsychoPy—Psychophysics software in Python

    PubMed Central

    Peirce, Jonathan W.

    2007-01-01

    The vast majority of studies into visual processing are conducted using computer display technology. The current paper describes a new free suite of software tools designed to make this task easier, using the latest advances in hardware and software. PsychoPy is a platform-independent experimental control system written in the Python interpreted language using entirely free libraries. PsychoPy scripts are designed to be extremely easy to read and write, while retaining complete power for the user to customize the stimuli and environment. Tools are provided within the package to allow everything from stimulus presentation and response collection (from a wide range of devices) to simple data analysis such as psychometric function fitting. Most importantly, PsychoPy is highly extensible and the whole system can evolve via user contributions. If a user wants to add support for a particular stimulus, analysis or hardware device they can look at the code for existing examples, modify them and submit the modifications back into the package so that the whole community benefits. PMID:17254636

  1. The advanced software development workstation project

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III; Pitman, Charles L.

    1991-01-01

    The Advanced Software Development Workstation (ASDW) task is researching and developing the technologies required to support Computer Aided Software Engineering (CASE) with the emphasis on those advanced methods, tools, and processes that will be of benefit to support all NASA programs. Immediate goals are to provide research and prototype tools that will increase productivity, in the near term, in projects such as the Software Support Environment (SSE), the Space Station Control Center (SSCC), and the Flight Analysis and Design System (FADS) which will be used to support the Space Shuttle and Space Station Freedom. Goals also include providing technology for development, evolution, maintenance, and operations. The technologies under research and development in the ASDW project are targeted to provide productivity enhancements during the software life cycle phase of enterprise and information system modeling, requirements generation and analysis, system design and coding, and system use and maintenance. On-line user's guides will assist users in operating the developed information system with knowledge base expert assistance.

  2. Model based systems engineering for astronomical projects

    NASA Astrophysics Data System (ADS)

    Karban, R.; Andolfato, L.; Bristow, P.; Chiozzi, G.; Esselborn, M.; Schilling, M.; Schmid, C.; Sommer, H.; Zamparelli, M.

    2014-08-01

    Model Based Systems Engineering (MBSE) is an emerging field of systems engineering for which the System Modeling Language (SysML) is a key enabler for descriptive, prescriptive and predictive models. This paper surveys some of the capabilities, expectations and peculiarities of tools-assisted MBSE experienced in real-life astronomical projects. The examples range in depth and scope across a wide spectrum of applications (for example documentation, requirements, analysis, trade studies) and purposes (addressing a particular development need, or accompanying a project throughout many - if not all - its lifecycle phases, fostering reuse and minimizing ambiguity). From the beginnings of the Active Phasing Experiment, through VLT instrumentation, VLTI infrastructure, Telescope Control System for the E-ELT, until Wavefront Control for the E-ELT, we show how stepwise refinements of tools, processes and methods have provided tangible benefits to customary system engineering activities like requirement flow-down, design trade studies, interfaces definition, and validation, by means of a variety of approaches (like Model Checking, Simulation, Model Transformation) and methodologies (like OOSEM, State Analysis)

  3. Optimum Design of LLC Resonant Converter using Inductance Ratio (Lm/Lr)

    NASA Astrophysics Data System (ADS)

    Palle, Kowstubha; Krishnaveni, K.; Ramesh Reddy, Kolli

    2017-06-01

    The main benefits of LLC resonant dc/dc converter over conventional series and parallel resonant converters are its light load regulation, less circulating currents, larger bandwidth for zero voltage switching, and less tuning of switching frequency for controlled output. An unique analytical tool, called fundamental harmonic approximation with peak gain adjustment is used for designing the converter. In this paper, an optimum design of the converter is proposed by considering three different design criterions with different values of inductance ratio (Lm/Lr) to achieve good efficiency at high input voltage. The optimum design includes the analysis in operating range, switching frequency range, primary side losses of a switch and stability. The analysis is carried out with simulation using the software tools like MATLAB and PSIM. The performance of the optimized design is demonstrated for a design specification of 12 V, 5 A output operating with an input voltage range of 300-400 V using FSFR 2100 IC of Texas instruments.

  4. A priori analysis: an application to the estimate of the uncertainty in course grades

    NASA Astrophysics Data System (ADS)

    Lippi, G. L.

    2014-07-01

    A priori analysis (APA) is discussed as a tool to assess the reliability of grades in standard curricular courses. This unusual, but striking, application is presented when teaching the section on the data treatment of a laboratory course to illustrate the characteristics of the APA and its potential for widespread use, beyond the traditional physics curriculum. The conditions necessary for this kind of analysis are discussed, the general framework is set out and a specific example is given to illustrate its various aspects. Students are often struck by this unusual application and are more apt to remember the APA. Instructors may also benefit from some of the gathered information, as discussed in the paper.

  5. Integrative evaluation for sustainable decisions of urban wastewater system management under uncertainty

    NASA Astrophysics Data System (ADS)

    Hadjimichael, A.; Corominas, L.; Comas, J.

    2017-12-01

    With sustainable development as their overarching goal, urban wastewater system (UWS) managers need to take into account multiple social, economic, technical and environmental facets related to their decisions. In this complex decision-making environment, uncertainty can be formidable. It is present both in the ways the system is interpreted stochastically, but also in its natural ever-shifting behavior. This inherent uncertainty suggests that wiser decisions would be made under an adaptive and iterative decision-making regime. No decision-support framework has been presented in the literature to effectively addresses all these needs. The objective of this work is to describe such a conceptual framework to evaluate and compare alternative solutions for various UWS challenges within an adaptive management structure. Socio-economic aspects such as externalities are taken into account, along with other traditional criteria as necessary. Robustness, reliability and resilience analyses test the performance of the system against present and future variability. A valuation uncertainty analysis incorporates uncertain valuation assumptions in the decision-making process. The framework is demonstrated with an application to a case study presenting a typical problem often faced by managers: poor river water quality, increasing population, and more stringent water quality legislation. The application of the framework made use of: i) a cost-benefit analysis including monetized environmental benefits and damages; ii) a robustness analysis of system performance against future conditions; iii) reliability and resilience analyses of the system given contextual variability; and iv) a valuation uncertainty analysis of model parameters. The results suggest that the installation of bigger volumes would give rise to increased benefits despite larger capital costs, as well as increased robustness and resilience. Population numbers appear to affect the estimated benefits most, followed by electricity prices and climate change projections. The presented framework is expected to be a valuable tool for the next generation of UWS decision-making and the application demonstrates a novel and valuable integration of metrics and methods for UWS analysis.

  6. TranscriptomeBrowser 3.0: introducing a new compendium of molecular interactions and a new visualization tool for the study of gene regulatory networks.

    PubMed

    Lepoivre, Cyrille; Bergon, Aurélie; Lopez, Fabrice; Perumal, Narayanan B; Nguyen, Catherine; Imbert, Jean; Puthier, Denis

    2012-01-31

    Deciphering gene regulatory networks by in silico approaches is a crucial step in the study of the molecular perturbations that occur in diseases. The development of regulatory maps is a tedious process requiring the comprehensive integration of various evidences scattered over biological databases. Thus, the research community would greatly benefit from having a unified database storing known and predicted molecular interactions. Furthermore, given the intrinsic complexity of the data, the development of new tools offering integrated and meaningful visualizations of molecular interactions is necessary to help users drawing new hypotheses without being overwhelmed by the density of the subsequent graph. We extend the previously developed TranscriptomeBrowser database with a set of tables containing 1,594,978 human and mouse molecular interactions. The database includes: (i) predicted regulatory interactions (computed by scanning vertebrate alignments with a set of 1,213 position weight matrices), (ii) potential regulatory interactions inferred from systematic analysis of ChIP-seq experiments, (iii) regulatory interactions curated from the literature, (iv) predicted post-transcriptional regulation by micro-RNA, (v) protein kinase-substrate interactions and (vi) physical protein-protein interactions. In order to easily retrieve and efficiently analyze these interactions, we developed In-teractomeBrowser, a graph-based knowledge browser that comes as a plug-in for Transcriptome-Browser. The first objective of InteractomeBrowser is to provide a user-friendly tool to get new insight into any gene list by providing a context-specific display of putative regulatory and physical interactions. To achieve this, InteractomeBrowser relies on a "cell compartments-based layout" that makes use of a subset of the Gene Ontology to map gene products onto relevant cell compartments. This layout is particularly powerful for visual integration of heterogeneous biological information and is a productive avenue in generating new hypotheses. The second objective of InteractomeBrowser is to fill the gap between interaction databases and dynamic modeling. It is thus compatible with the network analysis software Cytoscape and with the Gene Interaction Network simulation software (GINsim). We provide examples underlying the benefits of this visualization tool for large gene set analysis related to thymocyte differentiation. The InteractomeBrowser plugin is a powerful tool to get quick access to a knowledge database that includes both predicted and validated molecular interactions. InteractomeBrowser is available through the TranscriptomeBrowser framework and can be found at: http://tagc.univ-mrs.fr/tbrowser/. Our database is updated on a regular basis.

  7. The New, Improved 2016 SmartWay Truck Carrier Tool

    EPA Pesticide Factsheets

    This EPA presentation provides information on the SmartWay Transport Partnership Program, including key information about EPA, Partners' roles, benefits, tools, partner recognition, awards, and brand value. Transcript available

  8. The psychometric properties of exercise benefits/barriers scale among women

    PubMed Central

    Farahani, Leila Amiri; Parvizy, Soroor; Mohammadi, Eesa; Asadi-Lari, Mohsen; Kazemnejad, Anoshiravan; Hasanpoor-Azgahdy, Seyede Batool; Taghizadeh, Ziba

    2017-01-01

    Background and objective Despite the numerous health benefits of regular physical activity (PA), physical inactivity is a major health issue among women. The goal of the current study was to measure the validity and reliability assessment of the exercise benefits/barriers scale among women between the ages of 18 and 65 years. This study was carried out among women residing in Khoramroudi neighborhood in Tehran between December 2013 and February 2014. Methods In this descriptive, methodological study, 278 women residing in Khoramroudi neighborhood in Tehran between December 2013 and February 2014 completed three questionnaires: the demographic data form, the Exercise Benefits/Barriers Scale. The construct validity, internal consistency, and stability of the study were measured by confirmatory factor analyses, Cronbach’s alpha, and Spearman Brown correlation coefficient by using SPSS 21 and LISREL 8.80, respectively. Results The confirmatory factor analysis showed the Persian version of EBBS was structured well. The Cronbach’s alpha coefficients for the total scale and its subscales were 0.927, 0.94 and 0.82, respectively. Spearman Brown correlation coefficient also showed good test-retest reliability. Conclusion The results of this study verified the reliability and validity of the applied instrument and introduced it as a tool to measure the benefits and barriers of physical activity among Iranian women. PMID:28894535

  9. The psychometric properties of exercise benefits/barriers scale among women.

    PubMed

    Farahani, Leila Amiri; Parvizy, Soroor; Mohammadi, Eesa; Asadi-Lari, Mohsen; Kazemnejad, Anoshiravan; Hasanpoor-Azgahdy, Seyede Batool; Taghizadeh, Ziba

    2017-07-01

    Despite the numerous health benefits of regular physical activity (PA), physical inactivity is a major health issue among women. The goal of the current study was to measure the validity and reliability assessment of the exercise benefits/barriers scale among women between the ages of 18 and 65 years. This study was carried out among women residing in Khoramroudi neighborhood in Tehran between December 2013 and February 2014. In this descriptive, methodological study, 278 women residing in Khoramroudi neighborhood in Tehran between December 2013 and February 2014 completed three questionnaires: the demographic data form, the Exercise Benefits/Barriers Scale. The construct validity, internal consistency, and stability of the study were measured by confirmatory factor analyses, Cronbach's alpha, and Spearman Brown correlation coefficient by using SPSS 21 and LISREL 8.80, respectively. The confirmatory factor analysis showed the Persian version of EBBS was structured well. The Cronbach's alpha coefficients for the total scale and its subscales were 0.927, 0.94 and 0.82, respectively. Spearman Brown correlation coefficient also showed good test-retest reliability. The results of this study verified the reliability and validity of the applied instrument and introduced it as a tool to measure the benefits and barriers of physical activity among Iranian women.

  10. Terminal - Tactical Separation Assured Flight Environment (T-TSafe)

    NASA Technical Reports Server (NTRS)

    Verma, Savita Arora; Tang, Huabin; Ballinger, Debbi

    2011-01-01

    The Tactical Separation Assured Flight Environment (TSAFE) has been previously tested as a conflict detection and resolution tool in the en-route phase of flight. Fast time simulations of a terminal version of this tool called Terminal TSAFE (T-TSAFE) have shown promise over the current conflict detection tools. It has shown to have fewer false alerts (as low as 2 per hour) and better prediction to conflict time than Conflict Alert. The tool will be tested in the simulated terminal area of Los Angeles International Airport, in a Human-in-the-loop experiment to identify controller procedures and information requirements. The simulation will include comparisons of T-TSAFE with NASA's version of Conflict Alert. Also, some other variables such as altitude entry by the controller, which improve T-TSAFE's predictions for conflict detection, will be tested. T-TSAFE integrates features of current conflict detection tools such as Automated Terminal Proximity Alert used to alleviate compression errors in the final approach phase. Based on fast-time simulation analysis, the anticipated benefits of T-TSAFE over Conflict Alert include reduced false/missed alerts and increased time to predicted loss of separation. Other metrics that will be used to evaluate the tool's impact on the controller include controller intervention, workload, and situation awareness.

  11. Development of a tool to improve the quality of decision making in atrial fibrillation

    PubMed Central

    2011-01-01

    Background Decision-making about appropriate therapy to reduce the stroke risk associated with non-valvular atrial fibrillation (NVAF) involves the consideration of trade-offs among the benefits, risks, and inconveniences of different treatment options. The objective of this paper is to describe the development of a decision support tool for NVAF based on the provision of individualized risk estimates for stroke and bleeding and on preparing patients to communicate with their physicians about their values and potential treatment options. Methods We developed a tool based on the principles of the International Patient Decision Aids Standards. The tool focuses on the patient-physician dyad as the decision-making unit and emphasizes improving the interaction between the two. It is built on the recognition that the application of patient values to a specific treatment decision is complex and that the final treatment choice is best made through a process of patient-clinician communication. Results The tool provides education incorporating patients ' illness perceptions to explain the relationship between NVAF and stroke, and then presents individualized risk estimates, derived using separate risk calculators for stroke and bleeding over a clinically meaningful time period (5 years) associated with no treatment, aspirin, and warfarin. Sequelae of both stroke and bleeding outcomes are also described. Patients are encouraged to verbalize how they value the incremental risks and benefits associated with each option and write down specific concerns to address with their physician. A physician prompt to encourage patients to discuss their opinions is included as part of the decision support tool. In pilot testing with 11 participants (mean age 78 ± 9 years, 64% with ≤ high-school education), 8 (72%) rated ease of completion as "very easy," and 9 (81%) rated amount of information as "just right." Conclusions The risks and benefits of different treatment options for reduction of stroke in NVAF vary widely according to patients' comorbidities. This tool facilitates the provision of individualized outcome data and encourages patients to communicate with their physicians about these risks and benefits. Future studies will examine whether use of the tool is associated with improved quality of decision making. PMID:21977943

  12. Utility of an emulation and simulation computer model for air revitalization system hardware design, development, and test

    NASA Technical Reports Server (NTRS)

    Yanosy, J. L.; Rowell, L. F.

    1985-01-01

    Efforts to make increasingly use of suitable computer programs in the design of hardware have the potential to reduce expenditures. In this context, NASA has evaluated the benefits provided by software tools through an application to the Environmental Control and Life Support (ECLS) system. The present paper is concerned with the benefits obtained by an employment of simulation tools in the case of the Air Revitalization System (ARS) of a Space Station life support system. Attention is given to the ARS functions and components, a computer program overview, a SAND (solid amine water desorbed) bed model description, a model validation, and details regarding the simulation benefits.

  13. Cost-effectiveness analysis: adding value to assessment of animal health welfare and production.

    PubMed

    Babo Martins, S; Rushton, J

    2014-12-01

    Cost-effectiveness analysis (CEA) has been extensively used in economic assessments in fields related to animal health, namely in human health where it provides a decision-making framework for choices about the allocation of healthcare resources. Conversely, in animal health, cost-benefit analysis has been the preferred tool for economic analysis. In this paper, the use of CEA in related areas and the role of this technique in assessments of animal health, welfare and production are reviewed. Cost-effectiveness analysis can add further value to these assessments, particularly in programmes targeting animal welfare or animal diseases with an impact on human health, where outcomes are best valued in natural effects rather than in monetary units. Importantly, CEA can be performed during programme implementation stages to assess alternative courses of action in real time.

  14. Physical Activity: A Tool for Improving Health (Part 1--Biological Health Benefits)

    ERIC Educational Resources Information Center

    Gallaway, Patrick J.; Hongu, Nobuko

    2015-01-01

    Extension educators have been promoting and incorporating physical activities into their community-based programs and improving the health of individuals, particularly those with limited resources. This article is the first of a three-part series describing the benefits of physical activity for human health: 1) biological health benefits of…

  15. Physical Activity: A Tool for Improving Health (Part 2-Mental Health Benefits)

    ERIC Educational Resources Information Center

    Gallaway, Patrick J.; Hongu, Nobuko

    2016-01-01

    By promoting physical activities and incorporating them into their community-based programs, Extension professionals are improving the health of individuals, particularly those with limited resources. This article is the second in a three-part series describing the benefits of physical activity for human health: (1) biological health benefits of…

  16. The Production Effect: Costs and Benefits in Free Recall

    ERIC Educational Resources Information Center

    Jones, Angela C.; Pyc, Mary A.

    2014-01-01

    The production effect, the memorial benefit for information read aloud versus silently, has been touted as a simple memory improvement tool. The current experiments were designed to evaluate the relative costs and benefits of production using a free recall paradigm. Results extend beyond prior work showing a production effect only when production…

  17. Clinical Data Warehouse: An Effective Tool to Create Intelligence in Disease Management.

    PubMed

    Karami, Mahtab; Rahimi, Azin; Shahmirzadi, Ali Hosseini

    Clinical business intelligence tools such as clinical data warehouse enable health care organizations to objectively assess the disease management programs that affect the quality of patients' life and well-being in public. The purpose of these programs is to reduce disease occurrence, improve patient care, and decrease health care costs. Therefore, applying clinical data warehouse can be effective in generating useful information about aspects of patient care to facilitate budgeting, planning, research, process improvement, external reporting, benchmarking, and trend analysis, as well as to enable the decisions needed to prevent the progression or appearance of the illness aligning with maintaining the health of the population. The aim of this review article is to describe the benefits of clinical data warehouse applications in creating intelligence for disease management programs.

  18. Using the Social Web to Supplement Classical Learning

    NASA Astrophysics Data System (ADS)

    Trausan-Matu, Stefan; Posea, Vlad; Rebedea, Traian; Chiru, Costin

    The paper describes a complex e-learning experiment that has involved over 700 students that attended the Human-Computer Interaction course at the “Politehnica” University of Bucharest during the last 4 years. The experiment consisted in using social web technologies like blogs and chat conferences to engage students in collaborative learning. The paper presents the learning scenario, the problems encountered and the tools developed for solving these problems and assisting tutors in evaluating the activity of the students. The results of the experiment and of using the blog and chat analysis tools are also covered. Moreover, we show the benefits of using such a scenario for the learning community formed by the students that attended this course in order to supplement the classical teaching and learning paradigm.

  19. A service-based BLAST command tool supported by cloud infrastructures.

    PubMed

    Carrión, Abel; Blanquer, Ignacio; Hernández, Vicente

    2012-01-01

    Notwithstanding the benefits of distributed-computing infrastructures for empowering bioinformatics analysis tools with the needed computing and storage capability, the actual use of these infrastructures is still low. Learning curves and deployment difficulties have reduced the impact on the wide research community. This article presents a porting strategy of BLAST based on a multiplatform client and a service that provides the same interface as sequential BLAST, thus reducing learning curve and with minimal impact on their integration on existing workflows. The porting has been done using the execution and data access components from the EC project Venus-C and the Windows Azure infrastructure provided in this project. The results obtained demonstrate a low overhead on the global execution framework and reasonable speed-up and cost-efficiency with respect to a sequential version.

  20. Benefit from NASA

    NASA Image and Video Library

    2001-08-01

    Apollo-era technology spurred the development of cordless products that we take for granted everyday. In the 1960s, NASA asked Black Decker to develop a special drill that would be powerful enough to cut through hard layers of the lunar surface and be lightweight, compact, and operate under its own power source, allowing Apollo astronauts to collect lunar samples further away from the Lunar Experiment Module. In response, Black Decker developed a computer program that analyzed and optimized drill motor operations. From their analysis, engineers were able to design a motor that was powerful yet required minimal battery power to operate. Since those first days of cordless products, Black Decker has continued to refine this technology and they now sell their rechargeable products worldwide (i.e. the Dustbuster, cordless tools for home and industrial use, and medical tools.)

  1. Knee Arthroscopy Simulation: A Randomized Controlled Trial Evaluating the Effectiveness of the Imperial Knee Arthroscopy Cognitive Task Analysis (IKACTA) Tool.

    PubMed

    Bhattacharyya, Rahul; Davidson, Donald J; Sugand, Kapil; Bartlett, Matthew J; Bhattacharya, Rajarshi; Gupte, Chinmay M

    2017-10-04

    Virtual-reality and cadaveric simulations are expensive and not readily accessible. Innovative and accessible training adjuncts are required to help to meet training needs. Cognitive task analysis has been used extensively to train pilots and in other surgical specialties. However, the use of cognitive task analyses within orthopaedics is in its infancy. The purpose of this study was to evaluate the effectiveness of a novel cognitive task analysis tool to train novice surgeons in diagnostic knee arthroscopy in high-fidelity, phantom-limb simulation. Three expert knee surgeons were interviewed independently to generate a list of technical steps, decision points, and errors for diagnostic knee arthroscopy. A modified Delphi technique was used to generate the final cognitive task analysis. A video and a voiceover were recorded for each phase of this procedure. These were combined to produce the Imperial Knee Arthroscopy Cognitive Task Analysis (IKACTA) tool that utilizes written and audiovisual stimuli to describe each phase of a diagnostic knee arthroscopy. In this double-blinded, randomized controlled trial, a power calculation was performed prior to recruitment. Sixteen novice orthopaedic trainees who performed ≤10 diagnostic knee arthroscopies were randomized into 2 equal groups. The intervention group (IKACTA group) was given the IKACTA tool and the control group had no additional learning material. They were assessed objectively (validated Arthroscopic Surgical Skill Evaluation Tool [ASSET] global rating scale) on a high-fidelity, phantom-knee simulator. All participants, using the Likert rating scale, subjectively rated the tool. The mean ASSET score (and standard deviation) was 19.5 ± 3.7 points in the IKACTA group and 10.6 ± 2.3 points in the control group, resulting in an improvement of 8.9 points (95% confidence interval, 7.6 to 10.1 points; p = 0.002); the score was determined as 51.3% (19.5 of 38) for the IKACTA group, 27.9% (10.6 of 38) for the control group, and 23.4% (8.9 of 38) for the improvement. All participants agreed that the cognitive task analysis learning tool was a useful training adjunct to learning in the operating room. To our knowledge, this is the first cognitive task analysis in diagnostic knee arthroscopy that is user-friendly and inexpensive and has demonstrated significant benefits in training. The IKACTA will provide trainees with a demonstrably strong foundation in diagnostic knee arthroscopy that will flatten learning curves in both technical skills and decision-making.

  2. Contributions to the AIAA Guidance, Navigation and Control Conference

    NASA Technical Reports Server (NTRS)

    Campbell, S. D. (Editor)

    2002-01-01

    This report contains six papers presented by the Lincoln Laboratory Air Traffic Control Systems Group at the American Institute of Aeronautics & Astronautics (AIAA) Guidance, Navigation and Control (GNC) conference on 6-9 August 2001 in Montreal, Canada. The work reported was sponsored by the NASA Advanced Air Transportation Technologies (AATT) program and the FAA Free Flight Phase 1 (FFP1) program. The papers are based on studies completed at Lincoln Laboratory in collaboration with staff at NASA Ames Research Center. These papers were presented in the Air Traffic Automation Session of the conference and fall into three major areas: Traffic Analysis & Benefits Studies, Weather/Automation Integration and Surface Surveillance. In the first area, a paper by Andrews & Robinson presents an analysis of the efficiency of runway operations at Dallas/Ft. Worth using a tool called PARO, and a paper by Welch, Andrews & Robinson presents a delay benefit results for the Final Approach Spacing Tool (FAST). In the second area, a paper by Campbell, et al describes a new weather distribution systems for the Center/TRACON Automation System (CTAS) that allows ingestion of multiple weather sources, and a paper by Vandevenne, Lloyd & Hogaboom describes the use of the NOAA Eta model as a backup wind data source for CTAS. Also in this area, a paper by Murphy & Campbell presents initial steps towards integrating weather impacted routes into FAST. In the third area, a paper by Welch, Bussolari and Atkins presents an initial operational concept for using surface surveillance to reduce taxi delays.

  3. Mobile instant messaging for rural community health workers: a case from Malawi

    PubMed Central

    Pimmer, Christoph; Mhango, Susan; Mzumara, Alfred; Mbvundula, Francis

    2017-01-01

    ABSTRACT Background: Mobile instant messaging (MIM) tools, such as WhatsApp, have transformed global communication practice. In the field of global health, MIM is an increasingly used, but little understood, phenomenon. Objectives: It remains unclear how MIM can be used by rural community health workers (CHWs) and their facilitators, and what are the associated benefits and constraints. To address this gap, WhatsApp groups were implemented and researched in a rural setting in Malawi. Methods: The multi-site case study research triangulated interviews and focus groups of CHWs and facilitators with the thematic qualitative analysis of the actual conversations on WhatsApp. A survey with open questions and the quantitative analysis of WhatsApp conversations were used as supplementary triangulation sources. Results: The use of MIM was differentiated according to instrumental (e.g. mobilising health resources) and participatory purposes (e.g. the enactment of emphatic ties). The identified benefits were centred on the enhanced ease and quality of communication of a geographically distributed health workforce, and the heightened connectedness of a professionally isolated health workforce. Alongside minor technical and connectivity issues, the main challenge for the CHWs was to negotiate divergent expectations regarding the social versus the instrumental use of the space. Conclusions: Despite some challenges and constraints, the implementation of WhatsApp was received positively by the CHWs and it was found to be a useful tool to support distributed rural health work. PMID:28914165

  4. Functional Analysis for an Integrated Capability of Arrival/Departure/Surface Management with Tactical Runway Management

    NASA Technical Reports Server (NTRS)

    Phojanamongkolkij, Nipa; Okuniek, Nikolai; Lohr, Gary W.; Schaper, Meilin; Christoffels, Lothar; Latorella, Kara A.

    2014-01-01

    The runway is a critical resource of any air transport system. It is used for arrivals, departures, and for taxiing aircraft and is universally acknowledged as a constraining factor to capacity for both surface and airspace operations. It follows that investigation of the effective use of runways, both in terms of selection and assignment as well as the timing and sequencing of the traffic is paramount to the efficient traffic flows. Both the German Aerospace Center (DLR) and NASA have developed concepts and tools to improve atomic aspects of coordinated arrival/departure/surface management operations and runway configuration management. In December 2012, NASA entered into a Collaborative Agreement with DLR. Four collaborative work areas were identified, one of which is called "Runway Management." As part of collaborative research in the "Runway Management" area, which is conducted with the DLR Institute of Flight Guidance, located in Braunschweig, the goal is to develop an integrated system comprised of the three DLR tools - arrival, departure, and surface management (collectively referred to as A/D/S-MAN) - and NASA's tactical runway configuration management (TRCM) tool. To achieve this goal, it is critical to prepare a concept of operations (ConOps) detailing how the NASA runway management and DLR arrival, departure, and surface management tools will function together to the benefit of each. To assist with the preparation of the ConOps, the integrated NASA and DLR tools are assessed through a functional analysis method described in this report. The report first provides the highlevel operational environments for air traffic management (ATM) in Germany and in the U.S., and the descriptions of the DLR's A/D/S-MAN and NASA's TRCM tools at the level of details necessary to compliment the purpose of the study. Functional analyses of each tool and a completed functional analysis of an integrated system design are presented next in the report. Future efforts to fully develop the ConOps will include: developing scenarios to fully test environmental, procedural, and data availability assumptions; executing the analysis by a walk-through of the integrated system using these scenarios; defining the appropriate role of operators in terms of their monitoring requirements and decision authority; executing the analysis by a walk-through of the integrated system with operator involvement; characterizing the environmental, system data requirements, and operator role assumptions for the ConOps.

  5. Benefits, challenges, and best practices for involving audiences in the development of interactive coastal risk communication tools: Professional communicators' experiences

    NASA Astrophysics Data System (ADS)

    Stephens, S. H.; DeLorme, D.

    2017-12-01

    To make scientific information useful and usable to audiences, communicators must understand audience needs, expectations, and future applications. This presentation synthesizes benefits, challenges, and best practices resulting from a qualitative social science interview study of nine professionals on their experiences developing interactive visualization tools for communicating about coastal environmental risks. Online interactive risk visualization tools, such as flooding maps, are used to provide scientific information about the impacts of coastal hazards. These tools have a wide range of audiences and purposes, including time-sensitive emergency communication, infrastructure and natural resource planning, and simply starting a community conversation about risks. Thus, the science, purposes, and audiences of these tools require a multifaceted communication strategy. In order to make these tools useable and accepted by their audiences, many professional development teams solicit target end-user input or incorporate formal user-centered design into the development process. This presentation will share results of seven interviews with developers of U.S. interactive coastal risk communication tools, ranging from state-level to international in scope. Specific techniques and procedures for audience input that were used in these projects will be discussed, including ad-hoc conversations with users, iterative usability testing with project stakeholder groups, and other participatory mechanisms. The presentation will then focus on benefits, challenges, and recommendations for best practice that the interviewees disclosed about including audiences in their development projects. Presentation attendees will gain an understanding of different procedures and techniques that professionals employ to involve end-users in risk tool development projects, as well as important considerations and recommendations for effectively involving audiences in science communication design.

  6. Simulation Tools for Power Electronics Courses Based on Java Technologies

    ERIC Educational Resources Information Center

    Canesin, Carlos A.; Goncalves, Flavio A. S.; Sampaio, Leonardo P.

    2010-01-01

    This paper presents interactive power electronics educational tools. These interactive tools make use of the benefits of Java language to provide a dynamic and interactive approach to simulating steady-state ideal rectifiers (uncontrolled and controlled; single-phase and three-phase). Additionally, this paper discusses the development and use of…

  7. PROTOTYPE TOOL FOR EVALUATING THE COST AND EFFECTIVENESS OF GREENHOUSE GAS MITIGATION TECHNOLOGIES

    EPA Science Inventory

    The paper introduces the structure of a tool, being developed by the U.S. EPA's Office of Research and Development, that will be able to analyze the benefits of new technologies and strategies for controlling greenhouse gas (GHG) emissions. When completed, the tool will be able ...

  8. The Animated Library

    ERIC Educational Resources Information Center

    Brewer, Jim; Dyal, Donald H.; Sweet, Robert

    2009-01-01

    Libraries have always been poised at the crossroads of access tools and content. Librarians, their personnel, and supporters have worked for generations to create tools to store and utilize content for the benefit of patrons. Libraries house materials and the tools to unlock them; their staffers teach patrons to use the materials and associated…

  9. The Virtual Intercultural Team Tool

    ERIC Educational Resources Information Center

    Rus, Calin

    2010-01-01

    This article describes the Virtual Intercultural Team Tool (VITT) and discusses its processes and benefits. VIIT is a virtual platform designed with the aim of assisting European project teams to improve intercultural communication and build on their cultural diversity for effective implementation of their projects. It is a process-focused tool,…

  10. 76 FR 4904 - Agency Information Collection Request; 30-Day Public Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-27

    ... datasets that are not specific to individual's personal health information to improve decision making by... making health indicator datasets (data that is not associated with any individuals) and tools available.../health . These datasets and tools are anticipated to benefit development of applications, web-based tools...

  11. How Much Professional Development Is Enough? Meeting the Needs of Independent Music Teachers Learning to Use a Digital Tool

    ERIC Educational Resources Information Center

    Upitis, Rena; Brook, Julia

    2017-01-01

    Even though there are demonstrated benefits of using online tools to support student musicians, there is a persistent challenge of providing sufficient and effective professional development for independent music teachers to use such tools successfully. This paper describes several methods for helping teachers use an online tool called iSCORE,…

  12. Tool Weighs Benefits, Risks of Raloxifene or Tamoxifen to Prevent Breast Cancer

    Cancer.gov

    Researchers have developed a benefit-risk index to help guide decisions on whether postmenopausal women at increased risk of developing breast cancer should take raloxifene or tamoxifen to reduce that risk.

  13. Communicating Value in Simulation: Cost-Benefit Analysis and Return on Investment.

    PubMed

    Asche, Carl V; Kim, Minchul; Brown, Alisha; Golden, Antoinette; Laack, Torrey A; Rosario, Javier; Strother, Christopher; Totten, Vicken Y; Okuda, Yasuharu

    2018-02-01

    Value-based health care requires a balancing of medical outcomes with economic value. Administrators need to understand both the clinical and the economic effects of potentially expensive simulation programs to rationalize the costs. Given the often-disparate priorities of clinical educators relative to health care administrators, justifying the value of simulation requires the use of economic analyses few physicians have been trained to conduct. Clinical educators need to be able to present thorough economic analyses demonstrating returns on investment and cost-effectiveness to effectively communicate with administrators. At the 2017 Academic Emergency Medicine Consensus Conference "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes," our breakout session critically evaluated the cost-benefit and return on investment of simulation. In this paper we provide an overview of some of the economic tools that a clinician may use to present the value of simulation training to financial officers and other administrators in the economic terms they understand. We also define three themes as a call to action for research related to cost-benefit analysis in simulation as well as four specific research questions that will help guide educators and hospital leadership to make decisions on the value of simulation for their system or program. © 2017 by the Society for Academic Emergency Medicine.

  14. Patient behavior and the benefits of artificial intelligence: the perils of "dangerous" literacy and illusory patient empowerment.

    PubMed

    Schulz, Peter J; Nakamoto, Kent

    2013-08-01

    Artificial intelligence can provide important support of patient health. However, limits to realized benefits can arise as patients assume an active role in their health decisions. Distinguishing the concepts of health literacy and patient empowerment, we analyze conditions that bias patient use of the Internet and limit access to and impact of artificial intelligence. Improving health literacy in the face of the Internet requires significant guidance. Patients must be directed toward the appropriate tools and also provided with key background knowledge enabling them to use the tools and capitalize on the artificial intelligence technology. Benefits of tools employing artificial intelligence to promote health cannot be realized without recognizing and addressing the patients' desires, expectations, and limitations that impact their Internet behavior. In order to benefit from artificial intelligence, patients need a substantial level of background knowledge and skill in information use-i.e., health literacy. It is critical that health professionals respond to patient search for information on the Internet, first by guiding their search to relevant, authoritative, and responsive sources, and second by educating patients about how to interpret the information they are likely to encounter. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  15. TU-FG-201-05: Varian MPC as a Statistical Process Control Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carver, A; Rowbottom, C

    Purpose: Quality assurance in radiotherapy requires the measurement of various machine parameters to ensure they remain within permitted values over time. In Truebeam release 2.0 the Machine Performance Check (MPC) was released allowing beam output and machine axis movements to be assessed in a single test. We aim to evaluate the Varian Machine Performance Check (MPC) as a tool for Statistical Process Control (SPC). Methods: Varian’s MPC tool was used on three Truebeam and one EDGE linac for a period of approximately one year. MPC was commissioned against independent systems. After this period the data were reviewed to determine whethermore » or not the MPC was useful as a process control tool. Analyses on individual tests were analysed using Shewhart control plots, using Matlab for analysis. Principal component analysis was used to determine if a multivariate model was of any benefit in analysing the data. Results: Control charts were found to be useful to detect beam output changes, worn T-nuts and jaw calibration issues. Upper and lower control limits were defined at the 95% level. Multivariate SPC was performed using Principal Component Analysis. We found little evidence of clustering beyond that which might be naively expected such as beam uniformity and beam output. Whilst this makes multivariate analysis of little use it suggests that each test is giving independent information. Conclusion: The variety of independent parameters tested in MPC makes it a sensitive tool for routine machine QA. We have determined that using control charts in our QA programme would rapidly detect changes in machine performance. The use of control charts allows large quantities of tests to be performed on all linacs without visual inspection of all results. The use of control limits alerts users when data are inconsistent with previous measurements before they become out of specification. A. Carver has received a speaker’s honorarium from Varian.« less

  16. A proof of concept for epidemiological research using structured reporting with pulmonary embolism as a use case.

    PubMed

    Daniel, Pinto Dos Santos; Sonja, Scheibl; Gordon, Arnhold; Aline, Maehringer-Kunz; Christoph, Düber; Peter, Mildenberger; Roman, Kloeckner

    2018-05-10

    This paper studies the possibilities of an integrated IT-based workflow for epidemiological research in pulmonary embolism using freely available tools and structured reporting. We included a total of 521 consecutive cases which had been referred to the radiology department for computed tomography pulmonary angiography (CTPA) with suspected pulmonary embolism (PE). Free-text reports were transformed into structured reports using a freely available IHE-MRRT-compliant reporting platform. D-dimer values were retrieved from the hospitals laboratory results system. All information was stored in the platform's database and visualized using freely available tools. For further analysis, we directly accessed the platform's database with an advanced analytics tool (RapidMiner). We were able developed an integrated workflow for epidemiological statistics from reports obtained in clinical routine. The report data allowed for automated calculation of epidemiological parameters. Prevalence of pulmonary embolism was 27.6%. The mean age in patients with and without pulmonary embolism did not differ (62.8 years and 62.0 years, respectively, p=0.987). As expected, there was a significant difference in mean D-dimer values (10.13 mg/L FEU and 3.12 mg/L FEU, respectively, p<0.001). Structured reporting can make data obtained from clinical routine more accessible. Designing practical workflows is feasible using freely available tools and allows for the calculation of epidemiological statistics on a near real-time basis. Therefore, radiologists should push for the implementation of structured reporting in clinical routine. Advances in knowledge: Theoretical benefits of structured reporting have long been discussed, but practical implementation demonstrating those benefits has been lacking. Here we present a first experience providing proof that structured reporting will make data from clinical routine more accessible.

  17. The perspectives of Iranian physicians and patients towards patient decision aids: a qualitative study.

    PubMed

    Rashidian, Hamideh; Nedjat, Saharnaz; Majdzadeh, Reza; Gholami, Jaleh; Haghjou, Leila; Abdollahi, Bahar Sadeghi; Davatchi, Fereydoun; Rashidian, Arash

    2013-09-25

    Patient preference is one of the main components of clinical decision making, therefore leading to the development of patient decision aids. The goal of this study was to describe physicians' and patients' viewpoints on the barriers and limitations of using patient decision aids in Iran, their proposed solutions, and, the benefits of using these tools. This qualitative study was conducted in 2011 in Iran by holding in-depth interviews with 14 physicians and 8 arthritis patient. Interviewees were selected through purposeful and maximum variation sampling. As an example, a patient decision aid on the treatment of knee arthritis was developed upon literature reviews and gathering expert opinion, and was presented at the time of interview. Thematic analysis was conducted to analyze the data by using the OpenCode software. The results were summarized into three categories and ten codes. The extracted categories were the perceived benefits of using the tools, as well as the patient-related and physician-related barriers in using decision aids. The following barriers in using patient decision aids were identified in this study: lack of patients and physicians' trainings in shared decision making, lack of specialist per capita, low treatment tariffs and lack of an exact evaluation system for patient participation in decision making. No doubt these barriers demand the health authorities' special attention. Hence, despite patients and physicians' inclination toward using patient decision aids, these problems have hindered the practical usage of these tools in Iran--as a developing country.

  18. Software applications for flux balance analysis.

    PubMed

    Lakshmanan, Meiyappan; Koh, Geoffrey; Chung, Bevan K S; Lee, Dong-Yup

    2014-01-01

    Flux balance analysis (FBA) is a widely used computational method for characterizing and engineering intrinsic cellular metabolism. The increasing number of its successful applications and growing popularity are possibly attributable to the availability of specific software tools for FBA. Each tool has its unique features and limitations with respect to operational environment, user-interface and supported analysis algorithms. Presented herein is an in-depth evaluation of currently available FBA applications, focusing mainly on usability, functionality, graphical representation and inter-operability. Overall, most of the applications are able to perform basic features of model creation and FBA simulation. COBRA toolbox, OptFlux and FASIMU are versatile to support advanced in silico algorithms to identify environmental and genetic targets for strain design. SurreyFBA, WEbcoli, Acorn, FAME, GEMSiRV and MetaFluxNet are the distinct tools which provide the user friendly interfaces in model handling. In terms of software architecture, FBA-SimVis and OptFlux have the flexible environments as they enable the plug-in/add-on feature to aid prospective functional extensions. Notably, an increasing trend towards the implementation of more tailored e-services such as central model repository and assistance to collaborative efforts was observed among the web-based applications with the help of advanced web-technologies. Furthermore, most recent applications such as the Model SEED, FAME, MetaFlux and MicrobesFlux have even included several routines to facilitate the reconstruction of genome-scale metabolic models. Finally, a brief discussion on the future directions of FBA applications was made for the benefit of potential tool developers.

  19. Managerial Coaching

    ERIC Educational Resources Information Center

    Bommelje, Rick

    2015-01-01

    This chapter explores how coaching equips managers and supervisors to be successful in the 21st-century workplace. Coaching has benefited these professionals by providing them with viable tools to enhance the leadership and managerial tools they already possess.

  20. Green Power Partner Resources

    EPA Pesticide Factsheets

    EPA Green Power Partners can access tools and resources to help promote their green power commitments. Partners use these tools to communicate the benefits of their green power use to their customers, stakeholders, and the general public.

  1. Health economics in public health.

    PubMed

    Ammerman, Alice S; Farrelly, Matthew A; Cavallo, David N; Ickes, Scott B; Hoerger, Thomas J

    2009-03-01

    Economic analysis is an important tool in deciding how to allocate scarce public health resources; however, there is currently a dearth of such analysis by public health researchers. Public health researchers and practitioners were surveyed to determine their current use of health economics and to identify barriers to use as well as potential strategies to decrease those barriers in order to allow them to more effectively incorporate economic analyses into their work. Data collected from five focus groups informed survey development. The survey included a demographic section and 14 multi-part questions. Participants were recruited in 2006 from three national public health organizations through e-mail; 294 academicians, practitioners, and community representatives answered the survey. Survey data were analyzed in 2007. Despite an expressed belief in the importance of health economics, more than half of the respondents reported very little or no current use of health economics in their work. Of those using health economics, cost-benefit and cost-effectiveness analysis and determination of public health costs were cited as the measures used most frequently. The most important barriers were lack of expertise, funding, time, tools, and data, as well as discomfort with economic theory. The resource deemed most important to using health economics was collaboration with economists or those with economic training. Respondents indicated a desire to learn more about health economics and tools for performing economic analysis. Given the importance of incorporating economic analysis into public health interventions, and the desire of survey respondents for more collaboration with health economists, opportunities for such collaborations should be increased.

  2. Cluster tool solution for fabrication and qualification of advanced photomasks

    NASA Astrophysics Data System (ADS)

    Schaetz, Thomas; Hartmann, Hans; Peter, Kai; Lalanne, Frederic P.; Maurin, Olivier; Baracchi, Emanuele; Miramond, Corinne; Brueck, Hans-Juergen; Scheuring, Gerd; Engel, Thomas; Eran, Yair; Sommer, Karl

    2000-07-01

    The reduction of wavelength in optical lithography, phase shift technology and optical proximity correction (OPC), requires a rapid increase in cost effective qualification of photomasks. The knowledge about CD variation, loss of pattern fidelity especially for OPC pattern and mask defects concerning the impact on wafer level is becoming a key issue for mask quality assessment. As part of the European Community supported ESPRIT projection 'Q-CAP', a new cluster concept has been developed, which allows the combination of hardware tools as well as software tools via network communication. It is designed to be open for any tool manufacturer and mask hose. The bi-directional network access allows the exchange of all relevant mask data including grayscale images, measurement results, lithography parameters, defect coordinates, layout data, process data etc. and its storage to a SQL database. The system uses SEMI format descriptions as well as standard network hardware and software components for the client server communication. Each tool is used mainly to perform its specific application without using expensive time to perform optional analysis, but the availability of the database allows each component to share the full data ste gathered by all components. Therefore, the cluster can be considered as one single virtual tool. The paper shows the advantage of the cluster approach, the benefits of the tools linked together already, and a vision of a mask house in the near future.

  3. Lot quality assurance sampling to monitor supplemental immunization activity quality: an essential tool for improving performance in polio endemic countries.

    PubMed

    Brown, Alexandra E; Okayasu, Hiromasa; Nzioki, Michael M; Wadood, Mufti Z; Chabot-Couture, Guillaume; Quddus, Arshad; Walker, George; Sutter, Roland W

    2014-11-01

    Monitoring the quality of supplementary immunization activities (SIAs) is a key tool for polio eradication. Regular monitoring data, however, are often unreliable, showing high coverage levels in virtually all areas, including those with ongoing virus circulation. To address this challenge, lot quality assurance sampling (LQAS) was introduced in 2009 as an additional tool to monitor SIA quality. Now used in 8 countries, LQAS provides a number of programmatic benefits: identifying areas of weak coverage quality with statistical reliability, differentiating areas of varying coverage with greater precision, and allowing for trend analysis of campaign quality. LQAS also accommodates changes to survey format, interpretation thresholds, evaluations of sample size, and data collection through mobile phones to improve timeliness of reporting and allow for visualization of campaign quality. LQAS becomes increasingly important to address remaining gaps in SIA quality and help focus resources on high-risk areas to prevent the continued transmission of wild poliovirus. © Crown copyright 2014.

  4. CEOS visualization environment (COVE) tool for intercalibration of satellite instruments

    USGS Publications Warehouse

    Kessler, P.D.; Killough, B.D.; Gowda, S.; Williams, B.R.; Chander, G.; Qu, Min

    2013-01-01

    Increasingly, data from multiple instruments are used to gain a more complete understanding of land surface processes at a variety of scales. Intercalibration, comparison, and coordination of satellite instrument coverage areas is a critical effort of international and domestic space agencies and organizations. The Committee on Earth Observation Satellites Visualization Environment (COVE) is a suite of browser-based applications that leverage Google Earth to display past, present, and future satellite instrument coverage areas and coincident calibration opportunities. This forecasting and ground coverage analysis and visualization capability greatly benefits the remote sensing calibration community in preparation for multisatellite ground calibration campaigns or individual satellite calibration studies. COVE has been developed for use by a broad international community to improve the efficiency and efficacy of such calibration planning efforts, whether those efforts require past, present, or future predictions. This paper provides a brief overview of the COVE tool, its validation, accuracies, and limitations with emphasis on the applicability of this visualization tool for supporting ground field campaigns and intercalibration of satellite instruments.

  5. Prostate cancer diagnostics: Clinical challenges and the ongoing need for disruptive and effective diagnostic tools.

    PubMed

    Sharma, Shikha; Zapatero-Rodríguez, Julia; O'Kennedy, Richard

    The increased incidence and the significant health burden associated with carcinoma of the prostate have led to substantial changes in its diagnosis over the past century. Despite technological advancements, the management of prostate cancer has become progressively more complex and controversial for both early and late-stage disease. The limitations and potential harms associated with the use of prostate-specific antigen (PSA) as a diagnostic marker have stimulated significant investigation of numerous novel biomarkers that demonstrate varying capacities to detect prostate cancer and can decrease unnecessary biopsies. However, only a few of these markers have been approved for specific clinical settings while the others have not been adequately validated for use. This review systematically and critically assesses ongoing issues and emerging challenges in the current state of prostate cancer diagnostic tools and the need for disruptive next generation tools based on analysis of combinations of these biomarkers to enhance predictive accuracy which will benefit clinical diagnostics and patient welfare. Copyright © 2016. Published by Elsevier Inc.

  6. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less

  7. CEOS Visualization Environment (COVE) Tool for Intercalibration of Satellite Instruments

    NASA Technical Reports Server (NTRS)

    Kessler, Paul D.; Killough, Brian D.; Gowda, Sanjay; Williams, Brian R.; Chander, Gyanesh; Qu, Min

    2013-01-01

    Increasingly, data from multiple instruments are used to gain a more complete understanding of land surface processes at a variety of scales. Intercalibration, comparison, and coordination of satellite instrument coverage areas is a critical effort of space agencies and of international and domestic organizations. The Committee on Earth Observation Satellites Visualization Environment (COVE) is a suite of browser-based applications that leverage Google Earth to display past, present, and future satellite instrument coverage areas and coincident calibration opportunities. This forecasting and ground coverage analysis and visualization capability greatly benefits the remote sensing calibration community in preparation for multisatellite ground calibration campaigns or individual satellite calibration studies. COVE has been developed for use by a broad international community to improve the efficiency and efficacy of such calibration efforts. This paper provides a brief overview of the COVE tool, its validation, accuracies and limitations with emphasis on the applicability of this visualization tool for supporting ground field campaigns and intercalibration of satellite instruments.

  8. Benefits Analysis of Multi-Center Dynamic Weather Routes

    NASA Technical Reports Server (NTRS)

    Sheth, Kapil; McNally, David; Morando, Alexander; Clymer, Alexis; Lock, Jennifer; Petersen, Julien

    2014-01-01

    Dynamic weather routes are flight plan corrections that can provide airborne flights more than user-specified minutes of flying-time savings, compared to their current flight plan. These routes are computed from the aircraft's current location to a flight plan fix downstream (within a predefined limit region), while avoiding forecasted convective weather regions. The Dynamic Weather Routes automation has been continuously running with live air traffic data for a field evaluation at the American Airlines Integrated Operations Center in Fort Worth, TX since July 31, 2012, where flights within the Fort Worth Air Route Traffic Control Center are evaluated for time savings. This paper extends the methodology to all Centers in United States and presents benefits analysis of Dynamic Weather Routes automation, if it was implemented in multiple airspace Centers individually and concurrently. The current computation of dynamic weather routes requires a limit rectangle so that a downstream capture fix can be selected, preventing very large route changes spanning several Centers. In this paper, first, a method of computing a limit polygon (as opposed to a rectangle used for Fort Worth Center) is described for each of the 20 Centers in the National Airspace System. The Future ATM Concepts Evaluation Tool, a nationwide simulation and analysis tool, is used for this purpose. After a comparison of results with the Center-based Dynamic Weather Routes automation in Fort Worth Center, results are presented for 11 Centers in the contiguous United States. These Centers are generally most impacted by convective weather. A breakdown of individual Center and airline savings is presented and the results indicate an overall average savings of about 10 minutes of flying time are obtained per flight.

  9. Improving the decision-making process for nonprescription drugs: a framework for benefit-risk assessment.

    PubMed

    Brass, E P; Lofstedt, R; Renn, O

    2011-12-01

    Nonprescription drugs pose unique challenges to regulators. The fact that the barriers to access are lower for nonprescription drugs as compared with prescription drugs may permit additional consumers to obtain effective drugs. However, the use of these drugs by consumers in the absence of supervision by a health-care professional may result in unacceptable rates of misuse and suboptimal clinical outcomes. A value-tree method is proposed that defines important benefit and risk domains relevant to nonprescription drugs. This value tree can be used to comprehensively identify product-specific attributes in each domain and can also support formal benefit-risk assessment using a variety of tools. This is illustrated here, using a modification of the International Risk Governance Council (IRGC) framework, a flexible tool previously applied in a number of fields, which systematizes an approach to issue review, early alignment of stakeholders, evaluation, and risk mitigation/management. The proposed approach has the potential to provide structured, transparent tools for regulatory decision making for nonprescription drugs.

  10. Mentorship programs for faculty development in academic general pediatric divisions.

    PubMed

    Takagishi, Jennifer; Dabrow, Sharon

    2011-01-01

    Introduction. Mentoring relationships have been shown to support academicians in areas of research, work/life balance, and promotion. Methods. General pediatric division chiefs accessed an electronic survey asking about mentorship relationships, their ability to create a mentorship program, and resources needed. Results. Dyadic mentorship programs were available at 53% of divisions. Peer mentorship programs were available at 27% of divisions. Overall, 84% of chiefs believed that dyadic mentorship would benefit their faculty. 91% of chiefs believed that peer mentorship would benefit their faculty. Chiefs were interested in starting peer (57%) or dyadic (55%) mentorship programs. Few divisions had a peer mentorship program available, whereas 24% already had a dyadic program. 43% of chiefs felt that they had the tools to start a program. Many tools are needed to create a program. Discussion. General pediatric division chiefs acknowledge the benefits of mentoring relationships, and some have programs in place. Many need tools to create them. Pediatric societies could facilitate this critical area of professional development.

  11. Immersion lithography defectivity analysis at DUV inspection wavelength

    NASA Astrophysics Data System (ADS)

    Golan, E.; Meshulach, D.; Raccah, N.; Yeo, J. Ho.; Dassa, O.; Brandl, S.; Schwarz, C.; Pierson, B.; Montgomery, W.

    2007-03-01

    Significant effort has been directed in recent years towards the realization of immersion lithography at 193nm wavelength. Immersion lithography is likely a key enabling technology for the production of critical layers for 45nm and 32nm design rule (DR) devices. In spite of the significant progress in immersion lithography technology, there remain several key technology issues, with a critical issue of immersion lithography process induced defects. The benefits of the optical resolution and depth of focus, made possible by immersion lithography, are well understood. Yet, these benefits cannot come at the expense of increased defect counts and decreased production yield. Understanding the impact of the immersion lithography process parameters on wafer defects formation and defect counts, together with the ability to monitor, control and minimize the defect counts down to acceptable levels is imperative for successful introduction of immersion lithography for production of advanced DR's. In this report, we present experimental results of immersion lithography defectivity analysis focused on topcoat layer thickness parameters and resist bake temperatures. Wafers were exposed on the 1150i-α-immersion scanner and 1200B Scanner (ASML), defect inspection was performed using a DUV inspection tool (UVision TM, Applied Materials). Higher sensitivity was demonstrated at DUV through detection of small defects not detected at the visible wavelength, indicating on the potential high sensitivity benefits of DUV inspection for this layer. The analysis indicates that certain types of defects are associated with different immersion process parameters. This type of analysis at DUV wavelengths would enable the optimization of immersion lithography processes, thus enabling the qualification of immersion processes for volume production.

  12. Development of a web-based toolkit to support improvement of care coordination in primary care.

    PubMed

    Ganz, David A; Barnard, Jenny M; Smith, Nina Z Y; Miake-Lye, Isomi M; Delevan, Deborah M; Simon, Alissa; Rose, Danielle E; Stockdale, Susan E; Chang, Evelyn T; Noël, Polly H; Finley, Erin P; Lee, Martin L; Zulman, Donna M; Cordasco, Kristina M; Rubenstein, Lisa V

    2018-05-23

    Promising practices for the coordination of chronic care exist, but how to select and share these practices to support quality improvement within a healthcare system is uncertain. This study describes an approach for selecting high-quality tools for an online care coordination toolkit to be used in Veterans Health Administration (VA) primary care practices. We evaluated tools in three steps: (1) an initial screening to identify tools relevant to care coordination in VA primary care, (2) a two-clinician expert review process assessing tool characteristics (e.g. frequency of problem addressed, linkage to patients' experience of care, effect on practice workflow, and sustainability with existing resources) and assigning each tool a summary rating, and (3) semi-structured interviews with VA patients and frontline clinicians and staff. Of 300 potentially relevant tools identified by searching online resources, 65, 38, and 18 remained after steps one, two and three, respectively. The 18 tools cover five topics: managing referrals to specialty care, medication management, patient after-visit summary, patient activation materials, agenda setting, patient pre-visit packet, and provider contact information for patients. The final toolkit provides access to the 18 tools, as well as detailed information about tools' expected benefits, and resources required for tool implementation. Future care coordination efforts can benefit from systematically reviewing available tools to identify those that are high quality and relevant.

  13. The Benefits of College Marching Bands for Students and Universities: A Review of the Literature

    ERIC Educational Resources Information Center

    Cumberledge, Jason P.

    2017-01-01

    College marching bands are a large and visible part of American music education. Institutions of higher learning have benefited from the existence of marching bands, as they serve as a powerful recruitment tool and an essential public relations vehicle for music departments and universities. The benefit students may receive from marching band…

  14. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    USGS Publications Warehouse

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest. Aquatic gap analysis naturally focuses on aquatic habitats. The analytical tools are largely based on specification of the species-habitat relations for the system and organism group of interest (Morrison et al. 2003; McKenna et al. 2006; Steen et al. 2006; Sowa et al. 2007). The Great Lakes Regional Aquatic Gap Analysis (GLGap) project focuses primarily on lotic habitat of the U.S. Great Lakes drainage basin and associated states and has been developed to address fish and fisheries issues. These tools are unique because they allow us to address problems at a range of scales from the region to the stream segment and include the ability to predict species specific occurrence or abundance for most of the fish species in the study area. The results and types of questions that can be addressed provide better global understanding of the ecological context within which specific natural resources fit (e.g., neighboring environments and resources, and large and small scale processes). The geographic analysis platform consists of broad and flexible geospatial tools (and associated data) with many potential applications. The objectives of this article are to provide a brief overview of GLGap methods and analysis tools, and demonstrate conservation and planning applications of those data and tools. Although there are many potential applications, we will highlight just three: (1) support for the Eastern Brook Trout Joint Venture (EBTJV), (2) Aquatic Life classification in Wisconsin, and (3) an educational tool that makes use of Google Earth (use of trade or product names does not imply endorsement by the U.S. Government) and Internet accessibility.

  15. Optimal water management and conflict resolution: The Middle East Water Project

    NASA Astrophysics Data System (ADS)

    Fisher, Franklin M.; Arlosoroff, Shaul; Eckstein, Zvi; Haddadin, Munther; Hamati, Salem G.; Huber-Lee, Annette; Jarrar, Ammar; Jayyousi, Anan; Shamir, Uri; Wesseling, Hans

    2002-11-01

    In many situations, actual water markets will not allocate water resources optimally, largely because of the perceived social value of water. It is possible, however, to build optimizing models which, taking account of demand as well as supply considerations, can substitute for actual markets. Such models can assist the formation of water policies, taking into account user-supplied values and constraints. They provide powerful tools for the system-wide cost-benefit analysis of infrastructure; this is illustrated by an analysis of the need for desalination in Israel and the cost and benefits of adding a conveyance line. Further, the use of such models can facilitate cooperation in water, yielding gains that can be considerably greater than the value of the disputed water itself. This can turn what appear to be zero-sum games into win-win situations. The Middle East Water Project has built such a model for the Israeli-Jordanian-Palestinian region. We find that the value of the water in dispute in the region is very small and the possible gains from cooperation are relatively large. Analysis of the scarcity value of water is a crucial feature.

  16. Application of 57Fe Mössbauer spectroscopy as a tool for mining exploration of bornite (Cu5FeS4) copper ore

    NASA Astrophysics Data System (ADS)

    Gainov, R. R.; Vagizov, F. G.; Golovanevskiy, V. A.; Ksenofontov, V. A.; Klingelhöfer, G.; Klekovkina, V. V.; Shumilova, T. G.; Pen'kov, I. N.

    2014-04-01

    Nuclear resonance methods, including Mössbauer spectroscopy,are considered as unique techniques suitable for remote on-line mineralogical analysis. The employment of these methods provides potentially significant commercial benefits for mining industry. As applied to copper sulfide ores, Mössbauer spectroscopy method is suitable for the analysis noted. Bornite (formally Cu5FeS4) is a significant part of copper ore and identification of its properties is important for economic exploitation of commercial copper ore deposits. A series of natural bornite samples was studied by 57Fe Mössbauer spectroscopy. Two aspects were considered: reexamination of 57Fe Mössbauer properties of natural bornite samples and their stability irrespective of origin and potential use of miniaturized Mössbauer spectrometers MIMOS II for in-situ bornite identification. The results obtained show a number of potential benefits of introducing the available portative Mössbauer equipment into the mining industry for express mineralogical analysis. In addition, results of some preliminary 63,65Cu nuclear quadrupole resonance (NQR) studies of bornite are reported and their merits with Mössbauer techniques for bornite detection discussed.

  17. Development and initial evaluation of the Clinical Information Systems Success Model (CISSM).

    PubMed

    Garcia-Smith, Dianna; Effken, Judith A

    2013-06-01

    Most clinical information systems (CIS) today are technically sound, but the number of successful implementations of these systems is low. The purpose of this study was to develop and test a theoretically based integrated CIS Success Model (CISSM) from the nurse perspective. Model predictors of CIS success were taken from existing research on information systems acceptance, user satisfaction, use intention, user behavior and perceptions, as well as clinical research. Data collected online from 234 registered nurses in four hospitals were used to test the model. Each nurse had used the Cerner Power Chart Admission Health Profile for at least 3 months. Psychometric testing and factor analysis of the 23-item CISSM instrument established its construct validity and reliability. Initial analysis showed nurses' satisfaction with and dependency on CIS use predicted their perceived CIS use Net Benefit. Further analysis identified Social Influence and Facilitating Conditions as other predictors of CIS user Net Benefit. The level of hospital CIS integration may account for the role of CIS Use Dependency in the success of CIS. Based on our experience, CISSM provides a formative as well as summative tool for evaluating CIS success from the nurse's perspective. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. Tool for analyzing the vulnerability of buildings to flooding: the case of Switzerland

    NASA Astrophysics Data System (ADS)

    Choffet, Marc; Bianchi, Renzo; Jaboyedoff, Michel; Kölz, Ehrfried; Lateltin, Olivier; Leroi, Eric; Mayis, Arnaud

    2010-05-01

    Whatever the way used to protect property exposed to flood, there exists a residual risk. That is what feedbacks of past flooding show. This residual risk is on one hand linked with the possibility that the protection measures may fail or may not work as intended. The residual risk is on the other hand linked with the possibility that the flood exceeds the chosen level of protection.In many European countries, governments and insurance companies are thinking in terms of vulnerability reduction. This publication will present a new tool to evaluate the vulnerability of buildings in a context of flooding. This tool is developed by the project "Analysis of the vulnerability of buildings to flooding" which is funded by the Foundation for Prevention of Cantonal insurances, Switzerland. It is composed by three modules and it aims to provide a method for reducing the vulnerability of buildings to flooding. The first two modules allow identifying all the elements composing the building and listing it. The third module is dedicated to the choice of efficient risk reducing measures on the basis of cost-benefit analyses. The diagnostic tool for different parts of the building is being developed to allow real estate appraisers, insurance companies and homeowners rapidly assess the vulnerability of buildings in flood prone areas. The tool works with by several databases that have been selected from the collection and analysis of data, information, standards and feedback from risk management, hydrology, architecture, construction, materials engineering, insurance, or economy of construction. A method for determining the local hazard is also proposed, to determine the height of potential floods threatening a building, based on a back analysis of Swiss hazard maps. To calibrate the model, seven cantonal insurance institutions participate in the study by providing data, such as the the amount of damage in flooded areas. The poster will present some results from the development of the tool, such as the amount of damages to buildings and the possibility of analysis offered by the tool. Furthermore, analysis of data from the insurance company led to the emergence of trends in costs of damage due to flooding. Some graphics will be presented in the poster to illustrate the tool design. It will be shown that the tool allow for a census of buildings and the awareness of its vulnerability to flooding. A database development explanation concerning the remediation cost measures and the damage costs are also proposed. Simple and innovative remedial measures could be shown in the poster. By the help of some examples it is shown that the tool allows for an investigation of some interesting perspectives in the development of insurance strategies for building stocks in flood prone areas.

  19. Panoramic ECG display versus conventional ECG: ischaemia detection by critical care nurses.

    PubMed

    Wilson, Nick; Hassani, Aimen; Gibson, Vanessa; Lightfoot, Timothy; Zizzo, Claudio

    2012-01-01

    To compare accuracy and certainty of diagnosis of cardiac ischaemia using the Panoramic ECG display tool plus conventional 12-lead electrocardiogram (ECG) versus 12-lead ECG alone by UK critical care nurses who were members of the British Association of Critical Care Nurses (BACCN). Critically ill patients are prone to myocardial ischaemia. Symptoms may be masked by sedation or analgesia, and ECG changes may be the only sign. Critical care nurses have an essential role in detecting ECG changes promptly. Despite this, critical care nurses may lack expertise in interpreting ECGs and myocardial ischaemia often goes undetected by critical care staff. British Association of Critical Care Nurses (BACCN) members were invited to complete an online survey to evaluate the analysis of two sets of eight ECGs displayed alone and with the new display device. Data from 82 participants showed diagnostic accuracy improved from 67·1% reading ECG traces alone, to 96·0% reading ECG plus Panoramic ECG display tool (P < 0·01, significance level α = 0·05). Participants' diagnostic certainty score rose from 41·7% reading ECG alone to 66·8% reading ECG plus Panoramic ECG display tool (P < 0·01, α = 0·05). The Panoramic ECG display tool improves both accuracy and certainty of detecting ST segment changes among critical care nurses, when compared to conventional 12-lead ECG alone. This benefit was greatest with early ischaemic changes. Critical care nurses who are least confident in reading conventional ECGs benefit the most from the new display. Critical care nurses have an essential role in the monitoring of critically ill patients. However, nurses do not always have the expertise to detect subtle ischaemic ECG changes promptly. Introduction of the Panoramic ECG display tool into clinical practice could lead to patients receiving treatment for myocardial ischaemia sooner with the potential for reduction in morbidity and mortality. © 2012 The Authors. Nursing in Critical Care © 2012 British Association of Critical Care Nurses.

  20. Methodolgy For Evaluation Of Technology Impacts In Space Electric Power Systems

    NASA Technical Reports Server (NTRS)

    Holda, Julie

    2004-01-01

    The Analysis and Management branch of the Power and Propulsion Office at NASA Glenn Research Center is responsible for performing complex analyses of the space power and In-Space propulsion products developed by GRC. This work quantifies the benefits of the advanced technologies to support on-going advocacy efforts. The Power and Propulsion Office is committed to understanding how the advancement in space technologies could benefit future NASA missions. They support many diverse projects and missions throughout NASA as well as industry and academia. The area of work that we are concentrating on is space technology investment strategies. Our goal is to develop a Monte-Carlo based tool to investigate technology impacts in space electric power systems. The framework is being developed at this stage, which will be used to set up a computer simulation of a space electric power system (EPS). The outcome is expected to be a probabilistic assessment of critical technologies and potential development issues. We are developing methods for integrating existing spreadsheet-based tools into the simulation tool. Also, work is being done on defining interface protocols to enable rapid integration of future tools. Monte Carlo-based simulation programs for statistical modeling of the EPS Model. I decided to learn and evaluate Palisade's @Risk and Risk Optimizer software, and utilize it's capabilities for the Electric Power System (EPS) model. I also looked at similar software packages (JMP, SPSS, Crystal Ball, VenSim, Analytica) available from other suppliers and evaluated them. The second task was to develop the framework for the tool, in which we had to define technology characteristics using weighing factors and probability distributions. Also we had to define the simulation space and add hard and soft constraints to the model. The third task is to incorporate (preliminary) cost factors into the model. A final task is developing a cross-platform solution of this framework.

  1. Staff perceptions of using outcome measures in stroke rehabilitation.

    PubMed

    Burton, Louisa-Jane; Tyson, Sarah; McGovern, Alison

    2013-05-01

    The use of standardised outcome measures is an integral part of stroke rehabilitation and is widely recommended as good practice. However, little is known about how measures are actually used or their impact. This study aimed to identify current clinical practice; how healthcare professionals working in stroke rehabilitation use outcome measures and their perceptions of the benefits and barriers to use. Eighty-four Health Care Professionals and 12 service managers and commissioners working in stroke services across a large UK county were surveyed by postal questionnaire. Ninety-six percent of clinical respondents used at least one measure, however, less than half used measures regularly during a patient's stay. The mean number of tools used was 3.2 (SD = 1.9). Eighty-one different tools were identified; 16 of which were unpublished and unvalidated. Perceived barriers in using outcome measures in day-to-day clinical practice included lack of resources (time and training) and lack of knowledge of appropriate measures. Benefits identified were to demonstrate the effectiveness of rehabilitation interventions and monitor patients' progress. Although the use of outcome measures is prevalent in clinical practice, there is little consistency in the tools utilised. The term "outcome measures" is used, but staff rarely used the measures at appropriate time points to formally assess and evaluate outcome. The term "measurement tool" more accurately reflects the purposes to which they were put and potential benefits. Further research to overcome the barriers in using standardised measurement tools and evaluate the impact of implementation on clinical practice is needed. • Health professionals working in stroke rehabilitation should work together to agree when and how outcome measures can be most effectively used in their service. • Efforts should be made to ensure that standardised tools are used to measure outcome at set time-points during rehabilitation, in order to achieve the anticipated benefits. • Communication between service providers and commissioners could be improved to highlight the barriers in using standardised measures of outcome.

  2. EERE's State & Local Energy Data Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shambarger, Erick; DeCesaro, Jennifer

    2014-06-23

    EERE's State and Local Energy Data (SLED) Tool provides basic energy market information that can help state and local governments plan and implement clean energy projects, including electricity generation; fuel sources and costs; applicable policies, regulations, and financial incentives; and renewable energy resource potential. Watch this video to learn more about the tool and hear testimonials from real users about the benefits of using this tool.

  3. EERE's State & Local Energy Data Tool

    ScienceCinema

    Shambarger, Erick; DeCesaro, Jennifer

    2018-05-30

    EERE's State and Local Energy Data (SLED) Tool provides basic energy market information that can help state and local governments plan and implement clean energy projects, including electricity generation; fuel sources and costs; applicable policies, regulations, and financial incentives; and renewable energy resource potential. Watch this video to learn more about the tool and hear testimonials from real users about the benefits of using this tool.

  4. Recent Advances and Future Challenges in Modified Mycotoxin Analysis: Why HRMS Has Become a Key Instrument in Food Contaminant Research

    PubMed Central

    Righetti, Laura; Paglia, Giuseppe; Galaverna, Gianni; Dall’Asta, Chiara

    2016-01-01

    Mycotoxins are secondary metabolites produced by pathogenic fungi in crops worldwide. These compounds can undergo modification in plants, leading to the formation of a large number of possible modified forms, whose toxicological relevance and occurrence in food and feed is still largely unexplored. The analysis of modified mycotoxins by liquid chromatography–mass spectrometry remains a challenge because of their chemical diversity, the large number of isomeric forms, and the lack of analytical standards. Here, the potential benefits of high-resolution and ion mobility mass spectrometry as a tool for separation and structure confirmation of modified mycotoxins have been investigated/reviewed. PMID:27918432

  5. Chipster: user-friendly analysis software for microarray and other high-throughput data.

    PubMed

    Kallio, M Aleksi; Tuimala, Jarno T; Hupponen, Taavi; Klemelä, Petri; Gentile, Massimiliano; Scheinin, Ilari; Koski, Mikko; Käki, Janne; Korpelainen, Eija I

    2011-10-14

    The growth of high-throughput technologies such as microarrays and next generation sequencing has been accompanied by active research in data analysis methodology, producing new analysis methods at a rapid pace. While most of the newly developed methods are freely available, their use requires substantial computational skills. In order to enable non-programming biologists to benefit from the method development in a timely manner, we have created the Chipster software. Chipster (http://chipster.csc.fi/) brings a powerful collection of data analysis methods within the reach of bioscientists via its intuitive graphical user interface. Users can analyze and integrate different data types such as gene expression, miRNA and aCGH. The analysis functionality is complemented with rich interactive visualizations, allowing users to select datapoints and create new gene lists based on these selections. Importantly, users can save the performed analysis steps as reusable, automatic workflows, which can also be shared with other users. Being a versatile and easily extendable platform, Chipster can be used for microarray, proteomics and sequencing data. In this article we describe its comprehensive collection of analysis and visualization tools for microarray data using three case studies. Chipster is a user-friendly analysis software for high-throughput data. Its intuitive graphical user interface enables biologists to access a powerful collection of data analysis and integration tools, and to visualize data interactively. Users can collaborate by sharing analysis sessions and workflows. Chipster is open source, and the server installation package is freely available.

  6. Chipster: user-friendly analysis software for microarray and other high-throughput data

    PubMed Central

    2011-01-01

    Background The growth of high-throughput technologies such as microarrays and next generation sequencing has been accompanied by active research in data analysis methodology, producing new analysis methods at a rapid pace. While most of the newly developed methods are freely available, their use requires substantial computational skills. In order to enable non-programming biologists to benefit from the method development in a timely manner, we have created the Chipster software. Results Chipster (http://chipster.csc.fi/) brings a powerful collection of data analysis methods within the reach of bioscientists via its intuitive graphical user interface. Users can analyze and integrate different data types such as gene expression, miRNA and aCGH. The analysis functionality is complemented with rich interactive visualizations, allowing users to select datapoints and create new gene lists based on these selections. Importantly, users can save the performed analysis steps as reusable, automatic workflows, which can also be shared with other users. Being a versatile and easily extendable platform, Chipster can be used for microarray, proteomics and sequencing data. In this article we describe its comprehensive collection of analysis and visualization tools for microarray data using three case studies. Conclusions Chipster is a user-friendly analysis software for high-throughput data. Its intuitive graphical user interface enables biologists to access a powerful collection of data analysis and integration tools, and to visualize data interactively. Users can collaborate by sharing analysis sessions and workflows. Chipster is open source, and the server installation package is freely available. PMID:21999641

  7. Monitoring an Online Course with the GISMO Tool: A Case Study

    ERIC Educational Resources Information Center

    Mazza, Riccardo; Botturi, Luca

    2007-01-01

    This article presents GISMO, a novel, open source, graphic student-tracking tool integrated into Moodle. GISMO represents a further step in information visualization applied to education, and also a novelty in the field of learning management systems applications. The visualizations of the tool, its uses and the benefits it can bring are…

  8. WMOST: A tool for assessing cost-benefits of watershed management decisions affecting coastal resilience

    EPA Science Inventory

    The Watershed Management Optimization Support Tool (WMOST v.1) was released by the US Environmental Protection Agency in December 2013 (http://www2.epa.gov/exposure-assessment-models/wmost-10-download-page). The objective of WMOST is to serve as a public-domain screening tool th...

  9. EBOOK.EXE: A Desktop Authoring Tool for HURAA.

    ERIC Educational Resources Information Center

    Hu, Xiangen; Mathews, Eric; Graesser, Arthur C.; Susarla, Suresh

    The development of authoring tools for intelligent systems is an important step for creating, maintaining, and structuring content in a quick and easy manner. It has the benefit of allowing for a rapid change to new domains or topics for tutoring. The development of such tools requires functional control, access protection, ease of learning, and…

  10. A consumer guide: tools to manage vegetation and fuels.

    Treesearch

    David L. Peterson; Louisa Evers; Rebecca A. Gravenmier; Ellen Eberhardt

    2007-01-01

    Current efforts to improve the scientific basis for fire management on public lands will benefit from more efficient transfer of technical information and tools that support planning, implementation, and effectiveness of vegetation and hazardous fuel treatments. The technical scope, complexity, and relevant spatial scale of analytical and decision support tools differ...

  11. Weighing the Anti-Ischemic Benefits and Bleeding Risks from Aspirin Therapy: a Rational Approach.

    PubMed

    Dugani, Sagar; Ames, Jeffrey M; Manson, JoAnn E; Mora, Samia

    2018-02-21

    The role of aspirin in secondary cardiovascular prevention is well understood; however, the role in primary prevention is less clear, and requires careful balancing of potential benefits with risks. Here, we summarize the evidence base on the benefits and risks of aspirin therapy, discuss clinical practice guidelines and decision support tools to assist in initiating aspirin therapy, and highlight ongoing trials that may clarify the role of aspirin in cardiovascular disease prevention. In 2016, the USPSTF released guidelines on the use of aspirin for primary prevention. Based on 11 trials (n = 118,445), aspirin significantly reduced all-cause mortality and nonfatal myocardial infarction, and in 7 trials that evaluated aspirin ≤ 100 mg/day, there was significant reduction in nonfatal stroke. The USPSTF recommends individualized use of aspirin based on factors including age, 10-year atherosclerotic cardiovascular disease risk score, and bleeding risk. Several ongoing trials are evaluating the role of aspirin in primary prevention, secondary prevention, and in combination therapy for atrial fibrillation. Evidence-based approaches to aspirin use should consider the anti-ischemic benefits and bleeding risks from aspirin. In this era of precision medicine, tools that provide the personalized benefit to risk assessment, such as the freely available clinical decision support tool (Aspirin-Guide), can be easily incorporated into the electronic health record and facilitate more informed decisions about initiating aspirin therapy for primary prevention. Aspirin has a complex matrix of benefits and risks, and its use in primary prevention requires individualized decision-making. Results from ongoing trials may guide healthcare providers in identifying appropriate candidates for aspirin therapy.

  12. Patient Understanding of the Risks and Benefits of Biologic Therapies in Inflammatory Bowel Disease: Insights from a Large-scale Analysis of Social Media Platforms.

    PubMed

    Martinez, Bibiana; Dailey, Francis; Almario, Christopher V; Keller, Michelle S; Desai, Mansee; Dupuy, Taylor; Mosadeghi, Sasan; Whitman, Cynthia; Lasch, Karen; Ursos, Lyann; Spiegel, Brennan M R

    2017-07-01

    Few studies have examined inflammatory bowel disease (IBD) patients' knowledge and understanding of biologic therapies outside traditional surveys. Here, we used social media data to examine IBD patients' understanding of the risks and benefits associated with biologic therapies and how this affects decision-making. We collected posts from Twitter and e-forum discussions from >3000 social media sites posted between June 27, 2012 and June 27, 2015. Guided by natural language processing, we identified posts with specific IBD keywords that discussed the risks and/or benefits of biologics. We then manually coded the resulting posts and performed qualitative analysis using ATLAS.ti software. A hierarchical coding structure was developed based on the keyword list and relevant themes were identified through manual coding. We examined 1598 IBD-related posts, of which 452 (28.3%) centered on the risks and/or benefits of biologics. There were 5 main themes: negative experiences and concerns with biologics (n = 247; 54.6%), decision-making surrounding biologic use (n = 169; 37.4%), positive experiences with biologics (n = 168; 37.2%), information seeking from peers (n = 125; 27.7%), and cost (n = 38; 8.4%). Posts describing negative experiences primarily commented on side effects from biologics, concerns about potential side effects and increased cancer risk, and pregnancy safety concerns. Posts on decision-making focused on nonbiologic treatment options, hesitation to initiate biologics, and concerns about changing or discontinuing regimens. Social media reveals a wide range of themes governing patients' experience and choice with IBD biologics. The complexity of navigating their risk-benefit profiles suggests merit in creating online tailored decision tools to support IBD patients' decision-making with biologic therapies.

  13. COMETBOARDS Can Optimize the Performance of a Wave-Rotor-Topped Gas Turbine Engine

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.

    1997-01-01

    A wave rotor, which acts as a high-technology topping spool in gas turbine engines, can increase the effective pressure ratio as well as the turbine inlet temperature in such engines. The wave rotor topping, in other words, may significantly enhance engine performance by increasing shaft horse power while reducing specific fuel consumption. This performance enhancement requires optimum selection of the wave rotor's adjustable parameters for speed, surge margin, and temperature constraints specified on different engine components. To examine the benefit of the wave rotor concept in engine design, researchers soft coupled NASA Lewis Research Center's multidisciplinary optimization tool COMETBOARDS and the NASA Engine Performance Program (NEPP) analyzer. The COMETBOARDS-NEPP combined design tool has been successfully used to optimize wave-rotor-topped engines. For illustration, the design of a subsonic gas turbine wave-rotor-enhanced engine with four ports for 47 mission points (which are specified by Mach number, altitude, and power-setting combinations) is considered. The engine performance analysis, constraints, and objective formulations were carried out through NEPP, and COMETBOARDS was used for the design optimization. So that the benefits that accrue from wave rotor enhancement could be examined, most baseline variables and constraints were declared to be passive, whereas important parameters directly associated with the wave rotor were considered to be active for the design optimization. The engine thrust was considered as the merit function. The wave rotor engine design, which became a sequence of 47 optimization subproblems, was solved successfully by using a cascade strategy available in COMETBOARDS. The graph depicts the optimum COMETBOARDS solutions for the 47 mission points, which were normalized with respect to standard results. As shown, the combined tool produced higher thrust for all mission points than did the other solution, with maximum benefits around mission points 11, 25, and 31. Such improvements can become critical, especially when engines are sized for these specific mission points.

  14. From the desktop to the grid: scalable bioinformatics via workflow conversion.

    PubMed

    de la Garza, Luis; Veit, Johannes; Szolek, Andras; Röttig, Marc; Aiche, Stephan; Gesing, Sandra; Reinert, Knut; Kohlbacher, Oliver

    2016-03-12

    Reproducibility is one of the tenets of the scientific method. Scientific experiments often comprise complex data flows, selection of adequate parameters, and analysis and visualization of intermediate and end results. Breaking down the complexity of such experiments into the joint collaboration of small, repeatable, well defined tasks, each with well defined inputs, parameters, and outputs, offers the immediate benefit of identifying bottlenecks, pinpoint sections which could benefit from parallelization, among others. Workflows rest upon the notion of splitting complex work into the joint effort of several manageable tasks. There are several engines that give users the ability to design and execute workflows. Each engine was created to address certain problems of a specific community, therefore each one has its advantages and shortcomings. Furthermore, not all features of all workflow engines are royalty-free -an aspect that could potentially drive away members of the scientific community. We have developed a set of tools that enables the scientific community to benefit from workflow interoperability. We developed a platform-free structured representation of parameters, inputs, outputs of command-line tools in so-called Common Tool Descriptor documents. We have also overcome the shortcomings and combined the features of two royalty-free workflow engines with a substantial user community: the Konstanz Information Miner, an engine which we see as a formidable workflow editor, and the Grid and User Support Environment, a web-based framework able to interact with several high-performance computing resources. We have thus created a free and highly accessible way to design workflows on a desktop computer and execute them on high-performance computing resources. Our work will not only reduce time spent on designing scientific workflows, but also make executing workflows on remote high-performance computing resources more accessible to technically inexperienced users. We strongly believe that our efforts not only decrease the turnaround time to obtain scientific results but also have a positive impact on reproducibility, thus elevating the quality of obtained scientific results.

  15. Managing bundled payments.

    PubMed

    Draper, Andrew

    2011-04-01

    Results of Medicare's ACE demonstration project and Geisinger Health System's ProvenCare initiative provide insight into the challenges hospitals will face as bundled payment proliferates. An early analysis of these results suggests that hospitals would benefit from bringing full automation using clinical IT tools to bear in their efforts to meet these challenges. Other important factors contributing to success include board and physician leadership, organizational structure, pricing methodology for bidding, evidence-based medical practice guidelines, supply cost management, process efficiency management, proactive and aggressive case management, business development and marketing strategy, and the financial management system.

  16. Information systems analysis approach in hospitals: a national survey.

    PubMed

    Wong, B K; Sellaro, C L; Monaco, J A

    1995-03-01

    A survey of 216 hospitals reveals that some hospitals do not conduct cost-benefit analyses or analyze possible adverse effects in feasibility studies. In determining and analyzing system requirements, external factors that initiate the transaction are not examined, and computer-aided software engineering (CASE) tools are seldom used. Some hospitals do not investigate the advantages and disadvantages of using in-house-developed software versus purchased software packages in the evaluation of alternatives. The survey finds that, overall, most hospitals follow the traditional systems development life cycle (SDLC) approach in analyzing information systems.

  17. Valuing investments in sustainable land management using an integrated modelling framework to support a watershed conservation scheme in the Upper Tana River, Kenya

    NASA Astrophysics Data System (ADS)

    Hunink, Johannes E.; Bryant, Benjamin P.; Vogl, Adrian; Droogers, Peter

    2015-04-01

    We analyse the multiple impacts of investments in sustainable land use practices on ecosystem services in the Upper Tana basin (Kenya) to support a watershed conservation scheme (a "water fund"). We apply an integrated modelling framework, building on previous field-based and modelling studies in the basin, and link biophysical outputs to economic benefits for the main actors in the basin. The first step in the modelling workflow is the use of a high-resolution spatial prioritization tool (Resource Investment Optimization System -- RIOS) to allocate the type and location of conservation investments in the different subbasins, subject to budget constraints and stakeholder concerns. We then run the Soil and Water Assessment Tool (SWAT) using the RIOS-identified investment scenarios to produce spatially explicit scenarios that simulate changes in water yield and suspended sediment. Finally, in close collaboration with downstream water users (urban water supply and hydropower) we link those biophysical outputs to monetary metrics, including: reduced water treatment costs, increased hydropower production, and crop yield benefits for upstream farmers in the conservation area. We explore how different budgets and different spatial targeting scenarios influence the return of the investments and the effectiveness of the water fund scheme. This study is novel in that it presents an integrated analysis targeting interventions in a decision context that takes into account local environmental and socio-economic conditions, and then relies on detailed, process-based, biophysical models to demonstrate the economic return on those investments. We conclude that the approach allows for an analysis on different spatial and temporal scales, providing conclusive evidence to stakeholders and decision makers on the contribution and benefits of the land-based investments in this basin. This is serving as foundational work to support the implementation of the Upper Tana-Nairobi Water Fund, a public-private partnership to safeguard ecosystem service provision and food security.

  18. On a learning curve for shared decision making: Interviews with clinicians using the knee osteoarthritis Option Grid.

    PubMed

    Elwyn, Glyn; Rasmussen, Julie; Kinsey, Katharine; Firth, Jill; Marrin, Katy; Edwards, Adrian; Wood, Fiona

    2018-02-01

    Tools used in clinical encounters to illustrate to patients the risks and benefits of treatment options have been shown to increase shared decision making. However, we do not have good information about how these tools are viewed by clinicians and how clinicians think patients would react to their use. Our aim was to examine clinicians' views about the possible and actual use of tools designed to support patients and clinicians to collaborate and deliberate about treatment options, namely, Option Grid decision aids. We conducted a thematic analysis of qualitative interviews embedded in the intervention phase of a trial of an Option Grid decision aid for osteoarthritis of the knee. Interviews were conducted with 6 participating clinicians before they used the tool and again after clinicians had used the tool with 6 patients. In the first interview, clinicians voiced concerns that the tool would lead to an increase in encounter duration, patient resistance regarding involvement in decision making, and potential information overload. At the second interview, after minimal training, the clinicians reported that the tool had changed their usual way of communicating, and it was generally acceptable and helpful to integrate it into practice. After experiencing the use of Option Grids, clinicians became more willing to use the tools in their clinical encounters with patients. How best to introduce Option Grids to clinicians and adopt their use into practice will need careful consideration of context, workflow, and clinical pathways. © 2016 John Wiley & Sons, Ltd.

  19. J-Earth: An Essential Resource for Terrestrial Remote Sensing and Data Analysis

    NASA Astrophysics Data System (ADS)

    Dunn, S.; Rupp, J.; Cheeseman, S.; Christensen, P. R.; Prashad, L. C.; Dickenshied, S.; Anwar, S.; Noss, D.; Murray, K.

    2011-12-01

    There is a need for a software tool that has the ability to display and analyze various types of earth science and social data through a simple, user-friendly interface. The J-Earth software tool has been designed to be easily accessible for download and intuitive use, regardless of the technical background of the user base. This tool does not require courses or text books to learn to use, yet is powerful enough to allow a more general community of users to perform complex data analysis. Professions that will benefit from this tool range from geologists, geographers, and climatologists to sociologists, economists, and ecologists as well as policy makers. J-Earth was developed by the Arizona State University Mars Space Flight Facility as part of the JMARS (Java Mission-planning and Analysis for Remote Sensing) suite of open-source tools. The program is a Geographic Information Systems (GIS) application used for viewing and processing satellite and airborne remote sensing data. While the functionality of JMARS has historically focused on the research needs of the planetary science community, J-Earth has been designed for a much broader Earth-based user audience. NASA instrument products accessible within J-Earth include data from ASTER, GOES, Landsat, MODIS, and TIMS. While J-Earth contains exceptionally comprehensive and high resolution satellite-derived data and imagery, this tool also includes many socioeconomic data products from projects lead by international organizations and universities. Datasets used in J-Earth take the form of grids, rasters, remote sensor "stamps", maps, and shapefiles. Some highly demanded global datasets available within J-Earth include five levels of administrative/political boundaries, climate data for current conditions as well as models for future climates, population counts and densities, land cover/land use, and poverty indicators. While this application does share the same powerful functionality of JMARS, J-Earth's apperance is enhanced for much easier data analysis. J-Earth utilizes a layering system to view data from different sources which can then be exported, scaled, colored and superimposed for quick comparisons. Users may now perform spatial analysis over several diverse datasets with respect to a defined geographic area or the entire globe. In addition, several newly acquired global datasets contain a temporal dimension which when accessed through J-Earth, make this a unique and powerful tool for spatial analysis over time. The functionality and ease of use set J-Earth apart from all other terrestrial GIS software packages and enable endless social, political, and scientific possibilities

  20. Intelligent transportation systems benefits : 1999 update

    DOT National Transportation Integrated Search

    1999-05-28

    The purpose of this report is to provide the Joint Program Office (JPO) with a tool to transmit existing knowledge of ITS benefits to the transportation profession. Also, this report is intended to provide the research community with information abou...

  1. Decision Support Tool for Nighttime Construction and Air Quality - User’s Guide

    DOT National Transportation Integrated Search

    2017-11-01

    The Texas Department of Transportation (TxDOT) Research Project 0-6864 Investigate the Air Quality Benefits of Nighttime Construction in Non-attainment Counties investigated the potential air quality benefits of shifting construction/maintenance acti...

  2. Measurement properties of self-report physical activity assessment tools in stroke: a protocol for a systematic review

    PubMed Central

    Martins, Júlia Caetano; Aguiar, Larissa Tavares; Nadeau, Sylvie; Scianni, Aline Alvim; Teixeira-Salmela, Luci Fuscaldi; Faria, Christina Danielli Coelho de Morais

    2017-01-01

    Introduction Self-report physical activity assessment tools are commonly used for the evaluation of physical activity levels in individuals with stroke. A great variety of these tools have been developed and widely used in recent years, which justify the need to examine their measurement properties and clinical utility. Therefore, the main objectives of this systematic review are to examine the measurement properties and clinical utility of self-report measures of physical activity and discuss the strengths and limitations of the identified tools. Methods and analysis A systematic review of studies that investigated the measurement properties and/or clinical utility of self-report physical activity assessment tools in stroke will be conducted. Electronic searches will be performed in five databases: Medical Literature Analysis and Retrieval System Online (MEDLINE) (PubMed), Excerpta Medica Database (EMBASE), Physiotherapy Evidence Database (PEDro), Literatura Latino-Americana e do Caribe em Ciências da Saúde (LILACS) and Scientific Electronic Library Online (SciELO), followed by hand searches of the reference lists of the included studies. Two independent reviewers will screen all retrieve titles, abstracts, and full texts, according to the inclusion criteria and will also extract the data. A third reviewer will be referred to solve any disagreement. A descriptive summary of the included studies will contain the design, participants, as well as the characteristics, measurement properties, and clinical utility of the self-report tools. The methodological quality of the studies will be evaluated using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist and the clinical utility of the identified tools will be assessed considering predefined criteria. This systematic review will follow the Preferred Reporting Items for Systematic Review and Meta-Analyses (PRISMA) statement. Discussion This systematic review will provide an extensive review of the measurement properties and clinical utility of self-report physical activity assessment tools used in individuals with stroke, which would benefit clinicians and researchers. Trial registration number PROSPERO CRD42016037146. PMID:28193848

  3. The Benefits and Complexities of Operating Geographic Information Systems (GIS) in a High Performance Computing (HPC) Environment

    NASA Astrophysics Data System (ADS)

    Shute, J.; Carriere, L.; Duffy, D.; Hoy, E.; Peters, J.; Shen, Y.; Kirschbaum, D.

    2017-12-01

    The NASA Center for Climate Simulation (NCCS) at the Goddard Space Flight Center is building and maintaining an Enterprise GIS capability for its stakeholders, to include NASA scientists, industry partners, and the public. This platform is powered by three GIS subsystems operating in a highly-available, virtualized environment: 1) the Spatial Analytics Platform is the primary NCCS GIS and provides users discoverability of the vast DigitalGlobe/NGA raster assets within the NCCS environment; 2) the Disaster Mapping Platform provides mapping and analytics services to NASA's Disaster Response Group; and 3) the internal (Advanced Data Analytics Platform/ADAPT) enterprise GIS provides users with the full suite of Esri and open source GIS software applications and services. All systems benefit from NCCS's cutting edge infrastructure, to include an InfiniBand network for high speed data transfers; a mixed/heterogeneous environment featuring seamless sharing of information between Linux and Windows subsystems; and in-depth system monitoring and warning systems. Due to its co-location with the NCCS Discover High Performance Computing (HPC) environment and the Advanced Data Analytics Platform (ADAPT), the GIS platform has direct access to several large NCCS datasets including DigitalGlobe/NGA, Landsat, MERRA, and MERRA2. Additionally, the NCCS ArcGIS Desktop Windows virtual machines utilize existing NetCDF and OPeNDAP assets for visualization, modelling, and analysis - thus eliminating the need for data duplication. With the advent of this platform, Earth scientists have full access to vast data repositories and the industry-leading tools required for successful management and analysis of these multi-petabyte, global datasets. The full system architecture and integration with scientific datasets will be presented. Additionally, key applications and scientific analyses will be explained, to include the NASA Global Landslide Catalog (GLC) Reporter crowdsourcing application, the NASA GLC Viewer discovery and analysis tool, the DigitalGlobe/NGA Data Discovery Tool, the NASA Disaster Response Group Mapping Platform (https://maps.disasters.nasa.gov), and support for NASA's Arctic - Boreal Vulnerability Experiment (ABoVE).

  4. Tamoxifen therapy benefit for patients with 70-gene signature high and low risk.

    PubMed

    van 't Veer, Laura J; Yau, Christina; Yu, Nancy Y; Benz, Christopher C; Nordenskjöld, Bo; Fornander, Tommy; Stål, Olle; Esserman, Laura J; Lindström, Linda Sofie

    2017-11-01

    Breast cancer molecular prognostic tools that predict recurrence risk have mainly been established on endocrine-treated patients and thus are not optimal for the evaluation of benefit from endocrine therapy. The Stockholm tamoxifen (STO-3) trial which randomized postmenopausal node-negative patients to 2-year tamoxifen (followed by an optional randomization for an additional 3-year tamoxifen vs nil), versus no adjuvant treatment, provides a unique opportunity to evaluate long-term 20-year benefit of endocrine therapy within prognostic risk classes of the 70-gene prognosis signature that was developed on adjuvantly untreated patients. We assessed by Kaplan-Meier analysis 20-year breast cancer-specific survival (BCSS) and 10-year distant metastasis-free survival (DMFS) for 538 estrogen receptor (ER)-positive, STO-3 trial patients with retrospectively ascertained 70-gene prognosis classification. Multivariable analysis of long-term (20 years) BCSS by STO-3 trial arm in the 70-gene high-risk and low-risk subgroups was performed using Cox proportional hazard modeling adjusting for classical patient and tumor characteristics. Tamoxifen-treated, 70-gene low- and high-risk patients had 20-year BCSS of 90 and 83%, as compared to 80 and 65% for untreated patients, respectively (log-rank p < 0.0001). Notably, there is equivalent tamoxifen benefit in both high (HR 0.42 (0.21-0.86), p = 0.018) and low (HR 0.46 (0.25-0.85), p = 0.013) 70-gene risk categories even after adjusting for clinico-pathological factors for BCSS. Limited tamoxifen exposure as given in the STO-3 trial provides persistent benefit for 10-15 years after diagnosis in a time-varying analysis. 10-year DMFS was 93 and 85% for low- and high-risk tamoxifen-treated, versus 83 and 70% for low- and high-risk untreated patients, respectively (log-rank p < 0.0001). Patients with ER-positive breast cancer, regardless of high or low 70-gene risk classification, receive significant survival benefit lasting over 10 years from adjuvant tamoxifen therapy, even when given for a relatively short duration.

  5. Design Considerations | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  6. Gas Fills | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  7. Understanding Windows | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  8. Books & Publications | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  9. Efficient Windows Collaborative | Home

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  10. An automated testing tool for traffic signal controller functionalities.

    DOT National Transportation Integrated Search

    2010-03-01

    The purpose of this project was to develop an automated tool that facilitates testing of traffic controller functionality using controller interface device (CID) technology. Benefits of such automated testers to traffic engineers include reduced test...

  11. Resources | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  12. Provide Views | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  13. Links | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  14. Reducing Condensation | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  15. Reduced Fading | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  16. EWC Membership | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  17. Visible Transmittance | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  18. EWC Members | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  19. Financing & Incentives | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  20. Introduction to the computational structural mechanics testbed

    NASA Technical Reports Server (NTRS)

    Lotts, C. G.; Greene, W. H.; Mccleary, S. L.; Knight, N. F., Jr.; Paulson, S. S.; Gillian, R. E.

    1987-01-01

    The Computational Structural Mechanics (CSM) testbed software system based on the SPAR finite element code and the NICE system is described. This software is denoted NICE/SPAR. NICE was developed at Lockheed Palo Alto Research Laboratory and contains data management utilities, a command language interpreter, and a command language definition for integrating engineering computational modules. SPAR is a system of programs used for finite element structural analysis developed for NASA by Lockheed and Engineering Information Systems, Inc. It includes many complementary structural analysis, thermal analysis, utility functions which communicate through a common database. The work on NICE/SPAR was motivated by requirements for a highly modular and flexible structural analysis system to use as a tool in carrying out research in computational methods and exploring computer hardware. Analysis examples are presented which demonstrate the benefits gained from a combination of the NICE command language with a SPAR computational modules.

Top