Applications of decision analysis and related techniques to industrial engineering problems at KSC
NASA Technical Reports Server (NTRS)
Evans, Gerald W.
1995-01-01
This report provides: (1) a discussion of the origination of decision analysis problems (well-structured problems) from ill-structured problems; (2) a review of the various methodologies and software packages for decision analysis and related problem areas; (3) a discussion of how the characteristics of a decision analysis problem affect the choice of modeling methodologies, thus providing a guide as to when to choose a particular methodology; and (4) examples of applications of decision analysis to particular problems encountered by the IE Group at KSC. With respect to the specific applications at KSC, particular emphasis is placed on the use of the Demos software package (Lumina Decision Systems, 1993).
Decision-problem state analysis methodology
NASA Technical Reports Server (NTRS)
Dieterly, D. L.
1980-01-01
A methodology for analyzing a decision-problem state is presented. The methodology is based on the analysis of an incident in terms of the set of decision-problem conditions encountered. By decomposing the events that preceded an unwanted outcome, such as an accident, into the set of decision-problem conditions that were resolved, a more comprehensive understanding is possible. All human-error accidents are not caused by faulty decision-problem resolutions, but it appears to be one of the major areas of accidents cited in the literature. A three-phase methodology is presented which accommodates a wide spectrum of events. It allows for a systems content analysis of the available data to establish: (1) the resolutions made, (2) alternatives not considered, (3) resolutions missed, and (4) possible conditions not considered. The product is a map of the decision-problem conditions that were encountered as well as a projected, assumed set of conditions that should have been considered. The application of this methodology introduces a systematic approach to decomposing the events that transpired prior to the accident. The initial emphasis is on decision and problem resolution. The technique allows for a standardized method of accident into a scenario which may used for review or the development of a training simulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Wei; Reddy, T. A.; Gurian, Patrick
2007-01-31
A companion paper to Jiang and Reddy that presents a general and computationally efficient methodology for dyanmic scheduling and optimal control of complex primary HVAC&R plants using a deterministic engineering optimization approach.
Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J
2015-03-15
This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.
An Overview of R in Health Decision Sciences.
Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam
2017-10-01
As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.
A Review of Citation Analysis Methodologies for Collection Management
ERIC Educational Resources Information Center
Hoffmann, Kristin; Doucette, Lise
2012-01-01
While there is a considerable body of literature that presents the results of citation analysis studies, most researchers do not provide enough detail in their methodology to reproduce the study, nor do they provide rationale for methodological decisions. In this paper, we review the methodologies used in 34 recent articles that present a…
Application of Bayesian and cost benefit risk analysis in water resources management
NASA Astrophysics Data System (ADS)
Varouchakis, E. A.; Palogos, I.; Karatzas, G. P.
2016-03-01
Decision making is a significant tool in water resources management applications. This technical note approaches a decision dilemma that has not yet been considered for the water resources management of a watershed. A common cost-benefit analysis approach, which is novel in the risk analysis of hydrologic/hydraulic applications, and a Bayesian decision analysis are applied to aid the decision making on whether or not to construct a water reservoir for irrigation purposes. The alternative option examined is a scaled parabolic fine variation in terms of over-pumping violations in contrast to common practices that usually consider short-term fines. The methodological steps are analytically presented associated with originally developed code. Such an application, and in such detail, represents new feedback. The results indicate that the probability uncertainty is the driving issue that determines the optimal decision with each methodology, and depending on the unknown probability handling, each methodology may lead to a different optimal decision. Thus, the proposed tool can help decision makers to examine and compare different scenarios using two different approaches before making a decision considering the cost of a hydrologic/hydraulic project and the varied economic charges that water table limit violations can cause inside an audit interval. In contrast to practices that assess the effect of each proposed action separately considering only current knowledge of the examined issue, this tool aids decision making by considering prior information and the sampling distribution of future successful audits.
Schaafsma, Joanna D; van der Graaf, Yolanda; Rinkel, Gabriel J E; Buskens, Erik
2009-12-01
The lack of a standard methodology in diagnostic research impedes adequate evaluation before implementation of constantly developing diagnostic techniques. We discuss the methodology of diagnostic research and underscore the relevance of decision analysis in the process of evaluation of diagnostic tests. Overview and conceptual discussion. Diagnostic research requires a stepwise approach comprising assessment of test characteristics followed by evaluation of added value, clinical outcome, and cost-effectiveness. These multiple goals are generally incompatible with a randomized design. Decision-analytic models provide an important alternative through integration of the best available evidence. Thus, critical assessment of clinical value and efficient use of resources can be achieved. Decision-analytic models should be considered part of the standard methodology in diagnostic research. They can serve as a valid alternative to diagnostic randomized clinical trials (RCTs).
Human/Automation Trade Methodology for the Moon, Mars and Beyond
NASA Technical Reports Server (NTRS)
Korsmeyer, David J.
2009-01-01
It is possible to create a consistent trade methodology that can characterize operations model alternatives for crewed exploration missions. For example, a trade-space that is organized around the objective of maximizing Crew Exploration Vehicle (CEV) independence would have the input as a classification of the category of analysis to be conducted or decision to be made, and a commitment to a detailed point in a mission profile during which the analysis or decision is to be made. For example, does the decision have to do with crew activity planning, or life support? Is the mission phase trans-Earth injection, cruise, or lunar descent? Different kinds of decision analysis of the trade-space between human and automated decisions will occurs at different points in a mission's profile. The necessary objectives at a given point in time during a mission will call for different kinds of response with respect to where and how computers and automation are expected to help provide an accurate, safe, and timely response. In this paper, a consistent methodology for assessing the trades between human and automated decisions on-board will be presented and various examples discussed.
Engineering tradeoff problems viewed as multiple objective optimizations and the VODCA methodology
NASA Astrophysics Data System (ADS)
Morgan, T. W.; Thurgood, R. L.
1984-05-01
This paper summarizes a rational model for making engineering tradeoff decisions. The model is a hybrid from the fields of social welfare economics, communications, and operations research. A solution methodology (Vector Optimization Decision Convergence Algorithm - VODCA) firmly grounded in the economic model is developed both conceptually and mathematically. The primary objective for developing the VODCA methodology was to improve the process for extracting relative value information about the objectives from the appropriate decision makers. This objective was accomplished by employing data filtering techniques to increase the consistency of the relative value information and decrease the amount of information required. VODCA is applied to a simplified hypothetical tradeoff decision problem. Possible use of multiple objective analysis concepts and the VODCA methodology in product-line development and market research are discussed.
29 CFR 1926.64 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2011 CFR
2011-07-01
... analysis methodology being used. (5) The employer shall establish a system to promptly address the team's... the decision as to the appropriate PHA methodology to use. All PHA methodologies are subject to... be developed in conjunction with the process hazard analysis in sufficient detail to support the...
29 CFR 1926.64 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2010 CFR
2010-07-01
... analysis methodology being used. (5) The employer shall establish a system to promptly address the team's... the decision as to the appropriate PHA methodology to use. All PHA methodologies are subject to... be developed in conjunction with the process hazard analysis in sufficient detail to support the...
Improta, Giovanni; Russo, Mario Alessandro; Triassi, Maria; Converso, Giuseppe; Murino, Teresa; Santillo, Liberatina Carmela
2018-05-01
Health technology assessments (HTAs) are often difficult to conduct because of the decisive procedures of the HTA algorithm, which are often complex and not easy to apply. Thus, their use is not always convenient or possible for the assessment of technical requests requiring a multidisciplinary approach. This paper aims to address this issue through a multi-criteria analysis focusing on the analytic hierarchy process (AHP). This methodology allows the decision maker to analyse and evaluate different alternatives and monitor their impact on different actors during the decision-making process. However, the multi-criteria analysis is implemented through a simulation model to overcome the limitations of the AHP methodology. Simulations help decision-makers to make an appropriate decision and avoid unnecessary and costly attempts. Finally, a decision problem regarding the evaluation of two health technologies, namely, the evaluation of two biological prostheses for incisional infected hernias, will be analysed to assess the effectiveness of the model. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
A Mixed Methodological Analysis of the Role of Culture in the Clinical Decision-Making Process
ERIC Educational Resources Information Center
Hays, Danica G.; Prosek, Elizabeth A.; McLeod, Amy L.
2010-01-01
Even though literature indicates that particular cultural groups receive more severe diagnoses at disproportionate rates, there has been minimal research that addresses how culture interfaces specifically with clinical decision making. This mixed methodological study of 41 counselors indicated that cultural characteristics of both counselors and…
Applied Swarm-based medicine: collecting decision trees for patterns of algorithms analysis.
Panje, Cédric M; Glatzer, Markus; von Rappard, Joscha; Rothermundt, Christian; Hundsberger, Thomas; Zumstein, Valentin; Plasswilm, Ludwig; Putora, Paul Martin
2017-08-16
The objective consensus methodology has recently been applied in consensus finding in several studies on medical decision-making among clinical experts or guidelines. The main advantages of this method are an automated analysis and comparison of treatment algorithms of the participating centers which can be performed anonymously. Based on the experience from completed consensus analyses, the main steps for the successful implementation of the objective consensus methodology were identified and discussed among the main investigators. The following steps for the successful collection and conversion of decision trees were identified and defined in detail: problem definition, population selection, draft input collection, tree conversion, criteria adaptation, problem re-evaluation, results distribution and refinement, tree finalisation, and analysis. This manuscript provides information on the main steps for successful collection of decision trees and summarizes important aspects at each point of the analysis.
NASA Technical Reports Server (NTRS)
Howard, R. A.; North, D. W.; Pezier, J. P.
1975-01-01
A new methodology is proposed for integrating planetary quarantine objectives into space exploration planning. This methodology is designed to remedy the major weaknesses inherent in the current formulation of planetary quarantine requirements. Application of the methodology is illustrated by a tutorial analysis of a proposed Jupiter Orbiter mission. The proposed methodology reformulates planetary quarantine planning as a sequential decision problem. Rather than concentrating on a nominal plan, all decision alternatives and possible consequences are laid out in a decision tree. Probabilities and values are associated with the outcomes, including the outcome of contamination. The process of allocating probabilities, which could not be made perfectly unambiguous and systematic, is replaced by decomposition and optimization techniques based on principles of dynamic programming. Thus, the new methodology provides logical integration of all available information and allows selection of the best strategy consistent with quarantine and other space exploration goals.
Cognitive Task Analysis of Business Jet Pilots' Weather Flying Behaviors: Preliminary Results
NASA Technical Reports Server (NTRS)
Latorella, Kara; Pliske, Rebecca; Hutton, Robert; Chrenka, Jason
2001-01-01
This report presents preliminary findings from a cognitive task analysis (CTA) of business aviation piloting. Results describe challenging weather-related aviation decisions and the information and cues used to support these decisions. Further, these results demonstrate the role of expertise in business aviation decision-making in weather flying, and how weather information is acquired and assessed for reliability. The challenging weather scenarios and novice errors identified in the results provide the basis for experimental scenarios and dependent measures to be used in future flight simulation evaluations of candidate aviation weather information systems. Finally, we analyzed these preliminary results to recommend design and training interventions to improve business aviation decision-making with weather information. The primary objective of this report is to present these preliminary findings and to document the extended CTA methodology used to elicit and represent expert business aviator decision-making with weather information. These preliminary findings will be augmented with results from additional subjects using this methodology. A summary of the complete results, absent the detailed treatment of methodology provided in this report, will be documented in a separate publication.
NASA Astrophysics Data System (ADS)
Vazquez Rascon, Maria de Lourdes
This thesis focuses on the implementation of a participatory and transparent decision making tool about the wind farm projects. This tool is based on an (argumentative) framework that reflects the stakeholder's values systems involved in these projects and it employs two multicriteria methods: the multicriteria decision aide and the participatory geographical information systems, making it possible to represent this value systems by criteria and indicators to be evaluated. The stakeholder's values systems will allow the inclusion of environmental, economic and social-cultural aspects of wind energy projects and, thus, a sustainable development wind projects vision. This vision will be analyzed using the 16 sustainable principles included in the Quebec's Sustainable Development Act. Four specific objectives have been instrumented to favor a logical completion work, and to ensure the development of a successfultool : designing a methodology to couple the MCDA and participatory GIS, testing the developed methodology by a case study, making a robustness analysis to address strategic issues and analyzing the strengths, weaknesses, opportunities and threads of the developed methodology. Achieving the first goal allowed us to obtain a decision-making tool called Territorial Intelligence Modeling for Energy Development (TIMED approach). The TIMED approach is visually represented by a figure expressing the idea of a co-construction decision and where ail stakeholders are the focus of this methodology. TIMED is composed of four modules: Multi-Criteria decision analysis, participatory geographic Information systems, active involvement of the stakeholders and scientific knowledge/local knowledge. The integration of these four modules allows for the analysis of different implementation scenarios of wind turbines in order to choose the best one based on a participatory and transparent decision-making process that takes into account stakeholders' concerns. The second objective enabled the testing of TIMED in an ex-post experience of a wind farm in operation since 2006. In this test, II people participated representing four stakeholder' categories: the private sector, the public sector, experts and civil society. This test allowed us to analyze the current situation in which wind projects are currently developed in Quebec. The concerns of some stakeholders regarding situations that are not considered in the current context were explored through a third goal. This third objective allowed us to make simulations taking into account the assumptions of strategic levels. Examples of the strategic level are the communication tools used to approach the host community and the park property type. Finally, the fourth objective, a SWOT analysis with the participation of eight experts, allowed us to verify the extent to which TIMED approach succeeded in constructing four fields for participatory decision-making: physical, intellectual, emotional and procedural. From these facts, 116 strengths, 28 weaknesses, 32 constraints and 54 opportunities were identified. Contributions, applications, limitations and extensions of this research are based on giving a participatory decision-making methodology taking into account socio-cultural, environmental and economic variables; making reflection sessions on a wind farm in operation; acquiring MCDA knowledge for participants involved in testing the proposed methodology; taking into account the physical, intellectual, emotional and procedural spaces to al1iculate a participatory decision; using the proposed methodology in renewable energy sources other than wind; the need to an interdisciplinary team for the methodology application; access to quality data; access to information technologies; the right to public participation; the neutrality of experts; the relationships between experts and non-experts; cultural constraints; improvement of designed indicators; the implementation of a Web platform for participatory decision-making and writing a manual on the use of the developed methodology. Keywords: wind farm, multicriteria decision, geographic information systems, TIMED approach, sustainable wind energy projects development, renewable energy, social participation, robustness concern, SWOT analysis.
NASA Technical Reports Server (NTRS)
Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe
2008-01-01
NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.
NASA Astrophysics Data System (ADS)
Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.
2012-09-01
Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.
Decision support for redesigning wastewater treatment technologies.
McConville, Jennifer R; Künzle, Rahel; Messmer, Ulrike; Udert, Kai M; Larsen, Tove A
2014-10-21
This paper offers a methodology for structuring the design space for innovative process engineering technology development. The methodology is exemplified in the evaluation of a wide variety of treatment technologies for source-separated domestic wastewater within the scope of the Reinvent the Toilet Challenge. It offers a methodology for narrowing down the decision-making field based on a strict interpretation of treatment objectives for undiluted urine and dry feces and macroenvironmental factors (STEEPLED analysis) which influence decision criteria. Such an evaluation identifies promising paths for technology development such as focusing on space-saving processes or the need for more innovation in low-cost, energy-efficient urine treatment methods. Critical macroenvironmental factors, such as housing density, transportation infrastructure, and climate conditions were found to affect technology decisions regarding reactor volume, weight of outputs, energy consumption, atmospheric emissions, investment cost, and net revenue. The analysis also identified a number of qualitative factors that should be carefully weighed when pursuing technology development; such as availability of O&M resources, health and safety goals, and other ethical issues. Use of this methodology allows for coevolution of innovative technology within context constraints; however, for full-scale technology choices in the field, only very mature technologies can be evaluated.
Evaluation of stormwater harvesting sites using multi criteria decision methodology
NASA Astrophysics Data System (ADS)
Inamdar, P. M.; Sharma, A. K.; Cook, Stephen; Perera, B. J. C.
2018-07-01
Selection of suitable urban stormwater harvesting sites and associated project planning are often complex due to spatial, temporal, economic, environmental and social factors, and related various other variables. This paper is aimed at developing a comprehensive methodology framework for evaluating of stormwater harvesting sites in urban areas using Multi Criteria Decision Analysis (MCDA). At the first phase, framework selects potential stormwater harvesting (SWH) sites using spatial characteristics in a GIS environment. In second phase, MCDA methodology is used for evaluating and ranking of SWH sites in multi-objective and multi-stakeholder environment. The paper briefly describes first phase of framework and focuses chiefly on the second phase of framework. The application of the methodology is also demonstrated over a case study comprising of the local government area, City of Melbourne (CoM), Australia for the benefit of wider water professionals engaged in this area. Nine performance measures (PMs) were identified to characterise the objectives and system performance related to the eight alternative SWH sites for the demonstration of the application of developed methodology. To reflect the stakeholder interests in the current study, four stakeholder participant groups were identified, namely, water authorities (WA), academics (AC), consultants (CS), and councils (CL). The decision analysis methodology broadly consisted of deriving PROMETHEE II rankings of eight alternative SWH sites in the CoM case study, under two distinct group decision making scenarios. The major innovation of this work is the development and application of comprehensive methodology framework that assists in the selection of potential sites for SWH, and facilitates the ranking in multi-objective and multi-stakeholder environment. It is expected that the proposed methodology will assist the water professionals and managers with better knowledge that will reduce the subjectivity in the selection and evaluation of SWH sites.
NASA Astrophysics Data System (ADS)
Zein-Sabatto, Saleh; Mikhail, Maged; Bodruzzaman, Mohammad; DeSimio, Martin; Derriso, Mark; Behbahani, Alireza
2012-06-01
It has been widely accepted that data fusion and information fusion methods can improve the accuracy and robustness of decision-making in structural health monitoring systems. It is arguably true nonetheless, that decision-level is equally beneficial when applied to integrated health monitoring systems. Several decisions at low-levels of abstraction may be produced by different decision-makers; however, decision-level fusion is required at the final stage of the process to provide accurate assessment about the health of the monitored system as a whole. An example of such integrated systems with complex decision-making scenarios is the integrated health monitoring of aircraft. Thorough understanding of the characteristics of the decision-fusion methodologies is a crucial step for successful implementation of such decision-fusion systems. In this paper, we have presented the major information fusion methodologies reported in the literature, i.e., probabilistic, evidential, and artificial intelligent based methods. The theoretical basis and characteristics of these methodologies are explained and their performances are analyzed. Second, candidate methods from the above fusion methodologies, i.e., Bayesian, Dempster-Shafer, and fuzzy logic algorithms are selected and their applications are extended to decisions fusion. Finally, fusion algorithms are developed based on the selected fusion methods and their performance are tested on decisions generated from synthetic data and from experimental data. Also in this paper, a modeling methodology, i.e. cloud model, for generating synthetic decisions is presented and used. Using the cloud model, both types of uncertainties; randomness and fuzziness, involved in real decision-making are modeled. Synthetic decisions are generated with an unbiased process and varying interaction complexities among decisions to provide for fair performance comparison of the selected decision-fusion algorithms. For verification purposes, implementation results of the developed fusion algorithms on structural health monitoring data collected from experimental tests are reported in this paper.
A multicriteria decision making model for assessment and selection of an ERP in a logistics context
NASA Astrophysics Data System (ADS)
Pereira, Teresa; Ferreira, Fernanda A.
2017-07-01
The aim of this work is to apply a methodology of decision support based on a multicriteria decision analyses (MCDA) model that allows the assessment and selection of an Enterprise Resource Planning (ERP) in a Portuguese logistics company by Group Decision Maker (GDM). A Decision Support system (DSS) that implements a MCDA - Multicriteria Methodology for the Assessment and Selection of Information Systems / Information Technologies (MMASSI / IT) is used based on its features and facility to change and adapt the model to a given scope. Using this DSS it was obtained the information system that best suited to the decisional context, being this result evaluated through a sensitivity and robustness analysis.
Application of risk analysis in water resourses management
NASA Astrophysics Data System (ADS)
Varouchakis, Emmanouil; Palogos, Ioannis
2017-04-01
A common cost-benefit analysis approach, which is novel in the risk analysis of hydrologic/hydraulic applications, and a Bayesian decision analysis are applied to aid the decision making on whether or not to construct a water reservoir for irrigation purposes. The alternative option examined is a scaled parabolic fine variation in terms of over-pumping violations in contrast to common practices that usually consider short-term fines. Such an application, and in such detail, represents new feedback. The results indicate that the probability uncertainty is the driving issue that determines the optimal decision with each methodology, and depending on the unknown probability handling, each methodology may lead to a different optimal decision. Thus, the proposed tool can help decision makers (stakeholders) to examine and compare different scenarios using two different approaches before making a decision considering the cost of a hydrologic/hydraulic project and the varied economic charges that water table limit violations can cause inside an audit interval. In contrast to practices that assess the effect of each proposed action separately considering only current knowledge of the examined issue, this tool aids decision making by considering prior information and the sampling distribution of future successful audits. This tool is developed in a web service for the easier stakeholders' access.
NASA Astrophysics Data System (ADS)
Moglia, Magnus; Sharma, Ashok K.; Maheepala, Shiroma
2012-07-01
SummaryPlanning of regional and urban water resources, and in particular with Integrated Urban Water Management approaches, often considers inter-relationships between human uses of water, the health of the natural environment as well as the cost of various management strategies. Decision makers hence typically need to consider a combination of social, environmental and economic goals. The types of strategies employed can include water efficiency measures, water sensitive urban design, stormwater management, or catchment management. Therefore, decision makers need to choose between different scenarios and to evaluate them against a number of criteria. This type of problem has a discipline devoted to it, i.e. Multi-Criteria Decision Analysis, which has often been applied in water management contexts. This paper describes the application of Subjective Logic in a basic Bayesian Network to a Multi-Criteria Decision Analysis problem. By doing this, it outlines a novel methodology that explicitly incorporates uncertainty and information reliability. The application of the methodology to a known case study context allows for exploration. By making uncertainty and reliability of assessments explicit, it allows for assessing risks of various options, and this may help in alleviating cognitive biases and move towards a well formulated risk management policy.
Angelis, Aris; Kanavos, Panos
2016-05-01
In recent years, multiple criteria decision analysis (MCDA) has emerged as a likely alternative to address shortcomings in health technology assessment (HTA) by offering a more holistic perspective to value assessment and acting as an alternative priority setting tool. In this paper, we argue that MCDA needs to subscribe to robust methodological processes related to the selection of objectives, criteria and attributes in order to be meaningful in the context of healthcare decision making and fulfil its role in value-based assessment (VBA). We propose a methodological process, based on multi-attribute value theory (MAVT) methods comprising five distinct phases, outline the stages involved in each phase and discuss their relevance in the HTA process. Importantly, criteria and attributes need to satisfy a set of desired properties, otherwise the outcome of the analysis can produce spurious results and misleading recommendations. Assuming the methodological process we propose is adhered to, the application of MCDA presents three very distinct advantages to decision makers in the context of HTA and VBA: first, it acts as an instrument for eliciting preferences on the performance of alternative options across a wider set of explicit criteria, leading to a more complete assessment of value; second, it allows the elicitation of preferences across the criteria themselves to reflect differences in their relative importance; and, third, the entire process of preference elicitation can be informed by direct stakeholder engagement, and can therefore reflect their own preferences. All features are fully transparent and facilitate decision making.
Medical Problem-Solving: A Critique of the Literature.
ERIC Educational Resources Information Center
McGuire, Christine H.
1985-01-01
Prescriptive, decision-analysis of medical problem-solving has been based on decision theory that involves calculation and manipulation of complex probability and utility values to arrive at optimal decisions that will maximize patient benefits. The studies offer a methodology for improving clinical judgment. (Author/MLW)
Samantra, Chitrasen; Datta, Saurav; Mahapatra, Siba Sankar
2017-03-01
In the context of underground coal mining industry, the increased economic issues regarding implementation of additional safety measure systems, along with growing public awareness to ensure high level of workers safety, have put great pressure on the managers towards finding the best solution to ensure safe as well as economically viable alternative selection. Risk-based decision support system plays an important role in finding such solutions amongst candidate alternatives with respect to multiple decision criteria. Therefore, in this paper, a unified risk-based decision-making methodology has been proposed for selecting an appropriate safety measure system in relation to an underground coal mining industry with respect to multiple risk criteria such as financial risk, operating risk, and maintenance risk. The proposed methodology uses interval-valued fuzzy set theory for modelling vagueness and subjectivity in the estimates of fuzzy risk ratings for making appropriate decision. The methodology is based on the aggregative fuzzy risk analysis and multi-criteria decision making. The selection decisions are made within the context of understanding the total integrated risk that is likely to incur while adapting the particular safety system alternative. Effectiveness of the proposed methodology has been validated through a real-time case study. The result in the context of final priority ranking is seemed fairly consistent.
NASA Astrophysics Data System (ADS)
McKinney, D. C.; Cuellar, A. D.
2015-12-01
Climate change has accelerated glacial retreat in high altitude glaciated regions of Nepal leading to the growth and formation of glacier lakes. Glacial lake outburst floods (GLOF) are sudden events triggered by an earthquake, moraine failure or other shock that causes a sudden outflow of water. These floods are catastrophic because of their sudden onset, the difficulty predicting them, and enormous quantity of water and debris rapidly flooding downstream areas. Imja Lake in the Himalaya of Nepal has experienced accelerated growth since it first appeared in the 1960s. Communities threatened by a flood from Imja Lake have advocated for projects to adapt to the increasing threat of a GLOF. Nonetheless, discussions surrounding projects for Imja have not included a rigorous analysis of the potential consequences of a flood, probability of an event, or costs of mitigation projects in part because this information is unknown or uncertain. This work presents a demonstration of a decision making methodology developed to rationally analyze the risks posed by Imja Lake and the various adaptation projects proposed using available information. In this work the authors use decision analysis, data envelopement analysis (DEA), and sensitivity analysis to assess proposed adaptation measures that would mitigate damage in downstream communities from a GLOF. We use an existing hydrodynamic model of the at-risk area to determine how adaptation projects will affect downstream flooding and estimate fatalities using an empirical method developed for dam failures. The DEA methodology allows us to estimate the value of a statistical life implied by each project given the cost of the project and number of lives saved to determine which project is the most efficient. In contrast the decision analysis methodology requires fatalities to be assigned a cost but allows the inclusion of uncertainty in the decision making process. We compare the output of these two methodologies and determine the sensitivity of the conclusions to changes in uncertain input parameters including project cost, value of a statistical life, and time to a GLOF event.
A hierarchical-multiobjective framework for risk management
NASA Technical Reports Server (NTRS)
Haimes, Yacov Y.; Li, Duan
1991-01-01
A broad hierarchical-multiobjective framework is established and utilized to methodologically address the management of risk. United into the framework are the hierarchical character of decision-making, the multiple decision-makers at separate levels within the hierarchy, the multiobjective character of large-scale systems, the quantitative/empirical aspects, and the qualitative/normative/judgmental aspects. The methodological components essentially consist of hierarchical-multiobjective coordination, risk of extreme events, and impact analysis. Examples of applications of the framework are presented. It is concluded that complex and interrelated forces require an analysis of trade-offs between engineering analysis and societal preferences, as in the hierarchical-multiobjective framework, to successfully address inherent risk.
Gilabert-Perramon, Antoni; Torrent-Farnell, Josep; Catalan, Arancha; Prat, Alba; Fontanet, Manel; Puig-Peiró, Ruth; Merino-Montero, Sandra; Khoury, Hanane; Goetghebeur, Mireille M; Badia, Xavier
2017-01-01
The aim of this study was to adapt and assess the value of a Multi-Criteria Decision Analysis (MCDA) framework (EVIDEM) for the evaluation of Orphan drugs in Catalonia (Catalan Health Service). The standard evaluation and decision-making procedures of CatSalut were compared with the EVIDEM methodology and contents. The EVIDEM framework was adapted to the Catalan context, focusing on the evaluation of Orphan drugs (PASFTAC program), during a Workshop with sixteen PASFTAC members. The criteria weighting was done using two different techniques (nonhierarchical and hierarchical). Reliability was assessed by re-test. The EVIDEM framework and methodology was found useful and feasible for Orphan drugs evaluation and decision making in Catalonia. All the criteria considered for the development of the CatSalut Technical Reports and decision making were considered in the framework. Nevertheless, the framework could improve the reporting of some of these criteria (i.e., "unmet needs" or "nonmedical costs"). Some Contextual criteria were removed (i.e., "Mandate and scope of healthcare system", "Environmental impact") or adapted ("population priorities and access") for CatSalut purposes. Independently of the weighting technique considered, the most important evaluation criteria identified for orphan drugs were: "disease severity", "unmet needs" and "comparative effectiveness", while the "size of the population" had the lowest relevance for decision making. Test-retest analysis showed weight consistency among techniques, supporting reliability overtime. MCDA (EVIDEM framework) could be a useful tool to complement the current evaluation methods of CatSalut, contributing to standardization and pragmatism, providing a method to tackle ethical dilemmas and facilitating discussions related to decision making.
Swift and Smart Decision Making: Heuristics that Work
ERIC Educational Resources Information Center
Hoy, Wayne K.; Tarter, C. J.
2010-01-01
Purpose: The aim of this paper is to examine the research literature on decision making and identify and develop a set of heuristics that work for school decision makers. Design/methodology/approach: This analysis is a synthesis of the research on decision-making heuristics that work. Findings: A set of nine rules for swift and smart decision…
DOT National Transportation Integrated Search
2016-09-01
This project applies a decision analytic methodology that takes considerations of extreme weather events to quantify and assess canopy investment options. The project collected data for two cases studies in two different transit agencies: Chicago Tra...
Cost-Utility Analysis: Current Methodological Issues and Future Perspectives
Nuijten, Mark J. C.; Dubois, Dominique J.
2011-01-01
The use of cost–effectiveness as final criterion in the reimbursement process for listing of new pharmaceuticals can be questioned from a scientific and policy point of view. There is a lack of consensus on main methodological issues and consequently we may question the appropriateness of the use of cost–effectiveness data in health care decision-making. Another concern is the appropriateness of the selection and use of an incremental cost–effectiveness threshold (Cost/QALY). In this review, we focus mainly on only some key methodological concerns relating to discounting, the utility concept, cost assessment, and modeling methodologies. Finally we will consider the relevance of some other important decision criteria, like social values and equity. PMID:21713127
Strategic Technology Investment Analysis: An Integrated System Approach
NASA Technical Reports Server (NTRS)
Adumitroaie, V.; Weisbin, C. R.
2010-01-01
Complex technology investment decisions within NASA are increasingly difficult to make such that the end results are satisfying the technical objectives and all the organizational constraints. Due to a restricted science budget environment and numerous required technology developments, the investment decisions need to take into account not only the functional impact on the program goals, but also development uncertainties and cost variations along with maintaining a healthy workforce. This paper describes an approach for optimizing and qualifying technology investment portfolios from the perspective of an integrated system model. The methodology encompasses multi-attribute decision theory elements and sensitivity analysis. The evaluation of the degree of robustness of the recommended portfolio provides the decision-maker with an array of viable selection alternatives, which take into account input uncertainties and possibly satisfy nontechnical constraints. The methodology is presented in the context of assessing capability development portfolios for NASA technology programs.
Risk-benefit analysis and public policy: a bibliography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, E.M.; Van Horn, A.J.
1976-11-01
Risk-benefit analysis has been implicitly practiced whenever decision-makers are confronted with decisions involving risks to life, health, or to the environment. Various methodologies have been developed to evaluate relevant criteria and to aid in assessing the impacts of alternative projects. Among these have been cost-benefit analysis, which has been widely used for project evaluation. However, in many cases it has been difficult to assign dollar costs to those criteria involving risks and benefits which are not now assigned explicit monetary values in our economic system. Hence, risk-benefit analysis has evolved to become more than merely an extension of cost-benefit analysis,more » and many methods have been applied to examine the trade-offs between risks and benefits. In addition, new scientific and statistical techniques have been developed for assessing current and future risks. The 950 references included in this bibliography are meant to suggest the breadth of those methodologies which have been applied to decisions involving risk.« less
Comparison between two methodologies for urban drainage decision aid.
Moura, P M; Baptista, M B; Barraud, S
2006-01-01
The objective of the present work is to compare two methodologies based on multicriteria analysis for the evaluation of stormwater systems. The first methodology was developed in Brazil and is based on performance-cost analysis, the second one is ELECTRE III. Both methodologies were applied to a case study. Sensitivity and robustness analyses were then carried out. These analyses demonstrate that both methodologies have equivalent results, and present low sensitivity and high robustness. These results prove that the Brazilian methodology is consistent and can be used safely in order to select a good solution or a small set of good solutions that could be compared with more detailed methods afterwards.
Mass Conflagration: An Analysis and Adaptation of the Shipboard Damage Control Organization
1991-03-01
the span of control narrows, as each supervisor is able to better monitor the actions and environment of his subordinates. (6) Communciation and... computed decision is reached by the decision makers, often based on a prior formal doctrine or methodology. [Ref. 4:p. 364] While no decision process
Assumptions Underlying Curriculum Decisions in Australia: An American Perspective.
ERIC Educational Resources Information Center
Willis, George
An analysis of the cultural and historical context in which curriculum decisions are made in Australia and a comparison with educational assumptions in the United States is the purpose of this paper. Methodology is based on personal teaching experience and observation in Australia. Seven factors are identified upon which curricular decisions in…
A multicriteria-based methodology for site prioritisation in sediment management.
Alvarez-Guerra, Manuel; Viguri, Javier R; Voulvoulis, Nikolaos
2009-08-01
Decision-making for sediment management is a complex task that incorporates the selections of areas for remediation and the assessment of options for any mitigation required. The application of Multicriteria Analysis (MCA) to rank different areas, according to their need for sediment management, provides a great opportunity for prioritisation, a first step in an integrated methodology that finally aims to assess and select suitable alternatives for managing the identified priority sites. This paper develops a methodology that starts with the delimitation of management units within areas of study, followed by the application of MCA methods that allows ranking of these management units, according to their need for remediation. This proposed process considers not only scientific evidence on sediment quality, but also other relevant aspects such as social and economic criteria associated with such decisions. This methodology is illustrated with its application to the case study area of the Bay of Santander, in northern Spain, highlighting some of the implications of utilising different MCA methods in the process. It also uses site-specific data to assess the subjectivity in the decision-making process, mainly reflected through the assignment of the criteria weights and uncertainties in the criteria scores. Analysis of the sensitivity of the results to these factors is used as a way to assess the stability and robustness of the ranking as a first step of the sediment management decision-making process.
Development of risk-based decision methodology for facility design.
DOT National Transportation Integrated Search
2014-06-01
This report develops a methodology for CDOT to use in the risk analysis of various types of facilities and provides : illustrative examples for the use of the proposed framework. An overview of the current practices and applications to : illustrate t...
Decision Making Analysis: Critical Factors-Based Methodology
2010-04-01
the pitfalls associated with current wargaming methods such as assuming a western view of rational values in decision - making regardless of the cultures...Utilization theory slightly expands the rational decision making model as it states that “actors try to maximize their expected utility by weighing the...items to categorize the decision - making behavior of political leaders which tend to demonstrate either a rational or cognitive leaning. Leaders
Moore, Bethany; Bone, Eric A
2017-01-01
The concept of triage in healthcare has been around for centuries and continues to be applied today so that scarce resources are allocated according to need. A business impact analysis (BIA) is a form of triage in that it identifies which processes are most critical, which to address first and how to allocate limited resources. On its own, however, the BIA provides only a roadmap of the impacts and interdependencies of an event. When disaster strikes, organisational decision-makers often face difficult decisions with regard to allocating limited resources between multiple 'mission-critical' functions. Applying the concept of triage to business continuity provides those decision-makers navigating a rapidly evolving and unpredictable event with a path that protects the fundamental priorities of the organisation. A business triage methodology aids decision-makers in times of crisis by providing a simplified framework for decision-making based on objective, evidence-based criteria, which is universally accepted and understood. When disaster strikes, the survival of the organisation depends on critical decision-making and quick actions to stabilise the incident. This paper argues that organisations need to supplement BIA processes with a decision-making triage methodology that can be quickly applied during the chaos of an actual event.
Role of scientific data in health decisions.
Samuels, S W
1979-01-01
The distinction between reality and models or methodological assumptions is necessary for an understanding of the use of data--economic, technical or biological--in decision-making. The traditional modes of analysis used in decisions are discussed historically and analytically. Utilitarian-based concepts such as cost-benefit analysis and cannibalistic concepts such as "acceptable risk" are rejected on logical and moral grounds. Historical reality suggests the concept of socially necessary risk determined through the dialectic process in democracy. PMID:120251
An Analysis of Category Management of Service Contracts
2017-12-01
management teams a way to make informed , data-driven decisions. Data-driven decisions derived from clustering not only align with Category...savings. Furthermore, this methodology provides a data-driven visualization to inform sound business decisions on potential Category Management ...Category Management initiatives. The Maptitude software will allow future research to collect data and develop visualizations to inform Category
The Aeronautical Data Link: Decision Framework for Architecture Analysis
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Goode, Plesent W.
2003-01-01
A decision analytic approach that develops optimal data link architecture configuration and behavior to meet multiple conflicting objectives of concurrent and different airspace operations functions has previously been developed. The approach, premised on a formal taxonomic classification that correlates data link performance with operations requirements, information requirements, and implementing technologies, provides a coherent methodology for data link architectural analysis from top-down and bottom-up perspectives. This paper follows the previous research by providing more specific approaches for mapping and transitioning between the lower levels of the decision framework. The goal of the architectural analysis methodology is to assess the impact of specific architecture configurations and behaviors on the efficiency, capacity, and safety of operations. This necessarily involves understanding the various capabilities, system level performance issues and performance and interface concepts related to the conceptual purpose of the architecture and to the underlying data link technologies. Efficient and goal-directed data link architectural network configuration is conditioned on quantifying the risks and uncertainties associated with complex structural interface decisions. Deterministic and stochastic optimal design approaches will be discussed that maximize the effectiveness of architectural designs.
Development of economic consequence methodology for process risk analysis.
Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed
2015-04-01
A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.
ERIC Educational Resources Information Center
Damboeck, Johanna
2012-01-01
Purpose: The aim of this article is to provide an analysis of the features that have shaped the state's decision-making process in the United Nations, with regard to the humanitarian intervention in Darfur from 2003 onwards. Design/methodology/approach: The methodological approach to the study is a review of political statement papers grounded in…
A methodology for stochastic analysis of share prices as Markov chains with finite states.
Mettle, Felix Okoe; Quaye, Enoch Nii Boi; Laryea, Ravenhill Adjetey
2014-01-01
Price volatilities make stock investments risky, leaving investors in critical position when uncertain decision is made. To improve investor evaluation confidence on exchange markets, while not using time series methodology, we specify equity price change as a stochastic process assumed to possess Markov dependency with respective state transition probabilities matrices following the identified state pace (i.e. decrease, stable or increase). We established that identified states communicate, and that the chains are aperiodic and ergodic thus possessing limiting distributions. We developed a methodology for determining expected mean return time for stock price increases and also establish criteria for improving investment decision based on highest transition probabilities, lowest mean return time and highest limiting distributions. We further developed an R algorithm for running the methodology introduced. The established methodology is applied to selected equities from Ghana Stock Exchange weekly trading data.
Churilov, Leonid; Liu, Daniel; Ma, Henry; Christensen, Soren; Nagakane, Yoshinari; Campbell, Bruce; Parsons, Mark W; Levi, Christopher R; Davis, Stephen M; Donnan, Geoffrey A
2013-04-01
The appropriateness of a software platform for rapid MRI assessment of the amount of salvageable brain tissue after stroke is critical for both the validity of the Extending the Time for Thrombolysis in Emergency Neurological Deficits (EXTEND) Clinical Trial of stroke thrombolysis beyond 4.5 hours and for stroke patient care outcomes. The objective of this research is to develop and implement a methodology for selecting the acute stroke imaging software platform most appropriate for the setting of a multi-centre clinical trial. A multi-disciplinary decision making panel formulated the set of preferentially independent evaluation attributes. Alternative Multi-Attribute Value Measurement methods were used to identify the best imaging software platform followed by sensitivity analysis to ensure the validity and robustness of the proposed solution. Four alternative imaging software platforms were identified. RApid processing of PerfusIon and Diffusion (RAPID) software was selected as the most appropriate for the needs of the EXTEND trial. A theoretically grounded generic multi-attribute selection methodology for imaging software was developed and implemented. The developed methodology assured both a high quality decision outcome and a rational and transparent decision process. This development contributes to stroke literature in the area of comprehensive evaluation of MRI clinical software. At the time of evaluation, RAPID software presented the most appropriate imaging software platform for use in the EXTEND clinical trial. The proposed multi-attribute imaging software evaluation methodology is based on sound theoretical foundations of multiple criteria decision analysis and can be successfully used for choosing the most appropriate imaging software while ensuring both robust decision process and outcomes. © 2012 The Authors. International Journal of Stroke © 2012 World Stroke Organization.
INTEGRATING DATA ANALYTICS AND SIMULATION METHODS TO SUPPORT MANUFACTURING DECISION MAKING
Kibira, Deogratias; Hatim, Qais; Kumara, Soundar; Shao, Guodong
2017-01-01
Modern manufacturing systems are installed with smart devices such as sensors that monitor system performance and collect data to manage uncertainties in their operations. However, multiple parameters and variables affect system performance, making it impossible for a human to make informed decisions without systematic methodologies and tools. Further, the large volume and variety of streaming data collected is beyond simulation analysis alone. Simulation models are run with well-prepared data. Novel approaches, combining different methods, are needed to use this data for making guided decisions. This paper proposes a methodology whereby parameters that most affect system performance are extracted from the data using data analytics methods. These parameters are used to develop scenarios for simulation inputs; system optimizations are performed on simulation data outputs. A case study of a machine shop demonstrates the proposed methodology. This paper also reviews candidate standards for data collection, simulation, and systems interfaces. PMID:28690363
Decision-theoretic methodology for reliability and risk allocation in nuclear power plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.
1985-01-01
This paper describes a methodology for allocating reliability and risk to various reactor systems, subsystems, components, operations, and structures in a consistent manner, based on a set of global safety criteria which are not rigid. The problem is formulated as a multiattribute decision analysis paradigm; the multiobjective optimization, which is performed on a PRA model and reliability cost functions, serves as the guiding principle for reliability and risk allocation. The concept of noninferiority is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The assessment of the decision maker's preferencesmore » could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided and several outstanding issues such as generic allocation and preference assessment are discussed.« less
The speed-accuracy tradeoff: history, physiology, methodology, and behavior
Heitz, Richard P.
2014-01-01
There are few behavioral effects as ubiquitous as the speed-accuracy tradeoff (SAT). From insects to rodents to primates, the tendency for decision speed to covary with decision accuracy seems an inescapable property of choice behavior. Recently, the SAT has received renewed interest, as neuroscience approaches begin to uncover its neural underpinnings and computational models are compelled to incorporate it as a necessary benchmark. The present work provides a comprehensive overview of SAT. First, I trace its history as a tractable behavioral phenomenon and the role it has played in shaping mathematical descriptions of the decision process. Second, I present a “users guide” of SAT methodology, including a critical review of common experimental manipulations and analysis techniques and a treatment of the typical behavioral patterns that emerge when SAT is manipulated directly. Finally, I review applications of this methodology in several domains. PMID:24966810
ERIC Educational Resources Information Center
Clemens, Rachael Annette
2017-01-01
This qualitative and interpretive inquiry explores the information behavior of birthmothers surrounding the processes of decision-making, coping, and living with the act of child relinquishment to adoption. An interpretative phenomenological analysis methodology is used to reveal the phenomenon as experienced by eight birthmothers, women who…
Life Cycle Assessment Software for Product and Process Sustainability Analysis
ERIC Educational Resources Information Center
Vervaeke, Marina
2012-01-01
In recent years, life cycle assessment (LCA), a methodology for assessment of environmental impacts of products and services, has become increasingly important. This methodology is applied by decision makers in industry and policy, product developers, environmental managers, and other non-LCA specialists working on environmental issues in a wide…
NASA Astrophysics Data System (ADS)
Pierce, S. A.; Wagner, K.; Schwartz, S.; Gentle, J. N., Jr.
2016-12-01
Critical water resources face the effects of historic drought, increased demand, and potential contamination, the need has never been greater to develop resources to effectively communicate conservation and protection across a broad audience and geographical area. The Watermark application and macro-analysis methodology merges topical analysis of context rich corpus from policy texts with multi-attributed solution sets from integrated models of water resource and other subsystems, such as mineral, food, energy, or environmental systems to construct a scalable, robust, and reproducible approach for identifying links between policy and science knowledge bases. The Watermark application is an open-source, interactive workspace to support science-based visualization and decision making. Designed with generalization in mind, Watermark is a flexible platform that allows for data analysis and inclusion of large datasets with an interactive front-end capable of connecting with other applications as well as advanced computing resources. In addition, the Watermark analysis methodology offers functionality that streamlines communication with non-technical users for policy, education, or engagement with groups around scientific topics of societal relevance. The technology stack for Watermark was selected with the goal of creating a robust and dynamic modular codebase that can be adjusted to fit many use cases and scale to support usage loads that range between simple data display to complex scientific simulation-based modelling and analytics. The methodology uses to topical analysis and simulation-optimization to systematically analyze the policy and management realities of resource systems and explicitly connect the social and problem contexts with science-based and engineering knowledge from models. A case example demonstrates use in a complex groundwater resources management study highlighting multi-criteria spatial decision making and uncertainty comparisons.
Kaufmann, Esther; Wittmann, Werner W.
2016-01-01
The success of bootstrapping or replacing a human judge with a model (e.g., an equation) has been demonstrated in Paul Meehl’s (1954) seminal work and bolstered by the results of several meta-analyses. To date, however, analyses considering different types of meta-analyses as well as the potential dependence of bootstrapping success on the decision domain, the level of expertise of the human judge, and the criterion for what constitutes an accurate decision have been missing from the literature. In this study, we addressed these research gaps by conducting a meta-analysis of lens model studies. We compared the results of a traditional (bare-bones) meta-analysis with findings of a meta-analysis of the success of bootstrap models corrected for various methodological artifacts. In line with previous studies, we found that bootstrapping was more successful than human judgment. Furthermore, bootstrapping was more successful in studies with an objective decision criterion than in studies with subjective or test score criteria. We did not find clear evidence that the success of bootstrapping depended on the decision domain (e.g., education or medicine) or on the judge’s level of expertise (novice or expert). Correction of methodological artifacts increased the estimated success of bootstrapping, suggesting that previous analyses without artifact correction (i.e., traditional meta-analyses) may have underestimated the value of bootstrapping models. PMID:27327085
Steuten, Lotte; van de Wetering, Gijs; Groothuis-Oudshoorn, Karin; Retèl, Valesca
2013-01-01
This article provides a systematic and critical review of the evolving methods and applications of value of information (VOI) in academia and practice and discusses where future research needs to be directed. Published VOI studies were identified by conducting a computerized search on Scopus and ISI Web of Science from 1980 until December 2011 using pre-specified search terms. Only full-text papers that outlined and discussed VOI methods for medical decision making, and studies that applied VOI and explicitly discussed the results with a view to informing healthcare decision makers, were included. The included papers were divided into methodological and applied papers, based on the aim of the study. A total of 118 papers were included of which 50 % (n = 59) are methodological. A rapidly accumulating literature base on VOI from 1999 onwards for methodological papers and from 2005 onwards for applied papers is observed. Expected value of sample information (EVSI) is the preferred method of VOI to inform decision making regarding specific future studies, but real-life applications of EVSI remain scarce. Methodological challenges to VOI are numerous and include the high computational demands, dealing with non-linear models and interdependency between parameters, estimations of effective time horizons and patient populations, and structural uncertainties. VOI analysis receives increasing attention in both the methodological and the applied literature bases, but challenges to applying VOI in real-life decision making remain. For many technical and methodological challenges to VOI analytic solutions have been proposed in the literature, including leaner methods for VOI. Further research should also focus on the needs of decision makers regarding VOI.
Triple Value System Dynamics Modeling to Help Stakeholders Engage with Food-Energy-Water Problems
Triple Value (3V) Community scoping projects and Triple Value Simulation (3VS) models help decision makers and stakeholders apply systems-analysis methodology to complex problems related to food production, water quality, and energy use. 3VS models are decision support tools that...
NASA Astrophysics Data System (ADS)
Brennan-Tonetta, Margaret
This dissertation seeks to provide key information and a decision support tool that states can use to support long-term goals of fossil fuel displacement and greenhouse gas reductions. The research yields three outcomes: (1) A methodology that allows for a comprehensive and consistent inventory and assessment of bioenergy feedstocks in terms of type, quantity, and energy potential. Development of a standardized methodology for consistent inventorying of biomass resources fosters research and business development of promising technologies that are compatible with the state's biomass resource base. (2) A unique interactive decision support tool that allows for systematic bioenergy analysis and evaluation of policy alternatives through the generation of biomass inventory and energy potential data for a wide variety of feedstocks and applicable technologies, using New Jersey as a case study. Development of a database that can assess the major components of a bioenergy system in one tool allows for easy evaluation of technology, feedstock and policy options. The methodology and decision support tool is applicable to other states and regions (with location specific modifications), thus contributing to the achievement of state and federal goals of renewable energy utilization. (3) Development of policy recommendations based on the results of the decision support tool that will help to guide New Jersey into a sustainable renewable energy future. The database developed in this research represents the first ever assessment of bioenergy potential for New Jersey. It can serve as a foundation for future research and modifications that could increase its power as a more robust policy analysis tool. As such, the current database is not able to perform analysis of tradeoffs across broad policy objectives such as economic development vs. CO2 emissions, or energy independence vs. source reduction of solid waste. Instead, it operates one level below that with comparisons of kWh or GGE generated by different feedstock/technology combinations at the state and county level. Modification of the model to incorporate factors that will enable the analysis of broader energy policy issues as those mentioned above, are recommended for future research efforts.
NASA Technical Reports Server (NTRS)
Schlater, Nelson J.; Simonds, Charles H.; Ballin, Mark G.
1993-01-01
Applied research and technology development (R&TD) is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Given the increased awareness of limitations in resources, effective R&TD today needs a method for up-front assessment of competing technologies to help guide technology investment decisions. Such an assessment approach must account for uncertainties in system performance parameters, mission requirements and architectures, and internal and external events influencing a development program. The methodology known as decision analysis has the potential to address these issues. It was evaluated by performing a case study assessment of alternative carbon dioxide removal technologies for NASA"s proposed First Lunar Outpost program. An approach was developed that accounts for the uncertainties in each technology's cost and performance parameters as well as programmatic uncertainties such as mission architecture. Life cycle cost savings relative to a baseline, adjusted for the cost of money, was used as a figure of merit to evaluate each of the alternative carbon dioxide removal technology candidates. The methodology was found to provide a consistent decision-making strategy for the develpoment of new life support technology. The case study results provided insight that was not possible from more traditional analysis approaches.
NASA Technical Reports Server (NTRS)
Schlater, Nelson J.; Simonds, Charles H.; Ballin, Mark G.
1993-01-01
Applied research and technology development (R&TD) is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Given the increased awareness of limitations in resources, effective R&TD today needs a method for up-front assessment of competing technologies to help guide technology investment decisions. Such an assessment approach must account for uncertainties in system performance parameters, mission requirements and architectures, and internal and external events influencing a development program. The methodology known as decision analysis has the potential to address these issues. It was evaluated by performing a case study assessment of alternative carbon dioxide removal technologies for NASA's proposed First Lunar Outpost program. An approach was developed that accounts for the uncertainties in each technology's cost and performance parameters as well as programmatic uncertainties such as mission architecture. Life cycle cost savings relative to a baseline, adjusted for the cost of money, was used as a figure of merit to evaluate each of the alternative carbon dioxide removal technology candidates. The methodology was found to provide a consistent decision-making strategy for development of new life support technology. The case study results provided insight that was not possible from more traditional analysis approaches.
Health economic evaluation: important principles and methodology.
Rudmik, Luke; Drummond, Michael
2013-06-01
To discuss health economic evaluation and improve the understanding of common methodology. This article discusses the methodology for the following types of economic evaluations: cost-minimization, cost-effectiveness, cost-utility, cost-benefit, and economic modeling. Topics include health-state utility measures, the quality-adjusted life year (QALY), uncertainty analysis, discounting, decision tree analysis, and Markov modeling. Economic evaluation is the comparative analysis of alternative courses of action in terms of both their costs and consequences. With increasing health care expenditure and limited resources, it is important for physicians to consider the economic impact of their interventions. Understanding common methodology involved in health economic evaluation will improve critical appraisal of the literature and optimize future economic evaluations. Copyright © 2012 The American Laryngological, Rhinological and Otological Society, Inc.
Cost-effectiveness analyses and their role in improving healthcare strategies.
Rodriguez, Maria I; Caughey, Aaron B
2013-12-01
In this era of healthcare reform, attention is focused on increasing the quality of care and access to services, while simultaneously reducing the cost. Economic evaluations can play an important role in translating research to evidence-based practice and policy. Cost-effectiveness analysis (CEA) and its utility for clinical and policy decision making among U.S. obstetricians and gynecologists is reviewed. Three case examples demonstrating the value of this methodology in decision making are considered. A discussion of the methodologic principles of CEA, the advantages, and the limitations of the methodology are presented. CEA can play an important role in evidence-based decision making, with value for clinicians and policy makers alike. These studies are of particular interest in the field of obstetrics and gynecology, in which uncertainty from epidemiologic or clinical trials exists, or multiple perspectives need to be considered (maternal, neonatal, and societal). As with all research, it is essential that economic evaluations are conducted according to established methodologic standards. Interpretation and application of results should occur with a clear understanding of both the value and the limitations of economic evaluations.
2012-01-01
Clinical decision rules are an increasingly common presence in the biomedical literature and represent one strategy of enhancing clinical-decision making with the goal of improving the efficiency and effectiveness of healthcare delivery. In the context of rehabilitation research, clinical decision rules have been predominantly aimed at classifying patients by predicting their treatment response to specific therapies. Traditionally, recommendations for developing clinical decision rules propose a multistep process (derivation, validation, impact analysis) using defined methodology. Research efforts aimed at developing a “diagnosis-based clinical decision rule” have departed from this convention. Recent publications in this line of research have used the modified terminology “diagnosis-based clinical decision guide.” Modifications to terminology and methodology surrounding clinical decision rules can make it more difficult for clinicians to recognize the level of evidence associated with a decision rule and understand how this evidence should be implemented to inform patient care. We provide a brief overview of clinical decision rule development in the context of the rehabilitation literature and two specific papers recently published in Chiropractic and Manual Therapies. PMID:22726639
Five Steps for Improving Evaluation Reports by Using Different Data Analysis Methods.
ERIC Educational Resources Information Center
Thompson, Bruce
Although methodological integrity is not the sole determinant of the value of a program evaluation, decision-makers do have a right, at a minimum, to be able to expect competent work from evaluators. This paper explores five areas where evaluators might improve methodological practices. First, evaluation reports should reflect the limited…
Kolasa, Katarzyna; Zah, Vladimir; Kowalczyk, Marta
2018-04-29
As budget constraints become more and more visible, there is growing recognition for greater transparency and greater stakeholders' engagement in the pharmaceuticals' pric-ing&reimbursement (P&R) decision making. New frameworks of drugs' value assessments are searched for. Among them, the multi-criteria decision analysis (MCDA) receives more and more attention. In 2014, ISPOR established Task Force to provide methodological recommendations for MCDA utilization in the health care decision making. Still, there is not so much knowledge about the real life experience with MCDA's adaptation to the P&R processes. Areas covered: A systematic literature review was performed to understand the rationale for MCDA adaptation, methodology used as well as its impact on P&R outcomes. Expert commentary: In total 102 hits were found through the search of databases, out of which 18 publications were selected. Although limited in scope, the review highlighted how MCDA can im-prove the decision making processes not only regarding pricing & reimbursement but also contribute to the the risk benefit assessment as well as optimization of treatment outcomes. Still none of re-viewed studies did report how MCDA results actually impacted the real life settings.
Conjoint analysis: using a market-based research model for healthcare decision making.
Mele, Nancy L
2008-01-01
Conjoint analysis is a market-based research model that has been used by businesses for more than 35 years to predict consumer preferences in product design and purchasing. Researchers in medicine, healthcare economics, and health policy have discovered the value of this methodology in determining treatment preferences, resource allocation, and willingness to pay. To describe the conjoint analysis methodology and explore value-added applications in nursing research. Conjoint analysis methodology is described, using examples from the healthcare and business literature, and personal experience with the method. Nurses are called upon to increase interdisciplinary research, provide an evidence base for nursing practice, create patient-centered treatments, and revise nursing education. Other disciplines have met challenges like these using conjoint analysis and discrete choice modeling.
ERIC Educational Resources Information Center
Liu, Shiang-Yao; Lin, Chuan-Shun; Tsai, Chin-Chung
2011-01-01
This study aims to test the nature of the assumption that there are relationships between scientific epistemological views (SEVs) and reasoning processes in socioscientific decision making. A mixed methodology that combines both qualitative and quantitative approaches of data collection and analysis was adopted not only to verify the assumption…
ERIC Educational Resources Information Center
Sims, Wendy L.; Lordo, Jackie; Phelps, Cynthia Williams
2016-01-01
The primary purpose of this study was to investigate characteristics of manuscripts submitted to the "Journal of Research in Music Education" (JRME) representing various research methodologies. A database was compiled comprising all manuscripts that received a publication decision from February 2009 through March 2014 (N = 506). Only…
NASA Astrophysics Data System (ADS)
Luo, Keqin
1999-11-01
The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and accurately what is going on in each tank, and (iii) identify all WM opportunities through process improvement. This work has formed a solid foundation for the further development of powerful WM technologies for comprehensive WM in the following decade.
NASA Astrophysics Data System (ADS)
Tabibzadeh, Maryam
According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test misinterpretation. Finally, a rational decision making model is introduced to quantify a section of the developed conceptual framework in the previous step and analyze the impact of different decision making biases on negative pressure test results. Along with the corroborating findings of previous studies, the analysis of the developed conceptual framework in this paper indicates that organizational factors are root causes of accumulated errors and questionable decisions made by personnel or management. Further analysis of this framework identifies procedural issues, economic pressure, and personnel management issues as the organizational factors with the highest influence on misinterpreting a negative pressure test. It is noteworthy that the captured organizational factors in the introduced conceptual framework are not only specific to the scope of the NPT. Most of these organizational factors have been identified as not only the common contributing causes of other offshore drilling accidents but also accidents in other oil and gas related operations as well as high-risk operations in other industries. In addition, the proposed rational decision making model in this research introduces a quantitative structure for analysis of the results of a conducted NPT. This model provides a structure and some parametric derived formulas to determine a cut-off point value, which assists personnel in accepting or rejecting an implemented negative pressure test. Moreover, it enables analysts to assess different decision making biases involved in the process of interpreting a conducted negative pressure test as well as the root organizational factors of those biases. In general, although the proposed integrated research methodology in this dissertation is developed for the risk assessment of human and organizational factors contributions in negative pressure test misinterpretation, it can be generalized and be potentially useful for other well control situations, both offshore and onshore; e.g. fracking. In addition, this methodology can be applied for the analysis of any high-risk operations, in not only the oil and gas industry but also in other industries such as nuclear power plants, aviation industry, and transportation sector.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-22
...;and investigations, committee meetings, agency decisions and rulings, #0;delegations of authority... Interest Functions Product Service Codes (PSCs) studied by the Agency; (2) The methodology used by the...
A customisable framework for the assessment of therapies in the solution of therapy decision tasks.
Manjarrés Riesco, A; Martínez Tomás, R; Mira Mira, J
2000-01-01
In current medical research, a growing interest can be observed in the definition of a global therapy-evaluation framework which integrates considerations such as patients preferences and quality-of-life results. In this article, we propose the use of the research results in this domain as a source of knowledge in the design of support systems for therapy decision analysis, in particular with a view to application in oncology. We discuss the incorporation of these considerations in the definition of the therapy-assessment methods involved in the solution of a generic therapy decision task, described in the context of AI software development methodologies such as CommonKADS. The goal of the therapy decision task is to identify the ideal therapy, for a given patient, in accordance with a set of objectives of a diverse nature. The assessment methods applied are based either on data obtained from statistics or on the specific idiosyncrasies of each patient, as identified from their responses to a suite of psychological tests. In the analysis of the therapy decision task we emphasise the importance, from a methodological perspective, of using a rigorous approach to the modelling of domain ontologies and domain-specific data. To this aim we make extensive use of the semi-formal object oriented analysis notation UML to describe the domain level.
A method for studying decision-making by guideline development groups.
Gardner, Benjamin; Davidson, Rosemary; McAteer, John; Michie, Susan
2009-08-05
Multidisciplinary guideline development groups (GDGs) have considerable influence on UK healthcare policy and practice, but previous research suggests that research evidence is a variable influence on GDG recommendations. The Evidence into Recommendations (EiR) study has been set up to document social-psychological influences on GDG decision-making. In this paper we aim to evaluate the relevance of existing qualitative methodologies to the EiR study, and to develop a method best-suited to capturing influences on GDG decision-making. A research team comprised of three postdoctoral research fellows and a multidisciplinary steering group assessed the utility of extant qualitative methodologies for coding verbatim GDG meeting transcripts and semi-structured interviews with GDG members. A unique configuration of techniques was developed to permit data reduction and analysis. Our method incorporates techniques from thematic analysis, grounded theory analysis, content analysis, and framework analysis. Thematic analysis of individual interviews conducted with group members at the start and end of the GDG process defines discrete problem areas to guide data extraction from GDG meeting transcripts. Data excerpts are coded both inductively and deductively, using concepts taken from theories of decision-making, social influence and group processes. These codes inform a framework analysis to describe and explain incidents within GDG meetings. We illustrate the application of the method by discussing some preliminary findings of a study of a National Institute for Health and Clinical Excellence (NICE) acute physical health GDG. This method is currently being applied to study the meetings of three of NICE GDGs. These cover topics in acute physical health, mental health and public health, and comprise a total of 45 full-day meetings. The method offers potential for application to other health care and decision-making groups.
Hill, Sarah R; Vale, Luke; Hunter, David; Henderson, Emily; Oluboyede, Yemi
2017-12-01
Public health interventions have unique characteristics compared to health technologies, which present additional challenges for economic evaluation (EE). High quality EEs that are able to address the particular methodological challenges are important for public health decision-makers. In England, they are even more pertinent given the transition of public health responsibilities in 2013 from the National Health Service to local government authorities where new agents are shaping policy decisions. Addressing alcohol misuse is a globally prioritised public health issue. This article provides a systematic review of EE and priority-setting studies for interventions to prevent and reduce alcohol misuse published internationally over the past decade (2006-2016). This review appraises the EE and priority-setting evidence to establish whether it is sufficient to meet the informational needs of public health decision-makers. 619 studies were identified via database searches. 7 additional studies were identified via hand searching journals, grey literature and reference lists. 27 met inclusion criteria. Methods identified included cost-utility analysis (18), cost-effectiveness analysis (6), cost-benefit analysis (CBA) (1), cost-consequence analysis (CCA) (1) and return-on-investment (1). The review identified a lack of consideration of methodological challenges associated with evaluating public health interventions and limited use of methods such as CBA and CCA which have been recommended as potentially useful for EE in public health. No studies using other specific priority-setting tools were identified. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
A stochastic conflict resolution model for trading pollutant discharge permits in river systems.
Niksokhan, Mohammad Hossein; Kerachian, Reza; Amin, Pedram
2009-07-01
This paper presents an efficient methodology for developing pollutant discharge permit trading in river systems considering the conflict of interests of involving decision-makers and the stakeholders. In this methodology, a trade-off curve between objectives is developed using a powerful and recently developed multi-objective genetic algorithm technique known as the Nondominated Sorting Genetic Algorithm-II (NSGA-II). The best non-dominated solution on the trade-off curve is defined using the Young conflict resolution theory, which considers the utility functions of decision makers and stakeholders of the system. These utility functions are related to the total treatment cost and a fuzzy risk of violating the water quality standards. The fuzzy risk is evaluated using the Monte Carlo analysis. Finally, an optimization model provides the trading discharge permit policies. The practical utility of the proposed methodology in decision-making is illustrated through a realistic example of the Zarjub River in the northern part of Iran.
Introduction to SIMRAND: Simulation of research and development project
NASA Technical Reports Server (NTRS)
Miles, R. F., Jr.
1982-01-01
SIMRAND: SIMulation of Research ANd Development Projects is a methodology developed to aid the engineering and management decision process in the selection of the optimal set of systems or tasks to be funded on a research and development project. A project may have a set of systems or tasks under consideration for which the total cost exceeds the allocated budget. Other factors such as personnel and facilities may also enter as constraints. Thus the project's management must select, from among the complete set of systems or tasks under consideration, a partial set that satisfies all project constraints. The SIMRAND methodology uses analytical techniques and probability theory, decision analysis of management science, and computer simulation, in the selection of this optimal partial set. The SIMRAND methodology is truly a management tool. It initially specifies the information that must be generated by the engineers, thus providing information for the management direction of the engineers, and it ranks the alternatives according to the preferences of the decision makers.
Dynamic Decision Making under Uncertainty and Partial Information
2017-01-30
order to address these problems, we investigated efficient computational methodologies for dynamic decision making under uncertainty and partial...information. In the course of this research, we developed and studied efficient simulation-based methodologies for dynamic decision making under...uncertainty and partial information; (ii) studied the application of these decision making models and methodologies to practical problems, such as those
Cost-benefit analysis of space technology
NASA Technical Reports Server (NTRS)
Hein, G. F.; Stevenson, S. M.; Sivo, J. N.
1976-01-01
A discussion of the implications and problems associated with the use of cost-benefit techniques is presented. Knowledge of these problems is useful in the structure of a decision making process. A methodology of cost-benefit analysis is presented for the evaluation of space technology. The use of the methodology is demonstrated with an evaluation of ion thrusters for north-south stationkeeping aboard geosynchronous communication satellites. A critique of the concept of consumers surplus for measuring benefits is also presented.
Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.
Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian
2017-09-12
Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.
Management of contaminated marine marketable resources after oil and HNS spills in Europe.
Cunha, Isabel; Neuparth, Teresa; Moreira, Susana; Santos, Miguel M; Reis-Henriques, Maria Armanda
2014-03-15
Different risk evaluation approaches have been used to face oil and hazardous and noxious substances (HNS) spills all over the world. To minimize health risks and mitigate economic losses due to a long term ban on the sale of sea products after a spill, it is essential to preemptively set risk evaluation criteria and standard methodologies based on previous experience and appropriate scientifically sound criteria. Standard methodologies are analyzed and proposed in order to improve the definition of criteria for reintegrating previously contaminated marine marketable resources into the commercialization chain in Europe. The criteria used in former spills for the closing of and lifting of bans on fisheries and harvesting are analyzed. European legislation was identified regarding food sampling, food chemical analysis and maximum levels of contaminants allowed in seafood, which ought to be incorporated in the standard methodologies for the evaluation of the decision criteria defined for oil and HNS spills in Europe. A decision flowchart is proposed that opens the current decision criteria to new material that may be incorporated in the decision process. Decision criteria are discussed and compared among countries and incidents. An a priori definition of risk criteria and an elaboration of action plans are proposed to speed up actions that will lead to prompt final decisions. These decisions, based on the best available scientific data and conducing to lift or ban economic activity, will tend to be better understood and respected by citizens. Copyright © 2014 Elsevier Ltd. All rights reserved.
Case based reasoning in criminal intelligence using forensic case data.
Ribaux, O; Margot, P
2003-01-01
A model that is based on the knowledge of experienced investigators in the analysis of serial crime is suggested to bridge a gap between technology and methodology. Its purpose is to provide a solid methodology for the analysis of serial crimes that supports decision making in the deployment of resources, either by guiding proactive policing operations or helping the investigative process. Formalisation has helped to derive a computerised system that efficiently supports the reasoning processes in the analysis of serial crime. This novel approach fully integrates forensic science data.
Postoptimality Analysis in the Selection of Technology Portfolios
NASA Technical Reports Server (NTRS)
Adumitroaie, Virgil; Shelton, Kacie; Elfes, Alberto; Weisbin, Charles R.
2006-01-01
This slide presentation reviews a process of postoptimally analysing the selection of technology portfolios. The rationale for the analysis stems from the need for consistent, transparent and auditable decision making processes and tools. The methodology is used to assure that project investments are selected through an optimization of net mission value. The main intent of the analysis is to gauge the degree of confidence in the optimal solution and to provide the decision maker with an array of viable selection alternatives which take into account input uncertainties and possibly satisfy non-technical constraints. A few examples of the analysis are reviewed. The goal of the postoptimality study is to enhance and improve the decision-making process by providing additional qualifications and substitutes to the optimal solution.
Methodological Issues in the Study of Air Force Organizational Structures,
MOTIVATION, MORALE, PERFORMANCE(HUMAN), LEADERSHIP , SKILLS, MANAGEMENT PLANNING AND CONTROL, MODEL THEORY , SYMPOSIA...RESOURCE MANAGEMENT , *HUMAN RESOURCES, *MANPOWER UTILIZATION, *JOB ANALYSIS, *ORGANIZATIONS, STRUCTURES, PERSONNEL MANAGEMENT , DECISION MAKING
District Heating Systems Performance Analyses. Heat Energy Tariff
NASA Astrophysics Data System (ADS)
Ziemele, Jelena; Vigants, Girts; Vitolins, Valdis; Blumberga, Dagnija; Veidenbergs, Ivars
2014-12-01
The paper addresses an important element of the European energy sector: the evaluation of district heating (DH) system operations from the standpoint of increasing energy efficiency and increasing the use of renewable energy resources. This has been done by developing a new methodology for the evaluation of the heat tariff. The paper presents an algorithm of this methodology, which includes not only a data base and calculation equation systems, but also an integrated multi-criteria analysis module using MADM/MCDM (Multi-Attribute Decision Making / Multi-Criteria Decision Making) based on TOPSIS (Technique for Order Performance by Similarity to Ideal Solution). The results of the multi-criteria analysis are used to set the tariff benchmarks. The evaluation methodology has been tested for Latvian heat tariffs, and the obtained results show that only half of heating companies reach a benchmark value equal to 0.5 for the efficiency closeness to the ideal solution indicator. This means that the proposed evaluation methodology would not only allow companies to determine how they perform with regard to the proposed benchmark, but also to identify their need to restructure so that they may reach the level of a low-carbon business.
Environmental Risk Assessment of dredging processes - application to Marin harbour (NW Spain)
NASA Astrophysics Data System (ADS)
Gómez, A. G.; García Alba, J.; Puente, A.; Juanes, J. A.
2014-04-01
A methodological procedure to estimate the environmental risk of dredging operations in aquatic systems has been developed. Environmental risk estimations are based on numerical models results, which provide an appropriated spatio-temporal framework analysis to guarantee an effective decision-making process. The methodological procedure has been applied on a real dredging operation in the port of Marin (NW Spain). Results from Marin harbour confirmed the suitability of the developed methodology and the conceptual approaches as a comprehensive and practical management tool.
ERIC Educational Resources Information Center
Watkins, Arthur Noel
The purpose of this study was to identify and describe the decision-making processes in senior high schools that were implementing programs of individualized schooling. Field methodology, including interviews, observations, and analysis of documents, was used to gather data in six senior high schools of varying size located throughout the country,…
ERIC Educational Resources Information Center
Sambodo, Leonardo A. A. T.; Nuthall, Peter L.
2010-01-01
Purpose: This study traced the origins of subsistence Farmers' technology adoption attitudes and extracted the critical elements in their decision making systems. Design/Methodology/Approach: The analysis was structured using a model based on the Theory of Planned Behaviour (TPB). The role of a "bargaining process" was particularly…
ERIC Educational Resources Information Center
Hadad, Yossi; Keren, Baruch; Ben-Yair, Avner
2010-01-01
In this paper we will demonstrate how productivity and improvement rate of urban organizational units (called also Decision Making Units--DMUs) may be assessed when measured along several time periods. The assessment and subsequent ranking of cities is achieved by means of the Data Envelopment Analysis (DEA) methodology to determine DMU's…
The application of decision analysis to life support research and technology development
NASA Technical Reports Server (NTRS)
Ballin, Mark G.
1994-01-01
Applied research and technology development is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Decision making regarding which technologies to advance and what resources to devote to them is a challenging but essential task. In the application of life support technology to future manned space flight, new technology concepts typically are characterized by nonexistent data and rough approximations of technology performance, uncertain future flight program needs, and a complex, time-intensive process to develop technology to a flight-ready status. Decision analysis is a quantitative, logic-based discipline that imposes formalism and structure to complex problems. It also accounts for the limits of knowledge that may be available at the time a decision is needed. The utility of decision analysis to life support technology R & D was evaluated by applying it to two case studies. The methodology was found to provide insight that is not possible from more traditional analysis approaches.
NASA Astrophysics Data System (ADS)
Huda, J.; Kauneckis, D. L.
2013-12-01
Climate change adaptation represents a number of unique policy-making challenges. Foremost among these is dealing with the range of future climate impacts to a wide scope of inter-related natural systems, their interaction with social and economic systems, and uncertainty resulting from the variety of downscaled climate model scenarios and climate science projections. These cascades of uncertainty have led to a number of new approaches as well as a reexamination of traditional methods for evaluating risk and uncertainty in policy-making. Policy makers are required to make decisions and formulate policy irrespective of the level of uncertainty involved and while a debate continues regarding the level of scientific certainty required in order to make a decision, incremental change in the climate policy continues at multiple governance levels. This project conducts a comparative analysis of the range of methodological approaches that are evolving to address uncertainty in climate change policy. It defines 'methodologies' to include a variety of quantitative and qualitative approaches involving both top-down and bottom-up policy processes that attempt to enable policymakers to synthesize climate information into the policy process. The analysis examines methodological approaches to decision-making in climate policy based on criteria such as sources of policy choice information, sectors to which the methodology has been applied, sources from which climate projections were derived, quantitative and qualitative methods used to deal with uncertainty, and the benefits and limitations of each. A typology is developed to better categorize the variety of approaches and methods, examine the scope of policy activities they are best suited for, and highlight areas for future research and development.
Optimal use of human and machine resources for Space Station assembly operations
NASA Technical Reports Server (NTRS)
Parrish, Joseph C.
1988-01-01
This paper investigates the issues involved in determining the best mix of human and machine resources for assembly of the Space Station. It presents the current Station assembly sequence, along with descriptions of the available assembly resources. A number of methodologies for optimizing the human/machine tradeoff problem have been developed, but the Space Station assembly offers some unique issues that have not yet been addressed. These include a strong constraint on available EVA time for early flights and a phased deployment of assembly resources over time. A methodology for incorporating the previously developed decision methods to the special case of the Space Station is presented. This methodology emphasizes an application of multiple qualitative and quantitative techniques, including simulation and decision analysis, for producing an objective, robust solution to the tradeoff problem.
A review of costing methodologies in critical care studies.
Pines, Jesse M; Fager, Samuel S; Milzman, David P
2002-09-01
Clinical decision making in critical care has traditionally been based on clinical outcome measures such as mortality and morbidity. Over the past few decades, however, increasing competition in the health care marketplace has made it necessary to consider costs when making clinical and managerial decisions in critical care. Sophisticated costing methodologies have been developed to aid this decision-making process. We performed a narrative review of published costing studies in critical care during the past 6 years. A total of 282 articles were found, of which 68 met our search criteria. They involved a mean of 508 patients (range, 20-13,907). A total of 92.6% of the studies (63 of 68) used traditional cost analysis, whereas the remaining 7.4% (5 of 68) used cost-effectiveness analysis. None (0 of 68) used cost-benefit analysis or cost-utility analysis. A total of 36.7% (25 of 68) used hospital charges as a surrogate for actual costs. Of the 43 articles that actually counted costs, 37.2% (16 of 43) counted physician costs, 27.9% (12 of 43) counted facility costs, 34.9% (15 of 43) counted nursing costs, 9.3% (4 of 43) counted societal costs, and 90.7% (39 of 43) counted laboratory, equipment, and pharmacy costs. Our conclusion is that despite considerable progress in costing methodologies, critical care studies have not adequately implemented these techniques. Given the importance of financial implications in medicine, it would be prudent for critical care studies to use these more advanced techniques. Copyright 2002, Elsevier Science (USA). All rights reserved.
Social Media Participation in Urban Planning: a New way to Interact and Take Decisions
NASA Astrophysics Data System (ADS)
López-Ornelas, E.; Abascal-Mena, R.; Zepeda-Hernández, S.
2017-09-01
Social Media Participation can be very important when you have to make an important decision about a topic related to urban planning. Textual analysis to identify the sentiment about a topic or, community detection and user analysis to identify the actors involved on a discussion can be very important for the persons or institutions that have to take an important decision. In this paper we propose a methodological design to analyse participation in social media. We study the installation of a new airport in Mexico City as a case of study to highlight the importance of conducting a study of this nature.
Khadam, Ibrahim; Kaluarachchi, Jagath J
2003-07-01
Decision analysis in subsurface contamination management is generally carried out through a traditional engineering economic viewpoint. However, new advances in human health risk assessment, namely, the probabilistic risk assessment, and the growing awareness of the importance of soft data in the decision-making process, require decision analysis methodologies that are capable of accommodating non-technical and politically biased qualitative information. In this work, we discuss the major limitations of the currently practiced decision analysis framework, which evolves around the definition of risk and cost of risk, and its poor ability to communicate risk-related information. A demonstration using a numerical example was conducted to provide insight on these limitations of the current decision analysis framework. The results from this simple ground water contamination and remediation scenario were identical to those obtained from studies carried out on existing Superfund sites, which suggests serious flaws in the current risk management framework. In order to provide a perspective on how these limitations may be avoided in future formulation of the management framework, more matured and well-accepted approaches to decision analysis in dam safety and the utility industry, where public health and public investment are of great concern, are presented and their applicability in subsurface remediation management is discussed. Finally, in light of the success of the application of risk-based decision analysis in dam safety and the utility industry, potential options for decision analysis in subsurface contamination management are discussed.
A Step-by-Step Design Methodology for a Base Case Vanadium Redox-Flow Battery
ERIC Educational Resources Information Center
Moore, Mark; Counce, Robert M.; Watson, Jack S.; Zawodzinski, Thomas A.; Kamath, Haresh
2012-01-01
The purpose of this work is to develop an evolutionary procedure to be used by Chemical Engineering students for the base-case design of a Vanadium Redox-Flow Battery. The design methodology is based on the work of Douglas (1985) and provides a profitability analysis at each decision level so that more profitable alternatives and directions can be…
Situating methodology within qualitative research.
Kramer-Kile, Marnie L
2012-01-01
Qualitative nurse researchers are required to make deliberate and sometimes complex methodological decisions about their work. Methodology in qualitative research is a comprehensive approach in which theory (ideas) and method (doing) are brought into close alignment. It can be difficult, at times, to understand the concept of methodology. The purpose of this research column is to: (1) define qualitative methodology; (2) illuminate the relationship between epistemology, ontology and methodology; (3) explicate the connection between theory and method in qualitative research design; and 4) highlight relevant examples of methodological decisions made within cardiovascular nursing research. Although there is no "one set way" to do qualitative research, all qualitative researchers should account for the choices they make throughout the research process and articulate their methodological decision-making along the way.
Schumm, Walter R
2012-11-01
Every social science researcher must make a number of methodological decisions when planning and implementing research projects. Each such decision carries with it both advantages and limitations. The decisions faced and made by Regnerus (2012) are discussed here in the wider context of social science literature regarding same-sex parenting. Even though the apparent outcomes of Regnerus's study were unpopular, the methodological decisions he made in the design and implementation of the New Family Structures Survey were not uncommon among social scientists, including many progressive, gay and lesbian scholars. These decisions and the research they produced deserve considerable and continued discussion, but criticisms of the underlying ethics and professionalism are misplaced because nearly every methodological decision that was made has ample precedents in research published by many other credible and distinguished scholars. Copyright © 2012 Elsevier Inc. All rights reserved.
Outline of cost-benefit analysis and a case study
NASA Technical Reports Server (NTRS)
Kellizy, A.
1978-01-01
The methodology of cost-benefit analysis is reviewed and a case study involving solar cell technology is presented. Emphasis is placed on simplifying the technique in order to permit a technical person not trained in economics to undertake a cost-benefit study comparing alternative approaches to a given problem. The role of economic analysis in management decision making is discussed. In simplifying the methodology it was necessary to restrict the scope and applicability of this report. Additional considerations and constraints are outlined. Examples are worked out to demonstrate the principles. A computer program which performs the computational aspects appears in the appendix.
Authors' response: the primacy of conscious decision making.
Shanks, David R; Newell, Ben R
2014-02-01
The target article sought to question the common belief that our decisions are often biased by unconscious influences. While many commentators offer additional support for this perspective, others question our theoretical assumptions, empirical evaluations, and methodological criteria. We rebut in particular the starting assumption that all decision making is unconscious, and that the onus should be on researchers to prove conscious influences. Further evidence is evaluated in relation to the core topics we reviewed (multiple-cue judgment, deliberation without attention, and decisions under uncertainty), as well as priming effects. We reiterate a key conclusion from the target article, namely, that it now seems to be generally accepted that awareness should be operationally defined as reportable knowledge, and that such knowledge can only be evaluated by careful and thorough probing. We call for future research to pay heed to the different ways in which awareness can intervene in decision making (as identified in our lens model analysis) and to employ suitable methodology in the assessment of awareness, including the requirements that awareness assessment must be reliable, relevant, immediate, and sensitive.
Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn
2006-09-01
Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.
Hasan, Haroon; Muhammed, Taaha; Yu, Jennifer; Taguchi, Kelsi; Samargandi, Osama A; Howard, A Fuchsia; Lo, Andrea C; Olson, Robert; Goddard, Karen
2017-10-01
The objective of our study was to evaluate the methodological quality of systematic reviews and meta-analyses in Radiation Oncology. A systematic literature search was conducted for all eligible systematic reviews and meta-analyses in Radiation Oncology from 1966 to 2015. Methodological characteristics were abstracted from all works that satisfied the inclusion criteria and quality was assessed using the critical appraisal tool, AMSTAR. Regression analyses were performed to determine factors associated with a higher score of quality. Following exclusion based on a priori criteria, 410 studies (157 systematic reviews and 253 meta-analyses) satisfied the inclusion criteria. Meta-analyses were found to be of fair to good quality while systematic reviews were found to be of less than fair quality. Factors associated with higher scores of quality in the multivariable analysis were including primary studies consisting of randomized control trials, performing a meta-analysis, and applying a recommended guideline related to establishing a systematic review protocol and/or reporting. Systematic reviews and meta-analyses may introduce a high risk of bias if applied to inform decision-making based on AMSTAR. We recommend that decision-makers in Radiation Oncology scrutinize the methodological quality of systematic reviews and meta-analyses prior to assessing their utility to inform evidence-based medicine and researchers adhere to methodological standards outlined in validated guidelines when embarking on a systematic review. Copyright © 2017 Elsevier Ltd. All rights reserved.
E Pluribus Analysis: Applying a Superforecasting Methodology to the Detection of Homegrown Violence
2018-03-01
actor violence and a set of predefined decision-making protocols. This research included running four simulations using the Monte Carlo technique, which...actor violence and a set of predefined decision-making protocols. This research included running four simulations using the Monte Carlo technique...PREDICTING RANDOMNESS.............................................................24 1. Using a “ Runs Test” to Determine a Temporal Pattern in Lone
The BCD of response time analysis in experimental economics.
Spiliopoulos, Leonidas; Ortmann, Andreas
2018-01-01
For decisions in the wild, time is of the essence. Available decision time is often cut short through natural or artificial constraints, or is impinged upon by the opportunity cost of time. Experimental economists have only recently begun to conduct experiments with time constraints and to analyze response time (RT) data, in contrast to experimental psychologists. RT analysis has proven valuable for the identification of individual and strategic decision processes including identification of social preferences in the latter case, model comparison/selection, and the investigation of heuristics that combine speed and performance by exploiting environmental regularities. Here we focus on the benefits, challenges, and desiderata of RT analysis in strategic decision making. We argue that unlocking the potential of RT analysis requires the adoption of process-based models instead of outcome-based models, and discuss how RT in the wild can be captured by time-constrained experiments in the lab. We conclude that RT analysis holds considerable potential for experimental economics, deserves greater attention as a methodological tool, and promises important insights on strategic decision making in naturally occurring environments.
Defender-Attacker Decision Tree Analysis to Combat Terrorism.
Garcia, Ryan J B; von Winterfeldt, Detlof
2016-12-01
We propose a methodology, called defender-attacker decision tree analysis, to evaluate defensive actions against terrorist attacks in a dynamic and hostile environment. Like most game-theoretic formulations of this problem, we assume that the defenders act rationally by maximizing their expected utility or minimizing their expected costs. However, we do not assume that attackers maximize their expected utilities. Instead, we encode the defender's limited knowledge about the attacker's motivations and capabilities as a conditional probability distribution over the attacker's decisions. We apply this methodology to the problem of defending against possible terrorist attacks on commercial airplanes, using one of three weapons: infrared-guided MANPADS (man-portable air defense systems), laser-guided MANPADS, or visually targeted RPGs (rocket propelled grenades). We also evaluate three countermeasures against these weapons: DIRCMs (directional infrared countermeasures), perimeter control around the airport, and hardening airplanes. The model includes deterrence effects, the effectiveness of the countermeasures, and the substitution of weapons and targets once a specific countermeasure is selected. It also includes a second stage of defensive decisions after an attack occurs. Key findings are: (1) due to the high cost of the countermeasures, not implementing countermeasures is the preferred defensive alternative for a large range of parameters; (2) if the probability of an attack and the associated consequences are large, a combination of DIRCMs and ground perimeter control are preferred over any single countermeasure. © 2016 Society for Risk Analysis.
29 CFR 1910.119 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2011 CFR
2011-07-01
... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...
29 CFR 1910.119 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2010 CFR
2010-07-01
... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...
Saving the Lost Boys: Narratives of Discipline Disproportionality
ERIC Educational Resources Information Center
Gray, Mariama Smith
2016-01-01
In this article, I explore how discriminatory adult practices disproportionately involve Latino boys in the juvenile justice system. I use the critical methodologies of critical ethnography, critical discourse analysis and Critical Race Theory (CRT) to provide a race-centered analysis of decision-making in student discipline. My findings reveal…
A comparative analysis of protected area planning and management frameworks
Per Nilsen; Grant Tayler
1997-01-01
A comparative analysis of the Recreation Opportunity Spectrum (ROS), Limits of Acceptable Change (LAC), a Process for Visitor Impact Management (VIM), Visitor Experience and Resource Protection (VERP), and the Management Process for Visitor Activities (known as VAMP) decision frameworks examines their origins; methodology; use of factors, indicators, and standards;...
An Integrated Approach to Life Cycle Analysis
NASA Technical Reports Server (NTRS)
Chytka, T. M.; Brown, R. W.; Shih, A. T.; Reeves, J. D.; Dempsey, J. A.
2006-01-01
Life Cycle Analysis (LCA) is the evaluation of the impacts that design decisions have on a system and provides a framework for identifying and evaluating design benefits and burdens associated with the life cycles of space transportation systems from a "cradle-to-grave" approach. Sometimes called life cycle assessment, life cycle approach, or "cradle to grave analysis", it represents a rapidly emerging family of tools and techniques designed to be a decision support methodology and aid in the development of sustainable systems. The implementation of a Life Cycle Analysis can vary and may take many forms; from global system-level uncertainty-centered analysis to the assessment of individualized discriminatory metrics. This paper will focus on a proven LCA methodology developed by the Systems Analysis and Concepts Directorate (SACD) at NASA Langley Research Center to quantify and assess key LCA discriminatory metrics, in particular affordability, reliability, maintainability, and operability. This paper will address issues inherent in Life Cycle Analysis including direct impacts, such as system development cost and crew safety, as well as indirect impacts, which often take the form of coupled metrics (i.e., the cost of system unreliability). Since LCA deals with the analysis of space vehicle system conceptual designs, it is imperative to stress that the goal of LCA is not to arrive at the answer but, rather, to provide important inputs to a broader strategic planning process, allowing the managers to make risk-informed decisions, and increase the likelihood of meeting mission success criteria.
Gascón, Fernando; de la Fuente, David; Puente, Javier; Lozano, Jesús
2007-11-01
The aim of this paper is to develop a methodology that is useful for analyzing, from a macroeconomic perspective, the aggregate demand and the aggregate supply features of the market of pharmaceutical generics. In order to determine the potential consumption and the potential production of pharmaceutical generics in different countries, two fuzzy decision support systems are proposed. Two fuzzy decision support systems, both based on the Mamdani model, were applied in this paper. These systems, generated by Matlab Toolbox 'Fuzzy' (v. 2.0), are able to determine the potential of a country for the manufacturing or the consumption of pharmaceutical generics. The systems make use of three macroeconomic input variables. In an empirical application of our proposed methodology, the potential towards consumption and manufacturing in Holland, Sweden, Italy and Spain has been estimated from national indicators. Cross-country comparisons are made and graphical surfaces are analyzed in order to interpret the results. The main contribution of this work is the development of a methodology that is useful for analyzing aggregate demand and aggregate supply characteristics of pharmaceutical generics. The methodology is valid for carrying out a systematic analysis of the potential generics have at a macrolevel in different countries. The main advantages of the use of fuzzy decision support systems in the context of pharmaceutical generics are the flexibility in the construction of the system, the speed in interpreting the results offered by the inference and surface maps and the ease with which a sensitivity analysis of the potential behavior of a given country may be performed.
What lies behind crop decisions?Coming to terms with revealing farmers' preferences
NASA Astrophysics Data System (ADS)
Gomez, C.; Gutierrez, C.; Pulido-Velazquez, M.; López Nicolás, A.
2016-12-01
The paper offers a fully-fledged applied revealed preference methodology to screen and represent farmers' choices as the solution of an optimal program involving trade-offs among the alternative welfare outcomes of crop decisions such as profits, income security and management easiness. The recursive two-stage method is proposed as an alternative to cope with the methodological problems inherent to common practice positive mathematical program methodologies (PMP). Differently from PMP, in the model proposed in this paper, the non-linear costs that are required for both calibration and smooth adjustment are not at odds with the assumptions of linear Leontief technologies and fixed crop prices and input costs. The method frees the model from ad-hoc assumptions about costs and then recovers the potential of economic analysis as a means to understand the rationale behind observed and forecasted farmers' decisions and then to enhance the potential of the model to support policy making in relevant domains such as agricultural policy, water management, risk management and climate change adaptation. After the introduction, where the methodological drawbacks and challenges are set up, section two presents the theoretical model, section three develops its empirical application and presents its implementation to a Spanish irrigation district and finally section four concludes and makes suggestions for further research.
A Narrative in Search of a Methodology.
Treloar, Anna; Stone, Teresa Elizabeth; McMillan, Margaret; Flakus, Kirstin
2015-07-01
Research papers present us with the summaries of scholars' work; what we readers do not see are the struggles behind the decision to choose one methodology over another. A student's mental health portfolio contained a narrative that led to an exploration of the most appropriate methodology for a projected study of clinical anecdotes told by nurses who work in mental health settings to undergraduates and new recruits about mental health nursing. This paper describes the process of struggle, beginning with the student's account, before posing a number of questions needing answers before the choice of the most appropriate methodology. We argue, after discussing the case for the use of literary analysis, discourse analysis, symbolic interactionism, hermeneutics, and narrative research, that case study research is the methodology of choice. Case study is frequently used in educational research and is sufficiently flexible to allow for an exploration of the phenomenon. © 2014 Wiley Periodicals, Inc.
Development of policies for Natura 2000 sites: a multi-criteria approach to support decision makers.
Cortina, Carla; Boggia, Antonio
2014-08-01
The aim of this study is to present a methodology to support decision makers in the choice of Natura 2000 sites needing an appropriate management plan to ensure a sustainable socio-economic development. In order to promote sustainable development in the Natura 2000 sites compatible with nature preservation, conservation measures or management plans are necessary. The main issue is to decide when only conservation measures can be applied and when the sites need an appropriate management plan. We present a case study for the Italian Region of Umbria. The methodology is based on a multi-criteria approach to identify the biodiversity index (BI), and on the development of a human activities index (HAI). By crossing the two indexes for each site on a Cartesian plane, four groups of sites were identified. Each group corresponds to a specific need for an appropriate management plan. Sites in the first group with a high level both of biodiversity and human activities have the most urgent need of an appropriate management plan to ensure sustainable development. The proposed methodology and analysis is replicable in other regions or countries by using the data available for each site in the Natura 2000 standard data form. A multi-criteria analysis is especially suitable for supporting decision makers when they deal with a multidimensional decision process. We found the multi-criteria approach particularly sound in this case, due to the concept of biodiversity itself, which is complex and multidimensional, and to the high number of alternatives (Natura 2000 sites) to be assessed. Copyright © 2014 Elsevier Ltd. All rights reserved.
Bayesian outcome-based strategy classification.
Lee, Michael D
2016-03-01
Hilbig and Moshagen (Psychonomic Bulletin & Review, 21, 1431-1443, 2014) recently developed a method for making inferences about the decision processes people use in multi-attribute forced choice tasks. Their paper makes a number of worthwhile theoretical and methodological contributions. Theoretically, they provide an insightful psychological motivation for a probabilistic extension of the widely-used "weighted additive" (WADD) model, and show how this model, as well as other important models like "take-the-best" (TTB), can and should be expressed in terms of meaningful priors. Methodologically, they develop an inference approach based on the Minimum Description Length (MDL) principles that balances both the goodness-of-fit and complexity of the decision models they consider. This paper aims to preserve these useful contributions, but provide a complementary Bayesian approach with some theoretical and methodological advantages. We develop a simple graphical model, implemented in JAGS, that allows for fully Bayesian inferences about which models people use to make decisions. To demonstrate the Bayesian approach, we apply it to the models and data considered by Hilbig and Moshagen (Psychonomic Bulletin & Review, 21, 1431-1443, 2014), showing how a prior predictive analysis of the models, and posterior inferences about which models people use and the parameter settings at which they use them, can contribute to our understanding of human decision making.
Development of a support tool for complex decision-making in the provision of rural maternity care.
Hearns, Glen; Klein, Michael C; Trousdale, William; Ulrich, Catherine; Butcher, David; Miewald, Christiana; Lindstrom, Ronald; Eftekhary, Sahba; Rosinski, Jessica; Gómez-Ramírez, Oralia; Procyk, Andrea
2010-02-01
Decisions in the organization of safe and effective rural maternity care are complex, difficult, value laden and fraught with uncertainty, and must often be based on imperfect information. Decision analysis offers tools for addressing these complexities in order to help decision-makers determine the best use of resources and to appreciate the downstream effects of their decisions. To develop a maternity care decision-making tool for the British Columbia Northern Health Authority (NH) for use in low birth volume settings. Based on interviews with community members, providers, recipients and decision-makers, and employing a formal decision analysis approach, we sought to clarify the influences affecting rural maternity care and develop a process to generate a set of value-focused objectives for use in designing and evaluating rural maternity care alternatives. Four low-volume communities with variable resources (with and without on-site births, with or without caesarean section capability) were chosen. Physicians (20), nurses (18), midwives and maternity support service providers (4), local business leaders, economic development officials and elected officials (12), First Nations (women [pregnant and non-pregnant], chiefs and band members) (40), social workers (3), pregnant women (2) and NH decision-makers/administrators (17). We developed a Decision Support Manual to assist with assessing community needs and values, context for decision-making, capacity of the health authority or healthcare providers, identification of key objectives for decision-making, developing alternatives for care, and a process for making trade-offs and balancing multiple objectives. The manual was deemed an effective tool for the purpose by the client, NH. Beyond assisting the decision-making process itself, the methodology provides a transparent communication tool to assist in making difficult decisions. While the manual was specifically intended to deal with rural maternity issues, the NH decision-makers feel the method can be easily adapted to assist decision-making in other contexts in medicine where there are conflicting objectives, values and opinions. Decisions on the location of new facilities or infrastructure, or enhancing or altering services such as surgical or palliative care, would be examples of complex decisions that might benefit from this methodology.
Development of a Support Tool for Complex Decision-Making in the Provision of Rural Maternity Care
Hearns, Glen; Klein, Michael C.; Trousdale, William; Ulrich, Catherine; Butcher, David; Miewald, Christiana; Lindstrom, Ronald; Eftekhary, Sahba; Rosinski, Jessica; Gómez-Ramírez, Oralia; Procyk, Andrea
2010-01-01
Context: Decisions in the organization of safe and effective rural maternity care are complex, difficult, value laden and fraught with uncertainty, and must often be based on imperfect information. Decision analysis offers tools for addressing these complexities in order to help decision-makers determine the best use of resources and to appreciate the downstream effects of their decisions. Objective: To develop a maternity care decision-making tool for the British Columbia Northern Health Authority (NH) for use in low birth volume settings. Design: Based on interviews with community members, providers, recipients and decision-makers, and employing a formal decision analysis approach, we sought to clarify the influences affecting rural maternity care and develop a process to generate a set of value-focused objectives for use in designing and evaluating rural maternity care alternatives. Setting: Four low-volume communities with variable resources (with and without on-site births, with or without caesarean section capability) were chosen. Participants: Physicians (20), nurses (18), midwives and maternity support service providers (4), local business leaders, economic development officials and elected officials (12), First Nations (women [pregnant and non-pregnant], chiefs and band members) (40), social workers (3), pregnant women (2) and NH decision-makers/administrators (17). Results: We developed a Decision Support Manual to assist with assessing community needs and values, context for decision-making, capacity of the health authority or healthcare providers, identification of key objectives for decision-making, developing alternatives for care, and a process for making trade-offs and balancing multiple objectives. The manual was deemed an effective tool for the purpose by the client, NH. Conclusions: Beyond assisting the decision-making process itself, the methodology provides a transparent communication tool to assist in making difficult decisions. While the manual was specifically intended to deal with rural maternity issues, the NH decision-makers feel the method can be easily adapted to assist decision-making in other contexts in medicine where there are conflicting objectives, values and opinions. Decisions on the location of new facilities or infrastructure, or enhancing or altering services such as surgical or palliative care, would be examples of complex decisions that might benefit from this methodology. PMID:21286270
Yap, H Y; Nixon, J D
2015-12-01
Energy recovery from municipal solid waste plays a key role in sustainable waste management and energy security. However, there are numerous technologies that vary in suitability for different economic and social climates. This study sets out to develop and apply a multi-criteria decision making methodology that can be used to evaluate the trade-offs between the benefits, opportunities, costs and risks of alternative energy from waste technologies in both developed and developing countries. The technologies considered are mass burn incineration, refuse derived fuel incineration, gasification, anaerobic digestion and landfill gas recovery. By incorporating qualitative and quantitative assessments, a preference ranking of the alternative technologies is produced. The effect of variations in decision criteria weightings are analysed in a sensitivity analysis. The methodology is applied principally to compare and assess energy recovery from waste options in the UK and India. These two countries have been selected as they could both benefit from further development of their waste-to-energy strategies, but have different technical and socio-economic challenges to consider. It is concluded that gasification is the preferred technology for the UK, whereas anaerobic digestion is the preferred technology for India. We believe that the presented methodology will be of particular value for waste-to-energy decision-makers in both developed and developing countries. Copyright © 2015 Elsevier Ltd. All rights reserved.
Optimizing Force Deployment and Force Structure for the Rapid Deployment Force
1984-03-01
Analysis . . . . .. .. ... ... 97 Experimental Design . . . . . .. .. .. ... 99 IX. Use of a Flexible Response Surface ........ 10.2 Selection of a...setS . ere designe . arun, programming methodology , where the require: s.stem re..r is input and the model optimizes the num=er. :::pe, cargo. an...to obtain new computer outputs" (Ref 38:23). The methodology can be used with any decision model, linear or nonlinear. Experimental Desion Since the
Tsalatsanis, Athanasios; Barnes, Laura E; Hozo, Iztok; Djulbegovic, Benjamin
2011-12-23
Despite the well documented advantages of hospice care, most terminally ill patients do not reap the maximum benefit from hospice services, with the majority of them receiving hospice care either prematurely or delayed. Decision systems to improve the hospice referral process are sorely needed. We present a novel theoretical framework that is based on well-established methodologies of prognostication and decision analysis to assist with the hospice referral process for terminally ill patients. We linked the SUPPORT statistical model, widely regarded as one of the most accurate models for prognostication of terminally ill patients, with the recently developed regret based decision curve analysis (regret DCA). We extend the regret DCA methodology to consider harms associated with the prognostication test as well as harms and effects of the management strategies. In order to enable patients and physicians in making these complex decisions in real-time, we developed an easily accessible web-based decision support system available at the point of care. The web-based decision support system facilitates the hospice referral process in three steps. First, the patient or surrogate is interviewed to elicit his/her personal preferences regarding the continuation of life-sustaining treatment vs. palliative care. Then, regret DCA is employed to identify the best strategy for the particular patient in terms of threshold probability at which he/she is indifferent between continuation of treatment and of hospice referral. Finally, if necessary, the probabilities of survival and death for the particular patient are computed based on the SUPPORT prognostication model and contrasted with the patient's threshold probability. The web-based design of the CDSS enables patients, physicians, and family members to participate in the decision process from anywhere internet access is available. We present a theoretical framework to facilitate the hospice referral process. Further rigorous clinical evaluation including testing in a prospective randomized controlled trial is required and planned.
2011-01-01
Background Despite the well documented advantages of hospice care, most terminally ill patients do not reap the maximum benefit from hospice services, with the majority of them receiving hospice care either prematurely or delayed. Decision systems to improve the hospice referral process are sorely needed. Methods We present a novel theoretical framework that is based on well-established methodologies of prognostication and decision analysis to assist with the hospice referral process for terminally ill patients. We linked the SUPPORT statistical model, widely regarded as one of the most accurate models for prognostication of terminally ill patients, with the recently developed regret based decision curve analysis (regret DCA). We extend the regret DCA methodology to consider harms associated with the prognostication test as well as harms and effects of the management strategies. In order to enable patients and physicians in making these complex decisions in real-time, we developed an easily accessible web-based decision support system available at the point of care. Results The web-based decision support system facilitates the hospice referral process in three steps. First, the patient or surrogate is interviewed to elicit his/her personal preferences regarding the continuation of life-sustaining treatment vs. palliative care. Then, regret DCA is employed to identify the best strategy for the particular patient in terms of threshold probability at which he/she is indifferent between continuation of treatment and of hospice referral. Finally, if necessary, the probabilities of survival and death for the particular patient are computed based on the SUPPORT prognostication model and contrasted with the patient's threshold probability. The web-based design of the CDSS enables patients, physicians, and family members to participate in the decision process from anywhere internet access is available. Conclusions We present a theoretical framework to facilitate the hospice referral process. Further rigorous clinical evaluation including testing in a prospective randomized controlled trial is required and planned. PMID:22196308
Contaminated site cleanups involving complex activities may benefit from a detailed environmental footprint analysis to inform decision-making about application of suitable best management practices for greener cleanups.
Methodological Quality of Consensus Guidelines in Implant Dentistry.
Faggion, Clovis Mariano; Apaza, Karol; Ariza-Fritas, Tania; Málaga, Lilian; Giannakopoulos, Nikolaos Nikitas; Alarcón, Marco Antonio
2017-01-01
Consensus guidelines are useful to improve clinical decision making. Therefore, the methodological evaluation of these guidelines is of paramount importance. Low quality information may guide to inadequate or harmful clinical decisions. To evaluate the methodological quality of consensus guidelines published in implant dentistry using a validated methodological instrument. The six implant dentistry journals with impact factors were scrutinised for consensus guidelines related to implant dentistry. Two assessors independently selected consensus guidelines, and four assessors independently evaluated their methodological quality using the Appraisal of Guidelines for Research & Evaluation (AGREE) II instrument. Disagreements in the selection and evaluation of guidelines were resolved by consensus. First, the consensus guidelines were analysed alone. Then, systematic reviews conducted to support the guidelines were included in the analysis. Non-parametric statistics for dependent variables (Wilcoxon signed rank test) was used to compare both groups. Of 258 initially retrieved articles, 27 consensus guidelines were selected. Median scores in four domains (applicability, rigour of development, stakeholder involvement, and editorial independence), expressed as percentages of maximum possible domain scores, were below 50% (median, 26%, 30.70%, 41.70%, and 41.70%, respectively). The consensus guidelines and consensus guidelines + systematic reviews data sets could be compared for 19 guidelines, and the results showed significant improvements in all domain scores (p < 0.05). Methodological improvement of consensus guidelines published in major implant dentistry journals is needed. The findings of the present study may help researchers to better develop consensus guidelines in implant dentistry, which will improve the quality and trust of information needed to make proper clinical decisions.
Methodological Quality of Consensus Guidelines in Implant Dentistry
Faggion, Clovis Mariano; Apaza, Karol; Ariza-Fritas, Tania; Málaga, Lilian; Giannakopoulos, Nikolaos Nikitas; Alarcón, Marco Antonio
2017-01-01
Background Consensus guidelines are useful to improve clinical decision making. Therefore, the methodological evaluation of these guidelines is of paramount importance. Low quality information may guide to inadequate or harmful clinical decisions. Objective To evaluate the methodological quality of consensus guidelines published in implant dentistry using a validated methodological instrument. Methods The six implant dentistry journals with impact factors were scrutinised for consensus guidelines related to implant dentistry. Two assessors independently selected consensus guidelines, and four assessors independently evaluated their methodological quality using the Appraisal of Guidelines for Research & Evaluation (AGREE) II instrument. Disagreements in the selection and evaluation of guidelines were resolved by consensus. First, the consensus guidelines were analysed alone. Then, systematic reviews conducted to support the guidelines were included in the analysis. Non-parametric statistics for dependent variables (Wilcoxon signed rank test) was used to compare both groups. Results Of 258 initially retrieved articles, 27 consensus guidelines were selected. Median scores in four domains (applicability, rigour of development, stakeholder involvement, and editorial independence), expressed as percentages of maximum possible domain scores, were below 50% (median, 26%, 30.70%, 41.70%, and 41.70%, respectively). The consensus guidelines and consensus guidelines + systematic reviews data sets could be compared for 19 guidelines, and the results showed significant improvements in all domain scores (p < 0.05). Conclusions Methodological improvement of consensus guidelines published in major implant dentistry journals is needed. The findings of the present study may help researchers to better develop consensus guidelines in implant dentistry, which will improve the quality and trust of information needed to make proper clinical decisions. PMID:28107405
Supply chain optimization for pediatric perioperative departments.
Davis, Janice L; Doyle, Robert
2011-09-01
Economic challenges compel pediatric perioperative departments to reduce nonlabor supply costs while maintaining the quality of patient care. Optimization of the supply chain introduces a framework for decision making that drives fiscally responsible decisions. The cost-effective supply chain is driven by implementing a value analysis process for product selection, being mindful of product sourcing decisions to reduce supply expense, creating logistical efficiency that will eliminate redundant processes, and managing inventory to ensure product availability. The value analysis approach is an analytical methodology for product selection that involves product evaluation and recommendation based on consideration of clinical benefit, overall financial impact, and revenue implications. Copyright © 2011 AORN, Inc. Published by Elsevier Inc. All rights reserved.
Wolfslehner, Bernhard; Seidl, Rupert
2010-12-01
The decision-making environment in forest management (FM) has changed drastically during the last decades. Forest management planning is facing increasing complexity due to a widening portfolio of forest goods and services, a societal demand for a rational, transparent decision process and rising uncertainties concerning future environmental conditions (e.g., climate change). Methodological responses to these challenges include an intensified use of ecosystem models to provide an enriched, quantitative information base for FM planning. Furthermore, multi-criteria methods are increasingly used to amalgamate information, preferences, expert judgments and value expressions, in support of the participatory and communicative dimensions of modern forestry. Although the potential of combining these two approaches has been demonstrated in a number of studies, methodological aspects in interfacing forest ecosystem models (FEM) and multi-criteria decision analysis (MCDA) are scarcely addressed explicitly. In this contribution we review the state of the art in FEM and MCDA in the context of FM planning and highlight some of the crucial issues when combining ecosystem and preference modeling. We discuss issues and requirements in selecting approaches suitable for supporting FM planning problems from the growing body of FEM and MCDA concepts. We furthermore identify two major challenges in a harmonized application of FEM-MCDA: (i) the design and implementation of an indicator-based analysis framework capturing ecological and social aspects and their interactions relevant for the decision process, and (ii) holistic information management that supports consistent use of different information sources, provides meta-information as well as information on uncertainties throughout the planning process.
NASA Astrophysics Data System (ADS)
Wolfslehner, Bernhard; Seidl, Rupert
2010-12-01
The decision-making environment in forest management (FM) has changed drastically during the last decades. Forest management planning is facing increasing complexity due to a widening portfolio of forest goods and services, a societal demand for a rational, transparent decision process and rising uncertainties concerning future environmental conditions (e.g., climate change). Methodological responses to these challenges include an intensified use of ecosystem models to provide an enriched, quantitative information base for FM planning. Furthermore, multi-criteria methods are increasingly used to amalgamate information, preferences, expert judgments and value expressions, in support of the participatory and communicative dimensions of modern forestry. Although the potential of combining these two approaches has been demonstrated in a number of studies, methodological aspects in interfacing forest ecosystem models (FEM) and multi-criteria decision analysis (MCDA) are scarcely addressed explicitly. In this contribution we review the state of the art in FEM and MCDA in the context of FM planning and highlight some of the crucial issues when combining ecosystem and preference modeling. We discuss issues and requirements in selecting approaches suitable for supporting FM planning problems from the growing body of FEM and MCDA concepts. We furthermore identify two major challenges in a harmonized application of FEM-MCDA: (i) the design and implementation of an indicator-based analysis framework capturing ecological and social aspects and their interactions relevant for the decision process, and (ii) holistic information management that supports consistent use of different information sources, provides meta-information as well as information on uncertainties throughout the planning process.
NASA Technical Reports Server (NTRS)
Christie, Vanessa L.; Landess, David J.
2012-01-01
In the international arena, decision makers are often swayed away from fact-based analysis by their own individual cultural and political bias. Modeling and Simulation-based training can raise awareness of individual predisposition and improve the quality of decision making by focusing solely on fact vice perception. This improved decision making methodology will support the multinational collaborative efforts of military and civilian leaders to solve challenges more effectively. The intent of this experimental research is to create a framework that allows decision makers to "come to the table" with the latest and most significant facts necessary to determine an appropriate solution for any given contingency.
Field of Study Choice: Using Conjoint Analysis and Clustering
ERIC Educational Resources Information Center
Shtudiner, Ze'ev; Zwilling, Moti; Kantor, Jeffrey
2017-01-01
Purpose: The purpose of this paper is to measure student's preferences regarding various attributes that affect their decision process while choosing a higher education area of study. Design/ Methodology/Approach: The paper exhibits two different models which shed light on the perceived value of each examined area of study: conjoint analysis and…
The US EPA’s ToxCastTM program seeks to combine advances in high-throughput screening technology with methodologies from statistics and computer science to develop high-throughput decision support tools for assessing chemical hazard and risk. To develop new methods of analysis of...
Automatic Target Recognition Classification System Evaluation Methodology
2002-09-01
Testing Set of Two-Class XOR Data (250 Samples)......................................... 2-59 2.27 Decision Analysis Process Flow Chart...ROC curve meta - analysis , which is the estimation of the true ROC curve of a given diagnostic system through ROC analysis across many studies or...technique can be very effective in sensitivity analysis ; trying to determine which data points have the most effect on the solution, and in
INDOOR AIR ASSESSMENT - A REVIEW OF INDOOR AIR QUALITY RISK CHARACTERIZATION
Risk assessment methodologies provide a mechanism for incorporating scientific evidence and Judgments Into the risk management decision process. isk characterization framework has been developed to provide a systematic approach for analysis and presentation of risk characterizati...
Close Combat Missile Methodology Study
2010-10-14
Modeling: Industrial Applications of DEX.” Informatica 23 (1999): 487-491. Bohanec, Marko, Blaz Zupan, and Vladislav Rajkovic. “Applications of...Lisec. “Multi-attribute Decision Analysis in GIS: Weighted Linear Combination and Ordered Weighted Averaging.” Informatica 33, (1999): 459- 474
Probabilistic Flood Maps to support decision-making: Mapping the Value of Information
NASA Astrophysics Data System (ADS)
Alfonso, L.; Mukolwe, M. M.; Di Baldassarre, G.
2016-02-01
Floods are one of the most frequent and disruptive natural hazards that affect man. Annually, significant flood damage is documented worldwide. Flood mapping is a common preimpact flood hazard mitigation measure, for which advanced methods and tools (such as flood inundation models) are used to estimate potential flood extent maps that are used in spatial planning. However, these tools are affected, largely to an unknown degree, by both epistemic and aleatory uncertainty. Over the past few years, advances in uncertainty analysis with respect to flood inundation modeling show that it is appropriate to adopt Probabilistic Flood Maps (PFM) to account for uncertainty. However, the following question arises; how can probabilistic flood hazard information be incorporated into spatial planning? Thus, a consistent framework to incorporate PFMs into the decision-making is required. In this paper, a novel methodology based on Decision-Making under Uncertainty theories, in particular Value of Information (VOI) is proposed. Specifically, the methodology entails the use of a PFM to generate a VOI map, which highlights floodplain locations where additional information is valuable with respect to available floodplain management actions and their potential consequences. The methodology is illustrated with a simplified example and also applied to a real case study in the South of France, where a VOI map is analyzed on the basis of historical land use change decisions over a period of 26 years. Results show that uncertain flood hazard information encapsulated in PFMs can aid decision-making in floodplain planning.
Multi-criteria analysis for PM10 planning
NASA Astrophysics Data System (ADS)
Pisoni, Enrico; Carnevale, Claudio; Volta, Marialuisa
To implement sound air quality policies, Regulatory Agencies require tools to evaluate outcomes and costs associated to different emission reduction strategies. These tools are even more useful when considering atmospheric PM10 concentrations due to the complex nonlinear processes that affect production and accumulation of the secondary fraction of this pollutant. The approaches presented in the literature (Integrated Assessment Modeling) are mainly cost-benefit and cost-effective analysis. In this work, the formulation of a multi-objective problem to control particulate matter is proposed. The methodology defines: (a) the control objectives (the air quality indicator and the emission reduction cost functions); (b) the decision variables (precursor emission reductions); (c) the problem constraints (maximum feasible technology reductions). The cause-effect relations between air quality indicators and decision variables are identified tuning nonlinear source-receptor models. The multi-objective problem solution provides to the decision maker a set of not-dominated scenarios representing the efficient trade-off between the air quality benefit and the internal costs (emission reduction technology costs). The methodology has been implemented for Northern Italy, often affected by high long-term exposure to PM10. The source-receptor models used in the multi-objective analysis are identified processing long-term simulations of GAMES multiphase modeling system, performed in the framework of CAFE-Citydelta project.
Strategic Decision Making Cycle in Higher Education: Case Study of E-Learning
ERIC Educational Resources Information Center
Divjak, Blaženka; Redep, Nina Begicevic
2015-01-01
This paper presents the methodology for strategic decision making in higher education (HE). The methodology is structured as a cycle of strategic decision making with four phases, and it is focused on institutional and national perspective, i.e. on decision making that takes place at institutions of HE and relevant national authorities, in case…
Risk-based economic decision analysis of remediation options at a PCE-contaminated site.
Lemming, Gitte; Friis-Hansen, Peter; Bjerg, Poul L
2010-05-01
Remediation methods for contaminated sites cover a wide range of technical solutions with different remedial efficiencies and costs. Additionally, they may vary in their secondary impacts on the environment i.e. the potential impacts generated due to emissions and resource use caused by the remediation activities. More attention is increasingly being given to these secondary environmental impacts when evaluating remediation options. This paper presents a methodology for an integrated economic decision analysis which combines assessments of remediation costs, health risk costs and potential environmental costs. The health risks costs are associated with the residual contamination left at the site and its migration to groundwater used for drinking water. A probabilistic exposure model using first- and second-order reliability methods (FORM/SORM) is used to estimate the contaminant concentrations at a downstream groundwater well. Potential environmental impacts on the local, regional and global scales due to the site remediation activities are evaluated using life cycle assessments (LCA). The potential impacts on health and environment are converted to monetary units using a simplified cost model. A case study based upon the developed methodology is presented in which the following remediation scenarios are analyzed and compared: (a) no action, (b) excavation and off-site treatment of soil, (c) soil vapor extraction and (d) thermally enhanced soil vapor extraction by electrical heating of the soil. Ultimately, the developed methodology facilitates societal cost estimations of remediation scenarios which can be used for internal ranking of the analyzed options. Despite the inherent uncertainties of placing a value on health and environmental impacts, the presented methodology is believed to be valuable in supporting decisions on remedial interventions. Copyright 2010 Elsevier Ltd. All rights reserved.
Cross-Border Healthcare Requests to Publicly Funded Healthcare Insurance: Empirical Analysis.
Stewart Ferreira, Lydia
2016-02-01
Despite the legal authority to confirm, override or modify healthcare insurance decisions made by physicians and government officials, health tribunal decisions have not been empirically analyzed. Using a novel quantitative methodology, all 387 Health Services Appeal and Review Board written and publicly available electronic decisions released over a five-year time period were statistically analyzed with respect to Ontario public health insurance requests for global cross-border healthcare. The statistical results found that patients knew their diagnosis prior to requesting cross-border healthcare, and 84% of patients requested specific northern US facilities for specific treatment. Two specific healthcare facilities in the US were requested for either surgery or assessments. A significant number of patients were seeking cross-border healthcare for pain treatment. This research challenges the assumption that cross-border treatment requests result only from domestic delay when instead patients are seeking specific treatments at specific facilities. This novel quantitative research methodology and data source of written and publicly available electronic Health Services Appeal and Review Board decisions should be used to inform policy decision regarding the utilization and evaluation of Canada's healthcare system and publicly funded healthcare insurance. Copyright © 2016 Longwoods Publishing.
Hartz, Susanne; John, Jürgen
2008-01-01
Economic evaluation as an integral part of health technology assessment is today mostly applied to established technologies. Evaluating healthcare innovations in their early states of development has recently attracted attention. Although it offers several benefits, it also holds methodological challenges. The aim of our study was to investigate the possible contributions of economic evaluation to industry's decision making early in product development and to confront the results with the actual use of early data in economic assessments. We conducted a literature research to detect methodological contributions as well as economic evaluations that used data from early phases of product development. Economic analysis can be beneficially used in early phases of product development for various purposes including early market assessment, R&D portfolio management, and first estimations of pricing and reimbursement scenarios. Analytical tools available for these purposes have been identified. Numerous empirical works were detected, but most do not disclose any concrete decision context and could not be directly matched with the suggested applications. Industry can benefit from starting economic evaluation early in product development in several ways. Empirical evidence suggests that there is still potential left unused.
Lee, Eun Gyung; Kim, Seung Won; Feigley, Charles E.; Harper, Martin
2015-01-01
This study introduces two semi-quantitative methods, Structured Subjective Assessment (SSA) and Control of Substances Hazardous to Health (COSHH) Essentials, in conjunction with two-dimensional Monte Carlo simulations for determining prior probabilities. Prior distribution using expert judgment was included for comparison. Practical applications of the proposed methods were demonstrated using personal exposure measurements of isoamyl acetate in an electronics manufacturing facility and of isopropanol in a printing shop. Applicability of these methods in real workplaces was discussed based on the advantages and disadvantages of each method. Although these methods could not be completely independent of expert judgments, this study demonstrated a methodological improvement in the estimation of the prior distribution for the Bayesian decision analysis tool. The proposed methods provide a logical basis for the decision process by considering determinants of worker exposure. PMID:23252451
NASA Technical Reports Server (NTRS)
Weissenberger, S. (Editor)
1973-01-01
A systems engineering approach is reported for the problem of reducing the number and severity of California's wildlife fires. Prevention methodologies are reviewed and cost benefit models are developed for making preignition decisions.
FY 1998 Proposed Rail Improvement Program Supplement Update - Bloomington II
DOT National Transportation Integrated Search
1997-07-01
The purpose of this amendment to the FY 1998 Rail Improvement Program Supplement : is to present an analysis which has been formulated, using prescribed methodology, to assist in an investment decision concerning track rehabilitation and new track co...
DOT National Transportation Integrated Search
2009-10-28
Transportation agencies strive to maintain their systems in good condition and also to provide : acceptable levels of service to users. However, funding is often inadequate to meet the needs : of system preservation and expansion, and thus performanc...
FY 1998 Proposed Rail Improvement Program Supplement Update - Pontiac
DOT National Transportation Integrated Search
1997-07-01
The purpose of this amendment to the FY 1998 Rail Improvement Program Supplement is to present an analysis which has been formulated, using prescribed methodology, to assist in an investment decision concerning new track construction at the Prairie C...
Analysis of maintenance costing with emphasis on contracting versus using state forces.
DOT National Transportation Integrated Search
1982-01-01
The authors present the findings of a study to develop a methodology for analyzing decisions of whether to perform ordinary maintenance, maintenance replacement, and incidental construction with state forces or to let them to contract. In developing ...
Simulation analysis of route diversion strategies for freeway incident management : final report.
DOT National Transportation Integrated Search
1995-02-01
The purpose of this project was to investigate whether simulation models could : be used as decision aids for defining traffic diversion strategies for effective : incident management. A methodology was developed for using such a model to : determine...
[Research Biomedical Ethics and Practical Wisdom].
Vergara, Oscar
2015-01-01
As is well known, in the field of Biomedical Ethics some methodological proposals have been put forward. They try to provide some guidelines in order to take proper decisions. These methodologies are quite useful insofar as they supply reasons for action, but they are essentially insufficient. In fact, taking a good decision requires a special skill that goes beyond sheer technique, and this skill is traditionally called practical wisdom. Not in the usual and more outlying sense of sheer caution, but in the more central one of phronesis or prudentia. Although it is not a new notion, it usually appears blurred in biomedical decision-making theory, playing the wrong role, or in a marginal or indefinite way. From this postulate, we will try to make a double analysis. First, we will try to show the need for a proper understanding of the core role that phronesis plays in decision making. Second, we will try to get the original meaning of Aristotelian phronesis back. For reasons of space, in this paper the second question will be just partially addressed.
Analysis of methods of processing of expert information by optimization of administrative decisions
NASA Astrophysics Data System (ADS)
Churakov, D. Y.; Tsarkova, E. G.; Marchenko, N. D.; Grechishnikov, E. V.
2018-03-01
In the real operation the measure definition methodology in case of expert estimation of quality and reliability of application-oriented software products is offered. In operation methods of aggregation of expert estimates on the example of a collective choice of an instrumental control projects in case of software development of a special purpose for needs of institutions are described. Results of operation of dialogue decision making support system are given an algorithm of the decision of the task of a choice on the basis of a method of the analysis of hierarchies and also. The developed algorithm can be applied by development of expert systems to the solution of a wide class of the tasks anyway connected to a multicriteria choice.
NASA Technical Reports Server (NTRS)
Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)
2002-01-01
This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.
Francisco Rodríguez y Silva; Armando González-Cabán
2016-01-01
We propose an economic analysis using utility and productivity, and efficiency theories to provide fire managers a decision support tool to determine the most efficient fire management programs levels. By incorporating managersâ accumulated fire suppression experiences (capitalized experience) in the analysis we help fire managers...
Policy Implications Analysis: A Methodological Advancement for Policy Research and Evaluation.
ERIC Educational Resources Information Center
Madey, Doren L.; Stenner, A. Jackson
Policy Implications Analysis (PIA) is a tool designed to maximize the likelihood that an evaluation report will have an impact on decision-making. PIA was designed to help people planning and conducting evaluations tailor their information so that it has optimal potential for being used and acted upon. This paper describes the development and…
Considerations for the Systematic Analysis and Use of Single-Case Research
ERIC Educational Resources Information Center
Horner, Robert H.; Swaminathan, Hariharan; Sugai, George; Smolkowski, Keith
2012-01-01
Single-case research designs provide a rigorous research methodology for documenting experimental control. If single-case methods are to gain wider application, however, a need exists to define more clearly (a) the logic of single-case designs, (b) the process and decision rules for visual analysis, and (c) an accepted process for integrating…
Cognitive Task Analysis for Instruction in Single-Injection Ultrasound Guided-Regional Anesthesia
ERIC Educational Resources Information Center
Gucev, Gligor V.
2012-01-01
Cognitive task analysis (CTA) is methodology for eliciting knowledge from subject matter experts. CTA has been used to capture the cognitive processes, decision-making, and judgments that underlie expert behaviors. A review of the literature revealed that CTA has not yet been used to capture the knowledge required to perform ultrasound guided…
Radiation Assurance for the Space Environment
NASA Technical Reports Server (NTRS)
Barth, Janet L.; LaBel, Kenneth A.; Poivey, Christian
2004-01-01
The space radiation environment can lead to extremely harsh operating conditions for spacecraft electronic systems. A hardness assurance methodology must be followed to assure that the space radiation environment does not compromise the functionality and performance of space-based systems during the mission lifetime. The methodology includes a definition of the radiation environment, assessment of the radiation sensitivity of parts, worst-case analysis of the impact of radiation effects, and part acceptance decisions which are likely to include mitigation measures.
Efficient Computation of Info-Gap Robustness for Finite Element Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stull, Christopher J.; Hemez, Francois M.; Williams, Brian J.
2012-07-05
A recent research effort at LANL proposed info-gap decision theory as a framework by which to measure the predictive maturity of numerical models. Info-gap theory explores the trade-offs between accuracy, that is, the extent to which predictions reproduce the physical measurements, and robustness, that is, the extent to which predictions are insensitive to modeling assumptions. Both accuracy and robustness are necessary to demonstrate predictive maturity. However, conducting an info-gap analysis can present a formidable challenge, from the standpoint of the required computational resources. This is because a robustness function requires the resolution of multiple optimization problems. This report offers anmore » alternative, adjoint methodology to assess the info-gap robustness of Ax = b-like numerical models solved for a solution x. Two situations that can arise in structural analysis and design are briefly described and contextualized within the info-gap decision theory framework. The treatments of the info-gap problems, using the adjoint methodology are outlined in detail, and the latter problem is solved for four separate finite element models. As compared to statistical sampling, the proposed methodology offers highly accurate approximations of info-gap robustness functions for the finite element models considered in the report, at a small fraction of the computational cost. It is noted that this report considers only linear systems; a natural follow-on study would extend the methodologies described herein to include nonlinear systems.« less
Zozaya, Néboa; Martínez-Galdeano, Lucía; Alcalá, Bleric; Armario-Hita, Jose Carlos; Carmona, Concepción; Carrascosa, Jose Manuel; Herranz, Pedro; Lamas, María Jesús; Trapero-Bertran, Marta; Hidalgo-Vega, Álvaro
2018-06-01
Multi-criteria decision analysis (MCDA) is a tool that systematically considers multiple factors relevant to health decision-making. The aim of this study was to use an MCDA to assess the value of dupilumab for severe atopic dermatitis compared with secukinumab for moderate to severe plaque psoriasis in Spain. Following the EVIDEM (Evidence and Value: Impact on DEcision Making) methodology, the estimated value of both interventions was obtained by means of an additive linear model that combined the individual weighting (between 1 and 5) of each criterion with the individual scoring of each intervention in each criterion. Dupilumab was evaluated against placebo, while secukinumab was evaluated against placebo, etanercept and ustekinumab. A retest was performed to assess the reproducibility of weights, scores and value estimates. The overall MCDA value estimate for dupilumab versus placebo was 0.51 ± 0.14. This value was higher than those obtained for secukinumab: 0.48 ± 0.15 versus placebo, 0.45 ± 0.15 versus etanercept and 0.39 ± 0.18 versus ustekinumab. The highest-value contribution was reported by the patients' group, followed by the clinical professionals and the decision makers. A fundamental element that explained the difference in the scoring between pathologies was the availability of therapeutic alternatives. The retest confirmed the consistency and replicability of the analysis. Under this methodology, and assuming similar economic costs per patient for both treatments, the results indicated that the overall value estimated of dupilumab for severe atopic dermatitis was similar to, or slightly higher than, that of secukinumab for moderate to severe plaque psoriasis.
Energy-Water Nexus: Balancing the Tradeoffs between Two-Level Decision Makers
Zhang, Xiaodong; Vesselinov, Velimir Valentinov
2016-09-03
Energy-water nexus has substantially increased importance in the recent years. Synergistic approaches based on systems-analysis and mathematical models are critical for helping decision makers better understand the interrelationships and tradeoffs between energy and water. In energywater nexus management, various decision makers with different goals and preferences, which are often conflicting, are involved. These decision makers may have different controlling power over the management objectives and the decisions. They make decisions sequentially from the upper level to the lower level, challenging decision making in energy-water nexus. In order to address such planning issues, a bi-level decision model is developed, which improvesmore » upon the existing studies by integration of bi-level programming into energy-water nexus management. The developed model represents a methodological contribution to the challenge of sequential decisionmaking in energy-water nexus through provision of an integrated modeling framework/tool. An interactive fuzzy optimization methodology is introduced to seek a satisfactory solution to meet the overall satisfaction of the two-level decision makers. The tradeoffs between the two-level decision makers in energy-water nexus management are effectively addressed and quantified. Application of the proposed model to a synthetic example problem has demonstrated its applicability in practical energy-water nexus management. Optimal solutions for electricity generation, fuel supply, water supply including groundwater, surface water and recycled water, capacity expansion of the power plants, and GHG emission control are generated. In conclusion, these analyses are capable of helping decision makers or stakeholders adjust their tolerances to make informed decisions to achieve the overall satisfaction of energy-water nexus management where bi-level sequential decision making process is involved.« less
Energy-Water Nexus: Balancing the Tradeoffs between Two-Level Decision Makers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xiaodong; Vesselinov, Velimir Valentinov
Energy-water nexus has substantially increased importance in the recent years. Synergistic approaches based on systems-analysis and mathematical models are critical for helping decision makers better understand the interrelationships and tradeoffs between energy and water. In energywater nexus management, various decision makers with different goals and preferences, which are often conflicting, are involved. These decision makers may have different controlling power over the management objectives and the decisions. They make decisions sequentially from the upper level to the lower level, challenging decision making in energy-water nexus. In order to address such planning issues, a bi-level decision model is developed, which improvesmore » upon the existing studies by integration of bi-level programming into energy-water nexus management. The developed model represents a methodological contribution to the challenge of sequential decisionmaking in energy-water nexus through provision of an integrated modeling framework/tool. An interactive fuzzy optimization methodology is introduced to seek a satisfactory solution to meet the overall satisfaction of the two-level decision makers. The tradeoffs between the two-level decision makers in energy-water nexus management are effectively addressed and quantified. Application of the proposed model to a synthetic example problem has demonstrated its applicability in practical energy-water nexus management. Optimal solutions for electricity generation, fuel supply, water supply including groundwater, surface water and recycled water, capacity expansion of the power plants, and GHG emission control are generated. In conclusion, these analyses are capable of helping decision makers or stakeholders adjust their tolerances to make informed decisions to achieve the overall satisfaction of energy-water nexus management where bi-level sequential decision making process is involved.« less
ERIC Educational Resources Information Center
Koro-Ljungberg, Mirka; Yendol-Hoppey, Diane; Smith, Jason Jude; Hayes, Sharon B.
2009-01-01
This article explores epistemological awareness and instantiation of methods, as well as uninformed ambiguity, in qualitative methodological decision making and research reporting. The authors argue that efforts should be made to make the research process, epistemologies, values, methodological decision points, and argumentative logic open,…
10 CFR 300.11 - Independent verification.
Code of Federal Regulations, 2011 CFR
2011-01-01
... verifiers, and has been empowered to make decisions relevant to the provision of a verification statement... methods; and (v) Risk assessment and methodologies and materiality analysis procedures outlined by other... Accreditation Board program for Environmental Management System auditors (ANSI-RAB-EMS); Board of Environmental...
10 CFR 300.11 - Independent verification.
Code of Federal Regulations, 2010 CFR
2010-01-01
... verifiers, and has been empowered to make decisions relevant to the provision of a verification statement... methods; and (v) Risk assessment and methodologies and materiality analysis procedures outlined by other... Accreditation Board program for Environmental Management System auditors (ANSI-RAB-EMS); Board of Environmental...
22 CFR 124.2 - Exemptions for training and military service.
Code of Federal Regulations, 2012 CFR
2012-04-01
..., software source code, design methodology, engineering analysis or manufacturing know-how such as that... underlying engineering methods and design philosophy utilized (i.e., the “why” or information that explains the rationale for particular design decision, engineering feature, or performance requirement...
22 CFR 124.2 - Exemptions for training and military service.
Code of Federal Regulations, 2014 CFR
2014-04-01
..., software source code, design methodology, engineering analysis or manufacturing know-how such as that... underlying engineering methods and design philosophy utilized (i.e., the “why” or information that explains the rationale for particular design decision, engineering feature, or performance requirement...
22 CFR 124.2 - Exemptions for training and military service.
Code of Federal Regulations, 2013 CFR
2013-04-01
..., software source code, design methodology, engineering analysis or manufacturing know-how such as that... underlying engineering methods and design philosophy utilized (i.e., the “why” or information that explains the rationale for particular design decision, engineering feature, or performance requirement...
22 CFR 124.2 - Exemptions for training and military service.
Code of Federal Regulations, 2011 CFR
2011-04-01
..., software source code, design methodology, engineering analysis or manufacturing know-how such as that... underlying engineering methods and design philosophy utilized (i.e., the “why” or information that explains the rationale for particular design decision, engineering feature, or performance requirement...
Risk methodology overview. [for carbon fiber release
NASA Technical Reports Server (NTRS)
Credeur, K. R.
1979-01-01
Some considerations of risk estimation, how risk is measured, and how risk analysis decisions are made are discussed. Specific problems of carbon fiber release are discussed by reviewing the objective, describing the main elements, and giving an example of the risk logic and outputs.
An innovative and shared methodology for event reconstruction using images in forensic science.
Milliet, Quentin; Jendly, Manon; Delémont, Olivier
2015-09-01
This study presents an innovative methodology for forensic science image analysis for event reconstruction. The methodology is based on experiences from real cases. It provides real added value to technical guidelines such as standard operating procedures (SOPs) and enriches the community of practices at stake in this field. This bottom-up solution outlines the many facets of analysis and the complexity of the decision-making process. Additionally, the methodology provides a backbone for articulating more detailed and technical procedures and SOPs. It emerged from a grounded theory approach; data from individual and collective interviews with eight Swiss and nine European forensic image analysis experts were collected and interpreted in a continuous, circular and reflexive manner. Throughout the process of conducting interviews and panel discussions, similarities and discrepancies were discussed in detail to provide a comprehensive picture of practices and points of view and to ultimately formalise shared know-how. Our contribution sheds light on the complexity of the choices, actions and interactions along the path of data collection and analysis, enhancing both the researchers' and participants' reflexivity. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Subagadis, Y. H.; Schütze, N.; Grundmann, J.
2014-09-01
The conventional methods used to solve multi-criteria multi-stakeholder problems are less strongly formulated, as they normally incorporate only homogeneous information at a time and suggest aggregating objectives of different decision-makers avoiding water-society interactions. In this contribution, Multi-Criteria Group Decision Analysis (MCGDA) using a fuzzy-stochastic approach has been proposed to rank a set of alternatives in water management decisions incorporating heterogeneous information under uncertainty. The decision making framework takes hydrologically, environmentally, and socio-economically motivated conflicting objectives into consideration. The criteria related to the performance of the physical system are optimized using multi-criteria simulation-based optimization, and fuzzy linguistic quantifiers have been used to evaluate subjective criteria and to assess stakeholders' degree of optimism. The proposed methodology is applied to find effective and robust intervention strategies for the management of a coastal hydrosystem affected by saltwater intrusion due to excessive groundwater extraction for irrigated agriculture and municipal use. Preliminary results show that the MCGDA based on a fuzzy-stochastic approach gives useful support for robust decision-making and is sensitive to the decision makers' degree of optimism.
Collaborative Strategic Decision Making in School Districts
ERIC Educational Resources Information Center
Brazer, S. David; Rich, William; Ross, Susan A.
2010-01-01
Purpose: The dual purpose of this paper is to determine how superintendents in US school districts work with stakeholders in the decision-making process and to learn how different choices superintendents make affect decision outcomes. Design/methodology/approach: This multiple case study of three school districts employs qualitative methodology to…
2009-03-01
making process (Skinner, 2001, 9). According to Clemen , before we can begin to apply any methodology to a specific decision problem, the analyst...it is possible to work with them to determine the values and objectives that relate to the decision in question ( Clemen , 2001, 21). Clemen ...value hierarchy is constructed, Clemen and Reilly suggest that a trade off is made between varying objectives. They introduce weights to determine
Clark, Renee M; Besterfield-Sacre, Mary E
2009-03-01
We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.
A Methodology to Support Decision Making in Flood Plan Mitigation
NASA Astrophysics Data System (ADS)
Biscarini, C.; di Francesco, S.; Manciola, P.
2009-04-01
The focus of the present document is on specific decision-making aspects of flood risk analysis. A flood is the result of runoff from rainfall in quantities too great to be confined in the low-water channels of streams. Little can be done to prevent a major flood, but we may be able to minimize damage within the flood plain of the river. This broad definition encompasses many possible mitigation measures. Floodplain management considers the integrated view of all engineering, nonstructural, and administrative measures for managing (minimizing) losses due to flooding on a comprehensive scale. The structural measures are the flood-control facilities designed according to flood characteristics and they include reservoirs, diversions, levees or dikes, and channel modifications. Flood-control measures that modify the damage susceptibility of floodplains are usually referred to as nonstructural measures and may require minor engineering works. On the other hand, those measures designed to modify the damage potential of permanent facilities are called non-structural and allow reducing potential damage during a flood event. Technical information is required to support the tasks of problem definition, plan formulation, and plan evaluation. The specific information needed and the related level of detail are dependent on the nature of the problem, the potential solutions, and the sensitivity of the findings to the basic information. Actions performed to set up and lay out the study are preliminary to the detailed analysis. They include: defining the study scope and detail, the field data collection, a review of previous studies and reports, and the assembly of needed maps and surveys. Risk analysis can be viewed as having many components: risk assessment, risk communication and risk management. Risk assessment comprises an analysis of the technical aspects of the problem, risk communication deals with conveying the information and risk management involves the decision process. In the present paper we propose a novel methodology for supporting the priority setting in the assessment of such issues, beyond the typical "expected value" approach. Scientific contribution and management aspects are merged to create a simplified method for plan basin implementation, based on risk and economic analyses. However, the economic evaluation is not the sole criterion for flood-damage reduction plan selection. Among the different criteria that are relevant to the decision process, safety and quality of human life, economic damage, expenses related with the chosen measures and environmental issues should play a fundamental role on the decisions made by the authorities. Some numerical indices, taking in account administrative, technical, economical and risk aspects, are defined and are combined together in a mathematical formula that defines a Priority Index (PI). In particular, the priority index defines a ranking of priority interventions, thus allowing the formulation of the investment plan. The research is mainly focused on the technical factors of risk assessment, providing quantitative and qualitative estimates of possible alternatives, containing measures of the risk associated with those alternatives. Moreover, the issues of risk management are analyzed, in particular with respect to the role of decision making in the presence of risk information. However, a great effort is devoted to make this index easy to be formulated and effective to allow a clear and transparent comparison between the alternatives. Summarizing this document describes a major- steps for incorporation of risk analysis into the decision making process: framing of the problem in terms of risk analysis, application of appropriate tools and techniques to obtain quantified results, use of the quantified results in the choice of structural and non-structural measures. In order to prove the reliability of the proposed methodology and to show how risk-based information can be incorporated into a flood analysis process, its application to some middle italy river basins is presented. The methodology assessment is performed by comparing different scenarios and showing that the optimal decision stems from a feasibility evaluation.
Kushniruk, A. W.; Patel, V. L.; Cimino, J. J.
1997-01-01
This paper describes an approach to the evaluation of health care information technologies based on usability engineering and a methodological framework from the study of medical cognition. The approach involves collection of a rich set of data including video recording of health care workers as they interact with systems, such as computerized patient records and decision support tools. The methodology can be applied in the laboratory setting, typically involving subjects "thinking aloud" as they interact with a system. A similar approach to data collection and analysis can also be extended to study of computer systems in the "live" environment of hospital clinics. Our approach is also influenced from work in the area of cognitive task analysis, which aims to characterize the decision making and reasoning of subjects of varied levels of expertise as they interact with information technology in carrying out representative tasks. The stages involved in conducting cognitively-based usability analyses are detailed and the application of such analysis in the iterative process of system and interface development is discussed. PMID:9357620
Initial Analysis of and Predictive Model Development for Weather Reroute Advisory Use
NASA Technical Reports Server (NTRS)
Arneson, Heather M.
2016-01-01
In response to severe weather conditions, traffic management coordinators specify reroutes to route air traffic around affected regions of airspace. Providing analysis and recommendations of available reroute options would assist the traffic management coordinators in making more efficient rerouting decisions. These recommendations can be developed by examining historical data to determine which previous reroute options were used in similar weather and traffic conditions. Essentially, using previous information to inform future decisions. This paper describes the initial steps and methodology used towards this goal. A method to extract relevant features from the large volume of weather data to quantify the convective weather scenario during a particular time range is presented. Similar routes are clustered. A description of the algorithm to identify which cluster of reroute advisories were actually followed by pilots is described. Models built for fifteen of the top twenty most frequently used reroute clusters correctly predict the use of the cluster for over 60 of the test examples. Results are preliminary but indicate that the methodology is worth pursuing with modifications based on insight gained from this analysis.
Event-scale power law recession analysis: quantifying methodological uncertainty
NASA Astrophysics Data System (ADS)
Dralle, David N.; Karst, Nathaniel J.; Charalampous, Kyriakos; Veenstra, Andrew; Thompson, Sally E.
2017-01-01
The study of single streamflow recession events is receiving increasing attention following the presentation of novel theoretical explanations for the emergence of power law forms of the recession relationship, and drivers of its variability. Individually characterizing streamflow recessions often involves describing the similarities and differences between model parameters fitted to each recession time series. Significant methodological sensitivity has been identified in the fitting and parameterization of models that describe populations of many recessions, but the dependence of estimated model parameters on methodological choices has not been evaluated for event-by-event forms of analysis. Here, we use daily streamflow data from 16 catchments in northern California and southern Oregon to investigate how combinations of commonly used streamflow recession definitions and fitting techniques impact parameter estimates of a widely used power law recession model. Results are relevant to watersheds that are relatively steep, forested, and rain-dominated. The highly seasonal mediterranean climate of northern California and southern Oregon ensures study catchments explore a wide range of recession behaviors and wetness states, ideal for a sensitivity analysis. In such catchments, we show the following: (i) methodological decisions, including ones that have received little attention in the literature, can impact parameter value estimates and model goodness of fit; (ii) the central tendencies of event-scale recession parameter probability distributions are largely robust to methodological choices, in the sense that differing methods rank catchments similarly according to the medians of these distributions; (iii) recession parameter distributions are method-dependent, but roughly catchment-independent, such that changing the choices made about a particular method affects a given parameter in similar ways across most catchments; and (iv) the observed correlative relationship between the power-law recession scale parameter and catchment antecedent wetness varies depending on recession definition and fitting choices. Considering study results, we recommend a combination of four key methodological decisions to maximize the quality of fitted recession curves, and to minimize bias in the related populations of fitted recession parameters.
Deciding to Come Out to Parents: Toward a Model of Sexual Orientation Disclosure Decisions.
Grafsky, Erika L
2017-08-16
The purpose of this study was to conduct research to understand nonheterosexual youths' decision to disclose their sexual orientation information to their parents. The sample for this study includes 22 youth between the ages of 14 and 21. Constructivist grounded theory guided the qualitative methodology and data analysis. The findings from this study posit an emerging model of sexual orientation disclosure decisions comprised of four interrelated factors that influence the decision to disclose or not disclose, as well as a description of the mechanism through which disclosure either does or does not occur. Clinical implications and recommendations for further research are provided. © 2017 Family Process Institute.
Fuzzy approaches to supplier selection problem
NASA Astrophysics Data System (ADS)
Ozkok, Beyza Ahlatcioglu; Kocken, Hale Gonce
2013-09-01
Supplier selection problem is a multi-criteria decision making problem which includes both qualitative and quantitative factors. In the selection process many criteria may conflict with each other, therefore decision-making process becomes complicated. In this study, we handled the supplier selection problem under uncertainty. In this context; we used minimum criterion, arithmetic mean criterion, regret criterion, optimistic criterion, geometric mean and harmonic mean. The membership functions created with the help of the characteristics of used criteria, and we tried to provide consistent supplier selection decisions by using these memberships for evaluating alternative suppliers. During the analysis, no need to use expert opinion is a strong aspect of the methodology used in the decision-making.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heo, Yeonsook; Augenbroe, Godfried; Graziano, Diane
2015-05-01
The increasing interest in retrofitting of existing buildings is motivated by the need to make a major contribution to enhancing building energy efficiency and reducing energy consumption and CO2 emission by the built environment. This paper examines the relevance of calibration in model-based analysis to support decision-making for energy and carbon efficiency retrofits of individual buildings and portfolios of buildings. The authors formulate a set of real retrofit decision-making situations and evaluate the role of calibration by using a case study that compares predictions and decisions from an uncalibrated model with those of a calibrated model. The case study illustratesmore » both the mechanics and outcomes of a practical alternative to the expert- and time-intense application of dynamic energy simulation models for large-scale retrofit decision-making under uncertainty.« less
Cost-effectiveness modelling in diagnostic imaging: a stepwise approach.
Sailer, Anna M; van Zwam, Wim H; Wildberger, Joachim E; Grutters, Janneke P C
2015-12-01
Diagnostic imaging (DI) is the fastest growing sector in medical expenditures and takes a central role in medical decision-making. The increasing number of various and new imaging technologies induces a growing demand for cost-effectiveness analysis (CEA) in imaging technology assessment. In this article we provide a comprehensive framework of direct and indirect effects that should be considered for CEA in DI, suitable for all imaging modalities. We describe and explain the methodology of decision analytic modelling in six steps aiming to transfer theory of CEA to clinical research by demonstrating key principles of CEA in a practical approach. We thereby provide radiologists with an introduction to the tools necessary to perform and interpret CEA as part of their research and clinical practice. • DI influences medical decision making, affecting both costs and health outcome. • This article provides a comprehensive framework for CEA in DI. • A six-step methodology for conducting and interpreting cost-effectiveness modelling is proposed.
Kohli, R; Tan, J K; Piontek, F A; Ziege, D E; Groot, H
1999-08-01
Changes in health care delivery, reimbursement schemes, and organizational structure have required health organizations to manage the costs of providing patient care while maintaining high levels of clinical and patient satisfaction outcomes. Today, cost information, clinical outcomes, and patient satisfaction results must become more fully integrated if strategic competitiveness and benefits are to be realized in health management decision making, especially in multi-entity organizational settings. Unfortunately, traditional administrative and financial systems are not well equipped to cater to such information needs. This article presents a framework for the acquisition, generation, analysis, and reporting of cost information with clinical outcomes and patient satisfaction in the context of evolving health management and decision-support system technology. More specifically, the article focuses on an enhanced costing methodology for determining and producing improved, integrated cost-outcomes information. Implementation issues and areas for future research in cost-information management and decision-support domains are also discussed.
Oxlade, Olivia; Pinto, Marcia; Trajman, Anete; Menzies, Dick
2013-01-01
Introduction Cost effectiveness analyses (CEA) can provide useful information on how to invest limited funds, however they are less useful if different analysis of the same intervention provide unclear or contradictory results. The objective of our study was to conduct a systematic review of methodologic aspects of CEA that evaluate Interferon Gamma Release Assays (IGRA) for the detection of Latent Tuberculosis Infection (LTBI), in order to understand how differences affect study results. Methods A systematic review of studies was conducted with particular focus on study quality and the variability in inputs used in models used to assess cost-effectiveness. A common decision analysis model of the IGRA versus Tuberculin Skin Test (TST) screening strategy was developed and used to quantify the impact on predicted results of observed differences of model inputs taken from the studies identified. Results Thirteen studies were ultimately included in the review. Several specific methodologic issues were identified across studies, including how study inputs were selected, inconsistencies in the costing approach, the utility of the QALY (Quality Adjusted Life Year) as the effectiveness outcome, and how authors choose to present and interpret study results. When the IGRA versus TST test strategies were compared using our common decision analysis model predicted effectiveness largely overlapped. Implications Many methodologic issues that contribute to inconsistent results and reduced study quality were identified in studies that assessed the cost-effectiveness of the IGRA test. More specific and relevant guidelines are needed in order to help authors standardize modelling approaches, inputs, assumptions and how results are presented and interpreted. PMID:23505412
Some Dimensions of Simulation.
ERIC Educational Resources Information Center
Beck, Isabel; Monroe, Bruce
Beginning with definitions of "simulation" (a methodology for testing alternative decisions under hypothetical conditions), this paper focuses on the use of simulation as an instructional method, pointing out the relationships and differences between role playing, games, and simulation. The term "simulation games" is explored with an analysis of…
Eye-gaze control of the computer interface: Discrimination of zoom intent
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldberg, J.H.; Schryver, J.C.
1993-10-01
An analysis methodology and associated experiment were developed to assess whether definable and repeatable signatures of eye-gaze characteristics are evident, preceding a decision to zoom-in, zoom-out, or not to zoom at a computer interface. This user intent discrimination procedure can have broad application in disability aids and telerobotic control. Eye-gaze was collected from 10 subjects in a controlled experiment, requiring zoom decisions. The eye-gaze data were clustered, then fed into a multiple discriminant analysis (MDA) for optimal definition of heuristics separating the zoom-in, zoom-out, and no-zoom conditions. Confusion matrix analyses showed that a number of variable combinations classified at amore » statistically significant level, but practical significance was more difficult to establish. Composite contour plots demonstrated the regions in parameter space consistently assigned by the MDA to unique zoom conditions. Peak classification occurred at about 1200--1600 msec. Improvements in the methodology to achieve practical real-time zoom control are considered.« less
Volk, Martin; Lautenbach, Sven; van Delden, Hedwig; Newham, Lachlan T H; Seppelt, Ralf
2010-12-01
This article analyses the benefits and shortcomings of the recently developed decision support systems (DSS) FLUMAGIS, Elbe-DSS, CatchMODS, and MedAction. The analysis elaborates on the following aspects: (i) application area/decision problem, (ii) stakeholder interaction/users involved, (iii) structure of DSS/model structure, (iv) usage of the DSS, and finally (v) most important shortcomings. On the basis of this analysis, we formulate four criteria that we consider essential for the successful use of DSS in landscape and river basin management. The criteria relate to (i) system quality, (ii) user support and user training, (iii) perceived usefulness and (iv) user satisfaction. We can show that the availability of tools and technologies for DSS in landscape and river basin management is good to excellent. However, our investigations indicate that several problems have to be tackled. First of all, data availability and homogenisation, uncertainty analysis and uncertainty propagation and problems with model integration require further attention. Furthermore, the appropriate and methodological stakeholder interaction and the definition of 'what end-users really need and want' have been documented as general shortcomings of all four examples of DSS. Thus, we propose an iterative development process that enables social learning of the different groups involved in the development process, because it is easier to design a DSS for a group of stakeholders who actively participate in an iterative process. We also identify two important lines of further development in DSS: the use of interactive visualization tools and the methodology of optimization to inform scenario elaboration and evaluate trade-offs among environmental measures and management alternatives.
NASA Astrophysics Data System (ADS)
Volk, Martin; Lautenbach, Sven; van Delden, Hedwig; Newham, Lachlan T. H.; Seppelt, Ralf
2010-12-01
This article analyses the benefits and shortcomings of the recently developed decision support systems (DSS) FLUMAGIS, Elbe-DSS, CatchMODS, and MedAction. The analysis elaborates on the following aspects: (i) application area/decision problem, (ii) stakeholder interaction/users involved, (iii) structure of DSS/model structure, (iv) usage of the DSS, and finally (v) most important shortcomings. On the basis of this analysis, we formulate four criteria that we consider essential for the successful use of DSS in landscape and river basin management. The criteria relate to (i) system quality, (ii) user support and user training, (iii) perceived usefulness and (iv) user satisfaction. We can show that the availability of tools and technologies for DSS in landscape and river basin management is good to excellent. However, our investigations indicate that several problems have to be tackled. First of all, data availability and homogenisation, uncertainty analysis and uncertainty propagation and problems with model integration require further attention. Furthermore, the appropriate and methodological stakeholder interaction and the definition of `what end-users really need and want' have been documented as general shortcomings of all four examples of DSS. Thus, we propose an iterative development process that enables social learning of the different groups involved in the development process, because it is easier to design a DSS for a group of stakeholders who actively participate in an iterative process. We also identify two important lines of further development in DSS: the use of interactive visualization tools and the methodology of optimization to inform scenario elaboration and evaluate trade-offs among environmental measures and management alternatives.
Methodological quality of economic evaluations of new pharmaceuticals in The Netherlands.
Hoomans, Ties; Severens, Johan L; van der Roer, Nicole; Delwel, Gepke O
2012-03-01
In the Netherlands, decisions about the reimbursement of new pharmaceuticals are based on cost effectiveness, as well as therapeutic value and budget impact. Since 1 January 2005, drug manufacturers are formally required to substantiate the cost effectiveness of drugs that have therapeutic added value in comparison with existing ones through pharmacoeconomic evaluations. Dutch guidelines for pharmacoeconomic research provide methods guidance, ensuring consistency in both the evidence and the decision-making process about drug reimbursement. This study reviewed the methodological quality of all 21 formally required pharmacoeconomic evaluations of new pharmaceuticals between 1 January 2005 and 1 October 2008, and verified whether these evaluations complied with pharmacoeconomic guidelines. Data on the quality of the pharmacoeconomic evaluations were extracted from the pharmacoeconomic reports published by the Dutch Health Care Insurance Board (CVZ). The Board's newsletters provided information on the advice to, and reimbursement decisions made by, the Dutch Minister of Health. All data extraction was carried out by two independent reviewers, and descriptive analyses were conducted. The methodological quality was sound in only 8 of the 21 pharmacoeconomic evaluations. In most cases, the perspective of analysis, the comparator drugs, and the reporting of both total and incremental costs and effects were correct. However, drug indication, form (i.e. cost utility/cost effectiveness) and time horizon of the evaluations were frequently flawed. Moreover, the costs and effects of the pharmaceuticals were not always analysed correctly, and modelling studies were often non-transparent. Twelve drugs were reimbursed, and nine were not. The compliance with pharmacoeconomic guidelines in economic evaluations of new pharmaceuticals can be improved. This would improve the methodological quality of the pharmacoeconomic evaluations and ensure consistency in the evidence and the decision-making process for drug reimbursement in the Netherlands.
Use of multicriteria decision analysis to address conservation conflicts.
Davies, A L; Bryce, R; Redpath, S M
2013-10-01
Conservation conflicts are increasing on a global scale and instruments for reconciling competing interests are urgently needed. Multicriteria decision analysis (MCDA) is a structured, decision-support process that can facilitate dialogue between groups with differing interests and incorporate human and environmental dimensions of conflict. MCDA is a structured and transparent method of breaking down complex problems and incorporating multiple objectives. The value of this process for addressing major challenges in conservation conflict management is that MCDA helps in setting realistic goals; entails a transparent decision-making process; and addresses mistrust, differing world views, cross-scale issues, patchy or contested information, and inflexible legislative tools. Overall we believe MCDA provides a valuable decision-support tool, particularly for increasing awareness of the effects of particular values and choices for working toward negotiated compromise, although an awareness of the effect of methodological choices and the limitations of the method is vital before applying it in conflict situations. © 2013 Society for Conservation Biology.
Patel, Vaishali N; Riley, Anne W
2007-10-01
A multiple case study was conducted to examine how staff in child out-of-home care programs used data from an Outcomes Management System (OMS) and other sources to inform decision-making. Data collection consisted of thirty-seven semi-structured interviews with clinicians, managers, and directors from two treatment foster care programs and two residential treatment centers, and individuals involved with developing the OMS; and observations of clinical and quality management meetings. Case study and grounded theory methodology guided analyses. The application of qualitative data analysis software is described. Results show that although staff rarely used data from the OMS, they did rely on other sources of systematically collected information to inform clinical, quality management, and program decisions. Analyses of how staff used these data suggest that improving the utility of OMS will involve encouraging staff to participate in data-based decision-making, and designing and implementing OMS in a manner that reflects how decision-making processes operate.
Optical diagnosis of cervical cancer by higher order spectra and boosting
NASA Astrophysics Data System (ADS)
Pratiher, Sawon; Mukhopadhyay, Sabyasachi; Barman, Ritwik; Pratiher, Souvik; Pradhan, Asima; Ghosh, Nirmalya; Panigrahi, Prasanta K.
2017-03-01
In this contribution, we report the application of higher order statistical moments using decision tree and ensemble based learning methodology for the development of diagnostic algorithms for optical diagnosis of cancer. The classification results were compared to those obtained with an independent feature extractors like linear discriminant analysis (LDA). The performance and efficacy of these methodology using higher order statistics as a classifier using boosting has higher specificity and sensitivity while being much faster as compared to other time-frequency domain based methods.
Hydrogen Hazards Assessment Protocol (HHAP): Approach and Methodology
NASA Technical Reports Server (NTRS)
Woods, Stephen
2009-01-01
This viewgraph presentation reviews the approach and methodology to develop a assessment protocol for hydrogen hazards. Included in the presentation are the reasons to perform hazards assessment, the types of hazard assessments that exist, an analysis of hydrogen hazards, specific information about the Hydrogen Hazards Assessment Protocol (HHAP). The assessment is specifically tailored for hydrogen behavior. The end product of the assesment is a compilation of hazard, mitigations and associated factors to facilitate decision making and achieve the best practice.
ERIC Educational Resources Information Center
Mason, Robert M.; And Others
This document presents a research effort intended to improve the economic information available for formulating politics and making decisions related to Information Analysis Centers (IAC's) and IAC services. The project used a system of IAC information activities to analyze the functional aspects of IAC services, calculate the present value of net…
Using multi-criteria decision analysis to appraise orphan drugs: a systematic review.
Friedmann, Carlotta; Levy, Pierre; Hensel, Paul; Hiligsmann, Mickaël
2018-04-01
Multi-criteria decision analysis (MCDA) could potentially solve current methodological difficulties in the appraisal of orphan drugs. Areas covered: We provide an overview of the existing evidence regarding the use of MCDA in the appraisal of orphan drugs worldwide. Three databases (Pubmed, Embase, Web of Science) were searched for English, French and German literature published between January 2000 and April 2017. Full-text articles were supplemented with conference abstracts. A total of seven articles and six abstracts were identified. Expert commentary: The literature suggests that MCDA is increasingly being used in the context of appraising orphan drugs. It has shown itself to be a flexible approach with the potential to assist in decision-making regarding reimbursement for orphan drugs. However, further research regarding its application must be conducted.
A stochastic multicriteria model for evidence-based decision making in drug benefit-risk analysis.
Tervonen, Tommi; van Valkenhoef, Gert; Buskens, Erik; Hillege, Hans L; Postmus, Douwe
2011-05-30
Drug benefit-risk (BR) analysis is based on firm clinical evidence regarding various safety and efficacy outcomes. In this paper, we propose a new and more formal approach for constructing a supporting multi-criteria model that fully takes into account the evidence on efficacy and adverse drug reactions. Our approach is based on the stochastic multi-criteria acceptability analysis methodology, which allows us to compute the typical value judgments that support a decision, to quantify decision uncertainty, and to compute a comprehensive BR profile. We construct a multi-criteria model for the therapeutic group of second-generation antidepressants. We assess fluoxetine and venlafaxine together with placebo according to incidence of treatment response and three common adverse drug reactions by using data from a published study. Our model shows that there are clear trade-offs among the treatment alternatives. Copyright © 2011 John Wiley & Sons, Ltd.
Bryan, Stirling; Williams, Iestyn; McIver, Shirley
2007-02-01
Resource scarcity is the raison d'être for the discipline of economics. Thus, the primary purpose of economic analysis is to help decision-makers when addressing problems arising due to the scarcity problem. The research reported here was concerned with how cost-effectiveness information is used by the National Institute for Health & Clinical Excellence (NICE) in national technology coverage decisions in the UK, and how its impact might be increased. The research followed a qualitative case study methodology with semi-structured interviews, supported by observation and analysis of secondary sources. Our research highlights that the technology appraisal function of NICE represents an important progression for the UK health economics community: new cost-effectiveness work is commissioned for each technology and that work directly informs national health policy. However, accountability in policy decisions necessitates that the information upon which decisions are based (including cost-effectiveness analysis, CEA) is accessible. This was found to be a serious problem and represents one of the main ongoing challenges. Other issues highlighted include perceived weaknesses in analysis methods and the poor alignment between the health maximisation objectives assumed in economic analyses and the range of other objectives facing decision-makers in reality. Copyright (c) 2006 John Wiley & Sons, Ltd.
Ritrovato, Matteo; Faggiano, Francesco C; Tedesco, Giorgia; Derrico, Pietro
2015-06-01
This article outlines the Decision-Oriented Health Technology Assessment: a new implementation of the European network for Health Technology Assessment Core Model, integrating the multicriteria decision-making analysis by using the analytic hierarchy process to introduce a standardized methodological approach as a valued and shared tool to support health care decision making within a hospital. Following the Core Model as guidance (European network for Health Technology Assessment. HTA core model for medical and surgical interventions. Available from: http://www.eunethta.eu/outputs/hta-core-model-medical-and-surgical-interventions-10r. [Accessed May 27, 2014]), it is possible to apply the analytic hierarchy process to break down a problem into its constituent parts and identify priorities (i.e., assigning a weight to each part) in a hierarchical structure. Thus, it quantitatively compares the importance of multiple criteria in assessing health technologies and how the alternative technologies perform in satisfying these criteria. The verbal ratings are translated into a quantitative form by using the Saaty scale (Saaty TL. Decision making with the analytic hierarchy process. Int J Serv Sci 2008;1:83-98). An eigenvectors analysis is used for deriving the weights' systems (i.e., local and global weights' system) that reflect the importance assigned to the criteria and the priorities related to the performance of the alternative technologies. Compared with the Core Model, this methodological approach supplies a more timely as well as contextualized evidence for a specific technology, making it possible to obtain data that are more relevant and easier to interpret, and therefore more useful for decision makers to make investment choices with greater awareness. We reached the conclusion that although there may be scope for improvement, this implementation is a step forward toward the goal of building a "solid bridge" between the scientific evidence and the final decision maker's choice. Copyright © 2015 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Discriminant forest classification method and system
Chen, Barry Y.; Hanley, William G.; Lemmond, Tracy D.; Hiller, Lawrence J.; Knapp, David A.; Mugge, Marshall J.
2012-11-06
A hybrid machine learning methodology and system for classification that combines classical random forest (RF) methodology with discriminant analysis (DA) techniques to provide enhanced classification capability. A DA technique which uses feature measurements of an object to predict its class membership, such as linear discriminant analysis (LDA) or Andersen-Bahadur linear discriminant technique (AB), is used to split the data at each node in each of its classification trees to train and grow the trees and the forest. When training is finished, a set of n DA-based decision trees of a discriminant forest is produced for use in predicting the classification of new samples of unknown class.
Schedule Risks Due to Delays in Advanced Technology Development
NASA Technical Reports Server (NTRS)
Reeves, John D. Jr.; Kayat, Kamal A.; Lim, Evan
2008-01-01
This paper discusses a methodology and modeling capability that probabilistically evaluates the likelihood and impacts of delays in advanced technology development prior to the start of design, development, test, and evaluation (DDT&E) of complex space systems. The challenges of understanding and modeling advanced technology development considerations are first outlined, followed by a discussion of the problem in the context of lunar surface architecture analysis. The current and planned methodologies to address the problem are then presented along with sample analyses and results. The methodology discussed herein provides decision-makers a thorough understanding of the schedule impacts resulting from the inclusion of various enabling advanced technology assumptions within system design.
Montserrat, A; Bosch, Ll; Kiser, M A; Poch, M; Corominas, Ll
2015-02-01
Using low-cost sensors, data can be collected on the occurrence and duration of overflows in each combined sewer overflow (CSO) structure in a combined sewer system (CSS). The collection and analysis of real data can be used to assess, improve, and maintain CSSs in order to reduce the number and impact of overflows. The objective of this study was to develop a methodology to evaluate the performance of CSSs using low-cost monitoring. This methodology includes (1) assessing the capacity of a CSS using overflow duration and rain volume data, (2) characterizing the performance of CSO structures with statistics, (3) evaluating the compliance of a CSS with government guidelines, and (4) generating decision tree models to provide support to managers for making decisions about system maintenance. The methodology is demonstrated with a case study of a CSS in La Garriga, Spain. The rain volume breaking point from which CSO structures started to overflow ranged from 0.6 mm to 2.8 mm. The structures with the best and worst performance in terms of overflow (overflow probability, order, duration and CSO ranking) were characterized. Most of the obtained decision trees to predict overflows from rain data had accuracies ranging from 70% to 83%. The results obtained from the proposed methodology can greatly support managers and engineers dealing with real-world problems, improvements, and maintenance of CSSs. Copyright © 2014 Elsevier B.V. All rights reserved.
Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention
Fisher, Brian; Smith, Jennifer; Pike, Ian
2017-01-01
Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications. PMID:28895928
External Dynamics Influencing Tattooing among College Students: A Qualitative Analysis
ERIC Educational Resources Information Center
Firmin, Michael; Tse, Luke; Foster, Janna; Angelini, Tammy
2012-01-01
The study utilized qualitative research methodology to assess external dynamics and their influences on tattooing practices among college students. Twenty-four undergraduates supplied in-depth interviews regarding the external variables related to college students' decisions to tattoo. The present research follows (Tse, Firmin, Angelini, &…
ERIC Educational Resources Information Center
Iivari, Juhani; Hirschheim, Rudy
1996-01-01
Analyzes and compares eight information systems (IS) development approaches: Information Modelling, Decision Support Systems, the Socio-Technical approach, the Infological approach, the Interactionist approach, the Speech Act-based approach, Soft Systems Methodology, and the Scandinavian Trade Unionist approach. Discusses the organizational roles…
Campolina, Alessandro Gonçalves; Soárez, Patrícia Coelho De; Amaral, Fábio Vieira do; Abe, Jair Minoro
2017-10-26
Multi-criteria decision analysis (MCDA) is an emerging tool that allows the integration of relevant factors for health technology assessment (HTA). This study aims to present a summary of the methodological characteristics of MCDA: definitions, approaches, applications, and implementation stages. A case study was conducted in the São Paulo State Cancer Institute (ICESP) in order to understand the perspectives of decision-makers in the process of drafting a recommendation for the incorporation of technology in the Brazilian Unified National Health System (SUS), through a report by the Brazilian National Commission for the Incorporation of Technologies in the SUS (CONITEC). Paraconsistent annotated evidential logic Eτ was the methodological approach adopted in the study, since it can serve as an underlying logic for constructs capable of synthesizing objective information (from the scientific literature) and subjective information (from experts' values and preferences in the area of knowledge). It also allows the incorporation of conflicting information (contradictions), as well as vague and even incomplete information in the valuation process, resulting from imperfection of the available scientific evidence. The method has the advantages of allowing explicit consideration of the criteria that influenced the decision, facilitating follow-up and visualization of process stages, allowing assessment of the contribution of each criterion separately, and in aggregate, to the decision's outcome, facilitating the discussion of diverging perspectives by different stakeholder groups, and increasing the understanding of the resulting recommendations. The use of an explicit MCDA approach should facilitate conflict mediation and optimize participation by different stakeholder groups.
Hermans, C.; Erickson, J.; Noordewier, T.; Sheldon, A.; Kline, M.
2007-01-01
Multicriteria decision analysis (MCDA) provides a well-established family of decision tools to aid stakeholder groups in arriving at collective decisions. MCDA can also function as a framework for the social learning process, serving as an educational aid in decision problems characterized by a high level of public participation. In this paper, the framework and results of a structured decision process using the outranking MCDA methodology preference ranking organization method of enrichment evaluation (PROMETHEE) are presented. PROMETHEE is used to frame multi-stakeholder discussions of river management alternatives for the Upper White River of Central Vermont, in the northeastern United States. Stakeholders met over 10 months to create a shared vision of an ideal river and its services to communities, develop a list of criteria by which to evaluate river management alternatives, and elicit preferences to rank and compare individual and group preferences. The MCDA procedure helped to frame a group process that made stakeholder preferences explicit and substantive discussions about long-term river management possible. ?? 2006 Elsevier Ltd. All rights reserved.
Hermans, Caroline; Erickson, Jon; Noordewier, Tom; Sheldon, Amy; Kline, Mike
2007-09-01
Multicriteria decision analysis (MCDA) provides a well-established family of decision tools to aid stakeholder groups in arriving at collective decisions. MCDA can also function as a framework for the social learning process, serving as an educational aid in decision problems characterized by a high level of public participation. In this paper, the framework and results of a structured decision process using the outranking MCDA methodology preference ranking organization method of enrichment evaluation (PROMETHEE) are presented. PROMETHEE is used to frame multi-stakeholder discussions of river management alternatives for the Upper White River of Central Vermont, in the northeastern United States. Stakeholders met over 10 months to create a shared vision of an ideal river and its services to communities, develop a list of criteria by which to evaluate river management alternatives, and elicit preferences to rank and compare individual and group preferences. The MCDA procedure helped to frame a group process that made stakeholder preferences explicit and substantive discussions about long-term river management possible.
NASA Astrophysics Data System (ADS)
Neuville, R.; Pouliot, J.; Poux, F.; Hallot, P.; De Rudder, L.; Billen, R.
2017-10-01
This paper deals with the establishment of a comprehensive methodological framework that defines 3D visualisation rules and its application in a decision support tool. Whilst the use of 3D models grows in many application fields, their visualisation remains challenging from the point of view of mapping and rendering aspects to be applied to suitability support the decision making process. Indeed, there exists a great number of 3D visualisation techniques but as far as we know, a decision support tool that facilitates the production of an efficient 3D visualisation is still missing. This is why a comprehensive methodological framework is proposed in order to build decision tables for specific data, tasks and contexts. Based on the second-order logic formalism, we define a set of functions and propositions among and between two collections of entities: on one hand static retinal variables (hue, size, shape…) and 3D environment parameters (directional lighting, shadow, haze…) and on the other hand their effect(s) regarding specific visual tasks. It enables to define 3D visualisation rules according to four categories: consequence, compatibility, potential incompatibility and incompatibility. In this paper, the application of the methodological framework is demonstrated for an urban visualisation at high density considering a specific set of entities. On the basis of our analysis and the results of many studies conducted in the 3D semiotics, which refers to the study of symbols and how they relay information, the truth values of propositions are determined. 3D visualisation rules are then extracted for the considered context and set of entities and are presented into a decision table with a colour coding. Finally, the decision table is implemented into a plugin developed with three.js, a cross-browser JavaScript library. The plugin consists of a sidebar and warning windows that help the designer in the use of a set of static retinal variables and 3D environment parameters.
Decision analysis in formulary decision making.
Schechter, C B
1993-06-01
Although decision making about what drugs to include in an institutional formulary appears to lend itself readily to quantitative techniques such as decision analysis and cost-benefit analysis, a review of the literature reveals that very little has been published in this area. Several of the published decision analyses use non-standard techniques that are, at best, of unproved validity, and may seriously distort the underlying issues through covert under-counting or double-counting of various drug attributes. Well executed decision analyses have contributed to establishing that drug acquisition costs are not an adequate measure of the total economic impact of formulary decisions and that costs of labour and materials associated with drug administration must be calculated on an institution-specific basis to reflect unique staffing patterns, bulk purchasing practices, and the availability of surplus capacity within the institution which might be mobilised at little marginal cost. Clinical studies of newly introduced drugs frequently fail to answer the questions that weigh most heavily on the structuring of a formal assessment of a proposed formulary acquisition. Studies comparing a full spectrum of therapeutically equivalent drugs are rarely done, and individual studies of particular pairs of drugs can rarely be used together because of differences in methodology or patient populations studied. Gathering of institution-specific economic and clinical data is a daunting, labour-intensive task. In many institutions, incentive and reward structures discourage behaviour that takes the broad institutional perspective that is intrinsic to a good decision analysis.(ABSTRACT TRUNCATED AT 250 WORDS)
Navigating the grounded theory terrain. Part 1.
Hunter, Andrew; Murphy, Kathy; Grealish, Annmarie; Casey, Dympna; Keady, John
2011-01-01
The decision to use grounded theory is not an easy one and this article aims to illustrate and explore the methodological complexity and decision-making process. It explores the decision making of one researcher in the first two years of a grounded theory PhD study looking at the psychosocial training needs of nurses and healthcare assistants working with people with dementia in residential care. It aims to map out three different approaches to grounded theory: classic, Straussian and constructivist. In nursing research, grounded theory is often referred to but it is not always well understood. This confusion is due in part to the history of grounded theory methodology, which is one of development and divergent approaches. Common elements across grounded theory approaches are briefly outlined, along with the key differences of the divergent approaches. Methodological literature pertaining to the three chosen grounded theory approaches is considered and presented to illustrate the options and support the choice made. The process of deciding on classical grounded theory as the version best suited to this research is presented. The methodological and personal factors that directed the decision are outlined. The relative strengths of Straussian and constructivist grounded theories are reviewed. All three grounded theory approaches considered offer the researcher a structured, rigorous methodology, but researchers need to understand their choices and make those choices based on a range of methodological and personal factors. In the second article, the final methodological decision will be outlined and its research application described.
Munson, Mark; Lieberman, Harvey; Tserlin, Elina; Rocnik, Jennifer; Ge, Jie; Fitzgerald, Maria; Patel, Vinod; Garcia-Echeverria, Carlos
2015-08-01
Herein, we report a novel and general method, lead optimization attrition analysis (LOAA), to benchmark two distinct small-molecule lead series using a relatively unbiased, simple technique and commercially available software. We illustrate this approach with data collected during lead optimization of two independent oncology programs as a case study. Easily generated graphics and attrition curves enabled us to calibrate progress and support go/no go decisions on each program. We believe that this data-driven technique could be used broadly by medicinal chemists and management to guide strategic decisions during drug discovery. Copyright © 2015 Elsevier Ltd. All rights reserved.
SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Payne, Suzette; Coppersmith, Ryan; Coppersmith, Kevin
A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Naval Reactors Facility (NRF), and the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL) (Figure 1-1). The PSHA followed the approaches and procedures appropriate for a Study Level 1 provided in the guidance advanced by the Senior Seismic Hazard Analysis Committee (SSHAC) in U.S. Nuclear Regulatory Commission (NRC) NUREG/CR-6372 and NUREG-2117 (NRC, 1997; 2012a). The SSHAC Level 1 PSHAs for MFC and ATR were conducted as part of the Seismic Risk Assessment (SRA) project (INL Project number 31287) to develop and apply a new-riskmore » informed methodology, respectively. The SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels. The SRA project is developing a new risk-informed methodology that will provide a systematic approach for evaluating the need for an update of an existing PSHA. The new methodology proposes criteria to be employed at specific analysis, decision, or comparison points in its evaluation process. The first four of seven criteria address changes in inputs and results of the PSHA and are given in U.S. Department of Energy (DOE) Standard, DOE-STD-1020-2012 (DOE, 2012a) and American National Standards Institute/American Nuclear Society (ANSI/ANS) 2.29 (ANS, 2008a). The last three criteria address evaluation of quantitative hazard and risk-focused information of an existing nuclear facility. The seven criteria and decision points are applied to Seismic Design Category (SDC) 3, 4, and 5, which are defined in American Society of Civil Engineers/Structural Engineers Institute (ASCE/SEI) 43-05 (ASCE, 2005). The application of the criteria and decision points could lead to an update or could determine that such update is not necessary.« less
A multi-criteria decision aid methodology to design electric vehicles public charging networks
NASA Astrophysics Data System (ADS)
Raposo, João; Rodrigues, Ana; Silva, Carlos; Dentinho, Tomaz
2015-05-01
This article presents a new multi-criteria decision aid methodology, dynamic-PROMETHEE, here used to design electric vehicle charging networks. In applying this methodology to a Portuguese city, results suggest that it is effective in designing electric vehicle charging networks, generating time and policy based scenarios, considering offer and demand and the city's urban structure. Dynamic-PROMETHE adds to the already known PROMETHEE's characteristics other useful features, such as decision memory over time, versatility and adaptability. The case study, used here to present the dynamic-PROMETHEE, served as inspiration and base to create this new methodology. It can be used to model different problems and scenarios that may present similar requirement characteristics.
Development of an evidence-based decision pathway for vestibular schwannoma treatment options.
Linkov, Faina; Valappil, Benita; McAfee, Jacob; Goughnour, Sharon L; Hildrew, Douglas M; McCall, Andrew A; Linkov, Igor; Hirsch, Barry; Snyderman, Carl
To integrate multiple sources of clinical information with patient feedback to build evidence-based decision support model to facilitate treatment selection for patients suffering from vestibular schwannomas (VS). This was a mixed methods study utilizing focus group and survey methodology to solicit feedback on factors important for making treatment decisions among patients. Two 90-minute focus groups were conducted by an experienced facilitator. Previously diagnosed VS patients were recruited by clinical investigators at the University of Pittsburgh Medical Center (UPMC). Classical content analysis was used for focus group data analysis. Providers were recruited from practices within the UPMC system and were surveyed using Delphi methods. This information can provide a basis for multi-criteria decision analysis (MCDA) framework to develop a treatment decision support system for patients with VS. Eight themes were derived from these data (focus group + surveys): doctor/health care system, side effects, effectiveness of treatment, anxiety, mortality, family/other people, quality of life, and post-operative symptoms. These data, as well as feedback from physicians were utilized in building a multi-criteria decision model. The study illustrated steps involved in the development of a decision support model that integrates evidence-based data and patient values to select treatment alternatives. Studies focusing on the actual development of the decision support technology for this group of patients are needed, as decisions are highly multifactorial. Such tools have the potential to improve decision making for complex medical problems with alternate treatment pathways. Copyright © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wanderer, Thomas, E-mail: thomas.wanderer@dlr.de; Herle, Stefan, E-mail: stefan.herle@rwth-aachen.de
2015-04-15
By their spatially very distributed nature, profitability and impacts of renewable energy resources are highly correlated with the geographic locations of power plant deployments. A web-based Spatial Decision Support System (SDSS) based on a Multi-Criteria Decision Analysis (MCDA) approach has been implemented for identifying preferable locations for solar power plants based on user preferences. The designated areas found serve for the input scenario development for a subsequent integrated Environmental Impact Assessment. The capabilities of the SDSS service get showcased for Concentrated Solar Power (CSP) plants in the region of Andalusia, Spain. The resulting spatial patterns of possible power plant sitesmore » are an important input to the procedural chain of assessing impacts of renewable energies in an integrated effort. The applied methodology and the implemented SDSS are applicable for other renewable technologies as well. - Highlights: • The proposed tool facilitates well-founded CSP plant siting decisions. • Spatial MCDA methods are implemented in a WebGIS environment. • GIS-based SDSS can contribute to a modern integrated impact assessment workflow. • The conducted case study proves the suitability of the methodology.« less
Environmental Education in Action: A Discursive Approach to Curriculum Design
ERIC Educational Resources Information Center
Reis, Giuliano; Roth, Wolff-Michael
2007-01-01
Why do the designers of environmental education do what they do towards the environment through education? More importantly, how do they account for their design decisions (plans and actions)? Using the theoretical and methodological framework of discourse analysis, we analyse environmental education designers' discourse in terms of the discursive…
ERIC Educational Resources Information Center
Turan, Fikret Korhan; Cetinkaya, Saadet; Ustun, Ceyda
2016-01-01
Building sustainable universities calls for participative management and collaboration among stakeholders. Combining analytic hierarchy and network processes (AHP/ANP) with statistical analysis, this research proposes a framework that can be used in higher education institutions for integrating stakeholder preferences into strategic decisions. The…
ERIC Educational Resources Information Center
Lauckner, Heidi; Paterson, Margo; Krupa, Terry
2012-01-01
Often, research projects are presented as final products with the methodologies cleanly outlined and little attention paid to the decision-making processes that led to the chosen approach. Limited attention paid to these decision-making processes perpetuates a sense of mystery about qualitative approaches, particularly for new researchers who will…
Error and Uncertainty Analysis for Ecological Modeling and Simulation
2001-12-01
management (LRAM) accounting for environmental, training, and economic factors. In the ELVS methodology, soil erosion status is used as a quantitative...Monte-Carlo approach. The optimization is realized through economic functions or on decision constraints, such as, unit sample cost, number of samples... nitrate flux to the Gulf of Mexico. Nature (Brief Communication) 414: 166-167. (Uncertainty analysis done with SERDP software) Gertner, G., G
Baines, Janis; Cunningham, Judy; Leemhuis, Christel; Hambridge, Tracy; Mackerras, Dorothy
2011-01-01
The approach used by food regulation agencies to examine the literature and forecast the impact of possible food regulations has many similar features to the approach used in nutritional epidemiological research. We outline the Risk Analysis Framework described by FAO/WHO, in which there is formal progression from identification of the nutrient or food chemical of interest, through to describing its effect on health and then assessing whether there is a risk to the population based on dietary exposure estimates. We then discuss some important considerations for the dietary modeling component of the Framework, including several methodological issues that also exist in research nutritional epidemiology. Finally, we give several case studies that illustrate how the different methodological components are used together to inform decisions about how to manage the regulatory problem. PMID:22254081
Toward a methodology for moral decision making in medicine.
Kushner, T; Belliotti, R A; Buckner, D
1991-12-01
The failure of medical codes to provide adequate guidance for physicians' moral dilemmas points to the fact that some rules of analysis, informed by moral theory, are needed to assist in resolving perplexing ethical problems occurring with increasing frequency as medical technology advances. Initially, deontological and teleological theories appear more helpful, but criticisms can be lodged against both, and neither proves to be sufficient in itself. This paper suggests that to elude the limitations of previous approaches, a method of moral decision making must be developed incorporating both coherence methodology and some independently supported theoretical foundations. Wide Reflective Equilibrium is offered, and its process described along with a theory of the person which is used to animate the process. Steps are outlined to be used in the process, leading to the application of the method to an actual case.
Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan; ...
2017-08-23
A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan
A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less
Using Risk Assessment Methodologies to Meet Management Objectives
NASA Technical Reports Server (NTRS)
DeMott, D. L.
2015-01-01
Current decision making involves numerous possible combinations of technology elements, safety and health issues, operational aspects and process considerations to satisfy program goals. Identifying potential risk considerations as part of the management decision making process provides additional tools to make more informed management decision. Adapting and using risk assessment methodologies can generate new perspectives on various risk and safety concerns that are not immediately apparent. Safety and operational risks can be identified and final decisions can balance these considerations with cost and schedule risks. Additional assessments can also show likelihood of event occurrence and event consequence to provide a more informed basis for decision making, as well as cost effective mitigation strategies. Methodologies available to perform Risk Assessments range from qualitative identification of risk potential, to detailed assessments where quantitative probabilities are calculated. Methodology used should be based on factors that include: 1) type of industry and industry standards, 2) tasks, tools, and environment 3) type and availability of data and 4) industry views and requirements regarding risk & reliability. Risk Assessments are a tool for decision makers to understand potential consequences and be in a position to reduce, mitigate or eliminate costly mistakes or catastrophic failures.
Goulart Coelho, Lineker M; Lange, Liséte C; Coelho, Hosmanny Mg
2017-01-01
Solid waste management is a complex domain involving the interaction of several dimensions; thus, its analysis and control impose continuous challenges for decision makers. In this context, multi-criteria decision-making models have become important and convenient supporting tools for solid waste management because they can handle problems involving multiple dimensions and conflicting criteria. However, the selection of the multi-criteria decision-making method is a hard task since there are several multi-criteria decision-making approaches, each one with a large number of variants whose applicability depends on information availability and the aim of the study. Therefore, to support researchers and decision makers, the objectives of this article are to present a literature review of multi-criteria decision-making applications used in solid waste management, offer a critical assessment of the current practices, and provide suggestions for future works. A brief review of fundamental concepts on this topic is first provided, followed by the analysis of 260 articles related to the application of multi-criteria decision making in solid waste management. These studies were investigated in terms of the methodology, including specific steps such as normalisation, weighting, and sensitivity analysis. In addition, information related to waste type, the study objective, and aspects considered was recorded. From the articles analysed it is noted that studies using multi-criteria decision making in solid waste management are predominantly addressed to problems related to municipal solid waste involving facility location or management strategy.
Lifecycle analysis for automobiles: Uses and limitations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaines, L.; Stodolsky, F.
There has been a recent trend toward the use of lifecycle analysis (LCA) as a decision-making tool for the automotive industry. However, the different practitioners` methods and assumptions vary widely, as do the interpretations put on the results. The lack of uniformity has been addressed by such groups as the Society of Environmental Toxicology and Chemistry (SETAC) and the International Organization for Standardization (ISO), but standardization of methodology assures neither meaningful results nor appropriate use of the results. This paper examines the types of analysis that are possible for automobiles, explains possible pitfalls to be avoided, and suggests ways thatmore » LCA can be used as part of a rational decision-making procedure. The key to performing a useful analysis is identification of the factors that will actually be used in making the decision. It makes no sense to analyze system energy use in detail if direct financial cost is to be the decision criterion. Criteria may depend on who is making the decision (consumer, producer, regulator). LCA can be used to track system performance for a variety of criteria, including emissions, energy use, and monetary costs, and these can have spatial and temporal distributions. Because optimization of one parameter is likely to worsen another, identification of trade-offs is an important function of LCA.« less
Multi-criteria decision analysis in environmental sciences: ten years of applications and trends.
Huang, Ivy B; Keisler, Jeffrey; Linkov, Igor
2011-09-01
Decision-making in environmental projects requires consideration of trade-offs between socio-political, environmental, and economic impacts and is often complicated by various stakeholder views. Multi-criteria decision analysis (MCDA) emerged as a formal methodology to face available technical information and stakeholder values to support decisions in many fields and can be especially valuable in environmental decision making. This study reviews environmental applications of MCDA. Over 300 papers published between 2000 and 2009 reporting MCDA applications in the environmental field were identified through a series of queries in the Web of Science database. The papers were classified by their environmental application area, decision or intervention type. In addition, the papers were also classified by the MCDA methods used in the analysis (analytic hierarchy process, multi-attribute utility theory, and outranking). The results suggest that there is a significant growth in environmental applications of MCDA over the last decade across all environmental application areas. Multiple MCDA tools have been successfully used for environmental applications. Even though the use of the specific methods and tools varies in different application areas and geographic regions, our review of a few papers where several methods were used in parallel with the same problem indicates that recommended course of action does not vary significantly with the method applied. Published by Elsevier B.V.
Spatial planning using probabilistic flood maps
NASA Astrophysics Data System (ADS)
Alfonso, Leonardo; Mukolwe, Micah; Di Baldassarre, Giuliano
2015-04-01
Probabilistic flood maps account for uncertainty in flood inundation modelling and convey a degree of certainty in the outputs. Major sources of uncertainty include input data, topographic data, model structure, observation data and parametric uncertainty. Decision makers prefer less ambiguous information from modellers; this implies that uncertainty is suppressed to yield binary flood maps. Though, suppressing information may potentially lead to either surprise or misleading decisions. Inclusion of uncertain information in the decision making process is therefore desirable and transparent. To this end, we utilise the Prospect theory and information from a probabilistic flood map to evaluate potential decisions. Consequences related to the decisions were evaluated using flood risk analysis. Prospect theory explains how choices are made given options for which probabilities of occurrence are known and accounts for decision makers' characteristics such as loss aversion and risk seeking. Our results show that decision making is pronounced when there are high gains and loss, implying higher payoffs and penalties, therefore a higher gamble. Thus the methodology may be appropriately considered when making decisions based on uncertain information.
2010-01-01
Background Decision support in health systems is a highly difficult task, due to the inherent complexity of the process and structures involved. Method This paper introduces a new hybrid methodology Expert-based Cooperative Analysis (EbCA), which incorporates explicit prior expert knowledge in data analysis methods, and elicits implicit or tacit expert knowledge (IK) to improve decision support in healthcare systems. EbCA has been applied to two different case studies, showing its usability and versatility: 1) Bench-marking of small mental health areas based on technical efficiency estimated by EbCA-Data Envelopment Analysis (EbCA-DEA), and 2) Case-mix of schizophrenia based on functional dependency using Clustering Based on Rules (ClBR). In both cases comparisons towards classical procedures using qualitative explicit prior knowledge were made. Bayesian predictive validity measures were used for comparison with expert panels results. Overall agreement was tested by Intraclass Correlation Coefficient in case "1" and kappa in both cases. Results EbCA is a new methodology composed by 6 steps:. 1) Data collection and data preparation; 2) acquisition of "Prior Expert Knowledge" (PEK) and design of the "Prior Knowledge Base" (PKB); 3) PKB-guided analysis; 4) support-interpretation tools to evaluate results and detect inconsistencies (here Implicit Knowledg -IK- might be elicited); 5) incorporation of elicited IK in PKB and repeat till a satisfactory solution; 6) post-processing results for decision support. EbCA has been useful for incorporating PEK in two different analysis methods (DEA and Clustering), applied respectively to assess technical efficiency of small mental health areas and for case-mix of schizophrenia based on functional dependency. Differences in results obtained with classical approaches were mainly related to the IK which could be elicited by using EbCA and had major implications for the decision making in both cases. Discussion This paper presents EbCA and shows the convenience of completing classical data analysis with PEK as a mean to extract relevant knowledge in complex health domains. One of the major benefits of EbCA is iterative elicitation of IK.. Both explicit and tacit or implicit expert knowledge are critical to guide the scientific analysis of very complex decisional problems as those found in health system research. PMID:20920289
Gibert, Karina; García-Alonso, Carlos; Salvador-Carulla, Luis
2010-09-30
Decision support in health systems is a highly difficult task, due to the inherent complexity of the process and structures involved. This paper introduces a new hybrid methodology Expert-based Cooperative Analysis (EbCA), which incorporates explicit prior expert knowledge in data analysis methods, and elicits implicit or tacit expert knowledge (IK) to improve decision support in healthcare systems. EbCA has been applied to two different case studies, showing its usability and versatility: 1) Bench-marking of small mental health areas based on technical efficiency estimated by EbCA-Data Envelopment Analysis (EbCA-DEA), and 2) Case-mix of schizophrenia based on functional dependency using Clustering Based on Rules (ClBR). In both cases comparisons towards classical procedures using qualitative explicit prior knowledge were made. Bayesian predictive validity measures were used for comparison with expert panels results. Overall agreement was tested by Intraclass Correlation Coefficient in case "1" and kappa in both cases. EbCA is a new methodology composed by 6 steps:. 1) Data collection and data preparation; 2) acquisition of "Prior Expert Knowledge" (PEK) and design of the "Prior Knowledge Base" (PKB); 3) PKB-guided analysis; 4) support-interpretation tools to evaluate results and detect inconsistencies (here Implicit Knowledg -IK- might be elicited); 5) incorporation of elicited IK in PKB and repeat till a satisfactory solution; 6) post-processing results for decision support. EbCA has been useful for incorporating PEK in two different analysis methods (DEA and Clustering), applied respectively to assess technical efficiency of small mental health areas and for case-mix of schizophrenia based on functional dependency. Differences in results obtained with classical approaches were mainly related to the IK which could be elicited by using EbCA and had major implications for the decision making in both cases. This paper presents EbCA and shows the convenience of completing classical data analysis with PEK as a mean to extract relevant knowledge in complex health domains. One of the major benefits of EbCA is iterative elicitation of IK.. Both explicit and tacit or implicit expert knowledge are critical to guide the scientific analysis of very complex decisional problems as those found in health system research.
Assessment of New Approaches in Geothermal Exploration Decision Making: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akar, S.; Young, K. R.
Geothermal exploration projects have significant amount of risk associated with uncertainties encountered in the discovery of the geothermal resource. Understanding when and how to proceed in an exploration program, and when to walk away from a site, are two of the largest challenges for increased geothermal deployment. Current methodologies for exploration decision making is left to subjective by subjective expert opinion which can be incorrectly biased by expertise (e.g. geochemistry, geophysics), geographic location of focus, and the assumed conceptual model. The aim of this project is to develop a methodology for more objective geothermal exploration decision making at a givenmore » location, including go-no-go decision points to help developers and investors decide when to give up on a location. In this scope, two different approaches are investigated: 1) value of information analysis (VOIA) which is used for evaluating and quantifying the value of a data before they are purchased, and 2) enthalpy-based exploration targeting based on reservoir size, temperature gradient estimates, and internal rate of return (IRR). The first approach, VOIA, aims to identify the value of a particular data when making decisions with an uncertain outcome. This approach targets the pre-drilling phase of exploration. These estimated VOIs are highly affected by the size of the project and still have a high degree of subjectivity in assignment of probabilities. The second approach, exploration targeting, is focused on decision making during the drilling phase. It starts with a basic geothermal project definition that includes target and minimum required production capacity and initial budgeting for exploration phases. Then, it uses average temperature gradient, reservoir temperature estimates, and production capacity to define targets and go/no-go limits. The decision analysis in this approach is based on achieving a minimum IRR at each phase of the project. This second approach was determined to be less subjective, since it requires less subjectivity in the input values.« less
Hybrid analysis for indicating patients with breast cancer using temperature time series.
Silva, Lincoln F; Santos, Alair Augusto S M D; Bravo, Renato S; Silva, Aristófanes C; Muchaluat-Saade, Débora C; Conci, Aura
2016-07-01
Breast cancer is the most common cancer among women worldwide. Diagnosis and treatment in early stages increase cure chances. The temperature of cancerous tissue is generally higher than that of healthy surrounding tissues, making thermography an option to be considered in screening strategies of this cancer type. This paper proposes a hybrid methodology for analyzing dynamic infrared thermography in order to indicate patients with risk of breast cancer, using unsupervised and supervised machine learning techniques, which characterizes the methodology as hybrid. The dynamic infrared thermography monitors or quantitatively measures temperature changes on the examined surface, after a thermal stress. In the dynamic infrared thermography execution, a sequence of breast thermograms is generated. In the proposed methodology, this sequence is processed and analyzed by several techniques. First, the region of the breasts is segmented and the thermograms of the sequence are registered. Then, temperature time series are built and the k-means algorithm is applied on these series using various values of k. Clustering formed by k-means algorithm, for each k value, is evaluated using clustering validation indices, generating values treated as features in the classification model construction step. A data mining tool was used to solve the combined algorithm selection and hyperparameter optimization (CASH) problem in classification tasks. Besides the classification algorithm recommended by the data mining tool, classifiers based on Bayesian networks, neural networks, decision rules and decision tree were executed on the data set used for evaluation. Test results support that the proposed analysis methodology is able to indicate patients with breast cancer. Among 39 tested classification algorithms, K-Star and Bayes Net presented 100% classification accuracy. Furthermore, among the Bayes Net, multi-layer perceptron, decision table and random forest classification algorithms, an average accuracy of 95.38% was obtained. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Distributed intelligent data analysis in diabetic patient management.
Bellazzi, R.; Larizza, C.; Riva, A.; Mira, A.; Fiocchi, S.; Stefanelli, M.
1996-01-01
This paper outlines the methodologies that can be used to perform an intelligent analysis of diabetic patients' data, realized in a distributed management context. We present a decision-support system architecture based on two modules, a Patient Unit and a Medical Unit, connected by telecommunication services. We stress the necessity to resort to temporal abstraction techniques, combined with time series analysis, in order to provide useful advice to patients; finally, we outline how data analysis and interpretation can be cooperatively performed by the two modules. PMID:8947655
1991-09-01
Baldwin, Robert H. and Thomas V. Daula. "Army Recruit Attrition and Force Manning Costs: Methodology and Analysis," Research in Labor Economics , ed. Ronald...Methodological Issues and a Proposed Specification," Research in Labor Economics , ed. Ronald Ehrenberg, 7:339-363 (1985b). Baldwin, Robert H. et al. "Military...Manpower Research: An Introduction," Research in Labor Economics , ed. Ronald Ehrenberg, L:257-260 (1985). Dalton, Dan R. et al. "Turnover Overstated: The
Measurement, methods, and divergent patterns: Reassessing the effects of same-sex parents.
Cheng, Simon; Powell, Brian
2015-07-01
Scholars have noted that survey analysis of small subsamples-for example, same-sex parent families-is sensitive to researchers' analytical decisions, and even small differences in coding can profoundly shape empirical patterns. As an illustration, we reassess the findings of a recent article by Regnerus regarding the implications of being raised by gay and lesbian parents. Taking a close look at the New Family Structures Study (NFSS), we demonstrate the potential for misclassifying a non-negligible number of respondents as having been raised by parents who had a same-sex romantic relationship. We assess the implications of these possible misclassifications, along with other methodological considerations, by reanalyzing the NFSS in seven steps. The reanalysis offers evidence that the empirical patterns showcased in the original Regnerus article are fragile-so fragile that they appear largely a function of these possible misclassifications and other methodological choices. Our replication and reanalysis of Regnerus's study offer a cautionary illustration of the importance of double checking and critically assessing the implications of measurement and other methodological decisions in our and others' research. Copyright © 2015 Elsevier Inc. All rights reserved.
Qiao, Yuanhua; Keren, Nir; Mannan, M Sam
2009-08-15
Risk assessment and management of transportation of hazardous materials (HazMat) require the estimation of accident frequency. This paper presents a methodology to estimate hazardous materials transportation accident frequency by utilizing publicly available databases and expert knowledge. The estimation process addresses route-dependent and route-independent variables. Negative binomial regression is applied to an analysis of the Department of Public Safety (DPS) accident database to derive basic accident frequency as a function of route-dependent variables, while the effects of route-independent variables are modeled by fuzzy logic. The integrated methodology provides the basis for an overall transportation risk analysis, which can be used later to develop a decision support system.
Medical technology as a key driver of rising health expenditure: disentangling the relationship
Sorenson, Corinna; Drummond, Michael; Bhuiyan Khan, Beena
2013-01-01
Health care spending has risen steadily in most countries, becoming a concern for decision-makers worldwide. Commentators often point to new medical technology as the key driver for burgeoning expenditures. This paper critically appraises this conjecture, based on an analysis of the existing literature, with the aim of offering a more detailed and considered analysis of this relationship. Several databases were searched to identify relevant literature. Various categories of studies (eg, multivariate and cost-effectiveness analyses) were included to cover different perspectives, methodological approaches, and issues regarding the link between medical technology and costs. Selected articles were reviewed and relevant information was extracted into a standardized template and analyzed for key cross-cutting themes, ie, impact of technology on costs, factors influencing this relationship, and methodological challenges in measuring such linkages. A total of 86 studies were reviewed. The analysis suggests that the relationship between medical technology and spending is complex and often conflicting. Findings were frequently contingent on varying factors, such as the availability of other interventions, patient population, and the methodological approach employed. Moreover, the impact of technology on costs differed across technologies, in that some (eg, cancer drugs, invasive medical devices) had significant financial implications, while others were cost-neutral or cost-saving. In light of these issues, we argue that decision-makers and other commentators should extend their focus beyond costs solely to include consideration of whether medical technology results in better value in health care and broader socioeconomic benefits. PMID:23807855
A semi-quantitative approach to GMO risk-benefit analysis.
Morris, E Jane
2011-10-01
In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process.
Caughlan, L.
2002-01-01
Natural resource management decisions are complicated by multiple property rights, management objectives, and stakeholders with varying degrees of influence over the decision making process. In order to make efficient decisions, managers must incorporate the opinions and values of the involved stakeholders as well as understand the complex institutional constraints and opportunities that influence the decision-making process. Often this type of information is not understood until after a decision has been made, which can result in wasted time and effort.The purpose of my dissertation was to show how institutional frameworks and stakeholder involvement influence the various phases of the resource management decision-making process in a public choice framework. The intent was to assist decision makers and stakeholders by developing a methodology for formally incorporating stakeholders'' objectives and influence into the resource management planning process and to predict the potential success of rent-seeking activity based on stakeholder preferences and level of influence. Concepts from decision analysis, institutional analysis, and public choice economics were used in designing this interdisciplinary framework. The framework was then applied to an actual case study concerning elk and bison management on the National Elk Refuge and Grand Teton National Park near Jackson, Wyoming. The framework allowed for the prediction of the level of support and conflict for all relevant policy decisions, and the identification of each stakeholder''s level of support or opposition for each management decision.
Transmission Planning Analysis Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-06-23
Developed to solve specific problem: Assist transmission planning for regional transfers in interconnected power systems. This work was originated in a study for the U.S. Department of State, to recommend transmission reinforcements for the Central American regional system that interconnects 6 countries. Transmission planning analysis is currently performed by engineers with domainspecific and systemspecific knowledge without a unique methodology. The software codes of this disclosure assists engineers by defining systematic analysis procedures to help identify weak points and make decisions on transmission planning of regional interconnected power systems. Transmission Planning Analysis Tool groups PSS/E results of multiple AC contingency analysismore » and voltage stability analysis and QV analysis of many scenarios of study and arrange them in a systematic way to aid power system planning engineers or transmission operators in effective decision]making process or in the off]line study environment.« less
[Parameter of evidence-based medicine in health care economics].
Wasem, J; Siebert, U
1999-08-01
In the view of scarcity of resources, economic evaluations in health care, in which not only effects but also costs related to a medical intervention are examined and a incremental cost-outcome-ratio is build, are an important supplement to the program of evidence based medicine. Outcomes of a medical intervention can be measured by clinical effectiveness, quality-adjusted life years, and monetary evaluation of benefits. As far as costs are concerned, direct medical costs, direct non-medical costs and indirect costs have to be considered in an economic evaluation. Data can be used from primary studies or secondary analysis; metaanalysis for synthesizing of data may be adequate. For calculation of incremental cost-benefit-ratios, models of decision analysis (decision tree models, Markov-models) often are necessary. Methodological and ethical limits for application of the results of economic evaluation in resource allocation decision in health care have to be regarded: Economic evaluations and the calculation of cost-outcome-rations should only support decision making but cannot replace it.
Mixture-based gatekeeping procedures in adaptive clinical trials.
Kordzakhia, George; Dmitrienko, Alex; Ishida, Eiji
2018-01-01
Clinical trials with data-driven decision rules often pursue multiple clinical objectives such as the evaluation of several endpoints or several doses of an experimental treatment. These complex analysis strategies give rise to "multivariate" multiplicity problems with several components or sources of multiplicity. A general framework for defining gatekeeping procedures in clinical trials with adaptive multistage designs is proposed in this paper. The mixture method is applied to build a gatekeeping procedure at each stage and inferences at each decision point (interim or final analysis) are performed using the combination function approach. An advantage of utilizing the mixture method is that it enables powerful gatekeeping procedures applicable to a broad class of settings with complex logical relationships among the hypotheses of interest. Further, the combination function approach supports flexible data-driven decisions such as a decision to increase the sample size or remove a treatment arm. The paper concludes with a clinical trial example that illustrates the methodology by applying it to develop an adaptive two-stage design with a mixture-based gatekeeping procedure.
Cristy Watkins; Lynne M. Westphal
2015-01-01
In this paper, we describe our application of Ostrom et al.'s ADICO syntax, a grammatical tool based in the Institutional Analysis and Development framework, to a study of ecological restoration decision making in the Chicago Wilderness region. As this method has only been used to look at written policy and/or extractive natural resource management systems, our...
Coastal zone management with stochastic multi-criteria analysis.
Félix, A; Baquerizo, A; Santiago, J M; Losada, M A
2012-12-15
The methodology for coastal management proposed in this study takes into account the physical processes of the coastal system and the stochastic nature of forcing agents. Simulation techniques are used to assess the uncertainty in the performance of a set of predefined management strategies based on different criteria representing the main concerns of interest groups. This statistical information as well as the distribution function that characterizes the uncertainty regarding the preferences of the decision makers is fed into a stochastic multi-criteria acceptability analysis that provides the probability of alternatives obtaining certain ranks and also calculates the preferences of a typical decision maker who supports an alternative. This methodology was applied as a management solution for Playa Granada in the Guadalfeo River Delta (Granada, Spain), where the construction of a dam in the river basin is causing severe erosion. The analysis of shoreline evolution took into account the coupled action of atmosphere, ocean, and land agents and their intrinsic stochastic character. This study considered five different management strategies. The criteria selected for the analysis were the economic benefits for three interest groups: (i) indirect beneficiaries of tourist activities; (ii) beach homeowners; and (iii) the administration. The strategies were ranked according to their effectiveness, and the relative importance given to each criterion was obtained. Copyright © 2012 Elsevier Ltd. All rights reserved.
Reiner, Bruce I
2018-04-01
Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.
ERIC Educational Resources Information Center
Kalman, Yoram M.
2016-01-01
In an era when novel educational technologies are constantly introduced to the marketplace, often accompanied by hyperbolic claims that these ground-breaking innovations will transform the educational landscape, decision makers in educational institutions need a methodological approach for examining the innovative potential of new educational…
Power Principles for Educational Leaders: Research into Practice
ERIC Educational Resources Information Center
Hoy, Wayne K.; Tarter, C. John
2011-01-01
Purpose: The aim of this article is to examine the empirical literature on irrationality and identify a set of concepts to help administrators cope with irrationality in decision making. Design/methodology/approach: This analysis is a synthesis of the selected research literature on irrationality. Findings: A set of seven concepts and propositions…
ERIC Educational Resources Information Center
Porter, Susan G.; Koch, Steven P.; Henderson, Andrew
2010-01-01
Background: There is a lack of consistent, comprehensible data collection and analysis methods for evaluating teacher preparation program's coverage of required standards for accreditation. Of particular concern is the adequate coverage of standards and competencies that address the teaching of English learners and teachers of students from…
E-Learning Adoption: The Role of Relative Advantages, Trialability and Academic Specialisation
ERIC Educational Resources Information Center
Hsbollah, Hafizah Mohamad; Idris, Kamil Md.
2009-01-01
Purpose: The purpose of this paper is to investigate Universiti Utara Malaysia UUM lecturers' perception of the decision regarding adopting e-learning as a teaching tool. Design/methodology/approach: Data were collected from 244 lecturers in Universiti Utara Malaysia. Internal consistency using Cronbach alpha and exploratory factor analysis with…
[Artificial neural networks for decision making in urologic oncology].
Remzi, M; Djavan, B
2007-06-01
This chapter presents a detailed introduction regarding Artificial Neural Networks (ANNs) and their contribution to modern Urologic Oncology. It includes a description of ANNs methodology and points out the differences between Artifical Intelligence and traditional statistic models in terms of usefulness for patients and clinicians, and its advantages over current statistical analysis.
Salary Equity Studies: The State of the Art. ASHE Annual Meeting 1982 Paper.
ERIC Educational Resources Information Center
Hengstler, Dennis D.; And Others
The strengths and weaknesses of various methodologies in conducting salary equity studies are examined. Particular attention is paid to the problems of identifying appropriate matches in the paired-comparison approach and to the sample, predictor and decision-rule problems associated with the regression analysis approach. In addition, highlights…
Getting State Education Data Right: What We Can Learn from Tennessee
ERIC Educational Resources Information Center
Jones, Joseph; Southern, Kyle
2011-01-01
Federal education policy in recent years has encouraged state and local education agencies to embrace data use and analysis in decision-making, ranging from policy development and implementation to performance evaluation. The capacity of these agencies to make effective and methodologically sound use of collected data for these purposes remains an…
ERIC Educational Resources Information Center
Maxwell, James R.; Gilberti, Anthony F.; Mupinga, Davison M.
2006-01-01
This paper will study some of the problems associated with case studies and make recommendations using standard and innovative methodologies effectively. Human resource management (HRM) and resource development cases provide context for analysis and decision-making designs in different industries. In most HRM development and training courses…
Profiling Students for Remediation Using Latent Class Analysis
ERIC Educational Resources Information Center
Boscardin, Christy K.
2012-01-01
While clinical exams using SPs are used extensively across the medical schools for summative purposes and high-stakes decisions, the method of identifying students for remediation varies widely and there is a lack of consensus on the best methodological approach. The purpose of this study is to provide an alternative approach to identification of…
Argumentation: A Methodology to Facilitate Critical Thinking.
Makhene, Agnes
2017-06-20
Caring is a difficult nursing activity that involves a complex nature of a human being in need of complex decision-making and problem solving through the critical thinking process. It is mandatory that critical thinking is facilitated in general and in nursing education particularly in order to render care in diverse multicultural patient care settings. This paper aims to describe how argumentation can be used to facilitate critical thinking in learners. A qualitative, exploratory and descriptive design that is contextual was used. Purposive sampling method was used to draw a sample and Miles and Huberman methodology of qualitative analysis was used to analyse data. Lincoln and Guba's strategies were employed to ensure trustworthiness, while Dhai and McQuoid-Mason's principles of ethical consideration were used. Following data analysis the findings were integrated within literature which culminated into the formulation of guidelines that can be followed when using argumentation as a methodology to facilitate critical thinking.
Lichacz, Frederick M J
2008-10-01
The present study represents a preliminary examination of the relationship between situation awareness (SA) and confidence within a distributed information-sharing environment using the calibration methodology. The calibration methodology uses the indices of calibration, resolution and over/under-confidence to examine the relationship between the accuracy of the responses and the degree of confidence that one has in these responses, which leads to a measure of an operator's meta-SA. The results of this study revealed that, although the participants were slightly overconfident in their responses, overall they demonstrated good meta-SA. That is, the participants' subjective probability judgements corresponded to their pattern of SA response accuracy. It is concluded that the use of calibration analysis represents a better methodology for expanding our understanding of the relationship between SA and confidence and ultimately how this relationship can impact decision-making and performance in applied settings than can be achieved by examining SA measures alone.
DECISION-COMPONENTS OF NICE'S TECHNOLOGY APPRAISALS ASSESSMENT FRAMEWORK.
de Folter, Joost; Trusheim, Mark; Jonsson, Pall; Garner, Sarah
2018-01-01
Value assessment frameworks have gained prominence recently in the context of U.S. healthcare. Such frameworks set out a series of factors that are considered in funding decisions. The UK's National Institute of Health and Care Excellence (NICE) is an established health technology assessment (HTA) agency. We present a novel application of text analysis that characterizes NICE's Technology Appraisals in the context of the newer assessment frameworks and present the results in a visual way. A total of 243 documents of NICE's medicines guidance from 2007 to 2016 were analyzed. Text analysis was used to identify a hierarchical set of decision factors considered in the assessments. The frequency of decision factors stated in the documents was determined and their association with terms related to uncertainty. The results were incorporated into visual representations of hierarchical factors. We identified 125 decision factors, and hierarchically grouped these into eight domains: Clinical Effectiveness, Cost Effectiveness, Condition, Current Practice, Clinical Need, New Treatment, Studies, and Other Factors. Textual analysis showed all domains appeared consistently in the guidance documents. Many factors were commonly associated with terms relating to uncertainty. A series of visual representations was created. This study reveals the complexity and consistency of NICE's decision-making processes and demonstrates that cost effectiveness is not the only decision-criteria. The study highlights the importance of processes and methodology that can take both quantitative and qualitative information into account. Visualizations can help effectively communicate this complex information during the decision-making process and subsequently to stakeholders.
Petrou, Stavros; Kwon, Joseph; Madan, Jason
2018-05-10
Economic analysts are increasingly likely to rely on systematic reviews and meta-analyses of health state utility values to inform the parameter inputs of decision-analytic modelling-based economic evaluations. Beyond the context of economic evaluation, evidence from systematic reviews and meta-analyses of health state utility values can be used to inform broader health policy decisions. This paper provides practical guidance on how to conduct a systematic review and meta-analysis of health state utility values. The paper outlines a number of stages in conducting a systematic review, including identifying the appropriate evidence, study selection, data extraction and presentation, and quality and relevance assessment. The paper outlines three broad approaches that can be used to synthesise multiple estimates of health utilities for a given health state or condition, namely fixed-effect meta-analysis, random-effects meta-analysis and mixed-effects meta-regression. Each approach is illustrated by a synthesis of utility values for a hypothetical decision problem, and software code is provided. The paper highlights a number of methodological issues pertinent to the conduct of meta-analysis or meta-regression. These include the importance of limiting synthesis to 'comparable' utility estimates, for example those derived using common utility measurement approaches and sources of valuation; the effects of reliance on limited or poorly reported published data from primary utility assessment studies; the use of aggregate outcomes within analyses; approaches to generating measures of uncertainty; handling of median utility values; challenges surrounding the disentanglement of utility estimates collected serially within the context of prospective observational studies or prospective randomised trials; challenges surrounding the disentanglement of intervention effects; and approaches to measuring model validity. Areas of methodological debate and avenues for future research are highlighted.
How to Measure Costs and Benefits of eHealth Interventions: An Overview of Methods and Frameworks.
Bergmo, Trine Strand
2015-11-09
Information on the costs and benefits of eHealth interventions is needed, not only to document value for money and to support decision making in the field, but also to form the basis for developing business models and to facilitate payment systems to support large-scale services. In the absence of solid evidence of its effects, key decision makers may doubt the effectiveness, which, in turn, limits investment in, and the long-term integration of, eHealth services. However, it is not realistic to conduct economic evaluations of all eHealth applications and services in all situations, so we need to be able to generalize from those we do conduct. This implies that we have to select the most appropriate methodology and data collection strategy in order to increase the transferability across evaluations. This paper aims to contribute to the understanding of how to apply economic evaluation methodology in the eHealth field. It provides a brief overview of basic health economics principles and frameworks and discusses some methodological issues and challenges in conducting cost-effectiveness analysis of eHealth interventions. Issues regarding the identification, measurement, and valuation of costs and benefits are outlined. Furthermore, this work describes the established techniques of combining costs and benefits, presents the decision rules for identifying the preferred option, and outlines approaches to data collection strategies. Issues related to transferability and complexity are also discussed.
[Impact of shared-decision making on patient satisfaction].
Suh, Won S; Lee, Chae Kyung
2010-01-01
The purpose of this research is to analyze the impact of shared-decision making on patient satisfaction. The study is significant since it focuses on developing appropriate methodologies and analyzing data to identify patient preferences, with the goals of optimizing treatment selection, and substantiating the relationship between such preferences and their impact on outcomes. A thorough literature review that developed the framework illustrating key dimensions of shared decision making was followed by a quantitative assessment and regression analysis of patient-perceived satisfaction, and the degree of shared-decision making. A positive association was evident between shared-decision making and patient satisfaction. The impact of shared decision making on patient satisfaction was greater than other variable including gender, education, and number of visits. Patients who participate in care-related decisions and who are given an explanation of their health problems are more likely to be satisfied with their care. It would benefit health care organizations to train their medical professionals in this communication method, and to include it in their practice guidelines.
NASA Astrophysics Data System (ADS)
Roy, Jean; Breton, Richard; Paradis, Stephane
2001-08-01
Situation Awareness (SAW) is essential for commanders to conduct decision-making (DM) activities. Situation Analysis (SA) is defined as a process, the examination of a situation, its elements, and their relations, to provide and maintain a product, i.e., a state of SAW for the decision maker. Operational trends in warfare put the situation analysis process under pressure. This emphasizes the need for a real-time computer-based Situation analysis Support System (SASS) to aid commanders in achieving the appropriate situation awareness, thereby supporting their response to actual or anticipated threats. Data fusion is clearly a key enabler for SA and a SASS. Since data fusion is used for SA in support of dynamic human decision-making, the exploration of the SA concepts and the design of data fusion techniques must take into account human factor aspects in order to ensure a cognitive fit of the fusion system with the decision-maker. Indeed, the tight human factor aspects in order to ensure a cognitive fit of the fusion system with the decision-maker. Indeed, the tight integration of the human element with the SA technology is essential. Regarding these issues, this paper provides a description of CODSI (Command Decision Support Interface), and operational- like human machine interface prototype for investigations in computer-based SA and command decision support. With CODSI, one objective was to apply recent developments in SA theory and information display technology to the problem of enhancing SAW quality. It thus provides a capability to adequately convey tactical information to command decision makers. It also supports the study of human-computer interactions for SA, and methodologies for SAW measurement.
Zhu; Dale
2000-10-01
/ Regional resource use planning relies on key regional stakeholder groups using and having equitable access to appropriate social, economic, and environmental information and assessment tools. Decision support systems (DSS) can improve stakeholder access to such information and analysis tools. Regional resource use planning, however, is a complex process involving multiple issues, multiple assessment criteria, multiple stakeholders, and multiple values. There is a need for an approach to DSS development that can assist in understanding and modeling complex problem situations in regional resource use so that areas where DSSs could provide effective support can be identified, and the user requirements can be well established. This paper presents an approach based on the soft systems methodology for identifying DSS opportunities for regional resource use planning, taking the Central Highlands Region of Queensland, Australia, as a case study.
Cognitive mapping tools: review and risk management needs.
Wood, Matthew D; Bostrom, Ann; Bridges, Todd; Linkov, Igor
2012-08-01
Risk managers are increasingly interested in incorporating stakeholder beliefs and other human factors into the planning process. Effective risk assessment and management requires understanding perceptions and beliefs of involved stakeholders, and how these beliefs give rise to actions that influence risk management decisions. Formal analyses of risk manager and stakeholder cognitions represent an important first step. Techniques for diagramming stakeholder mental models provide one tool for risk managers to better understand stakeholder beliefs and perceptions concerning risk, and to leverage this new understanding in developing risk management strategies. This article reviews three methodologies for assessing and diagramming stakeholder mental models--decision-analysis-based mental modeling, concept mapping, and semantic web analysis--and assesses them with regard to their ability to address risk manager needs. © 2012 Society for Risk Analysis.
Li, Daiqing; Zhang, Chen; Pizzol, Lisa; Critto, Andrea; Zhang, Haibo; Lv, Shihai; Marcomini, Antonio
2014-04-01
The rapid industrial development and urbanization processes that occurred in China over the past 30years has increased dramatically the consumption of natural resources and raw materials, thus exacerbating the human pressure on environmental ecosystems. In result, large scale environmental pollution of soil, natural waters and urban air were recorded. The development of effective industrial planning to support regional sustainable economy development has become an issue of serious concern for local authorities which need to select safe sites for new industrial settlements (i.e. industrial plants) according to assessment approaches considering cumulative impacts, synergistic pollution effects and risks of accidental releases. In order to support decision makers in the development of efficient and effective regional land-use plans encompassing the identification of suitable areas for new industrial settlements and areas in need of intervention measures, this study provides a spatial regional risk assessment methodology which integrates relative risk assessment (RRA) and socio-economic assessment (SEA) and makes use of spatial analysis (GIS) methodologies and multicriteria decision analysis (MCDA) techniques. The proposed methodology was applied to the Chinese region of Hulunbeier which is located in eastern Inner Mongolia Autonomous Region, adjacent to the Republic of Mongolia. The application results demonstrated the effectiveness of the proposed methodology in the identification of the most hazardous and risky industrial settlements, the most vulnerable regional receptors and the regional districts which resulted to be the most relevant for intervention measures since they are characterized by high regional risk and excellent socio-economic development conditions. Copyright © 2013 Elsevier Ltd. All rights reserved.
From LCAs to simplified models: a generic methodology applied to wind power electricity.
Padey, Pierryves; Girard, Robin; le Boulch, Denis; Blanc, Isabelle
2013-02-05
This study presents a generic methodology to produce simplified models able to provide a comprehensive life cycle impact assessment of energy pathways. The methodology relies on the application of global sensitivity analysis to identify key parameters explaining the impact variability of systems over their life cycle. Simplified models are built upon the identification of such key parameters. The methodology is applied to one energy pathway: onshore wind turbines of medium size considering a large sample of possible configurations representative of European conditions. Among several technological, geographical, and methodological parameters, we identified the turbine load factor and the wind turbine lifetime as the most influent parameters. Greenhouse Gas (GHG) performances have been plotted as a function of these key parameters identified. Using these curves, GHG performances of a specific wind turbine can be estimated, thus avoiding the undertaking of an extensive Life Cycle Assessment (LCA). This methodology should be useful for decisions makers, providing them a robust but simple support tool for assessing the environmental performance of energy systems.
Bayesian design of decision rules for failure detection
NASA Technical Reports Server (NTRS)
Chow, E. Y.; Willsky, A. S.
1984-01-01
The formulation of the decision making process of a failure detection algorithm as a Bayes sequential decision problem provides a simple conceptualization of the decision rule design problem. As the optimal Bayes rule is not computable, a methodology that is based on the Bayesian approach and aimed at a reduced computational requirement is developed for designing suboptimal rules. A numerical algorithm is constructed to facilitate the design and performance evaluation of these suboptimal rules. The result of applying this design methodology to an example shows that this approach is potentially a useful one.
A meta-analysis of the social communication questionnaire: Screening for autism spectrum disorder.
Chesnut, Steven R; Wei, Tianlan; Barnard-Brak, Lucy; Richman, David M
2017-11-01
The current meta-analysis examines the previous research on the utility of the Social Communication Questionnaire as a screening instrument for autism spectrum disorder. Previously published reports have highlighted the inconsistencies between Social Communication Questionnaire-screening results and formal autism spectrum disorder diagnoses. The variations in accuracy resulted in some researchers questioning the validity of the Social Communication Questionnaire. This study systematically examined the accuracy of the Social Communication Questionnaire as a function of the methodological decisions made by researchers screening for autism spectrum disorder over the last 15 years. Findings from this study suggest that the Social Communication Questionnaire is an acceptable screening instrument for autism spectrum disorder (area under the curve = 0.885). Variations in methodological decisions, however, greatly influenced the accuracy of the Social Communication Questionnaire in screening for autism spectrum disorder. Of these methodological variations, using the Current instead of the Lifetime version of the Social Communication Questionnaire resulted in the largest detrimental effect ( d = -3.898), followed by using the Social Communication Questionnaire with individuals younger than 4 years of age ( d = -2.924) and relying upon convenience samples ( d = -4.828 for clinical samples, -2.734 for convenience samples, and -1.422 for community samples). Directions for future research and implications for using the Social Communication Questionnaire to screen for autism spectrum disorder are discussed.
Costing the satellite power system
NASA Technical Reports Server (NTRS)
Hazelrigg, G. A., Jr.
1978-01-01
The paper presents a methodology for satellite power system costing, places approximate limits on the accuracy possible in cost estimates made at this time, and outlines the use of probabilistic cost information in support of the decision-making process. Reasons for using probabilistic costing or risk analysis procedures instead of standard deterministic costing procedures are considered. Components of cost, costing estimating relationships, grass roots costing, and risk analysis are discussed. Risk analysis using a Monte Carlo simulation model is used to estimate future costs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akar, S.; Young, K.
Geothermal exploration projects have a significant amount of risk associated with uncertainties encountered in the discovery of the geothermal resource. Two of the largest challenges for increased geothermal deployment are 1) understanding when and how to proceed in an exploration program, and 2) when to walk away from a site. Current methodologies for exploration decision-making are formulatedby subjective expert opinion which can be incorrectly biased by expertise (e.g. geochemistry, geophysics), geographic location of focus, and the assumed conceptual model. The aim of this project is to develop a methodology for more objective geothermal exploration decision making at a given location,more » including go/no-go decision points to help developers and investors decide when to give up on alocation. In this scope, two different approaches are investigated: 1) value of information analysis (VOIA) which is used for evaluating and quantifying the value of a data before they are purchased, and 2) enthalpy-based exploration targeting based on reservoir size, temperature gradient estimates, and internal rate of return (IRR). The first approach, VOIA, aims to identify the value of aparticular data when making decisions with an uncertain outcome. This approach targets the pre-drilling phase of exploration. These estimated VOIs are highly affected by the size of the project and still have a high degree of subjectivity in assignment of probabilities. The second approach, exploration targeting, is focused on decision making during the drilling phase. It starts with a basicgeothermal project definition that includes target and minimum required production capacity and initial budgeting for exploration phases. Then, it uses average temperature gradient, reservoir temperature estimates, and production capacity to define targets and go/no-go limits. The decision analysis in this approach is based on achieving a minimum IRR at each phase of the project. This secondapproach was determined to be less subjective, since numerical inputs come from the collected data. And it helps to facilitate communication between project managers and exploration geologists in making objective go/no-go decisions throughout the different project phases.« less
Saint-Hilary, Gaelle; Cadour, Stephanie; Robert, Veronique; Gasparini, Mauro
2017-05-01
Quantitative methodologies have been proposed to support decision making in drug development and monitoring. In particular, multicriteria decision analysis (MCDA) and stochastic multicriteria acceptability analysis (SMAA) are useful tools to assess the benefit-risk ratio of medicines according to the performances of the treatments on several criteria, accounting for the preferences of the decision makers regarding the relative importance of these criteria. However, even in its probabilistic form, MCDA requires the exact elicitations of the weights of the criteria by the decision makers, which may be difficult to achieve in practice. SMAA allows for more flexibility and can be used with unknown or partially known preferences, but it is less popular due to its increased complexity and the high degree of uncertainty in its results. In this paper, we propose a simple model as a generalization of MCDA and SMAA, by applying a Dirichlet distribution to the weights of the criteria and by making its parameters vary. This unique model permits to fit both MCDA and SMAA, and allows for a more extended exploration of the benefit-risk assessment of treatments. The precision of its results depends on the precision parameter of the Dirichlet distribution, which could be naturally interpreted as the strength of confidence of the decision makers in their elicitation of preferences. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Connors, Brenda L.; Rende, Richard; Colton, Timothy J.
2014-01-01
The unique yield of collecting observational data on human movement has received increasing attention in a number of domains, including the study of decision-making style. As such, interest has grown in the nuances of core methodological issues, including the best ways of assessing inter-rater reliability. In this paper we focus on one key topic – the distinction between establishing reliability for the patterning of behaviors as opposed to the computation of raw counts – and suggest that reliability for each be compared empirically rather than determined a priori. We illustrate by assessing inter-rater reliability for key outcome measures derived from movement pattern analysis (MPA), an observational methodology that records body movements as indicators of decision-making style with demonstrated predictive validity. While reliability ranged from moderate to good for raw counts of behaviors reflecting each of two Overall Factors generated within MPA (Assertion and Perspective), inter-rater reliability for patterning (proportional indicators of each factor) was significantly higher and excellent (ICC = 0.89). Furthermore, patterning, as compared to raw counts, provided better prediction of observable decision-making process assessed in the laboratory. These analyses support the utility of using an empirical approach to inform the consideration of measuring patterning versus discrete behavioral counts of behaviors when determining inter-rater reliability of observable behavior. They also speak to the substantial reliability that may be achieved via application of theoretically grounded observational systems such as MPA that reveal thinking and action motivations via visible movement patterns. PMID:24999336
Connors, Brenda L; Rende, Richard; Colton, Timothy J
2014-01-01
The unique yield of collecting observational data on human movement has received increasing attention in a number of domains, including the study of decision-making style. As such, interest has grown in the nuances of core methodological issues, including the best ways of assessing inter-rater reliability. In this paper we focus on one key topic - the distinction between establishing reliability for the patterning of behaviors as opposed to the computation of raw counts - and suggest that reliability for each be compared empirically rather than determined a priori. We illustrate by assessing inter-rater reliability for key outcome measures derived from movement pattern analysis (MPA), an observational methodology that records body movements as indicators of decision-making style with demonstrated predictive validity. While reliability ranged from moderate to good for raw counts of behaviors reflecting each of two Overall Factors generated within MPA (Assertion and Perspective), inter-rater reliability for patterning (proportional indicators of each factor) was significantly higher and excellent (ICC = 0.89). Furthermore, patterning, as compared to raw counts, provided better prediction of observable decision-making process assessed in the laboratory. These analyses support the utility of using an empirical approach to inform the consideration of measuring patterning versus discrete behavioral counts of behaviors when determining inter-rater reliability of observable behavior. They also speak to the substantial reliability that may be achieved via application of theoretically grounded observational systems such as MPA that reveal thinking and action motivations via visible movement patterns.
Breaking the Change Barrier: A 40 Year Analysis of Air Force Pilot Retention Solutions
national defense. A problem/solution research methodology using the organizational management theory of path dependence explored the implications of the...exodus is to start the incentive process earlier in the career and prior to the final decision to separate. Path dependent analysis indicates all prior... incentive options and personal involvement in the overall process. The Air Force can annually budget and forecast incentive requirements and personnel
Meyer, Travis S; Muething, Joseph Z; Lima, Gustavo Amoras Souza; Torres, Breno Raemy Rangel; del Rosario, Trystyn Keia; Gomes, José Orlando; Lambert, James H
2012-01-01
Radiological nuclear emergency responders must be able to coordinate evacuation and relief efforts following the release of radioactive material into populated areas. In order to respond quickly and effectively to a nuclear emergency, high-level coordination is needed between a number of large, independent organizations, including police, military, hazmat, and transportation authorities. Given the complexity, scale, time-pressure, and potential negative consequences inherent in radiological emergency responses, tracking and communicating information that will assist decision makers during a crisis is crucial. The emergency response team at the Angra dos Reis nuclear power facility, located outside of Rio de Janeiro, Brazil, presently conducts emergency response simulations once every two years to prepare organizational leaders for real-life emergency situations. However, current exercises are conducted without the aid of electronic or software tools, resulting in possible cognitive overload and delays in decision-making. This paper describes the development of a decision support system employing systems methodologies, including cognitive task analysis and human-machine interface design. The decision support system can aid the coordination team by automating cognitive functions and improving information sharing. A prototype of the design will be evaluated by plant officials in Brazil and incorporated to a future trial run of a response simulation.
Siebert, Uwe; Rochau, Ursula; Claxton, Karl
2013-01-01
Decision analysis (DA) and value-of-information (VOI) analysis provide a systematic, quantitative methodological framework that explicitly considers the uncertainty surrounding the currently available evidence to guide healthcare decisions. In medical decision making under uncertainty, there are two fundamental questions: 1) What decision should be made now given the best available evidence (and its uncertainty)?; 2) Subsequent to the current decision and given the magnitude of the remaining uncertainty, should we gather further evidence (i.e., perform additional studies), and if yes, which studies should be undertaken (e.g., efficacy, side effects, quality of life, costs), and what sample sizes are needed? Using the currently best available evidence, VoI analysis focuses on the likelihood of making a wrong decision if the new intervention is adopted. The value of performing further studies and gathering additional evidence is based on the extent to which the additional information will reduce this uncertainty. A quantitative framework allows for the valuation of the additional information that is generated by further research, and considers the decision maker's objectives and resource constraints. Claxton et al. summarise: "Value of information analysis can be used to inform a range of policy questions including whether a new technology should be approved based on existing evidence, whether it should be approved but additional research conducted or whether approval should be withheld until the additional evidence becomes available." [Claxton K. Value of information entry in Encyclopaedia of Health Economics, Elsevier, forthcoming 2014.] The purpose of this tutorial is to introduce the framework of systematic VoI analysis to guide further research. In our tutorial article, we explain the theoretical foundations and practical methods of decision analysis and value-of-information analysis. To illustrate, we use a simple case example of a foot ulcer (e.g., with diabetes) as well as key references from the literature, including examples for the use of the decision-analytic VoI framework by health technology assessment agencies to guide further research. These concepts may guide stakeholders involved or interested in how to determine whether or not and, if so, which additional evidence is needed to make decisions. Copyright © 2013. Published by Elsevier GmbH.
2010-06-16
Clemen and Reilly (2001) Risk analysis Haimes (2009); Kaplan et al. (2001): Lowrance (1976); Kaplan and Garrick (1981) Source: The US Army Energy...collect solar energy and convert to heat (NREL presentation) • Wind turbines capture energy in wind and convert it into electricity (NREL
Approaches to Forecasting Demands for Library Network Services. Report No. 10.
ERIC Educational Resources Information Center
Kang, Jong Hoa
The problem of forecasting monthly demands for library network services is considered in terms of using forecasts as inputs to policy analysis models, and in terms of using forecasts to aid in the making of budgeting and staffing decisions. Box-Jenkins time-series methodology, adaptive filtering, and regression approaches are examined and compared…
Key concepts and methods in social vulnerability and adaptive capacity
Daniel J. Murphy; Carina Wyborn; Laurie Yung; Daniel R. Williams
2015-01-01
National forests have been asked to assess how climate change will impact nearby human communities. To assist their thinking on this topic, we examine the concepts of social vulnerability and adaptive capacity with an emphasis on a range of theoretical and methodological approaches. This analysis is designed to help researchers and decision-makers select appropriate...
Ahmed, Rana; McCaffery, Kirsten J; Aslani, Parisa
2013-04-01
Attention-deficit/hyperactivity disorder (ADHD) is a pediatric psychological condition commonly treated with stimulant medications. Negative media reports and stigmatizing societal attitudes surrounding the use of these medications make it difficult for parents of affected children to accept stimulant treatment, despite it being first line therapy. The purpose of this study was to identify factors that influence parental decision making regarding stimulant treatment for ADHD. A systematic review of the literature was conducted to identify studies: 1) that employed qualitative methodology, 2) that highlighted treatment decision(s) about stimulant medication, 3) in which the decision(s) were made by the parent of a child with an official ADHD diagnosis, and 4) that examined the factors affecting the decision(s) made. Individual factors influencing parental treatment decision making, and the major themes encompassing these factors, were identified and followed by a thematic analysis. Eleven studies reporting on the experiences of 335 parents of children with ADHD were included. Four major themes encompassing influences on parents' decisions were derived from the thematic analysis performed: confronting the diagnosis, external influences, apprehension regarding therapy, and experience with the healthcare system. The findings of this systematic review reveal that there are multiple factors that influence parents' decisions about stimulant therapy. This information can assist clinicians in enhancing information delivery to parents of children with ADHD, and help reduce parental ambivalence surrounding stimulant medication use. Future work needs to address parental concerns about stimulants, and increase their involvement in shared decision making with clinicians to empower them to make the most appropriate treatment decision for their child.
Wahlster, Philip; Goetghebeur, Mireille; Kriza, Christine; Niederländer, Charlotte; Kolominsky-Rabas, Peter
2015-07-09
The diffusion of health technologies from translational research to reimbursement depends on several factors included the results of health economic analysis. Recent research identified several flaws in health economic concepts. Additionally, the heterogeneous viewpoints of participating stakeholders are rarely systematically addressed in current decision-making. Multi-criteria Decision Analysis (MCDA) provides an opportunity to tackle these issues. The objective of this study was to review applications of MCDA methods in decisions addressing the trade-off between costs and benefits. Using basic steps of the PRISMA guidelines, a systematic review of the healthcare literature was performed to identify original research articles from January 1990 to April 2014. Medline, PubMed, Springer Link and specific journals were searched. Using predefined categories, bibliographic records were systematically extracted regarding the type of policy applications, MCDA methodology, criteria used and their definitions. 22 studies were included in the analysis. 15 studies (68 %) used direct MCDA approaches and seven studies (32 %) used preference elicitation approaches. Four studies (19 %) focused on technologies in the early innovation process. The majority (18 studies - 81 %) examined reimbursement decisions. Decision criteria used in studies were obtained from the literature research and context-specific studies, expert opinions, and group discussions. The number of criteria ranged between three up to 15. The most frequently used criteria were health outcomes (73 %), disease impact (59 %), and implementation of the intervention (40 %). Economic criteria included cost-effectiveness criteria (14 studies, 64 %), and total costs/budget impact of an intervention (eight studies, 36 %). The process of including economic aspects is very different among studies. Some studies directly compare costs with other criteria while some include economic consideration in a second step. In early innovation processes, MCDA can provide information about stakeholder preferences as well as evidence needs in further development. However, only a minority of these studies include economic features due to the limited evidence. The most important economic criterion cost-effectiveness should not be included from a technical perspective as it is already a composite of costs and benefit. There is a significant lack of consensus in methodology employed by the various studies which highlights the need for guidance on application of MCDA at specific phases of an innovation.
Conflicts in developing countries: a case study from Rio de Janeiro
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bredariol, Celso Simoes; Magrini, Alessandra
In developing countries, environmental conflicts are resolved mainly in the political arena. In the developed nations, approaches favoring structured negotiation support techniques are more common, with methodologies and studies designed especially for this purpose, deriving from Group Communications and Decision Theory. This paper analyzes an environmental dispute in the City of Rio de Janeiro, applying conflict analysis methods and simulating its settlement. It concludes that the use of these methodologies in the developing countries may be undertaken with adaptations, designed to train community groups in negotiating while fostering the democratization of the settlement of these disputes.
Cost accounting methodologies in price setting of acute inpatient services in Hungary.
Gaal, Peter; Stefka, Nóra; Nagy, Júlia
2006-08-01
On the basis of documentary analysis and interviews with decision makers, this paper discusses the cost accounting methodologies used for price setting of inpatient services in the Hungarian health care system focusing on sector of acute inpatient care, which is financed through the Hungarian adaptation of Diagnosis Related Groups since 1993. Hungary has a quite sophisticated DRG system, which had a deep impact on the efficiency of the acute inpatient care sector. Nevertheless, the system requires continuous maintenance, where the cooperation of hospitals, as well as the minimisation of political influence are critical success factors.
From data to evidence: evaluative methods in evidence-based medicine.
Landry, M D; Sibbald, W J
2001-11-01
The amount of published information is increasing exponentially, and recent technologic advances have created systems whereby mass distribution of this information can occur at an infinite rate. This is particularly true in the broad field of medicine, as the absolute volume of data available to the practicing clinician is creating new challenges in the management of relevant information flow. Evidence-based medicine (EBM) is an information management and learning strategy that seeks to integrate clinical expertise with the best evidence available in order to make effective clinical decisions that will ultimately improve patient care. The systematic approach underlying EBM encourages the clinician to formulate specific and relevant questions, which are answered in an iterative manner through accessing the best available published evidence. The arguments against EBM stem from the idea that there are inherent weaknesses in research methodologies and that emphasis placed on published research may ignore clinical skills and individual patient needs. Despite these arguments, EBM is gaining momentum and is consistently used as a method of learning and improving health care delivery. However, if EBM is to be effective, the clinician needs to have a critical understanding of research methodology in order to judge the value and level of a particular data source. Without critical analysis of research methodology, there is an inherent risk of drawing incorrect conclusions that may affect clinical decision-making. Currently, there is a trend toward using secondary pre-appraised data rather than primary sources as best evidence. We review the qualitative and quantitative methodology commonly used in EBM and argue that it is necessary for the clinician to preferentially use primary rather than secondary sources in making clinically relevant decisions.
A decision model for planetary missions
NASA Technical Reports Server (NTRS)
Hazelrigg, G. A., Jr.; Brigadier, W. L.
1976-01-01
Many techniques developed for the solution of problems in economics and operations research are directly applicable to problems involving engineering trade-offs. This paper investigates the use of utility theory for decision making in planetary exploration space missions. A decision model is derived that accounts for the objectives of the mission - science - the cost of flying the mission and the risk of mission failure. A simulation methodology for obtaining the probability distribution of science value and costs as a function spacecraft and mission design is presented and an example application of the decision methodology is given for various potential alternatives in a comet Encke mission.
ERIC Educational Resources Information Center
Rupp, André A.
2018-01-01
This article discusses critical methodological design decisions for collecting, interpreting, and synthesizing empirical evidence during the design, deployment, and operational quality-control phases for automated scoring systems. The discussion is inspired by work on operational large-scale systems for automated essay scoring but many of the…
ERIC Educational Resources Information Center
Pereira-Leon, Maura J.
2010-01-01
This three-year study examined how participation in a 10-month technology-enhanced professional development program (PDP) influenced K-12 teachers' decisions to utilize or ignore technology into teaching practices. Carspecken's (1996) qualitative research methodology of Critical Ethnography provided the theoretical and methodological framework to…
Education and Training in Ethical Decision Making: Comparing Context and Orientation
ERIC Educational Resources Information Center
Perri, David F.; Callanan, Gerard A.; Rotenberry, Paul F.; Oehlers, Peter F.
2009-01-01
Purpose: The purpose of this paper is to present a teaching methodology for improving the understanding of ethical decision making. This pedagogical approach is applicable in college courses and in corporate training programs. Design/methodology/approach: Participants are asked to analyze a set of eight ethical dilemmas with differing situational…
Adaptive Multi-scale Prognostics and Health Management for Smart Manufacturing Systems
Choo, Benjamin Y.; Adams, Stephen C.; Weiss, Brian A.; Marvel, Jeremy A.; Beling, Peter A.
2017-01-01
The Adaptive Multi-scale Prognostics and Health Management (AM-PHM) is a methodology designed to enable PHM in smart manufacturing systems. In application, PHM information is not yet fully utilized in higher-level decision-making in manufacturing systems. AM-PHM leverages and integrates lower-level PHM information such as from a machine or component with hierarchical relationships across the component, machine, work cell, and assembly line levels in a manufacturing system. The AM-PHM methodology enables the creation of actionable prognostic and diagnostic intelligence up and down the manufacturing process hierarchy. Decisions are then made with the knowledge of the current and projected health state of the system at decision points along the nodes of the hierarchical structure. To overcome the issue of exponential explosion of complexity associated with describing a large manufacturing system, the AM-PHM methodology takes a hierarchical Markov Decision Process (MDP) approach into describing the system and solving for an optimized policy. A description of the AM-PHM methodology is followed by a simulated industry-inspired example to demonstrate the effectiveness of AM-PHM. PMID:28736651
Fragoulakis, Vasilios; Mitropoulou, Christina; van Schaik, Ron H; Maniadakis, Nikolaos; Patrinos, George P
2016-05-01
Genomic Medicine aims to improve therapeutic interventions and diagnostics, the quality of life of patients, but also to rationalize healthcare costs. To reach this goal, careful assessment and identification of evidence gaps for public health genomics priorities are required so that a more efficient healthcare environment is created. Here, we propose a public health genomics-driven approach to adjust the classical healthcare decision making process with an alternative methodological approach of cost-effectiveness analysis, which is particularly helpful for genomic medicine interventions. By combining classical cost-effectiveness analysis with budget constraints, social preferences, and patient ethics, we demonstrate the application of this model, the Genome Economics Model (GEM), based on a previously reported genome-guided intervention from a developing country environment. The model and the attendant rationale provide a practical guide by which all major healthcare stakeholders could ensure the sustainability of funding for genome-guided interventions, their adoption and coverage by health insurance funds, and prioritization of Genomic Medicine research, development, and innovation, given the restriction of budgets, particularly in developing countries and low-income healthcare settings in developed countries. The implications of the GEM for the policy makers interested in Genomic Medicine and new health technology and innovation assessment are also discussed.
NASA Astrophysics Data System (ADS)
Valentina, Gallina; Silvia, Torresan; Anna, Sperotto; Elisa, Furlan; Andrea, Critto; Antonio, Marcomini
2014-05-01
Nowadays, the challenge for coastal stakeholders and decision makers is to incorporate climate change in land and policy planning in order to ensure a sustainable integrated coastal zone management aimed at preserve coastal environments and socio-economic activities. Consequently, an increasing amount of information on climate variability and its impact on human and natural ecosystem is requested. Climate risk services allows to bridge the gap between climate experts and decision makers communicating timely science-based information about impacts and risks related to climate change that could be incorporated into land planning, policy and practice. Within the CLIM-RUN project (FP7), a participatory Regional Risk Assessment (RRA) methodology was applied for the evaluation of water-related hazards in coastal areas (i.e. pluvial flood and sea-level rise inundation risks) taking into consideration future climate change scenarios in the case study of the North Adriatic Sea for the period 2040-2050. Specifically, through the analysis of hazard, exposure, vulnerability and risk and the application of Multi-Criteria Decision Analysis (MCDA), the RRA methodology allowed to identify and prioritize targets (i.e. residential and commercial-industrial areas, beaches, infrastructures, wetlands, agricultural typology) and sub-areas that are more likely to be affected by pluvial flood and sea-level rise impacts in the same region. From the early stages of the climate risk services development and application, the RRA followed a bottom-up approach taking into account the needs, knowledge and perspectives of local stakeholders dealing with the Integrated Coastal Zone Management (ICZM), by means of questionnaires, workshops and focus groups organized within the project. Specifically, stakeholders were asked to provide their needs in terms of time scenarios, geographical scale and resolution, choice of receptors, vulnerability factors and thresholds that were considered in the implementation of the RRA methodology. The main output of the analysis are climate risk products produced with the DEcision support SYstem for COastal climate change impact assessment (DESYCO) and represented by GIS-based maps and statistics of hazard, exposure, physical and environmental vulnerability, risk and damage. These maps are useful to transfer information about climate change impacts to stakeholders and decision makers, to allow the classification and prioritization of areas that are likely to be affected by climate change impacts more severely than others in the same region, and therefore to support the identification of suitable areas for infrastructure, economic activities and human settlements toward the development of regional adaptation plans. The climate risk products and the results of North Adriatic case study will be here presented and discussed.
Personalized Clinical Diagnosis in Data Bases for Treatment Support in Phthisiology.
Lugovkina, T K; Skornyakov, S N; Golubev, D N; Egorov, E A; Medvinsky, I D
2016-01-01
The decision-making is a key event in the clinical practice. The program products with clinical decision support models in electronic data-base as well as with fixed decision moments of the real clinical practice and treatment results are very actual instruments for improving phthisiological practice and may be useful in the severe cases caused by the resistant strains of Mycobacterium tuberculosis. The methodology for gathering and structuring of useful information (critical clinical signals for decisions) is described. Additional coding of clinical diagnosis characteristics was implemented for numeric reflection of the personal situations. The created methodology for systematization and coding Clinical Events allowed to improve the clinical decision models for better clinical results.
A negotiation methodology and its application to cogeneration planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, S.M.; Liu, C.C.; Luu, S.
Power system planning has become a complex process in utilities today. This paper presents a methodology for integrated planning with multiple objectives. The methodology uses a graphical representation (Goal-Decision Network) to capture the planning knowledge. The planning process is viewed as a negotiation process that applies three negotiation operators to search for beneficial decisions in a GDN. Also, the negotiation framework is applied to the problem of planning for cogeneration interconnection. The simulation results are presented to illustrate the cogeneration planning process.
Cognitive task analysis of network analysts and managers for network situational awareness
NASA Astrophysics Data System (ADS)
Erbacher, Robert F.; Frincke, Deborah A.; Wong, Pak Chung; Moody, Sarah; Fink, Glenn
2010-01-01
The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The situational-awareness capabilities being developed focus on novel visualization techniques as well as data analysis techniques designed to improve the comprehensibility of the visualizations. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into ensuring we had feedback from network analysts and managers and understanding what their needs truly are. This paper discusses the cognitive task analysis methodology we followed to acquire feedback from the analysts. This paper also provides the details we acquired from the analysts on their processes, goals, concerns, etc. A final result we describe is the generation of a task-flow diagram.
Antioch, Kathryn M; Drummond, Michael F; Niessen, Louis W; Vondeling, Hindrik
2017-01-01
Economic evidence is influential in health technology assessment world-wide. Clinical Practice Guidelines (CPG) can enable economists to include economic information on health care provision. Application of economic evidence in CPGs, and its integration into clinical practice and national decision making is hampered by objections from professions, paucity of economic evidence or lack of policy commitment. The use of state-of-art economic methodologies will improve this. Economic evidence can be graded by 'checklists' to establish the best evidence for decision making given methodological rigor. New economic evaluation checklists, Multi-Criteria Decision Analyses (MCDA) and other decision criteria enable health economists to impact on decision making world-wide. We analyse the methodologies for integrating economic evidence into CPG agencies globally, including the Agency of Health Research and Quality (AHRQ) in the USA, National Health and Medical Research Council (NHMRC) and Australian political reforms. The Guidelines and Economists Network International (GENI) Board members from Australia, UK, Canada and Denmark presented the findings at the conference of the International Health Economists Association (IHEA) and we report conclusions and developments since. The Consolidated Guidelines for the Reporting of Economic Evaluations (CHEERS) 24 item check list can be used by AHRQ, NHMRC, other CPG and health organisations, in conjunction with the Drummond ten-point check list and a questionnaire that scores that checklist for grading studies, when assessing economic evidence. Cost-effectiveness Analysis (CEA) thresholds, opportunity cost and willingness-to-pay (WTP) are crucial issues for decision rules in CEA generally, including end-of-life therapies. Limitations of inter-rater reliability in checklists can be addressed by including more than one assessor to reach a consensus, especially when impacting on treatment decisions. We identify priority areas to generate economic evidence for CPGs by NHMRC, AHRQ, and other agencies. The evidence may cover demand for care issues such as involved time, logistics, innovation price, price sensitivity, substitutes and complements, WTP, absenteeism and presentism. Supply issues may include economies of scale, efficiency changes, and return on investment. Involved equity and efficiency measures may include cost-of-illness, disease burden, quality-of-life, budget impact, cost-effective ratios, net benefits and disparities in access and outcomes. Priority setting remains essential and trade-off decisions between policy criteria can be based on MCDA, both in evidence based clinical medicine and in health planning.
How to Measure Costs and Benefits of eHealth Interventions: An Overview of Methods and Frameworks
2015-01-01
Information on the costs and benefits of eHealth interventions is needed, not only to document value for money and to support decision making in the field, but also to form the basis for developing business models and to facilitate payment systems to support large-scale services. In the absence of solid evidence of its effects, key decision makers may doubt the effectiveness, which, in turn, limits investment in, and the long-term integration of, eHealth services. However, it is not realistic to conduct economic evaluations of all eHealth applications and services in all situations, so we need to be able to generalize from those we do conduct. This implies that we have to select the most appropriate methodology and data collection strategy in order to increase the transferability across evaluations. This paper aims to contribute to the understanding of how to apply economic evaluation methodology in the eHealth field. It provides a brief overview of basic health economics principles and frameworks and discusses some methodological issues and challenges in conducting cost-effectiveness analysis of eHealth interventions. Issues regarding the identification, measurement, and valuation of costs and benefits are outlined. Furthermore, this work describes the established techniques of combining costs and benefits, presents the decision rules for identifying the preferred option, and outlines approaches to data collection strategies. Issues related to transferability and complexity are also discussed. PMID:26552360
Methodological quality of systematic reviews on influenza vaccination.
Remschmidt, Cornelius; Wichmann, Ole; Harder, Thomas
2014-03-26
There is a growing body of evidence on the risks and benefits of influenza vaccination in various target groups. Systematic reviews are of particular importance for policy decisions. However, their methodological quality can vary considerably. To investigate the methodological quality of systematic reviews on influenza vaccination (efficacy, effectiveness, safety) and to identify influencing factors. A systematic literature search on systematic reviews on influenza vaccination was performed, using MEDLINE, EMBASE and three additional databases (1990-2013). Review characteristics were extracted and the methodological quality of the reviews was evaluated using the assessment of multiple systematic reviews (AMSTAR) tool. U-test, Kruskal-Wallis test, chi-square test, and multivariable linear regression analysis were used to assess the influence of review characteristics on AMSTAR-score. Fourty-six systematic reviews fulfilled the inclusion criteria. Average methodological quality was high (median AMSTAR-score: 8), but variability was large (AMSTAR range: 0-11). Quality did not differ significantly according to vaccination target group. Cochrane reviews had higher methodological quality than non-Cochrane reviews (p=0.001). Detailed analysis showed that this was due to better study selection and data extraction, inclusion of unpublished studies, and better reporting of study characteristics (all p<0.05). In the adjusted analysis, no other factor, including industry sponsorship or journal impact factor had an influence on AMSTAR score. Systematic reviews on influenza vaccination showed large differences regarding their methodological quality. Reviews conducted by the Cochrane collaboration were of higher quality than others. When using systematic reviews to guide the development of vaccination recommendations, the methodological quality of a review in addition to its content should be considered. Copyright © 2014 Elsevier Ltd. All rights reserved.
Angelis, Aris; Kanavos, Panos
2017-09-01
Escalating drug prices have catalysed the generation of numerous "value frameworks" with the aim of informing payers, clinicians and patients on the assessment and appraisal process of new medicines for the purpose of coverage and treatment selection decisions. Although this is an important step towards a more inclusive Value Based Assessment (VBA) approach, aspects of these frameworks are based on weak methodologies and could potentially result in misleading recommendations or decisions. In this paper, a Multiple Criteria Decision Analysis (MCDA) methodological process, based on Multi Attribute Value Theory (MAVT), is adopted for building a multi-criteria evaluation model. A five-stage model-building process is followed, using a top-down "value-focused thinking" approach, involving literature reviews and expert consultations. A generic value tree is structured capturing decision-makers' concerns for assessing the value of new medicines in the context of Health Technology Assessment (HTA) and in alignment with decision theory. The resulting value tree (Advance Value Tree) consists of three levels of criteria (top level criteria clusters, mid-level criteria, bottom level sub-criteria or attributes) relating to five key domains that can be explicitly measured and assessed: (a) burden of disease, (b) therapeutic impact, (c) safety profile (d) innovation level and (e) socioeconomic impact. A number of MAVT modelling techniques are introduced for operationalising (i.e. estimating) the model, for scoring the alternative treatment options, assigning relative weights of importance to the criteria, and combining scores and weights. Overall, the combination of these MCDA modelling techniques for the elicitation and construction of value preferences across the generic value tree provides a new value framework (Advance Value Framework) enabling the comprehensive measurement of value in a structured and transparent way. Given its flexibility to meet diverse requirements and become readily adaptable across different settings, the Advance Value Framework could be offered as a decision-support tool for evaluators and payers to aid coverage and reimbursement of new medicines. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Korving, H; Clemens, F
2002-01-01
In recent years, decision analysis has become an important technique in many disciplines. It provides a methodology for rational decision-making allowing for uncertainties in the outcome of several possible actions to be undertaken. An example in urban drainage is the situation in which an engineer has to decide upon a major reconstruction of a system in order to prevent pollution of receiving waters due to CSOs. This paper describes the possibilities of Bayesian decision-making in urban drainage. In particular, the utility of monitoring prior to deciding on the reconstruction of a sewer system to reduce CSO emissions is studied. Our concern is with deciding whether a price should be paid for new information and which source of information is the best choice given the expected uncertainties in the outcome. The influence of specific uncertainties (sewer system data and model parameters) on the probability of CSO volumes is shown to be significant. Using Bayes' rule, to combine prior impressions with new observations, reduces the risks linked with the planning of sewer system reconstructions.
Dolan, James G.; Boohaker, Emily; Allison, Jeroan; Imperiale, Thomas F.
2013-01-01
Background Current US colorectal cancer screening guidelines that call for shared decision making regarding the choice among several recommended screening options are difficult to implement. Multi-criteria decision analysis (MCDA) is an established methodology well suited for supporting shared decision making. Our study goal was to determine if a streamlined form of MCDA using rank order based judgments can accurately assess patients’ colorectal cancer screening priorities. Methods We converted priorities for four decision criteria and three sub-criteria regarding colorectal cancer screening obtained from 484 average risk patients using the Analytic Hierarchy Process (AHP) in a prior study into rank order-based priorities using rank order centroids. We compared the two sets of priorities using Spearman rank correlation and non-parametric Bland-Altman limits of agreement analysis. We assessed the differential impact of using the rank order-based versus the AHP-based priorities on the results of a full MCDA comparing three currently recommended colorectal cancer screening strategies. Generalizability of the results was assessed using Monte Carlo simulation. Results Correlations between the two sets of priorities for the seven criteria ranged from 0.55 to 0.92. The proportions of absolute differences between rank order-based and AHP-based priorities that were more than ± 0.15 ranged from 1% to 16%. Differences in the full MCDA results were minimal and the relative rankings of the three screening options were identical more than 88% of the time. The Monte Carlo simulation results were similar. Conclusion Rank order-based MCDA could be a simple, practical way to guide individual decisions and assess population decision priorities regarding colorectal cancer screening strategies. Additional research is warranted to further explore the use of these methods for promoting shared decision making. PMID:24300851
The ALMA CONOPS project: the impact of funding decisions on observatory performance
NASA Astrophysics Data System (ADS)
Ibsen, Jorge; Hibbard, John; Filippi, Giorgio
2014-08-01
In time when every penny counts, many organizations are facing the question of how much scientific impact a budget cut can have or, putting it in more general terms, which is the science impact of alternative (less costly) operational modes. In reply to such question posted by the governing bodies, the ALMA project had to develop a methodology (ALMA Concepts for Operations, CONOPS) that attempts to measure the impact that alternative operational scenarios may have on the overall scientific production of the Observatory. Although the analysis and the results are ALMA specific, the developed approach is rather general and provides a methodology for a cost-performance analysis of alternatives before any radical alterations to the operations model are adopted. This paper describes the key aspects of the methodology: a) the definition of the Figures of Merit (FoMs) for the assessment of quantitative science performance impacts as well as qualitative impacts, and presents a methodology using these FoMs to evaluate the cost and impact of the different operational scenarios; b) the definition of a REFERENCE operational baseline; c) the identification of Alternative Scenarios each replacing one or more concepts in the REFERENCE by a different concept that has a lower cost and some level of scientific and/or operational impact; d) the use of a Cost-Performance plane to graphically combine the effects that the alternative scenarios can have in terms of cost reduction and affected performance. Although is a firstorder assessment, we believe this approach is useful for comparing different operational models and to understand the cost performance impact of these choices. This can be used to take decision to meet budget cuts as well as in evaluating possible new emergent opportunities.
Decision making in asthma exacerbation: a clinical judgement analysis
Jenkins, John; Shields, Mike; Patterson, Chris; Kee, Frank
2007-01-01
Background Clinical decisions which impact directly on patient safety and quality of care are made during acute asthma attacks by individual doctors based on their knowledge and experience. Decisions include administration of systemic corticosteroids (CS) and oral antibiotics, and admission to hospital. Clinical judgement analysis provides a methodology for comparing decisions between practitioners with different training and experience, and improving decision making. Methods Stepwise linear regression was used to select clinical cues based on visual analogue scale assessments of the propensity of 62 clinicians to prescribe a short course of oral CS (decision 1), a course of antibiotics (decision 2), and/or admit to hospital (decision 3) for 60 “paper” patients. Results When compared by specialty, paediatricians' models for decision 1 were more likely to include level of alertness as a cue (54% vs 16%); for decision 2 they were more likely to include presence of crepitations (49% vs 16%) and less likely to include inhaled CS (8% vs 40%), respiratory rate (0% vs 24%) and air entry (70% vs 100%). When compared to other grades, the models derived for decision 3 by consultants/general practitioners were more likely to include wheeze severity as a cue (39% vs 6%). Conclusions Clinicians differed in their use of individual cues and the number included in their models. Patient safety and quality of care will benefit from clarification of decision‐making strategies as general learning points during medical training, in the development of guidelines and care pathways, and by clinicians developing self‐awareness of their own preferences. PMID:17428817
Decision-analytic modeling studies: An overview for clinicians using multiple myeloma as an example.
Rochau, U; Jahn, B; Qerimi, V; Burger, E A; Kurzthaler, C; Kluibenschaedl, M; Willenbacher, E; Gastl, G; Willenbacher, W; Siebert, U
2015-05-01
The purpose of this study was to provide a clinician-friendly overview of decision-analytic models evaluating different treatment strategies for multiple myeloma (MM). We performed a systematic literature search to identify studies evaluating MM treatment strategies using mathematical decision-analytic models. We included studies that were published as full-text articles in English, and assessed relevant clinical endpoints, and summarized methodological characteristics (e.g., modeling approaches, simulation techniques, health outcomes, perspectives). Eleven decision-analytic modeling studies met our inclusion criteria. Five different modeling approaches were adopted: decision-tree modeling, Markov state-transition modeling, discrete event simulation, partitioned-survival analysis and area-under-the-curve modeling. Health outcomes included survival, number-needed-to-treat, life expectancy, and quality-adjusted life years. Evaluated treatment strategies included novel agent-based combination therapies, stem cell transplantation and supportive measures. Overall, our review provides a comprehensive summary of modeling studies assessing treatment of MM and highlights decision-analytic modeling as an important tool for health policy decision making. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Probabilistic sensitivity analysis for NICE technology assessment: not an optional extra.
Claxton, Karl; Sculpher, Mark; McCabe, Chris; Briggs, Andrew; Akehurst, Ron; Buxton, Martin; Brazier, John; O'Hagan, Tony
2005-04-01
Recently the National Institute for Clinical Excellence (NICE) updated its methods guidance for technology assessment. One aspect of the new guidance is to require the use of probabilistic sensitivity analysis with all cost-effectiveness models submitted to the Institute. The purpose of this paper is to place the NICE guidance on dealing with uncertainty into a broader context of the requirements for decision making; to explain the general approach that was taken in its development; and to address each of the issues which have been raised in the debate about the role of probabilistic sensitivity analysis in general. The most appropriate starting point for developing guidance is to establish what is required for decision making. On the basis of these requirements, the methods and framework of analysis which can best meet these needs can then be identified. It will be argued that the guidance on dealing with uncertainty and, in particular, the requirement for probabilistic sensitivity analysis, is justified by the requirements of the type of decisions that NICE is asked to make. Given this foundation, the main issues and criticisms raised during and after the consultation process are reviewed. Finally, some of the methodological challenges posed by the need fully to characterise decision uncertainty and to inform the research agenda will be identified and discussed. Copyright (c) 2005 John Wiley & Sons, Ltd.
Software Cost Estimation Using a Decision Graph Process: A Knowledge Engineering Approach
NASA Technical Reports Server (NTRS)
Stukes, Sherry; Spagnuolo, John, Jr.
2011-01-01
This paper is not a description per se of the efforts by two software cost analysts. Rather, it is an outline of the methodology used for FSW cost analysis presented in a form that would serve as a foundation upon which others may gain insight into how to perform FSW cost analyses for their own problems at hand.
M.J. Conroy; B.R. Noon
1996-01-01
Biodiversity mapping (e.g., the Gap Analysis Program [GAP]), in which vegetative features and categories of land use are mapped at coarse spatial scales, has been proposed as a reliable tool for land use decisions (e.g., reserve identification, selection, and design). This implicitly assumes that species richness data collected at coarse spatiotemporal scales provide a...
ERIC Educational Resources Information Center
Shubik, Martin
The main problem in computer gaming research is the initial decision of choosing the type of gaming method to be used. Free-form games lead to exciting open-ended confrontations that generate much information. However, they do not easily lend themselves to analysis because they generate far too much information and their results are seldom…
The study of direct-to-consumer advertising for prescription drugs.
Schommer, Jon C; Hansen, Richard A
2005-06-01
The objectives of this article are to (1) identify key methodological issues related to investigating the effects of direct-to-consumer advertising (DTCA) for prescription drugs, (2) highlight opportunities and challenges that these issues pose, and (3) provide suggestions to address these challenges and opportunities from a social and administrative pharmacy perspective. Through a review of existing literature and consultation with research colleagues, we identified 3 broad issues regarding the study of DTCA for prescription drugs: (1) the importance of problem formulation, (2) the role of health behavior and decision-making perspectives, and (3) data collection and data analysis challenges and opportunities. Based upon our findings, we developed recommendations for future research in this area. Clear problem formulation will be instructive for prioritizing research needs and for determining the role that health behavior and decision-making perspectives can serve in DTCA research. In addition, it appears that cluster bias, nonlinear relationships, mediating/moderating effects, time effects, acquiescent response, and case mix are particularly salient challenges for the DTCA research domain. We suggest that problem formulation, selection of sound theories upon which to base research, and data collection and data analysis challenges are key methodological issues related to investigating the effects of DTCA for prescription drugs.
ERIC Educational Resources Information Center
Pearson, Marion L.; Albon, Simon P.; Hubball, Harry
2015-01-01
Individuals and teams engaging in the scholarship of teaching and learning (SoTL) in multidisciplinary higher education settings must make decisions regarding choice of research methodology and methods. These decisions are guided by the research context and the goals of the inquiry. With reference to our own recent experiences investigating…
ERIC Educational Resources Information Center
Suri, Harsh
2013-01-01
Primary research in education and social sciences is marked by a diversity of methods and perspectives. How can we accommodate and reflect such diversity at the level of synthesizing research? What are the critical methodological decisions in the process of a research synthesis, and how do these decisions open up certain possibilities, while…
Issues Related to Measuring and Interpreting Objectively Measured Sedentary Behavior Data
ERIC Educational Resources Information Center
Janssen, Xanne; Cliff, Dylan P.
2015-01-01
The use of objective measures of sedentary behavior has increased over the past decade; however, as is the case for objectively measured physical activity, methodological decisions before and after data collection are likely to influence the outcomes. The aim of this article is to review the evidence on different methodological decisions made by…
The Decisions of Elementary School Principals: A Test of Ideal Type Methodology.
ERIC Educational Resources Information Center
Greer, John T.
Interviews with 25 Georgia elementary school principals provided data that could be used to test an application of Max Weber's ideal type methodology to decision-making. Alfred Schuetz's model of the rational act, based on one of Weber's ideal types, was analyzed and translated into describable acts and behaviors. Interview procedures were…
Adaptive Multi-scale PHM for Robotic Assembly Processes
Choo, Benjamin Y.; Beling, Peter A.; LaViers, Amy E.; Marvel, Jeremy A.; Weiss, Brian A.
2017-01-01
Adaptive multiscale prognostics and health management (AM-PHM) is a methodology designed to support PHM in smart manufacturing systems. As a rule, PHM information is not used in high-level decision-making in manufacturing systems. AM-PHM leverages and integrates component-level PHM information with hierarchical relationships across the component, machine, work cell, and production line levels in a manufacturing system. The AM-PHM methodology enables the creation of actionable prognostic and diagnostic intelligence up and down the manufacturing process hierarchy. Decisions are made with the knowledge of the current and projected health state of the system at decision points along the nodes of the hierarchical structure. A description of the AM-PHM methodology with a simulated canonical robotic assembly process is presented. PMID:28664161
Novel methodology for pharmaceutical expenditure forecast.
Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Mzoughi, Olfa; Toumi, Mondher
2014-01-01
The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the 'EU Pharmaceutical expenditure forecast'; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). 1) Identification of all pharmaceuticals going off-patent and new branded medicinal products over a 5-year forecasting period in seven European Union (EU) Member States. 2) Development of a model to estimate direct and indirect impacts (based on health policies and clinical experts) on savings of generics and biosimilars. Inputs were originator sales value, patent expiry date, time to launch after marketing authorization, price discount, penetration rate, time to peak sales, and impact on brand price. 3) Development of a model for new drugs, which estimated sales progression in a competitive environment. Clinical expected benefits as well as commercial potential were assessed for each product by clinical experts. Inputs were development phase, marketing authorization dates, orphan condition, market size, and competitors. 4) Separate analysis of the budget impact of products going off-patent and new drugs according to several perspectives, distribution chains, and outcomes. 5) Addressing uncertainty surrounding estimations via deterministic and probabilistic sensitivity analysis. This methodology has proven to be effective by 1) identifying the main parameters impacting the variations in pharmaceutical expenditure forecasting across countries: generics discounts and penetration, brand price after patent loss, reimbursement rate, the penetration of biosimilars and discount price, distribution chains, and the time to reach peak sales for new drugs; 2) estimating the statistical distribution of the budget impact; and 3) testing different pricing and reimbursement policy decisions on health expenditures. This methodology was independent of historical data and appeared to be highly flexible and adapted to test robustness and provide probabilistic analysis to support policy decision making.
Web-services-based spatial decision support system to facilitate nuclear waste siting
NASA Astrophysics Data System (ADS)
Huang, L. Xinglai; Sheng, Grant
2006-10-01
The availability of spatial web services enables data sharing among managers, decision and policy makers and other stakeholders in much simpler ways than before and subsequently has created completely new opportunities in the process of spatial decision making. Though generally designed for a certain problem domain, web-services-based spatial decision support systems (WSDSS) can provide a flexible problem-solving environment to explore the decision problem, understand and refine problem definition, and generate and evaluate multiple alternatives for decision. This paper presents a new framework for the development of a web-services-based spatial decision support system. The WSDSS is comprised of distributed web services that either have their own functions or provide different geospatial data and may reside in different computers and locations. WSDSS includes six key components, namely: database management system, catalog, analysis functions and models, GIS viewers and editors, report generators, and graphical user interfaces. In this study, the architecture of a web-services-based spatial decision support system to facilitate nuclear waste siting is described as an example. The theoretical, conceptual and methodological challenges and issues associated with developing web services-based spatial decision support system are described.
NASA Technical Reports Server (NTRS)
Fertis, D. G.; Simon, A. L.
1981-01-01
The requisite methodology to solve linear and nonlinear problems associated with the static and dynamic analysis of rotating machinery, their static and dynamic behavior, and the interaction between the rotating and nonrotating parts of an engine is developed. Linear and nonlinear structural engine problems are investigated by developing solution strategies and interactive computational methods whereby the man and computer can communicate directly in making analysis decisions. Representative examples include modifying structural models, changing material, parameters, selecting analysis options and coupling with interactive graphical display for pre- and postprocessing capability.
General practitioners' decisions about discontinuation of medication: an explorative study.
Nixon, Michael Simon; Vendelø, Morten Thanning
2016-06-20
Purpose - The purpose of this paper is to investigate how general practitioners' (GPs) decisions about discontinuation of medication are influenced by their institutional context. Design/methodology/approach - In total, 24 GPs were interviewed, three practices were observed and documents were collected. The Gioia methodology was used to analyse data, drawing on a theoretical framework that integrate the sensemaking perspective and institutional theory. Findings - Most GPs, who actively consider discontinuation, are reluctant to discontinue medication, because the safest course of action for GPs is to continue prescriptions, rather than discontinue them. The authors conclude that this is in part due to the ambiguity about the appropriateness of discontinuing medication, experienced by the GPs, and in part because the clinical guidelines do not encourage discontinuation of medication, as they offer GPs a weak frame for discontinuation. Three reasons for this are identified: the guidelines provide dominating triggers for prescribing, they provide weak priming for discontinuation as an option, and they underscore a cognitive constraint against discontinuation. Originality/value - The analysis offers new insights about decision making when discontinuing medication. It also offers one of the first examinations of how the institutional context embedding GPs influences their decisions about discontinuation. For policymakers interested in the discontinuation of medication, the findings suggest that de-stigmatising discontinuation on an institutional level may be beneficial, allowing GPs to better justify discontinuation in light of the ambiguity they experience.
NASA Astrophysics Data System (ADS)
Pietrzyk, Mariusz W.; Manning, David J.; Dix, Alan; Donovan, Tim
2009-02-01
Aim: The goal of the study is to determine the spatial frequency characteristics at locations in the image of overt and covert observers' decisions and find out if there are any similarities in different observers' groups: the same radiological experience group or the same accuracy scored level. Background: The radiological task is described as a visual searching decision making procedure involving visual perception and cognitive processing. Humans perceive the world through a number of spatial frequency channels, each sensitive to visual information carried by different spatial frequency ranges and orientations. Recent studies have shown that particular physical properties of local and global image-based elements are correlated with the performance and the level of experience of human observers in breast cancer and lung nodule detections. Neurological findings in visual perception were an inspiration for wavelet applications in vision research because the methodology tries to mimic the brain processing algorithms. Methods: The wavelet approach to the set of postero-anterior chest radiographs analysis has been used to characterize perceptual preferences observers with different levels of experience in the radiological task. Psychophysical methodology has been applied to track eye movements over the image, where particular ROIs related to the observers' fixation clusters has been analysed in the spaces frame by Daubechies functions. Results: Significance differences have been found between the spatial frequency characteristics at the location of different decisions.
Challenges in the estimation of Net SURvival: The CENSUR working survival group.
Giorgi, R
2016-10-01
Net survival, the survival probability that would be observed, in a hypothetical world, where the cancer of interest would be the only possible cause of death, is a key indicator in population-based cancer studies. Accounting for mortality due to other causes, it allows cross-country comparisons or trends analysis and provides a useful indicator for public health decision-making. The objective of this study was to show how the creation and formalization of a network comprising established research teams, which already had substantial and complementary experience in both cancer survival analysis and methodological development, make it possible to meet challenges and thus provide more adequate tools, to improve the quality and the comparability of cancer survival data, and to promote methodological transfers in areas of emerging interest. The Challenges in the Estimation of Net SURvival (CENSUR) working survival group is composed of international researchers highly skilled in biostatistics, methodology, and epidemiology, from different research organizations in France, the United Kingdom, Italy, Slovenia, and Canada, and involved in French (FRANCIM) and European (EUROCARE) cancer registry networks. The expected advantages are an interdisciplinary, international, synergistic network capable of addressing problems in public health, for decision-makers at different levels; tools for those in charge of net survival analyses; a common methodology that makes unbiased cross-national comparisons of cancer survival feasible; transfer of methods for net survival estimations to other specific applications (clinical research, occupational epidemiology); and dissemination of results during an international training course. The formalization of the international CENSUR working survival group was motivated by a need felt by scientists conducting population-based cancer research to discuss, develop, and monitor implementation of a common methodology to analyze net survival in order to provide useful information for cancer control and cancer policy. A "team science" approach is necessary to address new challenges concerning the estimation of net survival. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Multi-criteria GIS-based siting of an incineration plant for municipal solid waste.
Tavares, Gilberto; Zsigraiová, Zdena; Semiao, Viriato
2011-01-01
Siting a municipal solid waste (MSW) incineration plant requires a comprehensive evaluation to identify the best available location(s) that can simultaneously meet the requirements of regulations and minimise economic, environmental, health, and social costs. A spatial multi-criteria evaluation methodology is presented to assess land suitability for a plant siting and applied to Santiago Island of Cape Verde. It combines the analytical hierarchy process (AHP) to estimate the selected evaluation criteria weights with Geographic Information Systems (GIS) for spatial data analysis that avoids the subjectivity of the judgements of decision makers in establishing the influences between some criteria or clusters of criteria. An innovative feature of the method lies in incorporating the environmental impact assessment of the plant operation as a criterion in the decision-making process itself rather than as an a posteriori assessment. Moreover, a two-scale approach is considered. At a global scale an initial screening identifies inter-municipal zones satisfying the decisive requirements (socio-economic, technical and environmental issues, with weights respectively, of 48%, 41% and 11%). A detailed suitability ranking inside the previously identified zones is then performed at a local scale in two phases and includes environmental assessment of the plant operation. Those zones are ranked by combining the non-environmental feasibility of Phase 1 (with a weight of 75%) with the environmental assessment of the plant operation impact of Phase 2 (with a weight of 25%). The reliability and robustness of the presented methodology as a decision supporting tool is assessed through a sensitivity analysis. The results proved the system effectiveness in the ranking process. Copyright © 2011 Elsevier Ltd. All rights reserved.
Brixner, Diana; Maniadakis, Nikos; Kaló, Zoltán; Hu, Shanlian; Shen, Jie; Wijaya, Kalman
2017-09-01
Off-patent pharmaceuticals (OPPs) represent more than 60% of the pharmaceutical market in many emerging countries, where they are frequently evaluated primarily on cost rather than with health technology assessment. OPPs are assumed to be identical to the originators. Branded and unbranded generic versions can, however, vary from the originator in active pharmaceutical ingredients, dosage, consistency formulation, excipients, manufacturing processes, and distribution, for example. These variables can alter the efficacy and safety of the product, negatively impacting both the anticipated cost savings and the population's health. In addition, many health care systems lack the resources or expertise to evaluate such products, and current assessment methods can be complex and difficult to adapt to a health system's needs. Multicriteria decision analysis (MCDA) simple scoring is an evidence-based health technology assessment methodology for evaluating OPPs, especially in emerging countries in which resources are limited but decision makers still must balance affordability with factors such as drug safety, level interchangeability, manufacturing site and active pharmaceutical ingredient quality, supply track record, and real-life outcomes. MCDA simple scoring can be applied to pharmaceutical pricing, reimbursement, formulary listing, and drug procurement. In November 2015, a workshop was held at the International Society for Pharmacoeconomics and Outcomes Research Annual Meeting in Milan to refine and prioritize criteria that can be used in MCDA simple scoring for OPPs, resulting in an example MCDA process and 22 prioritized criteria that health care systems in emerging countries can easily adapt to their own decision-making processes. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Jeon, Kwon Chan; Chen, Lei-Shih; Goodson, Patricia
2012-01-01
We performed a systematic review of factors affecting parental decisions to continue or terminate a pregnancy after prenatal diagnosis of a sex chromosome abnormality, as reported in published studies from 1987 to May 2011. Based on the Matrix Method for systematic reviews, 19 studies were found in five electronic databases, meeting specific inclusion/exclusion criteria. Abstracted data were organized in a matrix. Alongside the search for factors influencing parental decisions, each study was judged on its methodological quality and assigned a methodological quality score. Decisions either to terminate or to continue a sex chromosome abnormality-affected pregnancy shared five similar factors: specific type of sex chromosome abnormality, gestational week at diagnosis, parents' age, providers' genetic expertise, and number of children/desire for (more) children. Factors unique to termination decisions included parents' fear/anxiety and directive counseling. Factors uniquely associated with continuation decisions were parents' socioeconomic status and ethnicity. The studies' average methodological quality score was 10.6 (SD = 1.67; range, 8-14). Findings from this review can be useful in adapting and modifying guidelines for genetic counseling after prenatal diagnosis of a sex chromosome abnormality. Moreover, improving the quality of future studies on this topic may allow clearer understanding of the most influential factors affecting parental decisions.
Durif-Bruckert, C; Roux, P; Morelle, M; Mignotte, H; Faure, C; Moumjid-Ferdjaoui, N
2015-07-01
The aim of this study on shared decision-making in the doctor-patient encounter about surgical treatment for early-stage breast cancer, conducted in a regional cancer centre in France, was to further the understanding of patient perceptions on shared decision-making. The study used methodological triangulation to collect data (both quantitative and qualitative) about patient preferences in the context of a clinical consultation in which surgeons followed a shared decision-making protocol. Data were analysed from a multi-disciplinary research perspective (social psychology and health economics). The triangulated data collection methods were questionnaires (n = 132), longitudinal interviews (n = 47) and observations of consultations (n = 26). Methodological triangulation revealed levels of divergence and complementarity between qualitative and quantitative results that suggest new perspectives on the three inter-related notions of decision-making, participation and information. Patients' responses revealed important differences between shared decision-making and participation per se. The authors note that subjecting patients to a normative behavioural model of shared decision-making in an era when paradigms of medical authority are shifting may undermine the patient's quest for what he or she believes is a more important right: a guarantee of the best care available. © 2014 John Wiley & Sons Ltd.
Berne, Rosalyn W; Raviv, Daniel
2004-04-01
This paper introduces the Eight Dimensional Methodology for Innovative Thinking (the Eight Dimensional Methodology), for innovative problem solving, as a unified approach to case analysis that builds on comprehensive problem solving knowledge from industry, business, marketing, math, science, engineering, technology, arts, and daily life. It is designed to stimulate innovation by quickly generating unique "out of the box" unexpected and high quality solutions. It gives new insights and thinking strategies to solve everyday problems faced in the workplace, by helping decision makers to see otherwise obscure alternatives and solutions. Daniel Raviv, the engineer who developed the Eight Dimensional Methodology, and paper co-author, technology ethicist Rosalyn Berne, suggest that this tool can be especially useful in identifying solutions and alternatives for particular problems of engineering, and for the ethical challenges which arise with them. First, the Eight Dimensional Methodology helps to elucidate how what may appear to be a basic engineering problem also has ethical dimensions. In addition, it offers to the engineer a methodology for penetrating and seeing new dimensions of those problems. To demonstrate the effectiveness of the Eight Dimensional Methodology as an analytical tool for thinking about ethical challenges to engineering, the paper presents the case of the construction of the Large Binocular Telescope (LBT) on Mount Graham in Arizona. Analysis of the case offers to decision makers the use of the Eight Dimensional Methodology in considering alternative solutions for how they can proceed in their goals of exploring space. It then follows that same process through the second stage of exploring the ethics of each of those different solutions. The LBT project pools resources from an international partnership of universities and research institutes for the construction and maintenance of a highly sophisticated, powerful new telescope. It will soon mark the erection of the world's largest and most powerful optical telescope, designed to see fine detail otherwise visible only from space. It also represents a controversial engineering project that is being undertaken on land considered to be sacred by the local, native Apache people. As presented, the case features the University of Virginia, and its challenges in consideration of whether and how to join the LBT project consortium.
Expected p-values in light of an ROC curve analysis applied to optimal multiple testing procedures.
Vexler, Albert; Yu, Jihnhee; Zhao, Yang; Hutson, Alan D; Gurevich, Gregory
2017-01-01
Many statistical studies report p-values for inferential purposes. In several scenarios, the stochastic aspect of p-values is neglected, which may contribute to drawing wrong conclusions in real data experiments. The stochastic nature of p-values makes their use to examine the performance of given testing procedures or associations between investigated factors to be difficult. We turn our focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we prove that the EPV can be considered in the context of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. The ROC-based framework provides a new and efficient methodology for investigating and constructing statistical decision-making procedures, including: (1) evaluation and visualization of properties of the testing mechanisms, considering, e.g. partial EPVs; (2) developing optimal tests via the minimization of EPVs; (3) creation of novel methods for optimally combining multiple test statistics. We demonstrate that the proposed EPV-based approach allows us to maximize the integrated power of testing algorithms with respect to various significance levels. In an application, we use the proposed method to construct the optimal test and analyze a myocardial infarction disease dataset. We outline the usefulness of the "EPV/ROC" technique for evaluating different decision-making procedures, their constructions and properties with an eye towards practical applications.
Moher, David; Clifford, Tammy J.
2016-01-01
Background Rapid reviews expedite the knowledge synthesis process with the goal of providing timely information to healthcare decision-makers who want to use evidence-informed policy and practice approaches. A range of opinions and viewpoints on rapid reviews is thought to exist; however, no research to date has formally captured these views. This paper aims to explore evidence producer and knowledge user attitudes and perceptions towards rapid reviews. Methods A Q methodology study was conducted to identify central viewpoints about rapid reviews based on a broad topic discourse. Participants rank-ordered 50 text statements and explained their Q-sort in free-text comments. Individual Q-sorts were analysed using Q-Assessor (statistical method: factor analysis with varimax rotation). Factors, or salient viewpoints on rapid reviews, were identified, interpreted and described. Results Analysis of the 11 individual Q sorts identified three prominent viewpoints: Factor A cautions against the use of study design labels to make judgements. Factor B maintains that rapid reviews should be the exception and not the rule. Factor C focuses on the practical needs of the end-user over the review process. Conclusion Results show that there are opposing viewpoints on rapid reviews, yet some unity exists. The three factors described offer insight into how and why various stakeholders act as they do and what issues may need to be resolved before increase uptake of the evidence from rapid reviews can be realized in healthcare decision-making environments. PMID:27761324
Stakeholder analysis methodologies resource book
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babiuch, W.M.; Farhar, B.C.
1994-03-01
Stakeholder analysis allows analysts to identify how parties might be affected by government projects. This process involves identifying the likely impacts of a proposed action and stakeholder groups affected by that action. Additionally, the process involves assessing how these groups might be affected and suggesting measures to mitigate any adverse effects. Evidence suggests that the efficiency and effectiveness of government actions can be increased and adverse social impacts mitigated when officials understand how a proposed action might affect stakeholders. This report discusses how to conduct useful stakeholder analyses for government officials making decisions on energy-efficiency and renewable-energy technologies and theirmore » commercialization. It discusses methodological issues that may affect the validity and reliability of findings, including sampling, generalizability, validity, ``uncooperative`` stakeholder groups, using social indicators, and the effect of government regulations. The Appendix contains resource directories and a list of specialists in stakeholder analysis and involvement.« less
Stefanović, Stefica Cerjan; Bolanča, Tomislav; Luša, Melita; Ukić, Sime; Rogošić, Marko
2012-02-24
This paper describes the development of ad hoc methodology for determination of inorganic anions in oilfield water, since their composition often significantly differs from the average (concentration of components and/or matrix). Therefore, fast and reliable method development has to be performed in order to ensure the monitoring of desired properties under new conditions. The method development was based on computer assisted multi-criteria decision making strategy. The used criteria were: maximal value of objective functions used, maximal robustness of the separation method, minimal analysis time, and maximal retention distance between two nearest components. Artificial neural networks were used for modeling of anion retention. The reliability of developed method was extensively tested by the validation of performance characteristics. Based on validation results, the developed method shows satisfactory performance characteristics, proving the successful application of computer assisted methodology in the described case study. Copyright © 2011 Elsevier B.V. All rights reserved.
A Psychobiographical Study of Intuition in a Writer's Life: Paulo Coelho Revisited
Mayer, Claude-Hélène; Maree, David
2017-01-01
Intuition is defined as a form of knowledge which materialises as awareness of thoughts, feelings and physical sensations. It is a key to a deeper understanding and meaningfulness. Intuition, used as a psychological function, supports the transmission and integration of perceptions from unconscious and conscious realms. This study uses a psychobiographical single case study approach to explore intuition across the life span of Paulo Coelho. Methodologically, the study is based on a single case study, using the methodological frame of Dilthey's modern hermeneutics. The author, Paulo Coelho, was chosen as a subject of research, based on the content analysis of first- and third-person perspective documents. Findings show that Paulo Coelho, as one of the most famous and most read contemporary authors in the world, uses his intuitions as a deeper guidance in life, for decision-making and self-development. Intuitive decision-making is described throughout his life and by referring to selected creative works. PMID:28904596
1991-01-01
games. A leader with limited rationality will make decisions that bear a -vi- reasonable relationship to his objectives and values, but they may be...possible reasoning of opponents before or during crisis and conflict. The methodology is intended for use in analysis and defense planning, especially...overconfidence in prediction, failure to hedge, and failure actively to find ways to determine and affect the opponent’s reasoning before it is too late
A simple randomisation procedure for validating discriminant analysis: a methodological note.
Wastell, D G
1987-04-01
Because the goal of discriminant analysis (DA) is to optimise classification, it designedly exaggerates between-group differences. This bias complicates validation of DA. Jack-knifing has been used for validation but is inappropriate when stepwise selection (SWDA) is employed. A simple randomisation test is presented which is shown to give correct decisions for SWDA. The general superiority of randomisation tests over orthodox significance tests is discussed. Current work on non-parametric methods of estimating the error rates of prediction rules is briefly reviewed.
Karami, Manoochehr; Khazaei, Salman
2017-12-06
Clinical decision makings according studies result require the valid and correct data collection, andanalysis. However, there are some common methodological and statistical issues which may ignore by authors. In individual matched case- control design bias arising from the unconditional analysis instead of conditional analysis. Using an unconditional logistic for matched data causes the imposition of a large number of nuisance parameters which may result in seriously biased estimates.
Maguire, Erin; Hong, Paul; Ritchie, Krista; Meier, Jeremy; Archibald, Karen; Chorney, Jill
2016-11-04
To describe the process involved in developing a decision aid prototype for parents considering adenotonsillectomy for their children with sleep disordered breathing. A paper-based decision aid prototype was developed using the framework proposed by the International Patient Decision Aids Standards Collaborative. The decision aid focused on two main treatment options: watchful waiting and adenotonsillectomy. Usability was assessed with parents of pediatric patients and providers with qualitative content analysis of semi-structured interviews, which included open-ended user feedback. A steering committee composed of key stakeholders was assembled. A needs assessment was then performed, which confirmed the need for a decision support tool. A decision aid prototype was developed and modified based on semi-structured qualitative interviews and a scoping literature review. The prototype provided information on the condition, risk and benefits of treatments, and values clarification. The prototype underwent three cycles of accessibility, feasibility, and comprehensibility testing, incorporating feedback from all stakeholders to develop the final decision aid prototype. A standardized, iterative methodology was used to develop a decision aid prototype for parents considering adenotonsillectomy for their children with sleep disordered breathing. The decision aid prototype appeared feasible, acceptable and comprehensible, and may serve as an effective means of improving shared decision-making.
Needs assessment for business strategies of anesthesiology groups' practices.
Scurlock, Corey; Dexter, Franklin; Reich, David L; Galati, Maria
2011-07-01
Progress has been made in understanding strategic decision making influencing anesthesia groups' operating room business practices. However, there has been little analysis of the remaining gaps in our knowledge. We performed a needs assessment to identify unsolved problems in anesthesia business strategy based on Porter's Five Forces Analysis. The methodology was a narrative literature review. We found little previous investigation for 2 of the 5 forces (threat of new entrants and bargaining power of suppliers), modest understanding for 1 force (threat of substitute products or services), and substantial understanding for 2 forces (bargaining power of customers and jockeying for position among current competitors). Additional research in strategic decisions influencing anesthesia groups should focus on the threat of new entrants, bargaining power of suppliers, and the threat of substitute products or services.
A multi-phase network situational awareness cognitive task analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erbacher, Robert; Frincke, Deborah A.; Wong, Pak C.
Abstract The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into making certain that we had feedback from network analysts and managers and understand what their genuine needs are. This article discusses the cognitive task-analysis methodology that we followed to acquire feedback from the analysts. This article also provides the details we acquired from the analysts on their processes, goals, concerns, themore » data and metadata that they analyze. Finally, we describe the generation of a novel task-flow diagram representing the activities of the target user base.« less
Grant, A. M.; Richard, Y.; Deland, E.; Després, N.; de Lorenzi, F.; Dagenais, A.; Buteau, M.
1997-01-01
The Autocontrol methodology has been developed in order to support the optimisation of decision-making and the use of resources in the context of a clinical unit. The theoretical basis relates to quality assurance and information systems and is influenced by management and cognitive research in the health domain. The methodology uses population rather than individual decision making and because of its dynamic feedback design promises to have rapid and profound effect on practice. Most importantly the health care professional is the principle user of the Autocontrol system. In this methodology we distinguish three types of evidence necessary for practice change: practice based or internal evidence, best evidence derived from the literature or external evidence concerning the practice in question, and process based evidence on how to optimise the process of practice change. The software used by the system is of the executive decision support type which facilitates interrogation of large databases. The Autocontrol system is designed to interrogate the data of the patient medical record however the latter often lacks data on concomitant resource use and this must be supplemented. This paper reviews the Autocontrol methodology and gives examples from current studies. PMID:9357733
Grant, A M; Richard, Y; Deland, E; Després, N; de Lorenzi, F; Dagenais, A; Buteau, M
1997-01-01
The Autocontrol methodology has been developed in order to support the optimisation of decision-making and the use of resources in the context of a clinical unit. The theoretical basis relates to quality assurance and information systems and is influenced by management and cognitive research in the health domain. The methodology uses population rather than individual decision making and because of its dynamic feedback design promises to have rapid and profound effect on practice. Most importantly the health care professional is the principle user of the Autocontrol system. In this methodology we distinguish three types of evidence necessary for practice change: practice based or internal evidence, best evidence derived from the literature or external evidence concerning the practice in question, and process based evidence on how to optimise the process of practice change. The software used by the system is of the executive decision support type which facilitates interrogation of large databases. The Autocontrol system is designed to interrogate the data of the patient medical record however the latter often lacks data on concomitant resource use and this must be supplemented. This paper reviews the Autocontrol methodology and gives examples from current studies.
Social and ethical analysis in health technology assessment.
Tantivess, Sripen
2014-05-01
This paper presents a review of the domestic and international literature on the assessment of the social and ethical implications of health technologies. It gives an overview of the key concepts, principles, and approaches that should be taken into account when conducting a social and ethical analysis within health technology assessment (HTA). Although there is growing consensus among healthcare experts that the social and ethical ramifications of a given technology should be examined before its adoption, the demand for this kind of analysis among policy-makers around the world, including in Thailand, has so far been lacking. Currently decision-makers mainly base technology adoption decisions using evidence on clinical effectiveness, value for money, and budget impact, while social and ethical aspects have been neglected. Despite the recognized importance of considering equity, justice, and social issues when making decisions regarding health resource allocation, the absence of internationally-accepted principles and methodologies, among other factors, hinders research in these areas. Given that developing internationally agreed standards takes time, it has been recommended that priority be given to defining processes that are justifiable, transparent, and contestable. A discussion of the current situation in Thailand concerning social and ethical analysis of health technologies is also presented.
Use of multicriteria analysis (MCA) for sustainable hydropower planning and management.
Vassoney, Erica; Mammoliti Mochet, Andrea; Comoglio, Claudio
2017-07-01
Multicriteria analysis (MCA) is a decision-making tool applied to a wide range of environmental management problems, including renewable energy planning and management. An interesting field of application of MCA is the evaluation and analysis of the conflicting aspects of hydropower (HP) exploitation, affecting the three pillars of sustainability and involving several different stakeholders. The present study was aimed at reviewing the state of the art of MCA applications to sustainable hydropower production and related decision-making problems, based on a detailed analysis of the scientific papers published over the last 15 years on this topic. The papers were analysed and compared, focusing on the specific features of the MCA methods applied in the described case studies, highlighting the general aspects of the MCA application (purpose, spatial scale, software used, stakeholders, etc.) and the specific operational/technical features of the selected MCA technique (methodology, criteria, evaluation, approach, sensitivity, etc.). Some specific limitations of the analysed case studies were identified and a set of "quality indexes" of an exhaustive MCA application were suggested as potential improvements for more effectively support decision-making processes in sustainable HP planning and management problems. Copyright © 2017 Elsevier Ltd. All rights reserved.
Cost-effectiveness analysis: problems and promise for evaluating medical technology
NASA Astrophysics Data System (ADS)
Juday, Timothy R.
1994-12-01
Although using limited financial resources in the most beneficial way, in principle, a laudable goal, actually developing standards for measuring the cost-effectiveness of medical technologies and incorporating them into the coverage process is a much more difficult proposition. Important methodological difficulties include determining how to compare a technology to its leading alternative, defining costs, incorporating patient preferences, and defining health outcomes. In addition, more practical questions must be addressed. These questions include: who does the analysis? who makes the decisions? which technologies to evaluate? what resources are required? what is the political and legal environment? how much is a health outcome worth? The ultimate question that must be answered is what is a health outcome worth? Cost-effectiveness analysis cannot answer this question; it only enables comparison of cost-effectiveness ratios across technologies. In order to determine whether a technology should be covered, society or individual insurers must determine how much they are willing to pay for the health benefits. Conducting cost-effectiveness analysis will not remove the need to make difficult resource allocation decisions; however, explicitly examining the tradeoffs involved in these decisions should help to improve the process.
Washington, Simon; Oh, Jutaek
2006-03-01
Transportation professionals are sometimes required to make difficult transportation safety investment decisions in the face of uncertainty. In particular, an engineer may be expected to choose among an array of technologies and/or countermeasures to remediate perceived safety problems when: (1) little information is known about the countermeasure effects on safety; (2) information is known but from different regions, states, or countries where a direct generalization may not be appropriate; (3) where the technologies and/or countermeasures are relatively untested, or (4) where costs prohibit the full and careful testing of each of the candidate countermeasures via before-after studies. The importance of an informed and well-considered decision based on the best possible engineering knowledge and information is imperative due to the potential impact on the numbers of human injuries and deaths that may result from these investments. This paper describes the formalization and application of a methodology to evaluate the safety benefit of countermeasures in the face of uncertainty. To illustrate the methodology, 18 countermeasures for improving safety of at grade railroad crossings (AGRXs) in the Republic of Korea are considered. Akin to "stated preference" methods in travel survey research, the methodology applies random selection and laws of large numbers to derive accident modification factor (AMF) densities from expert opinions. In a full Bayesian analysis framework, the collective opinions in the form of AMF densities (data likelihood) are combined with prior knowledge (AMF density priors) for the 18 countermeasures to obtain 'best' estimates of AMFs (AMF posterior credible intervals). The countermeasures are then compared and recommended based on the largest safety returns with minimum risk (uncertainty). To the author's knowledge the complete methodology is new and has not previously been applied or reported in the literature. The results demonstrate that the methodology is able to discern anticipated safety benefit differences across candidate countermeasures. For the 18 at grade railroad crossings considered in this analysis, it was found that the top three performing countermeasures for reducing crashes are in-vehicle warning systems, obstacle detection systems, and constant warning time systems.
Using TELOS for the planning of the information system audit
NASA Astrophysics Data System (ADS)
Drljaca, D. P.; Latinovic, B.
2018-01-01
This paper intent is to analyse different aspects of information system audit and to synthesise them into the feasibility study report in order to facilitate decision making and planning of information system audit process. The TELOS methodology provides a comprehensive and holistic review for making feasibility study in general. This paper examines the use of TELOS in the identification of possible factors that may influence the decision on implementing information system audit. The research question relates to TELOS provision of sufficient information to decision makers to plan an information system audit. It was found that the TELOS methodology can be successfully applied in the process of approving and planning of information system audit. The five aspects of the feasibility study, if performed objectively, can provide sufficient information to decision makers to commission an information system audit, and also contribute better planning of the audit. Using TELOS methodology can assure evidence-based and cost-effective decision-making process and facilitate planning of the audit. The paper proposes an original approach, not examined until now. It is usual to use TELOS for different purposes and when there is a need for conveying of the feasibility study, but not in the planning of the information system audit. This gives originality to the paper and opens further research questions about evaluation of the feasibility study and possible research on comparative and complementary methodologies.
Methodological individualism in experimental games: not so easily dismissed.
Krueger, Joachim I
2008-06-01
Orthodox game theory and social preference models cannot explain why people cooperate in many experimental games or how they manage to coordinate their choices. The theory of evidential decision making provides a solution, based on the idea that people tend to project their own choices onto others, whatever these choices might be. Evidential decision making preserves methodological individualism, and it works without recourse to social preferences. Rejecting methodological individualism, team reasoning is a thinly disguised resurgence of the group mind fallacy, and the experiments reported by Colman et al. [Colman, A. M., Pulford, B. D., & Rose, J. (this issue). Collective rationality in interactive decisions: Evidence for team reasoning. Acta Psychologica, doi:10.1016/j.actpsy.2007.08.003.] do not offer evidence that uniquely supports team reasoning.
NASA Astrophysics Data System (ADS)
Lowe, Robert; Ziemke, Tom
2010-09-01
The somatic marker hypothesis (SMH) posits that the role of emotions and mental states in decision-making manifests through bodily responses to stimuli of import to the organism's welfare. The Iowa Gambling Task (IGT), proposed by Bechara and Damasio in the mid-1990s, has provided the major source of empirical validation to the role of somatic markers in the service of flexible and cost-effective decision-making in humans. In recent years the IGT has been the subject of much criticism concerning: (1) whether measures of somatic markers reveal that they are important for decision-making as opposed to behaviour preparation; (2) the underlying neural substrate posited as critical to decision-making of the type relevant to the task; and (3) aspects of the methodological approach used, particularly on the canonical version of the task. In this paper, a cognitive robotics methodology is proposed to explore a dynamical systems approach as it applies to the neural computation of reward-based learning and issues concerning embodiment. This approach is particularly relevant in light of a strongly emerging alternative hypothesis to the SMH, the reversal learning hypothesis, which links, behaviourally and neurocomputationally, a number of more or less complex reward-based decision-making tasks, including the 'A-not-B' task - already subject to dynamical systems investigations with a focus on neural activation dynamics. It is also suggested that the cognitive robotics methodology may be used to extend systematically the IGT benchmark to more naturalised, but nevertheless controlled, settings that might better explore the extent to which the SMH, and somatic states per se, impact on complex decision-making.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hiroyoshi Ueda; Katsuhiko Ishiguro; Kazumi Kitayama
2007-07-01
NUMO (Nuclear Waste Management Organization of Japan) has a responsibility for implementing geological disposal of vitrified HLW (High-Level radioactive Waste) in the Japanese nuclear waste management programme. Its staged siting procedure was initiated in 2002 by an open call for volunteer sites. Careful management strategy and methodology for the technical decision-making at every milestone are required to prepare for the volunteer site application and the site investigation stages after that. The formal Requirement Management System (RMS) is planned to support the computerized implementation of the specific management methodology, termed the NUMO Structured Approach (NSA). This planned RMS will help formore » comprehensive management of the decision-making processes in the geological disposal project, change management towards the anticipated project deviations, efficient project driving such as well programmed R and D etc. and structured record-keeping regarding the past decisions, which leads to soundness of the project in terms of the long-term continuity. The system should have handling/management functions for the database including the decisions/requirements in the project in consideration, their associated information and the structures composed of them in every decision-making process. The information relating to the premises, boundary conditions and time plan of the project should also be prepared in the system. Effective user interface and efficient operation on the in-house network are necessary. As a living system for the long-term formal use, flexibility to updating is indispensable. In advance of the formal system development, two-year activity to develop the preliminary RMS was already started. The purpose of this preliminary system is to template the decision/requirement structure, prototype the decision making management and thus show the feasibility of the innovative RMS. The paper describes the current status of the development, focusing on the initial stage including work analysis/modeling and the system conceptualization. (authors)« less
Sabharwal, Sanjeeve; Carter, Alexander; Darzi, Lord Ara; Reilly, Peter; Gupte, Chinmay M
2015-06-01
Approximately 76,000 people a year sustain a hip fracture in the UK and the estimated cost to the NHS is £1.4 billion a year. Health economic evaluations (HEEs) are one of the methods employed by decision makers to deliver healthcare policy supported by clinical and economic evidence. The objective of this study was to (1) identify and characterize HEEs for the management of patients with hip fractures, and (2) examine their methodological quality. A literature search was performed in MEDLINE, EMBASE and the NHS Economic Evaluation Database. Studies that met the specified definition for a HEE and evaluated hip fracture management were included. Methodological quality was assessed using the Consensus on Health Economic Criteria (CHEC). Twenty-seven publications met the inclusion criteria of this study and were included in our descriptive and methodological analysis. Domains of methodology that performed poorly included use of an appropriate time horizon (66.7% of studies), incremental analysis of costs and outcomes (63%), future discounting (44.4%), sensitivity analysis (40.7%), declaration of conflicts of interest (37%) and discussion of ethical considerations (29.6%). HEEs for patients with hip fractures are increasing in publication in recent years. Most of these studies fail to adopt a societal perspective and key aspects of their methodology are poor. The development of future HEEs in this field must adhere to established principles of methodology, so that better quality research can be used to inform health policy on the management of patients with a hip fracture. Copyright © 2014 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.
Dicks, Sean Glenton; Ranse, Kristen; van Haren, Frank MP; Boer, Douglas P
2017-01-01
Information and compassion assist families of potential organ donors to make informed decisions. However, psychological implications of the in-hospital process are not well described with past research focusing on decision-making. To enhance understanding and improve service delivery, a systematic review was conducted. Inductive analysis and synthesis utilised Grounded Theory Methodology within a systems theory framework and contributed to a model proposing that family and staff form a System of Systems with shared responsibility for process outcomes. This model can guide evaluation and improvement of care and will be tested by means of a longitudinal study of family experiences. PMID:28680696
Miniati, Roberto; Dori, Fabrizio; Cecconi, Giulio; Gusinu, Roberto; Niccolini, Fabrizio; Gentili, Guido Biffi
2013-01-01
A fundamental element of the social and safety function of a health structure is the need to guarantee continuity of clinical activity through the continuity of technology. This paper aims to design a Decision Support System (DSS) for medical technology evaluations based on the use of Key Performance Indicators (KPI) in order to provide a multi-disciplinary valuation of a technology in a health structure. The methodology used in planning the DSS followed the following key steps: the definition of relevant KPIs, the development of a database to calculate the KPIs, the calculation of the defined KPIs and the resulting study report. Finally, the clinical and economic validation of the system was conducted though a case study of Business Continuity applied in the operating department of the Florence University Hospital AOU Careggi in Italy. A web-based support system was designed for HTA in health structures. The case study enabled Business Continuity Management (BCM) to be implemented in a hospital department in relation to aspects of a single technology and the specific clinical process. Finally, an economic analysis of the procedure was carried out. The system is useful for decision makers in that it precisely defines which equipment to include in the BCM procedure, using a scale analysis of the specific clinical process in which the equipment is used. In addition, the economic analysis shows how the cost of the procedure is completely covered by the indirect costs which would result from the expenses incurred from a broken device, hence showing the complete auto-sustainability of the methodology.
Kaimakamis, Evangelos; Tsara, Venetia; Bratsas, Charalambos; Sichletidis, Lazaros; Karvounis, Charalambos; Maglaveras, Nikolaos
2016-01-01
Obstructive Sleep Apnea (OSA) is a common sleep disorder requiring the time/money consuming polysomnography for diagnosis. Alternative methods for initial evaluation are sought. Our aim was the prediction of Apnea-Hypopnea Index (AHI) in patients potentially suffering from OSA based on nonlinear analysis of respiratory biosignals during sleep, a method that is related to the pathophysiology of the disorder. Patients referred to a Sleep Unit (135) underwent full polysomnography. Three nonlinear indices (Largest Lyapunov Exponent, Detrended Fluctuation Analysis and Approximate Entropy) extracted from two biosignals (airflow from a nasal cannula, thoracic movement) and one linear derived from Oxygen saturation provided input to a data mining application with contemporary classification algorithms for the creation of predictive models for AHI. A linear regression model presented a correlation coefficient of 0.77 in predicting AHI. With a cutoff value of AHI = 8, the sensitivity and specificity were 93% and 71.4% in discrimination between patients and normal subjects. The decision tree for the discrimination between patients and normal had sensitivity and specificity of 91% and 60%, respectively. Certain obtained nonlinear values correlated significantly with commonly accepted physiological parameters of people suffering from OSA. We developed a predictive model for the presence/severity of OSA using a simple linear equation and additional decision trees with nonlinear features extracted from 3 respiratory recordings. The accuracy of the methodology is high and the findings provide insight to the underlying pathophysiology of the syndrome. Reliable predictions of OSA are possible using linear and nonlinear indices from only 3 respiratory signals during sleep. The proposed models could lead to a better study of the pathophysiology of OSA and facilitate initial evaluation/follow up of suspected patients OSA utilizing a practical low cost methodology. ClinicalTrials.gov NCT01161381.
Do systematic reviews on pediatric topics need special methodological considerations?
Farid-Kapadia, Mufiza; Askie, Lisa; Hartling, Lisa; Contopoulos-Ioannidis, Despina; Bhutta, Zulfiqar A; Soll, Roger; Moher, David; Offringa, Martin
2017-03-06
Systematic reviews are key tools to enable decision making by healthcare providers and policymakers. Despite the availability of the evidence based Preferred Reporting Items for Systematic reviews and Meta-Analysis (PRISMA-2009 and PRISMA-P 2015) statements that were developed to improve the transparency and quality of reporting of systematic reviews, uncertainty on how to deal with pediatric-specific methodological challenges of systematic reviews impairs decision-making in child health. In this paper, we identify methodological challenges specific to the design, conduct and reporting of pediatric systematic reviews, and propose a process to address these challenges. One fundamental decision at the outset of a systematic review is whether to focus on a pediatric population only, or to include both adult and pediatric populations. Both from the policy and patient care point of view, the appropriateness of interventions and comparators administered to pre-defined pediatric age subgroup is critical. Decisions need to be based on the biological plausibility of differences in treatment effects across the developmental trajectory in children. Synthesis of evidence from different trials is often impaired by the use of outcomes and measurement instruments that differ between trials and are neither relevant nor validated in the pediatric population. Other issues specific to pediatric systematic reviews include lack of pediatric-sensitive search strategies and inconsistent choices of pediatric age subgroups in meta-analyses. In addition to these methodological issues generic to all pediatric systematic reviews, special considerations are required for reviews of health care interventions' safety and efficacy in neonatology, global health, comparative effectiveness interventions and individual participant data meta-analyses. To date, there is no standard approach available to overcome this problem. We propose to develop a consensus-based checklist of essential items which researchers should consider when they are planning (PRISMA-PC-Protocol for Children) or reporting (PRISMA-C-reporting for Children) a pediatric systematic review. Available guidelines including PRISMA do not cover the complexity associated with the conduct and reporting of systematic reviews in the pediatric population; they require additional and modified standards for reporting items. Such guidance will facilitate the translation of knowledge from the literature to bedside care and policy, thereby enhancing delivery of care and improving child health outcomes.
Execution Of Systems Integration Principles During Systems Engineering Design
2016-09-01
This thesis discusses integration failures observed by DOD and non - DOD systems as, inadequate stakeholder analysis, incomplete problem space and design ... design , development, test and deployment of a system. A lifecycle structure consists of phases within a methodology or process model. There are many...investigate design decisions without the need to commit to physical forms; “ experimental investigation using a model yields design or operational
Nadal, Ana; Pons, Oriol; Cuerva, Eva; Rieradevall, Joan; Josa, Alejandro
2018-06-01
Today, urban agriculture is one of the most widely used sustainability strategies to improve the metabolism of a city. Schools can play an important role in the implementation of sustainability master plans, due their socio-educational activities and their cohesive links with families; all key elements in the development of urban agriculture. Thus, the main objective of this research is to develop a procedure, in compact cities, to assess the potential installation of rooftop greenhouses (RTGs) in schools. The generation of a dynamic assessment tool capable of identifying and prioritizing schools with a high potential for RTGs and their eventual implementation would also represent a significant factor in the environmental, social, and nutritional education of younger generations. The methodology has four-stages (Pre-selection criteria; Selection of necessities; Sustainability analysis; and Sensitivity analysis and selection of the best alternative) in which economic, environmental, social and governance aspects all are considered. It makes use of Multi-Attribute Utility Theory and Multi-Criteria Decision Making, through the Integrated Value Model for Sustainability Assessments and the participation of two panels of multidisciplinary specialists, for the preparation of a unified sustainability index that guarantees the objectivity of the selection process. This methodology has been applied and validated in a case study of 11 schools in Barcelona (Spain). The social perspective of the proposed methodology favored the school in the case-study with the most staff and the largest parent-teacher association (social and governance indicators) that obtained the highest sustainability index (S11); at a considerable distance (45%) from the worst case (S3) with fewer school staff and parental support. Finally, objective decisions may be taken with the assistance of this appropriate, adaptable, and reliable Multi-Criteria Decision-Making tool on the vertical integration and implementation of urban agriculture in schools, in support of the goals of sustainable development and the circular economy. Copyright © 2018 Elsevier B.V. All rights reserved.
Medilanski, Edi; Chuan, Liang; Mosler, Hans-Joachim; Schertenleib, Roland; Larsen, Tove A
2007-05-01
We conducted a study of the institutional barriers to introducing urine source separation in the urban area of Kunming, China. On the basis of a stakeholder analysis, we constructed stakeholder diagrams showing the relative importance of decision-making power and (positive) interest in the topic. A hypothetical decision-making process for the urban case was derived based on a successful pilot project in a periurban area. All our results were evaluated by the stakeholders. We concluded that although a number of primary stakeholders have a large interest in testing urine source separation also in an urban context, most of the key stakeholders would be reluctant to this idea. However, the success in the periurban area showed that even a single, well-received pilot project can trigger the process of broad dissemination of new technologies. Whereas the institutional setting for such a pilot project is favorable in Kunming, a major challenge will be to adapt the technology to the demands of an urban population. Methodologically, we developed an approach to corroborate a stakeholder analysis with the perception of the stakeholders themselves. This is important not only in order to validate the analysis but also to bridge the theoretical gap between stakeholder analysis and stakeholder involvement. We also show that in disagreement with the assumption of most policy theories, local stakeholders consider informal decision pathways to be of great importance in actual policy-making.
NASA Astrophysics Data System (ADS)
Medilanski, Edi; Chuan, Liang; Mosler, Hans-Joachim; Schertenleib, Roland; Larsen, Tove A.
2007-05-01
We conducted a study of the institutional barriers to introducing urine source separation in the urban area of Kunming, China. On the basis of a stakeholder analysis, we constructed stakeholder diagrams showing the relative importance of decision-making power and (positive) interest in the topic. A hypothetical decision-making process for the urban case was derived based on a successful pilot project in a periurban area. All our results were evaluated by the stakeholders. We concluded that although a number of primary stakeholders have a large interest in testing urine source separation also in an urban context, most of the key stakeholders would be reluctant to this idea. However, the success in the periurban area showed that even a single, well-received pilot project can trigger the process of broad dissemination of new technologies. Whereas the institutional setting for such a pilot project is favorable in Kunming, a major challenge will be to adapt the technology to the demands of an urban population. Methodologically, we developed an approach to corroborate a stakeholder analysis with the perception of the stakeholders themselves. This is important not only in order to validate the analysis but also to bridge the theoretical gap between stakeholder analysis and stakeholder involvement. We also show that in disagreement with the assumption of most policy theories, local stakeholders consider informal decision pathways to be of great importance in actual policy-making.
Pasqualini, Vanina; Oberti, Pascal; Vigetta, Stéphanie; Riffard, Olivier; Panaïotis, Christophe; Cannac, Magali; Ferrat, Lila
2011-07-01
Forest management can benefit from decision support tools, including GIS-based multicriteria decision-aiding approach. In the Mediterranean region, Pinus pinaster forests play a very important role in biodiversity conservation and offer many socioeconomic benefits. However, the conservation of this species is affected by the increase in forest fires and the expansion of Matsucoccus feytaudi. This paper proposes a methodology based on commonly available data for assessing the values and risks of P. pinaster forests and to generating maps to aid in decisions pertaining to fire and phytosanitary risk management. The criteria for assessing the values (land cover type, legislative tools for biodiversity conservation, environmental tourist sites and access routes, and timber yield) and the risks (fire and phytosanitation) of P. pinaster forests were obtained directly or by considering specific indicators, and they were subsequently aggregated by means of GIS-based multicriteria analysis. This approach was tested on the island of Corsica (France), and maps to aid in decisions pertaining to fire risk and phytosanitary risk (M. feytaudi) were obtained for P. pinaster forest management. Study results are used by the technical offices of the local administration-Corsican Agricultural and Rural Development Agency (ODARC)-for planning the conservation of P. pinaster forests with regard to fire prevention and safety and phytosanitary risks. The decision maker took part in the evaluation criteria study (weight, normalization, and classification of the values). Most suitable locations are given to target the public intervention. The methodology presented in this paper could be applied to other species and in other Mediterranean regions.
NASA Astrophysics Data System (ADS)
Pasqualini, Vanina; Oberti, Pascal; Vigetta, Stéphanie; Riffard, Olivier; Panaïotis, Christophe; Cannac, Magali; Ferrat, Lila
2011-07-01
Forest management can benefit from decision support tools, including GIS-based multicriteria decision-aiding approach. In the Mediterranean region, Pinus pinaster forests play a very important role in biodiversity conservation and offer many socioeconomic benefits. However, the conservation of this species is affected by the increase in forest fires and the expansion of Matsucoccus feytaudi. This paper proposes a methodology based on commonly available data for assessing the values and risks of P. pinaster forests and to generating maps to aid in decisions pertaining to fire and phytosanitary risk management. The criteria for assessing the values (land cover type, legislative tools for biodiversity conservation, environmental tourist sites and access routes, and timber yield) and the risks (fire and phytosanitation) of P. pinaster forests were obtained directly or by considering specific indicators, and they were subsequently aggregated by means of GIS-based multicriteria analysis. This approach was tested on the island of Corsica (France), and maps to aid in decisions pertaining to fire risk and phytosanitary risk ( M. feytaudi) were obtained for P. pinaster forest management. Study results are used by the technical offices of the local administration— Corsican Agricultural and Rural Development Agency (ODARC)—for planning the conservation of P. pinaster forests with regard to fire prevention and safety and phytosanitary risks. The decision maker took part in the evaluation criteria study (weight, normalization, and classification of the values). Most suitable locations are given to target the public intervention. The methodology presented in this paper could be applied to other species and in other Mediterranean regions.
Methodology, Methods, and Metrics for Testing and Evaluating Augmented Cognition Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greitzer, Frank L.
The augmented cognition research community seeks cognitive neuroscience-based solutions to improve warfighter performance by applying and managing mitigation strategies to reduce workload and improve the throughput and quality of decisions. The focus of augmented cognition mitigation research is to define, demonstrate, and exploit neuroscience and behavioral measures that support inferences about the warfighter’s cognitive state that prescribe the nature and timing of mitigation. A research challenge is to develop valid evaluation methodologies, metrics and measures to assess the impact of augmented cognition mitigations. Two considerations are external validity, which is the extent to which the results apply to operational contexts;more » and internal validity, which reflects the reliability of performance measures and the conclusions based on analysis of results. The scientific rigor of the research methodology employed in conducting empirical investigations largely affects the validity of the findings. External validity requirements also compel us to demonstrate operational significance of mitigations. Thus it is important to demonstrate effectiveness of mitigations under specific conditions. This chapter reviews some cognitive science and methodological considerations in designing augmented cognition research studies and associated human performance metrics and analysis methods to assess the impact of augmented cognition mitigations.« less
NASA Technical Reports Server (NTRS)
Hanagud, S.; Uppaluri, B.
1975-01-01
This paper describes a methodology for making cost effective fatigue design decisions. The methodology is based on a probabilistic model for the stochastic process of fatigue crack growth with time. The development of a particular model for the stochastic process is also discussed in the paper. The model is based on the assumption of continuous time and discrete space of crack lengths. Statistical decision theory and the developed probabilistic model are used to develop the procedure for making fatigue design decisions on the basis of minimum expected cost or risk function and reliability bounds. Selections of initial flaw size distribution, NDT, repair threshold crack lengths, and inspection intervals are discussed.
ERIC Educational Resources Information Center
Young, I. Phillip; Fawcett, Paul
2013-01-01
Several teacher models exist for using high-stakes testing outcomes to make continuous employment decisions for principals. These models are reviewed, and specific flaws are noted if these models are retrofitted for principals. To address these flaws, a different methodology is proposed on the basis of actual field data. Specially addressed are…
A case study using the PrOACT-URL and BRAT frameworks for structured benefit risk assessment.
Nixon, Richard; Dierig, Christoph; Mt-Isa, Shahrul; Stöckert, Isabelle; Tong, Thaison; Kuhls, Silvia; Hodgson, Gemma; Pears, John; Waddingham, Ed; Hockley, Kimberley; Thomson, Andrew
2016-01-01
While benefit-risk assessment is a key component of the drug development and maintenance process, it is often described in a narrative. In contrast, structured benefit-risk assessment builds on established ideas from decision analysis and comprises a qualitative framework and quantitative methodology. We compare two such frameworks, applying multi-criteria decision-analysis (MCDA) within the PrOACT-URL framework and weighted net clinical benefit (wNCB), within the BRAT framework. These are applied to a case study of natalizumab for the treatment of relapsing remitting multiple sclerosis. We focus on the practical considerations of applying these methods and give recommendations for visual presentation of results. In the case study, we found structured benefit-risk analysis to be a useful tool for structuring, quantifying, and communicating the relative benefit and safety profiles of drugs in a transparent, rational and consistent way. The two frameworks were similar. MCDA is a generic and flexible methodology that can be used to perform a structured benefit-risk in any common context. wNCB is a special case of MCDA and is shown to be equivalent to an extension of the number needed to treat (NNT) principle. It is simpler to apply and understand than MCDA and can be applied when all outcomes are measured on a binary scale. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Pollock, Michelle; Fernandes, Ricardo M; Hartling, Lisa
2017-03-23
Overviews of reviews (overviews) compile information from multiple systematic reviews (SRs) to provide a single synthesis of relevant evidence for decision-making. It is recommended that authors assess and report the methodological quality of SRs in overviews-for example, using A MeaSurement Tool to Assess systematic Reviews (AMSTAR). Currently, there is variation in whether and how overview authors assess and report SR quality, and limited guidance is available. Our objectives were to: examine methodological considerations involved in using AMSTAR to assess the quality of Cochrane and non-Cochrane SRs in overviews of healthcare interventions; identify challenges (and develop potential decision rules) when using AMSTAR in overviews; and examine the potential impact of considering methodological quality when making inclusion decisions in overviews. We selected seven overviews of healthcare interventions and included all SRs meeting each overview's inclusion criteria. For each SR, two reviewers independently conducted AMSTAR assessments with consensus and discussed challenges encountered. We also examined the correlation between AMSTAR assessments and SR results/conclusions. Ninety-five SRs were included (30 Cochrane, 65 non-Cochrane). Mean AMSTAR assessments (9.6/11 vs. 5.5/11; p < 0.001) and inter-rater reliability (AC1 statistic: 0.84 vs. 0.69; "almost perfect" vs. "substantial" using the Landis & Koch criteria) were higher for Cochrane compared to non-Cochrane SRs. Four challenges were identified when applying AMSTAR in overviews: the scope of the SRs and overviews often differed; SRs examining similar topics sometimes made different methodological decisions; reporting of non-Cochrane SRs was sometimes poor; and some non-Cochrane SRs included other SRs as well as primary studies. Decision rules were developed to address each challenge. We found no evidence that AMSTAR assessments were correlated with SR results/conclusions. Results indicate that the AMSTAR tool can be used successfully in overviews that include Cochrane and non-Cochrane SRs, though decision rules may be useful to circumvent common challenges. Findings support existing recommendations that quality assessments of SRs in overviews be conducted independently, in duplicate, with a process for consensus. Results also suggest that using methodological quality to guide inclusion decisions (e.g., to exclude poorly conducted and reported SRs) may not introduce bias into the overview process.
Bossard, B.; Renard, J. M.; Capelle, P.; Paradis, P.; Beuscart, M. C.
2000-01-01
Investing in information technology has become a crucial process in hospital management today. Medical and administrative managers are faced with difficulties in measuring medical information technology costs and benefits due to the complexity of the domain. This paper proposes a preimplementation methodology for evaluating and appraising material, process and human costs and benefits. Based on the users needs and organizational process analysis, the methodology provides an evaluative set of financial and non financial indicators which can be integrated in a decision making and investment evaluation process. We describe the first results obtained after a few months of operation for the Computer-Based Patient Record (CPR) project. Its full acceptance, in spite of some difficulties, encourages us to diffuse the method for the entire project. PMID:11079851
Hernandez, Jonathan M; Tsalatsanis, Athanasios; Humphries, Leigh Ann; Miladinovic, Branko; Djulbegovic, Benjamin; Velanovich, Vic
2014-06-01
To use regret decision theory methodology to assess three treatment strategies in pancreatic adenocarcinoma. Pancreatic adenocarcinoma is uniformly fatal without operative intervention. Resection can prolong survival in some patients; however, it is associated with significant morbidity and mortality. Regret theory serves as a novel framework linking both rationality and intuition to determine the optimal course for physicians facing difficult decisions related to treatment. We used the Cox proportional hazards model to predict survival of patients with pancreatic adenocarcinoma and generated a decision model using regret-based decision curve analysis, which integrates both the patient's prognosis and the physician's preferences expressed in terms of regret associated with a certain action. A physician's treatment preferences are indicated by a threshold probability, which is the probability of death/survival at which the physician is uncertain whether or not to perform surgery. The analysis modeled 3 possible choices: perform surgery on all patients; never perform surgery; and act according to the prediction model. The records of 156 consecutive patients with pancreatic adenocarcinoma were retrospectively evaluated by a single surgeon at a tertiary referral center. Significant independent predictors of overall survival included preoperative stage [P = 0.005; 95% confidence interval (CI), 1.19-2.27], vitality (P < 0.001; 95% CI, 0.96-0.98), daily physical function (P < 0.001; 95% CI, 0.97-0.99), and pathological stage (P < 0.001; 95% CI, 3.06-16.05). Compared with the "always aggressive" or "always passive" surgical treatment strategies, the survival model was associated with the least amount of regret for a wide range of threshold probabilities. Regret-based decision curve analysis provides a novel perspective for making treatment-related decisions by incorporating the decision maker's preferences expressed as his or her estimates of benefits and harms associated with the treatment considered.
Cheung, Steven W; Aranda, Derick; Driscoll, Colin L W; Parsa, Andrew T
2010-02-01
Complex medical decision making obligates tradeoff assessments among treatment outcomes expectations, but an accessible tool to perform the necessary analysis is conspicuously absent. We aimed to demonstrate methodology and feasibility of adapting conjoint analysis for mapping clinical outcomes expectations to treatment decisions in vestibular schwannoma (VS) management. Prospective. Tertiary medical center and US-based otologists/neurotologists. Treatment preference profiles among VS stakeholders-61 younger and 74 older prospective patients, 61 observation patients, and 60 surgeons-were assessed for the synthetic VS case scenario of a 10-mm tumor in association with useful hearing and normal facial function. Treatment attribute utility. Conjoint analysis attribute levels were set in accordance to the results of a meta-analysis. Forty-five case series were disaggregated to formulate microsurgery facial nerve and hearing preservation outcomes expectations models. Attribute utilities were computed and mapped to the realistic treatment choices of translabyrinthine craniotomy, middle fossa craniotomy, and gamma knife radiosurgery. Among the treatment attributes of likelihoods of causing deafness, temporary facial weakness for 2 months, and incurable cancer within 20 years, and recovery time, permanent deafness was less important to tumor surgeons, and temporary facial weakness was more important to tumor surgeons and observation patients (Wilcoxon rank-sum, p < 0.001). Inverse mapping of preference profiles to realistic treatment choices showed all study cohorts were inclined to choose gamma knife radiosurgery. Mapping clinical outcomes expectations to treatment decisions for a synthetic clinical scenario revealed inhomogeneous drivers of choice selection among study cohorts. Medical decision engines that analyze personal preferences of outcomes expectations for VS and many other diseases may be developed to promote shared decision making among health care stakeholders and transparency in the informed consent process.
Zhang, Kejiang; Achari, Gopal; Pei, Yuansheng
2010-10-01
Different types of uncertain information-linguistic, probabilistic, and possibilistic-exist in site characterization. Their representation and propagation significantly influence the management of contaminated sites. In the absence of a framework with which to properly represent and integrate these quantitative and qualitative inputs together, decision makers cannot fully take advantage of the available and necessary information to identify all the plausible alternatives. A systematic methodology was developed in the present work to incorporate linguistic, probabilistic, and possibilistic information into the Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), a subgroup of Multi-Criteria Decision Analysis (MCDA) methods for ranking contaminated sites. The identification of criteria based on the paradigm of comparative risk assessment provides a rationale for risk-based prioritization. Uncertain linguistic, probabilistic, and possibilistic information identified in characterizing contaminated sites can be properly represented as numerical values, intervals, probability distributions, and fuzzy sets or possibility distributions, and linguistic variables according to their nature. These different kinds of representation are first transformed into a 2-tuple linguistic representation domain. The propagation of hybrid uncertainties is then carried out in the same domain. This methodology can use the original site information directly as much as possible. The case study shows that this systematic methodology provides more reasonable results. © 2010 SETAC.
PRIORITIES FOR HEALTH ECONOMIC METHODOLOGICAL RESEARCH: RESULTS OF AN EXPERT CONSULTATION.
Tordrup, David; Chouaid, Christos; Cuijpers, Pim; Dab, William; van Dongen, Johanna Maria; Espin, Jaime; Jönsson, Bengt; Léonard, Christian; McDaid, David; McKee, Martin; Miguel, José Pereira; Patel, Anita; Reginster, Jean-Yves; Ricciardi, Walter; Rutten-van Molken, Maureen; Rupel, Valentina Prevolnik; Sach, Tracey; Sassi, Franco; Waugh, Norman; Bertollini, Roberto
2017-01-01
The importance of economic evaluation in decision making is growing with increasing budgetary pressures on health systems. Diverse economic evidence is available for a range of interventions across national contexts within Europe, but little attention has been given to identifying evidence gaps that, if filled, could contribute to more efficient allocation of resources. One objective of the Research Agenda for Health Economic Evaluation project is to determine the most important methodological evidence gaps for the ten highest burden conditions in the European Union (EU), and to suggest ways of filling these gaps. The highest burden conditions in the EU by Disability Adjusted Life Years were determined using the Global Burden of Disease study. Clinical interventions were identified for each condition based on published guidelines, and economic evaluations indexed in MEDLINE were mapped to each intervention. A panel of public health and health economics experts discussed the evidence during a workshop and identified evidence gaps. The literature analysis contributed to identifying cross-cutting methodological and technical issues, which were considered by the expert panel to derive methodological research priorities. The panel suggests a research agenda for health economics which incorporates the use of real-world evidence in the assessment of new and existing interventions; increased understanding of cost-effectiveness according to patient characteristics beyond the "-omics" approach to inform both investment and disinvestment decisions; methods for assessment of complex interventions; improved cross-talk between economic evaluations from health and other sectors; early health technology assessment; and standardized, transferable approaches to economic modeling.
NASA Astrophysics Data System (ADS)
Kucharski, John; Tkach, Mark; Olszewski, Jennifer; Chaudhry, Rabia; Mendoza, Guillermo
2016-04-01
This presentation demonstrates the application of Climate Risk Informed Decision Analysis (CRIDA) at Zambia's principal water treatment facility, The Iolanda Water Treatment Plant. The water treatment plant is prone to unacceptable failures during periods of low hydropower production at the Kafue Gorge Dam Hydroelectric Power Plant. The case study explores approaches of increasing the water treatment plant's ability to deliver acceptable levels of service under the range of current and potential future climate states. The objective of the study is to investigate alternative investments to build system resilience that might have been informed by the CRIDA process, and to evaluate the extra resource requirements by a bilateral donor agency to implement the CRIDA process. The case study begins with an assessment of the water treatment plant's vulnerability to climate change. It does so by following general principals described in "Confronting Climate Uncertainty in Water Resource Planning and Project Design: the Decision Tree Framework". By utilizing relatively simple bootstrapping methods a range of possible future climate states is generated while avoiding the use of more complex and costly downscaling methodologies; that are beyond the budget and technical capacity of many teams. The resulting climate vulnerabilities and uncertainty in the climate states that produce them are analyzed as part of a "Level of Concern" analysis. CRIDA principals are then applied to this Level of Concern analysis in order to arrive at a set of actionable water management decisions. The principal goals of water resource management is to transform variable, uncertain hydrology into dependable services (e.g. water supply, flood risk reduction, ecosystem benefits, hydropower production, etc…). Traditional approaches to climate adaptation require the generation of predicted future climate states but do little guide decision makers how this information should impact decision making. In this context it is not surprising that the increased hydrologic variability and uncertainty produced by many climate risk analyses bedevil water resource decision making. The Climate Risk Informed Decision Analysis (CRIDA) approach builds on work found in "Confronting Climate Uncertainty in Water Resource Planning and Project Design: the Decision Tree Framework" which provide guidance of vulnerability assessments. It guides practitioners through a "Level of Concern" analysis where climate vulnerabilities are analyzed to produce actionable alternatives and decisions.
Achillas, Ch; Vlachokostas, Ch; Moussiopoulos, Nu; Banias, G
2010-05-01
Environmentally sound end-of-life management of Electrical and Electronic Equipment has been realised as a top priority issue internationally, both due to the waste stream's continuously increasing quantities, as well as its content in valuable and also hazardous materials. In an effort to manage Waste Electrical and Electronic Equipment (WEEE), adequate infrastructure in treatment and recycling facilities is considered a prerequisite. A critical number of such plants are mandatory to be installed in order: (i) to accommodate legislative needs, (ii) decrease transportation cost, and (iii) expand reverse logistics network and cover more areas. However, WEEE recycling infrastructures require high expenditures and therefore the decision maker need to be most precautious. In this context, special care should be given on the viability of infrastructure which is heavily dependent on facilities' location. To this end, a methodology aiming towards optimal location of Units of Treatment and Recycling is developed, taking into consideration economical together with social criteria, in an effort to interlace local acceptance and financial viability. For the decision support system's needs, ELECTRE III is adopted as a multicriteria analysis technique. The methodology's applicability is demonstrated with a real-world case study in Greece. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
The application of seismic risk-benefit analysis to land use planning in Taipei City.
Hung, Hung-Chih; Chen, Liang-Chun
2007-09-01
In the developing countries of Asia local authorities rarely use risk analysis instruments as a decision-making support mechanism during planning and development procedures. The main purpose of this paper is to provide a methodology to enable planners to undertake such analyses. We illustrate a case study of seismic risk-benefit analysis for the city of Taipei, Taiwan, using available land use maps and surveys as well as a new tool developed by the National Science Council in Taiwan--the HAZ-Taiwan earthquake loss estimation system. We use three hypothetical earthquakes to estimate casualties and total and annualised direct economic losses, and to show their spatial distribution. We also characterise the distribution of vulnerability over the study area using cluster analysis. A risk-benefit ratio is calculated to express the levels of seismic risk attached to alternative land use plans. This paper suggests ways to perform earthquake risk evaluations and the authors intend to assist city planners to evaluate the appropriateness of their planning decisions.
Busch, Hauke; Boerries, Melanie; Bao, Jie; Hanke, Sebastian T; Hiss, Manuel; Tiko, Theodhor; Rensing, Stefan A
2013-01-01
Transcription factors (TFs) often trigger developmental decisions, yet, their transcripts are often only moderately regulated and thus not easily detected by conventional statistics on expression data. Here we present a method that allows to determine such genes based on trajectory analysis of time-resolved transcriptome data. As a proof of principle, we have analysed apical stem cells of filamentous moss (P. patens) protonemata that develop from leaflets upon their detachment from the plant. By our novel correlation analysis of the post detachment transcriptome kinetics we predict five out of 1,058 TFs to be involved in the signaling leading to the establishment of pluripotency. Among the predicted regulators is the basic helix loop helix TF PpRSL1, which we show to be involved in the establishment of apical stem cells in P. patens. Our methodology is expected to aid analysis of key players of developmental decisions in complex plant and animal systems.
Probabilistic risk analysis and terrorism risk.
Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J
2010-04-01
Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.
Measuring sustainable development using a multi-criteria model: a case study.
Boggia, Antonio; Cortina, Carla
2010-11-01
This paper shows how Multi-criteria Decision Analysis (MCDA) can help in a complex process such as the assessment of the level of sustainability of a certain area. The paper presents the results of a study in which a model for measuring sustainability was implemented to better aid public policy decisions regarding sustainability. In order to assess sustainability in specific areas, a methodological approach based on multi-criteria analysis has been developed. The aim is to rank areas in order to understand the specific technical and/or financial support that they need to develop sustainable growth. The case study presented is an assessment of the level of sustainability in different areas of an Italian Region using the MCDA approach. Our results show that MCDA is a proper approach for sustainability assessment. The results are easy to understand and the evaluation path is clear and transparent. This is what decision makers need for having support to their decisions. The multi-criteria model for evaluation has been developed respecting the sustainable development economic theory, so that final results can have a clear meaning in terms of sustainability. Copyright 2010 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Barr, B. G.; Martinko, E. A.
1976-01-01
Activities of the Kansas Applied Remote Sensing Program (KARS) designed to establish interactions on cooperative projects with decision makers in Kansas agencies in the development and application of remote sensing procedures are reported. Cooperative demonstration projects undertaken with several different agencies involved three principal areas of effort: Wildlife Habitat and Environmental Analysis; Urban and Regional Analysis; Agricultural and Rural Analysis. These projects were designed to concentrate remote sensing concepts and methodologies on existing agency problems to insure the continued relevancy of the program and maximize the possibility for immediate operational use. Completed projects are briefly discussed.
NASA Astrophysics Data System (ADS)
Sobradelo, Rosa; Martí, Joan; Kilburn, Christopher; López, Carmen
2014-05-01
Understanding the potential evolution of a volcanic crisis is crucial to improving the design of effective mitigation strategies. This is especially the case for volcanoes close to densely-populated regions, where inappropriate decisions may trigger widespread loss of life, economic disruption and public distress. An outstanding goal for improving the management of volcanic crises, therefore, is to develop objective, real-time methodologies for evaluating how an emergency will develop and how scientists communicate with decision makers. Here we present a new model BADEMO (Bayesian Decision Model) that applies a general and flexible, probabilistic approach to managing volcanic crises. The model combines the hazard and risk factors that decision makers need for a holistic analysis of a volcanic crisis. These factors include eruption scenarios and their probabilities of occurrence, the vulnerability of populations and their activities, and the costs of false alarms and failed forecasts. The model can be implemented before an emergency, to identify actions for reducing the vulnerability of a district; during an emergency, to identify the optimum mitigating actions and how these may change as new information is obtained; and after an emergency, to assess the effectiveness of a mitigating response and, from the results, to improve strategies before another crisis occurs. As illustrated by a retrospective analysis of the 2011 eruption of El Hierro, in the Canary Islands, BADEMO provides the basis for quantifying the uncertainty associated with each recommended action as an emergency evolves, and serves as a mechanism for improving communications between scientists and decision makers.
Zhang, Chao; Jia, Pengli; Yu, Liu; Xu, Chang
2018-05-01
Dose-response meta-analysis (DRMA) is widely applied to investigate the dose-specific relationship between independent and dependent variables. Such methods have been in use for over 30 years and are increasingly employed in healthcare and clinical decision-making. In this article, we give an overview of the methodology used in DRMA. We summarize the commonly used regression model and the pooled method in DRMA. We also use an example to illustrate how to employ a DRMA by these methods. Five regression models, linear regression, piecewise regression, natural polynomial regression, fractional polynomial regression, and restricted cubic spline regression, were illustrated in this article to fit the dose-response relationship. And two types of pooling approaches, that is, one-stage approach and two-stage approach are illustrated to pool the dose-response relationship across studies. The example showed similar results among these models. Several dose-response meta-analysis methods can be used for investigating the relationship between exposure level and the risk of an outcome. However the methodology of DRMA still needs to be improved. © 2018 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.
Midwives׳ clinical reasoning during second stage labour: Report on an interpretive study.
Jefford, Elaine; Fahy, Kathleen
2015-05-01
clinical reasoning was once thought to be the exclusive domain of medicine - setting it apart from 'non-scientific' occupations like midwifery. Poor assessment, clinical reasoning and decision-making skills are well known contributors to adverse outcomes in maternity care. Midwifery decision-making models share a common deficit: they are insufficiently detailed to guide reasoning processes for midwives in practice. For these reasons we wanted to explore if midwives actively engaged in clinical reasoning processes within their clinical practice and if so to what extent. The study was conducted using post structural, feminist methodology. to what extent do midwives engage in clinical reasoning processes when making decisions in the second stage labour? twenty-six practising midwives were interviewed. Feminist interpretive analysis was conducted by two researchers guided by the steps of a model of clinical reasoning process. Six narratives were excluded from analysis because they did not sufficiently address the research question. The midwives narratives were prepared via data reduction. A theoretically informed analysis and interpretation was conducted. using a feminist, interpretive approach we created a model of midwifery clinical reasoning grounded in the literature and consistent with the data. Thirteen of the 20 participant narratives demonstrate analytical clinical reasoning abilities but only nine completed the process and implemented the decision. Seven midwives used non-analytical decision-making without adequately checking against assessment data. over half of the participants demonstrated the ability to use clinical reasoning skills. Less than half of the midwives demonstrated clinical reasoning as their way of making decisions. The new model of Midwifery Clinical Reasoning includes 'intuition' as a valued way of knowing. Using intuition, however, should not replace clinical reasoning which promotes through decision-making can be made transparent and be consensually validated. Copyright © 2015 Elsevier Ltd. All rights reserved.
Krauter, Paula; Edwards, Donna; Yang, Lynn; Tucker, Mark
2011-09-01
Decontamination and recovery of a facility or outdoor area after a wide-area biological incident involving a highly persistent agent (eg, Bacillus anthracis spores) is a complex process that requires extensive information and significant resources, which are likely to be limited, particularly if multiple facilities or areas are affected. This article proposes a systematic methodology for evaluating information to select the decontamination or alternative treatments that optimize use of resources if decontamination is required for the facility or area. The methodology covers a wide range of approaches, including volumetric and surface decontamination, monitored natural attenuation, and seal and abandon strategies. A proposed trade-off analysis can help decision makers understand the relative appropriateness, efficacy, and labor, skill, and cost requirements of the various decontamination methods for the particular facility or area needing treatment--whether alone or as part of a larger decontamination effort. Because the state of decontamination knowledge and technology continues to evolve rapidly, the methodology presented here is designed to accommodate new strategies and materials and changing information.
Modeling Operations Other Than War: Non-Combatants in Combat Modeling
1994-09-01
supposition that non-combatants are an essential feature in OOTW. The model proposal includes a methodology for civilian unit decision making . The model...combatants are an essential feature in OOTW. The model proposal includes a methodology for civilian unit decision making . Thi- model also includes...numerical example demonstrated that the model appeared to perform in an acceptable manner, in that it produced output within a reasonable range. During the
A Decision Support Methodology for Space Technology Advocacy.
1984-12-01
determine their parameters. Program control is usually exercised by level of effort funding. 63xx is the designator for advanced development pro- grams... designing systems or models that successfully aid the decision-maker. One remedy for this deficiency in the techniques is to increase the...methodology for use by the Air Force Space Technology Advocate is designed to provide the following features [l11:146-1471: meaningful reduction of available
The importance of operations, risk, and cost assessment to space transfer systems design
NASA Technical Reports Server (NTRS)
Ball, J. M.; Komerska, R. J.; Rowell, L. F.
1992-01-01
This paper examines several methodologies which contribute to comprehensive subsystem cost estimation. The example of a space-based lunar space transfer vehicle (STV) design is used to illustrate how including both primary and secondary factors into cost affects the decision of whether to use aerobraking or propulsion for earth orbit capture upon lunar return. The expected dominant cost factor in this decision is earth-to-orbit launch cost driven by STV mass. However, to quantify other significant cost factors, this cost comparison included a risk analysis to identify development and testing costs, a Taguchi design of experiments to determine a minimum mass aerobrake design, and a detailed operations analysis. As a result, the predicted cost advantage of aerobraking, while still positive, was subsequently reduced by about 30 percent compared to the simpler mass-based cost estimates.
Kwajalein Infrastructure Prioritization Methodology
2012-07-01
Kwajalein are failing apart and if not fixed they could hinder or ruin the base’s ability to execute their mission. The proposed model ranks different ...their perspectives. Multiple Objective Decision Analysis (MODA) was conducted to compare the different value measures together. Since each value...measure is rated differently , it would be difficult to compare them to one another if there was no way to bring them under one type of measurement or unit
Scientific Measure of Africa's Connectivity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zennaro, M.; Canessa, E.; Sreenivasan, K.R.
2006-04-24
Data on Internet performance and the analysis of its trend can be useful for decision makers and scientists alike. Such performance measurements are possible using the PingER methodology. We use the data thus obtained to quantify the difference in performance between developed and developing countries, sometimes referred to as the Digital Divide. Motivated by the recent interest of G8 countries in African development, we particularly focus on the African countries.
Near-term hybrid vehicle program, phase 1. Appendix C: Preliminary design data package
NASA Technical Reports Server (NTRS)
1979-01-01
The design methodology, the design decision rationale, the vehicle preliminary design summary, and the advanced technology developments are presented. The detailed vehicle design, the vehicle ride and handling and front structural crashworthiness analysis, the microcomputer control of the propulsion system, the design study of the battery switching circuit, the field chopper, and the battery charger, and the recent program refinements and computer results are presented.
Vilar, Santiago; Harpaz, Rave; Chase, Herbert S; Costanzi, Stefano; Rabadan, Raul
2011-01-01
Background Adverse drug events (ADE) cause considerable harm to patients, and consequently their detection is critical for patient safety. The US Food and Drug Administration maintains an adverse event reporting system (AERS) to facilitate the detection of ADE in drugs. Various data mining approaches have been developed that use AERS to detect signals identifying associations between drugs and ADE. The signals must then be monitored further by domain experts, which is a time-consuming task. Objective To develop a new methodology that combines existing data mining algorithms with chemical information by analysis of molecular fingerprints to enhance initial ADE signals generated from AERS, and to provide a decision support mechanism to facilitate the identification of novel adverse events. Results The method achieved a significant improvement in precision in identifying known ADE, and a more than twofold signal enhancement when applied to the ADE rhabdomyolysis. The simplicity of the method assists in highlighting the etiology of the ADE by identifying structurally similar drugs. A set of drugs with strong evidence from both AERS and molecular fingerprint-based modeling is constructed for further analysis. Conclusion The results demonstrate that the proposed methodology could be used as a pharmacovigilance decision support tool to facilitate ADE detection. PMID:21946238
Decision Making, Models of Mind, and the New Cognitive Science.
ERIC Educational Resources Information Center
Evers, Colin W.
1998-01-01
Explores implications for understanding educational decision making from a cognitive science perspective. Examines three models of mind providing the methodological framework for decision-making studies. The "absent mind" embodies the behaviorist research tradition. The "functionalist mind" underwrites traditional cognitivism…
Preliminary Work Domain Analysis for Human Extravehicular Activity
NASA Technical Reports Server (NTRS)
McGuire, Kerry; Miller, Matthew; Feigh, Karen
2015-01-01
A work domain analysis (WDA) of human extravehicular activity (EVA) is presented in this study. A formative methodology such as Cognitive Work Analysis (CWA) offers a new perspective to the knowledge gained from the past 50 years of living and working in space for the development of future EVA support systems. EVA is a vital component of human spaceflight and provides a case study example of applying a work domain analysis (WDA) to a complex sociotechnical system. The WDA presented here illustrates how the physical characteristics of the environment, hardware, and life support systems of the domain guide the potential avenues and functional needs of future EVA decision support system development.
Romanelli, Asunción; Massone, Héctor E; Escalante, Alicia H
2011-09-01
This article gives an account of the implementation of a stakeholder analysis framework at La Brava Wetland Basin, Argentina, in a common-pool resource (CPR) management context. Firstly, the context in which the stakeholder framework was implemented is described. Secondly, a four-step methodology is applied: (1) stakeholder identification, (2) stakeholder differentiation-categorization, (3) investigation of stakeholders' relationships, and (4) analysis of social-biophysical interdependencies. This methodology classifies stakeholders according to their level of influence on the system and their potential in the conservation of natural resources. The main influential stakeholders are La Brava Village residents and tourism-related entrepreneurs who are empowered to make the more important decisions within the planning process of the ecosystem. While these key players are seen as facilitators of change, there are other groups (residents of the inner basin and fishermen) which are seen mainly as key blockers. The applied methodology for the Stakeholder Analysis and the evaluation of social-biophysical interdependencies carried out in this article can be seen as an encouraging example for other experts in natural sciences to learn and use these methods developed in social sciences. Major difficulties and some recommendations of applying this method in the practice by non-experts are discussed.
Analytical group decision making in natural resources: methodology and application
Daniel L. Schmoldt; David L. Peterson
2000-01-01
Group decision making is becoming increasingly important in natural resource management and associated scientific applications, because multiple values are treated coincidentally in time and space, multiple resource specialists are needed, and multiple stakeholders must be included in the decision process. Decades of social science research on decision making in groups...
NASA Astrophysics Data System (ADS)
Gupta, Mahima; Mohanty, B. K.
2017-04-01
In this paper, we have developed a methodology to derive the level of compensation numerically in multiple criteria decision-making (MCDM) problems under fuzzy environment. The degree of compensation is dependent on the tranquility and anxiety level experienced by the decision-maker while taking the decision. Higher tranquility leads to the higher realisation of the compensation whereas the increased level of anxiety reduces the amount of compensation in the decision process. This work determines the level of tranquility (or anxiety) using the concept of fuzzy sets and its various level sets. The concepts of indexing of fuzzy numbers, the risk barriers and the tranquility level of the decision-maker are used to derive his/her risk prone or risk averse attitude of decision-maker in each criterion. The aggregation of the risk levels in each criterion gives us the amount of compensation in the entire MCDM problem. Inclusion of the compensation leads us to model the MCDM problem as binary integer programming problem (BIP). The solution to BIP gives us the compensatory decision to MCDM. The proposed methodology is illustrated through a numerical example.
Hess, Erik P; Wells, George A; Jaffe, Allan; Stiell, Ian G
2008-01-01
Background Chest pain is the second most common chief complaint in North American emergency departments. Data from the U.S. suggest that 2.1% of patients with acute myocardial infarction and 2.3% of patients with unstable angina are misdiagnosed, with slightly higher rates reported in a recent Canadian study (4.6% and 6.4%, respectively). Information obtained from the history, 12-lead ECG, and a single set of cardiac enzymes is unable to identify patients who are safe for early discharge with sufficient sensitivity. The 2007 ACC/AHA guidelines for UA/NSTEMI do not identify patients at low risk for adverse cardiac events who can be safely discharged without provocative testing. As a result large numbers of low risk patients are triaged to chest pain observation units and undergo provocative testing, at significant cost to the healthcare system. Clinical decision rules use clinical findings (history, physical exam, test results) to suggest a diagnostic or therapeutic course of action. Currently no methodologically robust clinical decision rule identifies patients safe for early discharge. Methods/design The goal of this study is to derive a clinical decision rule which will allow emergency physicians to accurately identify patients with chest pain who are safe for early discharge. The study will utilize a prospective cohort design. Standardized clinical variables will be collected on all patients at least 25 years of age complaining of chest pain prior to provocative testing. Variables strongly associated with the composite outcome acute myocardial infarction, revascularization, or death will be further analyzed with multivariable analysis to derive the clinical rule. Specific aims are to: i) apply standardized clinical assessments to patients with chest pain, incorporating results of early cardiac testing; ii) determine the inter-observer reliability of the clinical information; iii) determine the statistical association between the clinical findings and the composite outcome; and iv) use multivariable analysis to derive a highly sensitive clinical decision rule to guide triage decisions. Discussion The study will derive a highly sensitive clinical decision rule to identify low risk patients safe for early discharge. This will improve patient care, lower healthcare costs, and enhance flow in our busy and overcrowded emergency departments. PMID:18254973
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fraile-Garcia, Esteban, E-mail: esteban.fraile@unirioja.es; Ferreiro-Cabello, Javier, E-mail: javier.ferreiro@unirioja.es; Qualiberica S.L.
The European Committee for Standardization (CEN) through its Technical Committee CEN/TC-350 is developing a series of standards for assessing the building sustainability, at both product and building levels. The practical application of the selection (decision making) of structural alternatives made by one-way slabs leads to an intermediate level between the product and the building. Thus the present study addresses this problem of decision making, following the CEN guidelines and incorporating relevant aspects of architectural design into residential construction. A life cycle assessment (LCA) is developed in order to obtain valid information for the decision making process (the LCA was developedmore » applying CML methodology although Ecoindicator99 was used in order to facilitate the comparison of the values); this information (the carbon footprint values) is contrasted with other databases and with the information from the Environmental Product Declaration (EPD) of one of the lightening materials (expanded polystyrene), in order to validate the results. Solutions of different column disposition and geometries are evaluated in the three pillars of sustainable construction on residential construction: social, economic and environmental. The quantitative analysis of the variables used in this study enables and facilitates an objective comparison in the design stage by a responsible technician; the application of the proposed methodology reduces the possible solutions to be evaluated by the expert to 12.22% of the options in the case of low values of the column index and to 26.67% for the highest values. - Highlights: • Methodology for selection of structural alternatives in buildings with one-way slabs • Adapted to CEN guidelines (CEN/TC-350) for assessing the building sustainability • LCA is developed in order to obtain valid information for the decision making process. • Results validated comparing carbon footprint, databases and Env. Product Declarations • The proposal reduces the solutions to be evaluated to between 12.22 and 26.67%.« less
Zeeman, Heidi; Kendall, Elizabeth; Whitty, Jennifer A; Wright, Courtney J; Townsend, Clare; Smith, Dianne; Lakhani, Ali; Kennerley, Samantha
2016-03-15
Identifying the housing preferences of people with complex disabilities is a much needed, but under-developed area of practice and scholarship. Despite the recognition that housing is a social determinant of health and quality of life, there is an absence of empirical methodologies that can practically and systematically involve consumers in this complex service delivery and housing design market. A rigorous process for making effective and consistent development decisions is needed to ensure resources are used effectively and the needs of consumers with complex disability are properly met. This 3-year project aims to identify how the public and private housing market in Australia can better respond to the needs of people with complex disabilities whilst simultaneously achieving key corporate objectives. First, using the Customer Relationship Management framework, qualitative (Nominal Group Technique) and quantitative (Discrete Choice Experiment) methods will be used to quantify the housing preferences of consumers and their carers. A systematic mixed-method, quasi-experimental design will then be used to quantify the development priorities of other key stakeholders (e.g., architects, developers, Government housing services etc.) in relation to inclusive housing for people with complex disabilities. Stakeholders randomly assigned to Group 1 (experimental group) will participate in a series of focus groups employing Analytical Hierarchical Process (AHP) methodology. Stakeholders randomly assigned to Group 2 (control group) will participate in focus groups employing existing decision making processes to inclusive housing development (e.g., Risk, Opportunity, Cost, Benefit considerations). Using comparative stakeholder analysis, this research design will enable the AHP methodology (a proposed tool to guide inclusive housing development decisions) to be tested. It is anticipated that the findings of this study will enable stakeholders to incorporate consumer housing preferences into commercial decisions. Housing designers and developers will benefit from the creation of a parsimonious set of consumer-led housing preferences by which to make informed investments in future housing and contribute to future housing policy. The research design has not been applied in the Australian research context or elsewhere, and will provide a much needed blueprint for market investment to develop viable, consumer directed inclusive housing options for people with complex disability.
Santatiwongchai, Benjarin; Chantarastapornchit, Varit; Wilkinson, Thomas; Thiboonboon, Kittiphong; Rattanavipapong, Waranya; Walker, Damian G; Chalkidou, Kalipso; Teerawattananon, Yot
2015-01-01
Information generated from economic evaluation is increasingly being used to inform health resource allocation decisions globally, including in low- and middle- income countries. However, a crucial consideration for users of the information at a policy level, e.g. funding agencies, is whether the studies are comparable, provide sufficient detail to inform policy decision making, and incorporate inputs from data sources that are reliable and relevant to the context. This review was conducted to inform a methodological standardisation workstream at the Bill and Melinda Gates Foundation (BMGF) and assesses BMGF-funded cost-per-DALY economic evaluations in four programme areas (malaria, tuberculosis, HIV/AIDS and vaccines) in terms of variation in methodology, use of evidence, and quality of reporting. The findings suggest that there is room for improvement in the three areas of assessment, and support the case for the introduction of a standardised methodology or reference case by the BMGF. The findings are also instructive for all institutions that fund economic evaluations in LMICs and who have a desire to improve the ability of economic evaluations to inform resource allocation decisions.
Is the relationship between pattern recall and decision-making influenced by anticipatory recall?
Gorman, Adam D; Abernethy, Bruce; Farrow, Damian
2013-01-01
The present study compared traditional measures of pattern recall to measures of anticipatory recall and decision-making to examine the underlying mechanisms of expert pattern perception and to address methodological limitations in previous studies where anticipatory recall has generally been overlooked. Recall performance in expert and novice basketball players was measured by examining the spatial error in recalling player positions both for a target image (traditional recall) and at 40-ms increments following the target image (anticipatory recall). Decision-making performance was measured by comparing the participant's response to those identified by a panel of expert coaches. Anticipatory recall was observed in the recall task and was significantly more pronounced for the experts, suggesting that traditional methods of spatial recall analysis may not have provided a completely accurate determination of the full magnitude of the experts' superiority. Accounting for anticipatory recall also increased the relative contribution of recall skill to decision-making accuracy although the gains in explained variance were modest and of debatable functional significance.
[Clinical reasoning in undergraduate nursing education: a scoping review].
Menezes, Sáskia Sampaio Cipriano de; Corrêa, Consuelo Garcia; Silva, Rita de Cássia Gengo E; Cruz, Diná de Almeida Monteiro Lopes da
2015-12-01
This study aimed at analyzing the current state of knowledge on clinical reasoning in undergraduate nursing education. A systematic scoping review through a search strategy applied to the MEDLINE database, and an analysis of the material recovered by extracting data done by two independent reviewers. The extracted data were analyzed and synthesized in a narrative manner. From the 1380 citations retrieved in the search, 23 were kept for review and their contents were summarized into five categories: 1) the experience of developing critical thinking/clinical reasoning/decision-making process; 2) teaching strategies related to the development of critical thinking/clinical reasoning/decision-making process; 3) measurement of variables related to the critical thinking/clinical reasoning/decision-making process; 4) relationship of variables involved in the critical thinking/clinical reasoning/decision-making process; and 5) theoretical development models of critical thinking/clinical reasoning/decision-making process for students. The biggest challenge for developing knowledge on teaching clinical reasoning seems to be finding consistency between theoretical perspectives on the development of clinical reasoning and methodologies, methods, and procedures in research initiatives in this field.
Oliva, Juan; Brosa, Max; Espín, Jaime; Figueras, Montserrat; Trapero, Marta
2015-01-01
Economic evaluation of health care interventions has experienced a strong growth over the past decade and is increasingly present as a support tool in the decisions making process on public funding of health services and pricing in European countries. A necessary element using them is that agents that perform economic evaluations have minimum rules with agreement on methodological aspects. Although there are methodological issues in which there is a high degree of consensus, there are others in which there is no such degree of agreement being closest to the normative field or have experienced significant methodological advances in recent years. In this first article of a series of three, we will discuss on the perspective of analysis and assessment of costs in economic evaluation of health interventions using the technique Metaplan. Finally, research lines are proposed to overcome the identified discrepancies.
NASA Astrophysics Data System (ADS)
Polatidis, Heracles; Morales, Jan Borràs
2016-11-01
In this paper a methodological framework for increasing the actual applicability of wind farms is developed and applied. The framework is based on multi-criteria decision aid techniques that perform an integrated technical and societal evaluation of a number of potential wind power projects that are a variation of a pre-existing actual proposal that faces implementation difficulties. A number of evaluation criteria are established and assessed via particular related software or are comparatively evaluated among each other on a semi-qualitative basis. The preference of a diverse audience of pertinent stakeholders can be also incorporated in the overall analysis. The result of the process is an identification of a new project that will exhibit increased actual implementation potential compared with the original proposal. The methodology is tested in a case study of a wind farm in the UK and relevant conclusions are drawn.
Failure detection system design methodology. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chow, E. Y.
1980-01-01
The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.
A Decision Making Methodology in Support of the Business Rules Lifecycle
NASA Technical Reports Server (NTRS)
Wild, Christopher; Rosca, Daniela
1998-01-01
The business rules that underlie an enterprise emerge as a new category of system requirements that represent decisions about how to run the business, and which are characterized by their business-orientation and their propensity for change. In this report, we introduce a decision making methodology which addresses several aspects of the business rules lifecycle: acquisition, deployment and evolution. We describe a meta-model for representing business rules in terms of an enterprise model, and also a decision support submodel for reasoning about and deriving the rules. The possibility for lifecycle automated assistance is demonstrated in terms of the automatic extraction of business rules from the decision structure. A system based on the metamodel has been implemented, including the extraction algorithm. This is the final report for Daniela Rosca's PhD fellowship. It describes the work we have done over the past year, current research and the list of publications associated with her thesis topic.
Analytical control test plan and microbiological methods for the water recovery test
NASA Technical Reports Server (NTRS)
Traweek, M. S. (Editor); Tatara, J. D. (Editor)
1994-01-01
Qualitative and quantitative laboratory results are important to the decision-making process. In some cases, they may represent the only basis for deciding between two or more given options or processes. Therefore, it is essential that handling of laboratory samples and analytical operations employed are performed at a deliberate level of conscientious effort. Reporting erroneous results can lead to faulty interpretations and result in misinformed decisions. This document provides analytical control specifications which will govern future test procedures related to all Water Recovery Test (WRT) Phase 3 activities to be conducted at the National Aeronautics and Space Administration/Marshall Space Flight Center (NASA/MSFC). This document addresses the process which will be used to verify analytical data generated throughout the test period, and to identify responsibilities of key personnel and participating laboratories, the chains of communication to be followed, and ensure that approved methodology and procedures are used during WRT activities. This document does not outline specifics, but provides a minimum guideline by which sampling protocols, analysis methodologies, test site operations, and laboratory operations should be developed.
Georgakis, D. Christine; Trace, David A.; Naeymi-Rad, Frank; Evens, Martha
1990-01-01
Medical expert systems require comprehensive evaluation of their diagnostic accuracy. The usefulness of these systems is limited without established evaluation methods. We propose a new methodology for evaluating the diagnostic accuracy and the predictive capacity of a medical expert system. We have adapted to the medical domain measures that have been used in the social sciences to examine the performance of human experts in the decision making process. Thus, in addition to the standard summary measures, we use measures of agreement and disagreement, and Goodman and Kruskal's λ and τ measures of predictive association. This methodology is illustrated by a detailed retrospective evaluation of the diagnostic accuracy of the MEDAS system. In a study using 270 patients admitted to the North Chicago Veterans Administration Hospital, diagnoses produced by MEDAS are compared with the discharge diagnoses of the attending physicians. The results of the analysis confirm the high diagnostic accuracy and predictive capacity of the MEDAS system. Overall, the agreement of the MEDAS system with the “gold standard” diagnosis of the attending physician has reached a 90% level.
Assessing School Readiness for a Practice Arrangement Using Decision Tree Methodology.
ERIC Educational Resources Information Center
Barger, Sara E.
1998-01-01
Questions in a decision-tree address mission, faculty interest, administrative support, and practice plan as a way of assessing arrangements for nursing faculty's clinical practice. Decisions should be based on congruence between the human resource allocation and the reward systems. (SK)
A trainable decisions-in decision-out (DEI-DEO) fusion system
NASA Astrophysics Data System (ADS)
Dasarathy, Belur V.
1998-03-01
Most of the decision fusion systems proposed hitherto in the literature for multiple data source (sensor) environments operate on the basis of pre-defined fusion logic, be they crisp (deterministic), probabilistic, or fuzzy in nature, with no specific learning phase. The fusion systems that are trainable, i.e., ones that have a learning phase, mostly operate in the features-in-decision-out mode, which essentially reduces the fusion process functionally to a pattern classification task in the joint feature space. In this study, a trainable decisions-in-decision-out fusion system is described which estimates a fuzzy membership distribution spread across the different decision choices based on the performance of the different decision processors (sensors) corresponding to each training sample (object) which is associated with a specific ground truth (true decision). Based on a multi-decision space histogram analysis of the performance of the different processors over the entire training data set, a look-up table associating each cell of the histogram with a specific true decision is generated which forms the basis for the operational phase. In the operational phase, for each set of decision inputs, a pointer to the look-up table learnt previously is generated from which a fused decision is derived. This methodology, although primarily designed for fusing crisp decisions from the multiple decision sources, can be adapted for fusion of fuzzy decisions as well if such are the inputs from these sources. Examples, which illustrate the benefits and limitations of the crisp and fuzzy versions of the trainable fusion systems, are also included.
Novel methodology for pharmaceutical expenditure forecast
Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Mzoughi, Olfa; Toumi, Mondher
2014-01-01
Background and objective The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the ‘EU Pharmaceutical expenditure forecast’; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). Methods 1) Identification of all pharmaceuticals going off-patent and new branded medicinal products over a 5-year forecasting period in seven European Union (EU) Member States. 2) Development of a model to estimate direct and indirect impacts (based on health policies and clinical experts) on savings of generics and biosimilars. Inputs were originator sales value, patent expiry date, time to launch after marketing authorization, price discount, penetration rate, time to peak sales, and impact on brand price. 3) Development of a model for new drugs, which estimated sales progression in a competitive environment. Clinical expected benefits as well as commercial potential were assessed for each product by clinical experts. Inputs were development phase, marketing authorization dates, orphan condition, market size, and competitors. 4) Separate analysis of the budget impact of products going off-patent and new drugs according to several perspectives, distribution chains, and outcomes. 5) Addressing uncertainty surrounding estimations via deterministic and probabilistic sensitivity analysis. Results This methodology has proven to be effective by 1) identifying the main parameters impacting the variations in pharmaceutical expenditure forecasting across countries: generics discounts and penetration, brand price after patent loss, reimbursement rate, the penetration of biosimilars and discount price, distribution chains, and the time to reach peak sales for new drugs; 2) estimating the statistical distribution of the budget impact; and 3) testing different pricing and reimbursement policy decisions on health expenditures. Conclusions This methodology was independent of historical data and appeared to be highly flexible and adapted to test robustness and provide probabilistic analysis to support policy decision making. PMID:27226843
ERIC Educational Resources Information Center
Chen, Gwo-Dong; Liu, Chen-Chung; Ou, Kuo-Liang; Liu, Baw-Jhiune
2000-01-01
Discusses the use of Web logs to record student behavior that can assist teachers in assessing performance and making curriculum decisions for distance learning students who are using Web-based learning systems. Adopts decision tree and data cube information processing methodologies for developing more effective pedagogical strategies. (LRW)
Thermodynamic analysis of steam-injected advanced gas turbine cycles
NASA Astrophysics Data System (ADS)
Pandey, Devendra; Bade, Mukund H.
2017-12-01
This paper deals with thermodynamic analysis of steam-injected gas turbine (STIGT) cycle. To analyse the thermodynamic performance of steam-injected gas turbine (STIGT) cycles, a methodology based on pinch analysis is proposed. This graphical methodology is a systematic approach proposed for a selection of gas turbine with steam injection. The developed graphs are useful for selection of steam-injected gas turbine (STIGT) for optimal operation of it and helps designer to take appropriate decision. The selection of steam-injected gas turbine (STIGT) cycle can be done either at minimum steam ratio (ratio of mass flow rate of steam to air) with maximum efficiency or at maximum steam ratio with maximum net work conditions based on the objective of plants designer. Operating the steam injection based advanced gas turbine plant at minimum steam ratio improves efficiency, resulting in reduction of pollution caused by the emission of flue gases. On the other hand, operating plant at maximum steam ratio can result in maximum work output and hence higher available power.
Khakzad, Nima; Landucci, Gabriele; Reniers, Genserik
2017-09-01
In the present study, we have introduced a methodology based on graph theory and multicriteria decision analysis for cost-effective fire protection of chemical plants subject to fire-induced domino effects. By modeling domino effects in chemical plants as a directed graph, the graph centrality measures such as out-closeness and betweenness scores can be used to identify the installations playing a key role in initiating and propagating potential domino effects. It is demonstrated that active fire protection of installations with the highest out-closeness score and passive fire protection of installations with the highest betweenness score are the most effective strategies for reducing the vulnerability of chemical plants to fire-induced domino effects. We have employed a dynamic graph analysis to investigate the impact of both the availability and the degradation of fire protection measures over time on the vulnerability of chemical plants. The results obtained from the graph analysis can further be prioritized using multicriteria decision analysis techniques such as the method of reference point to find the most cost-effective fire protection strategy. © 2016 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Çakır, Süleyman
2017-10-01
In this study, a two-phase methodology for resource allocation problems under a fuzzy environment is proposed. In the first phase, the imprecise Shannon's entropy method and the acceptability index are suggested, for the first time in the literature, to select input and output variables to be used in the data envelopment analysis (DEA) application. In the second step, an interval inverse DEA model is executed for resource allocation in a short run. In an effort to exemplify the practicality of the proposed fuzzy model, a real case application has been conducted involving 16 cement firms listed in Borsa Istanbul. The results of the case application indicated that the proposed hybrid model is a viable procedure to handle input-output selection and resource allocation problems under fuzzy conditions. The presented methodology can also lend itself to different applications such as multi-criteria decision-making problems.
Evaluative methodology for prioritizing transportation energy conservation strategies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pang, L.M.G.
An analytical methodology was developed for the purpose of prioritizing a set of transportation energy conservation (TEC) strategies within an urban environment. Steps involved in applying the methodology consist of 1) defining the goals, objectives and constraints of the given urban community, 2) identifying potential TEC strategies, 3) assessing the impact of the strategies, 4) applying the TEC evaluation model, and 5) utilizing a selection process to determine the optimal set of strategies for implementation. This research provides an overview of 21 TEC strategies, a quick-response technique for estimating energy savings, a multiattribute utility theory approach for assessing subjective impacts,more » and a computer program for making the strategy evaluations, all of which assist in expediting the execution of the entire methodology procedure. The critical element of the methodology is the strategy evaluation model which incorporates a number of desirable concepts including 1) a comprehensive accounting of all relevant impacts, 2) the application of multiobjective decision-making techniques, 3) an approach to assure compatibilty among quantitative and qualitative impact measures, 4) the inclusion of the decision maker's preferences in the evaluation procedure, and 5) the cost-effectiveness concept. Application of the methodolgy to Salt Lake City, Utah demonstrated its utility, ease of use and favorability by decision makers.« less
Modeling new coal projects: supercritical or subcritical?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carrino, A.J.; Jones, R.B.
Decisions made on new build coal-fired plants are driven by several factors - emissions, fuel logistics and electric transmission access all provide constraints. The crucial economic decision whether to build supercritical or subcritical units often depends on assumptions concerning the reliability/availability of each technology, the cost of on-fuel operations including maintenance, the generation efficiencies and the potential for emissions credits at some future value. Modeling the influence of these key factors requires analysis and documentation to assure the assets actually meet the projected financial performance. This article addresses some of the issue related to the trade-offs that have the potentialmore » to be driven by the supercritical/subcritical decision. Solomon Associates has been collecting cost, generation and reliability data on coal-fired power generation assets for approximately 10 years using a strict methodology and taxonomy to categorize and compare actual plant operations data. This database provides validated information not only on performance, but also on alternative performance scenarios, which can provide useful insights in the pro forma financial analysis and models of new plants. 1 ref., 1 fig., 3 tabs.« less
NASA Astrophysics Data System (ADS)
Guttikunda, S. K.; Johnson, T. M.; Procee, P.
2004-12-01
Fossil fuel combustion for domestic cooking and heating, power generation, industrial processes, and motor vehicles are the primary sources of air pollution in the developing country cities. Over the past twenty years, major advances have been made in understanding the social and economic consequences of air pollution. In both industrialized and developing countries, it has been shown that air pollution from energy combustion has detrimental impacts on human health and the environment. Lack of information on the sectoral contributions to air pollution - especially fine particulates, is one of the typical constraints for an effective integrated urban air quality management program. Without such information, it is difficult, if not impossible, for decision makers to provide policy advice and make informed investment decisions related to air quality improvements in developing countries. This also raises the need for low-cost ways of determining the principal sources of fine PM for a proper planning and decision making. The project objective is to develop and verify a methodology to assess and monitor the sources of PM, using a combination of ground-based monitoring and source apportionment techniques. This presentation will focus on four general tasks: (1) Review of the science and current activities in the combined use of monitoring data and modeling for better understanding of PM pollution. (2) Review of recent advances in atmospheric source apportionment techniques (e.g., principal component analysis, organic markers, source-receptor modeling techniques). (3) Develop a general methodology to use integrated top-down and bottom-up datasets. (4) Review of a series of current case studies from Africa, Asia and Latin America and the methodologies applied to assess the air pollution and its sources.
Factors Which Influence The Fish Purchasing Decision: A study on Traditional Market in Riau Mainland
NASA Astrophysics Data System (ADS)
Siswati, Latifa; Putri, Asgami
2018-05-01
The purposes of the research are to analyze and assess the factors which influence fish purchasing by the community at Tenayan Raya district Pekanbaru.Research methodology which used is survey method, especially interview and observation technique or direct supervision on the market which located at Tenayan Raya district. Determination technique of sampling location/region is done by purposive sampling. The sampling method is done by accidental sampling. Technique analysis of factors which used using the data that derived from the respondent opinion to various fish variable. The result of this research are the factors which influence fish purchasing decision done in a traditional market which located at Tenayan Raya district are product factor, price factors, social factor and individual factor. Product factor which influences fish purchasing decision as follows: the eyelets condition, the nutrition of fresh fish, the diversity of sold fish. Price factors influence the fish purchasing decision, such as: the price of fresh fish, the convincing price and the suitability price and benefits of the fresh fish. Individual factors which influence a fish purchasing decision, such as education and income levels. Social factors which influence a fish purchasing decision, such as family, colleagues and feeding habits of fish.
Gómez-García, Francisco; Ruano, Juan; Aguilar-Luque, Macarena; Alcalde-Mellado, Patricia; Gay-Mimbrera, Jesús; Hernández-Romero, José Luis; Sanz-Cabanillas, Juan Luis; Maestre-López, Beatriz; González-Padilla, Marcelino; Carmona-Fernández, Pedro J; García-Nieto, Antonio Vélez; Isla-Tejera, Beatriz
2017-12-29
Article summaries' information and structure may influence researchers/clinicians' decisions to conduct deeper full-text analyses. Specifically, abstracts of systematic reviews (SRs) and meta-analyses (MA) should provide structured summaries for quick assessment. This study explored a method for determining the methodological quality and bias risk of full-text reviews using abstract information alone. Systematic literature searches for SRs and/or MA about psoriasis were undertaken on MEDLINE, EMBASE, and Cochrane database. For each review, quality, abstract-reporting completeness, full-text methodological quality, and bias risk were evaluated using Preferred Reporting Items for Systematic Reviews and Meta-analyses for abstracts (PRISMA-A), Assessing the Methodological Quality of Systematic Reviews (AMSTAR), and ROBIS tools, respectively. Article-, author-, and journal-derived metadata were systematically extracted from eligible studies using a piloted template, and explanatory variables concerning abstract-reporting quality were assessed using univariate and multivariate-regression models. Two classification models concerning SRs' methodological quality and bias risk were developed based on per-item and total PRISMA-A scores and decision-tree algorithms. This work was supported, in part, by project ICI1400136 (JR). No funding was received from any pharmaceutical company. This study analysed 139 SRs on psoriasis interventions. On average, they featured 56.7% of PRISMA-A items. The mean total PRISMA-A score was significantly higher for high-methodological-quality SRs than for moderate- and low-methodological-quality reviews. SRs with low-bias risk showed higher total PRISMA-A values than reviews with high-bias risk. In the final model, only 'authors per review > 6' (OR: 1.098; 95%CI: 1.012-1.194), 'academic source of funding' (OR: 3.630; 95%CI: 1.788-7.542), and 'PRISMA-endorsed journal' (OR: 4.370; 95%CI: 1.785-10.98) predicted PRISMA-A variability. Reviews with a total PRISMA-A score < 6, lacking identification as SR or MA in the title, and lacking explanation concerning bias risk assessment methods were classified as low-methodological quality. Abstracts with a total PRISMA-A score ≥ 9, including main outcomes results and explanation bias risk assessment method were classified as having low-bias risk. The methodological quality and bias risk of SRs may be determined by abstract's quality and completeness analyses. Our proposal aimed to facilitate synthesis of evidence evaluation by clinical professionals lacking methodological skills. External validation is necessary.
Kallio, Hanna; Pietilä, Anna-Maija; Johnson, Martin; Kangasniemi, Mari
2016-12-01
To produce a framework for the development of a qualitative semi-structured interview guide. Rigorous data collection procedures fundamentally influence the results of studies. The semi-structured interview is a common data collection method, but methodological research on the development of a semi-structured interview guide is sparse. Systematic methodological review. We searched PubMed, CINAHL, Scopus and Web of Science for methodological papers on semi-structured interview guides from October 2004-September 2014. Having examined 2,703 titles and abstracts and 21 full texts, we finally selected 10 papers. We analysed the data using the qualitative content analysis method. Our analysis resulted in new synthesized knowledge on the development of a semi-structured interview guide, including five phases: (1) identifying the prerequisites for using semi-structured interviews; (2) retrieving and using previous knowledge; (3) formulating the preliminary semi-structured interview guide; (4) pilot testing the guide; and (5) presenting the complete semi-structured interview guide. Rigorous development of a qualitative semi-structured interview guide contributes to the objectivity and trustworthiness of studies and makes the results more plausible. Researchers should consider using this five-step process to develop a semi-structured interview guide and justify the decisions made during it. © 2016 John Wiley & Sons Ltd.
Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions
NASA Astrophysics Data System (ADS)
De Risi, Raffaele; Goda, Katsuichiro
2017-08-01
Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.
Api, A M; Belsito, D; Bruze, M; Cadby, P; Calow, P; Dagli, M L; Dekant, W; Ellis, G; Fryer, A D; Fukayama, M; Griem, P; Hickey, C; Kromidas, L; Lalko, J F; Liebler, D C; Miyachi, Y; Politano, V T; Renskers, K; Ritacco, G; Salvito, D; Schultz, T W; Sipes, I G; Smith, B; Vitale, D; Wilcox, D K
2015-08-01
The Research Institute for Fragrance Materials, Inc. (RIFM) has been engaged in the generation and evaluation of safety data for fragrance materials since its inception over 45 years ago. Over time, RIFM's approach to gathering data, estimating exposure and assessing safety has evolved as the tools for risk assessment evolved. This publication is designed to update the RIFM safety assessment process, which follows a series of decision trees, reflecting advances in approaches in risk assessment and new and classical toxicological methodologies employed by RIFM over the past ten years. These changes include incorporating 1) new scientific information including a framework for choosing structural analogs, 2) consideration of the Threshold of Toxicological Concern (TTC), 3) the Quantitative Risk Assessment (QRA) for dermal sensitization, 4) the respiratory route of exposure, 5) aggregate exposure assessment methodology, 6) the latest methodology and approaches to risk assessments, 7) the latest alternatives to animal testing methodology and 8) environmental risk assessment. The assessment begins with a thorough analysis of existing data followed by in silico analysis, identification of 'read across' analogs, generation of additional data through in vitro testing as well as consideration of the TTC approach. If necessary, risk management may be considered. Copyright © 2014 Elsevier Ltd. All rights reserved.
Heuristic decomposition for non-hierarchic systems
NASA Technical Reports Server (NTRS)
Bloebaum, Christina L.; Hajela, P.
1991-01-01
Design and optimization is substantially more complex in multidisciplinary and large-scale engineering applications due to the existing inherently coupled interactions. The paper introduces a quasi-procedural methodology for multidisciplinary optimization that is applicable for nonhierarchic systems. The necessary decision-making support for the design process is provided by means of an embedded expert systems capability. The method employs a decomposition approach whose modularity allows for implementation of specialized methods for analysis and optimization within disciplines.
2016-09-01
Some technologies that were not included in the analysis (due to site-level evaluations), but could be added in the future, include: wind turbines ...number of entities involved in the procurement, operation, maintenance , testing, and fueling of the generators, detailed inventory and cost data is...difficult to obtain. The DPW is often understaffed, leading to uneven testing and maintenance of the equipment despite their best efforts. The
2016-10-04
analysis (due to site-level evaluations), but could be added in the future, include: wind turbines (the installations we visited were not interested due...procurement, operation, maintenance , testing, and fueling of the generators, detailed inventory and cost data is difficult to obtain. The DPW is often...understaffed, leading to uneven testing and maintenance of the equipment despite their best efforts. The reliability of these generators is typically
Selecting essential information for biosurveillance--a multi-criteria decision analysis.
Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina
2014-01-01
The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.
Solid Waste Management Planning--A Methodology
ERIC Educational Resources Information Center
Theisen, Hilary M.; And Others
1975-01-01
This article presents a twofold solid waste management plan consisting of a basic design methodology and a decision-making methodology. The former provides a framework for the developing plan while the latter builds flexibility into the design so that there is a model for use during the planning process. (MA)
Vasconcelos, A G; Almeida, R M; Nobre, F F
2001-08-01
This paper introduces an approach that includes non-quantitative factors for the selection and assessment of multivariate complex models in health. A goodness-of-fit based methodology combined with fuzzy multi-criteria decision-making approach is proposed for model selection. Models were obtained using the Path Analysis (PA) methodology in order to explain the interrelationship between health determinants and the post-neonatal component of infant mortality in 59 municipalities of Brazil in the year 1991. Socioeconomic and demographic factors were used as exogenous variables, and environmental, health service and agglomeration as endogenous variables. Five PA models were developed and accepted by statistical criteria of goodness-of fit. These models were then submitted to a group of experts, seeking to characterize their preferences, according to predefined criteria that tried to evaluate model relevance and plausibility. Fuzzy set techniques were used to rank the alternative models according to the number of times a model was superior to ("dominated") the others. The best-ranked model explained above 90% of the endogenous variables variation, and showed the favorable influences of income and education levels on post-neonatal mortality. It also showed the unfavorable effect on mortality of fast population growth, through precarious dwelling conditions and decreased access to sanitation. It was possible to aggregate expert opinions in model evaluation. The proposed procedure for model selection allowed the inclusion of subjective information in a clear and systematic manner.
Supporting Space Systems Design via Systems Dependency Analysis Methodology
NASA Astrophysics Data System (ADS)
Guariniello, Cesare
The increasing size and complexity of space systems and space missions pose severe challenges to space systems engineers. When complex systems and Systems-of-Systems are involved, the behavior of the whole entity is not only due to that of the individual systems involved but also to the interactions and dependencies between the systems. Dependencies can be varied and complex, and designers usually do not perform analysis of the impact of dependencies at the level of complex systems, or this analysis involves excessive computational cost, or occurs at a later stage of the design process, after designers have already set detailed requirements, following a bottom-up approach. While classical systems engineering attempts to integrate the perspectives involved across the variety of engineering disciplines and the objectives of multiple stakeholders, there is still a need for more effective tools and methods capable to identify, analyze and quantify properties of the complex system as a whole and to model explicitly the effect of some of the features that characterize complex systems. This research describes the development and usage of Systems Operational Dependency Analysis and Systems Developmental Dependency Analysis, two methods based on parametric models of the behavior of complex systems, one in the operational domain and one in the developmental domain. The parameters of the developed models have intuitive meaning, are usable with subjective and quantitative data alike, and give direct insight into the causes of observed, and possibly emergent, behavior. The approach proposed in this dissertation combines models of one-to-one dependencies among systems and between systems and capabilities, to analyze and evaluate the impact of failures or delays on the outcome of the whole complex system. The analysis accounts for cascading effects, partial operational failures, multiple failures or delays, and partial developmental dependencies. The user of these methods can assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.
Rogowski, W H; Grosse, S D; Meyer, E; John, J; Palmer, S
2012-05-01
Public decision makers face demands to invest in applied research in order to accelerate the adoption of new genetic tests. However, such an investment is profitable only if the results gained from further investigations have a significant impact on health care practice. An upper limit for the value of additional information aimed at improving the basis for reimbursement decisions is given by the expected value of perfect information (EVPI). This study illustrates the significance of the concept of EVPI on the basis of a probabilistic cost-effectiveness model of screening for hereditary hemochromatosis among German men. In the present example, population-based screening can barely be recommended at threshold values of 50,000 or 100,000 Euro per life year gained and also the value of additional research which might cause this decision to be overturned is small: At the mentioned threshold values, the EVPI in the German public health care system was ca. 500,000 and 2,200,000 Euro, respectively. An analysis of EVPI by individual parameters or groups of parameters shows that additional research about adherence to preventive phlebotomy could potentially provide the highest benefit. The potential value of further research also depends on methodological assumptions regarding the decision maker's time horizon as well as on scenarios with an impact on the number of affected patients and the cost-effectiveness of screening.
Evidence-based management - healthcare manager viewpoints.
Janati, Ali; Hasanpoor, Edris; Hajebrahimi, Sakineh; Sadeghi-Bazargani, Homayoun
2018-06-11
Purpose Hospital manager decisions can have a significant impact on service effectiveness and hospital success, so using an evidence-based approach can improve hospital management. The purpose of this paper is to identify evidence-based management (EBMgt) components and challenges. Consequently, the authors provide an improving evidence-based decision-making framework. Design/methodology/approach A total of 45 semi-structured interviews were conducted in 2016. The authors also established three focus group discussions with health service managers. Data analysis followed deductive qualitative analysis guidelines. Findings Four basic themes emerged from the interviews, including EBMgt evidence sources (including sub-themes: scientific and research evidence, facts and information, political-social development plans, managers' professional expertise and ethical-moral evidence); predictors (sub-themes: stakeholder values and expectations, functional behavior, knowledge, key competencies and skill, evidence sources, evidence levels, uses and benefits and government programs); EBMgt barriers (sub-themes: managers' personal characteristics, decision-making environment, training and research system and organizational issues); and evidence-based hospital management processes (sub-themes: asking, acquiring, appraising, aggregating, applying and assessing). Originality/value Findings suggest that most participants have positive EBMgt attitudes. A full evidence-based hospital manager is a person who uses all evidence sources in a six-step decision-making process. EBMgt frameworks are a good tool to manage healthcare organizations. The authors found factors affecting hospital EBMgt and identified six evidence sources that healthcare managers can use in evidence-based decision-making processes.
NASA Technical Reports Server (NTRS)
Smith, Jeffrey H.; Levin, Richard R.; Carpenter, Elisabeth J.
1990-01-01
The results are described of an application of multiattribute analysis to the evaluation of high leverage prototyping technologies in the automation and robotics (A and R) areas that might contribute to the Space Station (SS) Freedom baseline design. An implication is that high leverage prototyping is beneficial to the SS Freedom Program as a means for transferring technology from the advanced development program to the baseline program. The process also highlights the tradeoffs to be made between subsidizing high value, low risk technology development versus high value, high risk technology developments. Twenty one A and R Technology tasks spanning a diverse array of technical concepts were evaluated using multiattribute decision analysis. Because of large uncertainties associated with characterizing the technologies, the methodology was modified to incorporate uncertainty. Eight attributes affected the rankings: initial cost, operation cost, crew productivity, safety, resource requirements, growth potential, and spinoff potential. The four attributes of initial cost, operations cost, crew productivity, and safety affected the rankings the most.
The value of a statistical life: a meta-analysis with a mixed effects regression model.
Bellavance, François; Dionne, Georges; Lebeau, Martin
2009-03-01
The value of a statistical life (VSL) is a very controversial topic, but one which is essential to the optimization of governmental decisions. We see a great variability in the values obtained from different studies. The source of this variability needs to be understood, in order to offer public decision-makers better guidance in choosing a value and to set clearer guidelines for future research on the topic. This article presents a meta-analysis based on 39 observations obtained from 37 studies (from nine different countries) which all use a hedonic wage method to calculate the VSL. Our meta-analysis is innovative in that it is the first to use the mixed effects regression model [Raudenbush, S.W., 1994. Random effects models. In: Cooper, H., Hedges, L.V. (Eds.), The Handbook of Research Synthesis. Russel Sage Foundation, New York] to analyze studies on the value of a statistical life. We conclude that the variability found in the values studied stems in large part from differences in methodologies.
Ethical Guidelines for Structural Interventions to Small-Scale Historic Stone Masonry Buildings.
Hurol, Yonca; Yüceer, Hülya; Başarır, Hacer
2015-12-01
Structural interventions to historic stone masonry buildings require that both structural and heritage values be considered simultaneously. The absence of one of these value systems in implementation can be regarded as an unethical professional action. The research objective of this article is to prepare a guideline for ensuring ethical structural interventions to small-scale stone historic masonry buildings in the conservation areas of Northern Cyprus. The methodology covers an analysis of internationally accepted conservation documents and national laws related to the conservation of historic buildings, an analysis of building codes, especially Turkish building codes, which have been used in Northern Cyprus, and an analysis of the structural interventions introduced to a significant historic building in a semi-intact state in the walled city of Famagusta. This guideline covers issues related to whether buildings are intact or ruined, the presence of earthquake risk, the types of structural decisions in an architectural conservation project, and the values to consider during the decision making phase.
Support Tool in the Diagnosis of Major Depressive Disorder
NASA Astrophysics Data System (ADS)
Nunes, Luciano Comin; Pinheiro, Plácido Rogério; Pequeno, Tarcísio Cavalcante; Pinheiro, Mirian Calíope Dantas
Major Depressive Disorder have been responsible for millions of professionals temporary removal, and even permanent, from diverse fields of activities around the world, generating damage to social, financial, productive systems and social security, and especially damage to the image of the individual and his family that these disorders produce in individuals who are patients, characteristics that make them stigmatized and discriminated into their society, making difficult their return to the production system. The lack of early diagnosis has provided reactive and late measures, only when the professional suffering psychological disorder is already showing signs of incapacity for working and social relationships. This article aims to assist in the decision making to establish early diagnosis of these types of psychological disorders. It presents a proposal for a hybrid model composed of expert system structured methodologies for decision support (Multi-Criteria Decision Analysis - MCDA) and representations of knowledge structured in logical rules of production and probabilities (Artificial Intelligence - AI).
Multi-objective game-theory models for conflict analysis in reservoir watershed management.
Lee, Chih-Sheng
2012-05-01
This study focuses on the development of a multi-objective game-theory model (MOGM) for balancing economic and environmental concerns in reservoir watershed management and for assistance in decision. Game theory is used as an alternative tool for analyzing strategic interaction between economic development (land use and development) and environmental protection (water-quality protection and eutrophication control). Geographic information system is used to concisely illustrate and calculate the areas of various land use types. The MOGM methodology is illustrated in a case study of multi-objective watershed management in the Tseng-Wen reservoir, Taiwan. The innovation and advantages of MOGM can be seen in the results, which balance economic and environmental concerns in watershed management and which can be interpreted easily by decision makers. For comparison, the decision-making process using conventional multi-objective method to produce many alternatives was found to be more difficult. Copyright © 2012 Elsevier Ltd. All rights reserved.
Item response theory analysis of the Lichtenberg Financial Decision Screening Scale.
Teresi, Jeanne A; Ocepek-Welikson, Katja; Lichtenberg, Peter A
2017-01-01
The focus of these analyses was to examine the psychometric properties of the Lichtenberg Financial Decision Screening Scale (LFDSS). The purpose of the screen was to evaluate the decisional abilities and vulnerability to exploitation of older adults. Adults aged 60 and over were interviewed by social, legal, financial, or health services professionals who underwent in-person training on the administration and scoring of the scale. Professionals provided a rating of the decision-making abilities of the older adult. The analytic sample included 213 individuals with an average age of 76.9 (SD = 10.1). The majority (57%) were female. Data were analyzed using item response theory (IRT) methodology. The results supported the unidimensionality of the item set. Several IRT models were tested. Ten ordinal and binary items evidenced a slightly higher reliability estimate (0.85) than other versions and better coverage in terms of the range of reliable measurement across the continuum of financial incapacity.
Evaluation of ilmenite serpentine concrete and ordinary concrete as nuclear reactor shielding
NASA Astrophysics Data System (ADS)
Abulfaraj, Waleed H.; Kamal, Salah M.
1994-07-01
The present study involves adapting a formal decision methodology to the selection of alternative nuclear reactor concretes shielding. Multiattribute utility theory is selected to accommodate decision makers' preferences. Multiattribute utility theory (MAU) is here employed to evaluate two appropriate nuclear reactor shielding concretes in terms of effectiveness to determine the optimal choice in order to meet the radiation protection regulations. These concretes are Ordinary concrete (O.C.) and Ilmenite Serpentile concrete (I.S.C.). These are normal weight concrete and heavy heat resistive concrete, respectively. The effectiveness objective of the nuclear reactor shielding is defined and structured into definite attributes and subattributes to evaluate the best alternative. Factors affecting the decision are dose received by reactor's workers, the material properties as well as cost of concrete shield. A computer program is employed to assist in performing utility analysis. Based upon data, the result shows the superiority of Ordinary concrete over Ilmenite Serpentine concrete.
Gaming in Nursing Education: A Literature Review.
Pront, Leeanne; Müller, Amanda; Koschade, Adam; Hutton, Alison
The aim of this research was to investigate videogame-based learning in nursing education and establish how videogames are currently employed and how they link to the development of decision-making, motivation, and other benefits. Although digital game-based learning potentially offers a safe and convenient environment that can support nursing students developing essential skills, nurse educators are typically slow to adopt such resources. A comprehensive search of electronic databases was conducted, followed by a thematic analysis of the literature. Evaluations of identified games found generally positive results regarding usability and effectiveness of videogames in nursing education. Analysis of advantages of videogames in nursing education identified potential benefits for decision-making, motivation, repeated exposure, logistical, and financial value. Despite the paucity of games available and the methodological limitations identified, findings provide evidence to support the potential effectiveness of videogames as a learning resource in nursing education.
"ATLAS" Advanced Technology Life-cycle Analysis System
NASA Technical Reports Server (NTRS)
Lollar, Louis F.; Mankins, John C.; ONeil, Daniel A.
2004-01-01
Making good decisions concerning research and development portfolios-and concerning the best systems concepts to pursue - as early as possible in the life cycle of advanced technologies is a key goal of R&D management This goal depends upon the effective integration of information from a wide variety of sources as well as focused, high-level analyses intended to inform such decisions Life-cycle Analysis System (ATLAS) methodology and tool kit. ATLAS encompasses a wide range of methods and tools. A key foundation for ATLAS is the NASA-created Technology Readiness. The toolkit is largely spreadsheet based (as of August 2003). This product is being funded by the Human and Robotics The presentation provides a summary of the Advanced Technology Level (TRL) systems Technology Program Office, Office of Exploration Systems, NASA Headquarters, Washington D.C. and is being integrated by Dan O Neil of the Advanced Projects Office, NASA/MSFC, Huntsville, AL
Oshiyama, Natália F; Bassani, Rosana A; D'Ottaviano, Itala M L; Bassani, José W M
2012-04-01
As technology evolves, the role of medical equipment in the healthcare system, as well as technology management, becomes more important. Although the existence of large databases containing management information is currently common, extracting useful information from them is still difficult. A useful tool for identification of frequently failing equipment, which increases maintenance cost and downtime, would be the classification according to the corrective maintenance data. Nevertheless, establishment of classes may create inconsistencies, since an item may be close to two classes by the same extent. Paraconsistent logic might help solve this problem, as it allows the existence of inconsistent (contradictory) information without trivialization. In this paper, a methodology for medical equipment classification based on the ABC analysis of corrective maintenance data is presented, and complemented with a paraconsistent annotated logic analysis, which may enable the decision maker to take into consideration alerts created by the identification of inconsistencies and indeterminacies in the classification.
Guimarães, José Maria Ximenes; Jorge, Maria Salete Bessa; Maia, Regina Claudia Furtado; de Oliveira, Lucia Conde; Morais, Ana Patrícia Pereira; Lima, Marcos Paulo de Oliveira; Assis, Marluce Maria Araújo; dos Santos, Adriano Maia
2010-07-01
The article approaches the comprehension of professionals that act in the mental health area about the movement of construction of social participation in the health system of Fortaleza, Ceará State. The methodology adopted is based upon qualitative approach. The study was developed with semi-structured interviews with 17 mental health professionals of the city above mentioned. The empirical data was analyzed through the technique of thematic content analysis, where it was identified three cores of analysis: social participation as space of citizenship and policy formulation; oriented to attention of collective needs; and decision taking. The study reveals that social participation represents a possibility of amplifying X the relations between the Civil Society and the State, which makes possible the social intervention in proposals of the health policies. It is highlighted the right to health linked to the consolidation of democracy in the attention to the needs and collective edification.
Cognitive Task Analysis of Network Analysts and Managers for Network Situational Awareness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erbacher, Robert; Frincke, Deborah A.; Wong, Pak C.
The goal of the project was to create a set of next generation cyber situational awareness capabilities with applications to other domains in the long term. The goal is to improve the decision making process such that decision makers can choose better actions. To this end, we put extensive effort into ensuring we had feedback from network analysts and managers and understood what their needs truly were. Consequently, this is the focus of this portion of the research. This paper discusses the methodology we followed to acquire this feedback from the analysts, namely a cognitive task analysis. Additionally, this papermore » provides the details we acquired from the analysts. This essentially provides details on their processes, goals, concerns, the data and meta-data they analyze, etc. A final result we describe is the generation of a task-flow diagram.« less
Satellite solar power - Will it pay off
NASA Technical Reports Server (NTRS)
Hazelrigg, G. A., Jr.
1977-01-01
A cost analysis is presented for front-end investments required for the development of a satellite solar power system. The methodology used makes use of risk analysis techniques to quantify the present state of knowledge relevant to the construction and operation of a satellite solar power station 20 years in the future. Results are used to evaluate the 'expected value' of a three-year research program providing additional information which will be used as a basis for a decision to either continue development of the concept at an increasing funding level or to terminate or drastically alter the program. The program is costed phase by phase, and a decision tree is constructed. The estimated probability of success for the research and studies phase is .540. The expected value of a program leading to the construction of 120 systems at a rate of four per year is 12.433 billion dollars.
Giacomini, Mita; Cook, Deborah; DeJean, Deirdre
2009-04-01
The objective of this study is to identify and appraise qualitative research evidence on the experience of making life-support decisions in critical care. In six databases and supplementary sources, we sought original research published from January 1990 through June 2008 reporting qualitative empirical studies of the experience of life-support decision making in critical care settings. Fifty-three journal articles and monographs were included. Of these, 25 reported prospective studies and 28 reported retrospective studies. We abstracted methodologic characteristics relevant to the basic critical appraisal of qualitative research (prospective data collection, ethics approval, purposive sampling, iterative data collection and analysis, and any method to corroborate findings). Qualitative research traditions represented include grounded theory (n = 15, 28%), ethnography or naturalistic methods (n = 15, 28%), phenomenology (n = 9, 17%), and other or unspecified approaches (n = 14, 26%). All 53 documents describe the research setting; 97% indicate purposive sampling of participants. Studies vary in their capture of multidisciplinary clinician and family perspectives. Thirty-one (58%) report research ethics board review. Only 49% report iterative data collection and analysis, and eight documents (15%) describe an analytically driven stopping point for data collection. Thirty-two documents (60%) indicated a method for corroborating findings. Qualitative evidence often appears outside of clinical journals, with most research from the United States. Prospective, observation-based studies follow life-support decision making directly. These involve a variety of participants and yield important insights into interactions, communication, and dynamics. Retrospective, interview-based studies lack this direct engagement, but focus on the recollections of fewer types of participants (particularly patients and physicians), and typically address specific issues (communication and stress). Both designs can provide useful reflections for improving care. Given the diversity of qualitative research in critical care, room for improvement exists regarding both the quality and transparency of reported methodology.
Problem solving using soft systems methodology.
Land, L
This article outlines a method of problem solving which considers holistic solutions to complex problems. Soft systems methodology allows people involved in the problem situation to have control over the decision-making process.
NASA Astrophysics Data System (ADS)
Babovic, Filip; Mijic, Ana; Madani, Kaveh
2017-04-01
Urban areas around the world are growing in size and importance; however, cities experience elevated risks of pluvial flooding due to the prevalence of impermeable land surfaces within them. Urban planners and engineers encounter a great deal of uncertainty when planning adaptations to these flood risks, due to the interaction of multiple factors such as climate change and land use change. This leads to conditions of deep uncertainty. Blue-Green (BG) solutions utilise natural vegetation and processes to absorb and retain runoff while providing a host of other social, economic and environmental services. When utilised in conjunction with Decision Making under Deep Uncertainty (DMDU) methodologies, BG infrastructure provides a flexible and adaptable method of "no-regret" adaptation; resulting in a practical, economically efficient, and socially acceptable solution for flood risk mitigation. This work presents the methodology for analysing the impact of BG infrastructure in the context of the Adaptation Tipping Points approach to protect against pluvial flood risk in an iterative manner. An economic analysis of the adaptation pathways is also conducted in order to better inform decision-makers on the benefits and costs of the adaptation options presented. The methodology was applied to a case study in the Cranbrook Catchment in the North East of London. Our results show that BG infrastructure performs better under conditions of uncertainty than traditional grey infrastructure.
Comprehensible knowledge model creation for cancer treatment decision making.
Afzal, Muhammad; Hussain, Maqbool; Ali Khan, Wajahat; Ali, Taqdir; Lee, Sungyoung; Huh, Eui-Nam; Farooq Ahmad, Hafiz; Jamshed, Arif; Iqbal, Hassan; Irfan, Muhammad; Abbas Hydari, Manzar
2017-03-01
A wealth of clinical data exists in clinical documents in the form of electronic health records (EHRs). This data can be used for developing knowledge-based recommendation systems that can assist clinicians in clinical decision making and education. One of the big hurdles in developing such systems is the lack of automated mechanisms for knowledge acquisition to enable and educate clinicians in informed decision making. An automated knowledge acquisition methodology with a comprehensible knowledge model for cancer treatment (CKM-CT) is proposed. With the CKM-CT, clinical data are acquired automatically from documents. Quality of data is ensured by correcting errors and transforming various formats into a standard data format. Data preprocessing involves dimensionality reduction and missing value imputation. Predictive algorithm selection is performed on the basis of the ranking score of the weighted sum model. The knowledge builder prepares knowledge for knowledge-based services: clinical decisions and education support. Data is acquired from 13,788 head and neck cancer (HNC) documents for 3447 patients, including 1526 patients of the oral cavity site. In the data quality task, 160 staging values are corrected. In the preprocessing task, 20 attributes and 106 records are eliminated from the dataset. The Classification and Regression Trees (CRT) algorithm is selected and provides 69.0% classification accuracy in predicting HNC treatment plans, consisting of 11 decision paths that yield 11 decision rules. Our proposed methodology, CKM-CT, is helpful to find hidden knowledge in clinical documents. In CKM-CT, the prediction models are developed to assist and educate clinicians for informed decision making. The proposed methodology is generalizable to apply to data of other domains such as breast cancer with a similar objective to assist clinicians in decision making and education. Copyright © 2017 Elsevier Ltd. All rights reserved.
INTEGRATION OF POLLUTION PREVENTION TOOLS
A prototype computer-based decision support system was designed to provide small businesses with an integrated pollution prevention methodology. Preliminary research involved compilation of an inventory of existing pollution prevention tools (i.e., methodologies, software, etc.),...
NASA Astrophysics Data System (ADS)
Hagos Subagadis, Yohannes; Schütze, Niels; Grundmann, Jens
2015-04-01
The planning and implementation of effective water resources management strategies need an assessment of multiple (physical, environmental, and socio-economic) issues, and often requires new research in which knowledge of diverse disciplines are combined in a unified methodological and operational frameworks. Such integrative research to link different knowledge domains faces several practical challenges. Such complexities are further compounded by multiple actors frequently with conflicting interests and multiple uncertainties about the consequences of potential management decisions. A fuzzy-stochastic multiple criteria decision analysis tool was developed in this study to systematically quantify both probabilistic and fuzzy uncertainties associated with complex hydrosystems management. It integrated physical process-based models, fuzzy logic, expert involvement and stochastic simulation within a general framework. Subsequently, the proposed new approach is applied to a water-scarce coastal arid region water management problem in northern Oman, where saltwater intrusion into a coastal aquifer due to excessive groundwater extraction for irrigated agriculture has affected the aquifer sustainability, endangering associated socio-economic conditions as well as traditional social structure. Results from the developed method have provided key decision alternatives which can serve as a platform for negotiation and further exploration. In addition, this approach has enabled to systematically quantify both probabilistic and fuzzy uncertainties associated with the decision problem. Sensitivity analysis applied within the developed tool has shown that the decision makers' risk aversion and risk taking attitude may yield in different ranking of decision alternatives. The developed approach can be applied to address the complexities and uncertainties inherent in water resources systems to support management decisions, while serving as a platform for stakeholder participation.
Discretion in Student Discipline: Insight into Elementary Principals' Decision Making
ERIC Educational Resources Information Center
Findlay, Nora M.
2015-01-01
Little research exists that examines the exercise of discretion by principals in their disciplinary decision making. This study sought to understand the application of values by principals as they engage in student disciplinary decision making within legally fixed parameters of their administrative discretion. This qualitative methodology used…
Making Supply Chains Resilient to Floods Using a Bayesian Network
NASA Astrophysics Data System (ADS)
Haraguchi, M.
2015-12-01
Natural hazards distress the global economy by disrupting the interconnected supply chain networks. Manufacturing companies have created cost-efficient supply chains by reducing inventories, streamlining logistics and limiting the number of suppliers. As a result, today's supply chains are profoundly susceptible to systemic risks. In Thailand, for example, the GDP growth rate declined by 76 % in 2011 due to prolonged flooding. Thailand incurred economic damage including the loss of USD 46.5 billion, approximately 70% of which was caused by major supply chain disruptions in the manufacturing sector. Similar problems occurred after the Great East Japan Earthquake and Tsunami in 2011, the Mississippi River floods and droughts during 2011 - 2013, and Hurricane Sandy in 2012. This study proposes a methodology for modeling supply chain disruptions using a Bayesian network analysis (BNA) to estimate expected values of countermeasures of floods, such as inventory management, supplier management and hard infrastructure management. We first performed a spatio-temporal correlation analysis between floods and extreme precipitation data for the last 100 years at a global scale. Then we used a BNA to create synthetic networks that include variables associated with the magnitude and duration of floods, major components of supply chains and market demands. We also included decision variables of countermeasures that would mitigate potential losses caused by supply chain disruptions. Finally, we conducted a cost-benefit analysis by estimating the expected values of these potential countermeasures while conducting a sensitivity analysis. The methodology was applied to supply chain disruptions caused by the 2011 Thailand floods. Our study demonstrates desirable typical data requirements for the analysis, such as anonymized supplier network data (i.e. critical dependencies, vulnerability information of suppliers) and sourcing data(i.e. locations of suppliers, and production rates and volume), and data from previous experiences (i.e. companies' risk mitigation strategy decisions).
Detection and Processing Techniques of FECG Signal for Fetal Monitoring
2009-01-01
Fetal electrocardiogram (FECG) signal contains potentially precise information that could assist clinicians in making more appropriate and timely decisions during labor. The ultimate reason for the interest in FECG signal analysis is in clinical diagnosis and biomedical applications. The extraction and detection of the FECG signal from composite abdominal signals with powerful and advance methodologies are becoming very important requirements in fetal monitoring. The purpose of this review paper is to illustrate the various methodologies and developed algorithms on FECG signal detection and analysis to provide efficient and effective ways of understanding the FECG signal and its nature for fetal monitoring. A comparative study has been carried out to show the performance and accuracy of various methods of FECG signal analysis for fetal monitoring. Finally, this paper further focused some of the hardware implementations using electrical signals for monitoring the fetal heart rate. This paper opens up a passage for researchers, physicians, and end users to advocate an excellent understanding of FECG signal and its analysis procedures for fetal heart rate monitoring system. PMID:19495912
van Exel, Job; Baker, Rachel; Mason, Helen; Donaldson, Cam; Brouwer, Werner
2015-02-01
Resources available to the health care sector are finite and typically insufficient to fulfil all the demands for health care in the population. Decisions must be made about which treatments to provide. Relatively little is known about the views of the general public regarding the principles that should guide such decisions. We present the findings of a Q methodology study designed to elicit the shared views in the general public across ten countries regarding the appropriate principles for prioritising health care resources. In 2010, 294 respondents rank ordered a set of cards and the results of these were subject to by-person factor analysis to identify common patterns in sorting. Five distinct viewpoints were identified, (I) "Egalitarianism, entitlement and equality of access"; (II) "Severity and the magnitude of health gains"; (III) "Fair innings, young people and maximising health benefits"; (IV) "The intrinsic value of life and healthy living"; (V) "Quality of life is more important than simply staying alive". Given the plurality of views on the principles for health care priority setting, no single equity principle can be used to underpin health care priority setting. Hence, the process of decision making becomes more important, in which, arguably, these multiple perspectives in society should be somehow reflected. Copyright © 2014 Elsevier Ltd. All rights reserved.
Guest, James; Harrop, James S; Aarabi, Bizhan; Grossman, Robert G; Fawcett, James W; Fehlings, Michael G; Tator, Charles H
2012-09-01
The North American Clinical Trials Network (NACTN) includes 9 clinical centers funded by the US Department of Defense and the Christopher Reeve Paralysis Foundation. Its purpose is to accelerate clinical testing of promising therapeutics in spinal cord injury (SCI) through the development of a robust interactive infrastructure. This structure includes key committees that serve to provide longitudinal guidance to the Network. These committees include the Executive, Data Management, and Neurological Outcome Assessments Committees, and the Therapeutic Selection Committee (TSC), which is the subject of this manuscript. The NACTN brings unique elements to the SCI field. The Network's stability is not restricted to a single clinical trial. Network members have diverse expertise and include experts in clinical care, clinical trial design and methodology, pharmacology, preclinical and clinical research, and advanced rehabilitation techniques. Frequent systematic communication is assigned a high value, as is democratic process, fairness and efficiency of decision making, and resource allocation. This article focuses on how decision making occurs within the TSC to rank alternative therapeutics according to 2 main variables: quality of the preclinical data set, and fit with the Network's aims and capabilities. This selection process is important because if the Network's resources are committed to a therapeutic, alternatives cannot be pursued. A proposed methodology includes a multicriteria decision analysis that uses a Multi-Attribute Global Inference of Quality matrix to quantify the process. To rank therapeutics, the TSC uses a series of consensus steps designed to reduce individual and group bias and limit subjectivity. Given the difficulties encountered by industry in completing clinical trials in SCI, stable collaborative not-for-profit consortia, such as the NACTN, may be essential to clinical progress in SCI. The evolution of the NACTN also offers substantial opportunity to refine decision making and group dynamics. Making the best possible decisions concerning therapeutics selection for trial testing is a cornerstone of the Network's function.
Constenla, Dagna
2015-03-24
Economic evaluations have routinely understated the net benefits of vaccination by not including the full range of economic benefits that accrue over the lifetime of a vaccinated person. Broader approaches for evaluating benefits of vaccination can be used to more accurately calculate the value of vaccination. This paper reflects on the methodology of one such approach - the health investment life course approach - that looks at the impact of vaccine investment on lifetime returns. The role of this approach on vaccine decision-making will be assessed using the malaria health investment life course model example. We describe a framework that measures the impact of a health policy decision on government accounts over many generations. The methodological issues emerging from this approach are illustrated with an example from a recently completed health investment life course analysis of malaria vaccination in Ghana. Beyond the results, various conceptual and practical challenges of applying this framework to Ghana are discussed in this paper. The current framework seeks to understand how disease and available technologies can impact a range of economic parameters such as labour force participation, education, healthcare consumption, productivity, wages or economic growth, and taxation following their introduction. The framework is unique amongst previous economic models in malaria because it considers future tax revenue for governments. The framework is complementary to cost-effectiveness and budget impact analysis. The intent of this paper is to stimulate discussion on how existing and new methodology can add to knowledge regarding the benefits from investing in new and underutilized vaccines. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Palermo, Gianluca; Golkar, Alessandro; Gaudenzi, Paolo
2015-06-01
As small satellites and Sun Synchronous Earth Observation systems are assuming an increased role in nowadays space activities, including commercial investments, it is of interest to assess how infrastructures could be developed to support the development of such systems and other spacecraft that could benefit from having a data relay service in Low Earth Orbit (LEO), as opposed to traditional Geostationary relays. This paper presents a tradespace exploration study of the architecture of such LEO commercial satellite data relay systems, here defined as Earth Orbiting Support Systems (EOSS). The paper proposes a methodology to formulate architectural decisions for EOSS constellations, and enumerate the corresponding tradespace of feasible architectures. Evaluation metrics are proposed to measure benefits and costs of architectures; lastly, a multicriteria Pareto criterion is used to downselect optimal architectures for subsequent analysis. The methodology is applied to two case studies for a set of 30 and 100 customer-spacecraft respectively, representing potential markets for LEO services in Exploration, Earth Observation, Science, and CubeSats. Pareto analysis shows how increased performance of the constellation is always achieved by an increased node size, as measured by the gain of the communications antenna mounted on EOSS spacecraft. On the other hand, nonlinear trends in optimal orbital altitude, number of satellites per plane, and number of orbital planes, are found in both cases. An upward trend in individual node memory capacity is found, although never exceeding 256 Gbits of onboard memory for both cases that have been considered, assuming the availability of a polar ground station for EOSS data downlink. System architects can use the proposed methodology to identify optimal EOSS constellations for a given service pricing strategy and customer target, thus identifying alternatives for selection by decision makers.
Ajmera, Puneeta
2017-10-09
Purpose Organizations have to evaluate their internal and external environments in this highly competitive world. Strengths, weaknesses, opportunities and threats (SWOT) analysis is a very useful technique which analyzes the strengths, weaknesses, opportunities and threats of an organization for taking strategic decisions and it also provides a foundation for the formulation of strategies. But the drawback of SWOT analysis is that it does not quantify the importance of individual factors affecting the organization and the individual factors are described in brief without weighing them. Because of this reason, SWOT analysis can be integrated with any multiple attribute decision-making (MADM) technique like the technique for order preference by similarity to ideal solution (TOPSIS), analytical hierarchy process, etc., to evaluate the best alternative among the available strategic alternatives. The paper aims to discuss these issues. Design/methodology/approach In this study, SWOT analysis is integrated with a multicriteria decision-making technique called TOPSIS to rank different strategies for Indian medical tourism in order of priority. Findings SO strategy (providing best facilitation and care to the medical tourists at par to developed countries) is the best strategy which matches with the four elements of S, W, O and T of SWOT matrix and 35 strategic indicators. Practical implications This paper proposes a solution based on a combined SWOT analysis and TOPSIS approach to help the organizations to evaluate and select strategies. Originality/value Creating a new technology or administering a new strategy always has some degree of resistance by employees. To minimize resistance, the author has used TOPSIS as it involves group thinking, requiring every manager of the organization to analyze and evaluate different alternatives and average measure of each parameter in final decision matrix.
Material selection and assembly method of battery pack for compact electric vehicle
NASA Astrophysics Data System (ADS)
Lewchalermwong, N.; Masomtob, M.; Lailuck, V.; Charoenphonphanich, C.
2018-01-01
Battery packs become the key component in electric vehicles (EVs). The main costs of which are battery cells and assembling processes. The battery cell is indeed priced from battery manufacturers while the assembling cost is dependent on battery pack designs. Battery pack designers need overall cost as cheap as possible, but it still requires high performance and more safety. Material selection and assembly method as well as component design are very important to determine the cost-effectiveness of battery modules and battery packs. Therefore, this work presents Decision Matrix, which can aid in the decision-making process of component materials and assembly methods for a battery module design and a battery pack design. The aim of this study is to take the advantage of incorporating Architecture Analysis method into decision matrix methods by capturing best practices for conducting design architecture analysis in full account of key design components critical to ensure efficient and effective development of the designs. The methodology also considers the impacts of choice-alternatives along multiple dimensions. Various alternatives for materials and assembly techniques of battery pack are evaluated, and some sample costs are presented. Due to many components in the battery pack, only seven components which are positive busbar and Z busbar are represented in this paper for using decision matrix methods.
Identifying typical patterns of vulnerability: A 5-step approach based on cluster analysis
NASA Astrophysics Data System (ADS)
Sietz, Diana; Lüdeke, Matthias; Kok, Marcel; Lucas, Paul; Carsten, Walther; Janssen, Peter
2013-04-01
Specific processes that shape the vulnerability of socio-ecological systems to climate, market and other stresses derive from diverse background conditions. Within the multitude of vulnerability-creating mechanisms, distinct processes recur in various regions inspiring research on typical patterns of vulnerability. The vulnerability patterns display typical combinations of the natural and socio-economic properties that shape a systems' vulnerability to particular stresses. Based on the identification of a limited number of vulnerability patterns, pattern analysis provides an efficient approach to improving our understanding of vulnerability and decision-making for vulnerability reduction. However, current pattern analyses often miss explicit descriptions of their methods and pay insufficient attention to the validity of their groupings. Therefore, the question arises as to how do we identify typical vulnerability patterns in order to enhance our understanding of a systems' vulnerability to stresses? A cluster-based pattern recognition applied at global and local levels is scrutinised with a focus on an applicable methodology and practicable insights. Taking the example of drylands, this presentation demonstrates the conditions necessary to identify typical vulnerability patterns. They are summarised in five methodological steps comprising the elicitation of relevant cause-effect hypotheses and the quantitative indication of mechanisms as well as an evaluation of robustness, a validation and a ranking of the identified patterns. Reflecting scale-dependent opportunities, a global study is able to support decision-making with insights into the up-scaling of interventions when available funds are limited. In contrast, local investigations encourage an outcome-based validation. This constitutes a crucial step in establishing the credibility of the patterns and hence their suitability for informing extension services and individual decisions. In this respect, working at the local level provides a clear advantage since, to a large extent, limitations in globally available observational data constrain such a validation on the global scale. Overall, the five steps are outlined in detail in order to facilitate and motivate the application of pattern recognition in other research studies concerned with vulnerability analysis, including future applications to different vulnerability frameworks. Such applications could promote the refinement of mechanisms in specific contexts and advance methodological adjustments. This would further increase the value of identifying typical patterns in the properties of socio-ecological systems for an improved understanding and management of the relation between these systems and particular stresses.
Mertz, Marcel; Schildmann, Jan
2018-06-01
Empirical bioethics is commonly understood as integrating empirical research with normative-ethical research in order to address an ethical issue. Methodological analyses in empirical bioethics mainly focus on the integration of socio-empirical sciences (e.g. sociology or psychology) and normative ethics. But while there are numerous multidisciplinary research projects combining life sciences and normative ethics, there is few explicit methodological reflection on how to integrate both fields, or about the goals and rationales of such interdisciplinary cooperation. In this paper we will review some drivers for the tendency of empirical bioethics methodologies to focus on the collaboration of normative ethics with particularly social sciences. Subsequently, we argue that the ends of empirical bioethics, not the empirical methods, are decisive for the question of which empirical disciplines can contribute to empirical bioethics in a meaningful way. Using already existing types of research integration as a springboard, five possible types of research which encompass life sciences and normative analysis will illustrate how such cooperation can be conceptualized from a methodological perspective within empirical bioethics. We will conclude with a reflection on the limitations and challenges of empirical bioethics research that integrates life sciences.
New methodologies for multi-scale time-variant reliability analysis of complex lifeline networks
NASA Astrophysics Data System (ADS)
Kurtz, Nolan Scot
The cost of maintaining existing civil infrastructure is enormous. Since the livelihood of the public depends on such infrastructure, its state must be managed appropriately using quantitative approaches. Practitioners must consider not only which components are most fragile to hazard, e.g. seismicity, storm surge, hurricane winds, etc., but also how they participate on a network level using network analysis. Focusing on particularly damaged components does not necessarily increase network functionality, which is most important to the people that depend on such infrastructure. Several network analyses, e.g. S-RDA, LP-bounds, and crude-MCS, and performance metrics, e.g. disconnection bounds and component importance, are available for such purposes. Since these networks are existing, the time state is also important. If networks are close to chloride sources, deterioration may be a major issue. Information from field inspections may also have large impacts on quantitative models. To address such issues, hazard risk analysis methodologies for deteriorating networks subjected to seismicity, i.e. earthquakes, have been created from analytics. A bridge component model has been constructed for these methodologies. The bridge fragilities, which were constructed from data, required a deeper level of analysis as these were relevant for specific structures. Furthermore, chloride-induced deterioration network effects were investigated. Depending on how mathematical models incorporate new information, many approaches are available, such as Bayesian model updating. To make such procedures more flexible, an adaptive importance sampling scheme was created for structural reliability problems. Additionally, such a method handles many kinds of system and component problems with singular or multiple important regions of the limit state function. These and previously developed analysis methodologies were found to be strongly sensitive to the network size. Special network topologies may be more or less computationally difficult, while the resolution of the network also has large affects. To take advantage of some types of topologies, network hierarchical structures with super-link representation have been used in the literature to increase the computational efficiency by analyzing smaller, densely connected networks; however, such structures were based on user input and subjective at times. To address this, algorithms must be automated and reliable. These hierarchical structures may indicate the structure of the network itself. This risk analysis methodology has been expanded to larger networks using such automated hierarchical structures. Component importance is the most important objective from such network analysis; however, this may only provide the information of which bridges to inspect/repair earliest and little else. High correlations influence such component importance measures in a negative manner. Additionally, a regional approach is not appropriately modelled. To investigate a more regional view, group importance measures based on hierarchical structures have been created. Such structures may also be used to create regional inspection/repair approaches. Using these analytical, quantitative risk approaches, the next generation of decision makers may make both component and regional-based optimal decisions using information from both network function and further effects of infrastructure deterioration.
Patty, Nathalie J S; van Dijk, Hanna Maria; Wallenburg, Iris; Bal, Roland; Helmerhorst, Theo J M; van Exel, Job; Cramm, Jane Murray
2017-11-07
Despite the introduction of Human papillomavirus (HPV) vaccination in national immunization programs (NIPs), vaccination rates in most countries remain relatively low. An understanding of the reasons underlying decisions about whether to vaccinate is essential in order to promote wider spread of HPV vaccination. This is particularly important in relation to policies seeking to address shortfalls in current HPV campaigns. The aim of this study was to explore prevailing perspectives concerning HPV vaccination among girls, boys, and parents, and so to identify potential determinants of HPV vaccination decisions in these groups. Perspectives were explored using Q-methodology. Forty-seven girls, 39 boys, and 107 parents in the Netherlands were asked to rank a comprehensive set of 35 statements, assembled based on the health belief model (HBM), according to their agreement with them. By-person factor analysis was used to identify common patterns in these rankings, which were interpreted as perspectives on HPV vaccination. These perspectives were further interpreted and described using data collected with interviews and open-ended questions. The analysis revealed four perspectives: "prevention is better than cure," "fear of unknown side effects," "lack of information and awareness," and "my body, my choice." The first two perspectives and corresponding determinants of HPV vaccination decisions were coherent and distinct; the third and fourth perspectives were more ambiguous and, to some extent, incoherent, involving doubt and lack of awareness and information (perspective 3), and overconfidence (perspective 4). Given the aim of publically funded vaccination programs to minimize the spread of HPV infection and HPV-related disease and the concerns about current uptake levels, our results indicate that focus should be placed on increasing awareness and knowledge, in particular among those in a modifiable phase.
Dynamics of Metabolism and Decision Making During Alcohol Consumption: Modeling and Analysis.
Giraldo, Luis Felipe; Passino, Kevin M; Clapp, John D; Ruderman, Danielle
2017-11-01
Heavy alcohol consumption is considered an important public health issue in the United States as over 88 000 people die every year from alcohol-related causes. Research is being conducted to understand the etiology of alcohol consumption and to develop strategies to decrease high-risk consumption and its consequences, but there are still important gaps in determining the main factors that influence the consumption behaviors throughout the drinking event. There is a need for methodologies that allow us not only to identify such factors but also to have a comprehensive understanding of how they are connected and how they affect the dynamical evolution of a drinking event. In this paper, we use previous empirical findings from laboratory and field studies to build a mathematical model of the blood alcohol concentration dynamics in individuals that are in drinking events. We characterize these dynamics as the result of the interaction between a decision-making system and the metabolic process for alcohol. We provide a model of the metabolic process for arbitrary alcohol intake patterns and a characterization of the mechanisms that drive the decision-making process of a drinker during the drinking event. We use computational simulations and Lyapunov stability theory to analyze the effects of the parameters of the model on the blood alcohol concentration dynamics that are characterized. Also, we propose a methodology to inform the model using data collected in situ and to make estimations that provide additional information to the analysis. We show how this model allows us to analyze and predict previously observed behaviors, to design new approaches for the collection of data that improves the construction of the model, and help with the design of interventions.
2015-01-01
Information generated from economic evaluation is increasingly being used to inform health resource allocation decisions globally, including in low- and middle- income countries. However, a crucial consideration for users of the information at a policy level, e.g. funding agencies, is whether the studies are comparable, provide sufficient detail to inform policy decision making, and incorporate inputs from data sources that are reliable and relevant to the context. This review was conducted to inform a methodological standardisation workstream at the Bill and Melinda Gates Foundation (BMGF) and assesses BMGF-funded cost-per-DALY economic evaluations in four programme areas (malaria, tuberculosis, HIV/AIDS and vaccines) in terms of variation in methodology, use of evidence, and quality of reporting. The findings suggest that there is room for improvement in the three areas of assessment, and support the case for the introduction of a standardised methodology or reference case by the BMGF. The findings are also instructive for all institutions that fund economic evaluations in LMICs and who have a desire to improve the ability of economic evaluations to inform resource allocation decisions. PMID:25950443
NASA Astrophysics Data System (ADS)
Shah, Syed Muhammad Saqlain; Batool, Safeera; Khan, Imran; Ashraf, Muhammad Usman; Abbas, Syed Hussnain; Hussain, Syed Adnan
2017-09-01
Automatic diagnosis of human diseases are mostly achieved through decision support systems. The performance of these systems is mainly dependent on the selection of the most relevant features. This becomes harder when the dataset contains missing values for the different features. Probabilistic Principal Component Analysis (PPCA) has reputation to deal with the problem of missing values of attributes. This research presents a methodology which uses the results of medical tests as input, extracts a reduced dimensional feature subset and provides diagnosis of heart disease. The proposed methodology extracts high impact features in new projection by using Probabilistic Principal Component Analysis (PPCA). PPCA extracts projection vectors which contribute in highest covariance and these projection vectors are used to reduce feature dimension. The selection of projection vectors is done through Parallel Analysis (PA). The feature subset with the reduced dimension is provided to radial basis function (RBF) kernel based Support Vector Machines (SVM). The RBF based SVM serves the purpose of classification into two categories i.e., Heart Patient (HP) and Normal Subject (NS). The proposed methodology is evaluated through accuracy, specificity and sensitivity over the three datasets of UCI i.e., Cleveland, Switzerland and Hungarian. The statistical results achieved through the proposed technique are presented in comparison to the existing research showing its impact. The proposed technique achieved an accuracy of 82.18%, 85.82% and 91.30% for Cleveland, Hungarian and Switzerland dataset respectively.
2012-04-18
Rigorous methodological standards help to ensure that medical research produces information that is valid and generalizable, and are essential in patient-centered outcomes research (PCOR). Patient-centeredness refers to the extent to which the preferences, decision-making needs, and characteristics of patients are addressed, and is the key characteristic differentiating PCOR from comparative effectiveness research. The Patient Protection and Affordable Care Act signed into law in 2010 created the Patient-Centered Outcomes Research Institute (PCORI), which includes an independent, federally appointed Methodology Committee. The Methodology Committee is charged to develop methodological standards for PCOR. The 4 general areas identified by the committee in which standards will be developed are (1) prioritizing research questions, (2) using appropriate study designs and analyses, (3) incorporating patient perspectives throughout the research continuum, and (4) fostering efficient dissemination and implementation of results. A Congressionally mandated PCORI methodology report (to be issued in its first iteration in May 2012) will begin to provide standards in each of these areas, and will inform future PCORI funding announcements and review criteria. The work of the Methodology Committee is intended to enable generation of information that is relevant and trustworthy for patients, and to enable decisions that improve patient-centered outcomes.
Program management aid for redundancy selection and operational guidelines
NASA Technical Reports Server (NTRS)
Hodge, P. W.; Davis, W. L.; Frumkin, B.
1972-01-01
Although this criterion was developed specifically for use on the shuttle program, it has application to many other multi-missions programs (i.e. aircraft or mechanisms). The methodology employed is directly applicable even if the tools (nomographs and equations) are for mission peculiar cases. The redundancy selection criterion was developed to insure that both the design and operational cost impacts (life cycle costs) were considered in the selection of the quantity of operational redundancy. These tools were developed as aids in expediting the decision process and not intended as the automatic decision maker. This approach to redundancy selection is unique in that it enables a pseudo systems analysis to be performed on an equipment basis without waiting for all designs to be hardened.
A Simulation of the Base Civil Engineering Work Request/Work Order System.
1981-09-01
with better information with which to make a decision. For example, if the Chief of R&R wanted to know the effect on work order processing time of...work order processing times for the system. The Q-GERT Analysis Program developed by Pritsker (11) was used to simulate the generation of work...several factors affecting the mean work order processing time. 26 [2 r -- ... ... CHAPTER III RESEARCH METHODOLOGY Overview This chapter presents the
ERIC Educational Resources Information Center
Savich, Carl
2008-01-01
The purpose of this paper was to analyze the court case Pontiac v. Spellings, a legal challenge to the No Child Left Behind (NCLB) Act filed in 2005. The methodology was to examine and analyze the briefs filed and the court decisions to analyze the legal arguments made by the parties to the lawsuit. The results were that the U.S. Circuit Court for…
Automated control of hierarchical systems using value-driven methods
NASA Technical Reports Server (NTRS)
Pugh, George E.; Burke, Thomas E.
1990-01-01
An introduction is given to the Value-driven methodology, which has been successfully applied to solve a variety of difficult decision, control, and optimization problems. Many real-world decision processes (e.g., those encountered in scheduling, allocation, and command and control) involve a hierarchy of complex planning considerations. For such problems it is virtually impossible to define a fixed set of rules that will operate satisfactorily over the full range of probable contingencies. Decision Science Applications' value-driven methodology offers a systematic way of automating the intuitive, common-sense approach used by human planners. The inherent responsiveness of value-driven systems to user-controlled priorities makes them particularly suitable for semi-automated applications in which the user must remain in command of the systems operation. Three examples of the practical application of the approach in the automation of hierarchical decision processes are discussed: the TAC Brawler air-to-air combat simulation is a four-level computerized hierarchy; the autonomous underwater vehicle mission planning system is a three-level control system; and the Space Station Freedom electrical power control and scheduling system is designed as a two-level hierarchy. The methodology is compared with rule-based systems and with other more widely-known optimization techniques.
Theorizing Land Cover and Land Use Change: The Peasant Economy of Colonization in the Amazon Basin
NASA Technical Reports Server (NTRS)
Caldas, Marcellus; Walker, Robert; Arima, Eugenio; Perz, Stephen; Aldrich, Stephen; Simmons, Cynthia
2007-01-01
This paper addresses deforestation processes in the Amazon basin. It deploys a methodology combining remote sensing and survey-based fieldwork to examine, with regression analysis, the impact household structure and economic circumstances on deforestation decisions made by colonist farmers in the forest frontiers of Brazil. Unlike most previous regression-based studies, the methodology implemented analyzes behavior at the level of the individual property. The regressions correct for endogenous relationships between key variables, and spatial autocorrelation, as necessary. Variables used in the analysis are specified, in part, by a theoretical development integrating the Chayanovian concept of the peasant household with spatial considerations stemming from von Thuenen. The results from the empirical model indicate that demographic characteristics of households, as well as market factors, affect deforestation in the Amazon. Thus, statistical results from studies that do not include household-scale information may be subject to error. From a policy perspective, the results suggest that environmental policies in the Amazon based on market incentives to small farmers may not be as effective as hoped, given the importance of household factors in catalyzing the demand for land. The paper concludes by noting that household decisions regarding land use and deforestation are not independent of broader social circumstances, and that a full understanding of Amazonian deforestation will require insight into why poor families find it necessary to settle the frontier in the first place.
Chari, Ramya; Burke, Thomas A.; White, Ronald H.; Fox, Mary A.
2012-01-01
Susceptibility to chemical toxins has not been adequately addressed in risk assessment methodologies. As a result, environmental policies may fail to meet their fundamental goal of protecting the public from harm. This study examines how characterization of risk may change when susceptibility is explicitly considered in policy development; in particular we examine the process used by the U.S. Environmental Protection Agency (EPA) to set a National Ambient Air Quality Standard (NAAQS) for lead. To determine a NAAQS, EPA estimated air lead-related decreases in child neurocognitive function through a combination of multiple data elements including concentration-response (CR) functions. In this article, we present alternative scenarios for determining a lead NAAQS using CR functions developed in populations more susceptible to lead toxicity due to socioeconomic disadvantage. The use of CR functions developed in susceptible groups resulted in cognitive decrements greater than original EPA estimates. EPA’s analysis suggested that a standard level of 0.15 µg/m3 would fulfill decision criteria, but by incorporating susceptibility we found that options for the standard could reasonably be extended to lower levels. The use of data developed in susceptible populations would result in the selection of a more protective NAAQS under the same decision framework applied by EPA. Results are used to frame discussion regarding why cumulative risk assessment methodologies are needed to help inform policy development. PMID:22690184
Testing the Intelligence of Unmanned Autonomous Systems
2008-01-01
decisions without the operator. The term autonomous is also used interchangeably with intelligent, giving rise to the name unmanned autonomous system ( UAS ...For the purposes of this article, UAS describes an unmanned system that makes decisions based on gathered information. Because testers should not...make assumptions about the decision process within a UAS , there is a need for a methodology that completely tests this decision process without biasing
Methodological challenges of validating a clinical decision-making tool in the practice environment.
Brennan, Caitlin W; Daly, Barbara J
2015-04-01
Validating a measurement tool intended for use in the practice environment poses challenges that may not be present when validating a tool intended solely for research purposes. The aim of this article is to describe the methodological challenges of validating a clinical decision-making tool, the Oncology Acuity Tool, which nurses use to make nurse assignment and staffing decisions prospectively each shift. Data were derived from a larger validation study, during which several methodological challenges arose. Revisions to the tool, including conducting iterative feedback cycles with end users, were necessary before the validation study was initiated. The "true" value of patient acuity is unknown, and thus, two approaches to inter-rater reliability assessment were used. Discordant perspectives existed between experts and end users. Balancing psychometric rigor with clinical relevance may be achieved through establishing research-practice partnerships, seeking active and continuous feedback with end users, and weighing traditional statistical rules of thumb with practical considerations. © The Author(s) 2014.
Flores, Walter
2010-01-01
Governance refers to decision-making processes in which power relationships and actors and institutions' particular interests converge. Situations of consensus and conflict are inherent to such processes. Furthermore, decision-making happens within a framework of ethical principles, motivations and incentives which could be explicit or implicit. Health systems in most Latin-American and Caribbean countries take the principles of equity, solidarity, social participation and the right to health as their guiding principles; such principles must thus rule governance processes. However, this is not always the case and this is where the importance of investigating governance in health systems lies. Making advances in investigating governance involves conceptual and methodological implications. Clarifying and integrating normative and analytical approaches is relevant at conceptual level as both are necessary for an approach seeking to investigate and understand social phenomena's complexity. In relation to methodological level, there is a need to expand the range of variables, sources of information and indicators for studying decision-making aimed to greater equity, health citizenship and public policy efficiency.
Spanish methodological approach for biosphere assessment of radioactive waste disposal.
Agüero, A; Pinedo, P; Cancio, D; Simón, I; Moraleda, M; Pérez-Sánchez, D; Trueba, C
2007-10-01
The development of radioactive waste disposal facilities requires implementation of measures that will afford protection of human health and the environment over a specific temporal frame that depends on the characteristics of the wastes. The repository design is based on a multi-barrier system: (i) the near-field or engineered barrier, (ii) far-field or geological barrier and (iii) the biosphere system. Here, the focus is on the analysis of this last system, the biosphere. A description is provided of conceptual developments, methodological aspects and software tools used to develop the Biosphere Assessment Methodology in the context of high-level waste (HLW) disposal facilities in Spain. This methodology is based on the BIOMASS "Reference Biospheres Methodology" and provides a logical and systematic approach with supplementary documentation that helps to support the decisions necessary for model development. It follows a five-stage approach, such that a coherent biosphere system description and the corresponding conceptual, mathematical and numerical models can be built. A discussion on the improvements implemented through application of the methodology to case studies in international and national projects is included. Some facets of this methodological approach still require further consideration, principally an enhanced integration of climatology, geography and ecology into models considering evolution of the environment, some aspects of the interface between the geosphere and biosphere, and an accurate quantification of environmental change processes and rates.
Got risk? risk-centric perspective for spacecraft technology decision-making
NASA Technical Reports Server (NTRS)
Feather, Martin S.; Cornford, Steven L.; Moran, Kelly
2004-01-01
A risk-based decision-making methodology conceived and developed at JPL and NASA has been used to aid in decision making for spacecraft technology assessment, adoption, development and operation. It takes a risk-centric perspective, through which risks are used as a reasoning step to interpose between mission objectives and risk mitigation measures.
Lunar Exploration Architecture Level Key Drivers and Sensitivities
NASA Technical Reports Server (NTRS)
Goodliff, Kandyce; Cirillo, William; Earle, Kevin; Reeves, J. D.; Shyface, Hilary; Andraschko, Mark; Merrill, R. Gabe; Stromgren, Chel; Cirillo, Christopher
2009-01-01
Strategic level analysis of the integrated behavior of lunar transportation and lunar surface systems architecture options is performed to assess the benefit, viability, affordability, and robustness of system design choices. This analysis employs both deterministic and probabilistic modeling techniques so that the extent of potential future uncertainties associated with each option are properly characterized. The results of these analyses are summarized in a predefined set of high-level Figures of Merit (FOMs) so as to provide senior NASA Constellation Program (CxP) and Exploration Systems Mission Directorate (ESMD) management with pertinent information to better inform strategic level decision making. The strategic level exploration architecture model is designed to perform analysis at as high a level as possible but still capture those details that have major impacts on system performance. The strategic analysis methodology focuses on integrated performance, affordability, and risk analysis, and captures the linkages and feedbacks between these three areas. Each of these results leads into the determination of the high-level FOMs. This strategic level analysis methodology has been previously applied to Space Shuttle and International Space Station assessments and is now being applied to the development of the Constellation Program point-of-departure lunar architecture. This paper provides an overview of the strategic analysis methodology and the lunar exploration architecture analyses to date. In studying these analysis results, the strategic analysis team has identified and characterized key drivers affecting the integrated architecture behavior. These key drivers include inclusion of a cargo lander, mission rate, mission location, fixed-versus- variable costs/return on investment, and the requirement for probabilistic analysis. Results of sensitivity analysis performed on lunar exploration architecture scenarios are also presented.
77 FR 16319 - Amtrak's Petition for Determination of PRIIA Section 209 Cost Methodology
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-20
... surrounding its development, the Board concludes that the methodology will: (1) Ensure equal treatment in the....stb.dot.gov . This decision will not significantly affect either the human environment or the...
Nutley, Tara; Gnassou, Léontine; Traore, Moussa; Bosso, Abitche Edwige; Mullen, Stephanie
2014-01-01
Improving a health system requires data, but too often they are unused or under-used by decision makers. Without interventions to improve the use of data in decision making, health systems cannot meet the needs of the populations they serve. In 2008, in Côte d'Ivoire, data were largely unused in health decision-making processes. To implement and evaluate an intervention to improve the use of data in decision making in Cote d'Ivoire. From 2008 to 2012, Cote d'Ivoire sought to improve the use of national health data through an intervention that broadens participation in and builds links between data collection and decision-making processes; identifies information needs; improves data quality; builds capacity to analyze, synthesize, and interpret data; and develops policies to support data use. To assess the results, a Performance of Routine Information System Management Assessment was conducted before and after the intervention using a combination of purposeful and random sampling. In 2008, the sample consisted of the central level, 12 districts, and 119 facilities, and in 2012, the sample consisted of the central level, 20 districts, and 190 health facilities. To assess data use, we developed dichotomous indicators: discussions of analysis findings, decisions taken based on the analysis, and decisions referred to upper management for action. We aggregated the indicators to generate a composite, continuous index of data use. From 2008 to 2012, the district data-use score increased from 40 to 70%; the facility score remained the same - 38%. The central score is not reported, because of a methodological difference in the two assessments. The intervention improved the use of data in decision making at the district level in Côte d'Ivoire. This study provides an example of, and guidance for, implementing a large-scale intervention to improve data-informed decision making.
Methodology for cost analysis of film-based and filmless portable chest systems
NASA Astrophysics Data System (ADS)
Melson, David L.; Gauvain, Karen M.; Beardslee, Brian M.; Kraitsik, Michael J.; Burton, Larry; Blaine, G. James; Brink, Gary S.
1996-05-01
Many studies analyzing the costs of film-based and filmless radiology have focused on multi- modality, hospital-wide solutions. Yet due to the enormous cost of converting an entire large radiology department or hospital to a filmless environment all at once, institutions often choose to eliminate film one area at a time. Narrowing the focus of cost-analysis may be useful in making such decisions. This presentation will outline a methodology for analyzing the cost per exam of film-based and filmless solutions for providing portable chest exams to Intensive Care Units (ICUs). The methodology, unlike most in the literature, is based on parallel data collection from existing filmless and film-based ICUs, and is currently being utilized at our institution. Direct costs, taken from the perspective of the hospital, for portable computed radiography chest exams in one filmless and two film-based ICUs are identified. The major cost components are labor, equipment, materials, and storage. Methods for gathering and analyzing each of the cost components are discussed, including FTE-based and time-based labor analysis, incorporation of equipment depreciation, lease, and maintenance costs, and estimation of materials costs. Extrapolation of data from three ICUs to model hypothetical, hospital-wide film-based and filmless ICU imaging systems is described. Performance of sensitivity analysis on the filmless model to assess the impact of anticipated reductions in specific labor, equipment, and archiving costs is detailed. A number of indirect costs, which are not explicitly included in the analysis, are identified and discussed.
Lonsdorf, Tina B; Menz, Mareike M; Andreatta, Marta; Fullana, Miguel A; Golkar, Armita; Haaker, Jan; Heitland, Ivo; Hermann, Andrea; Kuhn, Manuel; Kruse, Onno; Meir Drexler, Shira; Meulders, Ann; Nees, Frauke; Pittig, Andre; Richter, Jan; Römer, Sonja; Shiban, Youssef; Schmitz, Anja; Straube, Benjamin; Vervliet, Bram; Wendt, Julia; Baas, Johanna M P; Merz, Christian J
2017-06-01
The so-called 'replicability crisis' has sparked methodological discussions in many areas of science in general, and in psychology in particular. This has led to recent endeavours to promote the transparency, rigour, and ultimately, replicability of research. Originating from this zeitgeist, the challenge to discuss critical issues on terminology, design, methods, and analysis considerations in fear conditioning research is taken up by this work, which involved representatives from fourteen of the major human fear conditioning laboratories in Europe. This compendium is intended to provide a basis for the development of a common procedural and terminology framework for the field of human fear conditioning. Whenever possible, we give general recommendations. When this is not feasible, we provide evidence-based guidance for methodological decisions on study design, outcome measures, and analyses. Importantly, this work is also intended to raise awareness and initiate discussions on crucial questions with respect to data collection, processing, statistical analyses, the impact of subtle procedural changes, and data reporting specifically tailored to the research on fear conditioning. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
A Methodology for Robust Comparative Life Cycle Assessments Incorporating Uncertainty.
Gregory, Jeremy R; Noshadravan, Arash; Olivetti, Elsa A; Kirchain, Randolph E
2016-06-21
We propose a methodology for conducting robust comparative life cycle assessments (LCA) by leveraging uncertainty. The method evaluates a broad range of the possible scenario space in a probabilistic fashion while simultaneously considering uncertainty in input data. The method is intended to ascertain which scenarios have a definitive environmentally preferable choice among the alternatives being compared and the significance of the differences given uncertainty in the parameters, which parameters have the most influence on this difference, and how we can identify the resolvable scenarios (where one alternative in the comparison has a clearly lower environmental impact). This is accomplished via an aggregated probabilistic scenario-aware analysis, followed by an assessment of which scenarios have resolvable alternatives. Decision-tree partitioning algorithms are used to isolate meaningful scenario groups. In instances where the alternatives cannot be resolved for scenarios of interest, influential parameters are identified using sensitivity analysis. If those parameters can be refined, the process can be iterated using the refined parameters. We also present definitions of uncertainty quantities that have not been applied in the field of LCA and approaches for characterizing uncertainty in those quantities. We then demonstrate the methodology through a case study of pavements.
NASA Astrophysics Data System (ADS)
Pinon, Olivia J.
The increase in the types of airspace users (large aircraft, small and regional jets, very light jets, unmanned aerial vehicles, etc.), as well as the very limited number of future new airport development projects are some of the factors that will characterize the next decades in air transportation. These factors, associated with a persistent growth in air traffic will worsen the current gridlock situation experienced at some major airports. As airports are becoming the major capacity bottleneck to continued growth in air traffic, it is therefore primordial to make the most efficient use of the current, and very often, underutilized airport infrastructure. This research thus proposes to address the increase in air traffic demand and resulting capacity issues by considering the implementation of operational concepts and technologies at underutilized airports. However, there are many challenges associated with sustaining the development of this type of airports. First, the need to synchronize evolving technologies with airports’ needs and investment capabilities is paramount. Additionally, it was observed that the evolution of secondary airports, and their needs, is tightly linked to the environment in which they operate. In particular, sensitivity of airports to changes in the dynamics of their environment is important, therefore requiring that the factors that drive the need for technology acquisition be identified and characterized. Finally, the difficulty to evaluate risk and make financially viable decisions, particularly when investing in new technologies, cannot be ignored. This research provides a methodology that addresses these challenges and ensures the sustainability of airport capacity-enhancement investments in a continuously changing environment. In particular, it is articulated around the need to provide decision makers with the capability to valuate and select adaptable technology portfolios to ensure airport financial viability. Hence, the four-step process developed in this research leverages the benefits yielded by impact assessment techniques, system dynamics modeling, and real options analysis to 1) provide the decision maker with a rigorous, structured, and traceable process for technology selection, 2) assess the combined impact of interrelated technologies, 3) support the translation of technology impact factors into airport performance indicators, and help identify the factors that drive the need for capacity expansion, and finally 4) enable the quantitative assessment of the strategic value of embedding flexibility in the formulation of technology portfolios and investment options. In particular, the development of this methodology highlights the successful implementation of relevance tree analysis, morphological analysis, filters and dependency tables to support the aforementioned process for technology selection. Further, it illustrates the limited capability of Cross Impact Analysis to identify technology relationships for the problem at hand. Finally, this methodology demonstrates, through a change in demand at the airport modeled, the importance of being able to weigh both the technological and strategic performance of the technology portfolios considered. In particular, it illustrates the impact that the level of traffic, the presence of congestion, the timing and sequence of investments, and the number of technologies included, have on the strategic value of a portfolio. Hence, by capturing the time dimension and technology causality impacts in technology portfolio selection, this work helps identify key technologies or technology groupings, and assess their performance on airport metrics. By embedding flexibility in the formulation of investment scenarios, it provides the decision maker with a more accurate picture of the options available to him, as well as the time and sequence under which these should be exercised.
ARCHITECT: The architecture-based technology evaluation and capability tradeoff method
NASA Astrophysics Data System (ADS)
Griendling, Kelly A.
The use of architectures for the design, development, and documentation of system-of-systems engineering has become a common practice in recent years. This practice became mandatory in the defense industry in 2004 when the Department of Defense Architecture Framework (DoDAF) Promulgation Memo mandated that all Department of Defense (DoD) architectures must be DoDAF compliant. Despite this mandate, there has been significant confusion and a lack of consistency in the creation and the use of the architecture products. Products are typically created as static documents used for communication and documentation purposes that are difficult to change and do not support engineering design activities and acquisition decision making. At the same time, acquisition guidance has been recently reformed to move from the bottom-up approach of the Requirements Generation System (RGS) to the top-down approach mandated by the Joint Capabilities Integration and Devel- opment System (JCIDS), which requires the use of DoDAF to support acquisition. Defense agencies have had difficulty adjusting to this new policy, and are struggling to determine how to meet new acquisition requirements. This research has developed the Architecture-based Technology Evaluation and Capability Tradeoff (ARCHITECT) Methodology to respond to these challenges and address concerns raised about the defense acquisition process, particularly the time required to implement parts of the process, the need to evaluate solutions across capability and mission areas, and the need to use a rigorous, traceable, repeatable method that utilizes modeling and simulation to better substantiate early-phase acquisition decisions. The objective is to create a capability-based systems engineering methodology for the early phases of design and acquisition (specifically Pre-Milestone A activities) which improves agility in defense acquisition by (1) streamlining the development of key elements of JCIDS and DoDAF, (2) moving the creation of DoDAF products forward in the defense acquisition process, and (3) using DoDAF products for more than documentation by integrating them into the problem definition and analysis of alternatives phases and applying executable architecting. This research proposes and demonstrates the plausibility of a prescriptive methodology for developing executable DoDAF products which will explicitly support decision-making in the early phases of JCIDS. A set of criteria by which CBAs should be judged is proposed, and the methodology is developed with these criteria in mind. The methodology integrates existing tools and techniques for systems engineering and system of systems engineering with several new modeling and simulation tools and techniques developed as part of this research to fill gaps noted in prior CBAs. A suppression of enemy air defenses (SEAD) mission is used to demonstrate the ap- plication of ARCHITECT and to show the plausibility of the approach. For the SEAD study, metrics are derived and a gap analysis is performed. The study then identifies and quantitatively compares system and operational architecture alternatives for performing SEAD. A series of down-selections is performed to identify promising architectures, and these promising solutions are subject to further analysis where the impacts of force structure and network structure are examined. While the numerical results of the SEAD study are notional and could not be applied to an actual SEAD CBA, the example served to highlight many of the salient features of the methodology. The SEAD study presented enabled pre-Milestone A tradeoffs to be performed quantitatively across a large number of architectural alternatives in a traceable and repeatable manner. The alternatives considered included variations on operations, systems, organizational responsibilities (through the assignment of systems to tasks), network (or collaboration) structure, interoperability level, and force structure. All of the information used in the study is preserved in the environment, which is dynamic and allows for on-the-fly analysis. The assumptions used were consistent, which was assured through the use of single file documenting all inputs, which was shared across all models. Furthermore, a model was made of the ARCHITECT methodology itself, and was used to demonstrate that even if the steps took twice as long to perform as they did in the case of the SEAD example, the methodology still provides the ability to conduct CBA analyses in less time than prior CBAs to date. Overall, it is shown that the ARCHITECT methodology results in an improvement over current CBAs in the criteria developed here.
Standards and guidelines for observational studies: quality is in the eye of the beholder.
Morton, Sally C; Costlow, Monica R; Graff, Jennifer S; Dubois, Robert W
2016-03-01
Patient care decisions demand high-quality research. To assist those decisions, numerous observational studies are being performed. Are the standards and guidelines to assess observational studies consistent and actionable? What policy considerations should be considered to ensure decision makers can determine if an observational study is of high-quality and valid to inform treatment decisions? Based on a literature review and input from six experts, we compared and contrasted nine standards/guidelines using 23 methodological elements involved in observational studies (e.g., study protocol, data analysis, and so forth). Fourteen elements (61%) were addressed by at least seven standards/guidelines; 12 of these elements disagreed in the approach. Nine elements (39%) were addressed by six or fewer standards/guidelines. Ten elements (43%) were not actionable in at least one standard/guideline that addressed the element. The lack of observational study standard/guideline agreement may contribute to variation in study conduct; disparities in what is considered credible research; and ultimately, what evidence is adopted. A common set of agreed on standards/guidelines for conducting observational studies will benefit funders, researchers, journal editors, and decision makers. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Fraser, Kimberly D; Estabrooks, Carole; Allen, Marion; Strang, Vicki
2009-03-01
Case managers make decisions that directly affect the amount and type of services home care clients receive and subsequently affect the overall available health care resources of home care programs. A recent systematic review of the literature identified significant knowledge gaps with respect to resource allocation decision-making in home care. Using Spradley's methodology, we designed an ethnographic study of a children's home care program in Western Canada. The sample included 11 case managers and program leaders. Data sources included interviews, card sorts, and participant observation over a 5-month period. Data analyses included open coding, domain, taxonomic, and componential analysis. One of the key findings was a taxonomy of factors that influence case manager resource allocation decisions. The factors were grouped into one of four main categories: system-related, home care program-related, family related, or client-related. Family related factors have not been previously reported as influencing case manager resource allocation decision-making and nor has the team's role been reported as an influencing factor. The findings of this study are examined in light of Daniels and Sabin's Accountability for Reasonableness framework, which may be useful for future knowledge development about micro-level resource allocation theory.
NASA Astrophysics Data System (ADS)
Rizzi, Jonathan; Torresan, Silvia; Gallina, Valentina; Critto, Andrea; Marcomini, Antonio
2013-04-01
Europe's coast faces a variety of climate change threats from extreme high tides, storm surges and rising sea levels. In particular, it is very likely that mean sea level rise will contribute to upward trends in extreme coastal high water levels, thus posing higher risks to coastal locations currently experiencing coastal erosion and inundation processes. In 2007 the European Commission approved the Flood Directive (2007/60/EC), which has the main purpose to establish a framework for the assessment and management of flood risks for inland and coastal areas, thus reducing the adverse consequences for human health, the environment, cultural heritage and economic activities. Improvements in scientific understanding are thus needed to inform decision-making about the best strategies for mitigating and managing storm surge risks in coastal areas. The CLIMDAT project is aimed at improving the understanding of the risks related to extreme storm surge events in the coastal area of the North Adriatic Sea (Italy), considering potential climate change scenarios. The project implements a Regional Risk Assessment (RRA) methodology developed in the FP7 KULTURisk project for the assessment of physical/environmental impacts posed by flood hazards and employs the DEcision support SYstem for Coastal climate change impact assessment (DESYCO) for the application of the methodology to the case study area. The proposed RRA methodology is aimed at the identification and prioritization of targets and areas at risk from water-related natural hazards in the considered region at the meso-scale. To this aim, it integrates information about extreme storm surges with bio-geophysical and socio-economic information (e.g. vegetation cover, slope, soil type, population density) of the analyzed receptors (i.e. people, economic activities, cultural heritages, natural and semi-natural systems). Extreme storm surge hazard scenarios are defined using tide gauge time series coming from 28 tide gauge stations located in the North Adriatic coastal areas from 1989 to 2011. These data, together with the sea-level rise scenarios for the considered future timeframe, represent the input for the application of the Joint Probability method (Pugh and Vassie, 1979), which allows the evaluation of the maximum height of extreme storm surge events with different return period and the number of extreme events per year. The methodology uses Geographic Information Systems to manage, process, analyse, and visualize data and employs Multi-Criteria Decision Analysis to integrate stakeholders preferences and experts judgments into the analysis in order to obtain a total risk index in the considered region. The final outputs are represented by GIS-based risk maps which allow the communication of the potential consequences of extreme storm surge to decision makers and stakeholders. Moreover, they can support the establishment of relative priorities for intervention through the identification of suitable areas for human settlements, infrastructures and economic activities. Finally the produced output can represent a basis for definition of storm surge hazard and storm surge risk management plans according to the Floods Directive. The preliminary results of the RRA application in the CLIMDAT project will be here presented and discussed.
Selecting Essential Information for Biosurveillance—A Multi-Criteria Decision Analysis
Generous, Nicholas; Margevicius, Kristen J.; Taylor-McCabe, Kirsten J.; Brown, Mac; Daniel, W. Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina
2014-01-01
The National Strategy for Biosurveillancedefines biosurveillance as “the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels.” However, the strategy does not specify how “essential information” is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being “essential”. Thequestion of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of “essential information” for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system. PMID:24489748
Rothermundt, Christian; Bailey, Alexandra; Cerbone, Linda; Eisen, Tim; Escudier, Bernard; Gillessen, Silke; Grünwald, Viktor; Larkin, James; McDermott, David; Oldenburg, Jan; Porta, Camillo; Rini, Brian; Schmidinger, Manuela; Sternberg, Cora; Putora, Paul M
2015-09-01
With the advent of targeted therapies, many treatment options in the first-line setting of metastatic clear cell renal cell carcinoma (mccRCC) have emerged. Guidelines and randomized trial reports usually do not elucidate the decision criteria for the different treatment options. In order to extract the decision criteria for the optimal therapy for patients, we performed an analysis of treatment algorithms from experts in the field. Treatment algorithms for the treatment of mccRCC from experts of 11 institutions were obtained, and decision trees were deduced. Treatment options were identified and a list of unified decision criteria determined. The final decision trees were analyzed with a methodology based on diagnostic nodes, which allows for an automated cross-comparison of decision trees. The most common treatment recommendations were determined, and areas of discordance were identified. The analysis revealed heterogeneity in most clinical scenarios. The recommendations selected for first-line treatment of mccRCC included sunitinib, pazopanib, temsirolimus, interferon-α combined with bevacizumab, high-dose interleukin-2, sorafenib, axitinib, everolimus, and best supportive care. The criteria relevant for treatment decisions were performance status, Memorial Sloan Kettering Cancer Center risk group, only or mainly lung metastases, cardiac insufficiency, hepatic insufficiency, age, and "zugzwang" (composite of multiple, related criteria). In the present study, we used diagnostic nodes to compare treatment algorithms in the first-line treatment of mccRCC. The results illustrate the heterogeneity of the decision criteria and treatment strategies for mccRCC and how available data are interpreted and implemented differently among experts. The data provided in the present report should not be considered to serve as treatment recommendations for the management of treatment-naïve patients with multiple metastases from metastatic clear cell renal cell carcinoma outside a clinical trial; however, the data highlight the different treatment options and the criteria used to select them. The diversity in decision making and how results from phase III trials can be interpreted and implemented differently in daily practice are demonstrated. ©AlphaMed Press.
Towards a Computational Analysis of Status and Leadership Styles on FDA Panels
NASA Astrophysics Data System (ADS)
Broniatowski, David A.; Magee, Christopher L.
Decisions by committees of technical experts are increasingly impacting society. These decision-makers are typically embedded within a web of social relations. Taken as a whole, these relations define an implicit social structure which can influence the decision outcome. Aspects of this structure are founded on interpersonal affinity between parties to the negotiation, on assigned roles, and on the recognition of status characteristics, such as relevant domain expertise. This paper build upon a methodology aimed at extracting an explicit representation of such social structures using meeting transcripts as a data source. Whereas earlier results demonstrated that the method presented here can identify groups of decision-makers with a contextual affinity (i.e., membership in a given medical specialty or voting clique), we now can extract meaningful status hierarchies, and can identify differing facilitation styles among committee chairs. Use of this method is demonstrated on the transcripts of U.S. Food and Drug Administration (FDA) advisory panel meeting transcripts; nevertheless, the approach presented here is extensible to other domains and requires only a meeting transcript as input.
The SIMRAND methodology - Simulation of Research and Development Projects
NASA Technical Reports Server (NTRS)
Miles, R. F., Jr.
1984-01-01
In research and development projects, a commonly occurring management decision is concerned with the optimum allocation of resources to achieve the project goals. Because of resource constraints, management has to make a decision regarding the set of proposed systems or tasks which should be undertaken. SIMRAND (Simulation of Research and Development Projects) is a methodology which was developed for aiding management in this decision. Attention is given to a problem description, aspects of model formulation, the reduction phase of the model solution, the simulation phase, and the evaluation phase. The implementation of the considered approach is illustrated with the aid of an example which involves a simplified network of the type used to determine the price of silicon solar cells.
Materials Selection Criteria for Nuclear Power Applications: A Decision Algorithm
NASA Astrophysics Data System (ADS)
Rodríguez-Prieto, Álvaro; Camacho, Ana María; Sebastián, Miguel Ángel
2016-02-01
An innovative methodology based on stringency levels is proposed in this paper and improves the current selection method for structural materials used in demanding industrial applications. This paper describes a new approach for quantifying the stringency of materials requirements based on a novel deterministic algorithm to prevent potential failures. We have applied the new methodology to different standardized specifications used in pressure vessels design, such as SA-533 Grade B Cl.1, SA-508 Cl.3 (issued by the American Society of Mechanical Engineers), DIN 20MnMoNi55 (issued by the German Institute of Standardization) and 16MND5 (issued by the French Nuclear Commission) specifications and determine the influence of design code selection. This study is based on key scientific publications on the influence of chemical composition on the mechanical behavior of materials, which were not considered when the technological requirements were established in the aforementioned specifications. For this purpose, a new method to quantify the efficacy of each standard has been developed using a deterministic algorithm. The process of assigning relative weights was performed by consulting a panel of experts in materials selection for reactor pressure vessels to provide a more objective methodology; thus, the resulting mathematical calculations for quantitative analysis are greatly simplified. The final results show that steel DIN 20MnMoNi55 is the best material option. Additionally, more recently developed materials such as DIN 20MnMoNi55, 16MND5 and SA-508 Cl.3 exhibit mechanical requirements more stringent than SA-533 Grade B Cl.1. The methodology presented in this paper can be used as a decision tool in selection of materials for a wide range of applications.
Cognitive Systems Modeling and Analysis of Command and Control Systems
NASA Technical Reports Server (NTRS)
Norlander, Arne
2012-01-01
Military operations, counter-terrorism operations and emergency response often oblige operators and commanders to operate within distributed organizations and systems for safe and effective mission accomplishment. Tactical commanders and operators frequently encounter violent threats and critical demands on cognitive capacity and reaction time. In the future they will make decisions in situations where operational and system characteristics are highly dynamic and non-linear, i.e. minor events, decisions or actions may have serious and irreversible consequences for the entire mission. Commanders and other decision makers must manage true real time properties at all levels; individual operators, stand-alone technical systems, higher-order integrated human-machine systems and joint operations forces alike. Coping with these conditions in performance assessment, system development and operational testing is a challenge for both practitioners and researchers. This paper reports on research from which the results led to a breakthrough: An integrated approach to information-centered systems analysis to support future command and control systems research development. This approach integrates several areas of research into a coherent framework, Action Control Theory (ACT). It comprises measurement techniques and methodological advances that facilitate a more accurate and deeper understanding of the operational environment, its agents, actors and effectors, generating new and updated models. This in turn generates theoretical advances. Some good examples of successful approaches are found in the research areas of cognitive systems engineering, systems theory, and psychophysiology, and in the fields of dynamic, distributed decision making and naturalistic decision making.
Web-based health services and clinical decision support.
Jegelevicius, Darius; Marozas, Vaidotas; Lukosevicius, Arunas; Patasius, Martynas
2004-01-01
The purpose of this study was the development of a Web-based e-health service for comprehensive assistance and clinical decision support. The service structure consists of a Web server, a PHP-based Web interface linked to a clinical SQL database, Java applets for interactive manipulation and visualization of signals and a Matlab server linked with signal and data processing algorithms implemented by Matlab programs. The service ensures diagnostic signal- and image analysis-sbased clinical decision support. By using the discussed methodology, a pilot service for pathology specialists for automatic calculation of the proliferation index has been developed. Physicians use a simple Web interface for uploading the pictures under investigation to the server; subsequently a Java applet interface is used for outlining the region of interest and, after processing on the server, the requested proliferation index value is calculated. There is also an "expert corner", where experts can submit their index estimates and comments on particular images, which is especially important for system developers. These expert evaluations are used for optimization and verification of automatic analysis algorithms. Decision support trials have been conducted for ECG and ophthalmology ultrasonic investigations of intraocular tumor differentiation. Data mining algorithms have been applied and decision support trees constructed. These services are under implementation by a Web-based system too. The study has shown that the Web-based structure ensures more effective, flexible and accessible services compared with standalone programs and is very convenient for biomedical engineers and physicians, especially in the development phase.
Ciplak, Nesli
2015-08-01
The aim of this paper is to identify the best possible health care waste management option in the West Black Sea Region by taking into account economic, social, environmental, and technical aspects in the concept of multi-criteria decision analysis. In the scope of this research, three different health care waste management scenarios that consist of different technology alternatives were developed and compared using a decision-making computer software, called Right Choice, by identifying various criteria, measuring them, and ranking their relative importance from the point of key stakeholders. The results of the study show that the decentralized autoclave technology option coupled with the disposal through land-filling with energy recovery has potential to be an optimum option for health care waste management system, and an efficient health care waste segregation scheme should be given more attention by the authorities in the region. Furthermore, the discussion of the results points out multidisciplinary approach and the equilibrium between social, environmental, economic, and technical criteria. The methodology used in this research was developed in order to enable the decision makers to gain an increased perception of a decision problem. In general, the results and remarks of this study can be used as a basis of future planning and anticipation of needs for investment in the area of health care waste management in the region and also in developing countries that are dealing with the similar waste management problems.
Integration of RAMS in LCC analysis for linear transport infrastructures. A case study for railways.
NASA Astrophysics Data System (ADS)
Calle-Cordón, Álvaro; Jiménez-Redondo, Noemi; Morales-Gámiz, F. J.; García-Villena, F. A.; Garmabaki, Amir H. S.; Odelius, Johan
2017-09-01
Life-cycle cost (LCC) analysis is an economic technique used to assess the total costs associated with the lifetime of a system in order to support decision making in long term strategic planning. For complex systems, such as railway and road infrastructures, the cost of maintenance plays an important role in the LCC analysis. Costs associated with maintenance interventions can be more reliably estimated by integrating the probabilistic nature of the failures associated to these interventions in the LCC models. Reliability, Maintainability, Availability and Safety (RAMS) parameters describe the maintenance needs of an asset in a quantitative way by using probabilistic information extracted from registered maintenance activities. Therefore, the integration of RAMS in the LCC analysis allows obtaining reliable predictions of system maintenance costs and the dependencies of these costs with specific cost drivers through sensitivity analyses. This paper presents an innovative approach for a combined RAMS & LCC methodology for railway and road transport infrastructures being developed under the on-going H2020 project INFRALERT. Such RAMS & LCC analysis provides relevant probabilistic information to be used for condition and risk-based planning of maintenance activities as well as for decision support in long term strategic investment planning.
Molinos-Senante, M; Garrido-Baserba, M; Reif, R; Hernández-Sancho, F; Poch, M
2012-06-15
The preliminary design and economic assessment of small wastewater treatment plants (less than 2000 population equivalent) are issues of particular interest since wastewaters from most of these agglomerations are not covered yet. This work aims to assess nine different technologies set-up for the secondary treatment in such type of facilities embracing both economic and environmental parameters. The main novelty of this work is the combination of an innovative environmental decision support system (EDSS) with a pioneer approach based on the inclusion of the environmental benefits derived from wastewater treatment. The integration of methodologies based on cost-benefit analysis tools with the vast amount of knowledge from treatment technologies contained in the EDSS was applied in nine scenarios comprising different wastewater characteristics and reuse options. Hence, a useful economic feasibility indicator is obtained for each technology including internal and external costs and, for the first time, benefits associated with the environmental damage avoided. This new methodology proved to be crucial for supporting the decision process, contributing to improve the sustainability of new treatment facilities and allows the selection of the most feasible technologies of a wide set of possibilities. Copyright © 2012 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Herbst, Patricio; Chazan, Daniel; Kosko, Karl W.; Dimmel, Justin; Erickson, Ander
2016-01-01
This paper describes instruments designed to use multimedia to study at scale the instructional decisions that mathematics teachers make as well as teachers' recognition of elements of the context of their work that might influence those decision. This methodological contribution shows how evidence of constructs like instructional norm and…
2007-03-01
Congress Facility 7366 30251 Hazardous Material Storage Shed 432 20447 Aircraft Research Lab 1630 20449 Aircraft Research Lab 2480 34042 Reserve Forces...Congress Facility 0.566 20055 Engineering Admin. Building 0.578 20449 Aircraft Research Lab 0.595 20447 Aircraft Research Lab 0.605 20464...0.525 $39.00 0.01346 20447 Aircraft Research Lab 0.605 $59.50 0.01017 20449 Aircraft Research Lab 0.595 $62.40 0.00954 20464 Area B Gas Station
2009-05-01
gangs. Important aspects of these are the concept of micro locations, or “set space” where gangs tend to locate ( Tita et al. 2005) and patterns of...spatial diffusion of gang activity (Cohen and Tita 1999, Tita and Cohen 2004). A particularly promising approach is the combination of concepts from...matches their social interaction ( Tita 2007, Tita and Ridgeway 2007). An illustration of the incorporation of insights from a spatial analysis into
1997-09-01
California has made outcomes research a vital priority, as evidence - based medicine will soon dictate breast cancer practice patterns and insurance coverage...results reported to date. I also emphasize outcomes research methodology in an attempt to define treatments guidelines from an evidence - based medicine approach...techniques such as decision analysis, cost- effectiveness, and evidence - based medicine . The goal of the new inpatient service is to optimize the value of
Accuracy and Calibration of Computational Approaches for Inpatient Mortality Predictive Modeling.
Nakas, Christos T; Schütz, Narayan; Werners, Marcus; Leichtle, Alexander B
2016-01-01
Electronic Health Record (EHR) data can be a key resource for decision-making support in clinical practice in the "big data" era. The complete database from early 2012 to late 2015 involving hospital admissions to Inselspital Bern, the largest Swiss University Hospital, was used in this study, involving over 100,000 admissions. Age, sex, and initial laboratory test results were the features/variables of interest for each admission, the outcome being inpatient mortality. Computational decision support systems were utilized for the calculation of the risk of inpatient mortality. We assessed the recently proposed Acute Laboratory Risk of Mortality Score (ALaRMS) model, and further built generalized linear models, generalized estimating equations, artificial neural networks, and decision tree systems for the predictive modeling of the risk of inpatient mortality. The Area Under the ROC Curve (AUC) for ALaRMS marginally corresponded to the anticipated accuracy (AUC = 0.858). Penalized logistic regression methodology provided a better result (AUC = 0.872). Decision tree and neural network-based methodology provided even higher predictive performance (up to AUC = 0.912 and 0.906, respectively). Additionally, decision tree-based methods can efficiently handle Electronic Health Record (EHR) data that have a significant amount of missing records (in up to >50% of the studied features) eliminating the need for imputation in order to have complete data. In conclusion, we show that statistical learning methodology can provide superior predictive performance in comparison to existing methods and can also be production ready. Statistical modeling procedures provided unbiased, well-calibrated models that can be efficient decision support tools for predicting inpatient mortality and assigning preventive measures.
NASA Astrophysics Data System (ADS)
Winkler, Julie A.; Palutikof, Jean P.; Andresen, Jeffrey A.; Goodess, Clare M.
1997-10-01
Empirical transfer functions have been proposed as a means for `downscaling' simulations from general circulation models (GCMs) to the local scale. However, subjective decisions made during the development of these functions may influence the ensuing climate scenarios. This research evaluated the sensitivity of a selected empirical transfer function methodology to 1) the definition of the seasons for which separate specification equations are derived, 2) adjustments for known departures of the GCM simulations of the predictor variables from observations, 3) the length of the calibration period, 4) the choice of function form, and 5) the choice of predictor variables. A modified version of the Climatological Projection by Model Statistics method was employed to generate control (1 × CO2) and perturbed (2 × CO2) scenarios of daily maximum and minimum temperature for two locations with diverse climates (Alcantarilla, Spain, and Eau Claire, Michigan). The GCM simulations used in the scenario development were from the Canadian Climate Centre second-generation model (CCC GCMII).Variations in the downscaling methodology were found to have a statistically significant impact on the 2 × CO2 climate scenarios, even though the 1 × CO2 scenarios for the different transfer function approaches were often similar. The daily temperature scenarios for Alcantarilla and Eau Claire were most sensitive to the decision to adjust for deficiencies in the GCM simulations, the choice of predictor variables, and the seasonal definitions used to derive the functions (i.e., fixed seasons, floating seasons, or no seasons). The scenarios were less sensitive to the choice of function form (i.e., linear versus nonlinear) and to an increase in the length of the calibration period.The results of Part I, which identified significant departures of the CCC GCMII simulations of two candidate predictor variables from observations, together with those presented here in Part II, 1) illustrate the importance of detailed comparisons of observed and GCM 1 × CO2 series of candidate predictor variables as an initial step in impact analysis, 2) demonstrate that decisions made when developing the transfer functions can have a substantial influence on the 2 × CO2 scenarios and their interpretation, 3) highlight the uncertainty in the appropriate criteria for evaluating transfer function approaches, and 4) suggest that automation of empirical transfer function methodologies is inappropriate because of differences in the performance of transfer functions between sites and because of spatial differences in the GCM's ability to adequately simulate the predictor variables used in the functions.
Trial-Based Economic Evaluations in Occupational Health
van Wier, Marieke F.; Tompa, Emile; Bongers, Paulien M.; van der Beek, Allard J.; van Tulder, Maurits W.; Bosmans, Judith E.
2014-01-01
To allocate available resources as efficiently as possible, decision makers need information on the relative economic merits of occupational health and safety (OHS) interventions. Economic evaluations can provide this information by comparing the costs and consequences of alternatives. Nevertheless, only a few of the studies that consider the effectiveness of OHS interventions take the extra step of considering their resource implications. Moreover, the methodological quality of those that do is generally poor. Therefore, this study aims to help occupational health researchers conduct high-quality trial-based economic evaluations by discussing the theory and methodology that underlie them, and by providing recommendations for good practice regarding their design, analysis, and reporting. This study also helps consumers of this literature with understanding and critically appraising trial-based economic evaluations of OHS interventions. PMID:24854249
Trial-based economic evaluations in occupational health: principles, methods, and recommendations.
van Dongen, Johanna M; van Wier, Marieke F; Tompa, Emile; Bongers, Paulien M; van der Beek, Allard J; van Tulder, Maurits W; Bosmans, Judith E
2014-06-01
To allocate available resources as efficiently as possible, decision makers need information on the relative economic merits of occupational health and safety (OHS) interventions. Economic evaluations can provide this information by comparing the costs and consequences of alternatives. Nevertheless, only a few of the studies that consider the effectiveness of OHS interventions take the extra step of considering their resource implications. Moreover, the methodological quality of those that do is generally poor. Therefore, this study aims to help occupational health researchers conduct high-quality trial-based economic evaluations by discussing the theory and methodology that underlie them, and by providing recommendations for good practice regarding their design, analysis, and reporting. This study also helps consumers of this literature with understanding and critically appraising trial-based economic evaluations of OHS interventions.
Kemp, Candace L.; Ball, Mary M.; Morgan, Jennifer Craft; Doyle, Patrick J.; Burgess, Elisabeth O.; Dillard, Joy A.; Barmon, Christina E.; Fitzroy, Andrea F.; Helmly, Victoria E.; Avent, Elizabeth S.; Perkins, Molly M.
2018-01-01
In this article, we analyze the research experiences associated with a longitudinal qualitative study of residents’ care networks in assisted living. Using data from researcher meetings, field notes, and memos, we critically examine our design and decision making and accompanying methodological implications. We focus on one complete wave of data collection involving 28 residents and 114 care network members in four diverse settings followed for 2 years. We identify study features that make our research innovative, but that also represent significant challenges. They include the focus and topic; settings and participants; scope and design complexity; nature, modes, frequency, and duration of data collection; and analytic approach. Each feature has methodological implications, including benefits and challenges pertaining to recruitment, retention, data collection, quality, and management, research team work, researcher roles, ethics, and dissemination. Our analysis demonstrates the value of our approach and of reflecting on and sharing methodological processes for cumulative knowledge building. PMID:27651072