Decision Analysis Techniques for Adult Learners: Application to Leadership
ERIC Educational Resources Information Center
Toosi, Farah
2017-01-01
Most decision analysis techniques are not taught at higher education institutions. Leaders, project managers and procurement agents in industry have strong technical knowledge, and it is crucial for them to apply this knowledge at the right time to make critical decisions. There are uncertainties, problems, and risks involved in business…
Group decision-making techniques for natural resource management applications
Coughlan, Beth A.K.; Armour, Carl L.
1992-01-01
This report is an introduction to decision analysis and problem-solving techniques for professionals in natural resource management. Although these managers are often called upon to make complex decisions, their training in the natural sciences seldom provides exposure to the decision-making tools developed in management science. Our purpose is to being to fill this gap. We present a general analysis of the pitfalls of group problem solving, and suggestions for improved interactions followed by the specific techniques. Selected techniques are illustrated. The material is easy to understand and apply without previous training or excessive study and is applicable to natural resource management issues.
Fuzzy rationality and parameter elicitation in decision analysis
NASA Astrophysics Data System (ADS)
Nikolova, Natalia D.; Tenekedjiev, Kiril I.
2010-07-01
It is widely recognised by decision analysts that real decision-makers always make estimates in an interval form. An overview of techniques to find an optimal alternative among such with imprecise and interval probabilities is presented. Scalarisation methods are outlined as most appropriate. A proper continuation of such techniques is fuzzy rational (FR) decision analysis. A detailed representation of the elicitation process influenced by fuzzy rationality is given. The interval character of probabilities leads to the introduction of ribbon functions, whose general form and special cases are compared with the p-boxes. As demonstrated, approximation of utilities in FR decision analysis does not depend on the probabilities, but the approximation of probabilities is dependent on preferences.
DECISION ANALYSIS OF INCINERATION COSTS IN SUPERFUND SITE REMEDIATION
This study examines the decision-making process of the remedial design (RD) phase of on-site incineration projects conducted at Superfund sites. Decisions made during RD affect the cost and schedule of remedial action (RA). Decision analysis techniques are used to determine the...
The effect of uncertainties in distance-based ranking methods for multi-criteria decision making
NASA Astrophysics Data System (ADS)
Jaini, Nor I.; Utyuzhnikov, Sergei V.
2017-08-01
Data in the multi-criteria decision making are often imprecise and changeable. Therefore, it is important to carry out sensitivity analysis test for the multi-criteria decision making problem. The paper aims to present a sensitivity analysis for some ranking techniques based on the distance measures in multi-criteria decision making. Two types of uncertainties are considered for the sensitivity analysis test. The first uncertainty is related to the input data, while the second uncertainty is towards the Decision Maker preferences (weights). The ranking techniques considered in this study are TOPSIS, the relative distance and trade-off ranking methods. TOPSIS and the relative distance method measure a distance from an alternative to the ideal and antiideal solutions. In turn, the trade-off ranking calculates a distance of an alternative to the extreme solutions and other alternatives. Several test cases are considered to study the performance of each ranking technique in both types of uncertainties.
Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat
2008-01-01
Background Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. Methods In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Results Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Conclusion Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided. PMID:19036144
Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat
2008-11-26
Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided.
Applications of decision analysis and related techniques to industrial engineering problems at KSC
NASA Technical Reports Server (NTRS)
Evans, Gerald W.
1995-01-01
This report provides: (1) a discussion of the origination of decision analysis problems (well-structured problems) from ill-structured problems; (2) a review of the various methodologies and software packages for decision analysis and related problem areas; (3) a discussion of how the characteristics of a decision analysis problem affect the choice of modeling methodologies, thus providing a guide as to when to choose a particular methodology; and (4) examples of applications of decision analysis to particular problems encountered by the IE Group at KSC. With respect to the specific applications at KSC, particular emphasis is placed on the use of the Demos software package (Lumina Decision Systems, 1993).
ERIC Educational Resources Information Center
Landmesser, John Andrew
2014-01-01
Information technology (IT) investment decision makers are required to process large volumes of complex data. An existing body of knowledge relevant to IT portfolio management (PfM), decision analysis, visual comprehension of large volumes of information, and IT investment decision making suggest Multi-Criteria Decision Making (MCDM) and…
Diaby, Vakaramoko; Sanogo, Vassiki; Moussa, Kouame Richard
2016-01-01
In this paper, the readers are introduced to ELICIT, an imprecise weight elicitation technique for multicriteria decision analysis for healthcare. The application of ELICIT consists of two steps: the rank ordering of evaluation criteria based on decision-makers' (DMs) preferences using the principal component analysis; and the estimation of criteria weights and their descriptive statistics using the variable interdependent analysis and the Monte Carlo method. The application of ELICIT is illustrated with a hypothetical case study involving the elicitation of weights for five criteria used to select the best device for eye surgery. The criteria were ranked from 1-5, based on a strict preference relationship established by the DMs. For each criterion, the deterministic weight was estimated as well as the standard deviation and 95% credibility interval. ELICIT is appropriate in situations where only ordinal DMs' preferences are available to elicit decision criteria weights.
Bujar, Magdalena; McAuslane, Neil; Walker, Stuart R; Salek, Sam
2017-01-01
Introduction: Although pharmaceutical companies, regulatory authorities, and health technology assessment (HTA) agencies have been increasingly using decision-making frameworks, it is not certain whether these enable better quality decision making. This could be addressed by formally evaluating the quality of decision-making process within those organizations. The aim of this literature review was to identify current techniques (tools, questionnaires, surveys, and studies) for measuring the quality of the decision-making process across the three stakeholders. Methods: Using MEDLINE, Web of Knowledge, and other Internet-based search engines, a literature review was performed to systematically identify techniques for assessing quality of decision making in medicines development, regulatory review, and HTA. A structured search was applied using key words and a secondary review was carried out. In addition, the measurement properties of each technique were assessed and compared. Ten Quality Decision-Making Practices (QDMPs) developed previously were then used as a framework for the evaluation of techniques identified in the review. Due to the variation in studies identified, meta-analysis was inappropriate. Results: This review identified 13 techniques, where 7 were developed specifically to assess decision making in medicines' development, regulatory review, or HTA; 2 examined corporate decision making, and 4 general decision making. Regarding how closely each technique conformed to the 10 QDMPs, the 13 techniques assessed a median of 6 QDMPs, with a mode of 3 QDMPs. Only 2 techniques evaluated all 10 QDMPs, namely the Organizational IQ and the Quality of Decision Making Orientation Scheme (QoDoS), of which only one technique, QoDoS could be applied to assess decision making of both individuals and organizations, and it possessed generalizability to capture issues relevant to companies as well as regulatory authorities. Conclusion: This review confirmed a general paucity of research in this area, particularly regarding the development and systematic application of techniques for evaluating quality decision making, with no consensus around a gold standard. This review has identified QoDoS as the most promising available technique for assessing decision making in the lifecycle of medicines and the next steps would be to further test its validity, sensitivity, and reliability.
The Process of Life Cycle Cost Analysis: Projecting Economic Consequences of Design Decisions
ERIC Educational Resources Information Center
AIA Journal, 1976
1976-01-01
Life-cycle cost analysis deals with both present and future costs and attempts to relate the two as a basis for making decisions. This article lays the groundwork for a better understanding of the techniques of life-cycle cost analysis. (Author/MLF)
Decision modeling for fire incident analysis
Donald G. MacGregor; Armando González-Cabán
2009-01-01
This paper reports on methods for representing and modeling fire incidents based on concepts and models from the decision and risk sciences. A set of modeling techniques are used to characterize key fire management decision processes and provide a basis for incident analysis. The results of these methods can be used to provide insights into the structure of fire...
Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.
ERIC Educational Resources Information Center
Carlson, David H.
1986-01-01
This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…
Quadrant Analysis as a Strategic Planning Technique in Curriculum Development and Program Marketing.
ERIC Educational Resources Information Center
Lynch, James; And Others
1996-01-01
Quadrant analysis, a widely-used research technique, is suggested as useful in college or university strategic planning. The technique uses consumer preference data and produces information suitable for a wide variety of curriculum and marketing decisions. Basic quadrant analysis design is described, and advanced variations are discussed, with…
2015-10-28
techniques such as regression analysis, correlation, and multicollinearity assessment to identify the change and error on the input to the model...between many of the independent or predictor variables, the issue of multicollinearity may arise [18]. VII. SUMMARY Accurate decisions concerning
Studying Parental Decision Making with Micro-Computers: The CPSI Technique.
ERIC Educational Resources Information Center
Holden, George W.
A technique for studying how parents think, make decisions, and solve childrearing problems, Computer-Presented Social Interactions (CPSI), is described. Two studies involving CPSI are presented. The first study concerns a common parental cognitive task: causal analysis of an undesired behavior. The task was to diagnose the cause of non-contingent…
Bujar, Magdalena; McAuslane, Neil; Walker, Stuart R.; Salek, Sam
2017-01-01
Introduction: Although pharmaceutical companies, regulatory authorities, and health technology assessment (HTA) agencies have been increasingly using decision-making frameworks, it is not certain whether these enable better quality decision making. This could be addressed by formally evaluating the quality of decision-making process within those organizations. The aim of this literature review was to identify current techniques (tools, questionnaires, surveys, and studies) for measuring the quality of the decision-making process across the three stakeholders. Methods: Using MEDLINE, Web of Knowledge, and other Internet-based search engines, a literature review was performed to systematically identify techniques for assessing quality of decision making in medicines development, regulatory review, and HTA. A structured search was applied using key words and a secondary review was carried out. In addition, the measurement properties of each technique were assessed and compared. Ten Quality Decision-Making Practices (QDMPs) developed previously were then used as a framework for the evaluation of techniques identified in the review. Due to the variation in studies identified, meta-analysis was inappropriate. Results: This review identified 13 techniques, where 7 were developed specifically to assess decision making in medicines' development, regulatory review, or HTA; 2 examined corporate decision making, and 4 general decision making. Regarding how closely each technique conformed to the 10 QDMPs, the 13 techniques assessed a median of 6 QDMPs, with a mode of 3 QDMPs. Only 2 techniques evaluated all 10 QDMPs, namely the Organizational IQ and the Quality of Decision Making Orientation Scheme (QoDoS), of which only one technique, QoDoS could be applied to assess decision making of both individuals and organizations, and it possessed generalizability to capture issues relevant to companies as well as regulatory authorities. Conclusion: This review confirmed a general paucity of research in this area, particularly regarding the development and systematic application of techniques for evaluating quality decision making, with no consensus around a gold standard. This review has identified QoDoS as the most promising available technique for assessing decision making in the lifecycle of medicines and the next steps would be to further test its validity, sensitivity, and reliability. PMID:28443022
Diaby, Vakaramoko; Sanogo, Vassiki; Moussa, Kouame Richard
2015-01-01
Objective In this paper, the readers are introduced to ELICIT, an imprecise weight elicitation technique for multicriteria decision analysis for healthcare. Methods The application of ELICIT consists of two steps: the rank ordering of evaluation criteria based on decision-makers’ (DMs) preferences using the principal component analysis; and the estimation of criteria weights and their descriptive statistics using the variable interdependent analysis and the Monte Carlo method. The application of ELICIT is illustrated with a hypothetical case study involving the elicitation of weights for five criteria used to select the best device for eye surgery. Results The criteria were ranked from 1–5, based on a strict preference relationship established by the DMs. For each criterion, the deterministic weight was estimated as well as the standard deviation and 95% credibility interval. Conclusions ELICIT is appropriate in situations where only ordinal DMs’ preferences are available to elicit decision criteria weights. PMID:26361235
User Oriented Techniques to Support Interaction and Decision Making with Large Educational Databases
ERIC Educational Resources Information Center
Hartley, Roger; Almuhaidib, Saud M. Y.
2007-01-01
Information Technology is developing rapidly and providing policy/decision makers with large amounts of information that require processing and analysis. Decision support systems (DSS) aim to provide tools that not only help such analyses, but enable the decision maker to experiment and simulate the effects of different policies and selection…
Multiobjective Decision Analysis With Engineering and Business Applications
NASA Astrophysics Data System (ADS)
Wood, Eric
The last 15 years have witnessed the development of a large number of multiobjective decision techniques. Applying these techniques to environmental, engineering, and business problems has become well accepted. Multiobjective Decision Analysis With Engineering and Business Applications attempts to cover the main multiobjective techniques both in their mathematical treatment and in their application to real-world problems.The book is divided into 12 chapters plus three appendices. The main portion of the book is represented by chapters 3-6, Where the various approaches are identified, classified, and reviewed. Chapter 3 covers methods for generating nondominated solutions; chapter 4, continuous methods with prior preference articulation; chapter 5, discrete methods with prior preference articulation; and chapter 6, methods of progressive articulation of preferences. In these four chapters, close to 20 techniques are discussed with over 20 illustrative examples. This is both a strength and a weakness; the breadth of techniques and examples provide comprehensive coverage, but it is in a style too mathematically compact for most readers. By my count, the presentation of the 20 techniques in chapters 3-6 covered 85 pages, an average of about 4.5 pages each; therefore, a sound basis in linear algebra and linear programing is required if the reader hopes to follow the material. Chapter 2, “Concepts in Multiobjective Analysis,” also assumes such a background.
Predicting Effective Course Conduction Strategy Using Datamining Techniques
ERIC Educational Resources Information Center
Parkavi, A.; Lakshmi, K.; Srinivasa, K. G.
2017-01-01
Data analysis techniques can be used to analyze the pattern of data in different fields. Based on the analysis' results, it is recommended that suggestions be provided to decision making authorities. The data mining techniques can be used in educational domain to improve the outcome of the educational sectors. The authors carried out this research…
Achillas, Charisios; Moussiopoulos, Nicolas; Karagiannidis, Avraam; Banias, Georgias; Perkoulidis, George
2013-02-01
Problems in waste management have become more and more complex during recent decades. The increasing volumes of waste produced and social environmental consciousness present prominent drivers for environmental managers towards the achievement of a sustainable waste management scheme. However, in practice, there are many factors and influences - often mutually conflicting - criteria for finding solutions in real-life applications. This paper presents a review of the literature on multi-criteria decision aiding in waste management problems for all reported waste streams. Despite limitations, which are clearly stated, most of the work published in this field is reviewed. The present review aims to provide environmental managers and decision-makers with a thorough list of practical applications of the multi-criteria decision analysis techniques that are used to solve real-life waste management problems, as well as the criteria that are mostly employed in such applications according to the nature of the problem under study. Moreover, the paper explores the advantages and disadvantages of using multi-criteria decision analysis techniques in waste management problems in comparison to other available alternatives.
Lo, Benjamin W Y; Fukuda, Hitoshi; Angle, Mark; Teitelbaum, Jeanne; Macdonald, R Loch; Farrokhyar, Forough; Thabane, Lehana; Levine, Mitchell A H
2016-01-01
Classification and regression tree analysis involves the creation of a decision tree by recursive partitioning of a dataset into more homogeneous subgroups. Thus far, there is scarce literature on using this technique to create clinical prediction tools for aneurysmal subarachnoid hemorrhage (SAH). The classification and regression tree analysis technique was applied to the multicenter Tirilazad database (3551 patients) in order to create the decision-making algorithm. In order to elucidate prognostic subgroups in aneurysmal SAH, neurologic, systemic, and demographic factors were taken into account. The dependent variable used for analysis was the dichotomized Glasgow Outcome Score at 3 months. Classification and regression tree analysis revealed seven prognostic subgroups. Neurological grade, occurrence of post-admission stroke, occurrence of post-admission fever, and age represented the explanatory nodes of this decision tree. Split sample validation revealed classification accuracy of 79% for the training dataset and 77% for the testing dataset. In addition, the occurrence of fever at 1-week post-aneurysmal SAH is associated with increased odds of post-admission stroke (odds ratio: 1.83, 95% confidence interval: 1.56-2.45, P < 0.01). A clinically useful classification tree was generated, which serves as a prediction tool to guide bedside prognostication and clinical treatment decision making. This prognostic decision-making algorithm also shed light on the complex interactions between a number of risk factors in determining outcome after aneurysmal SAH.
Gilabert-Perramon, Antoni; Torrent-Farnell, Josep; Catalan, Arancha; Prat, Alba; Fontanet, Manel; Puig-Peiró, Ruth; Merino-Montero, Sandra; Khoury, Hanane; Goetghebeur, Mireille M; Badia, Xavier
2017-01-01
The aim of this study was to adapt and assess the value of a Multi-Criteria Decision Analysis (MCDA) framework (EVIDEM) for the evaluation of Orphan drugs in Catalonia (Catalan Health Service). The standard evaluation and decision-making procedures of CatSalut were compared with the EVIDEM methodology and contents. The EVIDEM framework was adapted to the Catalan context, focusing on the evaluation of Orphan drugs (PASFTAC program), during a Workshop with sixteen PASFTAC members. The criteria weighting was done using two different techniques (nonhierarchical and hierarchical). Reliability was assessed by re-test. The EVIDEM framework and methodology was found useful and feasible for Orphan drugs evaluation and decision making in Catalonia. All the criteria considered for the development of the CatSalut Technical Reports and decision making were considered in the framework. Nevertheless, the framework could improve the reporting of some of these criteria (i.e., "unmet needs" or "nonmedical costs"). Some Contextual criteria were removed (i.e., "Mandate and scope of healthcare system", "Environmental impact") or adapted ("population priorities and access") for CatSalut purposes. Independently of the weighting technique considered, the most important evaluation criteria identified for orphan drugs were: "disease severity", "unmet needs" and "comparative effectiveness", while the "size of the population" had the lowest relevance for decision making. Test-retest analysis showed weight consistency among techniques, supporting reliability overtime. MCDA (EVIDEM framework) could be a useful tool to complement the current evaluation methods of CatSalut, contributing to standardization and pragmatism, providing a method to tackle ethical dilemmas and facilitating discussions related to decision making.
Decision analysis in formulary decision making.
Schechter, C B
1993-06-01
Although decision making about what drugs to include in an institutional formulary appears to lend itself readily to quantitative techniques such as decision analysis and cost-benefit analysis, a review of the literature reveals that very little has been published in this area. Several of the published decision analyses use non-standard techniques that are, at best, of unproved validity, and may seriously distort the underlying issues through covert under-counting or double-counting of various drug attributes. Well executed decision analyses have contributed to establishing that drug acquisition costs are not an adequate measure of the total economic impact of formulary decisions and that costs of labour and materials associated with drug administration must be calculated on an institution-specific basis to reflect unique staffing patterns, bulk purchasing practices, and the availability of surplus capacity within the institution which might be mobilised at little marginal cost. Clinical studies of newly introduced drugs frequently fail to answer the questions that weigh most heavily on the structuring of a formal assessment of a proposed formulary acquisition. Studies comparing a full spectrum of therapeutically equivalent drugs are rarely done, and individual studies of particular pairs of drugs can rarely be used together because of differences in methodology or patient populations studied. Gathering of institution-specific economic and clinical data is a daunting, labour-intensive task. In many institutions, incentive and reward structures discourage behaviour that takes the broad institutional perspective that is intrinsic to a good decision analysis.(ABSTRACT TRUNCATED AT 250 WORDS)
Ajmera, Puneeta
2017-10-09
Purpose Organizations have to evaluate their internal and external environments in this highly competitive world. Strengths, weaknesses, opportunities and threats (SWOT) analysis is a very useful technique which analyzes the strengths, weaknesses, opportunities and threats of an organization for taking strategic decisions and it also provides a foundation for the formulation of strategies. But the drawback of SWOT analysis is that it does not quantify the importance of individual factors affecting the organization and the individual factors are described in brief without weighing them. Because of this reason, SWOT analysis can be integrated with any multiple attribute decision-making (MADM) technique like the technique for order preference by similarity to ideal solution (TOPSIS), analytical hierarchy process, etc., to evaluate the best alternative among the available strategic alternatives. The paper aims to discuss these issues. Design/methodology/approach In this study, SWOT analysis is integrated with a multicriteria decision-making technique called TOPSIS to rank different strategies for Indian medical tourism in order of priority. Findings SO strategy (providing best facilitation and care to the medical tourists at par to developed countries) is the best strategy which matches with the four elements of S, W, O and T of SWOT matrix and 35 strategic indicators. Practical implications This paper proposes a solution based on a combined SWOT analysis and TOPSIS approach to help the organizations to evaluate and select strategies. Originality/value Creating a new technology or administering a new strategy always has some degree of resistance by employees. To minimize resistance, the author has used TOPSIS as it involves group thinking, requiring every manager of the organization to analyze and evaluate different alternatives and average measure of each parameter in final decision matrix.
E Pluribus Analysis: Applying a Superforecasting Methodology to the Detection of Homegrown Violence
2018-03-01
actor violence and a set of predefined decision-making protocols. This research included running four simulations using the Monte Carlo technique, which...actor violence and a set of predefined decision-making protocols. This research included running four simulations using the Monte Carlo technique...PREDICTING RANDOMNESS.............................................................24 1. Using a “ Runs Test” to Determine a Temporal Pattern in Lone
Decision curve analysis: a novel method for evaluating prediction models.
Vickers, Andrew J; Elkin, Elena B
2006-01-01
Diagnostic and prognostic models are typically evaluated with measures of accuracy that do not address clinical consequences. Decision-analytic techniques allow assessment of clinical outcomes but often require collection of additional information and may be cumbersome to apply to models that yield a continuous result. The authors sought a method for evaluating and comparing prediction models that incorporates clinical consequences,requires only the data set on which the models are tested,and can be applied to models that have either continuous or dichotomous results. The authors describe decision curve analysis, a simple, novel method of evaluating predictive models. They start by assuming that the threshold probability of a disease or event at which a patient would opt for treatment is informative of how the patient weighs the relative harms of a false-positive and a false-negative prediction. This theoretical relationship is then used to derive the net benefit of the model across different threshold probabilities. Plotting net benefit against threshold probability yields the "decision curve." The authors apply the method to models for the prediction of seminal vesicle invasion in prostate cancer patients. Decision curve analysis identified the range of threshold probabilities in which a model was of value, the magnitude of benefit, and which of several models was optimal. Decision curve analysis is a suitable method for evaluating alternative diagnostic and prognostic strategies that has advantages over other commonly used measures and techniques.
NASA Astrophysics Data System (ADS)
Henderson, Charles; Yerushalmi, Edit; Kuo, Vince H.; Heller, Kenneth; Heller, Patricia
2007-12-01
To identify and describe the basis upon which instructors make curricular and pedagogical decisions, we have developed an artifact-based interview and an analysis technique based on multilayered concept maps. The policy capturing technique used in the interview asks instructors to make judgments about concrete instructional artifacts similar to those they likely encounter in their teaching environment. The analysis procedure alternatively employs both an a priori systems view analysis and an emergent categorization to construct a multilayered concept map, which is a hierarchically arranged set of concept maps where child maps include more details than parent maps. Although our goal was to develop a model of physics faculty beliefs about the teaching and learning of problem solving in the context of an introductory calculus-based physics course, the techniques described here are applicable to a variety of situations in which instructors make decisions that influence teaching and learning.
Cost Analysis of Instructional Technology.
ERIC Educational Resources Information Center
Johnson, F. Craig; Dietrich, John E.
Although some serious limitations in the cost analysis technique do exist, the need for cost data in decision making is so great that every effort should be made to obtain accurate estimates. This paper discusses the several issues which arise when an attempt is made to make quality, trade-off, or scope decisions based on cost data. Three methods…
The Use of Geoprocessing in Educational Research and Decision Support.
ERIC Educational Resources Information Center
Sexton, Porter
1982-01-01
Discusses geoprocessing, a computer mapping technique used by the Portland (Oregon) School District in which geographic analysis and data processing are combined. Several applications for administrative decision-making are discussed, including bus routing and redistricting. (JJD)
Using real options analysis to support strategic management decisions
NASA Astrophysics Data System (ADS)
Kabaivanov, Stanimir; Markovska, Veneta; Milev, Mariyan
2013-12-01
Decision making is a complex process that requires taking into consideration multiple heterogeneous sources of uncertainty. Standard valuation and financial analysis techniques often fail to properly account for all these sources of risk as well as for all sources of additional flexibility. In this paper we explore applications of a modified binomial tree method for real options analysis (ROA) in an effort to improve decision making process. Usual cases of use of real options are analyzed with elaborate study on the applications and advantages that company management can derive from their application. A numeric results based on extending simple binomial tree approach for multiple sources of uncertainty are provided to demonstrate the improvement effects on management decisions.
Group decision making with the analytic hierarchy process in benefit-risk assessment: a tutorial.
Hummel, J Marjan; Bridges, John F P; IJzerman, Maarten J
2014-01-01
The analytic hierarchy process (AHP) has been increasingly applied as a technique for multi-criteria decision analysis in healthcare. The AHP can aid decision makers in selecting the most valuable technology for patients, while taking into account multiple, and even conflicting, decision criteria. This tutorial illustrates the procedural steps of the AHP in supporting group decision making about new healthcare technology, including (1) identifying the decision goal, decision criteria, and alternative healthcare technologies to compare, (2) structuring the decision criteria, (3) judging the value of the alternative technologies on each decision criterion, (4) judging the importance of the decision criteria, (5) calculating group judgments, (6) analyzing the inconsistency in judgments, (7) calculating the overall value of the technologies, and (8) conducting sensitivity analyses. The AHP is illustrated via a hypothetical example, adapted from an empirical AHP analysis on the benefits and risks of tissue regeneration to repair small cartilage lesions in the knee.
van Til, Janine; Groothuis-Oudshoorn, Catharina; Lieferink, Marijke; Dolan, James; Goetghebeur, Mireille
2014-01-01
There is an increased interest in the use of multi-criteria decision analysis (MCDA) to support regulatory and reimbursement decision making. The EVIDEM framework was developed to provide pragmatic multi-criteria decision support in health care, to estimate the value of healthcare interventions, and to aid in priority-setting. The objectives of this study were to test 1) the influence of different weighting techniques on the overall outcome of an MCDA exercise, 2) the discriminative power in weighting different criteria of such techniques, and 3) whether different techniques result in similar weights in weighting the criteria set proposed by the EVIDEM framework. A sample of 60 Dutch and Canadian students participated in the study. Each student used an online survey to provide weights for 14 criteria with two different techniques: a five-point rating scale and one of the following techniques selected randomly: ranking, point allocation, pairwise comparison and best worst scaling. The results of this study indicate that there is no effect of differences in weights on value estimates at the group level. On an individual level, considerable differences in criteria weights and rank order occur as a result of the weight elicitation method used, and the ability of different techniques to discriminate in criteria importance. Of the five techniques tested, the pair-wise comparison of criteria has the highest ability to discriminate in weights when fourteen criteria are compared. When weights are intended to support group decisions, the choice of elicitation technique has negligible impact on criteria weights and the overall value of an innovation. However, when weights are used to support individual decisions, the choice of elicitation technique influences outcome and studies that use dissimilar techniques cannot be easily compared. Weight elicitation through pairwise comparison of criteria is preferred when taking into account its superior ability to discriminate between criteria and respondents' preferences.
ERIC Educational Resources Information Center
Mosier, Nancy R.
Financial analysis techniques are tools that help managers make sound financial decisions that contribute to general corporate objectives. A literature review reveals that the most commonly used financial analysis techniques are payback time, average rate of return, present value or present worth, and internal rate of return. Despite the success…
A Market-oriented Approach To Maximizing Product Benefits: Cases in U.S. Forest Products Industries
Vijay S. Reddy; Robert J. Bush; Ronen Roudik
1996-01-01
Conjoint analysis, a decompositional customer preference modelling technique, has seen little application to forest products. However, the technique provides useful information for marketing decisions by quantifying consumer preference functions for multiattribute product alternatives. The results of a conjoint analysis include the contribution of each attribute and...
Classroom Observation Techniques. IDEA Paper No. 4.
ERIC Educational Resources Information Center
Acheson, Keith A.
Techniques for observing the classroom behavior of teachers and students are examined. These techniques provide a framework for analyzing and understanding classroom interaction, for making decisions about what should be happening, and for changing instructional behavior when it is necessary. The observation methods allow collection, analysis, and…
Processes through which ecosystems provide goods or benefit people can be referred to as "ecosystems services”, which may be quantified to clarify decision-making, with techniques including cost-benefit analysis. We are developing an online decision support tool, the Santa Cruz W...
Thokala, Praveen; Devlin, Nancy; Marsh, Kevin; Baltussen, Rob; Boysen, Meindert; Kalo, Zoltan; Longrenn, Thomas; Mussen, Filip; Peacock, Stuart; Watkins, John; Ijzerman, Maarten
2016-01-01
Health care decisions are complex and involve confronting trade-offs between multiple, often conflicting, objectives. Using structured, explicit approaches to decisions involving multiple criteria can improve the quality of decision making and a set of techniques, known under the collective heading multiple criteria decision analysis (MCDA), are useful for this purpose. MCDA methods are widely used in other sectors, and recently there has been an increase in health care applications. In 2014, ISPOR established an MCDA Emerging Good Practices Task Force. It was charged with establishing a common definition for MCDA in health care decision making and developing good practice guidelines for conducting MCDA to aid health care decision making. This initial ISPOR MCDA task force report provides an introduction to MCDA - it defines MCDA; provides examples of its use in different kinds of decision making in health care (including benefit risk analysis, health technology assessment, resource allocation, portfolio decision analysis, shared patient clinician decision making and prioritizing patients' access to services); provides an overview of the principal methods of MCDA; and describes the key steps involved. Upon reviewing this report, readers should have a solid overview of MCDA methods and their potential for supporting health care decision making. Copyright © 2016. Published by Elsevier Inc.
Munson, Mark; Lieberman, Harvey; Tserlin, Elina; Rocnik, Jennifer; Ge, Jie; Fitzgerald, Maria; Patel, Vinod; Garcia-Echeverria, Carlos
2015-08-01
Herein, we report a novel and general method, lead optimization attrition analysis (LOAA), to benchmark two distinct small-molecule lead series using a relatively unbiased, simple technique and commercially available software. We illustrate this approach with data collected during lead optimization of two independent oncology programs as a case study. Easily generated graphics and attrition curves enabled us to calibrate progress and support go/no go decisions on each program. We believe that this data-driven technique could be used broadly by medicinal chemists and management to guide strategic decisions during drug discovery. Copyright © 2015 Elsevier Ltd. All rights reserved.
Probing the neurochemical correlates of motivation and decision making.
Wassum, Kate M; Phillips, Paul E M
2015-01-21
Online electrochemical detection techniques are the state-of-the-art for evaluating chemical communication in the brain underlying motivated behavior and decision making. In this Viewpoint, we discuss avenues for future technological development, as well as the requirement for increasingly sophisticated and interdisciplinary behavioral analysis.
Direct Allocation Costing: Informed Management Decisions in a Changing Environment.
ERIC Educational Resources Information Center
Mancini, Cesidio G.; Goeres, Ernest R.
1995-01-01
It is argued that colleges and universities can use direct allocation costing to provide quantitative information needed for decision making. This method of analysis requires institutions to modify traditional ideas of costing, looking to the private sector for examples of accurate costing techniques. (MSE)
NASA Technical Reports Server (NTRS)
Souther, J. W.
1981-01-01
The need to teach informational writing as a decision-making process is discussed. Situational analysis, its relationship to decisions in writing, and the need for relevant assignments are considered. Teaching students to ask the right questions is covered. The need to teach writing responsiveness is described. Three steps to get started and four teaching techniques are described. The information needs of the 'expert' and the 'manager' are contrasted.
NASA Astrophysics Data System (ADS)
Al-Saggaf, Yeslam; Burmeister, Oliver K.
2012-09-01
This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.
Factors Which Influence The Fish Purchasing Decision: A study on Traditional Market in Riau Mainland
NASA Astrophysics Data System (ADS)
Siswati, Latifa; Putri, Asgami
2018-05-01
The purposes of the research are to analyze and assess the factors which influence fish purchasing by the community at Tenayan Raya district Pekanbaru.Research methodology which used is survey method, especially interview and observation technique or direct supervision on the market which located at Tenayan Raya district. Determination technique of sampling location/region is done by purposive sampling. The sampling method is done by accidental sampling. Technique analysis of factors which used using the data that derived from the respondent opinion to various fish variable. The result of this research are the factors which influence fish purchasing decision done in a traditional market which located at Tenayan Raya district are product factor, price factors, social factor and individual factor. Product factor which influences fish purchasing decision as follows: the eyelets condition, the nutrition of fresh fish, the diversity of sold fish. Price factors influence the fish purchasing decision, such as: the price of fresh fish, the convincing price and the suitability price and benefits of the fresh fish. Individual factors which influence a fish purchasing decision, such as education and income levels. Social factors which influence a fish purchasing decision, such as family, colleagues and feeding habits of fish.
Approaches to answering critical CER questions.
Kinnier, Christine V; Chung, Jeanette W; Bilimoria, Karl Y
2015-01-01
While randomized controlled trials (RCTs) are the gold standard for research, many research questions cannot be ethically and practically answered using an RCT. Comparative effectiveness research (CER) techniques are often better suited than RCTs to address the effects of an intervention under routine care conditions, an outcome otherwise known as effectiveness. CER research techniques covered in this section include: effectiveness-oriented experimental studies such as pragmatic trials and cluster randomized trials, treatment response heterogeneity, observational and database studies including adjustment techniques such as sensitivity analysis and propensity score analysis, systematic reviews and meta-analysis, decision analysis, and cost effectiveness analysis. Each section describes the technique and covers the strengths and weaknesses of the approach.
Practical thoughts on cost-benefit analysis and health services.
Burchell, A; Weeden, R
1982-08-01
Cost-benefit analysis is fast becoming--if it is not already--an essential tool in decision making. It is, however, a complex subject, and one in which few doctors have been trained. This paper offers practical thoughts on the art of cost-benefit analysis, and is written for clinicians and other medical specialists who, though inexpert in the techniques of accountancy, nevertheless wish to carry out their own simple analyses in a manner that will enable them, and others, to take effective decisions.
Evaluating the decision accuracy and speed of clinical data visualizations.
Pieczkiewicz, David S; Finkelstein, Stanley M
2010-01-01
Clinicians face an increasing volume of biomedical data. Assessing the efficacy of systems that enable accurate and timely clinical decision making merits corresponding attention. This paper discusses the multiple-reader multiple-case (MRMC) experimental design and linear mixed models as means of assessing and comparing decision accuracy and latency (time) for decision tasks in which clinician readers must interpret visual displays of data. These tools can assess and compare decision accuracy and latency (time). These experimental and statistical techniques, used extensively in radiology imaging studies, offer a number of practical and analytic advantages over more traditional quantitative methods such as percent-correct measurements and ANOVAs, and are recommended for their statistical efficiency and generalizability. An example analysis using readily available, free, and commercial statistical software is provided as an appendix. While these techniques are not appropriate for all evaluation questions, they can provide a valuable addition to the evaluative toolkit of medical informatics research.
A method for studying decision-making by guideline development groups.
Gardner, Benjamin; Davidson, Rosemary; McAteer, John; Michie, Susan
2009-08-05
Multidisciplinary guideline development groups (GDGs) have considerable influence on UK healthcare policy and practice, but previous research suggests that research evidence is a variable influence on GDG recommendations. The Evidence into Recommendations (EiR) study has been set up to document social-psychological influences on GDG decision-making. In this paper we aim to evaluate the relevance of existing qualitative methodologies to the EiR study, and to develop a method best-suited to capturing influences on GDG decision-making. A research team comprised of three postdoctoral research fellows and a multidisciplinary steering group assessed the utility of extant qualitative methodologies for coding verbatim GDG meeting transcripts and semi-structured interviews with GDG members. A unique configuration of techniques was developed to permit data reduction and analysis. Our method incorporates techniques from thematic analysis, grounded theory analysis, content analysis, and framework analysis. Thematic analysis of individual interviews conducted with group members at the start and end of the GDG process defines discrete problem areas to guide data extraction from GDG meeting transcripts. Data excerpts are coded both inductively and deductively, using concepts taken from theories of decision-making, social influence and group processes. These codes inform a framework analysis to describe and explain incidents within GDG meetings. We illustrate the application of the method by discussing some preliminary findings of a study of a National Institute for Health and Clinical Excellence (NICE) acute physical health GDG. This method is currently being applied to study the meetings of three of NICE GDGs. These cover topics in acute physical health, mental health and public health, and comprise a total of 45 full-day meetings. The method offers potential for application to other health care and decision-making groups.
WeightLifter: Visual Weight Space Exploration for Multi-Criteria Decision Making.
Pajer, Stephan; Streit, Marc; Torsney-Weir, Thomas; Spechtenhauser, Florian; Muller, Torsten; Piringer, Harald
2017-01-01
A common strategy in Multi-Criteria Decision Making (MCDM) is to rank alternative solutions by weighted summary scores. Weights, however, are often abstract to the decision maker and can only be set by vague intuition. While previous work supports a point-wise exploration of weight spaces, we argue that MCDM can benefit from a regional and global visual analysis of weight spaces. Our main contribution is WeightLifter, a novel interactive visualization technique for weight-based MCDM that facilitates the exploration of weight spaces with up to ten criteria. Our technique enables users to better understand the sensitivity of a decision to changes of weights, to efficiently localize weight regions where a given solution ranks high, and to filter out solutions which do not rank high enough for any plausible combination of weights. We provide a comprehensive requirement analysis for weight-based MCDM and describe an interactive workflow that meets these requirements. For evaluation, we describe a usage scenario of WeightLifter in automotive engineering and report qualitative feedback from users of a deployed version as well as preliminary feedback from decision makers in multiple domains. This feedback confirms that WeightLifter increases both the efficiency of weight-based MCDM and the awareness of uncertainty in the ultimate decisions.
Modeling Adversaries in Counterterrorism Decisions Using Prospect Theory.
Merrick, Jason R W; Leclerc, Philip
2016-04-01
Counterterrorism decisions have been an intense area of research in recent years. Both decision analysis and game theory have been used to model such decisions, and more recently approaches have been developed that combine the techniques of the two disciplines. However, each of these approaches assumes that the attacker is maximizing its utility. Experimental research shows that human beings do not make decisions by maximizing expected utility without aid, but instead deviate in specific ways such as loss aversion or likelihood insensitivity. In this article, we modify existing methods for counterterrorism decisions. We keep expected utility as the defender's paradigm to seek for the rational decision, but we use prospect theory to solve for the attacker's decision to descriptively model the attacker's loss aversion and likelihood insensitivity. We study the effects of this approach in a critical decision, whether to screen containers entering the United States for radioactive materials. We find that the defender's optimal decision is sensitive to the attacker's levels of loss aversion and likelihood insensitivity, meaning that understanding such descriptive decision effects is important in making such decisions. © 2014 Society for Risk Analysis.
Portfolio Decisions and Brain Reactions via the CEAD method.
Majer, Piotr; Mohr, Peter N C; Heekeren, Hauke R; Härdle, Wolfgang K
2016-09-01
Decision making can be a complex process requiring the integration of several attributes of choice options. Understanding the neural processes underlying (uncertain) investment decisions is an important topic in neuroeconomics. We analyzed functional magnetic resonance imaging (fMRI) data from an investment decision study for stimulus-related effects. We propose a new technique for identifying activated brain regions: cluster, estimation, activation, and decision method. Our analysis is focused on clusters of voxels rather than voxel units. Thus, we achieve a higher signal-to-noise ratio within the unit tested and a smaller number of hypothesis tests compared with the often used General Linear Model (GLM). We propose to first conduct the brain parcellation by applying spatially constrained spectral clustering. The information within each cluster can then be extracted by the flexible dynamic semiparametric factor model (DSFM) dimension reduction technique and finally be tested for differences in activation between conditions. This sequence of Cluster, Estimation, Activation, and Decision admits a model-free analysis of the local fMRI signal. Applying a GLM on the DSFM-based time series resulted in a significant correlation between the risk of choice options and changes in fMRI signal in the anterior insula and dorsomedial prefrontal cortex. Additionally, individual differences in decision-related reactions within the DSFM time series predicted individual differences in risk attitudes as modeled with the framework of the mean-variance model.
Decision strategies to reduce teenage and young adult deaths in the United States.
Keeney, Ralph L; Palley, Asa B
2013-09-01
This article uses decision analysis concepts and techniques to address an extremely important problem to any family with children, namely, how to avoid the tragic death of a child during the high-risk ages of 15-24. Descriptively, our analysis indicates that of the 35,000 annual deaths among this age group in the United States, approximately 20,000 could be avoided if individuals chose readily available alternatives for decisions relating to these deaths. Prescriptively, we develop a decision framework for parents and a child to both identify and proactively pursue decisions that can lower that child's exposure to life-threatening risks and positively alter decisions when facing such risks. Applying this framework for parents and the youth themselves, we illustrate the logic and process of generating proactive alternatives with numerous examples that each could pursue to lower these life-threatening risks and possibly avoid a tragic premature death, and discuss some public policy implications of our findings. © 2013 Society for Risk Analysis.
ERIC Educational Resources Information Center
Knezevich, Stephen J., Ed.
In this era of rapid social change, educational administrators have discovered that new approaches to problem solving and decision making are needed. Systems analysis could afford a promising approach to administrative problems by providing a number of systematic techniques designed to sharpen administrative decision making, enhance efficiency,…
Using decision analysis to choose phosphorus targets for Lake Erie.
Anderson, R M; Hobbs, B F; Koonce, J F; Locci, A B
2001-02-01
Lake Erie water quality has improved dramatically since the degraded conditions of the 1960s. Additional gains could be made, but at the expense of further investment and reductions in fishery productivity. In facing such cross-jurisdictional issues, natural resource managers in Canada and the United States must grapple with conflicting objectives and important uncertainties, while considering the priorities of the public that live in the basin. The techniques and tools of decision analysis have been used successfully to deal with such decision problems in a range of environmental settings, but infrequently in the Great Lakes. The objective of this paper is to illustrate how such techniques might be brought to bear on an important, real decision currently facing Lake Erie resource managers and stakeholders: the choice of new phosphorus loading targets for the lake. The heart of our approach is a systematic elicitation of stakeholder preferences and an investigation of the degree to which different phosphorus-loading policies might satisfy ecosystem objectives. Results show that there are potential benefits to changing the historical policy of reducing phosphorus loads in Lake Erie. Copyright 2001 Springer-Verlag
Time Management and the Military Decision Making Process
1992-12-18
This monograph analyzes the military decision making process in terms of time management in order to determine if a timeline will expedite the...process. The monograph begins by establishing the importance of time and time management in planning. This section provides a general discussion of time, an...Perhaps using some of the techniques that other armies use will facilitate time management .... Time management , Decision making, Timeline, Mission analysis, Wargaming, Courses of action, OPORD, Brigade OPS.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
NASA Astrophysics Data System (ADS)
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
ERIC Educational Resources Information Center
Hale, Norman; Lindelow, John
Chapter 12 in a volume on school leadership, this chapter cites the work of several authorities concerning problem-solving or decision-making techniques based on the belief that group problem-solving effort is preferable to individual effort. The first technique, force-field analysis, is described as a means of dissecting complex problems into…
Cognitive task analysis of network analysts and managers for network situational awareness
NASA Astrophysics Data System (ADS)
Erbacher, Robert F.; Frincke, Deborah A.; Wong, Pak Chung; Moody, Sarah; Fink, Glenn
2010-01-01
The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The situational-awareness capabilities being developed focus on novel visualization techniques as well as data analysis techniques designed to improve the comprehensibility of the visualizations. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into ensuring we had feedback from network analysts and managers and understanding what their needs truly are. This paper discusses the cognitive task analysis methodology we followed to acquire feedback from the analysts. This paper also provides the details we acquired from the analysts on their processes, goals, concerns, etc. A final result we describe is the generation of a task-flow diagram.
Present-value analysis: A systems approach to public decisionmaking for cost effectiveness
NASA Technical Reports Server (NTRS)
Herbert, T. T.
1971-01-01
Decision makers within Governmental agencies and Congress must evaluate competing (and sometimes conflicting) proposals which seek funding and implementation. Present value analysis can be an effective decision making tool by enabling the formal evaluation of the effects of competing proposals on efficient national resource utilization. A project's costs are not only its direct disbursements, but its social costs as well. How much does it cost to have those funds diverted from their use and economic benefit by the private sector to the public project? Comparisons of competing projects' social costs allow decision makers to expand their decision bases by quantifying the projects' impacts upon the economy and the efficient utilization of the country's limited national resources. A conceptual model is established for the choosing of the appropriate discount rate to be used in evaluation decisions through the technique.
Closed-Loop Analysis of Soft Decisions for Serial Links
NASA Technical Reports Server (NTRS)
Lansdowne, Chatwin A.; Steele, Glen F.; Zucha, Joan P.; Schlesinger, Adam M.
2013-01-01
We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.
ERIC Educational Resources Information Center
Hollis, Joseph W.; And Others
The LORS technique is a combination of several techniques such as role projection, simulation, psychodrama, feedback, value clarification, role reversal, dramatization, decision making, process analysis, and others. The significant difference is that, when the techniques are used together, each often undergoes changes to the point that the effect…
ERIC Educational Resources Information Center
Bigham, Gary D.; Riney, Mark R.
2017-01-01
To meet the constantly changing needs of schools and diverse learners, educators must frequently monitor student learning, revise curricula, and improve instruction. Consequently, it is critical that careful analyses of student performance data are ongoing components of curriculum decision-making processes. The primary purpose of this study is to…
Problem analysis: application in the development of market strategies for health care organizations.
Martin, J
1988-03-01
The problem analysis technique is an approach to understanding salient customer needs that is especially appropriate under complex market conditions. The author demonstrates the use of the approach in segmenting markets and conducting competitive analysis for positioning strategy decisions in health care.
Asking the Right Questions: Techniques for Collaboration and School Change. 2nd Edition.
ERIC Educational Resources Information Center
Holcomb, Edie L.
This work provides school change leaders with tools, techniques, tips, examples, illustrations, and stories about promoting school change. Tools provided include histograms, surveys, run charts, weighted voting, force-field analysis, decision matrices, and many others. Chapter 1, "Introduction," applies a matrix for asking questions…
Gorsevski, Pece V; Donevska, Katerina R; Mitrovski, Cvetko D; Frizado, Joseph P
2012-02-01
This paper presents a GIS-based multi-criteria decision analysis approach for evaluating the suitability for landfill site selection in the Polog Region, Macedonia. The multi-criteria decision framework considers environmental and economic factors which are standardized by fuzzy membership functions and combined by integration of analytical hierarchy process (AHP) and ordered weighted average (OWA) techniques. The AHP is used for the elicitation of attribute weights while the OWA operator function is used to generate a wide range of decision alternatives for addressing uncertainty associated with interaction between multiple criteria. The usefulness of the approach is illustrated by different OWA scenarios that report landfill suitability on a scale between 0 and 1. The OWA scenarios are intended to quantify the level of risk taking (i.e., optimistic, pessimistic, and neutral) and to facilitate a better understanding of patterns that emerge from decision alternatives involved in the decision making process. Copyright © 2011 Elsevier Ltd. All rights reserved.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-01-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987
Decision-problem state analysis methodology
NASA Technical Reports Server (NTRS)
Dieterly, D. L.
1980-01-01
A methodology for analyzing a decision-problem state is presented. The methodology is based on the analysis of an incident in terms of the set of decision-problem conditions encountered. By decomposing the events that preceded an unwanted outcome, such as an accident, into the set of decision-problem conditions that were resolved, a more comprehensive understanding is possible. All human-error accidents are not caused by faulty decision-problem resolutions, but it appears to be one of the major areas of accidents cited in the literature. A three-phase methodology is presented which accommodates a wide spectrum of events. It allows for a systems content analysis of the available data to establish: (1) the resolutions made, (2) alternatives not considered, (3) resolutions missed, and (4) possible conditions not considered. The product is a map of the decision-problem conditions that were encountered as well as a projected, assumed set of conditions that should have been considered. The application of this methodology introduces a systematic approach to decomposing the events that transpired prior to the accident. The initial emphasis is on decision and problem resolution. The technique allows for a standardized method of accident into a scenario which may used for review or the development of a training simulation.
Increasing Effectiveness in Teaching Ethics to Undergraduate Business Students.
ERIC Educational Resources Information Center
Lampe, Marc
1997-01-01
Traditional approaches to teaching business ethics (philosophical analysis, moral quandaries, executive cases) may not be effective in persuading undergraduates of the importance of ethical behavior. Better techniques include values education, ethical decision-making models, analysis of ethical conflicts, and role modeling. (SK)
Should I Pack My Umbrella? Clinical versus Statistical Prediction of Mental Health Decisions
ERIC Educational Resources Information Center
Aegisdottir, Stefania; Spengler, Paul M.; White, Michael J.
2006-01-01
In this rejoinder, the authors respond to the insightful commentary of Strohmer and Arm, Chwalisz, and Hilton, Harris, and Rice about the meta-analysis on statistical versus clinical prediction techniques for mental health judgments. The authors address issues including the availability of statistical prediction techniques for real-life psychology…
Network meta-analysis: a technique to gather evidence from direct and indirect comparisons
2017-01-01
Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228
Keeling, Jonathan W; Pryde, Julie A; Merrill, Jacqueline A
2013-01-01
The nation's 2862 local health departments (LHDs) are the primary means for assuring public health services for all populations. The objective of this study is to assess the effect of organizational network analysis on management decisions in LHDs and to demonstrate the technique's ability to detect organizational adaptation over time. We conducted a longitudinal network analysis in a full-service LHD with 113 employees serving about 187,000 persons. Network survey data were collected from employees at 3 times: months 0, 8, and 34. At time 1 the initial analysis was presented to LHD managers as an intervention with information on evidence-based management strategies to address the findings. At times 2 and 3 interviews documented managers' decision making and events in the task environment. Response rates for the 3 network analyses were 90%, 97%, and 83%. Postintervention (time 2) results showed beneficial changes in network measures of communication and integration. Screening and case identification increased for chlamydia and for gonorrhea. Outbreak mitigation was accelerated by cross-divisional teaming. Network measurements at time 3 showed LHD adaptation to H1N1 and budget constraints with increased centralization. Task redundancy increased dramatically after National Incident Management System training. Organizational network analysis supports LHD management with empirical evidence that can be translated into strategic decisions about communication, allocation of resources, and addressing knowledge gaps. Specific population health outcomes were traced directly to management decisions based on network evidence. The technique can help managers improve how LHDs function as organizations and contribute to our understanding of public health systems.
NEW APPROACHES IN RISK ANALYSIS OF ENVIRONMENTAL STRESSORS TO HUMAN AND ECOLOGICAL SYSTEMS
We explore the application of novel techniques for improving and integrating risk analysis of environmental stressors to human and ecological systems. Environmental protection decisions are guided by risk assessments serving as tools to develop regulatory policy and other relate...
Using cognitive task analysis to identify critical decisions in the laparoscopic environment.
Craig, Curtis; Klein, Martina I; Griswold, John; Gaitonde, Krishnanath; McGill, Thomas; Halldorsson, Ari
2012-12-01
The aim of this study was to identify the critical decisions surgeons need to make regarding laparoscopic surgery, the information these decisions are based on, the strategies employed by surgeons to reach their objectives, and the difficulties experienced by novices. Laparoscopic training focuses on the development of technical skills. However, successful surgical outcomes are also dependent on appropriate decisions made during surgery, which are influenced by critical cues and the use of appropriate strategies. Novices might not be as adept at cue detection and strategy use. Participants were eight attending surgeons. The authors employed task-analytic techniques to identify critical decisions inherent in laparoscopy and the cues, strategies, and novice traps associated with these decisions. The authors used decision requirements tables to organize the data into the key decisions made during the preoperative, operative, and postoperative phases as well as the cues, strategies, and novice traps associated with these decisions. Key decisions identified for the preoperative phase included but were not limited to the decision of performing a laparoscopic versus open surgery, necessity to review the literature, practicing the procedure, and trocar placement. Some key decisions identified for the operative phase included converting to open surgery, performing angiograms, cutting tissue or organs, and reevaluation of the approach. Only one key decision was identified for the postoperative phrase: whether the surgeon's technique needs to be evaluated and revised. The laparoscopic environment requires complex decision making, and novices are prone to errors in their decisions. The information elicited in this study is applicable to laparoscopic training.
Vetter, Jeffrey S.
2005-02-01
The method and system described herein presents a technique for performance analysis that helps users understand the communication behavior of their message passing applications. The method and system described herein may automatically classifies individual communication operations and reveal the cause of communication inefficiencies in the application. This classification allows the developer to quickly focus on the culprits of truly inefficient behavior, rather than manually foraging through massive amounts of performance data. Specifically, the method and system described herein trace the message operations of Message Passing Interface (MPI) applications and then classify each individual communication event using a supervised learning technique: decision tree classification. The decision tree may be trained using microbenchmarks that demonstrate both efficient and inefficient communication. Since the method and system described herein adapt to the target system's configuration through these microbenchmarks, they simultaneously automate the performance analysis process and improve classification accuracy. The method and system described herein may improve the accuracy of performance analysis and dramatically reduce the amount of data that users must encounter.
Miller, Matthew James; McGuire, Kerry M.; Feigh, Karen M.
2016-01-01
The design and adoption of decision support systems within complex work domains is a challenge for cognitive systems engineering (CSE) practitioners, particularly at the onset of project development. This article presents an example of applying CSE techniques to derive design requirements compatible with traditional systems engineering to guide decision support system development. Specifically, it demonstrates the requirements derivation process based on cognitive work analysis for a subset of human spaceflight operations known as extravehicular activity. The results are presented in two phases. First, a work domain analysis revealed a comprehensive set of work functions and constraints that exist in the extravehicular activity work domain. Second, a control task analysis was performed on a subset of the work functions identified by the work domain analysis to articulate the translation of subject matter states of knowledge to high-level decision support system requirements. This work emphasizes an incremental requirements specification process as a critical component of CSE analyses to better situate CSE perspectives within the early phases of traditional systems engineering design. PMID:28491008
Miller, Matthew James; McGuire, Kerry M; Feigh, Karen M
2017-06-01
The design and adoption of decision support systems within complex work domains is a challenge for cognitive systems engineering (CSE) practitioners, particularly at the onset of project development. This article presents an example of applying CSE techniques to derive design requirements compatible with traditional systems engineering to guide decision support system development. Specifically, it demonstrates the requirements derivation process based on cognitive work analysis for a subset of human spaceflight operations known as extravehicular activity . The results are presented in two phases. First, a work domain analysis revealed a comprehensive set of work functions and constraints that exist in the extravehicular activity work domain. Second, a control task analysis was performed on a subset of the work functions identified by the work domain analysis to articulate the translation of subject matter states of knowledge to high-level decision support system requirements. This work emphasizes an incremental requirements specification process as a critical component of CSE analyses to better situate CSE perspectives within the early phases of traditional systems engineering design.
NASA Astrophysics Data System (ADS)
Kaur, Parneet; Singh, Sukhwinder; Garg, Sushil; Harmanpreet
2010-11-01
In this paper we study about classification algorithms for farm DSS. By applying classification algorithms i.e. Limited search, ID3, CHAID, C4.5, Improved C4.5 and One VS all Decision Tree on common data set of crop with specified class, results are obtained. The tool used to derive results is SPINA. The graphical results obtained from tool are compared to suggest best technique to develop farm Decision Support System. This analysis would help to researchers to design effective and fast DSS for farmer to take decision for enhancing their yield.
Decision science: a scientific approach to enhance public health budgeting.
Honoré, Peggy A; Fos, Peter J; Smith, Torney; Riley, Michael; Kramarz, Kim
2010-01-01
The allocation of resources for public health programming is a complicated and daunting responsibility. Financial decision-making processes within public health agencies are especially difficult when not supported with techniques for prioritizing and ranking alternatives. This article presents a case study of a decision analysis software model that was applied to the process of identifying funding priorities for public health services in the Spokane Regional Health District. Results on the use of this decision support system provide insights into how decision science models, which have been used for decades in business and industry, can be successfully applied to public health budgeting as a means of strengthening agency financial management processes.
Applying Subject Matter Expertise (SME) Elicitation Techniques to TRAC Studies
2014-09-30
prioritisation, budgeting and resource allocation with multi-criteria decision analysis and decision conferencing ”. English. In: Annals of Operations... electronically . Typically, in responding to survey items, experts are not expected to elaborate beyond providing responses in the format requested in the...between them, however irrelevant to probability Kynn and Ayyub.84 For example, an electronic jamming device might disrupt a cell phone signal at certain
Automatic rule generation for high-level vision
NASA Technical Reports Server (NTRS)
Rhee, Frank Chung-Hoon; Krishnapuram, Raghu
1992-01-01
A new fuzzy set based technique that was developed for decision making is discussed. It is a method to generate fuzzy decision rules automatically for image analysis. This paper proposes a method to generate rule-based approaches to solve problems such as autonomous navigation and image understanding automatically from training data. The proposed method is also capable of filtering out irrelevant features and criteria from the rules.
Schaafsma, Joanna D; van der Graaf, Yolanda; Rinkel, Gabriel J E; Buskens, Erik
2009-12-01
The lack of a standard methodology in diagnostic research impedes adequate evaluation before implementation of constantly developing diagnostic techniques. We discuss the methodology of diagnostic research and underscore the relevance of decision analysis in the process of evaluation of diagnostic tests. Overview and conceptual discussion. Diagnostic research requires a stepwise approach comprising assessment of test characteristics followed by evaluation of added value, clinical outcome, and cost-effectiveness. These multiple goals are generally incompatible with a randomized design. Decision-analytic models provide an important alternative through integration of the best available evidence. Thus, critical assessment of clinical value and efficient use of resources can be achieved. Decision-analytic models should be considered part of the standard methodology in diagnostic research. They can serve as a valid alternative to diagnostic randomized clinical trials (RCTs).
Network meta-analysis: an introduction for pharmacists.
Xu, Yina; Amiche, Mohamed Amine; Tadrous, Mina
2018-05-21
Network meta-analysis is a new tool used to summarize and compare studies for multiple interventions, irrespective of whether these interventions have been directly evaluated against each other. Network meta-analysis is quickly becoming the standard in conducting therapeutic reviews and clinical guideline development. However, little guidance is available to help pharmacists review network meta-analysis studies in their practice. Major institutions such as the Cochrane Collaboration, Agency for Healthcare Research and Quality, Canadian Agency for Drugs and Technologies in Health, and National Institute for Health and Care Excellence Decision Support Unit have endorsed utilizing network meta-analysis to establish therapeutic evidence and inform decision making. Our objective is to introduce this novel technique to pharmacy practitioners, and highlight key assumptions behind network meta-analysis studies.
ERIC Educational Resources Information Center
Vivo, Juana-Maria; Franco, Manuel
2008-01-01
This article attempts to present a novel application of a method of measuring accuracy for academic success predictors that could be used as a standard. This procedure is known as the receiver operating characteristic (ROC) curve, which comes from statistical decision techniques. The statistical prediction techniques provide predictor models and…
NASA Astrophysics Data System (ADS)
Prasad, S.; Bruce, L. M.
2007-04-01
There is a growing interest in using multiple sources for automatic target recognition (ATR) applications. One approach is to take multiple, independent observations of a phenomenon and perform a feature level or a decision level fusion for ATR. This paper proposes a method to utilize these types of multi-source fusion techniques to exploit hyperspectral data when only a small number of training pixels are available. Conventional hyperspectral image based ATR techniques project the high dimensional reflectance signature onto a lower dimensional subspace using techniques such as Principal Components Analysis (PCA), Fisher's linear discriminant analysis (LDA), subspace LDA and stepwise LDA. While some of these techniques attempt to solve the curse of dimensionality, or small sample size problem, these are not necessarily optimal projections. In this paper, we present a divide and conquer approach to address the small sample size problem. The hyperspectral space is partitioned into contiguous subspaces such that the discriminative information within each subspace is maximized, and the statistical dependence between subspaces is minimized. We then treat each subspace as a separate source in a multi-source multi-classifier setup and test various decision fusion schemes to determine their efficacy. Unlike previous approaches which use correlation between variables for band grouping, we study the efficacy of higher order statistical information (using average mutual information) for a bottom up band grouping. We also propose a confidence measure based decision fusion technique, where the weights associated with various classifiers are based on their confidence in recognizing the training data. To this end, training accuracies of all classifiers are used for weight assignment in the fusion process of test pixels. The proposed methods are tested using hyperspectral data with known ground truth, such that the efficacy can be quantitatively measured in terms of target recognition accuracies.
Expert systems in civil engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kostem, C.N.; Maher, M.L.
1986-01-01
This book presents the papers given at a symposium on expert systems in civil engineering. Topics considered at the symposium included problem solving using expert system techniques, construction schedule analysis, decision making and risk analysis, seismic risk analysis systems, an expert system for inactive hazardous waste site characterization, an expert system for site selection, knowledge engineering, and knowledge-based expert systems in seismic analysis.
Mühlbacher, Axel C; Kaczynski, Anika
2016-02-01
Healthcare decision making is usually characterized by a low degree of transparency. The demand for transparent decision processes can be fulfilled only when assessment, appraisal and decisions about health technologies are performed under a systematic construct of benefit assessment. The benefit of an intervention is often multidimensional and, thus, must be represented by several decision criteria. Complex decision problems require an assessment and appraisal of various criteria; therefore, a decision process that systematically identifies the best available alternative and enables an optimal and transparent decision is needed. For that reason, decision criteria must be weighted and goal achievement must be scored for all alternatives. Methods of multi-criteria decision analysis (MCDA) are available to analyse and appraise multiple clinical endpoints and structure complex decision problems in healthcare decision making. By means of MCDA, value judgments, priorities and preferences of patients, insurees and experts can be integrated systematically and transparently into the decision-making process. This article describes the MCDA framework and identifies potential areas where MCDA can be of use (e.g. approval, guidelines and reimbursement/pricing of health technologies). A literature search was performed to identify current research in healthcare. The results showed that healthcare decision making is addressing the problem of multiple decision criteria and is focusing on the future development and use of techniques to weight and score different decision criteria. This article emphasizes the use and future benefit of MCDA.
NASA Astrophysics Data System (ADS)
Asmar, Joseph Al; Lahoud, Chawki; Brouche, Marwan
2018-05-01
Cogeneration and trigeneration systems can contribute to the reduction of primary energy consumption and greenhouse gas emissions in residential and tertiary sectors, by reducing fossil fuels demand and grid losses with respect to conventional systems. The cogeneration systems are characterized by a very high energy efficiency (80 to 90%) as well as a less polluting aspect compared to the conventional energy production. The integration of these systems into the energy network must simultaneously take into account their economic and environmental challenges. In this paper, a decision-making strategy will be introduced and is divided into two parts. The first one is a strategy based on a multi-objective optimization tool with data analysis and the second part is based on an optimization algorithm. The power dispatching of the Lebanese electricity grid is then simulated and considered as a case study in order to prove the compatibility of the cogeneration power calculated by our decision-making technique. In addition, the thermal energy produced by the cogeneration systems which capacity is selected by our technique shows compatibility with the thermal demand for district heating.
Using Boosting Decision Trees in Gravitational Wave Searches triggered by Gamma-ray Bursts
NASA Astrophysics Data System (ADS)
Zuraw, Sarah; LIGO Collaboration
2015-04-01
The search for gravitational wave bursts requires the ability to distinguish weak signals from background detector noise. Gravitational wave bursts are characterized by their transient nature, making them particularly difficult to detect as they are similar to non-Gaussian noise fluctuations in the detector. The Boosted Decision Tree method is a powerful machine learning algorithm which uses Multivariate Analysis techniques to explore high-dimensional data sets in order to distinguish between gravitational wave signal and background detector noise. It does so by training with known noise events and simulated gravitational wave events. The method is tested using waveform models and compared with the performance of the standard gravitational wave burst search pipeline for Gamma-ray Bursts. It is shown that the method is able to effectively distinguish between signal and background events under a variety of conditions and over multiple Gamma-ray Burst events. This example demonstrates the usefulness and robustness of the Boosted Decision Tree and Multivariate Analysis techniques as a detection method for gravitational wave bursts. LIGO, UMass, PREP, NEGAP.
Establishing Evidence for Internal Structure Using Exploratory Factor Analysis
ERIC Educational Resources Information Center
Watson, Joshua C.
2017-01-01
Exploratory factor analysis (EFA) is a data reduction technique used to condense data into smaller sets of summary variables by identifying underlying factors potentially accounting for patterns of collinearity among said variables. Using an illustrative example, the 5 general steps of EFA are described with best practices for decision making…
Analysis in Motion Initiative – Summarization Capability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arendt, Dustin; Pirrung, Meg; Jasper, Rob
2017-06-22
Analysts are tasked with integrating information from multiple data sources for important and timely decision making. What if sense making and overall situation awareness could be improved through visualization techniques? The Analysis in Motion initiative is advancing the ability to summarize and abstract multiple streams and static data sources over time.
A Compact Review of Multi-criteria Decision Analysis Uncertainty Techniques
2013-02-01
9 3.4 PROMETHEE -GAIA Method...obtained (74). 3.4 PROMETHEE -GAIA Method Preference Ranking Organization Method for Enrichment Evaluation ( PROMETHEE ) and Geometrical Analysis for...greater understanding of the importance of their selections. The PROMETHEE method was designed to perform MCDA while accounting for each of these
Investigation of Capabilities and Technologies Supporting Rapid UAV Launch System Development
2015-06-01
NUMBERS 6. AUTHOR(S) Patrick Alan Livesay 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Postgraduate School Monterey, CA 93943 8. PERFORMING ...to operate. This enabled the launcher design team to more clearly determine and articulate system require- ments and performance parameters. Next, a...Process (AHP) was performed to xvii prioritize the capabilities and assist in the decision-making process [1]. The AHP decision-analysis technique is
A Survey of New Trends in Symbolic Execution for Software Testing and Analysis
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Visser, Willem
2009-01-01
Symbolic execution is a well-known program analysis technique which represents values of program inputs with symbolic values instead of concrete (initialized) data and executes the program by manipulating program expressions involving the symbolic values. Symbolic execution has been proposed over three decades ago but recently it has found renewed interest in the research community, due in part to the progress in decision procedures, availability of powerful computers and new algorithmic developments. We provide a survey of some of the new research trends in symbolic execution, with particular emphasis on applications to test generation and program analysis. We first describe an approach that handles complex programming constructs such as input data structures, arrays, as well as multi-threading. We follow with a discussion of abstraction techniques that can be used to limit the (possibly infinite) number of symbolic configurations that need to be analyzed for the symbolic execution of looping programs. Furthermore, we describe recent hybrid techniques that combine concrete and symbolic execution to overcome some of the inherent limitations of symbolic execution, such as handling native code or availability of decision procedures for the application domain. Finally, we give a short survey of interesting new applications, such as predictive testing, invariant inference, program repair, analysis of parallel numerical programs and differential symbolic execution.
Scholz, Stefan; Mittendorf, Thomas
2014-12-01
Rheumatoid arthritis (RA) is a chronic, inflammatory disease with severe effects on the functional ability of patients. Due to the prevalence of 0.5 to 1.0 percent in western countries, new treatment options are a major concern for decision makers with regard to their budget impact. In this context, cost-effectiveness analyses are a helpful tool to evaluate new treatment options for reimbursement schemes. To analyze and compare decision analytic modeling techniques and to explore their use in RA with regard to their advantages and shortcomings. A systematic literature review was conducted in PubMED and 58 studies reporting health economics decision models were analyzed with regard to the modeling technique used. From the 58 reviewed publications, we found 13 reporting decision tree-analysis, 25 (cohort) Markov models, 13 publications on individual sampling methods (ISM) and seven discrete event simulations (DES). Thereby 26 studies were identified as presenting independently developed models and 32 models as adoptions. The modeling techniques used were found to differ in their complexity and in the number of treatment options compared. Methodological features are presented in the article and a comprehensive overview of the cost-effectiveness estimates is given in Additional files 1 and 2. When compared to the other modeling techniques, ISM and DES have advantages in the coverage of patient heterogeneity and, additionally, DES is capable to model more complex treatment sequences and competing risks in RA-patients. Nevertheless, the availability of sufficient data is necessary to avoid assumptions in ISM and DES exercises, thereby enabling biased results. Due to the different settings, time frames and interventions in the reviewed publications, no direct comparison of modeling techniques was applicable. The results from other indications suggest that incremental cost-effective ratios (ICERs) do not differ significantly between Markov and DES models, but DES is able to report more outcome parameters. Given a sufficient data supply, DES is the modeling technique of choice when modeling cost-effectiveness in RA. Otherwise transparency on the data inputs is crucial for valid results and to inform decision makers about possible biases. With regard to ICERs, Markov models might provide similar estimates as more advanced modeling techniques.
Risk-benefit analysis and public policy: a bibliography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, E.M.; Van Horn, A.J.
1976-11-01
Risk-benefit analysis has been implicitly practiced whenever decision-makers are confronted with decisions involving risks to life, health, or to the environment. Various methodologies have been developed to evaluate relevant criteria and to aid in assessing the impacts of alternative projects. Among these have been cost-benefit analysis, which has been widely used for project evaluation. However, in many cases it has been difficult to assign dollar costs to those criteria involving risks and benefits which are not now assigned explicit monetary values in our economic system. Hence, risk-benefit analysis has evolved to become more than merely an extension of cost-benefit analysis,more » and many methods have been applied to examine the trade-offs between risks and benefits. In addition, new scientific and statistical techniques have been developed for assessing current and future risks. The 950 references included in this bibliography are meant to suggest the breadth of those methodologies which have been applied to decisions involving risk.« less
Neural net diagnostics for VLSI test
NASA Technical Reports Server (NTRS)
Lin, T.; Tseng, H.; Wu, A.; Dogan, N.; Meador, J.
1990-01-01
This paper discusses the application of neural network pattern analysis algorithms to the IC fault diagnosis problem. A fault diagnostic is a decision rule combining what is known about an ideal circuit test response with information about how it is distorted by fabrication variations and measurement noise. The rule is used to detect fault existence in fabricated circuits using real test equipment. Traditional statistical techniques may be used to achieve this goal, but they can employ unrealistic a priori assumptions about measurement data. Our approach to this problem employs an adaptive pattern analysis technique based on feedforward neural networks. During training, a feedforward network automatically captures unknown sample distributions. This is important because distributions arising from the nonlinear effects of process variation can be more complex than is typically assumed. A feedforward network is also able to extract measurement features which contribute significantly to making a correct decision. Traditional feature extraction techniques employ matrix manipulations which can be particularly costly for large measurement vectors. In this paper we discuss a software system which we are developing that uses this approach. We also provide a simple example illustrating the use of the technique for fault detection in an operational amplifier.
Automatic Target Recognition Classification System Evaluation Methodology
2002-09-01
Testing Set of Two-Class XOR Data (250 Samples)......................................... 2-59 2.27 Decision Analysis Process Flow Chart...ROC curve meta - analysis , which is the estimation of the true ROC curve of a given diagnostic system through ROC analysis across many studies or...technique can be very effective in sensitivity analysis ; trying to determine which data points have the most effect on the solution, and in
Stakeholder perspectives on decision-analytic modeling frameworks to assess genetic services policy.
Guzauskas, Gregory F; Garrison, Louis P; Stock, Jacquie; Au, Sylvia; Doyle, Debra Lochner; Veenstra, David L
2013-01-01
Genetic services policymakers and insurers often make coverage decisions in the absence of complete evidence of clinical utility and under budget constraints. We evaluated genetic services stakeholder opinions on the potential usefulness of decision-analytic modeling to inform coverage decisions, and asked them to identify genetic tests for decision-analytic modeling studies. We presented an overview of decision-analytic modeling to members of the Western States Genetic Services Collaborative Reimbursement Work Group and state Medicaid representatives and conducted directed content analysis and an anonymous survey to gauge their attitudes toward decision-analytic modeling. Participants also identified and prioritized genetic services for prospective decision-analytic evaluation. Participants expressed dissatisfaction with current processes for evaluating insurance coverage of genetic services. Some participants expressed uncertainty about their comprehension of decision-analytic modeling techniques. All stakeholders reported openness to using decision-analytic modeling for genetic services assessments. Participants were most interested in application of decision-analytic concepts to multiple-disorder testing platforms, such as next-generation sequencing and chromosomal microarray. Decision-analytic modeling approaches may provide a useful decision tool to genetic services stakeholders and Medicaid decision-makers.
Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J; Ibarra-Manzano, Mario A
2016-03-05
Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states.
Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R.; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J.; Ibarra-Manzano, Mario A.
2016-01-01
Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states. PMID:26959029
Extraction of decision rules via imprecise probabilities
NASA Astrophysics Data System (ADS)
Abellán, Joaquín; López, Griselda; Garach, Laura; Castellano, Javier G.
2017-05-01
Data analysis techniques can be applied to discover important relations among features. This is the main objective of the Information Root Node Variation (IRNV) technique, a new method to extract knowledge from data via decision trees. The decision trees used by the original method were built using classic split criteria. The performance of new split criteria based on imprecise probabilities and uncertainty measures, called credal split criteria, differs significantly from the performance obtained using the classic criteria. This paper extends the IRNV method using two credal split criteria: one based on a mathematical parametric model, and other one based on a non-parametric model. The performance of the method is analyzed using a case study of traffic accident data to identify patterns related to the severity of an accident. We found that a larger number of rules is generated, significantly supplementing the information obtained using the classic split criteria.
Planning effectiveness may grow on fault trees.
Chow, C W; Haddad, K; Mannino, B
1991-10-01
The first step of a strategic planning process--identifying and analyzing threats and opportunities--requires subjective judgments. By using an analytical tool known as a fault tree, healthcare administrators can reduce the unreliability of subjective decision making by creating a logical structure for problem solving and decision making. A case study of 11 healthcare administrators showed that an analysis technique called prospective hindsight can add to a fault tree's ability to improve a strategic planning process.
Use of Inverse Reinforcement Learning for Identity Prediction
NASA Technical Reports Server (NTRS)
Hayes, Roy; Bao, Jonathan; Beling, Peter; Horowitz, Barry
2011-01-01
We adopt Markov Decision Processes (MDP) to model sequential decision problems, which have the characteristic that the current decision made by a human decision maker has an uncertain impact on future opportunity. We hypothesize that the individuality of decision makers can be modeled as differences in the reward function under a common MDP model. A machine learning technique, Inverse Reinforcement Learning (IRL), was used to learn an individual's reward function based on limited observation of his or her decision choices. This work serves as an initial investigation for using IRL to analyze decision making, conducted through a human experiment in a cyber shopping environment. Specifically, the ability to determine the demographic identity of users is conducted through prediction analysis and supervised learning. The results show that IRL can be used to correctly identify participants, at a rate of 68% for gender and 66% for one of three college major categories.
Nahid, Abdullah-Al; Mehrabi, Mohamad Ali; Kong, Yinan
2018-01-01
Breast Cancer is a serious threat and one of the largest causes of death of women throughout the world. The identification of cancer largely depends on digital biomedical photography analysis such as histopathological images by doctors and physicians. Analyzing histopathological images is a nontrivial task, and decisions from investigation of these kinds of images always require specialised knowledge. However, Computer Aided Diagnosis (CAD) techniques can help the doctor make more reliable decisions. The state-of-the-art Deep Neural Network (DNN) has been recently introduced for biomedical image analysis. Normally each image contains structural and statistical information. This paper classifies a set of biomedical breast cancer images (BreakHis dataset) using novel DNN techniques guided by structural and statistical information derived from the images. Specifically a Convolutional Neural Network (CNN), a Long-Short-Term-Memory (LSTM), and a combination of CNN and LSTM are proposed for breast cancer image classification. Softmax and Support Vector Machine (SVM) layers have been used for the decision-making stage after extracting features utilising the proposed novel DNN models. In this experiment the best Accuracy value of 91.00% is achieved on the 200x dataset, the best Precision value 96.00% is achieved on the 40x dataset, and the best F -Measure value is achieved on both the 40x and 100x datasets.
Discriminant forest classification method and system
Chen, Barry Y.; Hanley, William G.; Lemmond, Tracy D.; Hiller, Lawrence J.; Knapp, David A.; Mugge, Marshall J.
2012-11-06
A hybrid machine learning methodology and system for classification that combines classical random forest (RF) methodology with discriminant analysis (DA) techniques to provide enhanced classification capability. A DA technique which uses feature measurements of an object to predict its class membership, such as linear discriminant analysis (LDA) or Andersen-Bahadur linear discriminant technique (AB), is used to split the data at each node in each of its classification trees to train and grow the trees and the forest. When training is finished, a set of n DA-based decision trees of a discriminant forest is produced for use in predicting the classification of new samples of unknown class.
Development and evaluation of an automatic labeling technique for spring small grains
NASA Technical Reports Server (NTRS)
Crist, E. P.; Malila, W. A. (Principal Investigator)
1981-01-01
A labeling technique is described which seeks to associate a sampling entity with a particular crop or crop group based on similarity of growing season and temporal-spectral patterns of development. Human analyst provide contextual information, after which labeling decisions are made automatically. Results of a test of the technique on a large, multi-year data set are reported. Grain labeling accuracies are similar to those achieved by human analysis techniques, while non-grain accuracies are lower. Recommendations for improvments and implications of the test results are discussed.
Marsh, Kevin; IJzerman, Maarten; Thokala, Praveen; Baltussen, Rob; Boysen, Meindert; Kaló, Zoltán; Lönngren, Thomas; Mussen, Filip; Peacock, Stuart; Watkins, John; Devlin, Nancy
2016-01-01
Health care decisions are complex and involve confronting trade-offs between multiple, often conflicting objectives. Using structured, explicit approaches to decisions involving multiple criteria can improve the quality of decision making. A set of techniques, known under the collective heading, multiple criteria decision analysis (MCDA), are useful for this purpose. In 2014, ISPOR established an Emerging Good Practices Task Force. The task force's first report defined MCDA, provided examples of its use in health care, described the key steps, and provided an overview of the principal methods of MCDA. This second task force report provides emerging good-practice guidance on the implementation of MCDA to support health care decisions. The report includes: a checklist to support the design, implementation and review of an MCDA; guidance to support the implementation of the checklist; the order in which the steps should be implemented; illustrates how to incorporate budget constraints into an MCDA; provides an overview of the skills and resources, including available software, required to implement MCDA; and future research directions. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
ATR evaluation through the synthesis of multiple performance measures
NASA Astrophysics Data System (ADS)
Bassham, Christopher B.; Klimack, William K.; Bauer, Kenneth W., Jr.
2002-07-01
This research demonstrates the application of decision analysis (DA) techniques to decisions made within Automatic Target Recognition (ATR) technology development. This work is accomplished to improve the means by which ATR technologies are evaluated. The first step in this research was to create a flexible decision analysis framework that could be applied to several decisions across different ATR programs evaluated by the Comprehensive ATR Scientific Evaluation (COMPASE) Center of the Air Force Research Laboratory (AFRL). For the purposes of this research, a single COMPASE Center representative provided the value, utility, and preference functions for the DA framework. The DA framework employs performance measures collected during ATR classification system (CS) testing to calculate value and utility scores. The authors gathered data from the Moving and Stationary Target Acquisition and Recognition (MSTAR) program to demonstrate how the decision framework could be used to evaluate three different ATR CSs. A decision-maker may use the resultant scores to gain insight into any of the decisions that occur throughout the lifecycle of ATR technologies. Additionally, a means of evaluating ATR CS self-assessment ability is presented. This represents a new criterion that emerged from this study, and no present evaluation metric is known.
Bevilacqua, M; Ciarapica, F E; Giacchetta, G
2008-07-01
This work is an attempt to apply classification tree methods to data regarding accidents in a medium-sized refinery, so as to identify the important relationships between the variables, which can be considered as decision-making rules when adopting any measures for improvement. The results obtained using the CART (Classification And Regression Trees) method proved to be the most precise and, in general, they are encouraging concerning the use of tree diagrams as preliminary explorative techniques for the assessment of the ergonomic, management and operational parameters which influence high accident risk situations. The Occupational Injury analysis carried out in this paper was planned as a dynamic process and can be repeated systematically. The CART technique, which considers a very wide set of objective and predictive variables, shows new cause-effect correlations in occupational safety which had never been previously described, highlighting possible injury risk groups and supporting decision-making in these areas. The use of classification trees must not, however, be seen as an attempt to supplant other techniques, but as a complementary method which can be integrated into traditional types of analysis.
Paraconsistent Annotated Logic in Viability Analysis: an Approach to Product Launching
NASA Astrophysics Data System (ADS)
Romeu de Carvalho, Fábio; Brunstein, Israel; Abe, Jair Minoro
2004-08-01
In this paper we present an application of the Para-analyzer, a logical analyzer based on the Paraconsistent Annotated Logic Pτ, introduced by Da Silva Filho and Abe in the decision-making systems. An example is analyzed in detail showing how uncertainty, inconsistency and paracompleteness can be elegantly handled with this logical system. As application for the Para-analyzer in decision-making, we developed the BAM — Baricenter Analysis Method. In order to make the presentation easier, we present the BAM applied in the viability analysis of product launching. Some of the techniques of Paraconsistent Annotated Logic have been applied in Artificial Intelligence, Robotics, Information Technolgy (Computer Sciences), etc..
Park, Myonghwa; Choi, Sora; Shin, A Mi; Koo, Chul Hoi
2013-02-01
The purpose of this study was to develop a prediction model for the characteristics of older adults with depression using the decision tree method. A large dataset from the 2008 Korean Elderly Survey was used and data of 14,970 elderly people were analyzed. Target variable was depression and 53 input variables were general characteristics, family & social relationship, economic status, health status, health behavior, functional status, leisure & social activity, quality of life, and living environment. Data were analyzed by decision tree analysis, a data mining technique using SPSS Window 19.0 and Clementine 12.0 programs. The decision trees were classified into five different rules to define the characteristics of older adults with depression. Classification & Regression Tree (C&RT) showed the best prediction with an accuracy of 80.81% among data mining models. Factors in the rules were life satisfaction, nutritional status, daily activity difficulty due to pain, functional limitation for basic or instrumental daily activities, number of chronic diseases and daily activity difficulty due to disease. The different rules classified by the decision tree model in this study should contribute as baseline data for discovering informative knowledge and developing interventions tailored to these individual characteristics.
History matching through dynamic decision-making
Maschio, Célio; Santos, Antonio Alberto; Schiozer, Denis; Rocha, Anderson
2017-01-01
History matching is the process of modifying the uncertain attributes of a reservoir model to reproduce the real reservoir performance. It is a classical reservoir engineering problem and plays an important role in reservoir management since the resulting models are used to support decisions in other tasks such as economic analysis and production strategy. This work introduces a dynamic decision-making optimization framework for history matching problems in which new models are generated based on, and guided by, the dynamic analysis of the data of available solutions. The optimization framework follows a ‘learning-from-data’ approach, and includes two optimizer components that use machine learning techniques, such as unsupervised learning and statistical analysis, to uncover patterns of input attributes that lead to good output responses. These patterns are used to support the decision-making process while generating new, and better, history matched solutions. The proposed framework is applied to a benchmark model (UNISIM-I-H) based on the Namorado field in Brazil. Results show the potential the dynamic decision-making optimization framework has for improving the quality of history matching solutions using a substantial smaller number of simulations when compared with a previous work on the same benchmark. PMID:28582413
NASA Astrophysics Data System (ADS)
Roy, Jean; Breton, Richard; Paradis, Stephane
2001-08-01
Situation Awareness (SAW) is essential for commanders to conduct decision-making (DM) activities. Situation Analysis (SA) is defined as a process, the examination of a situation, its elements, and their relations, to provide and maintain a product, i.e., a state of SAW for the decision maker. Operational trends in warfare put the situation analysis process under pressure. This emphasizes the need for a real-time computer-based Situation analysis Support System (SASS) to aid commanders in achieving the appropriate situation awareness, thereby supporting their response to actual or anticipated threats. Data fusion is clearly a key enabler for SA and a SASS. Since data fusion is used for SA in support of dynamic human decision-making, the exploration of the SA concepts and the design of data fusion techniques must take into account human factor aspects in order to ensure a cognitive fit of the fusion system with the decision-maker. Indeed, the tight human factor aspects in order to ensure a cognitive fit of the fusion system with the decision-maker. Indeed, the tight integration of the human element with the SA technology is essential. Regarding these issues, this paper provides a description of CODSI (Command Decision Support Interface), and operational- like human machine interface prototype for investigations in computer-based SA and command decision support. With CODSI, one objective was to apply recent developments in SA theory and information display technology to the problem of enhancing SAW quality. It thus provides a capability to adequately convey tactical information to command decision makers. It also supports the study of human-computer interactions for SA, and methodologies for SAW measurement.
A fuzzy MCDM framework based on fuzzy measure and fuzzy integral for agile supplier evaluation
NASA Astrophysics Data System (ADS)
Dursun, Mehtap
2017-06-01
Supply chains need to be agile in order to response quickly to the changes in today's competitive environment. The success of an agile supply chain depends on the firm's ability to select the most appropriate suppliers. This study proposes a multi-criteria decision making technique for conducting an analysis based on multi-level hierarchical structure and fuzzy logic for the evaluation of agile suppliers. The ideal and anti-ideal solutions are taken into consideration simultaneously in the developed approach. The proposed decision approach enables the decision-makers to use linguistic terms, and thus, reduce their cognitive burden in the evaluation process. Furthermore, a hierarchy of evaluation criteria and their related sub-criteria is employed in the presented approach in order to conduct a more effective analysis.
NASA Astrophysics Data System (ADS)
Soltanmohammadi, Hossein; Osanloo, Morteza; Aghajani Bazzazi, Abbas
2009-08-01
This study intends to take advantage of a previously developed framework for mined land suitability analysis (MLSA) consisted of economical, social, technical and mine site factors to achieve a partial and also a complete pre-order of feasible post-mining land-uses. Analysis by an outranking multi-attribute decision-making (MADM) technique, called PROMETHEE (preference ranking organization method for enrichment evaluation), was taken into consideration because of its clear advantages on the field of MLSA as compared with MADM ranking techniques. Application of the proposed approach on a mined land can be completed through some successive steps. First, performance of the MLSA attributes is scored locally by each individual decision maker (DM). Then the assigned performance scores are normalized and the deviation amplitudes of non-dominated alternatives are calculated. Weights of the attributes are calculated by another MADM technique namely, analytical hierarchy process (AHP) in a separate procedure. Using the Gaussian preference function beside the weights, the preference indexes of the land-use alternatives are obtained. Calculation of the outgoing and entering flows of the alternatives and one by one comparison of these values will lead to partial pre-order of them and calculation of the net flows, will lead to a ranked preference for each land-use. At the final step, utilizing the PROMETHEE group decision support system which incorporates judgments of all the DMs, a consensual ranking can be derived. In this paper, preference order of post-mining land-uses for a hypothetical mined land has been derived according to judgments of one DM to reveal applicability of the proposed approach.
A platform for proactive, risk-based slope asset management, phase II.
DOT National Transportation Integrated Search
2015-03-01
The lidar visualization technique developed by this project enables highway managers to understand changes in slope characteristics : along highways. This change detection and analysis can be the basis of informed decisions for slope inspection and r...
A platform for proactive, risk-based slope asset management, phase II.
DOT National Transportation Integrated Search
2015-08-01
The lidar visualization technique developed by this project enables highway managers to understand changes : in slope characteristics along highways. This change detection and analysis can be the basis of informed : decisions for slope inspection and...
2006-06-01
heart of a distinction within the CSE community with respect to the differences between Cognitive Task Analysis (CTA) and Cognitive Work Analysis...Wesley. Pirolli, P. and Card, S. (2005). The sensemaking process and leverage points for analyst technology as identified through cognitive task analysis . In...D. D., and Elm, W. C. (2000). Cognitive task analysis as bootstrapping multiple converging techniques. In Schraagen, Chipman, and Shalin (Eds
Dotson, G Scott; Hudson, Naomi L; Maier, Andrew
2015-01-01
Emergency Management and Operations (EMO) personnel are in need of resources and tools to assist in understanding the health risks associated with dermal exposures during chemical incidents. This article reviews available resources and presents a conceptual framework for a decision support system (DSS) that assists in characterizing and managing risk during chemical emergencies involving dermal exposures. The framework merges principles of three decision-making techniques: 1) scenario planning, 2) risk analysis, and 3) multicriteria decision analysis (MCDA). This DSS facilitates dynamic decision making during each of the distinct life cycle phases of an emergency incident (ie, preparedness, response, or recovery) and identifies EMO needs. A checklist tool provides key questions intended to guide users through the complexities of conducting a dermal risk assessment. The questions define the scope of the framework for resource identification and application to support decision-making needs. The framework consists of three primary modules: 1) resource compilation, 2) prioritization, and 3) decision. The modules systematically identify, organize, and rank relevant information resources relating to the hazards of dermal exposures to chemicals and risk management strategies. Each module is subdivided into critical elements designed to further delineate the resources based on relevant incident phase and type of information. The DSS framework provides a much needed structure based on contemporary decision analysis principles for 1) documenting key questions for EMO problem formulation and 2) a method for systematically organizing, screening, and prioritizing information resources on dermal hazards, exposures, risk characterization, and management.
Dotson, G. Scott; Hudson, Naomi L.; Maier, Andrew
2016-01-01
Emergency Management and Operations (EMO) personnel are in need of resources and tools to assist in understanding the health risks associated with dermal exposures during chemical incidents. This article reviews available resources and presents a conceptual framework for a decision support system (DSS) that assists in characterizing and managing risk during chemical emergencies involving dermal exposures. The framework merges principles of three decision-making techniques: 1) scenario planning, 2) risk analysis, and 3) multicriteria decision analysis (MCDA). This DSS facilitates dynamic decision making during each of the distinct life cycle phases of an emergency incident (ie, preparedness, response, or recovery) and identifies EMO needs. A checklist tool provides key questions intended to guide users through the complexities of conducting a dermal risk assessment. The questions define the scope of the framework for resource identification and application to support decision-making needs. The framework consists of three primary modules: 1) resource compilation, 2) prioritization, and 3) decision. The modules systematically identify, organize, and rank relevant information resources relating to the hazards of dermal exposures to chemicals and risk management strategies. Each module is subdivided into critical elements designed to further delineate the resources based on relevant incident phase and type of information. The DSS framework provides a much needed structure based on contemporary decision analysis principles for 1) documenting key questions for EMO problem formulation and 2) a method for systematically organizing, screening, and prioritizing information resources on dermal hazards, exposures, risk characterization, and management. PMID:26312660
Modeling paradigms for medical diagnostic decision support: a survey and future directions.
Wagholikar, Kavishwar B; Sundararajan, Vijayraghavan; Deshpande, Ashok W
2012-10-01
Use of computer based decision tools to aid clinical decision making, has been a primary goal of research in biomedical informatics. Research in the last five decades has led to the development of Medical Decision Support (MDS) applications using a variety of modeling techniques, for a diverse range of medical decision problems. This paper surveys literature on modeling techniques for diagnostic decision support, with a focus on decision accuracy. Trends and shortcomings of research in this area are discussed and future directions are provided. The authors suggest that-(i) Improvement in the accuracy of MDS application may be possible by modeling of vague and temporal data, research on inference algorithms, integration of patient information from diverse sources and improvement in gene profiling algorithms; (ii) MDS research would be facilitated by public release of de-identified medical datasets, and development of opensource data-mining tool kits; (iii) Comparative evaluations of different modeling techniques are required to understand characteristics of the techniques, which can guide developers in choice of technique for a particular medical decision problem; and (iv) Evaluations of MDS applications in clinical setting are necessary to foster physicians' utilization of these decision aids.
Probabilistic/Fracture-Mechanics Model For Service Life
NASA Technical Reports Server (NTRS)
Watkins, T., Jr.; Annis, C. G., Jr.
1991-01-01
Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.
Saigal, Christopher S; Lambrechts, Sylvia I; Seenu Srinivasan, V; Dahan, Ely
2017-06-01
Many guidelines advocate the use of shared decision making for men with newly diagnosed prostate cancer. Decision aids can facilitate the process of shared decision making. Implicit in this approach is the idea that physicians understand which elements of treatment matter to patients. Little formal work exists to guide physicians or developers of decision aids in identifying these attributes. We use a mixed-methods technique adapted from marketing science, the 'Voice of the Patient', to describe and identify treatment elements of value for men with localized prostate cancer. We conducted semi-structured interviews with 30 men treated for prostate cancer in the urology clinic of the West Los Angeles Veteran Affairs Medical Center. We used a qualitative analysis to generate themes in patient narratives, and a quantitative approach, agglomerative hierarchical clustering, to identify attributes of treatment that were most relevant to patients making decisions about prostate cancer. We identified five 'traditional' prostate cancer treatment attributes: sexual dysfunction, bowel problems, urinary problems, lifespan, and others' opinions. We further identified two novel treatment attributes: a treatment's ability to validate a sense of proactivity and the need for an incision (separate from risks of surgery). Application of a successful marketing technique, the 'Voice of the Customer', in a clinical setting elicits non-obvious attributes that highlight unique patient decision-making concerns. Use of this method in the development of decision aids may result in more effective decision support.
Online Heterogeneous Transfer by Hedge Ensemble of Offline and Online Decisions.
Yan, Yuguang; Wu, Qingyao; Tan, Mingkui; Ng, Michael K; Min, Huaqing; Tsang, Ivor W
2017-10-10
In this paper, we study the online heterogeneous transfer (OHT) learning problem, where the target data of interest arrive in an online manner, while the source data and auxiliary co-occurrence data are from offline sources and can be easily annotated. OHT is very challenging, since the feature spaces of the source and target domains are different. To address this, we propose a novel technique called OHT by hedge ensemble by exploiting both offline knowledge and online knowledge of different domains. To this end, we build an offline decision function based on a heterogeneous similarity that is constructed using labeled source data and unlabeled auxiliary co-occurrence data. After that, an online decision function is learned from the target data. Last, we employ a hedge weighting strategy to combine the offline and online decision functions to exploit knowledge from the source and target domains of different feature spaces. We also provide a theoretical analysis regarding the mistake bounds of the proposed approach. Comprehensive experiments on three real-world data sets demonstrate the effectiveness of the proposed technique.
Critical care physician cognitive task analysis: an exploratory study
Fackler, James C; Watts, Charles; Grome, Anna; Miller, Thomas; Crandall, Beth; Pronovost, Peter
2009-01-01
Introduction For better or worse, the imposition of work-hour limitations on house-staff has imperiled continuity and/or improved decision-making. Regardless, the workflow of every physician team in every academic medical centre has been irrevocably altered. We explored the use of cognitive task analysis (CTA) techniques, most commonly used in other high-stress and time-sensitive environments, to analyse key cognitive activities in critical care medicine. The study objective was to assess the usefulness of CTA as an analytical tool in order that physician cognitive tasks may be understood and redistributed within the work-hour limited medical decision-making teams. Methods After approval from each Institutional Review Board, two intensive care units (ICUs) within major university teaching hospitals served as data collection sites for CTA observations and interviews of critical care providers. Results Five broad categories of cognitive activities were identified: pattern recognition; uncertainty management; strategic vs. tactical thinking; team coordination and maintenance of common ground; and creation and transfer of meaning through stories. Conclusions CTA within the framework of Naturalistic Decision Making is a useful tool to understand the critical care process of decision-making and communication. The separation of strategic and tactical thinking has implications for workflow redesign. Given the global push for work-hour limitations, such workflow redesign is occurring. Further work with CTA techniques will provide important insights toward rational, rather than random, workflow changes. PMID:19265517
Critical care physician cognitive task analysis: an exploratory study.
Fackler, James C; Watts, Charles; Grome, Anna; Miller, Thomas; Crandall, Beth; Pronovost, Peter
2009-01-01
For better or worse, the imposition of work-hour limitations on house-staff has imperiled continuity and/or improved decision-making. Regardless, the workflow of every physician team in every academic medical centre has been irrevocably altered. We explored the use of cognitive task analysis (CTA) techniques, most commonly used in other high-stress and time-sensitive environments, to analyse key cognitive activities in critical care medicine. The study objective was to assess the usefulness of CTA as an analytical tool in order that physician cognitive tasks may be understood and redistributed within the work-hour limited medical decision-making teams. After approval from each Institutional Review Board, two intensive care units (ICUs) within major university teaching hospitals served as data collection sites for CTA observations and interviews of critical care providers. Five broad categories of cognitive activities were identified: pattern recognition; uncertainty management; strategic vs. tactical thinking; team coordination and maintenance of common ground; and creation and transfer of meaning through stories. CTA within the framework of Naturalistic Decision Making is a useful tool to understand the critical care process of decision-making and communication. The separation of strategic and tactical thinking has implications for workflow redesign. Given the global push for work-hour limitations, such workflow redesign is occurring. Further work with CTA techniques will provide important insights toward rational, rather than random, workflow changes.
Kateeb, E T; Warren, J J; Gaeth, G J; Momany, E T; Damiano, P C
2016-04-01
When traditional ranking and rating surveys are used to assess dentists' treatment decisions, the patient's source of payment appears to be of little importance. Therefore, this study used the marketing research tool conjoint analysis to investigate the relative impact of source of payment along with the child's age and cooperativeness on pediatric dentists' willingness to use Atraumatic Restorative Treatment (ART) to restore posterior primary teeth. A conjoint survey was completed by 707 pediatric dentists. Three factors (age of the child, cooperativeness, type of insurance) were varied across 3 levels to create 9 patient scenarios. The relative weights that dentists placed on these factors in the restorative treatment decision process were determined by conjoint analysis. "Cooperativeness" (52%) was the most important factor, "age of the child" (26%) the second-most important factor, followed by "insurance status of the child" (22%). For the third factor, insurance, pediatric dentists were least willing to use ART with publicly insured children (-0.082), and this was significantly different from their willingness to use ART with uninsured children (0.010) but not significantly different than their willingness to use ART for children with private insurance (0.073). Unlike traditional ranking and rating tools, conjoint analysis found that the insurance status of the patient appeared to be an important factor in dentists' decisions about different restorative treatment options. When pediatric dentists were forced to make tradeoffs among different patients' factors, they were most willing to use ART technique with young, uncooperative patients when they had no insurance. Knowledge Transfer Statement : The present study suggests the feasibility of using techniques borrowed from marketing research, such as conjoint analysis, to understand dentists' restorative treatment decisions. Results of this study demonstrate pediatric dentists' willingness to use a particular restorative treatment option (Atraumatic Restorative Treatment in this application) when forced to make tradeoffs in a "conjoined," or holistic, context among different factors presented in real-life patient scenarios. A deeper understanding of dentists' treatment decisions is vital to develop valid practice guidelines and interventions that encourage the use of appropriate restorative treatment modalities.
Washington, Karla T.; Oliver, Debra Parker; Gage, L. Ashley; Albright, David L.; Demiris, George
2015-01-01
Background Much of the existing research on shared decision-making in hospice and palliative care focuses on the provider-patient dyad; little is known about shared decision-making that is inclusive of family members of patients with advanced disease. Aim We sought to describe shared decision-making as it occurred in hospice interdisciplinary team meetings that included family caregivers as participants using video-conferencing technology. Design We conducted a multimethod study in which we used content and thematic analysis techniques to analyze video-recordings of hospice interdisciplinary team meetings (n = 100), individual interviews of family caregivers (n = 73) and hospice staff members (n = 78), and research field notes. Setting/participants Participants in the original studies from which data for this analysis were drawn were hospice family caregivers and staff members employed by one of five different community-based hospice agencies located in the Midwestern United States. Results Shared decision-making occurred infrequently in hospice interdisciplinary team meetings that included family caregivers. Barriers to shared decision-making included time constraints, communication skill deficits, unaddressed emotional needs, staff absences, and unclear role expectations. The hospice philosophy of care, current trends in health care delivery, the interdisciplinary nature of hospice teams, and the designation of a team leader/facilitator supported shared decision-making. Conclusions The involvement of family caregivers in hospice interdisciplinary team meetings using video-conferencing technology creates a useful platform for shared decision-making; however, steps must be taken to transform family caregivers from meeting attendees to shared decision-makers. PMID:26281854
Washington, Karla T; Oliver, Debra Parker; Gage, L Ashley; Albright, David L; Demiris, George
2016-03-01
Much of the existing research on shared decision-making in hospice and palliative care focuses on the provider-patient dyad; little is known about shared decision-making that is inclusive of family members of patients with advanced disease. We sought to describe shared decision-making as it occurred in hospice interdisciplinary team meetings that included family caregivers as participants using video-conferencing technology. We conducted a multimethod study in which we used content and thematic analysis techniques to analyze video-recordings of hospice interdisciplinary team meetings (n = 100), individual interviews of family caregivers (n = 73) and hospice staff members (n = 78), and research field notes. Participants in the original studies from which data for this analysis were drawn were hospice family caregivers and staff members employed by one of five different community-based hospice agencies located in the Midwestern United States. Shared decision-making occurred infrequently in hospice interdisciplinary team meetings that included family caregivers. Barriers to shared decision-making included time constraints, communication skill deficits, unaddressed emotional needs, staff absences, and unclear role expectations. The hospice philosophy of care, current trends in healthcare delivery, the interdisciplinary nature of hospice teams, and the designation of a team leader/facilitator supported shared decision-making. The involvement of family caregivers in hospice interdisciplinary team meetings using video-conferencing technology creates a useful platform for shared decision-making; however, steps must be taken to transform family caregivers from meeting attendees to shared decision-makers. © The Author(s) 2015.
Irrigation water policy analysis using a business simulation game
NASA Astrophysics Data System (ADS)
Buchholz, M.; Holst, G.; Musshoff, O.
2016-10-01
Despite numerous studies on farmers' responses to changing irrigation water policies, uncertainties remain about the potential of water pricing schemes and water quotas to reduce irrigation. Thus far, policy impact analysis is predominantly based upon rational choice models that assume behavioral assumptions, such as a perfectly rational profit-maximizing decision maker. Also, econometric techniques are applied which could lack internal validity due to uncontrolled field data. Furthermore, such techniques are not capable of identifying ill-designed policies prior to their implementation. With this in mind, we apply a business simulation game for ex ante policy impact analysis of irrigation water policies at the farm level. Our approach has the potential to reveal the policy-induced behavioral change of the participants in a controlled environment. To do so, we investigate how real farmers from Germany, in an economic experiment, respond to a water pricing scheme and a water quota intending to reduce irrigation. In the business simulation game, the participants manage a "virtual" cash-crop farm for which they make crop allocation and irrigation decisions during several production periods, while facing uncertain product prices and weather conditions. The results reveal that a water quota is able to reduce mean irrigation applications, while a water pricing scheme does not have an impact, even though both policies exhibit equal income effects for the farmers. However, both policies appear to increase the variation of irrigation applications. Compared to a perfectly rational profit-maximizing decision maker, the participants apply less irrigation on average, both when irrigation is not restricted and when a water pricing scheme applies. Moreover, the participants' risk attitude affects the irrigation decisions.
Enrollment Projection within a Decision-Making Framework.
ERIC Educational Resources Information Center
Armstrong, David F.; Nunley, Charlene Wenckowski
1981-01-01
Two methods used to predict enrollment at Montgomery College in Maryland are compared and evaluated, and the administrative context in which they are used is considered. The two methods involve time series analysis (curve fitting) and indicator techniques (yield from components). (MSE)
A Watershed-scale Design Optimization Model for Stormwater Best Management Practices
U.S. Environmental Protection Agency developed a decision-support system, System for Urban Stormwater Treatment and Analysis Integration (SUSTAIN), to evaluate alternative plans for stormwater quality management and flow abatement techniques in urban and developing areas. SUSTAI...
Recent advances in applying decision science to managing national forests
Marcot, Bruce G.; Thompson, Matthew P.; Runge, Michael C.; Thompson, Frank R.; McNulty, Steven; Cleaves, David; Tomosy, Monica; Fisher, Larry A.; Andrew, Bliss
2012-01-01
Management of federal public forests to meet sustainability goals and multiple use regulations is an immense challenge. To succeed, we suggest use of formal decision science procedures and tools in the context of structured decision making (SDM). SDM entails four stages: problem structuring (framing the problem and defining objectives and evaluation criteria), problem analysis (defining alternatives, evaluating likely consequences, identifying key uncertainties, and analyzing tradeoffs), decision point (identifying the preferred alternative), and implementation and monitoring the preferred alternative with adaptive management feedbacks. We list a wide array of models, techniques, and tools available for each stage, and provide three case studies of their selected use in National Forest land management and project plans. Successful use of SDM involves participation by decision-makers, analysts, scientists, and stakeholders. We suggest specific areas for training and instituting SDM to foster transparency, rigor, clarity, and inclusiveness in formal decision processes regarding management of national forests.
Derivative Trade Optimizing Model Utilizing GP Based on Behavioral Finance Theory
NASA Astrophysics Data System (ADS)
Matsumura, Koki; Kawamoto, Masaru
This paper proposed a new technique which makes the strategy trees for the derivative (option) trading investment decision based on the behavioral finance theory and optimizes it using evolutionary computation, in order to achieve high profitability. The strategy tree uses a technical analysis based on a statistical, experienced technique for the investment decision. The trading model is represented by various technical indexes, and the strategy tree is optimized by the genetic programming(GP) which is one of the evolutionary computations. Moreover, this paper proposed a method using the prospect theory based on the behavioral finance theory to set psychological bias for profit and deficit and attempted to select the appropriate strike price of option for the higher investment efficiency. As a result, this technique produced a good result and found the effectiveness of this trading model by the optimized dealings strategy.
NASA Astrophysics Data System (ADS)
Kolkman, M. J.; Kok, M.; van der Veen, A.
The solution of complex, unstructured problems is faced with policy controversy and dispute, unused and misused knowledge, project delay and failure, and decline of public trust in governmental decisions. Mental model mapping (also called concept mapping) is a technique to analyse these difficulties on a fundamental cognitive level, which can reveal experiences, perceptions, assumptions, knowledge and subjective beliefs of stakeholders, experts and other actors, and can stimulate communication and learning. This article presents the theoretical framework from which the use of mental model mapping techniques to analyse this type of problems emerges as a promising technique. The framework consists of the problem solving or policy design cycle, the knowledge production or modelling cycle, and the (computer) model as interface between the cycles. Literature attributes difficulties in the decision-making process to communication gaps between decision makers, stakeholders and scientists, and to the construction of knowledge within different paradigm groups that leads to different interpretation of the problem situation. Analysis of the decision-making process literature indicates that choices, which are made in all steps of the problem solving cycle, are based on an individual decision maker’s frame of perception. This frame, in turn, depends on the mental model residing in the mind of the individual. Thus we identify three levels of awareness on which the decision process can be analysed. This research focuses on the third level. Mental models can be elicited using mapping techniques. In this way, analysing an individual’s mental model can shed light on decision-making problems. The steps of the knowledge production cycle are, in the same manner, ultimately driven by the mental models of the scientist in a specific discipline. Remnants of this mental model can be found in the resulting computer model. The characteristics of unstructured problems (complexity, uncertainty and disagreement) can be positioned in the framework, as can the communities of knowledge construction and valuation involved in the solution of these problems (core science, applied science, and professional consultancy, and “post-normal” science). Mental model maps, this research hypothesises, are suitable to analyse the above aspects of the problem. This hypothesis is tested for the case of the Zwolle storm surch barrier. Analysis can aid integration between disciplines, participation of public stakeholders, and can stimulate learning processes. Mental model mapping is recommended to visualise the use of knowledge, to analyse difficulties in problem solving process, and to aid information transfer and communication. Mental model mapping help scientists to shape their new, post-normal responsibilities in a manner that complies with integrity when dealing with unstructured problems in complex, multifunctional systems.
Visualization and Analysis for Near-Real-Time Decision Making in Distributed Workflows
Pugmire, David; Kress, James; Choi, Jong; ...
2016-08-04
Data driven science is becoming increasingly more common, complex, and is placing tremendous stresses on visualization and analysis frameworks. Data sources producing 10GB per second (and more) are becoming increasingly commonplace in both simulation, sensor and experimental sciences. These data sources, which are often distributed around the world, must be analyzed by teams of scientists that are also distributed. Enabling scientists to view, query and interact with such large volumes of data in near-real-time requires a rich fusion of visualization and analysis techniques, middleware and workflow systems. Here, this paper discusses initial research into visualization and analysis of distributed datamore » workflows that enables scientists to make near-real-time decisions of large volumes of time varying data.« less
An Analysis of Categorical and Quantitative Methods for Planning Under Uncertainty
Langlotz, Curtis P.; Shortliffe, Edward H.
1988-01-01
Decision theory and logical reasoning are both methods for representing and solving medical decision problems. We analyze the usefulness of these two approaches to medical therapy planning by establishing a simple correspondence between decision theory and non-monotonic logic, a formalization of categorical logical reasoning. The analysis indicates that categorical approaches to planning can be viewed as comprising two decision-theoretic concepts: probabilities (degrees of belief in planning hypotheses) and utilities (degrees of desirability of planning outcomes). We present and discuss examples of the following lessons from this decision-theoretic view of categorical (nonmonotonic) reasoning: (1) Decision theory and artificial intelligence techniques are intended to solve different components of the planning problem. (2) When considered in the context of planning under uncertainty, nonmonotonic logics do not retain the domain-independent characteristics of classical logical reasoning for planning under certainty. (3) Because certain nonmonotonic programming paradigms (e.g., frame-based inheritance, rule-based planning, protocol-based reminders) are inherently problem-specific, they may be inappropriate to employ in the solution of certain types of planning problems. We discuss how these conclusions affect several current medical informatics research issues, including the construction of “very large” medical knowledge bases.
Cost-effectiveness Analysis with Influence Diagrams.
Arias, M; Díez, F J
2015-01-01
Cost-effectiveness analysis (CEA) is used increasingly in medicine to determine whether the health benefit of an intervention is worth the economic cost. Decision trees, the standard decision modeling technique for non-temporal domains, can only perform CEA for very small problems. To develop a method for CEA in problems involving several dozen variables. We explain how to build influence diagrams (IDs) that explicitly represent cost and effectiveness. We propose an algorithm for evaluating cost-effectiveness IDs directly, i.e., without expanding an equivalent decision tree. The evaluation of an ID returns a set of intervals for the willingness to pay - separated by cost-effectiveness thresholds - and, for each interval, the cost, the effectiveness, and the optimal intervention. The algorithm that evaluates the ID directly is in general much more efficient than the brute-force method, which is in turn more efficient than the expansion of an equivalent decision tree. Using OpenMarkov, an open-source software tool that implements this algorithm, we have been able to perform CEAs on several IDs whose equivalent decision trees contain millions of branches. IDs can perform CEA on large problems that cannot be analyzed with decision trees.
Managing industrial risk--having a tested and proven system to prevent and assess risk.
Heller, Stephen
2006-03-17
Some relatively easy techniques exist to improve the risk picture/profile to aid in preventing losses. Today with the advent of computer system resources, focusing on specific aspects of risk through systematic scoring and comparison, the risk analysis can be relatively easy to achieve. Techniques like these demonstrate how working experience and common sense can be combined mathematically into a flexible risk management tool or risk model for analyzing risk. The risk assessment methodology provided by companies today is no longer the ideas and practices of one group or even one company. It is reflective of the practice of many companies, as well as the ideas and expertise of academia and government regulators. The use of multi-criteria decision making (MCDM) techniques for making critical decisions has been recognized for many years for a variety of purposes. In today's computer age, the easy accessing and user-friendly nature for using these techniques, makes them a favorable choice for use in the risk assessment environment. The new user of these methodologies should find many ideas directly applicable to his or her needs when approaching risk decision making. The user should find their ideas readily adapted, with slight modification, to accurately reflect a specific situation using MCDM techniques. This makes them an attractive feature for use in assessment and risk modeling. The main advantage of decision making techniques, such as MCDM, is that in the early stages of a risk assessment, accurate data on industrial risk, and failures are lacking. In most cases, it is still insufficient to perform a thorough risk assessment using purely statistical concepts. The practical advantages towards deviating from strict data-driven protocol seem to outweigh the drawbacks. Industry failure data often comes at a high cost when a loss occurs. We can benefit from this unfortunate acquisition of data through the continuous refining of our decisions by incorporating this new information into our assessments. MCDM techniques offer flexibility in accessing comparison within broad data sets to reflect our best estimation of their importance towards contribution to the risk picture. This allows for the accurate determination of the more probable and more consequential issues. This can later be refined using more intensive risk techniques and the avoidance of less critical issues.
Wang, Mingming; Sweetapple, Chris; Fu, Guangtao; Farmani, Raziyeh; Butler, David
2017-10-01
This paper presents a new framework for decision making in sustainable drainage system (SuDS) scheme design. It integrates resilience, hydraulic performance, pollution control, rainwater usage, energy analysis, greenhouse gas (GHG) emissions and costs, and has 12 indicators. The multi-criteria analysis methods of entropy weight and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) were selected to support SuDS scheme selection. The effectiveness of the framework is demonstrated with a SuDS case in China. Indicators used include flood volume, flood duration, a hydraulic performance indicator, cost and resilience. Resilience is an important design consideration, and it supports scheme selection in the case study. The proposed framework will help a decision maker to choose an appropriate design scheme for implementation without subjectivity. Copyright © 2017 Elsevier Ltd. All rights reserved.
Multicriteria Analysis model for the comparison of different rockfalls protection devices
NASA Astrophysics Data System (ADS)
Mignelli, C.; Pomarico, S.; Peila, D.
2012-04-01
In mountain regions roads and railways as well as urbanized areas, can often be endangered by rockfalls and need to be protected against the impact of falling blocks. The effects of rockfall events can be the damage of road, vehicles, injuries or death of drivers or passengers and economic loss due to road closure. The cost of a single car accident can be significant since it can involve the hospitalization of the driver and passengers, the repair of the vehicle, the legal costs and compensation. The public administrations must manage the roads in order to protect the areas at risk and therefore make choices that take into account both technical and social aspects. The fulfillment of safety requirements for routes in mountainside areas is therefore a multidimensional concept that includes socio-economic, environmental, technical and ethical perspectives and thus leads to issue that are characterized simultaneously by a high degree of conflict, complexity and uncertainty. Multicriteria Analysis (MCA) is an adequate approach that can deal with these kind of issues. It behaves as an umbrella term since it includes a large series of evaluation techniques able to take into explicit consideration simultaneously several criteria, in order to support the Decision Maker through a rational approach to make a comparative assessment of alternative projects. A very large and consolidated amount of MCA literature exists, in which it is possible to find a wide range of techniques and application fields such as waste management, transport infrastructures, strategic policy planning, environmental impact assessment of territorial transformations, market and logistics, economics and finance, industrial management and civil engineering. This paper address the problem of rockfall risk induced on a road using the Analytic Hierarchy Process (AHP), a Multicriteria Analysis technique suitable for dealing with complex problems related to making a choice from among several alternatives and which provides a comparison of the considered options. The developed model takes into account five different aspects of the decision-making process (economic, environmental, design, transport and social aspects) that have been organized according the hierarchical framework of the AHP technique. The criteria that were identified in the analysis and their weights, in the decision-making process, have been discussed and determined by means of specific focus groups with technical experts in the geo-engineering field. Three different protection devices, usually used for rockfall protection (embankment, shelter topped by rockfall barrier and tunnel), are compared through the AHP method, in a specific "geo" environment to show the feasibility of the method. The application of the AHP technique, which was performed using the Expert Choice software, allowed the most relevant aspects of the decision-making process to be highlighted and showing how the proposed method can be a valuable tool for public administration. Furthermore, in order to test the robustness of the proposed model a sensitivity analysis was carried out. The research has an originality value since it focuses on a participative methodological approach thus making the decision process more traceable and reliable.
Closed-Loop Analysis of Soft Decisions for Serial Links
NASA Technical Reports Server (NTRS)
Lansdowne, Chatwin A.; Steele, Glen F.; Zucha, Joan P.; Schlensinger, Adam M.
2012-01-01
Modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more overhead through noisier channels, and software-defined radios use error-correction techniques that approach Shannon s theoretical limit of performance. The authors describe the benefit of closed-loop measurements for a receiver when paired with a counterpart transmitter and representative channel conditions. We also describe a real-time Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in real-time during the development of software defined radios.
NASA Technical Reports Server (NTRS)
Lansdowne, Chatwin; Steele, Glen; Zucha, Joan; Schlesinger, Adam
2013-01-01
We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.
Archetypes for Organisational Safety
NASA Technical Reports Server (NTRS)
Marais, Karen; Leveson, Nancy G.
2003-01-01
We propose a framework using system dynamics to model the dynamic behavior of organizations in accident analysis. Most current accident analysis techniques are event-based and do not adequately capture the dynamic complexity and non-linear interactions that characterize accidents in complex systems. In this paper we propose a set of system safety archetypes that model common safety culture flaws in organizations, i.e., the dynamic behaviour of organizations that often leads to accidents. As accident analysis and investigation tools, the archetypes can be used to develop dynamic models that describe the systemic and organizational factors contributing to the accident. The archetypes help clarify why safety-related decisions do not always result in the desired behavior, and how independent decisions in different parts of the organization can combine to impact safety.
Directional Slack-Based Measure for the Inverse Data Envelopment Analysis
Abu Bakar, Mohd Rizam; Lee, Lai Soon; Jaafar, Azmi B.; Heydar, Maryam
2014-01-01
A novel technique has been introduced in this research which lends its basis to the Directional Slack-Based Measure for the inverse Data Envelopment Analysis. In practice, the current research endeavors to elucidate the inverse directional slack-based measure model within a new production possibility set. On one occasion, there is a modification imposed on the output (input) quantities of an efficient decision making unit. In detail, the efficient decision making unit in this method was omitted from the present production possibility set but substituted by the considered efficient decision making unit while its input and output quantities were subsequently modified. The efficiency score of the entire DMUs will be retained in this approach. Also, there would be an improvement in the efficiency score. The proposed approach was investigated in this study with reference to a resource allocation problem. It is possible to simultaneously consider any upsurges (declines) of certain outputs associated with the efficient decision making unit. The significance of the represented model is accentuated by presenting numerical examples. PMID:24883350
Improving sensor data analysis through diverse data source integration
NASA Astrophysics Data System (ADS)
Casper, Jennifer; Albuquerque, Ronald; Hyland, Jeremy; Leveille, Peter; Hu, Jing; Cheung, Eddy; Mauer, Dan; Couture, Ronald; Lai, Barry
2009-05-01
Daily sensor data volumes are increasing from gigabytes to multiple terabytes. The manpower and resources needed to analyze the increasing amount of data are not growing at the same rate. Current volumes of diverse data, both live streaming and historical, are not fully analyzed. Analysts are left mostly to analyzing the individual data sources manually. This is both time consuming and mentally exhausting. Expanding data collections only exacerbate this problem. Improved data management techniques and analysis methods are required to process the increasing volumes of historical and live streaming data sources simultaneously. Improved techniques are needed to reduce an analysts decision response time and to enable more intelligent and immediate situation awareness. This paper describes the Sensor Data and Analysis Framework (SDAF) system built to provide analysts with the ability to pose integrated queries on diverse live and historical data sources, and plug in needed algorithms for upstream processing and filtering. The SDAF system was inspired by input and feedback from field analysts and experts. This paper presents SDAF's capabilities, implementation, and reasoning behind implementation decisions. Finally, lessons learned from preliminary tests and deployments are captured for future work.
Seventh symposium on systems analysis in forest resources; 1997 May 28-31; Traverse City, MI.
J. Michael Vasievich; Jeremy S. Fried; Larry A. Leefers
2000-01-01
This international symposium included presentations by representatives from government, academic, and private institutions. Topics covered management objectives; information systems: modeling, optimization, simulation and decision support techniques; spatial methods; timber supply; and economic and operational analyses.
INNOVATIONS IN SOIL SAMPLING AND DATA ANALYSIS
Successful research outcomes from the VOC in soils work will provide the Agency with methods and techniques that provide the accurate VOC concentrations so that decisions related to a contaminated site can be made to optimize the protectiveness to the environment and human health...
Tradespace Exploration for the Engineering of Resilient Systems
2015-05-01
world scenarios. The types of tools within the SAE set include visualization, decision analysis, and M&S, so it is difficult to categorize this toolset... overpopulated , or questionable. ERS Tradespace Workshop Create predictive models using multiple techniques (e.g., regression, Kriging, neural nets
ERIC Educational Resources Information Center
Decker, Erwin A.; And Others
The pros and cons of decentralization of decision-making authority to the school-site level as a public school management technique are intended to serve as an informational summary for the members of the California State Board of Education, and as a resource for school district governing boards and district administrators to use to determine the…
Analysis of Decision Making Skills for Large Scale Disaster Response
2015-08-21
Capability to influence and collaborate Compassion Teamwork Communication Leadership Provide vision of outcome / set priorities Confidence, courage to make...project evaluates the viability of expanding the use of serious games to augment classroom training, tabletop and full scale exercise, and actual...training, evaluation, analysis, and technology ex- ploration. Those techniques have found successful niches, but their wider applicability faces
ERIC Educational Resources Information Center
Ross, Sarah Gwen
2012-01-01
Response to intervention (RTI) is increasingly being used in educational settings to make high-stakes, special education decisions. Because of this, the accurate use and analysis of single-case designs to monitor intervention effectiveness has become important to the RTI process. Effect size methods for single-case designs provide a useful way to…
A new approach to enhance the performance of decision tree for classifying gene expression data.
Hassan, Md; Kotagiri, Ramamohanarao
2013-12-20
Gene expression data classification is a challenging task due to the large dimensionality and very small number of samples. Decision tree is one of the popular machine learning approaches to address such classification problems. However, the existing decision tree algorithms use a single gene feature at each node to split the data into its child nodes and hence might suffer from poor performance specially when classifying gene expression dataset. By using a new decision tree algorithm where, each node of the tree consists of more than one gene, we enhance the classification performance of traditional decision tree classifiers. Our method selects suitable genes that are combined using a linear function to form a derived composite feature. To determine the structure of the tree we use the area under the Receiver Operating Characteristics curve (AUC). Experimental analysis demonstrates higher classification accuracy using the new decision tree compared to the other existing decision trees in literature. We experimentally compare the effect of our scheme against other well known decision tree techniques. Experiments show that our algorithm can substantially boost the classification performance of the decision tree.
NASA Astrophysics Data System (ADS)
Lee, K. David; Colony, Mike
2011-06-01
Modeling and simulation has been established as a cost-effective means of supporting the development of requirements, exploring doctrinal alternatives, assessing system performance, and performing design trade-off analysis. The Army's constructive simulation for the evaluation of equipment effectiveness in small combat unit operations is currently limited to representation of situation awareness without inclusion of the many uncertainties associated with real world combat environments. The goal of this research is to provide an ability to model situation awareness and decision process uncertainties in order to improve evaluation of the impact of battlefield equipment on ground soldier and small combat unit decision processes. Our Army Probabilistic Inference and Decision Engine (Army-PRIDE) system provides this required uncertainty modeling through the application of two critical techniques that allow Bayesian network technology to be applied to real-time applications. (Object-Oriented Bayesian Network methodology and Object-Oriented Inference technique). In this research, we implement decision process and situation awareness models for a reference scenario using Army-PRIDE and demonstrate its ability to model a variety of uncertainty elements, including: confidence of source, information completeness, and information loss. We also demonstrate that Army-PRIDE improves the realism of the current constructive simulation's decision processes through Monte Carlo simulation.
A hybrid method for classifying cognitive states from fMRI data.
Parida, S; Dehuri, S; Cho, S-B; Cacha, L A; Poznanski, R R
2015-09-01
Functional magnetic resonance imaging (fMRI) makes it possible to detect brain activities in order to elucidate cognitive-states. The complex nature of fMRI data requires under-standing of the analyses applied to produce possible avenues for developing models of cognitive state classification and improving brain activity prediction. While many models of classification task of fMRI data analysis have been developed, in this paper, we present a novel hybrid technique through combining the best attributes of genetic algorithms (GAs) and ensemble decision tree technique that consistently outperforms all other methods which are being used for cognitive-state classification. Specifically, this paper illustrates the combined effort of decision-trees ensemble and GAs for feature selection through an extensive simulation study and discusses the classification performance with respect to fMRI data. We have shown that our proposed method exhibits significant reduction of the number of features with clear edge classification accuracy over ensemble of decision-trees.
Cheramie, G M; Griffin, K M; Morgan, T
2000-02-01
A national survey of specialist school psychologists examined the perceived usefulness of assessment techniques in making decisions regarding eligibility for the educational classification of emotional disturbance and in generating classroom recommendations. Analysis showed measures rated as most useful were interviews with the parent, teacher, and student, observations of the student, and norm-referenced rating scales. Projective techniques were least useful. These findings are important in the context of "best practices" for the multidimensional assessment of emotional disturbance which promotes a more direct link between assessment and intervention.
2003-10-01
Among the procedures developed to identify cognitive processes, there are the Cognitive Task Analysis (CTA) and the Cognitive Work Analysis (CWA...of Cognitive Task Design. [11] Potter, S.S., Roth, E.M., Woods, D.D., and Elm, W.C. (2000). Cognitive Task Analysis as Bootstrapping Multiple...Converging Techniques, In Schraagen, Chipman, and Shalin (Eds.). Cognitive Task Analysis . Mahwah, NJ: Lawrence Erlbaum Associates. [12] Roth, E.M
Advanced decision aiding techniques applicable to space
NASA Technical Reports Server (NTRS)
Kruchten, Robert J.
1987-01-01
RADC has had an intensive program to show the feasibility of applying advanced technology to Air Force decision aiding situations. Some aspects of the program, such as Satellite Autonomy, are directly applicable to space systems. For example, RADC has shown the feasibility of decision aids that combine the advantages of laser disks and computer generated graphics; decision aids that interface object-oriented programs with expert systems; decision aids that solve path optimization problems; etc. Some of the key techniques that could be used in space applications are reviewed. Current applications are reviewed along with their advantages and disadvantages, and examples are given of possible space applications. The emphasis is to share RADC experience in decision aiding techniques.
NASA Astrophysics Data System (ADS)
Green, David M.; Dallaire, Joel D.; Reaper, Jerome H.
2004-08-01
The Joint Battlespace Infosphere (JBI) program is performing a technology investigation into global communications, data mining and warehousing, and data fusion technologies by focusing on techniques and methodologies that support twenty first century military distributed collaboration. Advancement of these technologies is vitally important if military decision makers are to have the right data, in the right format, at the right time and place to support making the right decisions within available timelines. A quantitative understanding of individual and combinational effects arising from the application of technologies within a framework is presently far too complex to evaluate at more than a cursory depth. In order to facilitate quantitative analysis under these circumstances, the Distributed Information Enterprise Modeling and Simulation (DIEMS) team was formed to apply modeling and simulation (M&S) techniques to help in addressing JBI analysis challenges. The DIEMS team has been tasked utilizing collaborative distributed M&S architectures to quantitatively evaluate JBI technologies and tradeoffs. This paper first presents a high level view of the DIEMS project. Once this approach has been established, a more concentrated view of the detailed communications simulation techniques used in generating the underlying support data sets is presented.
Automatic Conflict Detection on Contracts
NASA Astrophysics Data System (ADS)
Fenech, Stephen; Pace, Gordon J.; Schneider, Gerardo
Many software applications are based on collaborating, yet competing, agents or virtual organisations exchanging services. Contracts, expressing obligations, permissions and prohibitions of the different actors, can be used to protect the interests of the organisations engaged in such service exchange. However, the potentially dynamic composition of services with different contracts, and the combination of service contracts with local contracts can give rise to unexpected conflicts, exposing the need for automatic techniques for contract analysis. In this paper we look at automatic analysis techniques for contracts written in the contract language mathcal{CL}. We present a trace semantics of mathcal{CL} suitable for conflict analysis, and a decision procedure for detecting conflicts (together with its proof of soundness, completeness and termination). We also discuss its implementation and look into the applications of the contract analysis approach we present. These techniques are applied to a small case study of an airline check-in desk.
20170913 - Systematic Approaches to Biological/Chemical Read-Across for Hazard Identification (EMGS)
Read-across is a well-established data gap filling technique used within chemical category and analogue approaches for regulatory purposes. The category/analogue workflow comprises a number of steps starting from decision context, data gap analysis through to analogue identificat...
Mathematics and mallard management
Cowardin, L.M.; Johnson, D.H.
1979-01-01
Waterfowl managers can effectively use simple population models to aid in making management decisions. We present a basic model of the change in population size as related to survival and recruitment. A management technique designed to increase survival of mallards (Anas platyrhynchos) by limiting harvest on the Chippewa National Forest, Minnesota, is used to illustrate the application of models in decision making. The analysis suggests that the management technique would be of limited effectiveness. In a 2nd example, the change in mallard population in central North Dakota is related to implementing programs to create dense nesting cover with or without supplementary predator control. The analysis suggests that large tracts of land would be required to achieve a hypothetical management objective of increasing harvest by 50% while maintaining a stable population. Less land would be required if predator reduction were used in combination with cover management, but questions about effectiveness and ecological implications of large scale predator reduction remain unresolved. The use of models as a guide to planning research responsive to the needs of management is illustrated.
In search of tools to aid logical thinking and communicating about medical decision making.
Hunink, M G
2001-01-01
To have real-time impact on medical decision making, decision analysts need a wide variety of tools to aid logical thinking and communication. Decision models provide a formal framework to integrate evidence and values, but they are commonly perceived as complex and difficult to understand by those unfamiliar with the methods, especially in the context of clinical decision making. The theory of constraints, introduced by Eliyahu Goldratt in the business world, provides a set of tools for logical thinking and communication that could potentially be useful in medical decision making. The author used the concept of a conflict resolution diagram to analyze the decision to perform carotid endarterectomy prior to coronary artery bypass grafting in a patient with both symptomatic coronary and asymptomatic carotid artery disease. The method enabled clinicians to visualize and analyze the issues, identify and discuss the underlying assumptions, search for the best available evidence, and use the evidence to make a well-founded decision. The method also facilitated communication among those involved in the care of the patient. Techniques from fields other than decision analysis can potentially expand the repertoire of tools available to support medical decision making and to facilitate communication in decision consults.
Engineering tradeoff problems viewed as multiple objective optimizations and the VODCA methodology
NASA Astrophysics Data System (ADS)
Morgan, T. W.; Thurgood, R. L.
1984-05-01
This paper summarizes a rational model for making engineering tradeoff decisions. The model is a hybrid from the fields of social welfare economics, communications, and operations research. A solution methodology (Vector Optimization Decision Convergence Algorithm - VODCA) firmly grounded in the economic model is developed both conceptually and mathematically. The primary objective for developing the VODCA methodology was to improve the process for extracting relative value information about the objectives from the appropriate decision makers. This objective was accomplished by employing data filtering techniques to increase the consistency of the relative value information and decrease the amount of information required. VODCA is applied to a simplified hypothetical tradeoff decision problem. Possible use of multiple objective analysis concepts and the VODCA methodology in product-line development and market research are discussed.
Framing of Uncertainty in Scientific Publications: Towards Recommendations for Decision Support
NASA Astrophysics Data System (ADS)
Guillaume, J. H. A.; Helgeson, C.; Elsawah, S.; Jakeman, A. J.; Kummu, M.
2016-12-01
Uncertainty is recognised as an essential issue in environmental decision making and decision support. As modellers, we notably use a variety of tools and techniques within an analysis, for example related to uncertainty quantification and model validation. We also address uncertainty by how we present results. For example, experienced modellers are careful to distinguish robust conclusions from those that need further work, and the precision of quantitative results is tailored to their accuracy. In doing so, the modeller frames how uncertainty should be interpreted by their audience. This is an area which extends beyond modelling to fields such as philosophy of science, semantics, discourse analysis, intercultural communication and rhetoric. We propose that framing of uncertainty deserves greater attention in the context of decision support, and that there are opportunities in this area for fundamental research, synthesis and knowledge transfer, development of teaching curricula, and significant advances in managing uncertainty in decision making. This presentation reports preliminary results of a study of framing practices. Specifically, we analyse the framing of uncertainty that is visible in the abstracts from a corpus of scientific articles. We do this through textual analysis of the content and structure of those abstracts. Each finding that appears in an abstract is classified according to the uncertainty framing approach used, using a classification scheme that was iteratively revised based on reflection and comparison amongst three coders. This analysis indicates how frequently the different framing approaches are used, and provides initial insights into relationships between frames, how the frames relate to interpretation of uncertainty, and how rhetorical devices are used by modellers to communicate uncertainty in their work. We propose initial hypotheses for how the resulting insights might influence decision support, and help advance decision making to better address uncertainty.
Kolasa, Katarzyna; Zwolinski, Krzysztof Miroslaw; Zah, Vladimir; Kaló, Zoltán; Lewandowski, Tadeusz
2018-04-27
A Multi Criteria Decision Analysis (MCDA) technique was adopted to reveal the preferences of the Appraisal Body of the Polish HTA agency towards orphan drugs (OMPs). There were 34 positive and 23 negative HTA recommendations out of 54 distinctive drug-indication pairs. The MCDA matrix consisted of 13 criteria, seven of which made the most impact on the HTA process. Appraisal of clinical evidence, cost of therapy, and safety considerations were the main contributors to the HTA guidance, whilst advancement of technology and manufacturing costs made the least impact. MCDA can be regarded as a valuable tool for revealing decision makers' preferences in the healthcare sector. Given that only roughly half of all criteria included in the MCDA matrix were deemed to make an impact on the HTA process, there is certainly some room for improvement with respect to the adaptation of a new approach towards the value assessment of OMPs in Poland.
Operationalising uncertainty in data and models for integrated water resources management.
Blind, M W; Refsgaard, J C
2007-01-01
Key sources of uncertainty of importance for water resources management are (1) uncertainty in data; (2) uncertainty related to hydrological models (parameter values, model technique, model structure); and (3) uncertainty related to the context and the framing of the decision-making process. The European funded project 'Harmonised techniques and representative river basin data for assessment and use of uncertainty information in integrated water management (HarmoniRiB)' has resulted in a range of tools and methods to assess such uncertainties, focusing on items (1) and (2). The project also engaged in a number of discussions surrounding uncertainty and risk assessment in support of decision-making in water management. Based on the project's results and experiences, and on the subsequent discussions a number of conclusions can be drawn on the future needs for successful adoption of uncertainty analysis in decision support. These conclusions range from additional scientific research on specific uncertainties, dedicated guidelines for operational use to capacity building at all levels. The purpose of this paper is to elaborate on these conclusions and anchoring them in the broad objective of making uncertainty and risk assessment an essential and natural part in future decision-making processes.
NASA Astrophysics Data System (ADS)
de Brito, M. M.; Evers, M.
2015-11-01
This paper provides a review of Multi-Criteria Decision Making (MCDM) applications to flood risk management, seeking to highlight trends and identify research gaps. Totally, 128 peer-reviewed papers published from 1995 to June 2015 were systematically analysed and classified into the following application areas: (1) ranking of alternatives for flood mitigation, (2) reservoir flood control, (3) susceptibility, (4) hazard, (5) vulnerability, (6) risk, (7) coping capacity, and (8) emergency management. Additionally, the articles were categorized based on the publication year, MCDM method, whether they were or were not carried out in a participatory process, and if uncertainty and sensitivity analysis were performed. Results showed that the number of flood MCDM publications has exponentially grown during this period, with over 82 % of all papers published since 2009. The Analytical Hierarchy Process (AHP) was the most popular technique, followed by Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS), and Simple Additive Weighting (SAW). Although there is greater interest on MCDM, uncertainty analysis remains an issue and is seldom applied in flood-related studies. In addition, participation of multiple stakeholders has been generally fragmented, focusing on particular stages of the decision-making process, especially on the definition of criteria weights. Based on the survey, some suggestions for further investigation are provided.
NASA Technical Reports Server (NTRS)
Hoffman, Edward J. (Editor); Lawbaugh, William M. (Editor)
1996-01-01
Papers address the following topics: NASA's project management development process; Better decisions through structural analysis; NASA's commercial technology management system; Today's management techniques and tools; Program control in NASA - needs and opportunities; and Resources for NASA managers.
ERIC Educational Resources Information Center
Bronn, Peggy Simcic; Olson, Erik L.
1999-01-01
Illustrates the operationalization of the conjoint analysis multivariate technique for the study of the public relations function within strategic decision making in a crisis situation. Finds that what the theory describes as the strategic way of handling a crisis is also the way each of the managers who were evaluated would prefer to conduct…
Cognitive mapping tools: review and risk management needs.
Wood, Matthew D; Bostrom, Ann; Bridges, Todd; Linkov, Igor
2012-08-01
Risk managers are increasingly interested in incorporating stakeholder beliefs and other human factors into the planning process. Effective risk assessment and management requires understanding perceptions and beliefs of involved stakeholders, and how these beliefs give rise to actions that influence risk management decisions. Formal analyses of risk manager and stakeholder cognitions represent an important first step. Techniques for diagramming stakeholder mental models provide one tool for risk managers to better understand stakeholder beliefs and perceptions concerning risk, and to leverage this new understanding in developing risk management strategies. This article reviews three methodologies for assessing and diagramming stakeholder mental models--decision-analysis-based mental modeling, concept mapping, and semantic web analysis--and assesses them with regard to their ability to address risk manager needs. © 2012 Society for Risk Analysis.
A rational framework for production decision making in blood establishments.
Ramoa, Augusto; Maia, Salomé; Lourenço, Anália
2012-07-24
SAD_BaSe is a blood bank data analysis software, created to assist in the management of blood donations and the blood production chain in blood establishments. In particular, the system keeps track of several collection and production indicators, enables the definition of collection and production strategies, and the measurement of quality indicators required by the Quality Management System regulating the general operation of blood establishments. This paper describes the general scenario of blood establishments and its main requirements in terms of data management and analysis. It presents the architecture of SAD_BaSe and identifies its main contributions. Specifically, it brings forward the generation of customized reports driven by decision making needs and the use of data mining techniques in the analysis of donor suspensions and donation discards.
A Rational Framework for Production Decision Making in Blood Establishments.
Ramoa, Augusto; Maia, Salomé; Lourenço, Anália
2012-12-01
SAD_BaSe is a blood bank data analysis software, created to assist in the management of blood donations and the blood production chain in blood establishments. In particular, the system keeps track of several collection and production indicators, enables the definition of collection and production strategies, and the measurement of quality indicators required by the Quality Management System regulating the general operation of blood establishments. This paper describes the general scenario of blood establishments and its main requirements in terms of data management and analysis. It presents the architecture of SAD_BaSe and identifies its main contributions. Specifically, it brings forward the generation of customized reports driven by decision making needs and the use of data mining techniques in the analysis of donor suspensions and donation discards.
Analysis respons to the implementation of nuclear installations safety culture using AHP-TOPSIS
NASA Astrophysics Data System (ADS)
Situmorang, J.; Kuntoro, I.; Santoso, S.; Subekti, M.; Sunaryo, G. R.
2018-02-01
An analysis of responses to the implementation of nuclear installations safety culture has been done using AHP (Analitic Hierarchy Process) - TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution). Safety culture is considered as collective commitments of the decision-making level, management level, and individual level. Thus each level will provide a subjective perspective as an alternative approach to implementation. Furthermore safety culture is considered by the statement of five characteristics which in more detail form consist of 37 attributes, and therefore can be expressed as multi-attribute state. Those characteristics and or attributes will be a criterion and its value is difficult to determine. Those criteria of course, will determine and strongly influence the implementation of the corresponding safety culture. To determine the pattern and magnitude of the influence is done by using a TOPSIS that is based on decision matrix approach and is composed of alternatives and criteria. The weight of each criterion is determined by AHP technique. The data used are data collected through questionnaires at the workshop on safety and health in 2015. .Reliability test of data gives Cronbah Alpha value of 95.5% which according to the criteria is stated reliable. Validity test using bivariate correlation analysis technique between each attribute give Pearson correlation for all attribute is significant at level 0,01. Using confirmatory factor analysis gives Kaise-Meyer-Olkin of sampling Adequacy (KMO) is 0.719 and it is greater than the acceptance criterion 0.5 as well as the 0.000 significance level much smaller than 0.05 and stated that further analysis could be performed. As a result of the analysis it is found that responses from the level of decision maker (second echelon) dominate the best order preference rank to be the best solution in strengthening the nuclear installation safety culture, except for the first characteristics, safety is a clearly recognized value. The rank of preference order is obtained sequentially according to the level of policy maker, management and individual or staff.
Distributed intelligent data analysis in diabetic patient management.
Bellazzi, R.; Larizza, C.; Riva, A.; Mira, A.; Fiocchi, S.; Stefanelli, M.
1996-01-01
This paper outlines the methodologies that can be used to perform an intelligent analysis of diabetic patients' data, realized in a distributed management context. We present a decision-support system architecture based on two modules, a Patient Unit and a Medical Unit, connected by telecommunication services. We stress the necessity to resort to temporal abstraction techniques, combined with time series analysis, in order to provide useful advice to patients; finally, we outline how data analysis and interpretation can be cooperatively performed by the two modules. PMID:8947655
Watch for pitfalls of discounted cash flow techniques.
Chow, C W; McNamee, A H
1991-04-01
Discounted cash flow (DCF) techniques can enhance the effectiveness of a healthcare organization's capital budgeting decisions. But a financial manager unaware of common misapplications of DCF techniques may make capital decisions with a hidden bias against long-term projects, an inaccurate evaluation of options, or inappropriate estimations of expected inflation and risk. Social and psychological factors also can impede effective decisions on projects already introduced.
Using data mining techniques to predict the severity of bicycle crashes.
Prati, Gabriele; Pietrantoni, Luca; Fraboni, Federico
2017-04-01
To investigate the factors predicting severity of bicycle crashes in Italy, we used an observational study of official statistics. We applied two of the most widely used data mining techniques, CHAID decision tree technique and Bayesian network analysis. We used data provided by the Italian National Institute of Statistics on road crashes that occurred on the Italian road network during the period ranging from 2011 to 2013. In the present study, the dataset contains information about road crashes occurred on the Italian road network during the period ranging from 2011 to 2013. We extracted 49,621 road accidents where at least one cyclist was injured or killed from the original database that comprised a total of 575,093 road accidents. CHAID decision tree technique was employed to establish the relationship between severity of bicycle crashes and factors related to crash characteristics (type of collision and opponent vehicle), infrastructure characteristics (type of carriageway, road type, road signage, pavement type, and type of road segment), cyclists (gender and age), and environmental factors (time of the day, day of the week, month, pavement condition, and weather). CHAID analysis revealed that the most important predictors were, in decreasing order of importance, road type (0.30), crash type (0.24), age of cyclist (0.19), road signage (0.08), gender of cyclist (0.07), type of opponent vehicle (0.05), month (0.04), and type of road segment (0.02). These eight most important predictors of the severity of bicycle crashes were included as predictors of the target (i.e., severity of bicycle crashes) in Bayesian network analysis. Bayesian network analysis identified crash type (0.31), road type (0.19), and type of opponent vehicle (0.18) as the most important predictors of severity of bicycle crashes. Copyright © 2017 Elsevier Ltd. All rights reserved.
A review of costing methodologies in critical care studies.
Pines, Jesse M; Fager, Samuel S; Milzman, David P
2002-09-01
Clinical decision making in critical care has traditionally been based on clinical outcome measures such as mortality and morbidity. Over the past few decades, however, increasing competition in the health care marketplace has made it necessary to consider costs when making clinical and managerial decisions in critical care. Sophisticated costing methodologies have been developed to aid this decision-making process. We performed a narrative review of published costing studies in critical care during the past 6 years. A total of 282 articles were found, of which 68 met our search criteria. They involved a mean of 508 patients (range, 20-13,907). A total of 92.6% of the studies (63 of 68) used traditional cost analysis, whereas the remaining 7.4% (5 of 68) used cost-effectiveness analysis. None (0 of 68) used cost-benefit analysis or cost-utility analysis. A total of 36.7% (25 of 68) used hospital charges as a surrogate for actual costs. Of the 43 articles that actually counted costs, 37.2% (16 of 43) counted physician costs, 27.9% (12 of 43) counted facility costs, 34.9% (15 of 43) counted nursing costs, 9.3% (4 of 43) counted societal costs, and 90.7% (39 of 43) counted laboratory, equipment, and pharmacy costs. Our conclusion is that despite considerable progress in costing methodologies, critical care studies have not adequately implemented these techniques. Given the importance of financial implications in medicine, it would be prudent for critical care studies to use these more advanced techniques. Copyright 2002, Elsevier Science (USA). All rights reserved.
Effects of foveal information processing
NASA Technical Reports Server (NTRS)
Harris, R. L., Sr.
1984-01-01
The scanning behavior of pilots must be understood so that cockpit displays can be assembled which will provide the most information accurately and quickly to the pilot. The results of seven years of collecting and analyzing pilot scanning data are summarized. The data indicate that pilot scanning behavior is: (1) subsconscious; (2) situation dependent; and (3) can be disrupted if pilots are forced to make conscious decisions. Testing techniques and scanning analysis techniques have been developed that are sensitive to pilot workload.
Real-time emergency forecasting technique for situation management systems
NASA Astrophysics Data System (ADS)
Kopytov, V. V.; Kharechkin, P. V.; Naumenko, V. V.; Tretyak, R. S.; Tebueva, F. B.
2018-05-01
The article describes the real-time emergency forecasting technique that allows increasing accuracy and reliability of forecasting results of any emergency computational model applied for decision making in situation management systems. Computational models are improved by the Improved Brown’s method applying fractal dimension to forecast short time series data being received from sensors and control systems. Reliability of emergency forecasting results is ensured by the invalid sensed data filtering according to the methods of correlation analysis.
NASA Astrophysics Data System (ADS)
Jha, Madan K.; Chowdary, V. M.; Chowdhury, Alivia
2010-11-01
An approach is presented for the evaluation of groundwater potential using remote sensing, geographic information system, geoelectrical, and multi-criteria decision analysis techniques. The approach divides the available hydrologic and hydrogeologic data into two groups, exogenous (hydrologic) and endogenous (subsurface). A case study in Salboni Block, West Bengal (India), uses six thematic layers of exogenous parameters and four thematic layers of endogenous parameters. These thematic layers and their features were assigned suitable weights which were normalized by analytic hierarchy process and eigenvector techniques. The layers were then integrated using ArcGIS software to generate two groundwater potential maps. The hydrologic parameters-based groundwater potential zone map indicated that the `good' groundwater potential zone covers 27.14% of the area, the `moderate' zone 45.33%, and the `poor' zone 27.53%. A comparison of this map with the groundwater potential map based on subsurface parameters revealed that the hydrologic parameters-based map accurately delineates groundwater potential zones in about 59% of the area, and hence it is dependable to a certain extent. More than 80% of the study area has moderate-to-poor groundwater potential, which necessitates efficient groundwater management for long-term water security. Overall, the integrated technique is useful for the assessment of groundwater resources at a basin or sub-basin scale.
Fuzzy MCDM Technique for Planning the Environment Watershed
NASA Astrophysics Data System (ADS)
Chen, Yi-Chun; Lien, Hui-Pang; Tzeng, Gwo-Hshiung; Yang, Lung-Shih; Yen, Leon
In the real word, the decision making problems are very vague and uncertain in a number of ways. The most criteria have interdependent and interactive features so they cannot be evaluated by conventional measures method. Such as the feasibility, thus, to approximate the human subjective evaluation process, it would be more suitable to apply a fuzzy method in environment-watershed plan topic. This paper describes the design of a fuzzy decision support system in multi-criteria analysis approach for selecting the best plan alternatives or strategies in environmentwatershed. The Fuzzy Analytic Hierarchy Process (FAHP) method is used to determine the preference weightings of criteria for decision makers by subjective perception. A questionnaire was used to find out from three related groups comprising fifteen experts. Subjectivity and vagueness analysis is dealt with the criteria and alternatives for selection process and simulation results by using fuzzy numbers with linguistic terms. Incorporated the decision makers’ attitude towards preference, overall performance value of each alternative can be obtained based on the concept of Fuzzy Multiple Criteria Decision Making (FMCDM). This research also gives an example of evaluating consisting of five alternatives, solicited from a environmentwatershed plan works in Taiwan, is illustrated to demonstrate the effectiveness and usefulness of the proposed approach.
A matter of tradeoffs: reintroduction as a multiple objective decision
Converse, Sarah J.; Moore, Clinton T.; Folk, Martin J.; Runge, Michael C.
2013-01-01
Decision making in guidance of reintroduction efforts is made challenging by the substantial scientific uncertainty typically involved. However, a less recognized challenge is that the management objectives are often numerous and complex. Decision makers managing reintroduction efforts are often concerned with more than just how to maximize the probability of reintroduction success from a population perspective. Decision makers are also weighing other concerns such as budget limitations, public support and/or opposition, impacts on the ecosystem, and the need to consider not just a single reintroduction effort, but conservation of the entire species. Multiple objective decision analysis is a powerful tool for formal analysis of such complex decisions. We demonstrate the use of multiple objective decision analysis in the case of the Florida non-migratory whooping crane reintroduction effort. In this case, the State of Florida was considering whether to resume releases of captive-reared crane chicks into the non-migratory whooping crane population in that state. Management objectives under consideration included maximizing the probability of successful population establishment, minimizing costs, maximizing public relations benefits, maximizing the number of birds available for alternative reintroduction efforts, and maximizing learning about the demographic patterns of reintroduced whooping cranes. The State of Florida engaged in a collaborative process with their management partners, first, to evaluate and characterize important uncertainties about system behavior, and next, to formally evaluate the tradeoffs between objectives using the Simple Multi-Attribute Rating Technique (SMART). The recommendation resulting from this process, to continue releases of cranes at a moderate intensity, was adopted by the State of Florida in late 2008. Although continued releases did not receive support from the International Whooping Crane Recovery Team, this approach does provide a template for the formal, transparent consideration of multiple, potentially competing, objectives in reintroduction decision making.
Trajectory-Based Performance Assessment for Aviation Weather Information
NASA Technical Reports Server (NTRS)
Vigeant-Langlois, Laurence; Hansman, R. John, Jr.
2003-01-01
Based on an analysis of aviation decision-makers' time-related weather information needs, an abstraction of the aviation weather decision task was developed, that involves 4-D intersection testing between aircraft trajectory hypertubes and hazardous weather hypervolumes. The framework builds on the hypothesis that hazardous meteorological fields can be simplified using discrete boundaries of surrogate threat attributes. The abstractions developed in the framework may be useful in studying how to improve the performance of weather forecasts from the trajectory-centric perspective, as well as for developing useful visualization techniques of weather information.
Team formation and breakup in multiagent systems
NASA Astrophysics Data System (ADS)
Rao, Venkatesh Guru
The goal of this dissertation is to pose and solve problems involving team formation and breakup in two specific multiagent domains: formation travel and space-based interferometric observatories. The methodology employed comprises elements drawn from control theory, scheduling theory and artificial intelligence (AI). The original contribution of the work comprises three elements. The first contribution, the partitioned state-space approach is a technique for formulating and solving co-ordinated motion problem using calculus of variations techniques. The approach is applied to obtain optimal two-agent formation travel trajectories on graphs. The second contribution is the class of MixTeam algorithms, a class of team dispatchers that extends classical dispatching by accommodating team formation and breakup and exploration/exploitation learning. The algorithms are applied to observation scheduling and constellation geometry design for interferometric space telescopes. The use of feedback control for team scheduling is also demonstrated with these algorithms. The third contribution is the analysis of the optimality properties of greedy, or myopic, decision-making for a simple class of team dispatching problems. This analysis represents a first step towards the complete analysis of complex team schedulers such as the MixTeam algorithms. The contributions represent an extension to the literature on team dynamics in control theory. The broad conclusions that emerge from this research are that greedy or myopic decision-making strategies for teams perform well when specific parameters in the domain are weakly affected by an agent's actions, and that intelligent systems require a closer integration of domain knowledge in decision-making functions.
Structural Equation Model Trees
ERIC Educational Resources Information Center
Brandmaier, Andreas M.; von Oertzen, Timo; McArdle, John J.; Lindenberger, Ulman
2013-01-01
In the behavioral and social sciences, structural equation models (SEMs) have become widely accepted as a modeling tool for the relation between latent and observed variables. SEMs can be seen as a unification of several multivariate analysis techniques. SEM Trees combine the strengths of SEMs and the decision tree paradigm by building tree…
Using the Nobel Laureates in Economics to Teach Quantitative Methods
ERIC Educational Resources Information Center
Becker, William E.; Greene, William H.
2005-01-01
The authors show how the work of Nobel Laureates in economics can enhance student understanding and bring them up to date on topics such as probability, uncertainty and decision theory, hypothesis testing, regression to the mean, instrumental variable techniques, discrete choice modeling, and time-series analysis. (Contains 2 notes.)
Risk Management for Weapon Systems Acquisition: A Decision Support System
1985-02-28
includes the program evaluation and review technique (PERT) for network analysis, the PMRM for quantifying risk , an optimization package for generating...Despite the inclusion of uncertainty in time, PERT can at best be considered as a tool for quantifying risk with regard to the time element only. Moreover
Developing and Assessing E-Learning Techniques for Teaching Forecasting
ERIC Educational Resources Information Center
Gel, Yulia R.; O'Hara Hines, R. Jeanette; Chen, He; Noguchi, Kimihiro; Schoner, Vivian
2014-01-01
In the modern business environment, managers are increasingly required to perform decision making and evaluate related risks based on quantitative information in the face of uncertainty, which in turn increases demand for business professionals with sound skills and hands-on experience with statistical data analysis. Computer-based training…
Data Mining and Knowledge Management in Higher Education -Potential Applications.
ERIC Educational Resources Information Center
Luan, Jing
This paper introduces a new decision support tool, data mining, in the context of knowledge management. The most striking features of data mining techniques are clustering and prediction. The clustering aspect of data mining offers comprehensive characteristics analysis of students, while the predicting function estimates the likelihood for a…
A number of investigators have recently examined the utility of applying probabilistic techniques in the derivation of toxic equivalency factors (TEFs) for polychlorinated dibenzo-p-dioxins (PCDDs), polychlorinated dibenzofurans (PCDFs) and dioxin-like polychlorinated biphenyls (...
Systemic Planning: An Annotated Bibliography and Literature Guide. Exchange Bibliography No. 91.
ERIC Educational Resources Information Center
Catanese, Anthony James
Systemic planning is an operational approach to using scientific rigor and qualitative judgment in a complementary manner. It integrates rigorous techniques and methods from systems analysis, cybernetics, decision theory, and work programing. The annotated reference sources in this bibliography include those works that have been most influential…
SUSTAIN (System for Urban Stormwater Treatment and Analysis INtegration) is a decision support system to facilitate selection and placement of best management practices (BMPs) and low impact development (LID) techniques at strategic locations in urban watersheds. It was develope...
Teaching Green Engineering: The Case of Ethanol Lifecycle Analysis
ERIC Educational Resources Information Center
Vallero, Daniel A.; Braiser, Chris
2008-01-01
Lifecycle assessment (LCA) is a valuable tool in teaching green engineering and has been used to assess biofuels, including ethanol. An undergraduate engineering course at Duke University has integrated LCA with other interactive teaching techniques to enhance awareness and to inform engineering decision making related to societal issues, such as…
Maintenance Audit through Value Analysis Technique: A Case Study
NASA Astrophysics Data System (ADS)
Carnero, M. C.; Delgado, S.
2008-11-01
The increase in competitiveness, technological changes and the increase in the requirements of quality and service have forced a change in the design and application of maintenance, as well as the way in which it is considered within the managerial strategy. There are numerous maintenance activities that must be developed in a service company. As a result the maintenance functions as a whole have to be outsourced. Nevertheless, delegating this subject to specialized personnel does not exempt the company from responsibilities, but rather leads to the need for control of each maintenance activity. In order to achieve this control and to evaluate the efficiency and effectiveness of the company it is essential to carry out an audit that diagnoses the problems that could develop. In this paper a maintenance audit applied to a service company is developed. The methodology applied is based on the expert systems. The expert system by means of rules uses the weighting technique SMART and value analysis to obtain the weighting between the decision functions and between the alternatives. The expert system applies numerous rules and relations between different variables associated with the specific maintenance functions, to obtain the maintenance state by sections and the general maintenance state of the enterprise. The contributions of this paper are related to the development of a maintenance audit in a service enterprise, in which maintenance is not generally considered a strategic subject and to the integration of decision-making tools such as the weighting technique SMART with value analysis techniques, typical in the design of new products, in the area of the rule-based expert systems.
Multi-intelligence critical rating assessment of fusion techniques (MiCRAFT)
NASA Astrophysics Data System (ADS)
Blasch, Erik
2015-06-01
Assessment of multi-intelligence fusion techniques includes credibility of algorithm performance, quality of results against mission needs, and usability in a work-domain context. Situation awareness (SAW) brings together low-level information fusion (tracking and identification), high-level information fusion (threat and scenario-based assessment), and information fusion level 5 user refinement (physical, cognitive, and information tasks). To measure SAW, we discuss the SAGAT (Situational Awareness Global Assessment Technique) technique for a multi-intelligence fusion (MIF) system assessment that focuses on the advantages of MIF against single intelligence sources. Building on the NASA TLX (Task Load Index), SAGAT probes, SART (Situational Awareness Rating Technique) questionnaires, and CDM (Critical Decision Method) decision points; we highlight these tools for use in a Multi-Intelligence Critical Rating Assessment of Fusion Techniques (MiCRAFT). The focus is to measure user refinement of a situation over the information fusion quality of service (QoS) metrics: timeliness, accuracy, confidence, workload (cost), and attention (throughput). A key component of any user analysis includes correlation, association, and summarization of data; so we also seek measures of product quality and QuEST of information. Building a notion of product quality from multi-intelligence tools is typically subjective which needs to be aligned with objective machine metrics.
Training conservation practitioners to be better decision makers
Johnson, Fred A.; Eaton, Mitchell J.; Williams, James H.; Jensen, Gitte H.; Madsen, Jesper
2015-01-01
Traditional conservation curricula and training typically emphasizes only one part of systematic decision making (i.e., the science), at the expense of preparing conservation practitioners with critical skills in values-setting, working with decision makers and stakeholders, and effective problem framing. In this article we describe how the application of decision science is relevant to conservation problems and suggest how current and future conservation practitioners can be trained to be better decision makers. Though decision-analytic approaches vary considerably, they all involve: (1) properly formulating the decision problem; (2) specifying feasible alternative actions; and (3) selecting criteria for evaluating potential outcomes. Two approaches are available for providing training in decision science, with each serving different needs. Formal education is useful for providing simple, well-defined problems that allow demonstrations of the structure, axioms and general characteristics of a decision-analytic approach. In contrast, practical training can offer complex, realistic decision problems requiring more careful structuring and analysis than those used for formal training purposes. Ultimately, the kinds and degree of training necessary depend on the role conservation practitioners play in a decision-making process. Those attempting to facilitate decision-making processes will need advanced training in both technical aspects of decision science and in facilitation techniques, as well as opportunities to apprentice under decision analysts/consultants. Our primary goal should be an attempt to ingrain a discipline for applying clarity of thought to all decisions.
[The role of epidemiology in the process of decision-making].
Prost, A
1997-01-01
Epidemiology is the method of choice for quantifying and interpreting health phenomena, placing them into perspective to allow trend analysis and projections. It is a tool for analysis, evaluation and forecasting and is thus indispensable in the decision-making process. However, this comprehensive technique has its limitations since health is the result of complex interactions: individual requirements do not always correspond to the overall needs of the community; consideration has to be given to solidarity and the necessity for cost-sharing; and the decision process is strongly influenced by social, cultural, religious and political factors which defy quantification and, on occasion, any rational course of action. Each indicator only takes into account one aspect of the situation and the pertinent indicator should therefore be carefully selected. At the same time, any choice implicitly signifies value judgements-often unnoticed-which need to be balanced and validated in relation to the ethical values of the community in order to be of any assistance to decision-making. Decision-making is a qualitative political process which, although based on the quantitative analysis supplied by epidemiology, cannot be limited to it. Each approach enhance the other, but they should not be confused if freedom to act is to be preserved from being locked into some kind of mechanical process that is unacceptable both to man and to society.
Prasad, Keerthana; Winter, Jan; Bhat, Udayakrishna M; Acharya, Raviraja V; Prabhu, Gopalakrishna K
2012-08-01
This paper describes development of a decision support system for diagnosis of malaria using color image analysis. A hematologist has to study around 100 to 300 microscopic views of Giemsa-stained thin blood smear images to detect malaria parasites, evaluate the extent of infection and to identify the species of the parasite. The proposed algorithm picks up the suspicious regions and detects the parasites in images of all the views. The subimages representing all these parasites are put together to form a composite image which can be sent over a communication channel to obtain the opinion of a remote expert for accurate diagnosis and treatment. We demonstrate the use of the proposed technique for use as a decision support system by developing an android application which facilitates the communication with a remote expert for the final confirmation on the decision for treatment of malaria. Our algorithm detects around 96% of the parasites with a false positive rate of 20%. The Spearman correlation r was 0.88 with a confidence interval of 0.838 to 0.923, p<0.0001.
Korving, H; Clemens, F
2002-01-01
In recent years, decision analysis has become an important technique in many disciplines. It provides a methodology for rational decision-making allowing for uncertainties in the outcome of several possible actions to be undertaken. An example in urban drainage is the situation in which an engineer has to decide upon a major reconstruction of a system in order to prevent pollution of receiving waters due to CSOs. This paper describes the possibilities of Bayesian decision-making in urban drainage. In particular, the utility of monitoring prior to deciding on the reconstruction of a sewer system to reduce CSO emissions is studied. Our concern is with deciding whether a price should be paid for new information and which source of information is the best choice given the expected uncertainties in the outcome. The influence of specific uncertainties (sewer system data and model parameters) on the probability of CSO volumes is shown to be significant. Using Bayes' rule, to combine prior impressions with new observations, reduces the risks linked with the planning of sewer system reconstructions.
A systematic mapping study of process mining
NASA Astrophysics Data System (ADS)
Maita, Ana Rocío Cárdenas; Martins, Lucas Corrêa; López Paz, Carlos Ramón; Rafferty, Laura; Hung, Patrick C. K.; Peres, Sarajane Marques; Fantinato, Marcelo
2018-05-01
This study systematically assesses the process mining scenario from 2005 to 2014. The analysis of 705 papers evidenced 'discovery' (71%) as the main type of process mining addressed and 'categorical prediction' (25%) as the main mining task solved. The most applied traditional technique is the 'graph structure-based' ones (38%). Specifically concerning computational intelligence and machine learning techniques, we concluded that little relevance has been given to them. The most applied are 'evolutionary computation' (9%) and 'decision tree' (6%), respectively. Process mining challenges, such as balancing among robustness, simplicity, accuracy and generalization, could benefit from a larger use of such techniques.
Balk, Benjamin; Elder, Kelly
2000-01-01
We model the spatial distribution of snow across a mountain basin using an approach that combines binary decision tree and geostatistical techniques. In April 1997 and 1998, intensive snow surveys were conducted in the 6.9‐km2 Loch Vale watershed (LVWS), Rocky Mountain National Park, Colorado. Binary decision trees were used to model the large‐scale variations in snow depth, while the small‐scale variations were modeled through kriging interpolation methods. Binary decision trees related depth to the physically based independent variables of net solar radiation, elevation, slope, and vegetation cover type. These decision tree models explained 54–65% of the observed variance in the depth measurements. The tree‐based modeled depths were then subtracted from the measured depths, and the resulting residuals were spatially distributed across LVWS through kriging techniques. The kriged estimates of the residuals were added to the tree‐based modeled depths to produce a combined depth model. The combined depth estimates explained 60–85% of the variance in the measured depths. Snow densities were mapped across LVWS using regression analysis. Snow‐covered area was determined from high‐resolution aerial photographs. Combining the modeled depths and densities with a snow cover map produced estimates of the spatial distribution of snow water equivalence (SWE). This modeling approach offers improvement over previous methods of estimating SWE distribution in mountain basins.
In the teeth of the evidence: the curious case of evidence-based medicine.
Davidoff, F
1999-03-01
For a very long time, evidence from research has contributed to clinical decision making. Over the past 50 years, however, the nature of clinical research evidence has drastically changed compared with previous eras: its standards are higher, the tools for assembling and analyzing it are more powerful, and the context in which it is used is less authoritarian. The consequence has been a shift in both the concept and the practice of clinical decision making known as evidence-based medicine. Evidence-based decisions, by definition, use the strongest available evidence, are often more quantitatively informed than decisions made in the traditional fashion; and sometimes run counter to expert opinion. The techniques of evidence-based medicine are also helpful in resolving conflicting opinions. Evidence-based medicine did not simply appear in vacuo; its roots extend back at least as far as the great French Encyclopedia of the 18th century, and the subsequent work of Pierre Louis in Paris in the early 19th century. The power of the evidence-based approach has been enhanced in recent years by the development of the techniques of systematic review and meta-analysis. While this approach has its critics, we would all want the best available evidence used in making decisions about our care if we got sick. It is only fair that the patients under our care receive nothing less.
Statistical innovations in the medical device world sparked by the FDA.
Campbell, Gregory; Yue, Lilly Q
2016-01-01
The world of medical devices while highly diverse is extremely innovative, and this facilitates the adoption of innovative statistical techniques. Statisticians in the Center for Devices and Radiological Health (CDRH) at the Food and Drug Administration (FDA) have provided leadership in implementing statistical innovations. The innovations discussed include: the incorporation of Bayesian methods in clinical trials, adaptive designs, the use and development of propensity score methodology in the design and analysis of non-randomized observational studies, the use of tipping-point analysis for missing data, techniques for diagnostic test evaluation, bridging studies for companion diagnostic tests, quantitative benefit-risk decisions, and patient preference studies.
Decision-analytic modeling studies: An overview for clinicians using multiple myeloma as an example.
Rochau, U; Jahn, B; Qerimi, V; Burger, E A; Kurzthaler, C; Kluibenschaedl, M; Willenbacher, E; Gastl, G; Willenbacher, W; Siebert, U
2015-05-01
The purpose of this study was to provide a clinician-friendly overview of decision-analytic models evaluating different treatment strategies for multiple myeloma (MM). We performed a systematic literature search to identify studies evaluating MM treatment strategies using mathematical decision-analytic models. We included studies that were published as full-text articles in English, and assessed relevant clinical endpoints, and summarized methodological characteristics (e.g., modeling approaches, simulation techniques, health outcomes, perspectives). Eleven decision-analytic modeling studies met our inclusion criteria. Five different modeling approaches were adopted: decision-tree modeling, Markov state-transition modeling, discrete event simulation, partitioned-survival analysis and area-under-the-curve modeling. Health outcomes included survival, number-needed-to-treat, life expectancy, and quality-adjusted life years. Evaluated treatment strategies included novel agent-based combination therapies, stem cell transplantation and supportive measures. Overall, our review provides a comprehensive summary of modeling studies assessing treatment of MM and highlights decision-analytic modeling as an important tool for health policy decision making. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Analysis of Hospital Processes with Process Mining Techniques.
Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises
2015-01-01
Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.
Cloud Service Selection Using Multicriteria Decision Analysis
Anuar, Nor Badrul; Shiraz, Muhammad; Haque, Israat Tanzeena
2014-01-01
Cloud computing (CC) has recently been receiving tremendous attention from the IT industry and academic researchers. CC leverages its unique services to cloud customers in a pay-as-you-go, anytime, anywhere manner. Cloud services provide dynamically scalable services through the Internet on demand. Therefore, service provisioning plays a key role in CC. The cloud customer must be able to select appropriate services according to his or her needs. Several approaches have been proposed to solve the service selection problem, including multicriteria decision analysis (MCDA). MCDA enables the user to choose from among a number of available choices. In this paper, we analyze the application of MCDA to service selection in CC. We identify and synthesize several MCDA techniques and provide a comprehensive analysis of this technology for general readers. In addition, we present a taxonomy derived from a survey of the current literature. Finally, we highlight several state-of-the-art practical aspects of MCDA implementation in cloud computing service selection. The contributions of this study are four-fold: (a) focusing on the state-of-the-art MCDA techniques, (b) highlighting the comparative analysis and suitability of several MCDA methods, (c) presenting a taxonomy through extensive literature review, and (d) analyzing and summarizing the cloud computing service selections in different scenarios. PMID:24696645
Cloud service selection using multicriteria decision analysis.
Whaiduzzaman, Md; Gani, Abdullah; Anuar, Nor Badrul; Shiraz, Muhammad; Haque, Mohammad Nazmul; Haque, Israat Tanzeena
2014-01-01
Cloud computing (CC) has recently been receiving tremendous attention from the IT industry and academic researchers. CC leverages its unique services to cloud customers in a pay-as-you-go, anytime, anywhere manner. Cloud services provide dynamically scalable services through the Internet on demand. Therefore, service provisioning plays a key role in CC. The cloud customer must be able to select appropriate services according to his or her needs. Several approaches have been proposed to solve the service selection problem, including multicriteria decision analysis (MCDA). MCDA enables the user to choose from among a number of available choices. In this paper, we analyze the application of MCDA to service selection in CC. We identify and synthesize several MCDA techniques and provide a comprehensive analysis of this technology for general readers. In addition, we present a taxonomy derived from a survey of the current literature. Finally, we highlight several state-of-the-art practical aspects of MCDA implementation in cloud computing service selection. The contributions of this study are four-fold: (a) focusing on the state-of-the-art MCDA techniques, (b) highlighting the comparative analysis and suitability of several MCDA methods, (c) presenting a taxonomy through extensive literature review, and (d) analyzing and summarizing the cloud computing service selections in different scenarios.
Vision Based Autonomous Robotic Control for Advanced Inspection and Repair
NASA Technical Reports Server (NTRS)
Wehner, Walter S.
2014-01-01
The advanced inspection system is an autonomous control and analysis system that improves the inspection and remediation operations for ground and surface systems. It uses optical imaging technology with intelligent computer vision algorithms to analyze physical features of the real-world environment to make decisions and learn from experience. The advanced inspection system plans to control a robotic manipulator arm, an unmanned ground vehicle and cameras remotely, automatically and autonomously. There are many computer vision, image processing and machine learning techniques available as open source for using vision as a sensory feedback in decision-making and autonomous robotic movement. My responsibilities for the advanced inspection system are to create a software architecture that integrates and provides a framework for all the different subsystem components; identify open-source algorithms and techniques; and integrate robot hardware.
Formal methods for modeling and analysis of hybrid systems
NASA Technical Reports Server (NTRS)
Tiwari, Ashish (Inventor); Lincoln, Patrick D. (Inventor)
2009-01-01
A technique based on the use of a quantifier elimination decision procedure for real closed fields and simple theorem proving to construct a series of successively finer qualitative abstractions of hybrid automata is taught. The resulting abstractions are always discrete transition systems which can then be used by any traditional analysis tool. The constructed abstractions are conservative and can be used to establish safety properties of the original system. The technique works on linear and non-linear polynomial hybrid systems: the guards on discrete transitions and the continuous flows in all modes can be specified using arbitrary polynomial expressions over the continuous variables. An exemplar tool in the SAL environment built over the theorem prover PVS is detailed. The technique scales well to large and complex hybrid systems.
A Flexible and Non-instrusive Approach for Computing Complex Structural Coverage Metrics
NASA Technical Reports Server (NTRS)
Whalen, Michael W.; Person, Suzette J.; Rungta, Neha; Staats, Matt; Grijincu, Daniela
2015-01-01
Software analysis tools and techniques often leverage structural code coverage information to reason about the dynamic behavior of software. Existing techniques instrument the code with the required structural obligations and then monitor the execution of the compiled code to report coverage. Instrumentation based approaches often incur considerable runtime overhead for complex structural coverage metrics such as Modified Condition/Decision (MC/DC). Code instrumentation, in general, has to be approached with great care to ensure it does not modify the behavior of the original code. Furthermore, instrumented code cannot be used in conjunction with other analyses that reason about the structure and semantics of the code under test. In this work, we introduce a non-intrusive preprocessing approach for computing structural coverage information. It uses a static partial evaluation of the decisions in the source code and a source-to-bytecode mapping to generate the information necessary to efficiently track structural coverage metrics during execution. Our technique is flexible; the results of the preprocessing can be used by a variety of coverage-driven software analysis tasks, including automated analyses that are not possible for instrumented code. Experimental results in the context of symbolic execution show the efficiency and flexibility of our nonintrusive approach for computing code coverage information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nealey, S.M.; Liebow, E.B.
1988-03-01
The US Department of Energy sponsored a one-day workshop to discuss the complex dimensions of risk judgment formation and the assessment of social and economic effects of risk perceptions related to the permanent underground storage of highly radioactive waste from commercial nuclear power plants. Affected parties have publicly expressed concerns about potentially significant risk-related effects of this approach to waste management. A selective review of relevant literature in psychology, decision analysis, economics, sociology, and anthropology was completed, along with an examination of decision analysis techniques that might assist in developing suitable responses to public risk-related concerns. The workshop was organizedmore » as a forum in which a set of distinguished experts could exchange ideas and observations about the problems of characterizing the effects of risk judgments. Out of the exchange emerged the issues or themes of problems with probabilistic risk assessment techniques are evident; differences exist in the way experts and laypersons view risk, and this leads to higher levels of public concern than experts feel are justified; experts, risk managers, and decision-makers sometimes err in assessing risk and in dealing with the public; credibility and trust are important contributing factors in the formation of risk judgments; social and economic consequences of perceived risk should be properly anticipated; improvements can be made in informing the public about risk; the role of the public in risk assessment, risk management and decisions about risk should be reconsidered; and mitigation and compensation are central to resolving conflicts arising from divergent risk judgments. 1 tab.« less
The Sacramento-San Joaquin Delta Conflict: Strategic Insights for California's Policymakers
NASA Astrophysics Data System (ADS)
Moazezi, M. R.
2013-12-01
The Sacramento-San Joaquin Delta - a major water supply source in California and a unique habitat for many native and invasive species--is on the verge of collapse due to a prolonged conflict over how to manage the Delta. There is an urgent need to expedite the resolution of this conflict because the continuation of the status quo would leave irreversible environmental consequences for the entire state. In this paper a systematic technique is proposed for providing strategic insights into the Sacramento-San Joaquin Delta conflict. Game theory framework is chosen to systematically analyze behavioral characteristics of decision makers as well as their options in the conflict with respect to their preferences using a formal mathematical language. The Graph Model for Conflict Resolution (GMCR), a recent game-theoretic technique, is applied to model and analyze the Delta conflict in order to better understand the options, preferences, and behavioral characteristics of the major decision makers. GMCR II as a decision support system tool based on GMCR concept is used to facilitate the analysis of the problem through a range of non-cooperative game theoretic stability definitions. Furthermore, coalition analysis is conducted to analyze the potential for forming partial coalitions among decision makers, and to investigate how forming a coalition can influence the conflict resolution process. This contribution shows that involvement of the State of California is necessary for developing an environmental-friendly resolution for the Delta conflict. It also indicates that this resolution is only achievable through improving the fragile levee systems and constructing a new water export facility.
Theoretical basis of the DOE-2 building energy use analysis program
NASA Astrophysics Data System (ADS)
Curtis, R. B.
1981-04-01
A user-oriented, public domain, computer program was developed that will enable architects and engineers to perform design and retrofit studies of the energy-use of buildings under realistic weather conditions. The DOE-2.1A has been named by the US DOE as the standard evaluation technique for the Congressionally mandated building energy performance standards (BEPS). A number of program design decisions were made that determine the breadth of applicability of DOE-2.1. Such design decisions are intrinsic to all building energy use analysis computer programs and determine the types of buildings or the kind of HVAC systems that can be modeled. In particular, the weighting factor method used in DOE-2 has both advantages and disadvantages relative to other computer programs.
Decision trees in epidemiological research.
Venkatasubramaniam, Ashwini; Wolfson, Julian; Mitchell, Nathan; Barnes, Timothy; JaKa, Meghan; French, Simone
2017-01-01
In many studies, it is of interest to identify population subgroups that are relatively homogeneous with respect to an outcome. The nature of these subgroups can provide insight into effect mechanisms and suggest targets for tailored interventions. However, identifying relevant subgroups can be challenging with standard statistical methods. We review the literature on decision trees, a family of techniques for partitioning the population, on the basis of covariates, into distinct subgroups who share similar values of an outcome variable. We compare two decision tree methods, the popular Classification and Regression tree (CART) technique and the newer Conditional Inference tree (CTree) technique, assessing their performance in a simulation study and using data from the Box Lunch Study, a randomized controlled trial of a portion size intervention. Both CART and CTree identify homogeneous population subgroups and offer improved prediction accuracy relative to regression-based approaches when subgroups are truly present in the data. An important distinction between CART and CTree is that the latter uses a formal statistical hypothesis testing framework in building decision trees, which simplifies the process of identifying and interpreting the final tree model. We also introduce a novel way to visualize the subgroups defined by decision trees. Our novel graphical visualization provides a more scientifically meaningful characterization of the subgroups identified by decision trees. Decision trees are a useful tool for identifying homogeneous subgroups defined by combinations of individual characteristics. While all decision tree techniques generate subgroups, we advocate the use of the newer CTree technique due to its simplicity and ease of interpretation.
Ijzerman, Maarten J; van Til, Janine A; Snoek, Govert J
2008-12-01
To present and compare two multi-criteria decision techniques (analytic hierarchy process [AHP] and conjoint analysis [CA]) for eliciting preferences in patients with cervical spinal cord injury (SCI) who are eligible for surgical augmentation of hand function, either with or without implantation of a neuroprosthesis. The methods were compared in respect to attribute weights, overall preference, and practical experiences. Two previously designed and administered multi-criteria decision surveys in patients with SCI were compared and further analysed. Attributes and their weights in the AHP experiment were determined by an expert panel, followed by determination of the weights in the patient group. Attributes for the CA were selected and validated using an expert panel, piloted in six patients with SCI and subsequently administered to the same group of patients as participated in the AHP experiment. Both experiments showed the importance of non-outcome-related factors such as inpatient stay and number of surgical procedures. In particular, patients were less concerned with clinical outcomes in actual decision making. Overall preference in both the AHP and CA was in favor of tendon reconstruction (0.6 vs 0.4 for neuroprosthetic implantation). Both methods were easy to apply, but AHP was less easily explained and understood. Both the AHP and CA methods produced similar outcomes, which may have been caused by the obvious preferences of patients. CA may be preferred because of the holistic approach of considering all treatment attributes simultaneously and, hence, its power in simulating real market decisions. On the other hand, the AHP method is preferred as a hands-on, easy-to-implement task with immediate feedback to the respondent. This flexibility allows AHP to be used in shared decision making. However, the way the technique is composed results in many inconsistencies. Patients preferred CA but complained about the number of choice tasks.
Is Regret Theory an alternative basis for estimating the value of healthcare interventions?
Smith, R D
1996-08-01
This paper presents an argument for the existence of "regret' influencing the valuation of alternative outcomes when making treatment decisions in healthcare. It is argued that valuation techniques as currently formulated rely upon the axioms of Expected Utility Theory (transitivity and independence). This potentially leads to a misrepresentation of the respondents true preferences over treatment alternatives, and thus results in the potential for "irrational' decisions being observed. A modified version of Regret Theory is outlined, and the results of a tentative empirical analysis provided to illustrate the importance of accounting for regret in the valuation of health states. It is concluded that regret is an important element in individual valuation and decision making in health care.
Analysis of Alternatives for Dismantling of the Equipment in Building 117/1 at Ignalina NPP - 13278
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poskas, Povilas; Simonis, Audrius; Poskas, Gintautas
2013-07-01
Ignalina NPP was operating two RBMK-1500 reactors which are under decommissioning now. In this paper dismantling alternatives of the equipment in Building 117/1 are analyzed. After situation analysis and collection of the primary information related to components' physical and radiological characteristics, location and other data, two different alternatives for dismantling of the equipment are formulated - the first (A1), when major components (vessels and pipes of Emergency Core Cooling System - ECCS) are segmented/halved in situ using flame cutting (oxy-acetylene) and the second one (A2), when these components are segmented/halved at the workshop using CAMC (Contact Arc Metal Cutting) technique.more » To select the preferable alternative MCDA method - AHP (Analytic Hierarchy Process) is applied. Hierarchical list of decision criteria, necessary for assessment of alternatives performance, are formulated. Quantitative decision criteria values for these alternatives are calculated using software DECRAD, which was developed by Lithuanian Energy Institute Nuclear engineering laboratory. While qualitative decision criteria are evaluated using expert judgment. Analysis results show that alternative A1 is better than alternative A2. (authors)« less
A review of techniques to determine alternative selection in design for remanufacturing
NASA Astrophysics Data System (ADS)
Noor, A. Z. Mohamed; Fauadi, M. H. F. Md; Jafar, F. A.; Mohamad, N. R.; Yunos, A. S. Mohd
2017-10-01
This paper discusses the techniques used for optimization in manufacturing system. Although problem domain is focused on sustainable manufacturing, techniques used to optimize general manufacturing system were also discussed. Important aspects of Design for Remanufacturing (DFReM) considered include indexes, weighted average, grey decision making and Fuzzy TOPSIS. The limitation of existing techniques are most of them is highly based on decision maker’s perspective. Different experts may have different understanding and eventually scale it differently. Therefore, the objective of this paper is to determine available techniques and identify the lacking feature in it. Once all the techniques have been reviewed, a decision will be made by create another technique which should counter the lacking of discussed techniques. In this paper, shows that the hybrid computation of Fuzzy Analytic Hierarchy Process (AHP) and Artificial Neural Network (ANN) is suitable and fill the gap of all discussed technique.
School Mapping and Geospatial Analysis of the Schools in Jasra Development Block of India
NASA Astrophysics Data System (ADS)
Agrawal, S.; Gupta, R. D.
2016-06-01
GIS is a collection of tools and techniques that works on the geospatial data and is used in the analysis and decision making. Education is an inherent part of any civil society. Proper educational facilities generate the high quality human resource for any nation. Therefore, government needs an efficient system that can help in analysing the current state of education and its progress. Government also needs a system that can support in decision making and policy framing. GIS can serve the mentioned requirements not only for government but also for the general public. In order to meet the standards of human development, it is necessary for the government and decision makers to have a close watch on the existing education policy and its implementation condition. School mapping plays an important role in this aspect. School mapping consists of building the geospatial database of schools that supports in the infrastructure development, policy analysis and decision making. The present research work is an attempt for supporting Right to Education (RTE) and Sarv Sikha Abhiyaan (SSA) programmes run by Government of India through the use of GIS. School mapping of the study area is performed which is followed by the geospatial analysis. This research work will help in assessing the present status of educational infrastructure in Jasra block of Allahabad district, India.
Primal-dual techniques for online algorithms and mechanisms
NASA Astrophysics Data System (ADS)
Liaghat, Vahid
An offline algorithm is one that knows the entire input in advance. An online algorithm, however, processes its input in a serial fashion. In contrast to offline algorithms, an online algorithm works in a local fashion and has to make irrevocable decisions without having the entire input. Online algorithms are often not optimal since their irrevocable decisions may turn out to be inefficient after receiving the rest of the input. For a given online problem, the goal is to design algorithms which are competitive against the offline optimal solutions. In a classical offline scenario, it is often common to see a dual analysis of problems that can be formulated as a linear or convex program. Primal-dual and dual-fitting techniques have been successfully applied to many such problems. Unfortunately, the usual tricks come short in an online setting since an online algorithm should make decisions without knowing even the whole program. In this thesis, we study the competitive analysis of fundamental problems in the literature such as different variants of online matching and online Steiner connectivity, via online dual techniques. Although there are many generic tools for solving an optimization problem in the offline paradigm, in comparison, much less is known for tackling online problems. The main focus of this work is to design generic techniques for solving integral linear optimization problems where the solution space is restricted via a set of linear constraints. A general family of these problems are online packing/covering problems. Our work shows that for several seemingly unrelated problems, primal-dual techniques can be successfully applied as a unifying approach for analyzing these problems. We believe this leads to generic algorithmic frameworks for solving online problems. In the first part of the thesis, we show the effectiveness of our techniques in the stochastic settings and their applications in Bayesian mechanism design. In particular, we introduce new techniques for solving a fundamental linear optimization problem, namely, the stochastic generalized assignment problem (GAP). This packing problem generalizes various problems such as online matching, ad allocation, bin packing, etc. We furthermore show applications of such results in the mechanism design by introducing Prophet Secretary, a novel Bayesian model for online auctions. In the second part of the thesis, we focus on the covering problems. We develop the framework of "Disk Painting" for a general class of network design problems that can be characterized by proper functions. This class generalizes the node-weighted and edge-weighted variants of several well-known Steiner connectivity problems. We furthermore design a generic technique for solving the prize-collecting variants of these problems when there exists a dual analysis for the non-prize-collecting counterparts. Hence, we solve the online prize-collecting variants of several network design problems for the first time. Finally we focus on designing techniques for online problems with mixed packing/covering constraints. We initiate the study of degree-bounded graph optimization problems in the online setting by designing an online algorithm with a tight competitive ratio for the degree-bounded Steiner forest problem. We hope these techniques establishes a starting point for the analysis of the important class of online degree-bounded optimization on graphs.
Artificial intelligence within the chemical laboratory.
Winkel, P
1994-01-01
Various techniques within the area of artificial intelligence such as expert systems and neural networks may play a role during the problem-solving processes within the clinical biochemical laboratory. Neural network analysis provides a non-algorithmic approach to information processing, which results in the ability of the computer to form associations and to recognize patterns or classes among data. It belongs to the machine learning techniques which also include probabilistic techniques such as discriminant function analysis and logistic regression and information theoretical techniques. These techniques may be used to extract knowledge from example patients to optimize decision limits and identify clinically important laboratory quantities. An expert system may be defined as a computer program that can give advice in a well-defined area of expertise and is able to explain its reasoning. Declarative knowledge consists of statements about logical or empirical relationships between things. Expert systems typically separate declarative knowledge residing in a knowledge base from the inference engine: an algorithm that dynamically directs and controls the system when it searches its knowledge base. A tool is an expert system without a knowledge base. The developer of an expert system uses a tool by entering knowledge into the system. Many, if not the majority of problems encountered at the laboratory level are procedural. A problem is procedural if it is possible to write up a step-by-step description of the expert's work or if it can be represented by a decision tree. To solve problems of this type only small expert system tools and/or conventional programming are required.(ABSTRACT TRUNCATED AT 250 WORDS)
System analysis in rotorcraft design: The past decade
NASA Technical Reports Server (NTRS)
Galloway, Thomas L.
1988-01-01
Rapid advances in the technology of electronic digital computers and the need for an integrated synthesis approach in developing future rotorcraft programs has led to increased emphasis on system analysis techniques in rotorcraft design. The task in systems analysis is to deal with complex, interdependent, and conflicting requirements in a structured manner so rational and objective decisions can be made. Whether the results are wisdom or rubbish depends upon the validity and sometimes more importantly, the consistency of the inputs, the correctness of the analysis, and a sensible choice of measures of effectiveness to draw conclusions. In rotorcraft design this means combining design requirements, technology assessment, sensitivity analysis and reviews techniques currently in use by NASA and Army organizations in developing research programs and vehicle specifications for rotorcraft. These procedures span simple graphical approaches to comprehensive analysis on large mainframe computers. Examples of recent applications to military and civil missions are highlighted.
Analysis of Multi-State Systems with Multi-State Components Using EVMDDs
2012-05-01
Fault-Tolerant Computing (FTCS), pp. 249– 258, June 1995. [5] T. Kam, T. Villa, R. K. Brayton , and A. L. Sangiovanni- Vincentelli, “Multi-valued...Shmerko, and R. S. Stankovic, Decision Diagram Techniques for Micro- and Nanoelectronic Design, CRC Press, Taylor & Francis Group, 2006. [16] X. Zang, D
Using Spreadsheet Modeling Techniques for Capital Project Review. AIR 1985 Annual Forum Paper.
ERIC Educational Resources Information Center
Kaynor, Robert K.
The value of microcomputer modeling tools and spreadsheets to help college institutional researchers analyze proposed capital projects is discussed, along with strengths and weaknesses of different software packages. Capital budgeting is the analysis that supports decisions about the allocation and commitment of funds to long-term capital…
The Case Study Approach to Teaching Languages for Business: Problems and Benefits.
ERIC Educational Resources Information Center
Grosse, Christine Uber
Business case studies, descriptions of management problems or decisions that require students to analyze and decide on an appropriate course of action, are suitable for classroom study of commercial language because the technique emphasizes situational analysis and communicative activities such as role playing. The principles underlying the case…
ERIC Educational Resources Information Center
Maxwell, James R.; Gilberti, Anthony F.; Mupinga, Davison M.
2006-01-01
This paper will study some of the problems associated with case studies and make recommendations using standard and innovative methodologies effectively. Human resource management (HRM) and resource development cases provide context for analysis and decision-making designs in different industries. In most HRM development and training courses…
To Spray or Not To Spray? A Debate Over DDT.
ERIC Educational Resources Information Center
Dinan, Frank J.; Bieron, Joseph F.
2001-01-01
Presents an activity in which students grapple with the complex issues surrounding the use of DDT to control malaria which affects millions of people in developing nations. Considers risk/benefit analysis and the pre-cautionary principle, two techniques used when making policy decisions involving the impact of science and technology on society.…
Evidence for Early Morphological Decomposition in Visual Word Recognition
ERIC Educational Resources Information Center
Solomyak, Olla; Marantz, Alec
2010-01-01
We employ a single-trial correlational MEG analysis technique to investigate early processing in the visual recognition of morphologically complex words. Three classes of affixed words were presented in a lexical decision task: free stems (e.g., taxable), bound roots (e.g., tolerable), and unique root words (e.g., vulnerable, the root of which…
Angelis, Aris; Kanavos, Panos
2017-09-01
Escalating drug prices have catalysed the generation of numerous "value frameworks" with the aim of informing payers, clinicians and patients on the assessment and appraisal process of new medicines for the purpose of coverage and treatment selection decisions. Although this is an important step towards a more inclusive Value Based Assessment (VBA) approach, aspects of these frameworks are based on weak methodologies and could potentially result in misleading recommendations or decisions. In this paper, a Multiple Criteria Decision Analysis (MCDA) methodological process, based on Multi Attribute Value Theory (MAVT), is adopted for building a multi-criteria evaluation model. A five-stage model-building process is followed, using a top-down "value-focused thinking" approach, involving literature reviews and expert consultations. A generic value tree is structured capturing decision-makers' concerns for assessing the value of new medicines in the context of Health Technology Assessment (HTA) and in alignment with decision theory. The resulting value tree (Advance Value Tree) consists of three levels of criteria (top level criteria clusters, mid-level criteria, bottom level sub-criteria or attributes) relating to five key domains that can be explicitly measured and assessed: (a) burden of disease, (b) therapeutic impact, (c) safety profile (d) innovation level and (e) socioeconomic impact. A number of MAVT modelling techniques are introduced for operationalising (i.e. estimating) the model, for scoring the alternative treatment options, assigning relative weights of importance to the criteria, and combining scores and weights. Overall, the combination of these MCDA modelling techniques for the elicitation and construction of value preferences across the generic value tree provides a new value framework (Advance Value Framework) enabling the comprehensive measurement of value in a structured and transparent way. Given its flexibility to meet diverse requirements and become readily adaptable across different settings, the Advance Value Framework could be offered as a decision-support tool for evaluators and payers to aid coverage and reimbursement of new medicines. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Use of multicriteria analysis (MCA) for sustainable hydropower planning and management.
Vassoney, Erica; Mammoliti Mochet, Andrea; Comoglio, Claudio
2017-07-01
Multicriteria analysis (MCA) is a decision-making tool applied to a wide range of environmental management problems, including renewable energy planning and management. An interesting field of application of MCA is the evaluation and analysis of the conflicting aspects of hydropower (HP) exploitation, affecting the three pillars of sustainability and involving several different stakeholders. The present study was aimed at reviewing the state of the art of MCA applications to sustainable hydropower production and related decision-making problems, based on a detailed analysis of the scientific papers published over the last 15 years on this topic. The papers were analysed and compared, focusing on the specific features of the MCA methods applied in the described case studies, highlighting the general aspects of the MCA application (purpose, spatial scale, software used, stakeholders, etc.) and the specific operational/technical features of the selected MCA technique (methodology, criteria, evaluation, approach, sensitivity, etc.). Some specific limitations of the analysed case studies were identified and a set of "quality indexes" of an exhaustive MCA application were suggested as potential improvements for more effectively support decision-making processes in sustainable HP planning and management problems. Copyright © 2017 Elsevier Ltd. All rights reserved.
End-of-life decision making is more than rational.
Eliott, Jaklin A; Olver, Ian N
2005-01-01
Most medical models of end-of-life decision making by patients assume a rational autonomous adult obtaining and deliberating over information to arrive at some conclusion. If the patient is deemed incapable of this, family members are often nominated as substitutes, with assumptions that the family are united and rational. These are problematic assumptions. We interviewed 23 outpatients with cancer about the decision not to resuscitate a patient following cardiopulmonary arrest and examined their accounts of decision making using discourse analytical techniques. Our analysis suggests that participants access two different interpretative repertoires regarding the construct of persons, invoking a 'modernist' repertoire to assert the appropriateness of someone, a patient or family, making a decision, and a 'romanticist' repertoire when identifying either a patient or family as ineligible to make the decision. In determining the appropriateness of an individual to make decisions, participants informally apply 'Sanity' and 'Stability' tests, assessing both an inherent ability to reason (modernist repertoire) and the presence of emotion (romanticist repertoire) which might impact on the decision making process. Failure to pass the tests respectively excludes or excuses individuals from decision making. The absence of the romanticist repertoire in dominant models of patient decision making has ethical implications for policy makers and medical practitioners dealing with dying patients and their families.
Sprecher, D J; Ley, W B; Whittier, W D; Bowen, J M; Thatcher, C D; Pelzer, K D; Moore, J M
1989-07-15
A computer spreadsheet was developed to predict the economic impact of a management decision to use B-mode ultrasonographic ovine pregnancy diagnosis. The spreadsheet design and spreadsheet cell formulas are provided. The program used the partial farm budget technique to calculate net return (NR) or cash flow changes that resulted from the decision to use ultrasonography. Using the program, either simple pregnancy diagnosis or pregnancy diagnosis with the ability to determine singleton or multiple pregnancies may be compared with no flock ultrasonographic pregnancy diagnosis. A wide range of user-selected regional variables are used to calculate the cash flow changes associated with the ultrasonography decisions. A variable may be altered through a range of values to conduct a sensitivity analysis of predicted NR. Example sensitivity analyses are included for flock conception rate, veterinary ultrasound fee, and the price of corn. Variables that influence the number of cull animals and the cost of ultrasonography have the greatest impact on predicted NR. Because the determination of singleton or multiple pregnancies is more time consuming, its economic practicality in comparison with simple pregnancy diagnosis is questionable. The value of feed saved by identifying and separately feeding ewes with singleton pregnancies is not offset by the increased ultrasonography cost.
Staged decision making based on probabilistic forecasting
NASA Astrophysics Data System (ADS)
Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris
2016-04-01
Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in flood event management, the more damage can be reduced. And with decisions based on probabilistic forecasts, partial decisions can be made earlier in time (with a lower probability) and can be scaled up or down later in time when there is more certainty; whether the event takes place or not. Partial decisions are often more cheap, or shorten the final mitigation-time at the moment when there is more certainty. The proposed method is tested on Stonehaven, on the Carron River in Scotland. Decisions to implement demountable defences in the town are currently made based on a very short lead-time due to the absence of certainty. Application showed that staged decision making is possible and gives the decision maker more time to respond to a situation. The decision maker is able to take a lower regret decision with higher uncertainty and less related negative consequences. Although it is not possible to quantify intangible effects, it is part of the analysis to reduce these effects. Above all, the proposed approach has shown to be a possible improvement in economic terms and opens up possibilities of more flexible and robust decision making.
GIS Based Multi-Criteria Decision Analysis For Cement Plant Site Selection For Cuddalore District
NASA Astrophysics Data System (ADS)
Chhabra, A.
2015-12-01
India's cement industry is a vital part of its economy, providing employment to more than a million people. On the back of growing demands, due to increased construction and infrastructural activities cement market in India is expected to grow at a compound annual growth rate (CAGR) of 8.96 percent during the period 2014-2019. In this study, GIS-based spatial Multi Criteria Decision Analysis (MCDA) is used to determine the optimum and alternative sites to setup a cement plant. This technique contains a set of evaluation criteria which are quantifiable indicators of the extent to which decision objectives are realized. In intersection with available GIS (Geographical Information System) and local ancillary data, the outputs of image analysis serves as input for the multi-criteria decision making system. Moreover, the following steps were performed so as to represent the criteria in GIS layers, which underwent the GIS analysis in order to get several potential sites. Satellite imagery from LANDSAT 8 and ASTER DEM were used for the analysis. Cuddalore District in Tamil Nadu was selected as the study site as limestone mining is already being carried out in that region which meets the criteria of raw material for cement production. Several other criteria considered were land use land cover (LULC) classification (built-up area, river, forest cover, wet land, barren land, harvest land and agriculture land), slope, proximity to road, railway and drainage networks.
da Costa, Márcia Gisele Santos; Santos, Marisa da Silva; Sarti, Flávia Mori; Senna, Kátia Marie Simões e.; Tura, Bernardo Rangel; Goulart, Marcelo Correia
2014-01-01
Objectives The study performs a cost-effectiveness analysis of procedures for atrial septal defects occlusion, comparing conventional surgery to septal percutaneous implant. Methods A model of analytical decision was structured with symmetric branches to estimate cost-effectiveness ratio between the procedures. The decision tree model was based on evidences gathered through meta-analysis of literature, and validated by a panel of specialists. The lower number of surgical procedures performed for atrial septal defects occlusion at each branch was considered as the effectiveness outcome. Direct medical costs and probabilities for each event were inserted in the model using data available from Brazilian public sector database system and information extracted from the literature review, using micro-costing technique. Sensitivity analysis included price variations of percutaneous implant. Results The results obtained from the decision model demonstrated that the percutaneous implant was more cost effective in cost-effectiveness analysis at a cost of US$8,936.34 with a reduction in the probability of surgery occurrence in 93% of the cases. Probability of atrial septal communication occlusion and cost of the implant are the determinant factors of cost-effectiveness ratio. Conclusions The proposal of a decision model seeks to fill a void in the academic literature. The decision model proposed includes the outcomes that present major impact in relation to the overall costs of the procedure. The atrial septal defects occlusion using percutaneous implant reduces the physical and psychological distress to the patients in relation to the conventional surgery, which represent intangible costs in the context of economic evaluation. PMID:25302806
da Costa, Márcia Gisele Santos; Santos, Marisa da Silva; Sarti, Flávia Mori; Simões e Senna, Kátia Marie; Tura, Bernardo Rangel; Correia, Marcelo Goulart; Goulart, Marcelo Correia
2014-01-01
The study performs a cost-effectiveness analysis of procedures for atrial septal defects occlusion, comparing conventional surgery to septal percutaneous implant. A model of analytical decision was structured with symmetric branches to estimate cost-effectiveness ratio between the procedures. The decision tree model was based on evidences gathered through meta-analysis of literature, and validated by a panel of specialists. The lower number of surgical procedures performed for atrial septal defects occlusion at each branch was considered as the effectiveness outcome. Direct medical costs and probabilities for each event were inserted in the model using data available from Brazilian public sector database system and information extracted from the literature review, using micro-costing technique. Sensitivity analysis included price variations of percutaneous implant. The results obtained from the decision model demonstrated that the percutaneous implant was more cost effective in cost-effectiveness analysis at a cost of US$8,936.34 with a reduction in the probability of surgery occurrence in 93% of the cases. Probability of atrial septal communication occlusion and cost of the implant are the determinant factors of cost-effectiveness ratio. The proposal of a decision model seeks to fill a void in the academic literature. The decision model proposed includes the outcomes that present major impact in relation to the overall costs of the procedure. The atrial septal defects occlusion using percutaneous implant reduces the physical and psychological distress to the patients in relation to the conventional surgery, which represent intangible costs in the context of economic evaluation.
Development and initial evaluation of a treatment decision dashboard.
Dolan, James G; Veazie, Peter J; Russ, Ann J
2013-04-21
For many healthcare decisions, multiple alternatives are available with different combinations of advantages and disadvantages across several important dimensions. The complexity of current healthcare decisions thus presents a significant barrier to informed decision making, a key element of patient-centered care.Interactive decision dashboards were developed to facilitate decision making in Management, a field marked by similarly complicated choices. These dashboards utilize data visualization techniques to reduce the cognitive effort needed to evaluate decision alternatives and a non-linear flow of information that enables users to review information in a self-directed fashion. Theoretically, both of these features should facilitate informed decision making by increasing user engagement with and understanding of the decision at hand. We sought to determine if the interactive decision dashboard format can be successfully adapted to create a clinically realistic prototype patient decision aid suitable for further evaluation and refinement. We created a computerized, interactive clinical decision dashboard and performed a pilot test of its clinical feasibility and acceptability using a multi-method analysis. The dashboard summarized information about the effectiveness, risks of side effects and drug-drug interactions, out-of-pocket costs, and ease of use of nine analgesic treatment options for knee osteoarthritis. Outcome evaluations included observations of how study participants utilized the dashboard, questionnaires to assess usability, acceptability, and decisional conflict, and an open-ended qualitative analysis. The study sample consisted of 25 volunteers - 7 men and 18 women - with an average age of 51 years. The mean time spent interacting with the dashboard was 4.6 minutes. Mean evaluation scores on scales ranging from 1 (low) to 7 (high) were: mechanical ease of use 6.1, cognitive ease of use 6.2, emotional difficulty 2.7, decision-aiding effectiveness 5.9, clarification of values 6.5, reduction in decisional uncertainty 6.1, and provision of decision-related information 6.0. Qualitative findings were similarly positive. Interactive decision dashboards can be adapted for clinical use and have the potential to foster informed decision making. Additional research is warranted to more rigorously test the effectiveness and efficiency of patient decision dashboards for supporting informed decision making and other aspects of patient-centered care, including shared decision making.
Martelli, Nicolas; Hansen, Paul; van den Brink, Hélène; Boudard, Aurélie; Cordonnier, Anne-Laure; Devaux, Capucine; Pineau, Judith; Prognon, Patrice; Borget, Isabelle
2016-02-01
At the hospital level, decisions about purchasing new and oftentimes expensive medical devices must take into account multiple criteria simultaneously. Multi-criteria decision analysis (MCDA) is increasingly used for health technology assessment (HTA). One of the most successful hospital-based HTA approaches is mini-HTA, of which a notable example is the Matrix4value model. To develop a funding decision-support tool combining MCDA and mini-HTA, based on Matrix4value, suitable for medical devices for individual patient use in French university hospitals - known as the IDA tool, short for 'innovative device assessment'. Criteria for assessing medical devices were identified from a literature review and a survey of 18 French university hospitals. Weights for the criteria, representing their relative importance, were derived from a survey of 25 members of a medical devices committee using an elicitation technique involving pairwise comparisons. As a test of its usefulness, the IDA tool was applied to two new drug-eluting beads (DEBs) for transcatheter arterial chemoembolization. The IDA tool comprises five criteria and weights for each of two over-arching categories: risk and value. The tool revealed that the two new DEBs conferred no additional value relative to DEBs currently available. Feedback from participating decision-makers about the IDA tool was very positive. The tool could help to promote a more structured and transparent approach to HTA decision-making in French university hospitals. Copyright © 2015 Elsevier Inc. All rights reserved.
Kovshoff, Hanna; Williams, Sarah; Vrijens, May; Danckaerts, Marina; Thompson, Margaret; Yardley, Lucy; Hodgkins, Paul; Sonuga-Barke, Edmund J S
2012-02-01
Clinical decision making is influenced by a range of factors and constitutes an inherently complex task. Here we present results from the decisions regarding ADHD management (DRAMa) study in which we undertook a thematic analysis of clinicians' experiences and attitudes to assessment, diagnosis and treatment of ADHD. Fifty prescribing child psychiatrists and paediatricians from Belgium and the UK took part in semi-structured interviews about their decisions regarding the assessment, diagnosis and treatment of ADHD. Interviews were transcribed and processed using thematic analysis and the principles of grounded theory. Clinicians described the assessment and diagnostic process as inherently complicated and requiring time and experience to piece together the accounts of children made by multiple sources and through the use of varying information gathering techniques. Treatment decisions were viewed as a shared process between families, children, and the clinician. Published guidelines were viewed as vague, and few clinicians spoke about the use of symptom thresholds or specific impairment criteria. Furthermore, systematic or operationalised criteria to assess treatment outcomes were rarely used. Decision making in ADHD is regarded as a complicated, time consuming process which requires extensive use of clinical impression, and involves a partnership with parents. Clinicians want to separate biological from environmental causal factors to understand the level of impairment and the subsequent need for a diagnosis of ADHD. Clinical guidelines would benefit from revisions to take into account the real-world complexities of clinical decision making for ADHD.
Caird, Jeff K; Edwards, Christopher J; Creaser, Janet I; Horrey, William J
2005-01-01
A modified version of the flicker technique to induce change blindness was used to examine the effects of time constraints on decision-making accuracy at intersections on a total of 62 young (18-25 years), middle-aged (26-64 years), young-old (65-73 years), and old-old (74+ years) drivers. Thirty-six intersection photographs were manipulated so that one object (i.e., pedestrian, vehicle, sign, or traffic control device) in the scene would change when the images were alternated for either 5 or 8 s using the modified flicker method. Young and middle-aged drivers made significantly more correct decisions than did young-old and old-old drivers. Logistic regression analysis of the data indicated that age and/or time were significant predictors of decision performance in 14 of the 36 intersections. Actual or potential applications of this research include driving assessment and crash investigation.
Spatial decision support system for tobacco enterprise based on spatial data mining
NASA Astrophysics Data System (ADS)
Mei, Xin; Liu, Junyi; Zhang, Xuexia; Cui, Weihong
2007-11-01
Tobacco enterprise is a special enterprise, which has strong correlation to regional geography. But in the past research and application, the combination between tobacco and GIS is limited to use digital maps to assist cigarette distribution. How to comprehensively import 3S technique and spatial data mining (SDM) to construct spatial decision support system (SDSS) of tobacco enterprise is the main research aspect in this paper. The paper concretely analyzes the GIS requirements in tobacco enterprise for planning location of production, monitoring production management and product sale at the beginning. Then holistic solution is presented and frame design for tobacco enterprise spatial decision based on SDM is given. This paper describes how to use spatial analysis and data mining to realize the spatial decision processing such as monitoring tobacco planted acreage, analyzing and planning the cigarette sale network and so on.
The analysis of rapidly developing fog at the Kennedy Space Center
NASA Technical Reports Server (NTRS)
Wheeler, Mark M.; Atchison, Michael K.; Schumann, Robin; Taylor, Greg E.; Yersavich, Ann; Warburton, John D.
1994-01-01
This report documents fog precursors and fog climatology at Kennedy Space Center (KSC) Florida from 1986 to 1990. The major emphasis of this report focuses on rapidly developing fog events that would affect the less than 7-statute mile visibility rule for End-Of-Mission (EOM) Shuttle landing at KSC (Rule 4-64(A)). The Applied Meteorology Unit's (AMU's) work is to: develop a data base for study of fog associated weather conditions relating to violations of this landing constraint; develop forecast techniques or rules-of-thumb to determine whether or not current conditions are likely to result in an acceptable condition at landing; validate the forecast techniques; and transition techniques to operational use. As part of the analysis the fog events were categorized as either advection, pre-frontal or radiation. As a result of these analyses, the AMU developed a fog climatological data base, identified fog precursors and developed forecaster tools and decision trees. The fog climatological analysis indicates that during the fog season (October to April) there is a higher risk for a visibility violation at KSC during the early morning hours (0700 to 1200 UTC), while 95 percent of all fog events have dissipated by 1600 UTC. A high number of fog events are characterized by a westerly component to the surface wind at KSC (92 percent) and 83 percent of the fog events had fog develop west of KSC first (up to 2 hours). The AMU developed fog decision trees and forecaster tools that would help the forecaster identify fog precursors up to 12 hours in advance. Using the decision trees as process tools ensures the important meteorological data are not overlooked in the forecast process. With these tools and a better understanding of fog formation in the local KSC area, the Shuttle weather support forecaster should be able to give the Launch and Flight Directors a better KSC fog forecast with more confidence.
How to Assess the Value of Medicines?
Simoens, Steven
2010-01-01
This study aims to discuss approaches to assessing the value of medicines. Economic evaluation assesses value by means of the incremental cost-effectiveness ratio (ICER). Health is maximized by selecting medicines with increasing ICERs until the budget is exhausted. The budget size determines the value of the threshold ICER and vice versa. Alternatively, the threshold value can be inferred from pricing/reimbursement decisions, although such values vary between countries. Threshold values derived from the value-of-life literature depend on the technique used. The World Health Organization has proposed a threshold value tied to the national GDP. As decision makers may wish to consider multiple criteria, variable threshold values and weighted ICERs have been suggested. Other approaches (i.e., replacement approach, program budgeting and marginal analysis) have focused on improving resource allocation, rather than maximizing health subject to a budget constraint. Alternatively, the generalized optimization framework and multi-criteria decision analysis make it possible to consider other criteria in addition to value. PMID:21607066
How to assess the value of medicines?
Simoens, Steven
2010-01-01
This study aims to discuss approaches to assessing the value of medicines. Economic evaluation assesses value by means of the incremental cost-effectiveness ratio (ICER). Health is maximized by selecting medicines with increasing ICERs until the budget is exhausted. The budget size determines the value of the threshold ICER and vice versa. Alternatively, the threshold value can be inferred from pricing/reimbursement decisions, although such values vary between countries. Threshold values derived from the value-of-life literature depend on the technique used. The World Health Organization has proposed a threshold value tied to the national GDP. As decision makers may wish to consider multiple criteria, variable threshold values and weighted ICERs have been suggested. Other approaches (i.e., replacement approach, program budgeting and marginal analysis) have focused on improving resource allocation, rather than maximizing health subject to a budget constraint. Alternatively, the generalized optimization framework and multi-criteria decision analysis make it possible to consider other criteria in addition to value.
Green material selection for sustainability: A hybrid MCDM approach.
Zhang, Honghao; Peng, Yong; Tian, Guangdong; Wang, Danqi; Xie, Pengpeng
2017-01-01
Green material selection is a crucial step for the material industry to comprehensively improve material properties and promote sustainable development. However, because of the subjectivity and conflicting evaluation criteria in its process, green material selection, as a multi-criteria decision making (MCDM) problem, has been a widespread concern to the relevant experts. Thus, this study proposes a hybrid MCDM approach that combines decision making and evaluation laboratory (DEMATEL), analytical network process (ANP), grey relational analysis (GRA) and technique for order performance by similarity to ideal solution (TOPSIS) to select the optimal green material for sustainability based on the product's needs. A nonlinear programming model with constraints was proposed to obtain the integrated closeness index. Subsequently, an empirical application of rubbish bins was used to illustrate the proposed method. In addition, a sensitivity analysis and a comparison with existing methods were employed to validate the accuracy and stability of the obtained final results. We found that this method provides a more accurate and effective decision support tool for alternative evaluation or strategy selection.
Green material selection for sustainability: A hybrid MCDM approach
Zhang, Honghao; Peng, Yong; Tian, Guangdong; Wang, Danqi; Xie, Pengpeng
2017-01-01
Green material selection is a crucial step for the material industry to comprehensively improve material properties and promote sustainable development. However, because of the subjectivity and conflicting evaluation criteria in its process, green material selection, as a multi-criteria decision making (MCDM) problem, has been a widespread concern to the relevant experts. Thus, this study proposes a hybrid MCDM approach that combines decision making and evaluation laboratory (DEMATEL), analytical network process (ANP), grey relational analysis (GRA) and technique for order performance by similarity to ideal solution (TOPSIS) to select the optimal green material for sustainability based on the product's needs. A nonlinear programming model with constraints was proposed to obtain the integrated closeness index. Subsequently, an empirical application of rubbish bins was used to illustrate the proposed method. In addition, a sensitivity analysis and a comparison with existing methods were employed to validate the accuracy and stability of the obtained final results. We found that this method provides a more accurate and effective decision support tool for alternative evaluation or strategy selection. PMID:28498864
The application of data mining techniques to oral cancer prognosis.
Tseng, Wan-Ting; Chiang, Wei-Fan; Liu, Shyun-Yeu; Roan, Jinsheng; Lin, Chun-Nan
2015-05-01
This study adopted an integrated procedure that combines the clustering and classification features of data mining technology to determine the differences between the symptoms shown in past cases where patients died from or survived oral cancer. Two data mining tools, namely decision tree and artificial neural network, were used to analyze the historical cases of oral cancer, and their performance was compared with that of logistic regression, the popular statistical analysis tool. Both decision tree and artificial neural network models showed superiority to the traditional statistical model. However, as to clinician, the trees created by the decision tree models are relatively easier to interpret compared to that of the artificial neural network models. Cluster analysis also discovers that those stage 4 patients whose also possess the following four characteristics are having an extremely low survival rate: pN is N2b, level of RLNM is level I-III, AJCC-T is T4, and cells mutate situation (G) is moderate.
Elumalai, Vetrimurugan; Brindha, K; Sithole, Bongani; Lakshmanan, Elango
2017-04-01
Mapping groundwater contaminants and identifying the sources are the initial steps in pollution control and mitigation. Due to the availability of different mapping methods and the large number of emerging pollutants, these methods need to be used together in decision making. The present study aims to map the contaminated areas in Richards Bay, South Africa and compare the results of ordinary kriging (OK) and inverse distance weighted (IDW) interpolation techniques. Statistical methods were also used for identifying contamination sources. Na-Cl groundwater type was dominant followed by Ca-Mg-Cl. Data analysis indicate that silicate weathering, ion exchange and fresh water-seawater mixing are the major geochemical processes controlling the presence of major ions in groundwater. Factor analysis also helped to confirm the results. Overlay analysis by OK and IDW gave different results. Areas where groundwater was unsuitable as a drinking source were 419 and 116 km 2 for OK and IDW, respectively. Such diverse results make decision making difficult, if only one method was to be used. Three highly contaminated zones within the study area were more accurately identified by OK. If large areas are identified as being contaminated such as by IDW in this study, the mitigation measures will be expensive. If these areas were underestimated, then even though management measures are taken, it will not be effective for a longer time. Use of multiple techniques like this study will help to avoid taking harsh decisions. Overall, the groundwater quality in this area was poor, and it is essential to identify alternate drinking water source or treat the groundwater before ingestion.
Maimoun, Mousa; Madani, Kaveh; Reinhart, Debra
2016-04-15
Historically, the U.S. waste collection fleet was dominated by diesel-fueled waste collection vehicles (WCVs); the growing need for sustainable waste collection has urged decision makers to incorporate economically efficient alternative fuels, while mitigating environmental impacts. The pros and cons of alternative fuels complicate the decisions making process, calling for a comprehensive study that assesses the multiple factors involved. Multi-criteria decision analysis (MCDA) methods allow decision makers to select the best alternatives with respect to selection criteria. In this study, two MCDA methods, Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) and Simple Additive Weighting (SAW), were used to rank fuel alternatives for the U.S. waste collection industry with respect to a multi-level environmental and financial decision matrix. The environmental criteria consisted of life-cycle emissions, tail-pipe emissions, water footprint (WFP), and power density, while the financial criteria comprised of vehicle cost, fuel price, fuel price stability, and fueling station availability. The overall analysis showed that conventional diesel is still the best option, followed by hydraulic-hybrid WCVs, landfill gas (LFG) sourced natural gas, fossil natural gas, and biodiesel. The elimination of the WFP and power density criteria from the environmental criteria ranked biodiesel 100 (BD100) as an environmentally better alternative compared to other fossil fuels (diesel and natural gas). This result showed that considering the WFP and power density as environmental criteria can make a difference in the decision process. The elimination of the fueling station and fuel price stability criteria from the decision matrix ranked fossil natural gas second after LFG-sourced natural gas. This scenario was found to represent the status quo of the waste collection industry. A sensitivity analysis for the status quo scenario showed the overall ranking of diesel and fossil natural gas to be more sensitive to changing fuel prices as compared to other alternatives. Copyright © 2016 Elsevier B.V. All rights reserved.
Dierker, Lisa; Rose, Jennifer; Tan, Xianming; Li, Runze
2010-12-01
This paper describes and compares a selection of available modeling techniques for identifying homogeneous population subgroups in the interest of informing targeted substance use intervention. We present a nontechnical review of the common and unique features of three methods: (a) trajectory analysis, (b) functional hierarchical linear modeling (FHLM), and (c) decision tree methods. Differences among the techniques are described, including required data features, strengths and limitations in terms of the flexibility with which outcomes and predictors can be modeled, and the potential of each technique for helping to inform the selection of targets and timing of substance intervention programs.
Ignjatović, Aleksandra; Stojanović, Miodrag; Milošević, Zoran; Anđelković Apostolović, Marija
2017-12-02
The interest in developing risk models in medicine not only is appealing, but also associated with many obstacles in different aspects of predictive model development. Initially, the association of biomarkers or the association of more markers with the specific outcome was proven by statistical significance, but novel and demanding questions required the development of new and more complex statistical techniques. Progress of statistical analysis in biomedical research can be observed the best through the history of the Framingham study and development of the Framingham score. Evaluation of predictive models comes from a combination of the facts which are results of several metrics. Using logistic regression and Cox proportional hazards regression analysis, the calibration test, and the ROC curve analysis should be mandatory and eliminatory, and the central place should be taken by some new statistical techniques. In order to obtain complete information related to the new marker in the model, recently, there is a recommendation to use the reclassification tables by calculating the net reclassification index and the integrated discrimination improvement. Decision curve analysis is a novel method for evaluating the clinical usefulness of a predictive model. It may be noted that customizing and fine-tuning of the Framingham risk score initiated the development of statistical analysis. Clinically applicable predictive model should be a trade-off between all abovementioned statistical metrics, a trade-off between calibration and discrimination, accuracy and decision-making, costs and benefits, and quality and quantity of patient's life.
NASA Astrophysics Data System (ADS)
Chen, Ting-Yu
2012-06-01
This article presents a useful method for relating anchor dependency and accuracy functions to multiple attribute decision-making (MADM) problems in the context of Atanassov intuitionistic fuzzy sets (A-IFSs). Considering anchored judgement with displaced ideals and solution precision with minimal hesitation, several auxiliary optimisation models have proposed to obtain the optimal weights of the attributes and to acquire the corresponding TOPSIS (the technique for order preference by similarity to the ideal solution) index for alternative rankings. Aside from the TOPSIS index, as a decision-maker's personal characteristics and own perception of self may also influence the direction in the axiom of choice, the evaluation of alternatives is conducted based on distances of each alternative from the positive and negative ideal alternatives, respectively. This article originates from Li's [Li, D.-F. (2005), 'Multiattribute Decision Making Models and Methods Using Intuitionistic Fuzzy Sets', Journal of Computer and System Sciences, 70, 73-85] work, which is a seminal study of intuitionistic fuzzy decision analysis using deduced auxiliary programming models, and deems it a benchmark method for comparative studies on anchor dependency and accuracy functions. The feasibility and effectiveness of the proposed methods are illustrated by a numerical example. Finally, a comparative analysis is illustrated with computational experiments on averaging accuracy functions, TOPSIS indices, separation measures from positive and negative ideal alternatives, consistency rates of ranking orders, contradiction rates of the top alternative and average Spearman correlation coefficients.
Dranitsaris, George; Leung, Pauline
2004-01-01
Decision analysis is commonly used to perform economic evaluations of new pharmaceuticals. The outcomes of such studies are often reported as an incremental cost per quality-adjusted life year (QALY) gained with the new agent. Decision analysis can also be used in the context of estimating drug cost before market entry. The current study used neurokinin-1 (NK-1) receptor antagonists, a new class of antiemetics for cancer patients, as an example to illustrate the process using an incremental cost of dollars Can20,000 per QALY gained as the target threshold. A decision model was developed to simulate the control of acute and delayed emesis after cisplatin-based chemotherapy. The model compared standard therapy with granisetron and dexamethasone to the same protocol with the addition of an NK-1 before chemotherapy and continued twice daily for five days. The rates of complete emesis control were abstracted from a double-blind randomized trial. Costs of standard antiemetics and therapy for breakthrough vomiting were obtained from hospital sources. Utility estimates characterized as quality-adjusted emesis-free days were determined by interviewing twenty-five oncology nurses and pharmacists by using the Time Trade-Off technique. These data were then used to estimate the unit cost of the new antiemetic using a target threshold of dollars Can20,000 per QALY gained. A cost of dollars Can6.60 per NK-1 dose would generate an incremental cost of dollars Can20,000 per QALY. The sensitivity analysis on the unit cost identified a range from dollars Can4.80 to dollars Can10.00 per dose. For the recommended five days of therapy, the total cost should be dollars Can66.00 (dollars Can48.00-dollars Can100.00) for optimal economic efficiency relative to Canada's publicly funded health-care system. The use of decision modeling for estimating drug cost before product launch is a powerful technique to ensure value for money. Such information can be of value to both drug manufacturers and formulary committees, because it would facilitate negotiations for optimal pricing in a given jurisdiction.
Influences on corporate executive decision behavior in government acquisitions
NASA Technical Reports Server (NTRS)
Wetherington, J. R.
1986-01-01
This paper presents extensive exploratory research which had as its primary objective, the discovery and determination of major areas of concern exhibited by U.S. corporate executives in the preparation and submittal of proposals and bids to the Federal government. The existence of numerous unique concerns inherent in corporate strategies within the government market environment was established. A determination of the relationship of these concerns to each other was accomplished utilizing statistical factor analysis techniques resulting in the identification of major groupings of management concerns. Finally, using analysis of variance, an analysis and discovery of the interrelationship of the factors to corporate demographics was accomplished. The existence of separate and distinct concerns exhibited by corporate executives when contemplating sales and operations in the government marketplace was established. It was also demonstrated that quantifiable relationships exist between such variables and that the decision behavior exhibited by the responsible executives has an interrelationship to their company's demographics.
NASA Technical Reports Server (NTRS)
Steele, Glen; Lansdowne, Chatwin; Zucha, Joan; Schlensinger, Adam
2013-01-01
The Soft Decision Analyzer (SDA) is an instrument that combines hardware, firmware, and software to perform realtime closed-loop end-to-end statistical analysis of single- or dual- channel serial digital RF communications systems operating in very low signal-to-noise conditions. As an innovation, the unique SDA capabilities allow it to perform analysis of situations where the receiving communication system slips bits due to low signal-to-noise conditions or experiences constellation rotations resulting in channel polarity in versions or channel assignment swaps. SDA s closed-loop detection allows it to instrument a live system and correlate observations with frame, codeword, and packet losses, as well as Quality of Service (QoS) and Quality of Experience (QoE) events. The SDA s abilities are not confined to performing analysis in low signal-to-noise conditions. Its analysis provides in-depth insight of a communication system s receiver performance in a variety of operating conditions. The SDA incorporates two techniques for identifying slips. The first is an examination of content of the received data stream s relation to the transmitted data content and the second is a direct examination of the receiver s recovered clock signals relative to a reference. Both techniques provide benefits in different ways and allow the communication engineer evaluating test results increased confidence and understanding of receiver performance. Direct examination of data contents is performed by two different data techniques, power correlation or a modified Massey correlation, and can be applied to soft decision data widths 1 to 12 bits wide over a correlation depth ranging from 16 to 512 samples. The SDA detects receiver bit slips within a 4 bits window and can handle systems with up to four quadrants (QPSK, SQPSK, and BPSK systems). The SDA continuously monitors correlation results to characterize slips and quadrant change and is capable of performing analysis even when the receiver under test is subjected to conditions where its performance degrades to high error rates (30 percent or beyond). The design incorporates a number of features, such as watchdog triggers that permit the SDA system to recover from large receiver upsets automatically and continue accumulating performance analysis unaided by operator intervention. This accommodates tests that can last in the order of days in order to gain statistical confidence in results and is also useful for capturing snapshots of rare events.
Guidi, G; Pettenati, M C; Miniati, R; Iadanza, E
2012-01-01
In this paper we describe an Heart Failure analysis Dashboard that, combined with a handy device for the automatic acquisition of a set of patient's clinical parameters, allows to support telemonitoring functions. The Dashboard's intelligent core is a Computer Decision Support System designed to assist the clinical decision of non-specialist caring personnel, and it is based on three functional parts: Diagnosis, Prognosis, and Follow-up management. Four Artificial Intelligence-based techniques are compared for providing diagnosis function: a Neural Network, a Support Vector Machine, a Classification Tree and a Fuzzy Expert System whose rules are produced by a Genetic Algorithm. State of the art algorithms are used to support a score-based prognosis function. The patient's Follow-up is used to refine the diagnosis.
EVMDD-Based Analysis and Diagnosis Methods of Multi-State Systems with Multi-State Components
2014-01-01
Springer-Verlag New York Inc., 2001. [7] T. Kam, T. Villa, R. K. Brayton , and A. L. Sangiovanni-Vincentelli, “Multi-valued deci- sion diagrams: Theory and...Decision Diagram Techniques for Micro- and Nanoelectronic Design, CRC Press, Taylor & Francis Group, 2006. [22] X. Zang, D. Wang, H. Sun, and K. S. Trivedi
More Money for More Opportunity: Financial Support of Community College Systems.
ERIC Educational Resources Information Center
Wattenbarger, James L.; Cage, Bob N.
The main thesis presented is that the need for state-level planning for the community college requires increased state-level financial support, yet, at the same time essential local autonomy must be preserved. Areas such as cost-analysis and program-budgeting techniques that govern state support, the way cost-benefit decisions affect the ideal of…
How Do the Different Types of Computer Use Affect Math Achievement?
ERIC Educational Resources Information Center
Flores, Raymond; Inan, Fethi; Lin, Zhangxi
2013-01-01
In this study, the National Educational Longitudinal Study (ELS:2002) dataset was used and a predictive data mining technique, decision tree analysis, was implemented in order to examine which factors, in conjunction to computer use, can be used to predict high or low probability of success in high school mathematics. Specifically, this study…
DOT National Transportation Integrated Search
1985-03-01
A report is offered on a study of the information activities within the Right-of-Way section of ADOT. The objectives of the study were to adapt and apply techniques to measure user-perceived needs, satisfaction and utility of services provided Right-...
DOT National Transportation Integrated Search
1985-03-01
A report is offered on a study of the information activities within the Right-of-Way section of ADOT. The objectives of the study were to adapt and apply techniques to measure user-perceived needs, satisfaction and utility of services provided Right-...
Acquisition and production of skilled behavior in dynamic decision-making tasks
NASA Technical Reports Server (NTRS)
Kirlik, Alex
1990-01-01
Ongoing research investigating perceptual and contextual influences on skilled human performance in dynamic decision making environments is discussed. The research is motivated by two general classes of findings in recent decision making research. First, many studies suggest that the concrete context in which a task is presented has strong influences on the psychological processes used to perform the task and on subsequent performance. Second, studies of skilled behavior in a wide variety of task environments typically implicate the perceptual system as an important contributor to decision-making performance, either in its role as a mediator between the current decision context and stored knowledge, or as a mechanism capable of directly initiating activity through the development of a 'trained eye.' Both contextual and perceptual influences place limits on the ability of traditional utility-theoretic accounts of decision-making to guide display design, as variance in behavior due to contextual factors or the development of a perceptual skill is left unexplained. The author outlines a framework in which to view questions of perceptual and contextual influences on behavior and describe an experimental task and analysis technique which will be used to diagnose the possible role of perception in skilled decision making performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Still, C.M.
1996-12-01
The primary waste management alternatives are source reduction, recycling, composting, incineration, and landfilling. Often waste management policies are based entirely on technical considerations and ignore that actual disposal practices depend on individuals` attitudes and behaviors. This research formulated a decision analysis model that incorporates social value measures to determine the waste management strategy that maximizes the individuals` willingness to participate. The social values that are important and that were considered in the decision support model to assist with making decisions about solid waste management were convenience, feeling good about reducing waste, feeling good about leaving a good environment for futuremore » generations, and the value of recreation programs that can be provided with profit from a recycling program.« less
NASA Astrophysics Data System (ADS)
Bhattacharyya, Sidhakam; Bandyopadhyay, Gautam
2010-10-01
The council of most of the Urban Local Bodies (ULBs) has a limited scope for decision making in the absence of appropriate financial control mechanism. The information about expected amount of own fund during a particular period is of great importance for decision making. Therefore, in this paper, efforts are being made to present set of findings and to establish a model of estimating receipts of own sources and payments thereof using multiple regression analysis. Data for sixty months from a reputed ULB in West Bengal have been considered for ascertaining the regression models. This can be used as a part of financial management and control procedure by the council to estimate the effect on own fund. In our study we have considered two models using multiple regression analysis. "Model I" comprises of total adjusted receipt as the dependent variable and selected individual receipts as the independent variables. Similarly "Model II" consists of total adjusted payments as the dependent variable and selected individual payments as independent variables. The resultant of Model I and Model II is the surplus or deficit effecting own fund. This may be applied for decision making purpose by the council.
Engaging stakeholders for adaptive management using structured decision analysis
Irwin, Elise R.; Kathryn, D.; Kennedy, Mickett
2009-01-01
Adaptive management is different from other types of management in that it includes all stakeholders (versus only policy makers) in the process, uses resource optimization techniques to evaluate competing objectives, and recognizes and attempts to reduce uncertainty inherent in natural resource systems. Management actions are negotiated by stakeholders, monitored results are compared to predictions of how the system should respond, and management strategies are adjusted in a “monitor-compare-adjust” iterative routine. Many adaptive management projects fail because of the lack of stakeholder identification, engagement, and continued involvement. Primary reasons for this vary but are usually related to either stakeholders not having ownership (or representation) in decision processes or disenfranchisement of stakeholders after adaptive management begins. We present an example in which stakeholders participated fully in adaptive management of a southeastern regulated river. Structured decision analysis was used to define management objectives and stakeholder values and to determine initial flow prescriptions. The process was transparent, and the visual nature of the modeling software allowed stakeholders to see how their interests and values were represented in the decision process. The development of a stakeholder governance structure and communication mechanism has been critical to the success of the project.
2012-01-01
Background A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. Methods We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). Results The instruments under study provide excellent tools for preparing decisions within the SAP in a transparent way when structuring the primary analysis, sensitivity or ancillary analyses, and specific analyses for secondary endpoints. The mean logarithmic score and DIC discriminate well between different model scenarios. It becomes obvious that the naive choice of a conventional random effects Poisson model is often inappropriate for real-life count data. The findings are used to specify an appropriate mixed model employed in the sensitivity analyses of an ongoing phase III trial. Conclusions The proposed Bayesian methods are not only appealing for inference but notably provide a sophisticated insight into different aspects of model performance, such as forecast verification or calibration checks, and can be applied within the model selection process. The mean of the logarithmic score is a robust tool for model ranking and is not sensitive to sample size. Therefore, these Bayesian model selection techniques offer helpful decision support for shaping sensitivity and ancillary analyses in a statistical analysis plan of a clinical trial with longitudinal count data as the primary endpoint. PMID:22962944
Adrion, Christine; Mansmann, Ulrich
2012-09-10
A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). The instruments under study provide excellent tools for preparing decisions within the SAP in a transparent way when structuring the primary analysis, sensitivity or ancillary analyses, and specific analyses for secondary endpoints. The mean logarithmic score and DIC discriminate well between different model scenarios. It becomes obvious that the naive choice of a conventional random effects Poisson model is often inappropriate for real-life count data. The findings are used to specify an appropriate mixed model employed in the sensitivity analyses of an ongoing phase III trial. The proposed Bayesian methods are not only appealing for inference but notably provide a sophisticated insight into different aspects of model performance, such as forecast verification or calibration checks, and can be applied within the model selection process. The mean of the logarithmic score is a robust tool for model ranking and is not sensitive to sample size. Therefore, these Bayesian model selection techniques offer helpful decision support for shaping sensitivity and ancillary analyses in a statistical analysis plan of a clinical trial with longitudinal count data as the primary endpoint.
Collaborative care: Using six thinking hats for decision making.
Cioffi, Jane Marie
2017-12-01
To apply six thinking hats technique for decision making in collaborative care. In collaborative partnerships, effective communications need to occur in patient, family, and health care professional meetings. The effectiveness of these meetings depends on the engagement of participants and the quality of the meeting process. The use of six thinking hats technique to engage all participants in effective dialogue is proposed. Discussion paper. Electronic databases, CINAHL, Pub Med, and Science Direct, were searched for years 1990 to 2017. Using six thinking hats technique in patient family meetings nurses can guide a process of dialogue that focuses decision making to build equal care partnerships inclusive of all participants. Nurses will need to develop the skills for using six thinking hats technique and provide support to all participants during the meeting process. Collaborative decision making can be augmented by six thinking hat technique to provide patients, families, and health professionals with opportunities to make informed decisions about care that considers key issues for all involved. Nurses who are most often advocates for patients and their families are in a unique position to lead this initiative in meetings as they network with all health professionals. © 2017 John Wiley & Sons Australia, Ltd.
Three decision-making aids: brainstorming, nominal group, and Delphi technique.
McMurray, A R
1994-01-01
The methods of brainstorming, Nominal Group Technique, and the Delphi technique can be important resources for nursing staff development educators who wish to expand their decision-making skills. Staff development educators may find opportunities to use these methods for such tasks as developing courses, setting departmental goals, and forecasting trends for planning purposes. Brainstorming, Nominal Group Technique, and the Delphi technique provide a structured format that helps increase the quantity and quality of participant responses.
Practical semen analysis: from A to Z
Brazil, Charlene
2010-01-01
Accurate semen analysis is critical for decisions about patient care, as well as for studies addressing overall changes in semen quality, contraceptive efficacy and effects of toxicant exposure. The standardization of semen analysis is very difficult for many reasons, including the use of subjective techniques with no standards for comparison, poor technician training, problems with proficiency testing and a reluctance to change techniques. The World Health Organization (WHO) Semen handbook (2010) offers a vastly improved set of standardized procedures, all at a level of detail that will preclude most misinterpretations. However, there is a limit to what can be learned from words and pictures alone. A WHO-produced DVD that offers complete demonstrations of each technique along with quality assurance standards for motility, morphology and concentration assessments would enhance the effectiveness of the manual. However, neither the manual nor a DVD will help unless there is general acknowledgement of the critical need to standardize techniques and rigorously pursue quality control to ensure that laboratories actually perform techniques 'according to WHO' instead of merely reporting that they have done so. Unless improvements are made, patient results will continue to be compromised and comparison between studies and laboratories will have limited merit. PMID:20111076
An automated approach to the design of decision tree classifiers
NASA Technical Reports Server (NTRS)
Argentiero, P.; Chin, R.; Beaudet, P.
1982-01-01
An automated technique is presented for designing effective decision tree classifiers predicated only on a priori class statistics. The procedure relies on linear feature extractions and Bayes table look-up decision rules. Associated error matrices are computed and utilized to provide an optimal design of the decision tree at each so-called 'node'. A by-product of this procedure is a simple algorithm for computing the global probability of correct classification assuming the statistical independence of the decision rules. Attention is given to a more precise definition of decision tree classification, the mathematical details on the technique for automated decision tree design, and an example of a simple application of the procedure using class statistics acquired from an actual Landsat scene.
Cost-benefit analysis of space technology
NASA Technical Reports Server (NTRS)
Hein, G. F.; Stevenson, S. M.; Sivo, J. N.
1976-01-01
A discussion of the implications and problems associated with the use of cost-benefit techniques is presented. Knowledge of these problems is useful in the structure of a decision making process. A methodology of cost-benefit analysis is presented for the evaluation of space technology. The use of the methodology is demonstrated with an evaluation of ion thrusters for north-south stationkeeping aboard geosynchronous communication satellites. A critique of the concept of consumers surplus for measuring benefits is also presented.
ERIC Educational Resources Information Center
Mahon, Michael J.; Bullock, Charles C.
1992-01-01
Study examined the impact of decision-making instruction which incorporated self-control techniques and instruction which provided only encouragement and verbal praise on decision making in leisure (DML) on adolescents with mild mental retardation. Results support the efficacy of the DML model in facilitating thoughtful DML for study subjects. (SM)
ERIC Educational Resources Information Center
Trimmer, Karen
2016-01-01
This paper investigates reasoned risk-taking in decision-making by school principals using a methodology that combines sequential use of psychometric and traditional measurement techniques. Risk-taking is defined as when decisions are made that are not compliant with the regulatory framework, the primary governance mechanism for public schools in…
On avoiding framing effects in experienced decision makers.
Garcia-Retamero, Rocio; Dhami, Mandeep K
2013-01-01
The present study aimed to (a) demonstrate the effect of positive-negative framing on experienced criminal justice decision makers, (b) examine the debiasing effect of visually structured risk messages, and (c) investigate whether risk perceptions mediate the debiasing effect of visual aids on decision making. In two phases, 60 senior police officers estimated the accuracy of a counterterrorism technique in identifying whether a known terror suspect poses an imminent danger and decided whether they would recommend the technique to policy makers. Officers also rated their confidence in this recommendation. When information about the effectiveness of the counterterrorism technique was presented in a numerical format, officers' perceptions of accuracy and recommendation decisions were susceptible to the framing effect: The technique was perceived to be more accurate and was more likely to be recommended when its effectiveness was presented in a positive than in a negative frame. However, when the information was represented visually using icon arrays, there were no such framing effects. Finally, perceptions of accuracy mediated the debiasing effect of visual aids on recommendation decisions. We offer potential explanations for the debiasing effect of visual aids and implications for communicating risk to experienced, professional decision makers.
Information support for decision making on dispatching control of water distribution in irrigation
NASA Astrophysics Data System (ADS)
Yurchenko, I. F.
2018-05-01
The research has been carried out on developing the technique of supporting decision making for on-line control, operational management of water allocation for the interfarm irrigation projects basing on the analytical patterns of dispatcher control. This technique provides an increase of labour productivity as well as higher management quality due to the improved level of automation, as well as decision making optimization taking into account diagnostics of the issues, solutions classification, information being required to the decision makers.
Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason
2016-01-01
With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713
Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason
2016-12-15
With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vesselinov, Velimir; O'Malley, Daniel; Lin, Youzuo
2016-07-01
Mads.jl (Model analysis and decision support in Julia) is a code that streamlines the process of using data and models for analysis and decision support. It is based on another open-source code developed at LANL and written in C/C++ (MADS; http://mads.lanl.gov; LA-CC-11- 035). Mads.jl can work with external models of arbitrary complexity as well as built-in models of flow and transport in porous media. It enables a number of data- and model-based analyses including model calibration, sensitivity analysis, uncertainty quantification, and decision analysis. The code also can use a series of alternative adaptive computational techniques for Bayesian sampling, Monte Carlo,more » and Bayesian Information-Gap Decision Theory. The code is implemented in the Julia programming language, and has high-performance (parallel) and memory management capabilities. The code uses a series of third party modules developed by others. The code development will also include contributions to the existing third party modules written in Julia; this contributions will be important for the efficient implementation of the algorithm used by Mads.jl. The code also uses a series of LANL developed modules that are developed by Dan O'Malley; these modules will be also a part of the Mads.jl release. Mads.jl will be released under GPL V3 license. The code will be distributed as a Git repo at gitlab.com and github.com. Mads.jl manual and documentation will be posted at madsjulia.lanl.gov.« less
A Framework for Achieving Situational Awareness during Crisis based on Twitter Analysis
NASA Astrophysics Data System (ADS)
Zielinski, Andrea; Tokarchuk, Laurissa; Middleton, Stuart; Chaves, Fernando
2013-04-01
Decision Support Systems for Natural Crisis Management increasingly employ Web 2.0 and 3.0 technologies for future collaborative decision making, including the use of social networks like Twitter. However, human sensor data is not readily accessible and interpretable, since the texts are unstructured, noisy and available in various languages. The present work focusses on the detection of crisis events in a multilingual setting as part of the FP7-funded EU project TRIDEC and is motivated by the goal to establish a Tsunami warning system for the Mediterranean. It is integrated into a dynamic spatial-temporal decision making component with a command and control unit's graphical user interface that presents all relevant information to the human operator to support critical decision-support. To this end, a tool for the interactive visualization of geospatial data is implemented: All tweets with an exact timestamp or geo-location are monitored on the map in real-time so that the operator on duty can get an overall picture of the situation. Apart from the human sensor data, the seismic sensor data will appear also on the same screen. Signs of abnormal activity from twitter usage in social networks as well as in sensor networks devices can then be used to trigger official warning alerts according to the CAP message standard. Whenever a certain threshold of relevant tweets in a HASC region (Hierarchical Administrative Subdivision Code) is exceeded, the twitter activity in this administrative region will be shown on a map. We believe that the following functionalities are crucial for monitoring crisis, making use of text mining and network analysis techniques: Focussed crawling, trustworthyness analysis geo-parsing, and multilingual tweet classification. In the first step, the Twitter Streaming API accesses the social data, using an adaptive keyword list (focussed crawling). Then, tweets are filtered and aggregated to form counts for a certain time-span (e.g., an interval of 1-2 minutes). Particularly, we investigate the following novel techniques that help to fulfill this task: trustworthyness analysis (linkage analysis and user network analysis), geo-parsing (locating the event in space), and multilingual tweet classification (filtering out of noisy tweets for various Mediterranean languages). Lastly, an aberration algorithm looks for spikes in the temporal stream of twitter data.
Rahman, M Azizur; Rusteberg, Bernd; Gogu, R C; Lobo Ferreira, J P; Sauter, Martin
2012-05-30
This study reports the development of a new spatial multi-criteria decision analysis (SMCDA) software tool for selecting suitable sites for Managed Aquifer Recharge (MAR) systems. The new SMCDA software tool functions based on the combination of existing multi-criteria evaluation methods with modern decision analysis techniques. More specifically, non-compensatory screening, criteria standardization and weighting, and Analytical Hierarchy Process (AHP) have been combined with Weighted Linear Combination (WLC) and Ordered Weighted Averaging (OWA). This SMCDA tool may be implemented with a wide range of decision maker's preferences. The tool's user-friendly interface helps guide the decision maker through the sequential steps for site selection, those steps namely being constraint mapping, criteria hierarchy, criteria standardization and weighting, and criteria overlay. The tool offers some predetermined default criteria and standard methods to increase the trade-off between ease-of-use and efficiency. Integrated into ArcGIS, the tool has the advantage of using GIS tools for spatial analysis, and herein data may be processed and displayed. The tool is non-site specific, adaptive, and comprehensive, and may be applied to any type of site-selection problem. For demonstrating the robustness of the new tool, a case study was planned and executed at Algarve Region, Portugal. The efficiency of the SMCDA tool in the decision making process for selecting suitable sites for MAR was also demonstrated. Specific aspects of the tool such as built-in default criteria, explicit decision steps, and flexibility in choosing different options were key features, which benefited the study. The new SMCDA tool can be augmented by groundwater flow and transport modeling so as to achieve a more comprehensive approach to the selection process for the best locations of the MAR infiltration basins, as well as the locations of recovery wells and areas of groundwater protection. The new spatial multicriteria analysis tool has already been implemented within the GIS based Gabardine decision support system as an innovative MAR planning tool. Copyright © 2012 Elsevier Ltd. All rights reserved.
A rule-based system for real-time analysis of control systems
NASA Astrophysics Data System (ADS)
Larson, Richard R.; Millard, D. Edward
1992-10-01
An approach to automate the real-time analysis of flight critical health monitoring and system status is being developed and evaluated at the NASA Dryden Flight Research Facility. A software package was developed in-house and installed as part of the extended aircraft interrogation and display system. This design features a knowledge-base structure in the form of rules to formulate interpretation and decision logic of real-time data. This technique has been applied for ground verification and validation testing and flight testing monitoring where quick, real-time, safety-of-flight decisions can be very critical. In many cases post processing and manual analysis of flight system data are not required. The processing is described of real-time data for analysis along with the output format which features a message stack display. The development, construction, and testing of the rule-driven knowledge base, along with an application using the X-31A flight test program, are presented.
A rule-based system for real-time analysis of control systems
NASA Technical Reports Server (NTRS)
Larson, Richard R.; Millard, D. Edward
1992-01-01
An approach to automate the real-time analysis of flight critical health monitoring and system status is being developed and evaluated at the NASA Dryden Flight Research Facility. A software package was developed in-house and installed as part of the extended aircraft interrogation and display system. This design features a knowledge-base structure in the form of rules to formulate interpretation and decision logic of real-time data. This technique has been applied for ground verification and validation testing and flight testing monitoring where quick, real-time, safety-of-flight decisions can be very critical. In many cases post processing and manual analysis of flight system data are not required. The processing is described of real-time data for analysis along with the output format which features a message stack display. The development, construction, and testing of the rule-driven knowledge base, along with an application using the X-31A flight test program, are presented.
NASA Astrophysics Data System (ADS)
Casasent, David; Telfer, Brian
1988-02-01
The storage capacity, noise performance, and synthesis of associative memories for image analysis are considered. Associative memory synthesis is shown to be very similar to that of linear discriminant functions used in pattern recognition. These lead to new associative memories and new associative memory synthesis and recollection vector encodings. Heteroassociative memories are emphasized in this paper, rather than autoassociative memories, since heteroassociative memories provide scene analysis decisions, rather than merely enhanced output images. The analysis of heteroassociative memories has been given little attention. Heteroassociative memory performance and storage capacity are shown to be quite different from those of autoassociative memories, with much more dependence on the recollection vectors used and less dependence on M/N. This allows several different and preferable synthesis techniques to be considered for associative memories. These new associative memory synthesis techniques and new techniques to update associative memories are included. We also introduce a new SNR performance measure that is preferable to conventional noise standard deviation ratios.
Elements of an integrated health monitoring framework
NASA Astrophysics Data System (ADS)
Fraser, Michael; Elgamal, Ahmed; Conte, Joel P.; Masri, Sami; Fountain, Tony; Gupta, Amarnath; Trivedi, Mohan; El Zarki, Magda
2003-07-01
Internet technologies are increasingly facilitating real-time monitoring of Bridges and Highways. The advances in wireless communications for instance, are allowing practical deployments for large extended systems. Sensor data, including video signals, can be used for long-term condition assessment, traffic-load regulation, emergency response, and seismic safety applications. Computer-based automated signal-analysis algorithms routinely process the incoming data and determine anomalies based on pre-defined response thresholds and more involved signal analysis techniques. Upon authentication, appropriate action may be authorized for maintenance, early warning, and/or emergency response. In such a strategy, data from thousands of sensors can be analyzed with near real-time and long-term assessment and decision-making implications. Addressing the above, a flexible and scalable (e.g., for an entire Highway system, or portfolio of Networked Civil Infrastructure) software architecture/framework is being developed and implemented. This framework will network and integrate real-time heterogeneous sensor data, database and archiving systems, computer vision, data analysis and interpretation, physics-based numerical simulation of complex structural systems, visualization, reliability & risk analysis, and rational statistical decision-making procedures. Thus, within this framework, data is converted into information, information into knowledge, and knowledge into decision at the end of the pipeline. Such a decision-support system contributes to the vitality of our economy, as rehabilitation, renewal, replacement, and/or maintenance of this infrastructure are estimated to require expenditures in the Trillion-dollar range nationwide, including issues of Homeland security and natural disaster mitigation. A pilot website (http://bridge.ucsd.edu/compositedeck.html) currently depicts some basic elements of the envisioned integrated health monitoring analysis framework.
Zarinabad, Niloufar; Meeus, Emma M; Manias, Karen; Foster, Katharine
2018-01-01
Background Advances in magnetic resonance imaging and the introduction of clinical decision support systems has underlined the need for an analysis tool to extract and analyze relevant information from magnetic resonance imaging data to aid decision making, prevent errors, and enhance health care. Objective The aim of this study was to design and develop a modular medical image region of interest analysis tool and repository (MIROR) for automatic processing, classification, evaluation, and representation of advanced magnetic resonance imaging data. Methods The clinical decision support system was developed and evaluated for diffusion-weighted imaging of body tumors in children (cohort of 48 children, with 37 malignant and 11 benign tumors). Mevislab software and Python have been used for the development of MIROR. Regions of interests were drawn around benign and malignant body tumors on different diffusion parametric maps, and extracted information was used to discriminate the malignant tumors from benign tumors. Results Using MIROR, the various histogram parameters derived for each tumor case when compared with the information in the repository provided additional information for tumor characterization and facilitated the discrimination between benign and malignant tumors. Clinical decision support system cross-validation showed high sensitivity and specificity in discriminating between these tumor groups using histogram parameters. Conclusions MIROR, as a diagnostic tool and repository, allowed the interpretation and analysis of magnetic resonance imaging images to be more accessible and comprehensive for clinicians. It aims to increase clinicians’ skillset by introducing newer techniques and up-to-date findings to their repertoire and make information from previous cases available to aid decision making. The modular-based format of the tool allows integration of analyses that are not readily available clinically and streamlines the future developments. PMID:29720361
Phelps, Charles E; Lakdawalla, Darius N; Basu, Anirban; Drummond, Michael F; Towse, Adrian; Danzon, Patricia M
2018-02-01
The fifth section of our Special Task Force report identifies and discusses two aggregation issues: 1) aggregation of cost and benefit information across individuals to a population level for benefit plan decision making and 2) combining multiple elements of value into a single value metric for individuals. First, we argue that additional elements could be included in measures of value, but such elements have not generally been included in measures of quality-adjusted life-years. For example, we describe a recently developed extended cost-effectiveness analysis (ECEA) that provides a good example of how to use a broader concept of utility. ECEA adds two features-measures of financial risk protection and income distributional consequences. We then discuss a further option for expanding this approach-augmented CEA, which can introduce many value measures. Neither of these approaches, however, provide a comprehensive measure of value. To resolve this issue, we review a technique called multicriteria decision analysis that can provide a comprehensive measure of value. We then discuss budget-setting and prioritization using multicriteria decision analysis, issues not yet fully resolved. Next, we discuss deliberative processes, which represent another important approach for population- or plan-level decisions used by many health technology assessment bodies. These use quantitative information on CEA and other elements, but the group decisions are reached by a deliberative voting process. Finally, we briefly discuss the use of stated preference methods for developing "hedonic" value frameworks, and conclude with some recommendations in this area. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
James, Lachlan P; Robertson, Sam; Haff, G Gregory; Beckman, Emma M; Kelly, Vincent G
2017-03-01
To determine those performance indicators that have the greatest influence on classifying outcome at the elite level of mixed martial arts (MMA). A secondary objective was to establish the efficacy of decision tree analysis in explaining the characteristics of victory when compared to alternate statistical methods. Cross-sectional observational. Eleven raw performance indicators from male Ultimate Fighting Championship bouts (n=234) from July 2014 to December 2014 were screened for analysis. Each raw performance indicator was also converted to a rate-dependent measure to be scaled to fight duration. Further, three additional performance indicators were calculated from the dataset and included in the analysis. Cohen's d effect sizes were employed to determine the magnitude of the differences between Wins and Losses, while decision tree (chi-square automatic interaction detector (CHAID)) and discriminant function analyses (DFA) were used to classify outcome (Win and Loss). Effect size comparisons revealed differences between Wins and Losses across a number of performance indicators. Decision tree (raw: 71.8%; rate-scaled: 76.3%) and DFA (raw: 71.4%; rate-scaled 71.2%) achieved similar classification accuracies. Grappling and accuracy performance indicators were the most influential in explaining outcome. The decision tree models also revealed multiple combinations of performance indicators leading to victory. The decision tree analyses suggest that grappling activity and technique accuracy are of particular importance in achieving victory in elite-level MMA competition. The DFA results supported the importance of these performance indicators. Decision tree induction represents an intuitive and slightly more accurate approach to explaining bout outcome in this sport when compared to DFA. Copyright © 2016 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Threat evaluation for impact assessment in situation analysis systems
NASA Astrophysics Data System (ADS)
Roy, Jean; Paradis, Stephane; Allouche, Mohamad
2002-07-01
Situation analysis is defined as a process, the examination of a situation, its elements, and their relations, to provide and maintain a product, i.e., a state of situation awareness, for the decision maker. Data fusion is a key enabler to meeting the demanding requirements of military situation analysis support systems. According to the data fusion model maintained by the Joint Directors of Laboratories' Data Fusion Group, impact assessment estimates the effects on situations of planned or estimated/predicted actions by the participants, including interactions between action plans of multiple players. In this framework, the appraisal of actual or potential threats is a necessary capability for impact assessment. This paper reviews and discusses in details the fundamental concepts of threat analysis. In particular, threat analysis generally attempts to compute some threat value, for the individual tracks, that estimates the degree of severity with which engagement events will potentially occur. Presenting relevant tracks to the decision maker in some threat list, sorted from the most threatening to the least, is clearly in-line with the cognitive demands associated with threat evaluation. A key parameter in many threat value evaluation techniques is the Closest Point of Approach (CPA). Along this line of thought, threatening tracks are often prioritized based upon which ones will reach their CPA first. Hence, the Time-to-CPA (TCPA), i.e., the time it will take for a track to reach its CPA, is also a key factor. Unfortunately, a typical assumption for the computation of the CPA/TCPA parameters is that the track velocity will remain constant. When a track is maneuvering, the CPA/TCPA values will change accordingly. These changes will in turn impact the threat value computations and, ultimately, the resulting threat list. This is clearly undesirable from a command decision-making perspective. In this regard, the paper briefly discusses threat value stabilization approaches based on neural networks and other mathematical techniques.
An Integrated Approach to Life Cycle Analysis
NASA Technical Reports Server (NTRS)
Chytka, T. M.; Brown, R. W.; Shih, A. T.; Reeves, J. D.; Dempsey, J. A.
2006-01-01
Life Cycle Analysis (LCA) is the evaluation of the impacts that design decisions have on a system and provides a framework for identifying and evaluating design benefits and burdens associated with the life cycles of space transportation systems from a "cradle-to-grave" approach. Sometimes called life cycle assessment, life cycle approach, or "cradle to grave analysis", it represents a rapidly emerging family of tools and techniques designed to be a decision support methodology and aid in the development of sustainable systems. The implementation of a Life Cycle Analysis can vary and may take many forms; from global system-level uncertainty-centered analysis to the assessment of individualized discriminatory metrics. This paper will focus on a proven LCA methodology developed by the Systems Analysis and Concepts Directorate (SACD) at NASA Langley Research Center to quantify and assess key LCA discriminatory metrics, in particular affordability, reliability, maintainability, and operability. This paper will address issues inherent in Life Cycle Analysis including direct impacts, such as system development cost and crew safety, as well as indirect impacts, which often take the form of coupled metrics (i.e., the cost of system unreliability). Since LCA deals with the analysis of space vehicle system conceptual designs, it is imperative to stress that the goal of LCA is not to arrive at the answer but, rather, to provide important inputs to a broader strategic planning process, allowing the managers to make risk-informed decisions, and increase the likelihood of meeting mission success criteria.
Lee, Saro; Park, Inhye
2013-09-30
Subsidence of ground caused by underground mines poses hazards to human life and property. This study analyzed the hazard to ground subsidence using factors that can affect ground subsidence and a decision tree approach in a geographic information system (GIS). The study area was Taebaek, Gangwon-do, Korea, where many abandoned underground coal mines exist. Spatial data, topography, geology, and various ground-engineering data for the subsidence area were collected and compiled in a database for mapping ground-subsidence hazard (GSH). The subsidence area was randomly split 50/50 for training and validation of the models. A data-mining classification technique was applied to the GSH mapping, and decision trees were constructed using the chi-squared automatic interaction detector (CHAID) and the quick, unbiased, and efficient statistical tree (QUEST) algorithms. The frequency ratio model was also applied to the GSH mapping for comparing with probabilistic model. The resulting GSH maps were validated using area-under-the-curve (AUC) analysis with the subsidence area data that had not been used for training the model. The highest accuracy was achieved by the decision tree model using CHAID algorithm (94.01%) comparing with QUEST algorithms (90.37%) and frequency ratio model (86.70%). These accuracies are higher than previously reported results for decision tree. Decision tree methods can therefore be used efficiently for GSH analysis and might be widely used for prediction of various spatial events. Copyright © 2013. Published by Elsevier Ltd.
Mendoza, G A; Prabhu, R
2000-12-01
This paper describes an application of multiple criteria analysis (MCA) in assessing criteria and indicators adapted for a particular forest management unit. The methods include: ranking, rating, and pairwise comparisons. These methods were used in a participatory decision-making environment where a team representing various stakeholders and professionals used their expert opinions and judgements in assessing different criteria and indicators (C&I) on the one hand, and how suitable and applicable they are to a forest management unit on the other. A forest concession located in Kalimantan, Indonesia, was used as the site for the case study. Results from the study show that the multicriteria methods are effective tools that can be used as structured decision aids to evaluate, prioritize, and select sets of C&I for a particular forest management unit. Ranking and rating approaches can be used as a screening tool to develop an initial list of C&I. Pairwise comparison, on the other hand, can be used as a finer filter to further reduce the list. In addition to using these three MCA methods, the study also examines two commonly used group decision-making techniques, the Delphi method and the nominal group technique. Feedback received from the participants indicates that the methods are transparent, easy to implement, and provide a convenient environment for participatory decision-making.
Decision Making on Regional Landfill Site Selection in Hormozgan Province Using Smce
NASA Astrophysics Data System (ADS)
Majedi, A. S.; Kamali, B. M.; Maghsoudi, R.
2015-12-01
Landfill site selection and suitable conditions to bury hazardous wastes are among the most critical issues in modern societies. Taking several factors and limitations into account along with true decision making requires application of different decision techniques. To this end, current paper aims to make decisions about regional landfill site selection in Hormozgan province and utilizes SMCE technique combined with qualitative and quantitative criteria to select the final alternatives. To this respect, we first will describe the existing environmental situation in our study area and set the goals of our study in the framework of SMCE and will analyze the effective factors in regional landfill site selection. Then, methodological procedure of research was conducted using Delphi approach and questionnaires (in order to determine research validity, Chronbach Alpha (0.94) method was used). Spatial multi-criteria analysis model was designed in the form of criteria tree in SMCE using IL WIS software. Prioritization of respective spatial alternatives included: Bandar Abbas city with total 4 spatial alternatives (one zone with 1st priority, one zone with 3rd priority and two zones with 4thpriority) was considered the first priority, Bastak city with total 3 spatial alternatives (one zone with 2nd priority, one zone with 3rdpriorit and one zone with 4th priority) was the second priority and Bandar Abbas, Minab, Jask and Haji Abad cities were considered as the third priority.
NASA Astrophysics Data System (ADS)
Lin, Zi-Jing; Li, Lin; Cazzell, Marry; Liu, Hanli
2013-03-01
Functional near-infrared spectroscopy (fNIRS) is a non-invasive imaging technique which measures the hemodynamic changes that reflect the brain activity. Diffuse optical tomography (DOT), a variant of fNIRS with multi-channel NIRS measurements, has demonstrated capability of three dimensional (3D) reconstructions of hemodynamic changes due to the brain activity. Conventional method of DOT image analysis to define the brain activation is based upon the paired t-test between two different states, such as resting-state versus task-state. However, it has limitation because the selection of activation and post-activation period is relatively subjective. General linear model (GLM) based analysis can overcome this limitation. In this study, we combine the 3D DOT image reconstruction with GLM-based analysis (i.e., voxel-wise GLM analysis) to investigate the brain activity that is associated with the risk-decision making process. Risk decision-making is an important cognitive process and thus is an essential topic in the field of neuroscience. The balloon analogue risk task (BART) is a valid experimental model and has been commonly used in behavioral measures to assess human risk taking action and tendency while facing risks. We have utilized the BART paradigm with a blocked design to investigate brain activations in the prefrontal and frontal cortical areas during decision-making. Voxel-wise GLM analysis was performed on 18human participants (10 males and 8females).In this work, we wish to demonstrate the feasibility of using voxel-wise GLM analysis to image and study cognitive functions in response to risk decision making by DOT. Results have shown significant changes in the dorsal lateral prefrontal cortex (DLPFC) during the active choice mode and a different hemodynamic pattern between genders, which are in good agreements with published literatures in functional magnetic resonance imaging (fMRI) and fNIRS studies.
Gorawara-Bhat, Rita; O'Muircheartaigh, Siobhan; Mohile, Supriya; Dale, William
2017-09-01
To compare patients' attitudes towards recurrent prostate cancer (PCa) and starting hormone therapy (HT) treatment in two groups-Decision-Aid (DA) (intervention) and Standard-of-care (SoC) (Control). The present research was conducted at three academic clinics-two in the Midwest and one in the Northeast U.S. Patients with biochemical recurrence of PCa (n=26) and follow-up oncology visits meeting inclusion criteria were randomized to either the SoC or DA intervention group prior to their consultation. Analysts were blinded to group assignment. Semi-structured phone interviews with patients were conducted 1-week post consultation. Interviews were audio-taped and transcribed. Qualitative analytic techniques were used to extract salient themes and conduct a comparative analysis of the two groups. Four salient themes emerged-1) knowledge acquisition, 2) decision-making style, 3) decision-making about timing of HT, and 4) anxiety-coping mechanisms. A comparative analysis showed that patients receiving the DA intervention had a better comprehension of Prostate-specific antigen (PSA), an improved understanding of HT treatment implications, an external locus-of-control, participation in shared decision-making and, support-seeking for anxiety reduction. In contrast, SoC patients displayed worse comprehension of PSA testing and HT treatment implications, internal locus-of-control, unilateral involvement in knowledge-seeking and decision-making, and no support-seeking for anxiety-coping. The DA was more effective than the SoC group in helping PCa patients understand the full implications of PSA testing and treatment; motivating shared decision-making, and support-seeking for anxiety relief. DA DVD interventions can be a useful patient education tool for bringing higher quality decision-making to prostate cancer care. Copyright © 2017 Elsevier Ltd. All rights reserved.
Predicting decisions in human social interactions using real-time fMRI and pattern classification.
Hollmann, Maurice; Rieger, Jochem W; Baecke, Sebastian; Lützkendorf, Ralf; Müller, Charles; Adolf, Daniela; Bernarding, Johannes
2011-01-01
Negotiation and trade typically require a mutual interaction while simultaneously resting in uncertainty which decision the partner ultimately will make at the end of the process. Assessing already during the negotiation in which direction one's counterpart tends would provide a tremendous advantage. Recently, neuroimaging techniques combined with multivariate pattern classification of the acquired data have made it possible to discriminate subjective states of mind on the basis of their neuronal activation signature. However, to enable an online-assessment of the participant's mind state both approaches need to be extended to a real-time technique. By combining real-time functional magnetic resonance imaging (fMRI) and online pattern classification techniques, we show that it is possible to predict human behavior during social interaction before the interacting partner communicates a specific decision. Average accuracy reached approximately 70% when we predicted online the decisions of volunteers playing the ultimatum game, a well-known paradigm in economic game theory. Our results demonstrate the successful online analysis of complex emotional and cognitive states using real-time fMRI, which will enable a major breakthrough for social fMRI by providing information about mental states of partners already during the mutual interaction. Interestingly, an additional whole brain classification across subjects confirmed the online results: anterior insula, ventral striatum, and lateral orbitofrontal cortex, known to act in emotional self-regulation and reward processing for adjustment of behavior, appeared to be strong determinants of later overt behavior in the ultimatum game. Using whole brain classification we were also able to discriminate between brain processes related to subjective emotional and motivational states and brain processes related to the evaluation of objective financial incentives.
Discriminating External and Internal Causes for Heading Changes in Freely Flying Drosophila
Sayaman, Rosalyn W.; Murray, Richard M.; Dickinson, Michael H.
2013-01-01
As animals move through the world in search of resources, they change course in reaction to both external sensory cues and internally-generated programs. Elucidating the functional logic of complex search algorithms is challenging because the observable actions of the animal cannot be unambiguously assigned to externally- or internally-triggered events. We present a technique that addresses this challenge by assessing quantitatively the contribution of external stimuli and internal processes. We apply this technique to the analysis of rapid turns (“saccades”) of freely flying Drosophila melanogaster. We show that a single scalar feature computed from the visual stimulus experienced by the animal is sufficient to explain a majority (93%) of the turning decisions. We automatically estimate this scalar value from the observable trajectory, without any assumption regarding the sensory processing. A posteriori, we show that the estimated feature field is consistent with previous results measured in other experimental conditions. The remaining turning decisions, not explained by this feature of the visual input, may be attributed to a combination of deterministic processes based on unobservable internal states and purely stochastic behavior. We cannot distinguish these contributions using external observations alone, but we are able to provide a quantitative bound of their relative importance with respect to stimulus-triggered decisions. Our results suggest that comparatively few saccades in free-flying conditions are a result of an intrinsic spontaneous process, contrary to previous suggestions. We discuss how this technique could be generalized for use in other systems and employed as a tool for classifying effects into sensory, decision, and motor categories when used to analyze data from genetic behavioral screens. PMID:23468601
Soft-decision decoding techniques for linear block codes and their error performance analysis
NASA Technical Reports Server (NTRS)
Lin, Shu
1996-01-01
The first paper presents a new minimum-weight trellis-based soft-decision iterative decoding algorithm for binary linear block codes. The second paper derives an upper bound on the probability of block error for multilevel concatenated codes (MLCC). The bound evaluates difference in performance for different decompositions of some codes. The third paper investigates the bit error probability code for maximum likelihood decoding of binary linear codes. The fourth and final paper included in this report is concerns itself with the construction of multilevel concatenated block modulation codes using a multilevel concatenation scheme for the frequency non-selective Rayleigh fading channel.
NASA Astrophysics Data System (ADS)
Friedl, L.; Macauley, M.; Bernknopf, R.
2013-12-01
Internationally, multiple organizations are placing greater emphasis on the societal benefits that governments, businesses, and NGOs can derive from applications of Earth-observing satellite observations, research, and models. A growing set of qualitative, anecdotal examples on the uses of Earth observations across a range of sectors can be complemented by the quantitative substantiation of the socioeconomic benefits. In turn, the expanding breadth of environmental data available and the awareness of their beneficial applications to inform decisions can support new products and services by companies, agencies, and civil society. There are, however, significant efforts needed to bridge the Earth sciences and social and economic sciences fields to build capacity, develop case studies, and refine analytic techniques in quantifying socioeconomic benefits from the use of Earth observations. Some government programs, such as the NASA Earth Science Division's Applied Sciences Program have initiated activities in recent years to quantify the socioeconomic benefits from applications of Earth observations research, and to develop multidisciplinary models for organizations' decision-making activities. A community of practice has conducted workshops, developed impact analysis reports, published a book, developed a primer, and pursued other activities to advance analytic methodologies and build capacity. This paper will present an overview of measuring socioeconomic impacts of Earth observations and how the measures can be translated into a value of Earth observation information. It will address key terms, techniques, principles and applications of socioeconomic impact analyses. It will also discuss activities to pursue a research agenda on analytic techniques, develop a body of knowledge, and promote broader skills and capabilities.
Multiattribute Decision Modeling Techniques: A Comparative Analysis
1988-08-01
Analytic Hierarchy Process ( AHP ). It is structurally similar to SMART, but elicitation methods are different and there are several algorithms for...reconciliation of inconsistent judgments and for consistency checks that are not available in any of the utility procedures. The AHP has been applied...of commercially available software packages that implement the AHP algorithms. Elicitation Methods. The AHP builds heavily on value trees, which
Wildlife habitats of the north coast of California: new techniques for extensive forest inventory.
Janet L. Ohmann
1992-01-01
A study was undertaken to develop methods for extensive inventory and analysis of wildlife habitats. The objective was to provide information about amounts and conditions of wildlife habitats from extensive, sample based inventories so that wildlife can be better considered in forest planning and policy decisions at the regional scale. The new analytical approach...
Improved Decision Making Through Group Composition
1991-09-01
design for this research was actually a quasiexperimental design , because equivalent experimental and control groups could not be guaranteed...The Nonequivalent Control Group Design (25:126) O1 X 02 03 04 control group after the simulation exercise. Final game scores were collected from 02... Control Group Design ...... 88 20. Data Analysis Techniques ............. 98 21. The Control and Treatment Team Members’
Resource analysis and land use planning with space and high altitude photography
NASA Technical Reports Server (NTRS)
Schrumpf, B. J.
1972-01-01
Photographic scales providing resource data for decision making processes of land use and a legend system for barren lands, water resources, natural vegetation, agricultural, urban, and industrial lands in hierarchical framework are applied to various remote sensing techniques. Two natural vegetation resource and land use maps for a major portion of Maricopa County, Arizona are also produced.
Lynch, Chip M; Abdollahi, Behnaz; Fuqua, Joshua D; de Carlo, Alexandra R; Bartholomai, James A; Balgemann, Rayeanne N; van Berkel, Victor H; Frieboes, Hermann B
2017-12-01
Outcomes for cancer patients have been previously estimated by applying various machine learning techniques to large datasets such as the Surveillance, Epidemiology, and End Results (SEER) program database. In particular for lung cancer, it is not well understood which types of techniques would yield more predictive information, and which data attributes should be used in order to determine this information. In this study, a number of supervised learning techniques is applied to the SEER database to classify lung cancer patients in terms of survival, including linear regression, Decision Trees, Gradient Boosting Machines (GBM), Support Vector Machines (SVM), and a custom ensemble. Key data attributes in applying these methods include tumor grade, tumor size, gender, age, stage, and number of primaries, with the goal to enable comparison of predictive power between the various methods The prediction is treated like a continuous target, rather than a classification into categories, as a first step towards improving survival prediction. The results show that the predicted values agree with actual values for low to moderate survival times, which constitute the majority of the data. The best performing technique was the custom ensemble with a Root Mean Square Error (RMSE) value of 15.05. The most influential model within the custom ensemble was GBM, while Decision Trees may be inapplicable as it had too few discrete outputs. The results further show that among the five individual models generated, the most accurate was GBM with an RMSE value of 15.32. Although SVM underperformed with an RMSE value of 15.82, statistical analysis singles the SVM as the only model that generated a distinctive output. The results of the models are consistent with a classical Cox proportional hazards model used as a reference technique. We conclude that application of these supervised learning techniques to lung cancer data in the SEER database may be of use to estimate patient survival time with the ultimate goal to inform patient care decisions, and that the performance of these techniques with this particular dataset may be on par with that of classical methods. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Smart, A. C.
2014-12-01
Governments are increasingly asking for more evidence of the benefits of investing in geospatial data and infrastructure before investing. They are looking for a clearer articulation of the economic, environmental and social benefits than has been possble in the past. Development of techniques has accelerated in the past five years as governments and industry become more involved in the capture and use of geospatial data. However evaluation practitioners have struggled to answer these emerging questions. The paper explores the types of questions that decision makers are asking and discusses the different approaches and methods that have been used recently to answer them. It explores the need for better buisness case models. The emerging approaches are then discussed and their attributes reviewed. These include methods of analysing tengible economic benefits, intangible benefits and societal benefits. The paper explores the use of value chain analysis and real options analysis to better articulate the impacts on international competitiveness and how to value the potential benefits of innovations enabled by the geospatial data that is produced. The paper concludes by illustrating the potential for these techniques in current and future decision making.
Development and initial evaluation of a treatment decision dashboard
2013-01-01
Background For many healthcare decisions, multiple alternatives are available with different combinations of advantages and disadvantages across several important dimensions. The complexity of current healthcare decisions thus presents a significant barrier to informed decision making, a key element of patient-centered care. Interactive decision dashboards were developed to facilitate decision making in Management, a field marked by similarly complicated choices. These dashboards utilize data visualization techniques to reduce the cognitive effort needed to evaluate decision alternatives and a non-linear flow of information that enables users to review information in a self-directed fashion. Theoretically, both of these features should facilitate informed decision making by increasing user engagement with and understanding of the decision at hand. We sought to determine if the interactive decision dashboard format can be successfully adapted to create a clinically realistic prototype patient decision aid suitable for further evaluation and refinement. Methods We created a computerized, interactive clinical decision dashboard and performed a pilot test of its clinical feasibility and acceptability using a multi-method analysis. The dashboard summarized information about the effectiveness, risks of side effects and drug-drug interactions, out-of-pocket costs, and ease of use of nine analgesic treatment options for knee osteoarthritis. Outcome evaluations included observations of how study participants utilized the dashboard, questionnaires to assess usability, acceptability, and decisional conflict, and an open-ended qualitative analysis. Results The study sample consisted of 25 volunteers - 7 men and 18 women - with an average age of 51 years. The mean time spent interacting with the dashboard was 4.6 minutes. Mean evaluation scores on scales ranging from 1 (low) to 7 (high) were: mechanical ease of use 6.1, cognitive ease of use 6.2, emotional difficulty 2.7, decision-aiding effectiveness 5.9, clarification of values 6.5, reduction in decisional uncertainty 6.1, and provision of decision-related information 6.0. Qualitative findings were similarly positive. Conclusions Interactive decision dashboards can be adapted for clinical use and have the potential to foster informed decision making. Additional research is warranted to more rigorously test the effectiveness and efficiency of patient decision dashboards for supporting informed decision making and other aspects of patient-centered care, including shared decision making. PMID:23601912
2012-01-01
Background Decision-making in healthcare is complex. Research on coverage decision-making has focused on comparative studies for several countries, statistical analyses for single decision-makers, the decision outcome and appraisal criteria. Accounting for decision processes extends the complexity, as they are multidimensional and process elements need to be regarded as latent constructs (composites) that are not observed directly. The objective of this study was to present a practical application of partial least square path modelling (PLS-PM) to evaluate how it offers a method for empirical analysis of decision-making in healthcare. Methods Empirical approaches that applied PLS-PM to decision-making in healthcare were identified through a systematic literature search. PLS-PM was used as an estimation technique for a structural equation model that specified hypotheses between the components of decision processes and the reasonableness of decision-making in terms of medical, economic and other ethical criteria. The model was estimated for a sample of 55 coverage decisions on the extension of newborn screening programmes in Europe. Results were evaluated by standard reliability and validity measures for PLS-PM. Results After modification by dropping two indicators that showed poor measures in the measurement models’ quality assessment and were not meaningful for newborn screening, the structural equation model estimation produced plausible results. The presence of three influences was supported: the links between both stakeholder participation or transparency and the reasonableness of decision-making; and the effect of transparency on the degree of scientific rigour of assessment. Reliable and valid measurement models were obtained to describe the composites of ‘transparency’, ‘participation’, ‘scientific rigour’ and ‘reasonableness’. Conclusions The structural equation model was among the first applications of PLS-PM to coverage decision-making. It allowed testing of hypotheses in situations where there are links between several non-observable constructs. PLS-PM was compatible in accounting for the complexity of coverage decisions to obtain a more realistic perspective for empirical analysis. The model specification can be used for hypothesis testing by using larger sample sizes and for data in the full domain of health technologies. PMID:22856325
Fischer, Katharina E
2012-08-02
Decision-making in healthcare is complex. Research on coverage decision-making has focused on comparative studies for several countries, statistical analyses for single decision-makers, the decision outcome and appraisal criteria. Accounting for decision processes extends the complexity, as they are multidimensional and process elements need to be regarded as latent constructs (composites) that are not observed directly. The objective of this study was to present a practical application of partial least square path modelling (PLS-PM) to evaluate how it offers a method for empirical analysis of decision-making in healthcare. Empirical approaches that applied PLS-PM to decision-making in healthcare were identified through a systematic literature search. PLS-PM was used as an estimation technique for a structural equation model that specified hypotheses between the components of decision processes and the reasonableness of decision-making in terms of medical, economic and other ethical criteria. The model was estimated for a sample of 55 coverage decisions on the extension of newborn screening programmes in Europe. Results were evaluated by standard reliability and validity measures for PLS-PM. After modification by dropping two indicators that showed poor measures in the measurement models' quality assessment and were not meaningful for newborn screening, the structural equation model estimation produced plausible results. The presence of three influences was supported: the links between both stakeholder participation or transparency and the reasonableness of decision-making; and the effect of transparency on the degree of scientific rigour of assessment. Reliable and valid measurement models were obtained to describe the composites of 'transparency', 'participation', 'scientific rigour' and 'reasonableness'. The structural equation model was among the first applications of PLS-PM to coverage decision-making. It allowed testing of hypotheses in situations where there are links between several non-observable constructs. PLS-PM was compatible in accounting for the complexity of coverage decisions to obtain a more realistic perspective for empirical analysis. The model specification can be used for hypothesis testing by using larger sample sizes and for data in the full domain of health technologies.
Dolan, James G
2010-01-01
Current models of healthcare quality recommend that patient management decisions be evidence-based and patient-centered. Evidence-based decisions require a thorough understanding of current information regarding the natural history of disease and the anticipated outcomes of different management options. Patient-centered decisions incorporate patient preferences, values, and unique personal circumstances into the decision making process and actively involve both patients along with health care providers as much as possible. Fundamentally, therefore, evidence-based, patient-centered decisions are multi-dimensional and typically involve multiple decision makers.Advances in the decision sciences have led to the development of a number of multiple criteria decision making methods. These multi-criteria methods are designed to help people make better choices when faced with complex decisions involving several dimensions. They are especially helpful when there is a need to combine "hard data" with subjective preferences, to make trade-offs between desired outcomes, and to involve multiple decision makers. Evidence-based, patient-centered clinical decision making has all of these characteristics. This close match suggests that clinical decision support systems based on multi-criteria decision making techniques have the potential to enable patients and providers to carry out the tasks required to implement evidence-based, patient-centered care effectively and efficiently in clinical settings.The goal of this paper is to give readers a general introduction to the range of multi-criteria methods available and show how they could be used to support clinical decision-making. Methods discussed include the balance sheet, the even swap method, ordinal ranking methods, direct weighting methods, multi-attribute decision analysis, and the analytic hierarchy process (AHP).
Dolan, James G.
2010-01-01
Current models of healthcare quality recommend that patient management decisions be evidence-based and patient-centered. Evidence-based decisions require a thorough understanding of current information regarding the natural history of disease and the anticipated outcomes of different management options. Patient-centered decisions incorporate patient preferences, values, and unique personal circumstances into the decision making process and actively involve both patients along with health care providers as much as possible. Fundamentally, therefore, evidence-based, patient-centered decisions are multi-dimensional and typically involve multiple decision makers. Advances in the decision sciences have led to the development of a number of multiple criteria decision making methods. These multi-criteria methods are designed to help people make better choices when faced with complex decisions involving several dimensions. They are especially helpful when there is a need to combine “hard data” with subjective preferences, to make trade-offs between desired outcomes, and to involve multiple decision makers. Evidence-based, patient-centered clinical decision making has all of these characteristics. This close match suggests that clinical decision support systems based on multi-criteria decision making techniques have the potential to enable patients and providers to carry out the tasks required to implement evidence-based, patient-centered care effectively and efficiently in clinical settings. The goal of this paper is to give readers a general introduction to the range of multi-criteria methods available and show how they could be used to support clinical decision-making. Methods discussed include the balance sheet, the even swap method, ordinal ranking methods, direct weighting methods, multi-attribute decision analysis, and the analytic hierarchy process (AHP) PMID:21394218
Jahn, Beate; Theurl, Engelbert; Siebert, Uwe; Pfeiffer, Karl-Peter
2010-01-01
In most decision-analytic models in health care, it is assumed that there is treatment without delay and availability of all required resources. Therefore, waiting times caused by limited resources and their impact on treatment effects and costs often remain unconsidered. Queuing theory enables mathematical analysis and the derivation of several performance measures of queuing systems. Nevertheless, an analytical approach with closed formulas is not always possible. Therefore, simulation techniques are used to evaluate systems that include queuing or waiting, for example, discrete event simulation. To include queuing in decision-analytic models requires a basic knowledge of queuing theory and of the underlying interrelationships. This tutorial introduces queuing theory. Analysts and decision-makers get an understanding of queue characteristics, modeling features, and its strength. Conceptual issues are covered, but the emphasis is on practical issues like modeling the arrival of patients. The treatment of coronary artery disease with percutaneous coronary intervention including stent placement serves as an illustrative queuing example. Discrete event simulation is applied to explicitly model resource capacities, to incorporate waiting lines and queues in the decision-analytic modeling example.
Kearney, Philip E; Carson, Howie J; Collins, Dave
2018-05-01
This paper explores the approaches adopted by high-level field athletics coaches when attempting to refine an athlete's already well-established technique (long and triple jump and javelin throwing). Six coaches, who had all coached multiple athletes to multiple major championships, took part in semi-structured interviews focused upon a recent example of technique refinement. Data were analysed using a thematic content analysis. The coaching tools reported were generally consistent with those advised by the existing literature, focusing on attaining "buy-in", utilising part-practice, restoring movement automaticity and securing performance under pressure. Five of the six coaches reported using a systematic sequence of stages to implement the refinement, although the number and content of these stages varied between them. Notably, however, there were no formal sources of knowledge (e.g., coach education or training) provided to inform coaches' decision making. Instead, coaches' decisions were largely based on experience both within and outside the sporting domain. Data offer a useful stimulus for reflection amongst sport practitioners confronted by the problem of technique refinement. Certainly the limited awareness of existing guidelines on technique refinement expressed by the coaches emphasises a need for further collaborative work by researchers and coach educators to disseminate best practice.
A model of human decision making in multiple process monitoring situations
NASA Technical Reports Server (NTRS)
Greenstein, J. S.; Rouse, W. B.
1982-01-01
Human decision making in multiple process monitoring situations is considered. It is proposed that human decision making in many multiple process monitoring situations can be modeled in terms of the human's detection of process related events and his allocation of attention among processes once he feels event have occurred. A mathematical model of human event detection and attention allocation performance in multiple process monitoring situations is developed. An assumption made in developing the model is that, in attempting to detect events, the human generates estimates of the probabilities that events have occurred. An elementary pattern recognition technique, discriminant analysis, is used to model the human's generation of these probability estimates. The performance of the model is compared to that of four subjects in a multiple process monitoring situation requiring allocation of attention among processes.
A decision analysis approach for risk management of near-earth objects
NASA Astrophysics Data System (ADS)
Lee, Robert C.; Jones, Thomas D.; Chapman, Clark R.
2014-10-01
Risk management of near-Earth objects (NEOs; e.g., asteroids and comets) that can potentially impact Earth is an important issue that took on added urgency with the Chelyabinsk event of February 2013. Thousands of NEOs large enough to cause substantial damage are known to exist, although only a small fraction of these have the potential to impact Earth in the next few centuries. The probability and location of a NEO impact are subject to complex physics and great uncertainty, and consequences can range from minimal to devastating, depending upon the size of the NEO and location of impact. Deflecting a potential NEO impactor would be complex and expensive, and inter-agency and international cooperation would be necessary. Such deflection campaigns may be risky in themselves, and mission failure may result in unintended consequences. The benefits, risks, and costs of different potential NEO risk management strategies have not been compared in a systematic fashion. We present a decision analysis framework addressing this hazard. Decision analysis is the science of informing difficult decisions. It is inherently multi-disciplinary, especially with regard to managing catastrophic risks. Note that risk analysis clarifies the nature and magnitude of risks, whereas decision analysis guides rational risk management. Decision analysis can be used to inform strategic, policy, or resource allocation decisions. First, a problem is defined, including the decision situation and context. Second, objectives are defined, based upon what the different decision-makers and stakeholders (i.e., participants in the decision) value as important. Third, quantitative measures or scales for the objectives are determined. Fourth, alternative choices or strategies are defined. Fifth, the problem is then quantitatively modeled, including probabilistic risk analysis, and the alternatives are ranked in terms of how well they satisfy the objectives. Sixth, sensitivity analyses are performed in order to examine the impact of uncertainties. Finally, the need for further analysis, data collection, or refinement is determined. The first steps of defining the problem and the objectives are critical to constructing an informative decision analysis. Such steps must be undertaken with participation from experts, decision-makers, and stakeholders (defined here as "decision participants"). The basic problem here can be framed as: “What is the best strategy to manage risk associated with NEOs?” Some high-level objectives might be to minimize: mortality and injuries, damage to critical infrastructure (e.g., power, communications and food distribution), ecosystem damage, property damage, ungrounded media and public speculation, resources expended, and overall cost. Another valuable objective would be to maximize inter-agency/government coordination. Some of these objectives (e.g., “minimize mortality”) are readily quantified (e.g., deaths and injuries averted). Others are less so (e.g., “maximize inter-agency/government coordination”), but these can be scaled. Objectives may be inversely related: e.g., a strategy that minimizes mortality may cost more. They are also unlikely to be weighted equally. Defining objectives and assessing their relative weight and interactions requires early engagement with decision participants. High-level decisions include whether to deflect a NEO, when to deflect, what is the best alternative for deflection/destruction, and disaster management strategies if an impact occurs. Important influences include, for example: NEO characteristics (orbital characteristics, diameter, mass, spin and composition), impact probability and location, interval between discovery and projected impact date, interval between discovery and deflection target date, costs of information collection, costs and technological feasibility of deflection alternatives, risks of deflection campaigns, requirements for inter-agency and international cooperation, and timing of informing the public. The analytical aspects of decision analysis center on estimation of the expected value (i.e. utility) of different alternatives. The expected value of an alternative is a function of the probability-weighted consequences, estimated using Bayesian calculations in a decision tree or influence diagram model. The result is a set of expected-value estimates for all alternatives evaluated that enables a ranking; the higher the expected value, the more preferred the alternative. A common way to include resource limitations is by framing the decision analysis in the context of economics (e.g., cost-effectiveness analysis). An important aspect of decision analysis in the NEO risk management case is the ability, known as sensitivity analysis, to examine the effect of parameter uncertainty upon decisions. The simplest way to evaluate uncertainty associated with the information used in a decision analysis is to adjust the input values one at a time (or simultaneously) to examine how the results change. Monte Carlo simulations can be used to adjust the inputs over ranges or distributions of values; statistical means then are used to determine the most influential variables. These techniques yield a measure known as the expected value of imperfect information. This value is highly informative, because it allows the decision-maker with imperfect information to evaluate the impact of using experiments, tests, or data collection (e.g. Earth-based observations, space-based remote sensing, etc.) to refine judgments; and indeed to estimate how much should be spent to reduce uncertainty.
Wonodi, C B; Privor-Dumm, L; Aina, M; Pate, A M; Reis, R; Gadhoke, P; Levine, O S
2012-05-01
The decision-making process to introduce new vaccines into national immunization programmes is often complex, involving many stakeholders who provide technical information, mobilize finance, implement programmes and garner political support. Stakeholders may have different levels of interest, knowledge and motivations to introduce new vaccines. Lack of consensus on the priority, public health value or feasibility of adding a new vaccine can delay policy decisions. Efforts to support country-level decision-making have largely focused on establishing global policies and equipping policy makers with the information to support decision-making on new vaccine introduction (NVI). Less attention has been given to understanding the interactions of policy actors and how the distribution of influence affects the policy process and decision-making. Social network analysis (SNA) is a social science technique concerned with explaining social phenomena using the structural and relational features of the network of actors involved. This approach can be used to identify how information is exchanged and who is included or excluded from the process. For this SNA of vaccine decision-making in Nigeria, we interviewed federal and state-level government officials, officers of bilateral and multilateral partner organizations, and other stakeholders such as health providers and the media. Using data culled from those interviews, we performed an SNA in order to map formal and informal relationships and the distribution of influence among vaccine decision-makers, as well as to explore linkages and pathways to stakeholders who can influence critical decisions in the policy process. Our findings indicate a relatively robust engagement of key stakeholders in Nigeria. We hypothesized that economic stakeholders and implementers would be important to ensure sustainable financing and strengthen programme implementation, but some economic and implementation stakeholders did not appear centrally on the map; this may suggest a need to strengthen the decision-making processes by engaging these stakeholders more centrally and earlier.
Wellons, John C; Shannon, Chevis N; Holubkov, Richard; Riva-Cambrin, Jay; Kulkarni, Abhaya V; Limbrick, David D; Whitehead, William; Browd, Samuel; Rozzelle, Curtis; Simon, Tamara D; Tamber, Mandeep S; Oakes, W Jerry; Drake, James; Luerssen, Thomas G; Kestle, John
2017-07-01
OBJECTIVE Previous Hydrocephalus Clinical Research Network (HCRN) retrospective studies have shown a 15% difference in rates of conversion to permanent shunts with the use of ventriculosubgaleal shunts (VSGSs) versus ventricular reservoirs (VRs) as temporization procedures in the treatment of hydrocephalus due to high-grade intraventricular hemorrhage (IVH) of prematurity. Further research in the same study line revealed a strong influence of center-specific decision-making on shunt outcomes. The primary goal of this prospective study was to standardize decision-making across centers to determine true procedural superiority, if any, of VSGS versus VR as a temporization procedure in high-grade IVH of prematurity. METHODS The HCRN conducted a prospective cohort study across 6 centers with an approximate 1.5- to 3-year accrual period (depending on center) followed by 6 months of follow-up. Infants with premature birth, who weighed less than 1500 g, had Grade 3 or 4 IVH of prematurity, and had more than 72 hours of life expectancy were included in the study. Based on a priori consensus, decisions were standardized regarding the timing of initial surgical treatment, upfront shunt versus temporization procedure (VR or VSGS), and when to convert a VR or VSGS to a permanent shunt. Physical examination assessment and surgical technique were also standardized. The primary outcome was the proportion of infants who underwent conversion to a permanent shunt. The major secondary outcomes of interest included infection and other complication rates. RESULTS One hundred forty-five premature infants were enrolled and met criteria for analysis. Using the standardized decision rubrics, 28 infants never reached the threshold for treatment, 11 initially received permanent shunts, 4 were initially treated with endoscopic third ventriculostomy (ETV), and 102 underwent a temporization procedure (36 with VSGSs and 66 with VRs). The 2 temporization cohorts were similar in terms of sex, race, IVH grade, head (orbitofrontal) circumference, and ventricular size at temporization. There were statistically significant differences noted between groups in gestational age, birth weight, and bilaterality of clot burden that were controlled for in post hoc analysis. By Kaplan-Meier analysis, the 180-day rates of conversion to permanent shunts were 63.5% for VSGS and 74.0% for VR (p = 0.36, log-rank test). The infection rate for VSGS was 14% (5/36) and for VR was 17% (11/66; p = 0.71). The overall compliance rate with the standardized decision rubrics was noted to be 90% for all surgeons. CONCLUSIONS A standardized protocol was instituted across all centers of the HCRN. Compliance was high. Choice of temporization techniques in premature infants with IVH does not appear to influence rates of conversion to permanent ventricular CSF diversion. Once management decisions and surgical techniques are standardized across HCRN sites, thus minimizing center effect, the observed difference in conversion rates between VSGSs and VRs is mitigated.
Scaling Up Decision Theoretic Planning to Planetary Rover Problems
NASA Technical Reports Server (NTRS)
Meuleau, Nicolas; Dearden, Richard; Washington, Rich
2004-01-01
Because of communication limits, planetary rovers must operate autonomously during consequent durations. The ability to plan under uncertainty is one of the main components of autonomy. Previous approaches to planning under uncertainty in NASA applications are not able to address the challenges of future missions, because of several apparent limits. On another side, decision theory provides a solid principle framework for reasoning about uncertainty and rewards. Unfortunately, there are several obstacles to a direct application of decision-theoretic techniques to the rover domain. This paper focuses on the issues of structure and concurrency, and continuous state variables. We describes two techniques currently under development that address specifically these issues and allow scaling-up decision theoretic solution techniques to planetary rover planning problems involving a small number of goals.
NASA Astrophysics Data System (ADS)
Mohammed, Habiba Ibrahim; Majid, Zulkepli; Yusof, Norhakim Bin; Bello Yamusa, Yamusa
2018-03-01
Landfilling remains the most common systematic technique of solid waste disposal in most of the developed and developing countries. Finding a suitable site for landfill is a very challenging task. Landfill site selection process aims to provide suitable areas that will protect the environment and public health from pollution and hazards. Therefore, various factors such as environmental, physical, socio-economic, and geological criteria must be considered before siting any landfill. This makes the site selection process vigorous and tedious because it involves the processing of large amount of spatial data, rules and regulations from different agencies and also policy from decision makers. This allows the incorporation of conflicting objectives and decision maker preferences into spatial decision models. This paper particularly analyzes the multi-criteria evaluation (MCE) method of landfill site selection for solid waste management by means of literature reviews and surveys. The study will help the decision makers and waste management authorities to choose the most effective method when considering landfill site selection.
Khakzad, Nima; Landucci, Gabriele; Reniers, Genserik
2017-09-01
In the present study, we have introduced a methodology based on graph theory and multicriteria decision analysis for cost-effective fire protection of chemical plants subject to fire-induced domino effects. By modeling domino effects in chemical plants as a directed graph, the graph centrality measures such as out-closeness and betweenness scores can be used to identify the installations playing a key role in initiating and propagating potential domino effects. It is demonstrated that active fire protection of installations with the highest out-closeness score and passive fire protection of installations with the highest betweenness score are the most effective strategies for reducing the vulnerability of chemical plants to fire-induced domino effects. We have employed a dynamic graph analysis to investigate the impact of both the availability and the degradation of fire protection measures over time on the vulnerability of chemical plants. The results obtained from the graph analysis can further be prioritized using multicriteria decision analysis techniques such as the method of reference point to find the most cost-effective fire protection strategy. © 2016 Society for Risk Analysis.
Role of pharmacoeconomic analysis in R&D decision making: when, where, how?
Miller, Paul
2005-01-01
Pharmacoeconomics is vitally important to drug manufacturers in terms of communicating to external decision-makers (payers, prescribers, patients) the value of their products, achieving regulatory and reimbursement approval and contributing to commercial success. Since development of new drugs is long, costly and risky, and decisions must be made how to allocate considerable research and development (R&D) resources, pharmacoeconomics also has an essential role informing internal decision-making (within a company) during drug development. The use of pharmacoeconomics in early development phases is likely to enhance the efficiency of R&D resource use and also provide a solid foundation for communicating product value to external decision-makers further downstream, increasing the likelihood of regulatory (reimbursement) approval and commercial success. This paper puts the case for use of pharmacoeconomic analyses earlier in the development process and outlines five techniques (clinical trial simulation [CTS], option pricing [OP], investment appraisal [IA], threshold analysis [TA] and value of information [VOI] analysis) that can provide useful input into the design of clinical development programmes, portfolio management and optimal pricing strategy. CTS can estimate efficacy and tolerability profiles before clinical data are available. OP can show the value of different clinical programme designs, sequencing of studies and stop decisions. IA can compare expected net present value (NPV) of different product profiles or study designs. TA can be used to understand development drug profile requirements given partial data. VOI can assist risk management by quantifying uncertainty and assessing the economic viability of gathering further information on the development drug. No amount of pharmacoeconomic data can make a bad drug good; what it can do is enhance the drug developers understanding of the characteristics of that drug. Decision-making, in light of this information, is likely to be better than that without it, whether it leads to faster termination of uneconomic projects or the allocation of more appropriate resources to attractive projects.
Vista goes online: Decision-analytic systems for real-time decision-making in mission control
NASA Technical Reports Server (NTRS)
Barry, Matthew; Horvitz, Eric; Ruokangas, Corinne; Srinivas, Sampath
1994-01-01
The Vista project has centered on the use of decision-theoretic approaches for managing the display of critical information relevant to real-time operations decisions. The Vista-I project originally developed a prototype of these approaches for managing flight control displays in the Space Shuttle Mission Control Center (MCC). The follow-on Vista-II project integrated these approaches in a workstation program which currently is being certified for use in the MCC. To our knowledge, this will be the first application of automated decision-theoretic reasoning techniques for real-time spacecraft operations. We shall describe the development and capabilities of the Vista-II system, and provide an overview of the use of decision-theoretic reasoning techniques to the problems of managing the complexity of flight controller displays. We discuss the relevance of the Vista techniques within the MCC decision-making environment, focusing on the problems of detecting and diagnosing spacecraft electromechanical subsystems component failures with limited information, and the problem of determining what control actions should be taken in high-stakes, time-critical situations in response to a diagnosis performed under uncertainty. Finally, we shall outline our current research directions for follow-on projects.
Application of Multi-Criteria Decision Making (MCDM) Technique for Gradation of Jute Fibres
NASA Astrophysics Data System (ADS)
Choudhuri, P. K.
2014-12-01
Multi-Criteria Decision Making is a branch of Operation Research (OR) having a comparatively short history of about 40 years. It is being popularly used in the field of engineering, banking, fixing policy matters etc. It can also be applied for taking decisions in daily life like selecting a car to purchase, selecting bride or groom and many others. Various MCDM methods namely Weighted Sum Model (WSM), Weighted Product Model (WPM), Analytic Hierarchy Process (AHP), Technique for Order Preference by Similarity to Ideal Solutions (TOPSIS) and Elimination and Choice Translating Reality (ELECTRE) are there to solve many decision making problems, each having its own limitations. However it is very difficult to decide which MCDM method is the best. MCDM methods are prospective quantitative approaches for solving decision problems involving finite number of alternatives and criteria. Very few research works in textiles have been carried out with the help of this technique particularly where decision taking among several alternatives becomes the major problem based on some criteria which are conflicting in nature. Gradation of jute fibres on the basis of the criteria like strength, root content, defects, colour, density, fineness etc. is an important task to perform. The MCDM technique provides enough scope to be applied for the gradation of jute fibres or ranking among several varieties keeping in view a particular object and on the basis of some selection criteria and their relative weightage. The present paper is an attempt to explore the scope of applying the multiplicative AHP method of multi-criteria decision making technique to determine the quality values of selected jute fibres on the basis of some above stated important criteria and ranking them accordingly. A good agreement in ranking is observed between the existing Bureau of Indian Standards (BIS) grading and proposed method.
Computational intelligence techniques for biological data mining: An overview
NASA Astrophysics Data System (ADS)
Faye, Ibrahima; Iqbal, Muhammad Javed; Said, Abas Md; Samir, Brahim Belhaouari
2014-10-01
Computational techniques have been successfully utilized for a highly accurate analysis and modeling of multifaceted and raw biological data gathered from various genome sequencing projects. These techniques are proving much more effective to overcome the limitations of the traditional in-vitro experiments on the constantly increasing sequence data. However, most critical problems that caught the attention of the researchers may include, but not limited to these: accurate structure and function prediction of unknown proteins, protein subcellular localization prediction, finding protein-protein interactions, protein fold recognition, analysis of microarray gene expression data, etc. To solve these problems, various classification and clustering techniques using machine learning have been extensively used in the published literature. These techniques include neural network algorithms, genetic algorithms, fuzzy ARTMAP, K-Means, K-NN, SVM, Rough set classifiers, decision tree and HMM based algorithms. Major difficulties in applying the above algorithms include the limitations found in the previous feature encoding and selection methods while extracting the best features, increasing classification accuracy and decreasing the running time overheads of the learning algorithms. The application of this research would be potentially useful in the drug design and in the diagnosis of some diseases. This paper presents a concise overview of the well-known protein classification techniques.
Stress-induced changes in human decision-making are reversible.
Soares, J M; Sampaio, A; Ferreira, L M; Santos, N C; Marques, F; Palha, J A; Cerqueira, J J; Sousa, N
2012-07-03
Appropriate decision-making relies on the ability to shift between different behavioral strategies according to the context in which decisions are made. A cohort of subjects exposed to prolonged stress, and respective gender- and age-matched controls, performed an instrumental behavioral task to assess their decision-making strategies. The stressed cohort was reevaluated after a 6-week stress-free period. The behavioral analysis was complemented by a functional magnetic resonance imaging (fMRI) study to detect the patterns of activation in corticostriatal networks ruling goal-directed and habitual actions. Using structural MRI, the volumes of the main cortical and subcortical regions implicated in instrumental behavior were determined. Here we show that chronic stress biases decision-making strategies in humans toward habits, as choices of stressed subjects become insensitive to changes in outcome value. Using functional imaging techniques, we demonstrate that prolonged exposure to stress in humans causes an imbalanced activation of the networks that govern decision processes, shifting activation from the associative to the sensorimotor circuits. These functional changes are paralleled by atrophy of the medial prefrontal cortex and the caudate, and by an increase in the volume of the putamina. Importantly, a longitudinal assessment of the stressed individuals showed that both the structural and functional changes triggered by stress are reversible and that decisions become again goal-directed.
Williamson, J; Ranyard, R; Cuthbert, L
2000-05-01
This study is an evaluation of a process tracing method developed for naturalistic decisions, in this case a consumer choice task. The method is based on Huber et al.'s (1997) Active Information Search (AIS) technique, but develops it by providing spoken rather than written answers to respondents' questions, and by including think aloud instructions. The technique is used within a conversation-based situation, rather than the respondent thinking aloud 'into an empty space', as is conventionally the case in think aloud techniques. The method results in a concurrent verbal protocol as respondents make their decisions, and a retrospective report in the form of a post-decision summary. The method was found to be virtually non-reactive in relation to think aloud, although the variable of Preliminary Attribute Elicitation showed some evidence of reactivity. This was a methodological evaluation, and as such the data reported are essentially descriptive. Nevertheless, the data obtained indicate that the method is capable of producing information about decision processes which could have theoretical importance in terms of evaluating models of decision-making.
A decision-based perspective for the design of methods for systems design
NASA Technical Reports Server (NTRS)
Mistree, Farrokh; Muster, Douglas; Shupe, Jon A.; Allen, Janet K.
1989-01-01
Organization of material, a definition of decision based design, a hierarchy of decision based design, the decision support problem technique, a conceptual model design that can be manufactured and maintained, meta-design, computer-based design, action learning, and the characteristics of decisions are among the topics covered.
The design of aircraft using the decision support problem technique
NASA Technical Reports Server (NTRS)
Mistree, Farrokh; Marinopoulos, Stergios; Jackson, David M.; Shupe, Jon A.
1988-01-01
The Decision Support Problem Technique for unified design, manufacturing and maintenance is being developed at the Systems Design Laboratory at the University of Houston. This involves the development of a domain-independent method (and the associated software) that can be used to process domain-dependent information and thereby provide support for human judgment. In a computer assisted environment, this support is provided in the form of optimal solutions to Decision Support Problems.
Teaching ethical analysis in occupational therapy.
Haddad, A M
1988-05-01
Ethical decision making is a cognitive skill requiring education in ethical principles and an understanding of specific ethical issues. It is also a psychodynamic process involving personalities, values, opinions, and perceptions. This article proposes the use of case studies and role-playing techniques in teaching ethics in occupational therapy to supplement conventional methods of presenting ethical theories and principles. These two approaches invite students to discuss and analyze crucial issues in occupational therapy from a variety of viewpoints. Methodology of developing case studies and role-playing exercises are discussed. The techniques are evaluated and their application to the teaching of ethics is examined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harold S. Blackman; Ronald Boring; Julie L. Marble
This panel will discuss what new directions are necessary to maximize the usefulness of HRA techniques across different areas of application. HRA has long been a part of Probabilistic Risk Assessment in the nuclear industry as it offers a superior standard for risk-based decision-making. These techniques are continuing to be adopted by other industries including oil & gas, cybersecurity, nuclear, and aviation. Each participant will present his or her ideas concerning industry needs followed by a discussion about what research is needed and the necessity to achieve cross industry collaboration.
Social impact analysis: monetary valuation
Wainger, Lisa A.; Johnston, Robert J.; Bagstad, Kenneth J.; Casey, Frank; Vegh, Tibor
2014-01-01
This section provides basic guidance for using and conducting economic valuation, including criteria for judging whether valuation is appropriate for supporting decisions. It provides an introduction to the economic techniques used to measure changes in social welfare and describes which methods may be most appropriate for use in valuing particular ecosystem services. Rather than providing comprehensive valuation instructions,it directs readers to additional resources.More generally, it establishes that the valuation of ecosystem services is grounded in a long history of non-market valuation and discusses how ecosystem services valuation can be conducted within established economic theory and techniques.
Systems Engineering Techniques for ALS Decision Making
NASA Technical Reports Server (NTRS)
Rodriquez, Luis F.; Drysdale, Alan E.; Jones, Harry; Levri, Julie A.
2004-01-01
The Advanced Life Support (ALS) Metric is the predominant tool for predicting the cost of ALS systems. Metric goals for the ALS Program are daunting, requiring a threefold increase in the ALS Metric by 2010. Confounding the problem, the rate new ALS technologies reach the maturity required for consideration in the ALS Metric and the rate at which new configurations are developed is slow, limiting the search space and potentially giving the perspective of a ALS technology, the ALS Metric may remain elusive. This paper is a sequel to a paper published in the proceedings of the 2003 ICES conference entitled, "Managing to the metric: an approach to optimizing life support costs." The conclusions of that paper state that the largest contributors to the ALS Metric should be targeted by ALS researchers and management for maximum metric reductions. Certainly, these areas potentially offer large potential benefits to future ALS missions; however, the ALS Metric is not the only decision-making tool available to the community. To facilitate decision-making within the ALS community a combination of metrics should be utilized, such as the Equivalent System Mass (ESM)-based ALS metric, but also those available through techniques such as life cycle costing and faithful consideration of the sensitivity of the assumed models and data. Often a lack of data is cited as the reason why these techniques are not considered for utilization. An existing database development effort within the ALS community, known as OPIS, may provide the opportunity to collect the necessary information to enable the proposed systems analyses. A review of these additional analysis techniques is provided, focusing on the data necessary to enable these. The discussion is concluded by proposing how the data may be utilized by analysts in the future.
Material selection and assembly method of battery pack for compact electric vehicle
NASA Astrophysics Data System (ADS)
Lewchalermwong, N.; Masomtob, M.; Lailuck, V.; Charoenphonphanich, C.
2018-01-01
Battery packs become the key component in electric vehicles (EVs). The main costs of which are battery cells and assembling processes. The battery cell is indeed priced from battery manufacturers while the assembling cost is dependent on battery pack designs. Battery pack designers need overall cost as cheap as possible, but it still requires high performance and more safety. Material selection and assembly method as well as component design are very important to determine the cost-effectiveness of battery modules and battery packs. Therefore, this work presents Decision Matrix, which can aid in the decision-making process of component materials and assembly methods for a battery module design and a battery pack design. The aim of this study is to take the advantage of incorporating Architecture Analysis method into decision matrix methods by capturing best practices for conducting design architecture analysis in full account of key design components critical to ensure efficient and effective development of the designs. The methodology also considers the impacts of choice-alternatives along multiple dimensions. Various alternatives for materials and assembly techniques of battery pack are evaluated, and some sample costs are presented. Due to many components in the battery pack, only seven components which are positive busbar and Z busbar are represented in this paper for using decision matrix methods.
Reliability analysis of a robotic system using hybridized technique
NASA Astrophysics Data System (ADS)
Kumar, Naveen; Komal; Lather, J. S.
2017-09-01
In this manuscript, the reliability of a robotic system has been analyzed using the available data (containing vagueness, uncertainty, etc). Quantification of involved uncertainties is done through data fuzzification using triangular fuzzy numbers with known spreads as suggested by system experts. With fuzzified data, if the existing fuzzy lambda-tau (FLT) technique is employed, then the computed reliability parameters have wide range of predictions. Therefore, decision-maker cannot suggest any specific and influential managerial strategy to prevent unexpected failures and consequently to improve complex system performance. To overcome this problem, the present study utilizes a hybridized technique. With this technique, fuzzy set theory is utilized to quantify uncertainties, fault tree is utilized for the system modeling, lambda-tau method is utilized to formulate mathematical expressions for failure/repair rates of the system, and genetic algorithm is utilized to solve established nonlinear programming problem. Different reliability parameters of a robotic system are computed and the results are compared with the existing technique. The components of the robotic system follow exponential distribution, i.e., constant. Sensitivity analysis is also performed and impact on system mean time between failures (MTBF) is addressed by varying other reliability parameters. Based on analysis some influential suggestions are given to improve the system performance.
Adelson, David; Brown, Fred; Chaudhri, Naeem
2017-01-01
The use of intelligent techniques in medicine has brought a ray of hope in terms of treating leukaemia patients. Personalized treatment uses patient's genetic profile to select a mode of treatment. This process makes use of molecular technology and machine learning, to determine the most suitable approach to treating a leukaemia patient. Until now, no reviews have been published from a computational perspective concerning the development of personalized medicine intelligent techniques for leukaemia patients using molecular data analysis. This review studies the published empirical research on personalized medicine in leukaemia and synthesizes findings across studies related to intelligence techniques in leukaemia, with specific attention to particular categories of these studies to help identify opportunities for further research into personalized medicine support systems in chronic myeloid leukaemia. A systematic search was carried out to identify studies using intelligence techniques in leukaemia and to categorize these studies based on leukaemia type and also the task, data source, and purpose of the studies. Most studies used molecular data analysis for personalized medicine, but future advancement for leukaemia patients requires molecular models that use advanced machine-learning methods to automate decision-making in treatment management to deliver supportive medical information to the patient in clinical practice. PMID:28812013
Banjar, Haneen; Adelson, David; Brown, Fred; Chaudhri, Naeem
2017-01-01
The use of intelligent techniques in medicine has brought a ray of hope in terms of treating leukaemia patients. Personalized treatment uses patient's genetic profile to select a mode of treatment. This process makes use of molecular technology and machine learning, to determine the most suitable approach to treating a leukaemia patient. Until now, no reviews have been published from a computational perspective concerning the development of personalized medicine intelligent techniques for leukaemia patients using molecular data analysis. This review studies the published empirical research on personalized medicine in leukaemia and synthesizes findings across studies related to intelligence techniques in leukaemia, with specific attention to particular categories of these studies to help identify opportunities for further research into personalized medicine support systems in chronic myeloid leukaemia. A systematic search was carried out to identify studies using intelligence techniques in leukaemia and to categorize these studies based on leukaemia type and also the task, data source, and purpose of the studies. Most studies used molecular data analysis for personalized medicine, but future advancement for leukaemia patients requires molecular models that use advanced machine-learning methods to automate decision-making in treatment management to deliver supportive medical information to the patient in clinical practice.
Fertility preservation for social indications: a cost-based decision analysis.
Hirshfeld-Cytron, Jennifer; Grobman, William A; Milad, Magdy P
2012-03-01
Age-related infertility remains a problem that assisted reproductive techniques (ART) have limited ability to overcome. Correspondingly, because an increasing number of women are choosing to delay childbearing, fertility preservation strategies, initially intended for patients undergoing gonadotoxic therapies, are being applied to this group of healthy women. Studies supporting the effectiveness of this practice are lacking. Decision analytic techniques. We compared the cost-effectiveness of three strategies for women planning delayed childbearing until age 40: oocyte cryopreservation at age 25, ovarian tissue cryopreservation (OTC) at age 25, and no assisted reproduction until spontaneous conception had been attempted. Not applicable. Not applicable. Cost-effectiveness, which was defined as the cost per live birth. In this analysis, the strategy of foregoing fertility preservation at age 25 and then choosing ART only after not spontaneously conceiving at age 40 was the most cost-effective option. OTC was dominated by the other strategies. Sensitivity analyses demonstrated the robustness of the model; no analysis existed in which OTC was not dominated by oocyte cryopreservation. Increasing the cost of an IVF cycle beyond $22,000 was the only situation in which oocyte cryopreservation was the most preferred strategy. Neither oocyte cryopreservation nor OTC appear to be cost-effective under current circumstances for otherwise healthy women planning delayed childbearing. This analysis should give pause to the current practice of offering fertility preservation based only on the desire for delayed childbearing. Copyright © 2012 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
Simmons, T; Goodburn, B; Singhrao, S K
2016-01-01
This feasibility study was undertaken to describe and record the histological characteristics of burnt and unburnt cranial bone fragments from human and non-human bones. Reference series of fully mineralized, transverse sections of cranial bone, from all variables and specimen states, were prepared by manual cutting and semi-automated grinding and polishing methods. A photomicrograph catalogue reflecting differences in burnt and unburnt bone from human and non-humans was recorded and qualitative analysis was performed using an established classification system based on primary bone characteristics. The histomorphology associated with human and non-human samples was, for the main part, preserved following burning at high temperature. Clearly, fibro-lamellar complex tissue subtypes, such as plexiform or laminar primary bone, were only present in non-human bones. A decision tree analysis based on histological features provided a definitive identification key for distinguishing human from non-human bone, with an accuracy of 100%. The decision tree for samples where burning was unknown was 96% accurate, and multi-step classification to taxon was possible with 100% accuracy. The results of this feasibility study strongly suggest that histology remains a viable alternative technique if fragments of cranial bone require forensic examination in both burnt and unburnt states. The decision tree analysis may provide an additional but vital tool to enhance data interpretation. Further studies are needed to assess variation in histomorphology taking into account other cranial bones, ontogeny, species and burning conditions. © The Author(s) 2015.
NASA Astrophysics Data System (ADS)
Basye, Austin T.
A matrix element method analysis of the Standard Model Higgs boson, produced in association with two top quarks decaying to the lepton-plus-jets channel is presented. Based on 20.3 fb--1 of s=8 TeV data, produced at the Large Hadron Collider and collected by the ATLAS detector, this analysis utilizes multiple advanced techniques to search for ttH signatures with a 125 GeV Higgs boson decaying to two b -quarks. After categorizing selected events based on their jet and b-tag multiplicities, signal rich regions are analyzed using the matrix element method. Resulting variables are then propagated to two parallel multivariate analyses utilizing Neural Networks and Boosted Decision Trees respectively. As no significant excess is found, an observed (expected) limit of 3.4 (2.2) times the Standard Model cross-section is determined at 95% confidence, using the CLs method, for the Neural Network analysis. For the Boosted Decision Tree analysis, an observed (expected) limit of 5.2 (2.7) times the Standard Model cross-section is determined at 95% confidence, using the CLs method. Corresponding unconstrained fits of the Higgs boson signal strength to the observed data result in the measured signal cross-section to Standard Model cross-section prediction of mu = 1.2 +/- 1.3(total) +/- 0.7(stat.) for the Neural Network analysis, and mu = 2.9 +/- 1.4(total) +/- 0.8(stat.) for the Boosted Decision Tree analysis.
Using Decision Trees for Estimating Mode Choice of Trips in Buca-Izmir
NASA Astrophysics Data System (ADS)
Oral, L. O.; Tecim, V.
2013-05-01
Decision makers develop transportation plans and models for providing sustainable transport systems in urban areas. Mode Choice is one of the stages in transportation modelling. Data mining techniques can discover factors affecting the mode choice. These techniques can be applied with knowledge process approach. In this study a data mining process model is applied to determine the factors affecting the mode choice with decision trees techniques by considering individual trip behaviours from household survey data collected within Izmir Transportation Master Plan. From this perspective transport mode choice problem is solved on a case in district of Buca-Izmir, Turkey with CRISP-DM knowledge process model.
A data analysis expert system for large established distributed databases
NASA Technical Reports Server (NTRS)
Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick
1987-01-01
A design for a natural language database interface system, called the Deductively Augmented NASA Management Decision support System (DANMDS), is presented. The DANMDS system components have been chosen on the basis of the following considerations: maximal employment of the existing NASA IBM-PC computers and supporting software; local structuring and storing of external data via the entity-relationship model; a natural easy-to-use error-free database query language; user ability to alter query language vocabulary and data analysis heuristic; and significant artificial intelligence data analysis heuristic techniques that allow the system to become progressively and automatically more useful.
Outline of cost-benefit analysis and a case study
NASA Technical Reports Server (NTRS)
Kellizy, A.
1978-01-01
The methodology of cost-benefit analysis is reviewed and a case study involving solar cell technology is presented. Emphasis is placed on simplifying the technique in order to permit a technical person not trained in economics to undertake a cost-benefit study comparing alternative approaches to a given problem. The role of economic analysis in management decision making is discussed. In simplifying the methodology it was necessary to restrict the scope and applicability of this report. Additional considerations and constraints are outlined. Examples are worked out to demonstrate the principles. A computer program which performs the computational aspects appears in the appendix.
Maximizing the Predictive Value of Production Rules
1988-08-31
Clancev, 1985] Clancey, W. "Heuristic Classification." Artifcial Intelligence . 27 (1985) 289-350. [Crawford, 19881 Crawford, S. "Extensions to the CART...Optimality 16 6.1.2. Comparative Analysis for Normally Distributed Data 17 6.2. Comparison with Alternative Machine Learning Methods 18 6.2.1. Alternative...are reported on data sets previously analyzed in the Al literature using alternative classification techniques. 1. Introduction MIanv decision-making
DOT National Transportation Integrated Search
1985-03-01
A report is offered on a study of the information activities within the Right-of-Way section of ADOT. The objectives of the study were to adapt and apply techniques to measure user-perceived needs, satisfaction and utility of services provided Right-...
1982-12-01
VAPE was modeled to determine this launch rate and to determine the processing times for an Orbiter at VAPe . This informa- 21 tion was then used in the...year (node 79 and activity ?1). ETa are then selected to be sent to either KSC or VAPE (node 80). This decision is made (using Ur 8) on the basis of
Identifying environmental features for land management decisions
NASA Technical Reports Server (NTRS)
1982-01-01
The major accomplishments of the Center for Remote Sensing and Cartography are outlined. The analysis and inventory of the Parker Mountain rangeland and the use of multitemporal data to study aspen succession stages are discussed. New and continuing projects are also described including a Salt Lake County land use study, Wasatch-Cache riparian study, and Humboldt River riparian habitat study. Finally, progress in digital processing techniques is reported.
Methods and decision making on a Mars rover for identification of fossils
NASA Technical Reports Server (NTRS)
Eberlein, Susan; Yates, Gigi
1989-01-01
A system for automated fusion and interpretation of image data from multiple sensors, including multispectral data from an imaging spectrometer is being developed. Classical artificial intelligence techniques and artificial neural networks are employed to make real time decision based on current input and known scientific goals. Emphasis is placed on identifying minerals which could indicate past life activity or an environment supportive of life. Multispectral data can be used for geological analysis because different minerals have characteristic spectral reflectance in the visible and near infrared range. Classification of each spectrum into a broad class, based on overall spectral shape and locations of absorption bands is possible in real time using artificial neural networks. The goal of the system is twofold: multisensor and multispectral data must be interpreted in real time so that potentially interesting sites can be flagged and investigated in more detail while the rover is near those sites; and the sensed data must be reduced to the most compact form possible without loss of crucial information. Autonomous decision making will allow a rover to achieve maximum scientific benefit from a mission. Both a classical rule based approach and a decision neural network for making real time choices are being considered. Neural nets may work well for adaptive decision making. A neural net can be trained to work in two steps. First, the actual input state is mapped to the closest of a number of memorized states. After weighing the importance of various input parameters, the net produces an output decision based on the matched memory state. Real time, autonomous image data analysis and decision making capabilities are required for achieving maximum scientific benefit from a rover mission. The system under development will enhance the chances of identifying fossils or environments capable of supporting life on Mars
Leadership Strategies for Meeting New Challenges. Marketing.
ERIC Educational Resources Information Center
Knox, Alan B., Ed.
1982-01-01
Illustrates concepts and techniques available from marketing and related fields that can enrich decision making about marketing by continuing education administrators. They are concepts concerning marketing by nonprofit organizations, promotional techniques, highlights from a handbook on the use of direct mail, and the use of decision trees. (CT)
Gender classification of running subjects using full-body kinematics
NASA Astrophysics Data System (ADS)
Williams, Christina M.; Flora, Jeffrey B.; Iftekharuddin, Khan M.
2016-05-01
This paper proposes novel automated gender classification of subjects while engaged in running activity. The machine learning techniques include preprocessing steps using principal component analysis followed by classification with linear discriminant analysis, and nonlinear support vector machines, and decision-stump with AdaBoost. The dataset consists of 49 subjects (25 males, 24 females, 2 trials each) all equipped with approximately 80 retroreflective markers. The trials are reflective of the subject's entire body moving unrestrained through a capture volume at a self-selected running speed, thus producing highly realistic data. The classification accuracy using leave-one-out cross validation for the 49 subjects is improved from 66.33% using linear discriminant analysis to 86.74% using the nonlinear support vector machine. Results are further improved to 87.76% by means of implementing a nonlinear decision stump with AdaBoost classifier. The experimental findings suggest that the linear classification approaches are inadequate in classifying gender for a large dataset with subjects running in a moderately uninhibited environment.
DeBrew, Jacqueline Kayler; Lewallen, Lynne Porter
2014-04-01
Making the decision to pass or to fail a nursing student is difficult for nurse educators, yet one that all educators face at some point in time. To make this decision, nurse educators draw from their past experiences and personal reflections on the situation. Using the qualitative method of critical incident technique, the authors asked educators to describe a time when they had to make a decision about whether to pass or fail a student in the clinical setting. The findings describe student and faculty factors important in clinical evaluation decisions, demonstrate the benefits of reflective practice to nurse educators, and support the utility of critical incident technique not only as research methodology, but also as a technique for reflective practice. Copyright © 2013 Elsevier Ltd. All rights reserved.
Hosseinzade, Zeinab; Pagsuyoin, Sheree A; Ponnambalam, Kumaraswamy; Monem, Mohammad J
2017-12-01
The stiff competition for water between agriculture and non-agricultural production sectors makes it necessary to have effective management of irrigation networks in farms. However, the process of selecting flow control structures in irrigation networks is highly complex and involves different levels of decision makers. In this paper, we apply multi-attribute decision making (MADM) methodology to develop a decision analysis (DA) framework for evaluating, ranking and selecting check and intake structures for irrigation canals. The DA framework consists of identifying relevant attributes for canal structures, developing a robust scoring system for alternatives, identifying a procedure for data quality control, and identifying a MADM model for the decision analysis. An application is illustrated through an analysis for automation purposes of the Qazvin irrigation network, one of the oldest and most complex irrigation networks in Iran. A survey questionnaire designed based on the decision framework was distributed to experts, managers, and operators of the Qazvin network and to experts from the Ministry of Power in Iran. Five check structures and four intake structures were evaluated. A decision matrix was generated from the average scores collected from the survey, and was subsequently solved using TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution) method. To identify the most critical structure attributes for the selection process, optimal attribute weights were calculated using Entropy method. For check structures, results show that the duckbill weir is the preferred structure while the pivot weir is the least preferred. Use of the duckbill weir can potentially address the problem with existing Amil gates where manual intervention is required to regulate water levels during periods of flow extremes. For intake structures, the Neyrpic® gate and constant head orifice are the most and least preferred alternatives, respectively. Some advantages of the Neyrpic® gate are ease of operation and capacity to measure discharge flows. Overall, the application to the Qazvin irrigation network demonstrates the utility of the proposed DA framework in selecting appropriate structures for regulating water flows in irrigation canals. This framework systematically aids the decision process by capturing decisions made at various levels (individual farmers to high-level management). It can be applied to other cases where a new irrigation network is being designed, or where changes in irrigation structures need to be identified to improve flow control in existing networks. Copyright © 2017 Elsevier B.V. All rights reserved.
Pinkerton, Steven D.; Pearson, Cynthia R.; Eachus, Susan R.; Berg, Karina M.; Grimes, Richard M.
2008-01-01
Summary Maximizing our economic investment in HIV prevention requires balancing the costs of candidate interventions against their effects and selecting the most cost-effective interventions for implementation. However, many HIV prevention intervention trials do not collect cost information, and those that do use a variety of cost data collection methods and analysis techniques. Standardized cost data collection procedures, instrumentation, and analysis techniques are needed to facilitate the task of assessing intervention costs and to ensure comparability across intervention trials. This article describes the basic elements of a standardized cost data collection and analysis protocol and outlines a computer-based approach to implementing this protocol. Ultimately, the development of such a protocol would require contributions and “buy-in” from a diverse range of stakeholders, including HIV prevention researchers, cost-effectiveness analysts, community collaborators, public health decision makers, and funding agencies. PMID:18301128
Empowerment in Latina Immigrant Women Recovering From Interpersonal Violence: A Concept Analysis.
Page, Robin L; Chilton, Jenifer; Montalvo-Liendo, Nora; Matthews, Debra; Nava, Angeles
2017-04-01
Latina immigrant women are vulnerable and may experience higher levels of interpersonal or intimate partner violence (IPV) due to their immigrant status and cultural emphasis on familism. The concept of empowerment within the cultural context of Latina immigrant women experiencing IPV was analyzed using a modified version of Walker and Avant's concept analysis technique. The technique considers usage and definitions in the literature, antecedents, attributes, empirical referents, and the inclusion of a model and contrary case. This analysis encompasses a comparative approach and includes a discussion of how the definition of empowerment compares across the nursing literature. Defining attributes include reciprocal relationships, autonomy, and accountability. Antecedents comprise willingness to learn and motivation to create change. Consequences encompass self-esteem, self-efficacy, and competence for making life decisions. Empowerment has the potential to improve total well-being, having a positive and profound impact on the lives of women experiencing IPV.
NASA Technical Reports Server (NTRS)
Masud, Abu S. M.
1991-01-01
Fellowship activities were directed towards the identification of opportunities for application of the Multiple Criteria Decision Making (MCDM) techniques in the Space Exploration Initiative (SEI) domain. I identified several application possibilities and proposed demonstration application in these three areas: evaluation and ranking of SEI architectures, space mission planning and selection, and space system design. Here, only the first problem is discussed. The most meaningful result of the analysis is the wide separation between the two top ranked architectures, indicating a significant preference difference between them. It must also be noted that the final ranking reflects, to some extent, the biases of the evaluators and their understanding of the architecture.
NASA Astrophysics Data System (ADS)
Forney, W.; Raunikar, R. P.; Bernknopf, R.; Mishra, S.
2012-12-01
A production possibilities frontier (PPF) is a graph comparing the production interdependencies for two commodities. In this case, the commodities are defined as the ecosystem services of agricultural production and groundwater quality. This presentation focuses on the refinement of techniques used in an application to estimate the value of remote sensing information. Value of information focuses on the use of uncertain and varying qualities of information within a specific decision-making context for a certain application, which in this case included land use, biogeochemical, hydrogeologic, economic and geospatial data and models. The refined techniques include deriving alternate patterns and processes of ecosystem functions, new estimates of ecosystem service values to construct a PPF, and the extension of this work into decision support systems. We have coupled earth observations of agricultural production with groundwater quality measurements to estimate the value of remote sensing information in northeastern Iowa to be 857M ± 198M (at the 2010 price level) per year. We will present an improved method for modeling crop rotation patterns to include multiple years of rotation, reduction in the assumptions associated with optimal land use allocations, and prioritized improvement of the resolution of input data (for example, soil resources and topography). The prioritization focuses on watersheds that were identified at a coarse-scale of analysis to have higher intensities of agricultural production and lower probabilities of groundwater survivability (in other words, remaining below a regulatory threshold for nitrate pollution) over time, and thus require finer-scaled modeling and analysis. These improved techniques and the simulation of certain scale-dependent policy and management actions, which trade-off the objectives of optimizing crop value versus maintaining potable groundwater, and provide new estimates for the empirical values of the PPF. The calculation of a PPF in this way provides a decision maker with a tool to consider the ramifications of different policies, management practices and regional objectives.
Organizational Decision Making
1975-08-01
the lack of formal techniques typically used by large organizations, digress on the advantages of formal over informal... optimization ; for example one might do a number of optimization calculations, each time using a different measure of effectiveness as the optimized ...final decision. The next level of computer application involves the use of computerized optimization techniques. Optimization
The Computer in Educational Decision Making. An Introduction and Guide for School Administrators.
ERIC Educational Resources Information Center
Sanders, Susan; And Others
This text provides educational administrators with a working knowledge of the problem-solving techniques of PERT (planning, evaluation, and review technique), Linear Programming, Queueing Theory, and Simulation. The text includes an introduction to decision-making and operations research, four chapters consisting of indepth explanations of each…
Burton, R; Mauk, D
1993-03-01
By integrating customer satisfaction planning and industrial engineering techniques when examining internal costs and efficiencies, materiel managers are able to better realize what concepts will best meet their customers' needs. Defining your customer(s), applying industrial engineering techniques, completing work sampling studies, itemizing recommendations and benefits to each alternative, performing feasibility and cost-analysis matrixes and utilizing resources through productivity monitoring will get you on the right path toward selecting concepts to use. This article reviews the above procedures as they applied to one hospital's decision-making process to determine whether to incorporate a stockless inventory program. Through an analysis of customer demand, the hospital realized that stockless was the way to go, but not by outsourcing the function--the hospital incorporated an in-house stockless inventory program.
Chung, Eun-Sung; Kim, Yeonjoo
2014-12-15
This study proposed a robust prioritization framework to identify the priorities of treated wastewater (TWW) use locations with consideration of various uncertainties inherent in the climate change scenarios and the decision-making process. First, a fuzzy concept was applied because future forecast precipitation and their hydrological impact analysis results displayed significant variances when considering various climate change scenarios and long periods (e.g., 2010-2099). Second, various multi-criteria decision making (MCDM) techniques including weighted sum method (WSM), Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) and fuzzy TOPSIS were introduced to robust prioritization because different MCDM methods use different decision philosophies. Third, decision making method under complete uncertainty (DMCU) including maximin, maximax, minimax regret, Hurwicz, and equal likelihood were used to find robust final rankings. This framework is then applied to a Korean urban watershed. As a result, different rankings were obviously appeared between fuzzy TOPSIS and non-fuzzy MCDMs (e.g., WSM and TOPSIS) because the inter-annual variability in effectiveness was considered only with fuzzy TOPSIS. Then, robust prioritizations were derived based on 18 rankings from nine decadal periods of RCP4.5 and RCP8.5. For more robust rankings, five DMCU approaches using the rankings from fuzzy TOPSIS were derived. This framework combining fuzzy TOPSIS with DMCU approaches can be rendered less controversial among stakeholders under complete uncertainty of changing environments. Copyright © 2014 Elsevier Ltd. All rights reserved.
A multi-criteria model for the comparison of building envelope energy retrofits
NASA Astrophysics Data System (ADS)
Donnarumma, Giuseppe; Fiore, Pierfrancesco
2017-02-01
In light of the current EU guidelines in the energy field, improving building envelope performance cannot be separated from the context of satisfying the environmental sustainability requirements, reducing the costs associated with the life cycle of the building as well as economic and financial feasibility. Therefore, identifying the "optimal" energy retrofit solutions requires the simultaneous assessment of several factors and thus becomes a problem of choice between several possible alternatives. To facilitate the work of the decision-makers, public or private, adequate decision support tools are of great importance. Starting from this need, a model based on the multi-criteria analysis "AHP" technique is proposed, along with the definition of three synthetic indices associated with the three requirements of "Energy Performance", "Sustainability Performance" and "Cost". From the weighted aggregation of the three indices, a global index of preference is obtained that allows to "quantify" the satisfaction level of the i-th alternative from the point of view of a particular group of decision-makers. The model is then applied, by way of example, to the case-study of the energetic redevelopment of a former factory, assuming its functional conversion. Twenty possible alternative interventions on the opaque vertical closures, resulting from the combination of three thermal insulators families (synthetic, natural and mineral) with four energy retrofitting techniques are compared and the results obtained critically discussed by considering the point of view of the three different groups of decision-makers.
Improved CDMA Performance Using Parallel Interference Cancellation
NASA Technical Reports Server (NTRS)
Simon, Marvin; Divsalar, Dariush
1995-01-01
This report considers a general parallel interference cancellation scheme that significantly reduces the degradation effect of user interference but with a lesser implementation complexity than the maximum-likelihood technique. The scheme operates on the fact that parallel processing simultaneously removes from each user the interference produced by the remaining users accessing the channel in an amount proportional to their reliability. The parallel processing can be done in multiple stages. The proposed scheme uses tentative decision devices with different optimum thresholds at the multiple stages to produce the most reliably received data for generation and cancellation of user interference. The 1-stage interference cancellation is analyzed for three types of tentative decision devices, namely, hard, null zone, and soft decision, and two types of user power distribution, namely, equal and unequal powers. Simulation results are given for a multitude of different situations, in particular, those cases for which the analysis is too complex.
The speed-accuracy tradeoff: history, physiology, methodology, and behavior
Heitz, Richard P.
2014-01-01
There are few behavioral effects as ubiquitous as the speed-accuracy tradeoff (SAT). From insects to rodents to primates, the tendency for decision speed to covary with decision accuracy seems an inescapable property of choice behavior. Recently, the SAT has received renewed interest, as neuroscience approaches begin to uncover its neural underpinnings and computational models are compelled to incorporate it as a necessary benchmark. The present work provides a comprehensive overview of SAT. First, I trace its history as a tractable behavioral phenomenon and the role it has played in shaping mathematical descriptions of the decision process. Second, I present a “users guide” of SAT methodology, including a critical review of common experimental manipulations and analysis techniques and a treatment of the typical behavioral patterns that emerge when SAT is manipulated directly. Finally, I review applications of this methodology in several domains. PMID:24966810
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.
1995-01-01
Integrated Product and Process Development (IPPD) embodies the simultaneous application of both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. Georgia Tech has proposed the development of an Integrated Design Engineering Simulator that will merge Integrated Product and Process Development with interdisciplinary analysis techniques and state-of-the-art computational technologies. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. The current status of development is given and future directions are outlined.
IT vendor selection model by using structural equation model & analytical hierarchy process
NASA Astrophysics Data System (ADS)
Maitra, Sarit; Dominic, P. D. D.
2012-11-01
Selecting and evaluating the right vendors is imperative for an organization's global marketplace competitiveness. Improper selection and evaluation of potential vendors can dwarf an organization's supply chain performance. Numerous studies have demonstrated that firms consider multiple criteria when selecting key vendors. This research intends to develop a new hybrid model for vendor selection process with better decision making. The new proposed model provides a suitable tool for assisting decision makers and managers to make the right decisions and select the most suitable vendor. This paper proposes a Hybrid model based on Structural Equation Model (SEM) and Analytical Hierarchy Process (AHP) for long-term strategic vendor selection problems. The five steps framework of the model has been designed after the thorough literature study. The proposed hybrid model will be applied using a real life case study to assess its effectiveness. In addition, What-if analysis technique will be used for model validation purpose.
Kolasa, Katarzyna; Kalo, Zoltan; Zah, Vladimir
2016-08-01
According to some experts, there is still room for improvement with regard to the inclusion of ethical considerations in Health Technology Assessment (HTA). The pros and cons of the introduction of non-economic criteria in the HTA process in Central and Eastern Europe (CEE) are discussed. In comparison to Western Europe, financial considerations are even more important in CEE settings; however, it could also be said that attachment to equity and justice is part of CEE's heritage. Therefore, the trade-off between conflicting principles is evaluated. Expert commentary: To ensure the right balance between equity and efficiency in decision making, the current HTA framework has to be further augmented to allow all conflicting criteria to be addressed to a satisfactory degree. Following other examples, the applicability of multi criteria decision analysis technique to CEE settings should be further investigated.
NASA Technical Reports Server (NTRS)
Howard, R. A.; North, D. W.; Pezier, J. P.
1975-01-01
A new methodology is proposed for integrating planetary quarantine objectives into space exploration planning. This methodology is designed to remedy the major weaknesses inherent in the current formulation of planetary quarantine requirements. Application of the methodology is illustrated by a tutorial analysis of a proposed Jupiter Orbiter mission. The proposed methodology reformulates planetary quarantine planning as a sequential decision problem. Rather than concentrating on a nominal plan, all decision alternatives and possible consequences are laid out in a decision tree. Probabilities and values are associated with the outcomes, including the outcome of contamination. The process of allocating probabilities, which could not be made perfectly unambiguous and systematic, is replaced by decomposition and optimization techniques based on principles of dynamic programming. Thus, the new methodology provides logical integration of all available information and allows selection of the best strategy consistent with quarantine and other space exploration goals.
A Wireless Sensor Network-Based Approach with Decision Support for Monitoring Lake Water Quality.
Huang, Xiaoci; Yi, Jianjun; Chen, Shaoli; Zhu, Xiaomin
2015-11-19
Online monitoring and water quality analysis of lakes are urgently needed. A feasible and effective approach is to use a Wireless Sensor Network (WSN). Lake water environments, like other real world environments, present many changing and unpredictable situations. To ensure flexibility in such an environment, the WSN node has to be prepared to deal with varying situations. This paper presents a WSN self-configuration approach for lake water quality monitoring. The approach is based on the integration of a semantic framework, where a reasoner can make decisions on the configuration of WSN services. We present a WSN ontology and the relevant water quality monitoring context information, which considers its suitability in a pervasive computing environment. We also propose a rule-based reasoning engine that is used to conduct decision support through reasoning techniques and context-awareness. To evaluate the approach, we conduct usability experiments and performance benchmarks.
Analysis of data mining classification by comparison of C4.5 and ID algorithms
NASA Astrophysics Data System (ADS)
Sudrajat, R.; Irianingsih, I.; Krisnawan, D.
2017-01-01
The rapid development of information technology, triggered by the intensive use of information technology. For example, data mining widely used in investment. Many techniques that can be used assisting in investment, the method that used for classification is decision tree. Decision tree has a variety of algorithms, such as C4.5 and ID3. Both algorithms can generate different models for similar data sets and different accuracy. C4.5 and ID3 algorithms with discrete data provide accuracy are 87.16% and 99.83% and C4.5 algorithm with numerical data is 89.69%. C4.5 and ID3 algorithms with discrete data provides 520 and 598 customers and C4.5 algorithm with numerical data is 546 customers. From the analysis of the both algorithm it can classified quite well because error rate less than 15%.
Guimarães, José Maria Ximenes; Jorge, Maria Salete Bessa; Maia, Regina Claudia Furtado; de Oliveira, Lucia Conde; Morais, Ana Patrícia Pereira; Lima, Marcos Paulo de Oliveira; Assis, Marluce Maria Araújo; dos Santos, Adriano Maia
2010-07-01
The article approaches the comprehension of professionals that act in the mental health area about the movement of construction of social participation in the health system of Fortaleza, Ceará State. The methodology adopted is based upon qualitative approach. The study was developed with semi-structured interviews with 17 mental health professionals of the city above mentioned. The empirical data was analyzed through the technique of thematic content analysis, where it was identified three cores of analysis: social participation as space of citizenship and policy formulation; oriented to attention of collective needs; and decision taking. The study reveals that social participation represents a possibility of amplifying X the relations between the Civil Society and the State, which makes possible the social intervention in proposals of the health policies. It is highlighted the right to health linked to the consolidation of democracy in the attention to the needs and collective edification.
Li, Yongping; Huang, Guohe
2009-03-01
In this study, a dynamic analysis approach based on an inexact multistage integer programming (IMIP) model is developed for supporting municipal solid waste (MSW) management under uncertainty. Techniques of interval-parameter programming and multistage stochastic programming are incorporated within an integer-programming framework. The developed IMIP can deal with uncertainties expressed as probability distributions and interval numbers, and can reflect the dynamics in terms of decisions for waste-flow allocation and facility-capacity expansion over a multistage context. Moreover, the IMIP can be used for analyzing various policy scenarios that are associated with different levels of economic consequences. The developed method is applied to a case study of long-term waste-management planning. The results indicate that reasonable solutions have been generated for binary and continuous variables. They can help generate desired decisions of system-capacity expansion and waste-flow allocation with a minimized system cost and maximized system reliability.
Satellite solar power - Will it pay off
NASA Technical Reports Server (NTRS)
Hazelrigg, G. A., Jr.
1977-01-01
A cost analysis is presented for front-end investments required for the development of a satellite solar power system. The methodology used makes use of risk analysis techniques to quantify the present state of knowledge relevant to the construction and operation of a satellite solar power station 20 years in the future. Results are used to evaluate the 'expected value' of a three-year research program providing additional information which will be used as a basis for a decision to either continue development of the concept at an increasing funding level or to terminate or drastically alter the program. The program is costed phase by phase, and a decision tree is constructed. The estimated probability of success for the research and studies phase is .540. The expected value of a program leading to the construction of 120 systems at a rate of four per year is 12.433 billion dollars.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.
We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less
Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.; ...
2016-02-01
We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less
Tian, Yuan; Hassmiller Lich, Kristen; Osgood, Nathaniel D; Eom, Kirsten; Matchar, David B
2016-11-01
As health services researchers and decision makers tackle more difficult problems using simulation models, the number of parameters and the corresponding degree of uncertainty have increased. This often results in reduced confidence in such complex models to guide decision making. To demonstrate a systematic approach of linked sensitivity analysis, calibration, and uncertainty analysis to improve confidence in complex models. Four techniques were integrated and applied to a System Dynamics stroke model of US veterans, which was developed to inform systemwide intervention and research planning: Morris method (sensitivity analysis), multistart Powell hill-climbing algorithm and generalized likelihood uncertainty estimation (calibration), and Monte Carlo simulation (uncertainty analysis). Of 60 uncertain parameters, sensitivity analysis identified 29 needing calibration, 7 that did not need calibration but significantly influenced key stroke outcomes, and 24 not influential to calibration or stroke outcomes that were fixed at their best guess values. One thousand alternative well-calibrated baselines were obtained to reflect calibration uncertainty and brought into uncertainty analysis. The initial stroke incidence rate among veterans was identified as the most influential uncertain parameter, for which further data should be collected. That said, accounting for current uncertainty, the analysis of 15 distinct prevention and treatment interventions provided a robust conclusion that hypertension control for all veterans would yield the largest gain in quality-adjusted life years. For complex health care models, a mixed approach was applied to examine the uncertainty surrounding key stroke outcomes and the robustness of conclusions. We demonstrate that this rigorous approach can be practical and advocate for such analysis to promote understanding of the limits of certainty in applying models to current decisions and to guide future data collection. © The Author(s) 2016.
Giordano, R; Passarella, G; Uricchio, V F; Vurro, M
2007-07-01
The importance of shared decision processes in water management derives from the awareness of the inadequacy of traditional--i.e. engineering--approaches in dealing with complex and ill-structured problems. It is becoming increasingly obvious that traditional problem solving and decision support techniques, based on optimisation and factual knowledge, have to be combined with stakeholder based policy design and implementation. The aim of our research is the definition of an integrated decision support system for consensus achievement (IDSS-C) able to support a participative decision-making process in all its phases: problem definition and structuring, identification of the possible alternatives, formulation of participants' judgments, and consensus achievement. Furthermore, the IDSS-C aims at structuring, i.e. systematising the knowledge which has emerged during the participative process in order to make it comprehensible for the decision-makers and functional for the decision process. Problem structuring methods (PSM) and multi-group evaluation methods (MEM) have been integrated in the IDSS-C. PSM are used to support the stakeholders in providing their perspective of the problem and to elicit their interests and preferences, while MEM are used to define not only the degree of consensus for each alternative, highlighting those where the agreement is high, but also the consensus label for each alternative and the behaviour of individuals during the participative decision-making. The IDSS-C is applied experimentally to a decision process regarding the use of treated wastewater for agricultural irrigation in the Apulia Region (southern Italy).
Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis
NASA Technical Reports Server (NTRS)
Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William
2009-01-01
This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).
Vexler, Albert; Yu, Jihnhee
2018-04-13
A common statistical doctrine supported by many introductory courses and textbooks is that t-test type procedures based on normally distributed data points are anticipated to provide a standard in decision-making. In order to motivate scholars to examine this convention, we introduce a simple approach based on graphical tools of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. In this context, we propose employing a p-values-based method, taking into account the stochastic nature of p-values. We focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we extend the EPV concept to be considered in terms of the ROC curve technique. This provides expressive evaluations and visualizations of a wide spectrum of testing mechanisms' properties. We show that the conventional power characterization of tests is a partial aspect of the presented EPV/ROC technique. We desire that this explanation of the EPV/ROC approach convinces researchers of the usefulness of the EPV/ROC approach for depicting different characteristics of decision-making procedures, in light of the growing interest regarding correct p-values-based applications.
Cognitive Systems Modeling and Analysis of Command and Control Systems
NASA Technical Reports Server (NTRS)
Norlander, Arne
2012-01-01
Military operations, counter-terrorism operations and emergency response often oblige operators and commanders to operate within distributed organizations and systems for safe and effective mission accomplishment. Tactical commanders and operators frequently encounter violent threats and critical demands on cognitive capacity and reaction time. In the future they will make decisions in situations where operational and system characteristics are highly dynamic and non-linear, i.e. minor events, decisions or actions may have serious and irreversible consequences for the entire mission. Commanders and other decision makers must manage true real time properties at all levels; individual operators, stand-alone technical systems, higher-order integrated human-machine systems and joint operations forces alike. Coping with these conditions in performance assessment, system development and operational testing is a challenge for both practitioners and researchers. This paper reports on research from which the results led to a breakthrough: An integrated approach to information-centered systems analysis to support future command and control systems research development. This approach integrates several areas of research into a coherent framework, Action Control Theory (ACT). It comprises measurement techniques and methodological advances that facilitate a more accurate and deeper understanding of the operational environment, its agents, actors and effectors, generating new and updated models. This in turn generates theoretical advances. Some good examples of successful approaches are found in the research areas of cognitive systems engineering, systems theory, and psychophysiology, and in the fields of dynamic, distributed decision making and naturalistic decision making.
Dynamic fMRI of a decision-making task
NASA Astrophysics Data System (ADS)
Singh, Manbir; Sungkarat, Witaya
2008-03-01
A novel fMRI technique has been developed to capture the dynamics of the evolution of brain activity during complex tasks such as those designed to evaluate the neural basis of decision-making under different situations. A task called the Iowa Gambling Task was used as an example. Six normal human volunteers were studied. The task was presented inside a 3T MRI and a dynamic fMRI study of the approximately 2s period between the beginning and end of the decision-making period was conducted by employing a series of reference functions, separated by 200 ms, designed to capture activation at different time-points within this period. As decision-making culminates with a button-press, the timing of the button press was chosen as the reference (t=0) and corresponding reference functions were shifted backward in steps of 200ms from this point up to the time when motor activity from the previous button press became predominant. SPM was used to realign, high-pass filter (cutoff 200s), normalize to the Montreal Neurological Institute (MNI) Template using a 12 parameter affine/non-linear transformation, 8mm Gaussian smoothing, and event-related General Linear Model analysis for each of the shifted reference functions. The t-score of each activated voxel was then examined to find its peaking time. A random effect analysis (p<0.05) showed prefrontal, parietal and bi-lateral hippocampal activation peaking at different times during the decision making period in the n=6 group study.
Identifying environmental features for land management decisions
NASA Technical Reports Server (NTRS)
1981-01-01
The benefits of changes in management organization and facilities for the Center for Remote Sensing and Cartography in Utah are reported as well as interactions with and outreach to state and local agencies. Completed projects are described which studied (1) Unita Basin wetland/land use; (2) Davis County foothill development; (3) Farmington Bay shoreline fluctuation; (4) irrigation detection; and (5) satellite investigation of snow cover/mule deer relationships. Techniques developed for composite computer mapping, contrast enhancement, U-2 CIR/LANDSAT digital interface; factor analysis, and multivariate statistical analysis are described.
Syntactic/semantic techniques for feature description and character recognition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonzalez, R.C.
1983-01-01
The Pattern Analysis Branch, Mapping, Charting and Geodesy (MC/G) Division, of the Naval Ocean Research and Development Activity (NORDA) has been involved over the past several years in the development of algorithms and techniques for computer recognition of free-form handprinted symbols as they appear on the Defense Mapping Agency (DMA) maps and charts. NORDA has made significant contributions to the automation of MC/G through advancing the state of the art in such information extraction techniques. In particular, new concepts in character (symbol) skeletonization, rugged feature measurements, and expert system-oriented decision logic have allowed the development of a very high performancemore » Handprinted Symbol Recognition (HSR) system for identifying depth soundings from naval smooth sheets (accuracies greater than 99.5%). The study reported in this technical note is part of NORDA's continuing research and development in pattern and shape analysis as it applies to Navy and DMA ocean/environment problems. The issue addressed in this technical note deals with emerging areas of syntactic and semantic techniques in pattern recognition as they might apply to the free-form symbol problem.« less
Decision Tree Approach for Soil Liquefaction Assessment
Gandomi, Amir H.; Fridline, Mark M.; Roke, David A.
2013-01-01
In the current study, the performances of some decision tree (DT) techniques are evaluated for postearthquake soil liquefaction assessment. A database containing 620 records of seismic parameters and soil properties is used in this study. Three decision tree techniques are used here in two different ways, considering statistical and engineering points of view, to develop decision rules. The DT results are compared to the logistic regression (LR) model. The results of this study indicate that the DTs not only successfully predict liquefaction but they can also outperform the LR model. The best DT models are interpreted and evaluated based on an engineering point of view. PMID:24489498
Decision tree approach for soil liquefaction assessment.
Gandomi, Amir H; Fridline, Mark M; Roke, David A
2013-01-01
In the current study, the performances of some decision tree (DT) techniques are evaluated for postearthquake soil liquefaction assessment. A database containing 620 records of seismic parameters and soil properties is used in this study. Three decision tree techniques are used here in two different ways, considering statistical and engineering points of view, to develop decision rules. The DT results are compared to the logistic regression (LR) model. The results of this study indicate that the DTs not only successfully predict liquefaction but they can also outperform the LR model. The best DT models are interpreted and evaluated based on an engineering point of view.
NASA Astrophysics Data System (ADS)
Mayer, J. M.; Stead, D.
2017-04-01
With the increased drive towards deeper and more complex mine designs, geotechnical engineers are often forced to reconsider traditional deterministic design techniques in favour of probabilistic methods. These alternative techniques allow for the direct quantification of uncertainties within a risk and/or decision analysis framework. However, conventional probabilistic practices typically discretize geological materials into discrete, homogeneous domains, with attributes defined by spatially constant random variables, despite the fact that geological media display inherent heterogeneous spatial characteristics. This research directly simulates this phenomenon using a geostatistical approach, known as sequential Gaussian simulation. The method utilizes the variogram which imposes a degree of controlled spatial heterogeneity on the system. Simulations are constrained using data from the Ok Tedi mine site in Papua New Guinea and designed to randomly vary the geological strength index and uniaxial compressive strength using Monte Carlo techniques. Results suggest that conventional probabilistic techniques have a fundamental limitation compared to geostatistical approaches, as they fail to account for the spatial dependencies inherent to geotechnical datasets. This can result in erroneous model predictions, which are overly conservative when compared to the geostatistical results.
A Holistic Approach to Networked Information Systems Design and Analysis
2016-04-15
attain quite substantial savings. 11. Optimal algorithms for energy harvesting in wireless networks. We use a Markov- decision-process (MDP) based...approach to obtain optimal policies for transmissions . The key advantage of our approach is that it holistically considers information and energy in a...Coding technique to minimize delays and the number of transmissions in Wireless Systems. As we approach an era of ubiquitous computing with information
Productivity improvement and quality enhancement at NASA
NASA Technical Reports Server (NTRS)
Braunstein, D. R.
1985-01-01
NASA's Productivity Improvement and Quality Enhancement (PIQE) effort has as its objectives the encouragement of greater employee participation in management decision-making and the identification of impediments as well as opportunities for high productivity. Attempts are also made to try out novel management practices, and to evolve productivity trend analysis techniques. Every effort is made to note, reward, and diffuse successfully instituted PIQE approaches throughout the NASA-contractor organization.
Echocardiography in mitral stenosis
Omran, A.S.; Arifi, Ahmed A.; Mohamed, A.A.
2010-01-01
Echocardiography plays a major role in diagnosis, etiology and severity of Mitral Stenosis (MS), analysis of valve anatomy and decision-making for intervention. This technique has also a crucial role to assess consequences of MS and follow up of patients after medical or surgical intervention. In this article we review the role of conventional echocardiography in assessment of mitral stenosis and future direction of this modality using 3D echocardiography. PMID:23960637
Systematic procedure for designing processes with multiple environmental objectives.
Kim, Ki-Joo; Smith, Raymond L
2005-04-01
Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems due to the complex nature of the problems, the need for complex assessments, and the complicated analysis of multidimensional results. In this paper, a novel systematic procedure is presented for designing processes with multiple environmental objectives. This procedure has four steps: initialization, screening, evaluation, and visualization. The first two steps are used for systematic problem formulation based on mass and energy estimation and order of magnitude analysis. In the third step, an efficient parallel multiobjective steady-state genetic algorithm is applied to design environmentally benign and economically viable processes and to provide more accurate and uniform Pareto optimal solutions. In the last step a new visualization technique for illustrating multiple objectives and their design parameters on the same diagram is developed. Through these integrated steps the decision-maker can easily determine design alternatives with respect to his or her preferences. Most importantly, this technique is independent of the number of objectives and design parameters. As a case study, acetic acid recovery from aqueous waste mixtures is investigated by minimizing eight potential environmental impacts and maximizing total profit. After applying the systematic procedure, the most preferred design alternatives and their design parameters are easily identified.
Selection of Representative Models for Decision Analysis Under Uncertainty
NASA Astrophysics Data System (ADS)
Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.
2016-03-01
The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.
Teichgräber, Ulf K; de Bucourt, Maximilian
2012-01-01
OJECTIVES: To eliminate non-value-adding (NVA) waste for the procurement of endovascular stents in interventional radiology services by applying value stream mapping (VSM). The Lean manufacturing technique was used to analyze the process of material and information flow currently required to direct endovascular stents from external suppliers to patients. Based on a decision point analysis for the procurement of stents in the hospital, a present state VSM was drawn. After assessment of the current status VSM and progressive elimination of unnecessary NVA waste, a future state VSM was drawn. The current state VSM demonstrated that out of 13 processes for the procurement of stents only 2 processes were value-adding. Out of the NVA processes 5 processes were unnecessary NVA activities, which could be eliminated. The decision point analysis demonstrated that the procurement of stents was mainly a forecast driven push system. The future state VSM applies a pull inventory control system to trigger the movement of a unit after withdrawal by using a consignment stock. VSM is a visualization tool for the supply chain and value stream, based on the Toyota Production System and greatly assists in successfully implementing a Lean system. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Tuberculosis diagnosis support analysis for precarious health information systems.
Orjuela-Cañón, Alvaro David; Camargo Mendoza, Jorge Eliécer; Awad García, Carlos Enrique; Vergara Vela, Erika Paola
2018-04-01
Pulmonary tuberculosis is a world emergency for the World Health Organization. Techniques and new diagnosis tools are important to battle this bacterial infection. There have been many advances in all those fields, but in developing countries such as Colombia, where the resources and infrastructure are limited, new fast and less expensive strategies are increasingly needed. Artificial neural networks are computational intelligence techniques that can be used in this kind of problems and offer additional support in the tuberculosis diagnosis process, providing a tool to medical staff to make decisions about management of subjects under suspicious of tuberculosis. A database extracted from 105 subjects with precarious information of people under suspect of pulmonary tuberculosis was used in this study. Data extracted from sex, age, diabetes, homeless, AIDS status and a variable with clinical knowledge from the medical personnel were used. Models based on artificial neural networks were used, exploring supervised learning to detect the disease. Unsupervised learning was used to create three risk groups based on available information. Obtained results are comparable with traditional techniques for detection of tuberculosis, showing advantages such as fast and low implementation costs. Sensitivity of 97% and specificity of 71% where achieved. Used techniques allowed to obtain valuable information that can be useful for physicians who treat the disease in decision making processes, especially under limited infrastructure and data. Copyright © 2018 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Neal, Joan; Echternacht, Lonnie
1995-01-01
Experimental groups used four decision-making techniques--reverse brainstorming (RS), dialectical inquiry (DI), devil's advocacy (DA), and consensus--in evaluating writing assignments. Control group produced a better quality document. Student reaction to negative features of RS, DI, and DA were not significant. (SK)
Mudali, D; Teune, L K; Renken, R J; Leenders, K L; Roerdink, J B T M
2015-01-01
Medical imaging techniques like fluorodeoxyglucose positron emission tomography (FDG-PET) have been used to aid in the differential diagnosis of neurodegenerative brain diseases. In this study, the objective is to classify FDG-PET brain scans of subjects with Parkinsonian syndromes (Parkinson's disease, multiple system atrophy, and progressive supranuclear palsy) compared to healthy controls. The scaled subprofile model/principal component analysis (SSM/PCA) method was applied to FDG-PET brain image data to obtain covariance patterns and corresponding subject scores. The latter were used as features for supervised classification by the C4.5 decision tree method. Leave-one-out cross validation was applied to determine classifier performance. We carried out a comparison with other types of classifiers. The big advantage of decision tree classification is that the results are easy to understand by humans. A visual representation of decision trees strongly supports the interpretation process, which is very important in the context of medical diagnosis. Further improvements are suggested based on enlarging the number of the training data, enhancing the decision tree method by bagging, and adding additional features based on (f)MRI data.
Canis, Laure; Linkov, Igor; Seager, Thomas P
2010-11-15
The unprecedented uncertainty associated with engineered nanomaterials greatly expands the need for research regarding their potential environmental consequences. However, decision-makers such as regulatory agencies, product developers, or other nanotechnology stakeholders may not find the results of such research directly informative of decisions intended to mitigate environmental risks. To help interpret research findings and prioritize new research needs, there is an acute need for structured decision-analytic aids that are operable in a context of extraordinary uncertainty. Whereas existing stochastic decision-analytic techniques explore uncertainty only in decision-maker preference information, this paper extends model uncertainty to technology performance. As an illustrative example, the framework is applied to the case of single-wall carbon nanotubes. Four different synthesis processes (arc, high pressure carbon monoxide, chemical vapor deposition, and laser) are compared based on five salient performance criteria. A probabilistic rank ordering of preferred processes is determined using outranking normalization and a linear-weighted sum for different weighting scenarios including completely unknown weights and four fixed-weight sets representing hypothetical stakeholder views. No single process pathway dominates under all weight scenarios, but it is likely that some inferior process technologies could be identified as low priorities for further research.
Higgs, Gary
2006-04-01
Despite recent U.K. Government commitments' to encourage public participation in environmental decision making, those exercises conducted to date have been largely confined to 'traditional' modes of participation such as the dissemination of information and in encouraging feedback on proposals through, for example, questionnaires or surveys. It is the premise of this paper that participative approaches that use IT-based methods, based on combined geographical information systems (GIS) and multi-criteria evaluation techniques that could involve the public in the decision-making process, have the potential to build consensus and reduce disputes and conflicts such as those arising from the siting of different types of waste facilities. The potential of these techniques are documented through a review of the existing literature in order to highlight the opportunities and challenges facing decision makers in increasing the involvement of the public at different stages of the waste facility management process. It is concluded that there are important lessons to be learned by researchers, consultants, managers and decision makers if barriers hindering the wider use of such techniques are to be overcome.
Developing and Teaching Ethical Decision Making Skills.
ERIC Educational Resources Information Center
Robinson, John
1991-01-01
Student leaders and campus activities professionals can use a variety of techniques to help college students develop skill in ethical decision making, including teaching about the decision-making process, guiding students through decisions with a series of questions, playing ethics games, exploring assumptions, and best of all, role modeling. (MSE)
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
Application of Fuzzy TOPSIS for evaluating machining techniques using sustainability metrics
NASA Astrophysics Data System (ADS)
Digalwar, Abhijeet K.
2018-04-01
Sustainable processes and techniques are getting increased attention over the last few decades due to rising concerns over the environment, improved focus on productivity and stringency in environmental as well as occupational health and safety norms. The present work analyzes the research on sustainable machining techniques and identifies techniques and parameters on which sustainability of a process is evaluated. Based on the analysis these parameters are then adopted as criteria’s to evaluate different sustainable machining techniques such as Cryogenic Machining, Dry Machining, Minimum Quantity Lubrication (MQL) and High Pressure Jet Assisted Machining (HPJAM) using a fuzzy TOPSIS framework. In order to facilitate easy arithmetic, the linguistic variables represented by fuzzy numbers are transformed into crisp numbers based on graded mean representation. Cryogenic machining was found to be the best alternative sustainable technique as per the fuzzy TOPSIS framework adopted. The paper provides a method to deal with multi criteria decision making problems in a complex and linguistic environment.
Zarinabad, Niloufar; Meeus, Emma M; Manias, Karen; Foster, Katharine; Peet, Andrew
2018-05-02
Advances in magnetic resonance imaging and the introduction of clinical decision support systems has underlined the need for an analysis tool to extract and analyze relevant information from magnetic resonance imaging data to aid decision making, prevent errors, and enhance health care. The aim of this study was to design and develop a modular medical image region of interest analysis tool and repository (MIROR) for automatic processing, classification, evaluation, and representation of advanced magnetic resonance imaging data. The clinical decision support system was developed and evaluated for diffusion-weighted imaging of body tumors in children (cohort of 48 children, with 37 malignant and 11 benign tumors). Mevislab software and Python have been used for the development of MIROR. Regions of interests were drawn around benign and malignant body tumors on different diffusion parametric maps, and extracted information was used to discriminate the malignant tumors from benign tumors. Using MIROR, the various histogram parameters derived for each tumor case when compared with the information in the repository provided additional information for tumor characterization and facilitated the discrimination between benign and malignant tumors. Clinical decision support system cross-validation showed high sensitivity and specificity in discriminating between these tumor groups using histogram parameters. MIROR, as a diagnostic tool and repository, allowed the interpretation and analysis of magnetic resonance imaging images to be more accessible and comprehensive for clinicians. It aims to increase clinicians' skillset by introducing newer techniques and up-to-date findings to their repertoire and make information from previous cases available to aid decision making. The modular-based format of the tool allows integration of analyses that are not readily available clinically and streamlines the future developments. ©Niloufar Zarinabad, Emma M Meeus, Karen Manias, Katharine Foster, Andrew Peet. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 02.05.2018.
DataView: a computational visualisation system for multidisciplinary design and analysis
NASA Astrophysics Data System (ADS)
Wang, Chengen
2016-01-01
Rapidly processing raw data and effectively extracting underlining information from huge volumes of multivariate data become essential to all decision-making processes in sectors like finance, government, medical care, climate analysis, industries, science, etc. Remarkably, visualisation is recognised as a fundamental technology that props up human comprehension, cognition and utilisation of burgeoning amounts of heterogeneous data. This paper presents a computational visualisation system, named DataView, which has been developed for graphically displaying and capturing outcomes of multiphysics problem-solvers widely used in engineering fields. The DataView is functionally composed of techniques for table/diagram representation, and graphical illustration of scalar, vector and tensor fields. The field visualisation techniques are implemented on the basis of a range of linear and non-linear meshes, which flexibly adapts to disparate data representation schemas adopted by a variety of disciplinary problem-solvers. The visualisation system has been successfully applied to a number of engineering problems, of which some illustrations are presented to demonstrate effectiveness of the visualisation techniques.
Design of a digital voice data compression technique for orbiter voice channels
NASA Technical Reports Server (NTRS)
1975-01-01
Candidate techniques were investigated for digital voice compression to a transmission rate of 8 kbps. Good voice quality, speaker recognition, and robustness in the presence of error bursts were considered. The technique of delayed-decision adaptive predictive coding is described and compared with conventional adaptive predictive coding. Results include a set of experimental simulations recorded on analog tape. The two FM broadcast segments produced show the delayed-decision technique to be virtually undegraded or minimally degraded at .001 and .01 Viterbi decoder bit error rates. Preliminary estimates of the hardware complexity of this technique indicate potential for implementation in space shuttle orbiters.
van de Pol, M H J; Fluit, C R M G; Lagro, J; Lagro-Janssen, A L M; Olde Rikkert, M G M
2017-01-01
To develop a model for shared decision-making with frail older patients. Online Delphi forum. We used a three-round Delphi technique to reach consensus on the structure of a model for shared decision-making with older patients. The expert panel consisted of 16 patients (round 1), and 59 professionals (rounds 1-3). In round 1, the panel of experts was asked about important steps in the process of shared decision-making and the draft model was introduced. Rounds 2 and 3 were used to adapt the model and test it for 'importance' and 'feasibility'. Consensus for the dynamic shared decision-making model as a whole was achieved for both importance (91% panel agreement) and feasibility (76% panel agreement). Shared decision-making with older patients is a dynamic process. It requires a continuous supportive dialogue between health care professional and patient.
Application of SOJA and InforMatrix in practice: interactive web and workshop tools.
Brenninkmeijer, Rob; Janknegt, Robert
2007-10-01
System of Objectified Judgement Analysis (SOJA) and InforMatrix are decision-matrix techniques designed to support a rational selection of drugs. Both SOJA and InforMatrix can be considered as strategic tools in the practical implementation of rational pharmacotherapy. In order to apply the matrix techniques to drug selection, strategic navigation through essential information (with the aim of reaching consensus in pharmacotherapy) is required. The consensus has to be reached in an interactive, communicative, collegial manner, within a professional environment. This environment is realised in the form of interactive applications in workshops and on the internet. Such interactive applications are illustrated and discussed in this article.
NASA Technical Reports Server (NTRS)
1976-01-01
After the disaster of Staten Island in 1973 where 40 people were killed repairing a liquid natural gas storage tank, the New York Fire Commissioner requested NASA's help in drawing up a comprehensive plan to cover the design, construction, and operation of liquid natural gas facilities. Two programs are underway. The first transfers comprehensive risk management techniques and procedures which take the form of an instruction document that includes determining liquid-gas risks through engineering analysis and tests, controlling these risks by setting up redundant fail safe techniques, and establishing criteria calling for decisions that eliminate or accept certain risks. The second program prepares a liquid gas safety manual (the first of its kind).
Rule-based statistical data mining agents for an e-commerce application
NASA Astrophysics Data System (ADS)
Qin, Yi; Zhang, Yan-Qing; King, K. N.; Sunderraman, Rajshekhar
2003-03-01
Intelligent data mining techniques have useful e-Business applications. Because an e-Commerce application is related to multiple domains such as statistical analysis, market competition, price comparison, profit improvement and personal preferences, this paper presents a hybrid knowledge-based e-Commerce system fusing intelligent techniques, statistical data mining, and personal information to enhance QoS (Quality of Service) of e-Commerce. A Web-based e-Commerce application software system, eDVD Web Shopping Center, is successfully implemented uisng Java servlets and an Oracle81 database server. Simulation results have shown that the hybrid intelligent e-Commerce system is able to make smart decisions for different customers.
Postnatal Psychosocial Assessment and Clinical Decision-Making, a Descriptive Study.
Sims, Deborah; Fowler, Cathrine
2018-05-18
The aim of this study is to describe experienced child and family health nurses' clinical decision-making during a postnatal psychosocial assessment. Maternal emotional wellbeing in the postnatal year optimises parenting and promotes infant development. Psychosocial assessment potentially enables early intervention and reduces the risk of a mental disorder occurring during this time of change. Assessment accuracy, and the interventions used are determined by the standard of nursing decision-making. A qualitative methodology was employed to explore decision-making behaviour when conducting a postnatal psychosocial assessment. This study was conducted in an Australian early parenting organisation. Twelve experienced child and family health nurses were interviewed. A detailed description of a postnatal psychosocial assessment process was obtained using a critical incident technique. Template analysis was used to determine the information domains the nurses accessed, and content analysis was used to determine the nurses' thinking strategies, to make clinical decisions from this assessment. The nurses described 24 domains of information and used 17 thinking strategies, in a variety of combinations. The four information domains most commonly used were parenting, assessment tools, women-determined issues and sleep. The seven thinking strategies most commonly used were searching for information, forming relationships between the information, recognising a pattern, drawing a conclusion, setting priorities, providing explanations for the information and judging the value of the information. The variety and complexity of the clinical decision-making involved in postnatal psychosocial assessment confirms that the nurses use information appropriately and within their scope of nursing practice. The standard of clinical decision-making determines the results of the assessment and the optimal access to care. Knowledge of the information domains and the decision-making strategies that experienced nurses use for psychosocial assessment potentially improves practice by providing a framework for education and mentoring. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
NASA Astrophysics Data System (ADS)
Song, Jae Yeol; Chung, Eun-Sung
2017-04-01
This study developed a multi-criteria decision analysis framework to prioritize sites and types of low impact development (LID) practices. This framework was systemized as a web-based system coupled with the Storm Water Management Model (SWMM) from the Environmental Protection Agency (EPA). Using the technique for order of preference by similarity to ideal solution (TOPSIS), which is a type of multi-criteria decision-making (MCDM) method, multiple types and sites of designated LID practices are prioritized. This system is named the Water Management Prioritization Module (WMPM) and is an improved version of the Water Management Analysis Module (WMAM) that automatically generates and simulates multiple scenarios of LID design and planning parameters for a single LID type. WMPM can simultaneously determine the priority of multiple LID types and sites. In this study, an infiltration trench and permeable pavement were considered for multiple sub-catchments in South Korea to demonstrate the WMPM procedures. The TOPSIS method was manually incorporated to select the vulnerable target sub-catchments and to prioritize the LID planning scenarios for multiple types and sites considering socio-economic, hydrologic and physical-geometric factors. In this application, the Delphi method and entropy theory were used to determine the subjective and objective weights, respectively. Comparing the ranks derived by this system, two sub-catchments, S16 and S4, out of 18 were considered to be the most suitable places for installing an infiltration trench and porous pavement to reduce the peak and total flow, respectively, considering both socio-economic factors and hydrological effectiveness. WMPM can help policy-makers to objectively develop urban water plans for sustainable development. Keywords: Low Impact Development, Multi-Criteria Decision Analysis, SWMM, TOPSIS, Water Management Prioritization Module (WMPM)
Integration of RAMS in LCC analysis for linear transport infrastructures. A case study for railways.
NASA Astrophysics Data System (ADS)
Calle-Cordón, Álvaro; Jiménez-Redondo, Noemi; Morales-Gámiz, F. J.; García-Villena, F. A.; Garmabaki, Amir H. S.; Odelius, Johan
2017-09-01
Life-cycle cost (LCC) analysis is an economic technique used to assess the total costs associated with the lifetime of a system in order to support decision making in long term strategic planning. For complex systems, such as railway and road infrastructures, the cost of maintenance plays an important role in the LCC analysis. Costs associated with maintenance interventions can be more reliably estimated by integrating the probabilistic nature of the failures associated to these interventions in the LCC models. Reliability, Maintainability, Availability and Safety (RAMS) parameters describe the maintenance needs of an asset in a quantitative way by using probabilistic information extracted from registered maintenance activities. Therefore, the integration of RAMS in the LCC analysis allows obtaining reliable predictions of system maintenance costs and the dependencies of these costs with specific cost drivers through sensitivity analyses. This paper presents an innovative approach for a combined RAMS & LCC methodology for railway and road transport infrastructures being developed under the on-going H2020 project INFRALERT. Such RAMS & LCC analysis provides relevant probabilistic information to be used for condition and risk-based planning of maintenance activities as well as for decision support in long term strategic investment planning.
Wortley, Sally; Tong, Allison; Howard, Kirsten
2016-02-01
To identify characteristics (factors) about health technology assessment (HTA) decisions that are important to the public in determining whether public engagement should be undertaken and the reasons for these choices. Focus groups using a nominal group technique to identify and rank factors relevant to public engagement in HTA decision-making. Thematic analysis was also undertaken to describe reasons underpinning participants' choices and rankings. Members of the Australian general public. 58 people, aged 19-71 years participated in 6 focus groups. 24 factors were identified by participants that were considered important in determining whether public engagement should be undertaken. These factors were individually ranked and grouped into 4 themes to interpret preferences for engagement. Members of the public were more likely to think public engagement was needed when trade-offs between benefits and costs were required to determine 'value', uncertainties in the evidence were present, and family members and/or carers were impacted. The role of public engagement was also seen as important if the existent system lacked transparency and did not provide a voice for patients, particularly for conditions less known in the community. Members of the public considered value, impact, uncertainty, equity and transparency in determining when engagement should be undertaken. This indicates that the public's preferences on when to undertake engagement relate to both the content of the HTA itself as well as the processes in place to support HTA decision-making. By understanding these preferences, decision-makers can work towards more effective, meaningful public engagement by involving the public in issues that are important to them and/or improving the processes around decision-making. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
[Research Biomedical Ethics and Practical Wisdom].
Vergara, Oscar
2015-01-01
As is well known, in the field of Biomedical Ethics some methodological proposals have been put forward. They try to provide some guidelines in order to take proper decisions. These methodologies are quite useful insofar as they supply reasons for action, but they are essentially insufficient. In fact, taking a good decision requires a special skill that goes beyond sheer technique, and this skill is traditionally called practical wisdom. Not in the usual and more outlying sense of sheer caution, but in the more central one of phronesis or prudentia. Although it is not a new notion, it usually appears blurred in biomedical decision-making theory, playing the wrong role, or in a marginal or indefinite way. From this postulate, we will try to make a double analysis. First, we will try to show the need for a proper understanding of the core role that phronesis plays in decision making. Second, we will try to get the original meaning of Aristotelian phronesis back. For reasons of space, in this paper the second question will be just partially addressed.
Using CART to Identify Thresholds and Hierarchies in the Determinants of Funding Decisions.
Schilling, Chris; Mortimer, Duncan; Dalziel, Kim
2017-02-01
There is much interest in understanding decision-making processes that determine funding outcomes for health interventions. We use classification and regression trees (CART) to identify cost-effectiveness thresholds and hierarchies in the determinants of funding decisions. The hierarchical structure of CART is suited to analyzing complex conditional and nonlinear relationships. Our analysis uncovered hierarchies where interventions were grouped according to their type and objective. Cost-effectiveness thresholds varied markedly depending on which group the intervention belonged to: lifestyle-type interventions with a prevention objective had an incremental cost-effectiveness threshold of $2356, suggesting that such interventions need to be close to cost saving or dominant to be funded. For lifestyle-type interventions with a treatment objective, the threshold was much higher at $37,024. Lower down the tree, intervention attributes such as the level of patient contribution and the eligibility for government reimbursement influenced the likelihood of funding within groups of similar interventions. Comparison between our CART models and previously published results demonstrated concurrence with standard regression techniques while providing additional insights regarding the role of the funding environment and the structure of decision-maker preferences.
Tools and techniques for developing policies for complex and uncertain systems.
Bankes, Steven C
2002-05-14
Agent-based models (ABM) are examples of complex adaptive systems, which can be characterized as those systems for which no model less complex than the system itself can accurately predict in detail how the system will behave at future times. Consequently, the standard tools of policy analysis, based as they are on devising policies that perform well on some best estimate model of the system, cannot be reliably used for ABM. This paper argues that policy analysis by using ABM requires an alternative approach to decision theory. The general characteristics of such an approach are described, and examples are provided of its application to policy analysis.
A qualitative study of women's decision-making at the end of IVF treatment.
Peddie, V L; van Teijlingen, E; Bhattacharya, S
2005-07-01
The decision not to pursue further in vitro fertilization (IVF) after one or more unsuccessful attempts is an important and often difficult one for couples. Relatively little is known about the woman's perception of this decision-making process. The aim of this study was to examine patients' perspectives of decision-making, including circumstances influencing it and satisfaction with the decision-making process. Semi-structured interviews were conducted with a purposive sample of 25 women who had decided to end treatment after unsuccessful IVF treatment. Interviews were tape-recorded and transcribed by means of thematic analysis using the open coding technique. Women experienced difficulty in accepting that their infertility would remain unresolved. Many felt that they had started with unrealistic expectations of treatment success and felt vulnerable to the pressures of both the media and society. Although the decision to end treatment was difficult, it offered many women a way out of the emotional distress caused by IVF; however, the process of decision-making created a sense of 'confrontation' for the women in which they had to address issues they had previously avoided. Adoptive parents perceived less societal pressure than those who remained childless. Efforts to improve the psychological preparation of couples who decide to end IVF treatment should be directed towards examination of the existing system of consultation, which has certain limitations in terms of the quality of communication and the provision of post-treatment support. Further efforts to develop strategies, which facilitate the decision-making process, should be considered.
Fault trees for decision making in systems analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lambert, Howard E.
1975-10-09
The application of fault tree analysis (FTA) to system safety and reliability is presented within the framework of system safety analysis. The concepts and techniques involved in manual and automated fault tree construction are described and their differences noted. The theory of mathematical reliability pertinent to FTA is presented with emphasis on engineering applications. An outline of the quantitative reliability techniques of the Reactor Safety Study is given. Concepts of probabilistic importance are presented within the fault tree framework and applied to the areas of system design, diagnosis and simulation. The computer code IMPORTANCE ranks basic events and cut setsmore » according to a sensitivity analysis. A useful feature of the IMPORTANCE code is that it can accept relative failure data as input. The output of the IMPORTANCE code can assist an analyst in finding weaknesses in system design and operation, suggest the most optimal course of system upgrade, and determine the optimal location of sensors within a system. A general simulation model of system failure in terms of fault tree logic is described. The model is intended for efficient diagnosis of the causes of system failure in the event of a system breakdown. It can also be used to assist an operator in making decisions under a time constraint regarding the future course of operations. The model is well suited for computer implementation. New results incorporated in the simulation model include an algorithm to generate repair checklists on the basis of fault tree logic and a one-step-ahead optimization procedure that minimizes the expected time to diagnose system failure.« less
NASA Astrophysics Data System (ADS)
Goienetxea Uriarte, A.; Ruiz Zúñiga, E.; Urenda Moris, M.; Ng, A. H. C.
2015-05-01
Discrete Event Simulation (DES) is nowadays widely used to support decision makers in system analysis and improvement. However, the use of simulation for improving stochastic logistic processes is not common among healthcare providers. The process of improving healthcare systems involves the necessity to deal with trade-off optimal solutions that take into consideration a multiple number of variables and objectives. Complementing DES with Multi-Objective Optimization (SMO) creates a superior base for finding these solutions and in consequence, facilitates the decision-making process. This paper presents how SMO has been applied for system improvement analysis in a Swedish Emergency Department (ED). A significant number of input variables, constraints and objectives were considered when defining the optimization problem. As a result of the project, the decision makers were provided with a range of optimal solutions which reduces considerably the length of stay and waiting times for the ED patients. SMO has proved to be an appropriate technique to support healthcare system design and improvement processes. A key factor for the success of this project has been the involvement and engagement of the stakeholders during the whole process.
NASA Astrophysics Data System (ADS)
Kacprzyk, Janusz; Zadrożny, Sławomir
2010-05-01
We present how the conceptually and numerically simple concept of a fuzzy linguistic database summary can be a very powerful tool for gaining much insight into the very essence of data. The use of linguistic summaries provides tools for the verbalisation of data analysis (mining) results which, in addition to the more commonly used visualisation, e.g. via a graphical user interface, can contribute to an increased human consistency and ease of use, notably for supporting decision makers via the data-driven decision support system paradigm. Two new relevant aspects of the analysis are also outlined which were first initiated by the authors. First, following Kacprzyk and Zadrożny, it is further considered how linguistic data summarisation is closely related to some types of solutions used in natural language generation (NLG). This can make it possible to use more and more effective and efficient tools and techniques developed in NLG. Second, similar remarks are given on relations to systemic functional linguistics. Moreover, following Kacprzyk and Zadrożny, comments are given on an extremely relevant aspect of scalability of linguistic summarisation of data, using a new concept of a conceptual scalability.
Westreich, Daniel; Lessler, Justin; Funk, Michele Jonsson
2010-08-01
Propensity scores for the analysis of observational data are typically estimated using logistic regression. Our objective in this review was to assess machine learning alternatives to logistic regression, which may accomplish the same goals but with fewer assumptions or greater accuracy. We identified alternative methods for propensity score estimation and/or classification from the public health, biostatistics, discrete mathematics, and computer science literature, and evaluated these algorithms for applicability to the problem of propensity score estimation, potential advantages over logistic regression, and ease of use. We identified four techniques as alternatives to logistic regression: neural networks, support vector machines, decision trees (classification and regression trees [CART]), and meta-classifiers (in particular, boosting). Although the assumptions of logistic regression are well understood, those assumptions are frequently ignored. All four alternatives have advantages and disadvantages compared with logistic regression. Boosting (meta-classifiers) and, to a lesser extent, decision trees (particularly CART), appear to be most promising for use in the context of propensity score analysis, but extensive simulation studies are needed to establish their utility in practice. Copyright (c) 2010 Elsevier Inc. All rights reserved.
Suebnukarn, Siriwan; Chanakarn, Piyawadee; Phisutphatthana, Sirada; Pongpatarat, Kanchala; Wongwaithongdee, Udom; Oupadissakoon, Chanekrid
2015-12-01
An understanding of the processes of clinical decision-making is essential for the development of health information technology. In this study we have analysed the acquisition of information during decision-making in oral surgery, and analysed cognitive tasks using a "think-aloud" protocol. We studied the techniques of processing information that were used by novices and experts as they completed 4 oral surgical cases modelled from data obtained from electronic hospital records. We studied 2 phases of an oral surgeon's preoperative practice including the "diagnosis and planning of treatment" and "preparing for a procedure". A framework analysis approach was used to analyse the qualitative data, and a descriptive statistical analysis was made of the quantitative data. The results showed that novice surgeons used hypotheticodeductive reasoning, whereas experts recognised patterns to diagnose and manage patients. Novices provided less detail when they prepared for a procedure. Concepts regarding "signs", "importance", "decisions", and "process" occurred most often during acquisition of information by both novices and experts. Based on these results, we formulated recommendations for the design of clinical information technology that would help to improve the acquisition of clinical information required by oral surgeons at all levels of expertise in their clinical decision-making. Copyright © 2015 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Benute, Gláucia R G; Nomura, Roseli M Y; Liao, Adolfo W; Brizot, Maria de Lourdes; de Lucia, Mara C S; Zugaib, M
2012-08-01
this study investigated the feelings of women regarding end-of-life decision making after ultrasound diagnosis of a lethal fetal malformation. The aim of this study was to present the decision making process of women that chose for pregnancy termination and to present selected speeches of women about their feelings. open psychological interviews conducted by a psychologist immediately after the diagnosis of fetal malformation by ultrasound. Analysis of the results was performed through a content analysis technique. the study was carried out at a public university hospital in Brazil. 249 pregnant women who had received the diagnosis of a severe lethal fetal malformation. fetal anencephaly was the most frequent anomaly detected in 135 cases (54.3%). Termination of pregnancy was decided by 172 (69.1%) patients and legally authorised by the judiciary (66%). The reason for asking for termination was to reduce suffering in all of them. In the 77 women who chose not to terminate pregnancy (30.9%), the reasons were related to feelings of guilt (74%). the results support the importance of psychological counselling for couples when lethal fetal malformation is diagnosed. The act of reviewing moral and cultural values and elements of the unconscious provides assurance in the decision-making process and mitigates the risk of emotional trauma and guilt that can continue long after the pregnancy is terminated. Copyright © 2011 Elsevier Ltd. All rights reserved.
Interactive decision support in hepatic surgery
Dugas, Martin; Schauer, Rolf; Volk, Andreas; Rau, Horst
2002-01-01
Background Hepatic surgery is characterized by complicated operations with a significant peri- and postoperative risk for the patient. We developed a web-based, high-granular research database for comprehensive documentation of all relevant variables to evaluate new surgical techniques. Methods To integrate this research system into the clinical setting, we designed an interactive decision support component. The objective is to provide relevant information for the surgeon and the patient to assess preoperatively the risk of a specific surgical procedure. Based on five established predictors of patient outcomes, the risk assessment tool searches for similar cases in the database and aggregates the information to estimate the risk for an individual patient. Results The physician can verify the analysis and exclude manually non-matching cases according to his expertise. The analysis is visualized by means of a Kaplan-Meier plot. To evaluate the decision support component we analyzed data on 165 patients diagnosed with hepatocellular carcinoma (period 1996–2000). The similarity search provides a two-peak distribution indicating there are groups of similar patients and singular cases which are quite different to the average. The results of the risk estimation are consistent with the observed survival data, but must be interpreted with caution because of the limited number of matching reference cases. Conclusion Critical issues for the decision support system are clinical integration, a transparent and reliable knowledge base and user feedback. PMID:12003639
Stock and option portfolio using fuzzy logic approach
NASA Astrophysics Data System (ADS)
Sumarti, Novriana; Wahyudi, Nanang
2014-03-01
Fuzzy Logic in decision-making process has been widely implemented in various problems in industries. It is the theory of imprecision and uncertainty that was not based on probability theory. Fuzzy Logic adds values of degree between absolute true and absolute false. It starts with and builds on a set of human language rules supplied by the user. The fuzzy systems convert these rules to their mathematical equivalents. This could simplify the job of the system designer and the computer, and results in much more accurate representations of the way systems behave in the real world. In this paper we examine the decision making process of stock and option trading by the usage of MACD (Moving Average Convergence Divergence) technical analysis and Option Pricing with Fuzzy Logic approach. MACD technical analysis is for the prediction of the trends of underlying stock prices, such as bearish (going downward), bullish (going upward), and sideways. By using Fuzzy C-Means technique and Mamdani Fuzzy Inference System, we define the decision output where the value of MACD is high then decision is "Strong Sell", and the value of MACD is Low then the decision is "Strong Buy". We also implement the fuzzification of the Black-Scholes option-pricing formula. The stock and options methods are implemented on a portfolio of one stock and its options. Even though the values of input data, such as interest rates, stock price and its volatility, cannot be obtain accurately, these fuzzy methods can give a belief degree of the calculated the Black-Scholes formula so we can make the decision on option trading. The results show the good capability of the methods in the prediction of stock price trends. The performance of the simulated portfolio for a particular period of time also shows good return.
Determining flexor-tendon repair techniques via soft computing
NASA Technical Reports Server (NTRS)
Johnson, M.; Firoozbakhsh, K.; Moniem, M.; Jamshidi, M.
2001-01-01
An SC-based multi-objective decision-making method for determining the optimal flexor-tendon repair technique from experimental and clinical survey data, and with variable circumstances, was presented. Results were compared with those from the Taguchi method. Using the Taguchi method results in the need to perform ad-hoc decisions when the outcomes for individual objectives are contradictory to a particular preference or circumstance, whereas the SC-based multi-objective technique provides a rigorous straightforward computational process in which changing preferences and importance of differing objectives are easily accommodated. Also, adding more objectives is straightforward and easily accomplished. The use of fuzzy-set representations of information categories provides insight into their performance throughout the range of their universe of discourse. The ability of the technique to provide a "best" medical decision given a particular physician, hospital, patient, situation, and other criteria was also demonstrated.
Determining flexor-tendon repair techniques via soft computing.
Johnson, M; Firoozbakhsh, K; Moniem, M; Jamshidi, M
2001-01-01
An SC-based multi-objective decision-making method for determining the optimal flexor-tendon repair technique from experimental and clinical survey data, and with variable circumstances, was presented. Results were compared with those from the Taguchi method. Using the Taguchi method results in the need to perform ad-hoc decisions when the outcomes for individual objectives are contradictory to a particular preference or circumstance, whereas the SC-based multi-objective technique provides a rigorous straightforward computational process in which changing preferences and importance of differing objectives are easily accommodated. Also, adding more objectives is straightforward and easily accomplished. The use of fuzzy-set representations of information categories provides insight into their performance throughout the range of their universe of discourse. The ability of the technique to provide a "best" medical decision given a particular physician, hospital, patient, situation, and other criteria was also demonstrated.
Decision rules for unbiased inventory estimates
NASA Technical Reports Server (NTRS)
Argentiero, P. D.; Koch, D.
1979-01-01
An efficient and accurate procedure for estimating inventories from remote sensing scenes is presented. In place of the conventional and expensive full dimensional Bayes decision rule, a one-dimensional feature extraction and classification technique was employed. It is shown that this efficient decision rule can be used to develop unbiased inventory estimates and that for large sample sizes typical of satellite derived remote sensing scenes, resulting accuracies are comparable or superior to more expensive alternative procedures. Mathematical details of the procedure are provided in the body of the report and in the appendix. Results of a numerical simulation of the technique using statistics obtained from an observed LANDSAT scene are included. The simulation demonstrates the effectiveness of the technique in computing accurate inventory estimates.
Techniques for optimal crop selection in a controlled ecological life support system
NASA Technical Reports Server (NTRS)
Mccormack, Ann; Finn, Cory; Dunsky, Betsy
1993-01-01
A Controlled Ecological Life Support System (CELSS) utilizes a plant's natural ability to regenerate air and water while being grown as a food source in a closed life support system. Current plant research is directed toward obtaining quantitative empirical data on the regenerative ability of each species of plant and the system volume and power requirements. Two techniques were adapted to optimize crop species selection while at the same time minimizing the system volume and power requirements. Each allows the level of life support supplied by the plants to be selected, as well as other system parameters. The first technique uses decision analysis in the form of a spreadsheet. The second method, which is used as a comparison with and validation of the first, utilizes standard design optimization techniques. Simple models of plant processes are used in the development of these methods.
Techniques for optimal crop selection in a controlled ecological life support system
NASA Technical Reports Server (NTRS)
Mccormack, Ann; Finn, Cory; Dunsky, Betsy
1992-01-01
A Controlled Ecological Life Support System (CELSS) utilizes a plant's natural ability to regenerate air and water while being grown as a food source in a closed life support system. Current plant research is directed toward obtaining quantitative empirical data on the regenerative ability of each species of plant and the system volume and power requirements. Two techniques were adapted to optimize crop species selection while at the same time minimizing the system volume and power requirements. Each allows the level of life support supplied by the plants to be selected, as well as other system parameters. The first technique uses decision analysis in the form of a spreadsheet. The second method, which is used as a comparison with and validation of the first, utilizes standard design optimization techniques. Simple models of plant processes are used in the development of these methods.
Regression Verification Using Impact Summaries
NASA Technical Reports Server (NTRS)
Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana
2013-01-01
Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program versions [19]. These techniques compare two programs with a large degree of syntactic similarity to prove that portions of one program version are equivalent to the other. Regression verification can be used for guaranteeing backward compatibility, and for showing behavioral equivalence in programs with syntactic differences, e.g., when a program is refactored to improve its performance, maintainability, or readability. Existing regression verification techniques leverage similarities between program versions by using abstraction and decomposition techniques to improve scalability of the analysis [10, 12, 19]. The abstractions and decomposition in the these techniques, e.g., summaries of unchanged code [12] or semantically equivalent methods [19], compute an over-approximation of the program behaviors. The equivalence checking results of these techniques are sound but not complete-they may characterize programs as not functionally equivalent when, in fact, they are equivalent. In this work we describe a novel approach that leverages the impact of the differences between two programs for scaling regression verification. We partition program behaviors of each version into (a) behaviors impacted by the changes and (b) behaviors not impacted (unimpacted) by the changes. Only the impacted program behaviors are used during equivalence checking. We then prove that checking equivalence of the impacted program behaviors is equivalent to checking equivalence of all program behaviors for a given depth bound. In this work we use symbolic execution to generate the program behaviors and leverage control- and data-dependence information to facilitate the partitioning of program behaviors. The impacted program behaviors are termed as impact summaries. The dependence analyses that facilitate the generation of the impact summaries, we believe, could be used in conjunction with other abstraction and decomposition based approaches, [10, 12], as a complementary reduction technique. An evaluation of our regression verification technique shows that our approach is capable of leveraging similarities between program versions to reduce the size of the queries and the time required to check for logical equivalence. The main contributions of this work are: - A regression verification technique to generate impact summaries that can be checked for functional equivalence using an off-the-shelf decision procedure. - A proof that our approach is sound and complete with respect to the depth bound of symbolic execution. - An implementation of our technique using the LLVMcompiler infrastructure, the klee Symbolic Virtual Machine [4], and a variety of Satisfiability Modulo Theory (SMT) solvers, e.g., STP [7] and Z3 [6]. - An empirical evaluation on a set of C artifacts which shows that the use of impact summaries can reduce the cost of regression verification.
NASA Astrophysics Data System (ADS)
Rimland, Jeffrey; McNeese, Michael; Hall, David
2013-05-01
Although the capability of computer-based artificial intelligence techniques for decision-making and situational awareness has seen notable improvement over the last several decades, the current state-of-the-art still falls short of creating computer systems capable of autonomously making complex decisions and judgments in many domains where data is nuanced and accountability is high. However, there is a great deal of potential for hybrid systems in which software applications augment human capabilities by focusing the analyst's attention to relevant information elements based on both a priori knowledge of the analyst's goals and the processing/correlation of a series of data streams too numerous and heterogeneous for the analyst to digest without assistance. Researchers at Penn State University are exploring ways in which an information framework influenced by Klein's (Recognition Primed Decision) RPD model, Endsley's model of situational awareness, and the Joint Directors of Laboratories (JDL) data fusion process model can be implemented through a novel combination of Complex Event Processing (CEP) and Multi-Agent Software (MAS). Though originally designed for stock market and financial applications, the high performance data-driven nature of CEP techniques provide a natural compliment to the proven capabilities of MAS systems for modeling naturalistic decision-making, performing process adjudication, and optimizing networked processing and cognition via the use of "mobile agents." This paper addresses the challenges and opportunities of such a framework for augmenting human observational capability as well as enabling the ability to perform collaborative context-aware reasoning in both human teams and hybrid human / software agent teams.
Statistical approach for selection of biologically informative genes.
Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N
2018-05-20
Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes from high dimensional expression data for breeding and system biology studies. Published by Elsevier B.V.
Schmidtke, Daniel; Matsuki, Kazunaga; Kuperman, Victor
2017-11-01
The current study addresses a discrepancy in the psycholinguistic literature about the chronology of information processing during the visual recognition of morphologically complex words. Form-then-meaning accounts of complex word recognition claim that morphemes are processed as units of form prior to any influence of their meanings, whereas form-and-meaning models posit that recognition of complex word forms involves the simultaneous access of morphological and semantic information. The study reported here addresses this theoretical discrepancy by applying a nonparametric distributional technique of survival analysis (Reingold & Sheridan, 2014) to 2 behavioral measures of complex word processing. Across 7 experiments reported here, this technique is employed to estimate the point in time at which orthographic, morphological, and semantic variables exert their earliest discernible influence on lexical decision RTs and eye movement fixation durations. Contrary to form-then-meaning predictions, Experiments 1-4 reveal that surface frequency is the earliest lexical variable to exert a demonstrable influence on lexical decision RTs for English and Dutch derived words (e.g., badness ; bad + ness ), English pseudoderived words (e.g., wander ; wand + er ) and morphologically simple control words (e.g., ballad ; ball + ad ). Furthermore, for derived word processing across lexical decision and eye-tracking paradigms (Experiments 1-2; 5-7), semantic effects emerge early in the time-course of word recognition, and their effects either precede or emerge simultaneously with morphological effects. These results are not consistent with the premises of the form-then-meaning view of complex word recognition, but are convergent with a form-and-meaning account of complex word recognition. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Uncovering the requirements of cognitive work.
Roth, Emilie M
2008-06-01
In this article, the author provides an overview of cognitive analysis methods and how they can be used to inform system analysis and design. Human factors has seen a shift toward modeling and support of cognitively intensive work (e.g., military command and control, medical planning and decision making, supervisory control of automated systems). Cognitive task analysis and cognitive work analysis methods extend traditional task analysis techniques to uncover the knowledge and thought processes that underlie performance in cognitively complex settings. The author reviews the multidisciplinary roots of cognitive analysis and the variety of cognitive task analysis and cognitive work analysis methods that have emerged. Cognitive analysis methods have been used successfully to guide system design, as well as development of function allocation, team structure, and training, so as to enhance performance and reduce the potential for error. A comprehensive characterization of cognitive work requires two mutually informing analyses: (a) examination of domain characteristics and constraints that define cognitive requirements and challenges and (b) examination of practitioner knowledge and strategies that underlie both expert and error-vulnerable performance. A variety of specific methods can be adapted to achieve these aims within the pragmatic constraints of particular projects. Cognitive analysis methods can be used effectively to anticipate cognitive performance problems and specify ways to improve individual and team cognitive performance (be it through new forms of training, user interfaces, or decision aids).
Multi-criteria decision analysis for waste management in Saharawi refugee camps
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garfi, M.; Tondelli, S.; Bonoli, A.
2009-10-15
The aim of this paper is to compare different waste management solutions in Saharawi refugee camps (Algeria) and to test the feasibility of a decision-making method developed to be applied in particular conditions in which environmental and social aspects must be considered. It is based on multi criteria analysis, and in particular on the analytic hierarchy process (AHP), a mathematical technique for multi-criteria decision making (Saaty, T.L., 1980. The Analytic Hierarchy Process. McGraw-Hill, New York, USA; Saaty, T.L., 1990. How to Make a Decision: The Analytic Hierarchy Process. European Journal of Operational Research; Saaty, T.L., 1994. Decision Making for Leaders:more » The Analytic Hierarchy Process in a Complex World. RWS Publications, Pittsburgh, PA), and on participatory approach, focusing on local community's concerns. The research compares four different waste collection and management alternatives: waste collection by using three tipper trucks, disposal and burning in an open area; waste collection by using seven dumpers and disposal in a landfill; waste collection by using seven dumpers and three tipper trucks and disposal in a landfill; waste collection by using three tipper trucks and disposal in a landfill. The results show that the second and the third solutions provide better scenarios for waste management. Furthermore, the discussion of the results points out the multidisciplinarity of the approach, and the equilibrium between social, environmental and technical impacts. This is a very important aspect in a humanitarian and environmental project, confirming the appropriateness of the chosen method.« less
NASA Astrophysics Data System (ADS)
Gumilar, I.; Rizal, A.; Sriati; Setiawan Putra, R.
2018-04-01
This research aim was to analyzed process of decision making of purchasing ornamental freshwater fish at Peta Street, Bandung City and Analyzed what factors are driving consumers to buy freshwater fish Peta Street. The method used in this research is case study with rating scale and rank spearman analysis. The sampling technique is the accidental random sampling method consist of 30 respondents. The consumer’s decision making process consist of five stages, namely the recognition of needs, information searching, alternative evaluation, process of purchasing, and the evaluation of results. The results showed that at the stage of recognition of needs the motivation of purchasing freshwater fish because respondents are very fond of ornamental freshwater fish, at the stage of information search, the information sources are from the print media and friends or neighborhood. At the stage of alternative evaluation, the reason consumers buy ornamental freshwater fish because the quality of good products. The stage of purchasing decision process consumers bought 1-5 fish with frequency of purchase 1 time per month. The evaluation of results of post-purchasing consumers feel very satisfied with the fish products and the price is very affordable. To observe the factors that influence purchasing motivation of consumers, spearman rank test is the method. The results showed that the quality and price of the product are the factors that most influence the purchase decision of ornamental freshwater fish with the range of student-t value 3,968 and 2,107.
Local Management of Schools: Rationality and Decision-Making in the Employment of Teachers.
ERIC Educational Resources Information Center
Huckman, Lynda; Hill, Tim
1994-01-01
Examines the use of rational planning techniques in five English elementary schools. Discusses the decision-making processes used to determine the employment and remuneration of teachers. Finds that the decree of control over decision making was related closely to the extent to which decisions would contribute to solutions of other school…
Denys Yemshanov; Frank H. Koch; Mark Ducey; Klaus Koehler
2013-01-01
Geographic mapping of risks is a useful analytical step in ecological risk assessments and in particular, in analyses aimed to estimate risks associated with introductions of invasive organisms. In this paper, we approach invasive species risk mapping as a portfolio allocation problem and apply techniques from decision theory to build an invasion risk map that combines...
NASA Astrophysics Data System (ADS)
Vatutin, Eduard
2017-12-01
The article deals with the problem of analysis of effectiveness of the heuristic methods with limited depth-first search techniques of decision obtaining in the test problem of getting the shortest path in graph. The article briefly describes the group of methods based on the limit of branches number of the combinatorial search tree and limit of analyzed subtree depth used to solve the problem. The methodology of comparing experimental data for the estimation of the quality of solutions based on the performing of computational experiments with samples of graphs with pseudo-random structure and selected vertices and arcs number using the BOINC platform is considered. It also shows description of obtained experimental results which allow to identify the areas of the preferable usage of selected subset of heuristic methods depending on the size of the problem and power of constraints. It is shown that the considered pair of methods is ineffective in the selected problem and significantly inferior to the quality of solutions that are provided by ant colony optimization method and its modification with combinatorial returns.
Prospect theory in the valuation of health.
Moffett, Maurice L; Suarez-Almazor, Maria E
2005-08-01
Prospect theory is the prominent nonexpected utility theory in the estimation of health state preference scores for quality-adjusted life year calculation. Until recently, the theory was not considered to be developed to the point of implementation in economic analysis. This review focuses on the research and evidence that tests the implementation of prospect theory into health state valuation. The typical application of expected utility theory assumes that a decision maker has stable preferences under conditions of risk and uncertainty. Under prospect theory, preferences are dependent on whether the decision maker regards the outcome of a choice as a gain or loss, relative to a reference point. The conceptual preference for standard gamble utilities in the valuation of health states has led to the development of elicitation techniques. Empirical evidence using these techniques indicates that when individual preferences are elicited, a prospect theory consistent framework appears to be necessary for adequate representation of individual health utilities. The relevance of prospect theory to policy making and resource allocation remains to be established. Societal preferences may not need the same attitudes towards risks as individual preferences, and may remain largely risk neutral.
Varmazyar, Mohsen; Dehghanbaghi, Maryam; Afkhami, Mehdi
2016-10-01
Balanced Scorecard (BSC) is a strategic evaluation tool using both financial and non-financial indicators to determine the business performance of organizations or companies. In this paper, a new integrated approach based on the Balanced Scorecard (BSC) and multi-criteria decision making (MCDM) methods are proposed to evaluate the performance of research centers of research and technology organization (RTO) in Iran. Decision-Making Trial and Evaluation Laboratory (DEMATEL) are employed to reflect the interdependencies among BSC perspectives. Then, Analytic Network Process (ANP) is utilized to weight the indices influencing the considered problem. In the next step, we apply four MCDM methods including Additive Ratio Assessment (ARAS), Complex Proportional Assessment (COPRAS), Multi-Objective Optimization by Ratio Analysis (MOORA), and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) for ranking of alternatives. Finally, the utility interval technique is applied to combine the ranking results of MCDM methods. Weighted utility intervals are computed by constructing a correlation matrix between the ranking methods. A real case is presented to show the efficacy of the proposed approach. Copyright © 2016 Elsevier Ltd. All rights reserved.
Towards Application of NASA Standard for Models and Simulations in Aeronautical Design Process
NASA Astrophysics Data System (ADS)
Vincent, Luc; Dunyach, Jean-Claude; Huet, Sandrine; Pelissier, Guillaume; Merlet, Joseph
2012-08-01
Even powerful computational techniques like simulation endure limitations in their validity domain. Consequently using simulation models requires cautions to avoid making biased design decisions for new aeronautical products on the basis of inadequate simulation results. Thus the fidelity, accuracy and validity of simulation models shall be monitored in context all along the design phases to build confidence in achievement of the goals of modelling and simulation.In the CRESCENDO project, we adapt the Credibility Assessment Scale method from NASA standard for models and simulations from space programme to the aircraft design in order to assess the quality of simulations. The proposed eight quality assurance metrics aggregate information to indicate the levels of confidence in results. They are displayed in management dashboard and can secure design trade-off decisions at programme milestones.The application of this technique is illustrated in aircraft design context with specific thermal Finite Elements Analysis. This use case shows how to judge the fitness- for-purpose of simulation as Virtual testing means and then green-light the continuation of Simulation Lifecycle Management (SLM) process.
Knapsack - TOPSIS Technique for Vertical Handover in Heterogeneous Wireless Network
2015-01-01
In a heterogeneous wireless network, handover techniques are designed to facilitate anywhere/anytime service continuity for mobile users. Consistent best-possible access to a network with widely varying network characteristics requires seamless mobility management techniques. Hence, the vertical handover process imposes important technical challenges. Handover decisions are triggered for continuous connectivity of mobile terminals. However, bad network selection and overload conditions in the chosen network can cause fallout in the form of handover failure. In order to maintain the required Quality of Service during the handover process, decision algorithms should incorporate intelligent techniques. In this paper, a new and efficient vertical handover mechanism is implemented using a dynamic programming method from the operation research discipline. This dynamic programming approach, which is integrated with the Technique to Order Preference by Similarity to Ideal Solution (TOPSIS) method, provides the mobile user with the best handover decisions. Moreover, in this proposed handover algorithm a deterministic approach which divides the network into zones is incorporated into the network server in order to derive an optimal solution. The study revealed that this method is found to achieve better performance and QoS support to users and greatly reduce the handover failures when compared to the traditional TOPSIS method. The decision arrived at the zone gateway using this operational research analytical method (known as the dynamic programming knapsack approach together with Technique to Order Preference by Similarity to Ideal Solution) yields remarkably better results in terms of the network performance measures such as throughput and delay. PMID:26237221
Knapsack--TOPSIS Technique for Vertical Handover in Heterogeneous Wireless Network.
Malathy, E M; Muthuswamy, Vijayalakshmi
2015-01-01
In a heterogeneous wireless network, handover techniques are designed to facilitate anywhere/anytime service continuity for mobile users. Consistent best-possible access to a network with widely varying network characteristics requires seamless mobility management techniques. Hence, the vertical handover process imposes important technical challenges. Handover decisions are triggered for continuous connectivity of mobile terminals. However, bad network selection and overload conditions in the chosen network can cause fallout in the form of handover failure. In order to maintain the required Quality of Service during the handover process, decision algorithms should incorporate intelligent techniques. In this paper, a new and efficient vertical handover mechanism is implemented using a dynamic programming method from the operation research discipline. This dynamic programming approach, which is integrated with the Technique to Order Preference by Similarity to Ideal Solution (TOPSIS) method, provides the mobile user with the best handover decisions. Moreover, in this proposed handover algorithm a deterministic approach which divides the network into zones is incorporated into the network server in order to derive an optimal solution. The study revealed that this method is found to achieve better performance and QoS support to users and greatly reduce the handover failures when compared to the traditional TOPSIS method. The decision arrived at the zone gateway using this operational research analytical method (known as the dynamic programming knapsack approach together with Technique to Order Preference by Similarity to Ideal Solution) yields remarkably better results in terms of the network performance measures such as throughput and delay.
Jank, Louise; Martins, Magda Targa; Arsand, Juliana Bazzan; Campos Motta, Tanara Magalhães; Hoff, Rodrigo Barcellos; Barreto, Fabiano; Pizzolato, Tânia Mara
2015-11-01
A fast and simple method for residue analysis of the antibiotics classes of macrolides (erythromycin, azithromycin, tylosin, tilmicosin and spiramycin) and lincosamides (lincomycin and clindamycin) was developed and validated for cattle, swine and chicken muscle and for bovine milk. Sample preparation consists in a liquid-liquid extraction (LLE) with acetonitrile, followed by liquid chromatography-electrospray-tandem mass spectrometry analysis (LC-ESI-MS/MS), without the need of any additional clean-up steps. Chromatographic separation was achieved using a C18 column and a mobile phase composed by acidified acetonitrile and water. The method was fully validated according the criteria of the Commission Decision 2002/657/EC. Validation parameters such as limit of detection, limit of quantification, linearity, accuracy, repeatability, specificity, reproducibility, decision limit (CCα) and detection capability (CCβ) were evaluated. All calculated values met the established criteria. Reproducibility values, expressed as coefficient of variation, were all lower than 19.1%. Recoveries range from 60% to 107%. Limits of detection were from 5 to 25 µg kg(-1).The present method is able to be applied in routine analysis, with adequate time of analysis, low cost and a simple sample preparation protocol. Copyright © 2015. Published by Elsevier B.V.
Integrating climate change criteria in reforestation projects using a hybrid decision-support system
NASA Astrophysics Data System (ADS)
Curiel-Esparza, Jorge; Gonzalez-Utrillas, Nuria; Canto-Perello, Julian; Martin-Utrillas, Manuel
2015-09-01
The selection of appropriate species in a reforestation project has always been a complex decision-making problem in which, due mostly to government policies and other stakeholders, not only economic criteria but also other environmental issues interact. Climate change has not usually been taken into account in traditional reforestation decision-making strategies and management procedures. Moreover, there is a lack of agreement on the percentage of each one of the species in reforestation planning, which is usually calculated in a discretionary way. In this context, an effective multicriteria technique has been developed in order to improve the process of selecting species for reforestation in the Mediterranean region of Spain. A hybrid Delphi-AHP methodology is proposed, which includes a consistency analysis in order to reduce random choices. As a result, this technique provides an optimal percentage distribution of the appropriate species to be used in reforestation planning. The highest values of the weight given for each subcriteria corresponded to FR (fire forest response) and PR (pests and diseases risk), because of the increasing importance of the impact of climate change in the forest. However, CB (conservation of biodiversitiy) was in the third position in line with the aim of reforestation. Therefore, the most suitable species were Quercus faginea (19.75%) and Quercus ilex (19.35%), which offer a good balance between all the factors affecting the success and viability of reforestation.
Multicriteria Decision Framework for Cybersecurity Risk Assessment and Management.
Ganin, Alexander A; Quach, Phuoc; Panwar, Mahesh; Collier, Zachary A; Keisler, Jeffrey M; Marchese, Dayton; Linkov, Igor
2017-09-05
Risk assessors and managers face many difficult challenges related to novel cyber systems. Among these challenges are the constantly changing nature of cyber systems caused by technical advances, their distribution across the physical, information, and sociocognitive domains, and the complex network structures often including thousands of nodes. Here, we review probabilistic and risk-based decision-making techniques applied to cyber systems and conclude that existing approaches typically do not address all components of the risk assessment triplet (threat, vulnerability, consequence) and lack the ability to integrate across multiple domains of cyber systems to provide guidance for enhancing cybersecurity. We present a decision-analysis-based approach that quantifies threat, vulnerability, and consequences through a set of criteria designed to assess the overall utility of cybersecurity management alternatives. The proposed framework bridges the gap between risk assessment and risk management, allowing an analyst to ensure a structured and transparent process of selecting risk management alternatives. The use of this technique is illustrated for a hypothetical, but realistic, case study exemplifying the process of evaluating and ranking five cybersecurity enhancement strategies. The approach presented does not necessarily eliminate biases and subjectivity necessary for selecting countermeasures, but provides justifiable methods for selecting risk management actions consistent with stakeholder and decisionmaker values and technical data. Published 2017. This article is a U.S. Government work and is in the public domain in the U.S.A.
On Developing a Taxonomy for Multidisciplinary Design Optimization: A Decision-Based Perspective
NASA Technical Reports Server (NTRS)
Lewis, Kemper; Mistree, Farrokh
1995-01-01
In this paper, we approach MDO from a Decision-Based Design (DBD) perspective and explore classification schemes for designing complex systems and processes. Specifically, we focus on decisions, which are only a small portion of the Decision Support Problem (DSP) Technique, our implementation of DBD. We map coupled nonhierarchical and hierarchical representations from the DSP Technique into the Balling-Sobieski (B-S) framework (Balling and Sobieszczanski-Sobieski, 1994), and integrate domain-independent linguistic terms to complete our taxonomy. Application of DSPs to the design of complex, multidisciplinary systems include passenger aircraft, ships, damage tolerant structural and mechanical systems, and thermal energy systems. In this paper we show that Balling-Sobieski framework is consistent with that of the Decision Support Problem Technique through the use of linguistic entities to describe the same type of formulations. We show that the underlying linguistics of the solution approaches are the same and can be coalesced into a homogeneous framework with which to base the research, application, and technology MDO upon. We introduce, in the Balling-Sobieski framework, examples of multidisciplinary design, namely, aircraft, damage tolerant structural and mechanical systems, and thermal energy systems.
NASA Astrophysics Data System (ADS)
Uhde, Britta; Andreas Hahn, W.; Griess, Verena C.; Knoke, Thomas
2015-08-01
Multi-criteria decision analysis (MCDA) is a decision aid frequently used in the field of forest management planning. It includes the evaluation of multiple criteria such as the production of timber and non-timber forest products and tangible as well as intangible values of ecosystem services (ES). Hence, it is beneficial compared to those methods that take a purely financial perspective. Accordingly, MCDA methods are increasingly popular in the wide field of sustainability assessment. Hybrid approaches allow aggregating MCDA and, potentially, other decision-making techniques to make use of their individual benefits and leading to a more holistic view of the actual consequences that come with certain decisions. This review is providing a comprehensive overview of hybrid approaches that are used in forest management planning. Today, the scientific world is facing increasing challenges regarding the evaluation of ES and the trade-offs between them, for example between provisioning and regulating services. As the preferences of multiple stakeholders are essential to improve the decision process in multi-purpose forestry, participatory and hybrid approaches turn out to be of particular importance. Accordingly, hybrid methods show great potential for becoming most relevant in future decision making. Based on the review presented here, the development of models for the use in planning processes should focus on participatory modeling and the consideration of uncertainty regarding available information.
Uhde, Britta; Hahn, W Andreas; Griess, Verena C; Knoke, Thomas
2015-08-01
Multi-criteria decision analysis (MCDA) is a decision aid frequently used in the field of forest management planning. It includes the evaluation of multiple criteria such as the production of timber and non-timber forest products and tangible as well as intangible values of ecosystem services (ES). Hence, it is beneficial compared to those methods that take a purely financial perspective. Accordingly, MCDA methods are increasingly popular in the wide field of sustainability assessment. Hybrid approaches allow aggregating MCDA and, potentially, other decision-making techniques to make use of their individual benefits and leading to a more holistic view of the actual consequences that come with certain decisions. This review is providing a comprehensive overview of hybrid approaches that are used in forest management planning. Today, the scientific world is facing increasing challenges regarding the evaluation of ES and the trade-offs between them, for example between provisioning and regulating services. As the preferences of multiple stakeholders are essential to improve the decision process in multi-purpose forestry, participatory and hybrid approaches turn out to be of particular importance. Accordingly, hybrid methods show great potential for becoming most relevant in future decision making. Based on the review presented here, the development of models for the use in planning processes should focus on participatory modeling and the consideration of uncertainty regarding available information.
Clinical decisions for anterior restorations: the concept of restorative volume.
Cardoso, Jorge André; Almeida, Paulo Júlio; Fischer, Alex; Phaxay, Somano Luang
2012-12-01
The choice of the most appropriate restoration for anterior teeth is often a difficult decision. Numerous clinical and technical factors play an important role in selecting the treatment option that best suits the patient and the restorative team. Experienced clinicians have developed decision processes that are often more complex than may seem. Less experienced professionals may find difficulties making treatment decisions because of the widely varied restorative materials available and often numerous similar products offered by different manufacturers. The authors reviewed available evidence and integrated their clinical experience to select relevant factors that could provide a logical and practical guideline for restorative decisions in anterior teeth. The presented concept of restorative volume is based on structural, optical, and periodontal factors. Each of these factors will influence the short- and long-term behavior of restorations in terms of esthetics, biology, and function. Despite the marked evolution of esthetic restorative techniques and materials, significant limitations still exist, which should be addressed by researchers. The presented guidelines must be regarded as a mere orientation for risk analysis. A comprehensive individual approach should always be the core of restorative esthetic treatments. The complex decision process for anterior esthetic restorations can be clarified by a systematized examination of structural, optical, and periodontal factors. The basis for the proposed thought process is the concept of restorative volume that is a contemporary interpretation of restoration categories and their application. © 2012 Wiley Periodicals, Inc.
A Decision Theory Approach to College Resource Allocation.
ERIC Educational Resources Information Center
Baldwin, Charles W.
Current budgeting techniques are reviewed in relation to their application to higher education, including (1) incremental budgeting, where decisions are based primarily upon former levels of expenditures, (2) zero-based budgeting, involving the establishment and ranking of "decision packages", (3) Planning and Programming Budgeting…
Urban Rain Gauge Siting Selection Based on Gis-Multicriteria Analysis
NASA Astrophysics Data System (ADS)
Fu, Yanli; Jing, Changfeng; Du, Mingyi
2016-06-01
With the increasingly rapid growth of urbanization and climate change, urban rainfall monitoring as well as urban waterlogging has widely been paid attention. In the light of conventional siting selection methods do not take into consideration of geographic surroundings and spatial-temporal scale for the urban rain gauge site selection, this paper primarily aims at finding the appropriate siting selection rules and methods for rain gauge in urban area. Additionally, for optimization gauge location, a spatial decision support system (DSS) aided by geographical information system (GIS) has been developed. In terms of a series of criteria, the rain gauge optimal site-search problem can be addressed by a multicriteria decision analysis (MCDA). A series of spatial analytical techniques are required for MCDA to identify the prospective sites. With the platform of GIS, using spatial kernel density analysis can reflect the population density; GIS buffer analysis is used to optimize the location with the rain gauge signal transmission character. Experiment results show that the rules and the proposed method are proper for the rain gauge site selection in urban areas, which is significant for the siting selection of urban hydrological facilities and infrastructure, such as water gauge.
The role of pharmacoeconomics in current Indian healthcare system.
Ahmad, Akram; Patel, Isha; Parimilakrishnan, Sundararajan; Mohanta, Guru Prasad; Chung, HaeChung; Chang, Jongwha
2013-01-01
Phamacoeconomics can aid the policy makers and the healthcare providers in decision making in evaluating the affordability of and access to rational drug use. Efficiency is a key concept of pharmacoeconomics, and various strategies are suggested for buying the greatest amount of benefits for a given resource use. Phamacoeconomic evaluation techniques such as cost minimization analysis, cost effectiveness analysis, cost benefit analysis, and cost utilization analysis, which support identification and quantification of cost of drugs, are conducted in a similar way, but vary in measurement of value of health benefits and outcomes. This article provides a brief overview about pharmacoeconomics, its utility with respect to the Indian pharmaceutical industry, and the expanding insurance system in India. Pharmacoeconomic evidences can be utilized to support decisions on licensing, pricing, reimbursement, and maintenance of formulary procedure of pharmaceuticals. For the insurance companies to give better facility at minimum cost, India must develop the platform for pharmacoeconomics with a validating methodology and appropriate training. The role of clinical pharmacists including PharmD graduates are expected to be more beneficial than the conventional pharmacists, as they will be able to apply the principles of economics in daily basis practice in community and hospital pharmacy.
The role of pharmacoeconomics in current Indian healthcare system
Ahmad, Akram; Patel, Isha; Parimilakrishnan, Sundararajan; Mohanta, Guru Prasad; Chung, HaeChung; Chang, Jongwha
2013-01-01
Phamacoeconomics can aid the policy makers and the healthcare providers in decision making in evaluating the affordability of and access to rational drug use. Efficiency is a key concept of pharmacoeconomics, and various strategies are suggested for buying the greatest amount of benefits for a given resource use. Phamacoeconomic evaluation techniques such as cost minimization analysis, cost effectiveness analysis, cost benefit analysis, and cost utilization analysis, which support identification and quantification of cost of drugs, are conducted in a similar way, but vary in measurement of value of health benefits and outcomes. This article provides a brief overview about pharmacoeconomics, its utility with respect to the Indian pharmaceutical industry, and the expanding insurance system in India. Pharmacoeconomic evidences can be utilized to support decisions on licensing, pricing, reimbursement, and maintenance of formulary procedure of pharmaceuticals. For the insurance companies to give better facility at minimum cost, India must develop the platform for pharmacoeconomics with a validating methodology and appropriate training. The role of clinical pharmacists including PharmD graduates are expected to be more beneficial than the conventional pharmacists, as they will be able to apply the principles of economics in daily basis practice in community and hospital pharmacy. PMID:24991597
Using cluster analysis for medical resource decision making.
Dilts, D; Khamalah, J; Plotkin, A
1995-01-01
Escalating costs of health care delivery have in the recent past often made the health care industry investigate, adapt, and apply those management techniques relating to budgeting, resource control, and forecasting that have long been used in the manufacturing sector. A strategy that has contributed much in this direction is the definition and classification of a hospital's output into "products" or groups of patients that impose similar resource or cost demands on the hospital. Existing classification schemes have frequently employed cluster analysis in generating these groupings. Unfortunately, the myriad articles and books on clustering and classification contain few formalized selection methodologies for choosing a technique for solving a particular problem, hence they often leave the novice investigator at a loss. This paper reviews the literature on clustering, particularly as it has been applied in the medical resource-utilization domain, addresses the critical choices facing an investigator in the medical field using cluster analysis, and offers suggestions (using the example of clustering low-vision patients) for how such choices can be made.
From guideline modeling to guideline execution: defining guideline-based decision-support services.
Tu, S. W.; Musen, M. A.
2000-01-01
We describe our task-based approach to defining the guideline-based decision-support services that the EON system provides. We categorize uses of guidelines in patient-specific decision support into a set of generic tasks--making of decisions, specification of work to be performed, interpretation of data, setting of goals, and issuance of alert and reminders--that can be solved using various techniques. Our model includes constructs required for representing the knowledge used by these techniques. These constructs form a toolkit from which developers can select modeling solutions for guideline task. Based on the tasks and the guideline model, we define a guideline-execution architecture and a model of interactions between a decision-support server and clients that invoke services provided by the server. These services use generic interfaces derived from guideline tasks and their associated modeling constructs. We describe two implementations of these decision-support services and discuss how this work can be generalized. We argue that a well-defined specification of guideline-based decision-support services will facilitate sharing of tools that implement computable clinical guidelines. PMID:11080007
Anterior surgical management of single-level cervical disc disease: a cost-effectiveness analysis.
Lewis, Daniel J; Attiah, Mark A; Malhotra, Neil R; Burnett, Mark G; Stein, Sherman C
2014-12-01
Cost-effectiveness analysis with decision analysis and meta-analysis. To determine the relative cost-effectiveness of anterior cervical discectomy with fusion (with autograft, allograft, or spacers), anterior cervical discectomy without fusion (ACD), and cervical disc replacement (CDR) for the treatment of 1-level cervical disc disease. There is debate as to the optimal anterior surgical strategy to treat single-level cervical disc disease. Surgical strategies include 3 techniques of anterior cervical discectomy with fusion (autograft, allograft, or spacer-assisted fusion), ACD, and CDR. Several controlled trials have compared these treatments but have yielded mixed results. Decision analysis provides a structure for making a quantitative comparison of the costs and outcomes of each treatment. A literature search was performed and yielded 156 case series that fulfilled our search criteria describing nearly 17,000 cases. Data were abstracted from these publications and pooled meta-analytically to estimate the incidence of various outcomes, including index-level and adjacent-level reoperation. A decision analytic model calculated the expected costs in US dollars and outcomes in quality-adjusted life years for a typical adult patient with 1-level cervical radiculopathy subjected to each of the 5 approaches. At 5 years postoperatively, patients who had undergone ACD alone had significantly (P < 0.001) more quality-adjusted life years (4.885 ± 0.041) than those receiving other treatments. Patients with ACD also exhibited highly significant (P < 0.001) differences in costs, incurring the lowest societal costs ($16,558 ± $539). Follow-up data were inadequate for comparison beyond 5 years. The results of our decision analytic model indicate advantages for ACD, both in effectiveness and costs, over other strategies. Thus, ACD is a cost-effective alternative to anterior cervical discectomy with fusion and CDR in patients with single-level cervical disc disease. Definitive conclusions about degenerative changes after ACD and adjacent-level disease after CDR await longer follow-up. 4.
Hirschman, Karen B; Kapo, Jennifer M; Karlawish, Jason H T
2006-08-01
The objective of this study was to identify what standard of decision making a family member uses when making medical decisions for their relative with advanced dementia. Thirty family members of patients with advanced dementia from an Alzheimer disease center and a suburban long-term care facility were interviewed using a semistructured interview. All interviews were audiotaped, transcribed, and analyzed using qualitative data analysis techniques. Family members were split almost evenly in the standard they used when making medical decisions for their relative: substituted judgment (43%) or best interests (57%). However, few who used the substituted judgment standard viewed it as distinct from best interests. Instead, both standards were taken into consideration when making medical decisions. In addition to not having discussions about healthcare preferences, the reasons for not using a substituted judgment included: the need for family consensus, unrealistic expectations of the patient, the need to incorporate their relative's quality of life into the decision, and the influence of healthcare professionals. Family members who did not have discussions about healthcare preferences identified various barriers to the discussion, including waiting too long, avoiding the topic, and the patient's denial of dementia. These data suggest several reasons why surrogate decision-makers for persons with advanced dementia do not use the substituted judgment standard and the potential value of interventions that would allow patients with early-stage dementia and their family members to discuss healthcare preferences.
Balancing emotion and cognition: a case for decision aiding in conservation efforts.
Wilson, Robyn S
2008-12-01
Despite advances in the quality of participatory decision making for conservation, many current efforts still suffer from an inability to bridge the gap between science and policy. Judgment and decision-making research suggests this gap may result from a person's reliance on affect-based shortcuts in complex decision contexts. I examined the results from 3 experiments that demonstrate how affect (i.e., the instantaneous reaction one has to a stimulus) influences individual judgments in these contexts and identified techniques from the decision-aiding literature that help encourage a balance between affect-based emotion and cognition in complex decision processes. In the first study, subjects displayed a lack of focus on their stated conservation objectives and made decisions that reflected their initial affective impressions. Value-focused approaches may help individuals incorporate all the decision-relevant objectives by making the technical and value-based objectives more salient. In the second study, subjects displayed a lack of focus on statistical risk and again made affect-based decisions. Trade-off techniques may help individuals incorporate relevant technical data, even when it conflicts with their initial affective impressions or other value-based objectives. In the third study, subjects displayed a lack of trust in decision-making authorities when the decision involved a negatively affect-rich outcome (i.e., a loss). Identifying shared salient values and increasing procedural fairness may help build social trust in both decision-making authorities and the decision process.
Kernel and divergence techniques in high energy physics separations
NASA Astrophysics Data System (ADS)
Bouř, Petr; Kůs, Václav; Franc, Jiří
2017-10-01
Binary decision trees under the Bayesian decision technique are used for supervised classification of high-dimensional data. We present a great potential of adaptive kernel density estimation as the nested separation method of the supervised binary divergence decision tree. Also, we provide a proof of alternative computing approach for kernel estimates utilizing Fourier transform. Further, we apply our method to Monte Carlo data set from the particle accelerator Tevatron at DØ experiment in Fermilab and provide final top-antitop signal separation results. We have achieved up to 82 % AUC while using the restricted feature selection entering the signal separation procedure.
A stochastic conflict resolution model for trading pollutant discharge permits in river systems.
Niksokhan, Mohammad Hossein; Kerachian, Reza; Amin, Pedram
2009-07-01
This paper presents an efficient methodology for developing pollutant discharge permit trading in river systems considering the conflict of interests of involving decision-makers and the stakeholders. In this methodology, a trade-off curve between objectives is developed using a powerful and recently developed multi-objective genetic algorithm technique known as the Nondominated Sorting Genetic Algorithm-II (NSGA-II). The best non-dominated solution on the trade-off curve is defined using the Young conflict resolution theory, which considers the utility functions of decision makers and stakeholders of the system. These utility functions are related to the total treatment cost and a fuzzy risk of violating the water quality standards. The fuzzy risk is evaluated using the Monte Carlo analysis. Finally, an optimization model provides the trading discharge permit policies. The practical utility of the proposed methodology in decision-making is illustrated through a realistic example of the Zarjub River in the northern part of Iran.
Introduction to SIMRAND: Simulation of research and development project
NASA Technical Reports Server (NTRS)
Miles, R. F., Jr.
1982-01-01
SIMRAND: SIMulation of Research ANd Development Projects is a methodology developed to aid the engineering and management decision process in the selection of the optimal set of systems or tasks to be funded on a research and development project. A project may have a set of systems or tasks under consideration for which the total cost exceeds the allocated budget. Other factors such as personnel and facilities may also enter as constraints. Thus the project's management must select, from among the complete set of systems or tasks under consideration, a partial set that satisfies all project constraints. The SIMRAND methodology uses analytical techniques and probability theory, decision analysis of management science, and computer simulation, in the selection of this optimal partial set. The SIMRAND methodology is truly a management tool. It initially specifies the information that must be generated by the engineers, thus providing information for the management direction of the engineers, and it ranks the alternatives according to the preferences of the decision makers.
Using machine learning techniques to automate sky survey catalog generation
NASA Technical Reports Server (NTRS)
Fayyad, Usama M.; Roden, J. C.; Doyle, R. J.; Weir, Nicholas; Djorgovski, S. G.
1993-01-01
We describe the application of machine classification techniques to the development of an automated tool for the reduction of a large scientific data set. The 2nd Palomar Observatory Sky Survey provides comprehensive photographic coverage of the northern celestial hemisphere. The photographic plates are being digitized into images containing on the order of 10(exp 7) galaxies and 10(exp 8) stars. Since the size of this data set precludes manual analysis and classification of objects, our approach is to develop a software system which integrates independently developed techniques for image processing and data classification. Image processing routines are applied to identify and measure features of sky objects. Selected features are used to determine the classification of each object. GID3* and O-BTree, two inductive learning techniques, are used to automatically learn classification decision trees from examples. We describe the techniques used, the details of our specific application, and the initial encouraging results which indicate that our approach is well-suited to the problem. The benefits of the approach are increased data reduction throughput, consistency of classification, and the automated derivation of classification rules that will form an objective, examinable basis for classifying sky objects. Furthermore, astronomers will be freed from the tedium of an intensely visual task to pursue more challenging analysis and interpretation problems given automatically cataloged data.
Data Mining Techniques for Customer Relationship Management
NASA Astrophysics Data System (ADS)
Guo, Feng; Qin, Huilin
2017-10-01
Data mining have made customer relationship management (CRM) a new area where firms can gain a competitive advantage, and play a key role in the firms’ management decision. In this paper, we first analyze the value and application fields of data mining techniques for CRM, and further explore how data mining applied to Customer churn analysis. A new business culture is developing today. The conventional production centered and sales purposed market strategy is gradually shifting to customer centered and service purposed. Customers’ value orientation is increasingly affecting the firms’. And customer resource has become one of the most important strategic resources. Therefore, understanding customers’ needs and discriminating the most contributed customers has become the driving force of most modern business.
Cost-Effectiveness Research in Neurosurgery: We Can and We Must.
Stein, Sherman C
2018-01-05
Rapid advancement of medical and surgical therapies, coupled with the recent preoccupation with limiting healthcare costs, makes a collision of the 2 objectives imminent. This article explains the value of cost-effectiveness analysis (CEA) in reconciling the 2 competing goals, and provides a brief introduction to evidence-based CEA techniques. The historical role of CEA in determining whether new neurosurgical strategies provide value for cost is summarized briefly, as are the limitations of the technique. Finally, the unique ability of the neurosurgical community to provide input to the CEA process is emphasized, as are the potential risks of leaving these important decisions in the hands of others. Copyright © 2018 by the Congress of Neurological Surgeons.
Zagonari, Fabio
2016-04-01
In this paper, I propose a general, consistent, and operational approach that accounts for ecosystem services in a decision-making context: I link ecosystem services to sustainable development criteria; adopt multi-criteria analysis to measure ecosystem services, with weights provided by stakeholders used to account for equity issues; apply both temporal and spatial discount rates; and adopt a technique to order performance of the possible solutions based on their similarity to an ideal solution (TOPSIS) to account for uncertainty about the parameters and functions. Applying this approach in a case study of an offshore research platform in Italy (CNR Acqua Alta) revealed that decisions depend non-linearly on the degree of loss aversion, to a smaller extent on a global focus (as opposed to a local focus), and to the smallest extent on social concerns (as opposed to economic or environmental concerns). Application of the general model to the case study leads to the conclusion that the ecosystem services framework is likely to be less useful in supporting decisions than in identifying the crucial features on which decisions depend, unless experts from different disciplines are involved, stakeholders are represented, and experts and stakeholders achieve mutual understanding. Copyright © 2016 Elsevier B.V. All rights reserved.
Multi-criteria decision-making for flood risk management: a survey of the current state of the art
NASA Astrophysics Data System (ADS)
Madruga de Brito, Mariana; Evers, Mariele
2016-04-01
This paper provides a review of multi-criteria decision-making (MCDM) applications to flood risk management, seeking to highlight trends and identify research gaps. A total of 128 peer-reviewed papers published from 1995 to June 2015 were systematically analysed. Results showed that the number of flood MCDM publications has exponentially grown during this period, with over 82 % of all papers published since 2009. A wide range of applications were identified, with most papers focusing on ranking alternatives for flood mitigation, followed by risk, hazard, and vulnerability assessment. The analytical hierarchy process (AHP) was the most popular method, followed by Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS), and Simple Additive Weighting (SAW). Although there is greater interest in MCDM, uncertainty analysis remains an issue and was seldom applied in flood-related studies. In addition, participation of multiple stakeholders has been generally fragmented, focusing on particular stages of the decision-making process, especially on the definition of criteria weights. Therefore, addressing the uncertainties around stakeholders' judgments and endorsing an active participation in all steps of the decision-making process should be explored in future applications. This could help to increase the quality of decisions and the implementation of chosen measures.
Constraint reasoning in deep biomedical models.
Cruz, Jorge; Barahona, Pedro
2005-05-01
Deep biomedical models are often expressed by means of differential equations. Despite their expressive power, they are difficult to reason about and make decisions, given their non-linearity and the important effects that the uncertainty on data may cause. The objective of this work is to propose a constraint reasoning framework to support safe decisions based on deep biomedical models. The methods used in our approach include the generic constraint propagation techniques for reducing the bounds of uncertainty of the numerical variables complemented with new constraint reasoning techniques that we developed to handle differential equations. The results of our approach are illustrated in biomedical models for the diagnosis of diabetes, tuning of drug design and epidemiology where it was a valuable decision-supporting tool notwithstanding the uncertainty on data. The main conclusion that follows from the results is that, in biomedical decision support, constraint reasoning may be a worthwhile alternative to traditional simulation methods, especially when safe decisions are required.
NASA Technical Reports Server (NTRS)
Ni, Jianjun David
2011-01-01
This presentation briefly discusses a research effort on mitigation techniques of pulsed radio frequency interference (RFI) on a Low-Density-Parity-Check (LDPC) code. This problem is of considerable interest in the context of providing reliable communications to the space vehicle which might suffer severe degradation due to pulsed RFI sources such as large radars. The LDPC code is one of modern forward-error-correction (FEC) codes which have the decoding performance to approach the Shannon Limit. The LDPC code studied here is the AR4JA (2048, 1024) code recommended by the Consultative Committee for Space Data Systems (CCSDS) and it has been chosen for some spacecraft design. Even though this code is designed as a powerful FEC code in the additive white Gaussian noise channel, simulation data and test results show that the performance of this LDPC decoder is severely degraded when exposed to the pulsed RFI specified in the spacecraft s transponder specifications. An analysis work (through modeling and simulation) has been conducted to evaluate the impact of the pulsed RFI and a few implemental techniques have been investigated to mitigate the pulsed RFI impact by reshuffling the soft-decision-data available at the input of the LDPC decoder. The simulation results show that the LDPC decoding performance of codeword error rate (CWER) under pulsed RFI can be improved up to four orders of magnitude through a simple soft-decision-data reshuffle scheme. This study reveals that an error floor of LDPC decoding performance appears around CWER=1E-4 when the proposed technique is applied to mitigate the pulsed RFI impact. The mechanism causing this error floor remains unknown, further investigation is necessary.
NASA Astrophysics Data System (ADS)
Guttikunda, S. K.; Johnson, T. M.; Procee, P.
2004-12-01
Fossil fuel combustion for domestic cooking and heating, power generation, industrial processes, and motor vehicles are the primary sources of air pollution in the developing country cities. Over the past twenty years, major advances have been made in understanding the social and economic consequences of air pollution. In both industrialized and developing countries, it has been shown that air pollution from energy combustion has detrimental impacts on human health and the environment. Lack of information on the sectoral contributions to air pollution - especially fine particulates, is one of the typical constraints for an effective integrated urban air quality management program. Without such information, it is difficult, if not impossible, for decision makers to provide policy advice and make informed investment decisions related to air quality improvements in developing countries. This also raises the need for low-cost ways of determining the principal sources of fine PM for a proper planning and decision making. The project objective is to develop and verify a methodology to assess and monitor the sources of PM, using a combination of ground-based monitoring and source apportionment techniques. This presentation will focus on four general tasks: (1) Review of the science and current activities in the combined use of monitoring data and modeling for better understanding of PM pollution. (2) Review of recent advances in atmospheric source apportionment techniques (e.g., principal component analysis, organic markers, source-receptor modeling techniques). (3) Develop a general methodology to use integrated top-down and bottom-up datasets. (4) Review of a series of current case studies from Africa, Asia and Latin America and the methodologies applied to assess the air pollution and its sources.
Synchronous in-field application of life-detection techniques in planetary analog missions
NASA Astrophysics Data System (ADS)
Amador, Elena S.; Cable, Morgan L.; Chaudry, Nosheen; Cullen, Thomas; Gentry, Diana; Jacobsen, Malene B.; Murukesan, Gayathri; Schwieterman, Edward W.; Stevens, Adam H.; Stockton, Amanda; Yin, Chang; Cullen, David C.; Geppert, Wolf
2015-02-01
Field expeditions that simulate the operations of robotic planetary exploration missions at analog sites on Earth can help establish best practices and are therefore a positive contribution to the planetary exploration community. There are many sites in Iceland that possess heritage as planetary exploration analog locations and whose environmental extremes make them suitable for simulating scientific sampling and robotic operations. We conducted a planetary exploration analog mission at two recent lava fields in Iceland, Fimmvörðuháls (2010) and Eldfell (1973), using a specially developed field laboratory. We tested the utility of in-field site sampling down selection and tiered analysis operational capabilities with three life detection and characterization techniques: fluorescence microscopy (FM), adenine-triphosphate (ATP) bioluminescence assay, and quantitative polymerase chain reaction (qPCR) assay. The study made use of multiple cycles of sample collection at multiple distance scales and field laboratory analysis using the synchronous life-detection techniques to heuristically develop the continuing sampling and analysis strategy during the expedition. Here we report the operational lessons learned and provide brief summaries of scientific data. The full scientific data report will follow separately. We found that rapid in-field analysis to determine subsequent sampling decisions is operationally feasible, and that the chosen life detection and characterization techniques are suitable for a terrestrial life-detection field mission. In-field analysis enables the rapid obtainment of scientific data and thus facilitates the collection of the most scientifically relevant samples within a single field expedition, without the need for sample relocation to external laboratories. The operational lessons learned in this study could be applied to future terrestrial field expeditions employing other analytical techniques and to future robotic planetary exploration missions.
Mehrotra, Sanjay; Kim, Kibaek
2011-12-01
We consider the problem of outcomes based budget allocations to chronic disease prevention programs across the United States (US) to achieve greater geographical healthcare equity. We use Diabetes Prevention and Control Programs (DPCP) by the Center for Disease Control and Prevention (CDC) as an example. We present a multi-criteria robust weighted sum model for such multi-criteria decision making in a group decision setting. The principal component analysis and an inverse linear programming techniques are presented and used to study the actual 2009 budget allocation by CDC. Our results show that the CDC budget allocation process for the DPCPs is not likely model based. In our empirical study, the relative weights for different prevalence and comorbidity factors and the corresponding budgets obtained under different weight regions are discussed. Parametric analysis suggests that money should be allocated to states to promote diabetes education and to increase patient-healthcare provider interactions to reduce disparity across the US.
Hummel, J M Marjan; Snoek, Govert J; van Til, Janine A; van Rossum, Wouter; Ijzerman, Maarten J
2005-01-01
This study supported the evaluation by a rehabilitation team of the performance of two treatment options that improve the arm-hand function in subjects with sixth cervical vertebra (C6) level Motor Group 2 tetraplegia. The analytic hierarchy process, a technique for multicriteria decision analysis, was used by a rehabilitation team and potential recipients to quantitatively compare a new technology, Functional Elec trical Stimulation (FES), with conventional surgery. Perform-ance was measured by functional improvement, treatment load, risks, user-friendliness, and social outcomes. Functional improvement after FES was considered better than that after conventional surgery. However, the rehabilitation team's overall rating for conventional surgery was slightly higher than that for FES (57% vs 44%). Compared with the rehabilitation team, potential recipients gave greater weight to burden of treatment and less weight to functional improvement. This study shows that evaluation of new technology must be more comprehensive than the evaluation of functional improvement alone, and that patient preferences may differ from those of the rehabilitation team.
ERIC Educational Resources Information Center
Wholeben, Brent Edward
This report describing the use of operations research techniques to determine which courseware packages or what microcomputer systems best address varied instructional objectives focuses on the MICROPIK model, a highly structured evaluation technique for making such complex instructional decisions. MICROPIK is a multiple alternatives model (MAA)…
ERIC Educational Resources Information Center
Dewell, Reneé; Hanthorn, Christy; Danielson, Jared; Burzette, Rebecca; Coetzee, Johann; Griffin, D. Dee; Ramirez, Alejandro; Dewell, Grant
2015-01-01
The purpose of the project was to evaluate the use of an interactive workshop designed to teach novel practical welfare techniques to beef cattle caretakers and decision makers. Following training, respondents reported being more likely to use or recommend use of local anesthesia for dehorning and castration and were more inclined to use meloxicam…
A Markovian state-space framework for integrating flexibility into space system design decisions
NASA Astrophysics Data System (ADS)
Lafleur, Jarret M.
The past decades have seen the state of the art in aerospace system design progress from a scope of simple optimization to one including robustness, with the objective of permitting a single system to perform well even in off-nominal future environments. Integrating flexibility, or the capability to easily modify a system after it has been fielded in response to changing environments, into system design represents a further step forward. One challenge in accomplishing this rests in that the decision-maker must consider not only the present system design decision, but also sequential future design and operation decisions. Despite extensive interest in the topic, the state of the art in designing flexibility into aerospace systems, and particularly space systems, tends to be limited to analyses that are qualitative, deterministic, single-objective, and/or limited to consider a single future time period. To address these gaps, this thesis develops a stochastic, multi-objective, and multi-period framework for integrating flexibility into space system design decisions. Central to the framework are five steps. First, system configuration options are identified and costs of switching from one configuration to another are compiled into a cost transition matrix. Second, probabilities that demand on the system will transition from one mission to another are compiled into a mission demand Markov chain. Third, one performance matrix for each design objective is populated to describe how well the identified system configurations perform in each of the identified mission demand environments. The fourth step employs multi-period decision analysis techniques, including Markov decision processes from the field of operations research, to find efficient paths and policies a decision-maker may follow. The final step examines the implications of these paths and policies for the primary goal of informing initial system selection. Overall, this thesis unifies state-centric concepts of flexibility from economics and engineering literature with sequential decision-making techniques from operations research. The end objective of this thesis’ framework and its supporting tools is to enable selection of the next-generation space systems today, tailored to decision-maker budget and performance preferences, that will be best able to adapt and perform in a future of changing environments and requirements. Following extensive theoretical development, the framework and its steps are applied to space system planning problems of (1) DARPA-motivated multiple- or distributed-payload satellite selection and (2) NASA human space exploration architecture selection.
A Framework for Assessment of Aviation Safety Technology Portfolios
NASA Technical Reports Server (NTRS)
Jones, Sharon M.; Reveley, Mary S.
2014-01-01
The programs within NASA's Aeronautics Research Mission Directorate (ARMD) conduct research and development to improve the national air transportation system so that Americans can travel as safely as possible. NASA aviation safety systems analysis personnel support various levels of ARMD management in their fulfillment of system analysis and technology prioritization as defined in the agency's program and project requirements. This paper provides a framework for the assessment of aviation safety research and technology portfolios that includes metrics such as projected impact on current and future safety, technical development risk and implementation risk. The paper also contains methods for presenting portfolio analysis and aviation safety Bayesian Belief Network (BBN) output results to management using bubble charts and quantitative decision analysis techniques.
2012-03-01
Figure 2 Advantage of Value Focused Thinking Figure 3 shows the ten-step approach for executing VFT. This helps people learn techniques to...country is economically. So, it is natural proxy scale. Among winter Olympic Games, the players in figure skating are evaluated by examiners using...dollars Gross national product Constructed Points in Figure skating Letter Grade in School 17 2.4.4 Step-4 Create Value Function It is
Tools and Techniques for Basin-Scale Climate Change Assessment
NASA Astrophysics Data System (ADS)
Zagona, E.; Rajagopalan, B.; Oakley, W.; Wilson, N.; Weinstein, P.; Verdin, A.; Jerla, C.; Prairie, J. R.
2012-12-01
The Department of Interior's WaterSMART Program seeks to secure and stretch water supplies to benefit future generations and identify adaptive measures to address climate change. Under WaterSMART, Basin Studies are comprehensive water studies to explore options for meeting projected imbalances in water supply and demand in specific basins. Such studies could be most beneficial with application of recent scientific advances in climate projections, stochastic simulation, operational modeling and robust decision-making, as well as computational techniques to organize and analyze many alternatives. A new integrated set of tools and techniques to facilitate these studies includes the following components: Future supply scenarios are produced by the Hydrology Simulator, which uses non-parametric K-nearest neighbor resampling techniques to generate ensembles of hydrologic traces based on historical data, optionally conditioned on long paleo reconstructed data using various Markov Chain techniuqes. Resampling can also be conditioned on climate change projections from e.g., downscaled GCM projections to capture increased variability; spatial and temporal disaggregation is also provided. The simulations produced are ensembles of hydrologic inputs to the RiverWare operations/infrastucture decision modeling software. Alternative demand scenarios can be produced with the Demand Input Tool (DIT), an Excel-based tool that allows modifying future demands by groups such as states; sectors, e.g., agriculture, municipal, energy; and hydrologic basins. The demands can be scaled at future dates or changes ramped over specified time periods. Resulting data is imported directly into the decision model. Different model files can represent infrastructure alternatives and different Policy Sets represent alternative operating policies, including options for noticing when conditions point to unacceptable vulnerabilities, which trigger dynamically executing changes in operations or other options. The over-arching Study Manager provides a graphical tool to create combinations of future supply scenarios, demand scenarios, infrastructure and operating policy alternatives; each scenario is executed as an ensemble of RiverWare runs, driven by the hydrologic supply. The Study Manager sets up and manages multiple executions on multi-core hardware. The sizeable are typically direct model outputs, or post-processed indicators of performance based on model outputs. Post processing statistical analysis of the outputs are possible using the Graphical Policy Analysis Tool or other statistical packages. Several Basin Studies undertaken have used RiverWare to evaluate future scenarios. The Colorado River Basin Study, the most complex and extensive to date, has taken advantage of these tools and techniques to generate supply scenarios, produce alternative demand scenarios and to set up and execute the many combinations of supplies, demands, policies, and infrastructure alternatives. The tools and techniques will be described with example applications.
The Mechanisms of Involuntary Attention
ERIC Educational Resources Information Center
Prinzmetal, William; Ha, Ruby; Khani, Aniss
2010-01-01
We tested 3 mechanisms of involuntary attention: (1) a perceptual enhancement mechanism, (2) a response-decision mechanism, and (3) a serial-search mechanism. Experiment 1 used a response deadline technique to compare the perceptual enhancement and the decision mechanisms and found evidence consistent with the decision mechanism. Experiment 2 used…
NASA Astrophysics Data System (ADS)
Yang, Rui; Li, Xiangyang; Zhang, Tong
2014-10-01
This paper uses two physics-derived techniques, the minimum spanning tree and the hierarchical tree, to investigate the networks formed by CITIC (China International Trust and Investment Corporation) industry indices in three periods from 2006 to 2013. The study demonstrates that obvious industry clustering effects exist in the networks, and Durable Consumer Goods, Industrial Products, Information Technology, Frequently Consumption and Financial Industry are the core nodes in the networks. We also use the rolling window technique to investigate the dynamic evolution of the networks' stability, by calculating the mean correlations and mean distances, as well as the variance of correlations and the distances of these indices. China's stock market is still immature and subject to administrative interventions. Therefore, through this analysis, regulators can focus on monitoring the core nodes to ensure the overall stability of the entire market, while investors can enhance their portfolio allocations or investment decision-making.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pugmire, David; Kress, James; Choi, Jong
Data driven science is becoming increasingly more common, complex, and is placing tremendous stresses on visualization and analysis frameworks. Data sources producing 10GB per second (and more) are becoming increasingly commonplace in both simulation, sensor and experimental sciences. These data sources, which are often distributed around the world, must be analyzed by teams of scientists that are also distributed. Enabling scientists to view, query and interact with such large volumes of data in near-real-time requires a rich fusion of visualization and analysis techniques, middleware and workflow systems. Here, this paper discusses initial research into visualization and analysis of distributed datamore » workflows that enables scientists to make near-real-time decisions of large volumes of time varying data.« less
Research of Simple Multi-Attribute Rating Technique for Decision Support
NASA Astrophysics Data System (ADS)
Siregar, Dodi; Arisandi, Diki; Usman, Ari; Irwan, Dedy; Rahim, Robbi
2017-12-01
One of the roles of decision support system is that it can assist the decision maker in obtaining the appropriate alternative with the desired criteria, one of the methods that could apply for the decision maker is SMART method with multicriteria decision making. This multi-criteria decision-making theory has meaning where every alternative has criteria and has value and weight, and the author uses this approach to facilitate decision making with a compelling case. The problems discussed in this paper are classified into problems of a variety Multiobjective (multiple goals to be accomplished) and multicriteria (many of the decisive criteria in reaching such decisions).
Modification of Agility Running Technique in Reaction to a Defender in Rugby Union
Wheeler, Keane W.; Sayers, Mark G.L.
2010-01-01
Three-dimensional kinematic analysis examined agility running technique during pre-planned and reactive performance conditions specific to attacking ball carries in rugby union. The variation to running technique of 8 highly trained rugby union players was compared between agility conditions (pre-planned and reactive) and also agility performance speeds (fast, moderate and slow). Kinematic measures were used to determine the velocity of the centre of mass (COM) in the anteroposterior (running speed) and mediolateral (lateral movement speed) planes. The position of foot-strike and toe-off was also examined for the step prior to the agility side- step (pre-change of direction phase) and then the side-step (change of direction phase). This study demonstrated that less lateral movement speed towards the intended direction change occurred during reactive compared to pre-planned conditions at pre-change of direction (0.08 ± 0.28 m·s-1 and 0.42 ± 0.25 m·s-1, respectively) and change of direction foot-strikes (0.25 ± 0.42 m·s-1 and 0.69 ± 0.43 m·s-1, respectively). Less lateral movement speed during reactive conditions was associated with greater lateral foot displacement (44.52 ± 6.10% leg length) at the change of direction step compared to pre-planned conditions (41.35 ± 5.85%). Importantly, the anticipation abilities during reactive conditions provided a means to differentiate between speeds of agility performance, with faster performances displaying greater lateral movement speed at the change of direction foot- strike (0.52 ± 0.34 m·s-1) compared to moderate (0.20 ± 0.37 m·s-1) and slow (-0.08 ± 0.31 m·s-1). The changes to running technique during reactive conditions highlight the need to incorporate decision-making in rugby union agility programs. Key points Changes to running technique occur when required to make a decision. Fast agility performers use different stepping strategies in reactive conditions. Decision-making must be incorporated in agility training programs. PMID:24149639
NASA Astrophysics Data System (ADS)
Müller-Hansen, Finn; Schlüter, Maja; Mäs, Michael; Donges, Jonathan F.; Kolb, Jakob J.; Thonicke, Kirsten; Heitzig, Jobst
2017-11-01
Today, humans have a critical impact on the Earth system and vice versa, which can generate complex feedback processes between social and ecological dynamics. Integrating human behavior into formal Earth system models (ESMs), however, requires crucial modeling assumptions about actors and their goals, behavioral options, and decision rules, as well as modeling decisions regarding human social interactions and the aggregation of individuals' behavior. Here, we review existing modeling approaches and techniques from various disciplines and schools of thought dealing with human behavior at different levels of decision making. We demonstrate modelers' often vast degrees of freedom but also seek to make modelers aware of the often crucial consequences of seemingly innocent modeling assumptions. After discussing which socioeconomic units are potentially important for ESMs, we compare models of individual decision making that correspond to alternative behavioral theories and that make diverse modeling assumptions about individuals' preferences, beliefs, decision rules, and foresight. We review approaches to model social interaction, covering game theoretic frameworks, models of social influence, and network models. Finally, we discuss approaches to studying how the behavior of individuals, groups, and organizations can aggregate to complex collective phenomena, discussing agent-based, statistical, and representative-agent modeling and economic macro-dynamics. We illustrate the main ingredients of modeling techniques with examples from land-use dynamics as one of the main drivers of environmental change bridging local to global scales.
Özkan, Aysun
2013-02-01
Healthcare waste should be managed carefully because of infected, pathological, etc. content especially in developing countries. Applied management systems must be the most appropriate solution from a technical, environmental, economic and social point of view. The main objective of this study was to analyse the current status of healthcare waste management in Turkey, and to investigate the most appropriate treatment/disposal option by using different decision-making techniques. For this purpose, five different healthcare waste treatment/disposal alternatives including incineration, microwaving, on-site sterilization, off-site sterilization and landfill were evaluated according to two multi-criteria decision-making techniques: analytic network process (ANP) and ELECTRE. In this context, benefits, costs and risks for the alternatives were taken into consideration. Furthermore, the prioritization and ranking of the alternatives were determined and compared for both methods. According to the comparisons, the off-site sterilization technique was found to be the most appropriate solution in both cases.
NASA Astrophysics Data System (ADS)
Kaur, Jagreet; Singh Mann, Kulwinder, Dr.
2018-01-01
AI in Healthcare needed to bring real, actionable insights and Individualized insights in real time for patients and Doctors to support treatment decisions., We need a Patient Centred Platform for integrating EHR Data, Patient Data, Prescriptions, Monitoring, Clinical research and Data. This paper proposes a generic architecture for enabling AI based healthcare analytics Platform by using open sources Technologies Apache beam, Apache Flink Apache Spark, Apache NiFi, Kafka, Tachyon, Gluster FS, NoSQL- Elasticsearch, Cassandra. This paper will show the importance of applying AI based predictive and prescriptive analytics techniques in Health sector. The system will be able to extract useful knowledge that helps in decision making and medical monitoring in real-time through an intelligent process analysis and big data processing.
BMP analysis system for watershed-based stormwater management.
Zhen, Jenny; Shoemaker, Leslie; Riverson, John; Alvi, Khalid; Cheng, Mow-Soung
2006-01-01
Best Management Practices (BMPs) are measures for mitigating nonpoint source (NPS) pollution caused mainly by stormwater runoff. Established urban and newly developing areas must develop cost effective means for restoring or minimizing impacts, and planning future growth. Prince George's County in Maryland, USA, a fast-growing region in the Washington, DC metropolitan area, has developed a number of tools to support analysis and decision making for stormwater management planning and design at the watershed level. These tools support watershed analysis, innovative BMPs, and optimization. Application of these tools can help achieve environmental goals and lead to significant cost savings. This project includes software development that utilizes GIS information and technology, integrates BMP processes simulation models, and applies system optimization techniques for BMP planning and selection. The system employs the ESRI ArcGIS as the platform, and provides GIS-based visualization and support for developing networks including sequences of land uses, BMPs, and stream reaches. The system also provides interfaces for BMP placement, BMP attribute data input, and decision optimization management. The system includes a stand-alone BMP simulation and evaluation module, which complements both research and regulatory nonpoint source control assessment efforts, and allows flexibility in the examining various BMP design alternatives. Process based simulation of BMPs provides a technique that is sensitive to local climate and rainfall patterns. The system incorporates a meta-heuristic optimization technique to find the most cost-effective BMP placement and implementation plan given a control target, or a fixed cost. A case study is presented to demonstrate the application of the Prince George's County system. The case study involves a highly urbanized area in the Anacostia River (a tributary to Potomac River) watershed southeast of Washington, DC. An innovative system of management practices is proposed to minimize runoff, improve water quality, and provide water reuse opportunities. Proposed management techniques include bioretention, green roof, and rooftop runoff collection (rain barrel) systems. The modeling system was used to identify the most cost-effective combinations of management practices to help minimize frequency and size of runoff events and resulting combined sewer overflows to the Anacostia River.
Bayesian approaches for Integrated Water Resources Management. A Mediterranean case study.
NASA Astrophysics Data System (ADS)
Gulliver, Zacarías; Herrero, Javier; José Polo, María
2013-04-01
This study presents the first steps of a short-term/mid-term analysis of the water resources in the Guadalfeo Basin, Spain. Within the basin the recent construction of the Rules dam has required the development of specific management tools and structures for this water system. The climate variability and the high water demand requirements for agriculture irrigation and tourism in this region may cause different controversies in the water management planning process. During the first stages of the study a rigorous analysis of the Water Framework Directive results was done in order to implement the legal requirements and the solutions for the gaps identified by the water authorities. In addition, the stakeholders and water experts identified the variables and geophysical processes for our specific water system case. These particularities need to be taken into account and are required to be reflected in the final computational tool. For decision making process purposes in a mid-term scale, a bayesian network has been used to quantify uncertainty which also provides a structure representation of probabilities, actions-decisions and utilities. On one hand by applying these techniques it is possible the inclusion of decision rules generating influence diagrams that provides clear and coherent semantics for the value of making an observation. On the other hand the utility nodes encode the stakeholders preferences which are measured on a numerical scale, choosing the action that maximizes the expected utility [MEU]. Also this graphical model allows us to identify gaps and project corrective measures, for example, formulating associated scenarios with different event hypotheses. In this sense conditional probability distributions of the seasonal water demand and waste water has been obtained between the established intervals. This fact will give to the regional water managers useful information for future decision making process. The final display is very visual and allows the user to understand quickly the model and the causal relationships between the existing nodes and variables. The input data were collected from the local monitoring networks and the unmonitored data has been generated with a physically based spatially distributed hydrological model WiMMed, which is validated and calibrated. For short-term purposes, pattern analysis has been applied for the management of extreme events scenarios, techniques as Bayesian Neural Networks (BNN) or Gaussian Processes (GP) giving accuracy on the predictions.
Kimber, Melissa; Couturier, Jennifer; Jack, Susan; Niccols, Alison; Van Blyderveen, Sherry; McVey, Gail
2014-01-01
To explore the decision-making processes involved in the uptake and implementation of evidence-based treatments (EBTs), namely, family-based treatment (FBT), among therapists and their administrators within publically funded eating disorder treatment programs in Ontario, Canada. Fundamental qualitative description guided sampling, data collection, and analytic decisions. Forty therapists and 11 administrators belonging to a network of clinicians treating eating disorders completed an in-depth interview regarding the decision-making processes involved in EBT uptake and implementation within their organizations. Content analysis and the constant comparative technique were used to analyze interview transcripts, with 20% of the data independently double-coded by a second coder. Therapists and their administrators identified the importance of an inclusive change culture in evidence-based practice (EBP) decision-making. Each group indicated reluctance to make EBP decisions in isolation from the other. Additionally, participants identified seven stages of decision-making involved in EBT adoption, beginning with exposure to the EBT model and ending with evaluating the impact of the EBT on patient outcomes. Support for a stage-based decision-making process was in participants' indication that the stages were needed to demonstrate that they considered the costs and benefits of making a practice change. Participants indicated that EBTs endorsed by the Provincial Network for Eating Disorders or the Academy for Eating Disorders would more likely be adopted. Future work should focus on integrating the important decision-making processes identified in this study with known implementation models to increase the use of low-cost and effective treatments, such as FBT, within eating disorder treatment programs. Copyright © 2013 Wiley Periodicals, Inc.
Panel Discussion on Multi-Disciplinary Analysis
NASA Technical Reports Server (NTRS)
Garcia, Robert
2002-01-01
The Marshall Space Flight Center (MSFC) is hosting the Thermal and Fluids Analysis Workshop (TFAWS) during the week of September 10, 2001. Included in this year's TFAWS is a panel session on Multidisciplinary Analysis techniques. The intent is to provide an opportunity for the users to gain information as to what product may be best suited for their applications environment and to provide feedback to you, the developers, on future desired developments. Potential users of multidisciplinary analysis (MDA) techniques are often overwhelmed by the number of choices available to them via commercial products and by the pace of new developments in this area. The purpose of this panel session is to provide a forum wherein MDA tools available and under development can be discussed, compared, and contrasted. The intent of this panel is to provide the end-user with the information necessary to make educated decisions on how to proceed with selecting their MDA tool. It is anticipated that the discussions this year will focus on MDA techniques that couple discipline codes or algorithms (as opposed to monolithic, unified MDA approaches). The MDA developers will be asked to prepare a product overview presentation addressing specific questions provided by the panel organizers. The purpose of these questions will be to establish the method employed by the particular MDA technique for communication between the discipline codes, to establish the similarities and differences amongst the various approaches, and to establish the range of experience and applications for each particular MDA approach.
DECISION MAKING , * GROUP DYNAMICS, NAVAL TRAINING, TRANSFER OF TRAINING, SCIENTIFIC RESEARCH, CLASSIFICATION, PROBLEM SOLVING, MATHEMATICAL MODELS, SUBMARINES, SIMULATORS, PERFORMANCE(HUMAN), UNDERSEA WARFARE.
Kaner, Eileen; Heaven, Ben; Rapley, Tim; Murtagh, Madeleine; Graham, Ruth; Thomson, Richard; May, Carl
2007-01-10
Much of the research on decision-making in health care has focused on consultation outcomes. Less is known about the process by which clinicians and patients come to a treatment decision. This study aimed to quantitatively describe the behaviour shown by doctors and patients during primary care consultations when three types of decision aids were used to promote treatment decision-making in a randomised controlled trial. A video-based study set in an efficacy trial which compared the use of paper-based guidelines (control) with two forms of computer-based decision aids (implicit and explicit versions of DARTS II). Treatment decision concerned warfarin anti-coagulation to reduce the risk of stroke in older patients with atrial fibrillation. Twenty nine consultations were video-recorded. A ten-minute 'slice' of the consultation was sampled for detailed content analysis using existing interaction analysis protocols for verbal behaviour and ethological techniques for non-verbal behaviour. Median consultation times (quartiles) differed significantly depending on the technology used. Paper-based guidelines took 21 (19-26) minutes to work through compared to 31 (16-41) minutes for the implicit tool; and 44 (39-55) minutes for the explicit tool. In the ten minutes immediately preceding the decision point, GPs dominated the conversation, accounting for 64% (58-66%) of all utterances and this trend was similar across all three arms of the trial. Information-giving was the most frequent activity for both GPs and patients, although GPs did this at twice the rate compared to patients and at higher rates in consultations involving computerised decision aids. GPs' language was highly technically focused and just 7% of their conversation was socio-emotional in content; this was half the socio-emotional content shown by patients (15%). However, frequent head nodding and a close mirroring in the direction of eye-gaze suggested that both parties were active participants in the conversation Irrespective of the arm of the trial, both patients' and GPs' behaviour showed that they were reciprocally engaged in these consultations. However, even in consultations aimed at promoting shared decision-making, GPs' were verbally dominant, and they worked primarily as information providers for patients. In addition, computer-based decision aids significantly prolonged the consultations, particularly the later phases. These data suggest that decision aids may not lead to more 'sharing' in treatment decision-making and that, in their current form, they may take too long to negotiate for use in routine primary care.
NASA Astrophysics Data System (ADS)
Iqbal, M.; Islam, A.; Hossain, A.; Mustaque, S.
2016-12-01
Multi-Criteria Decision Making(MCDM) is advanced analytical method to evaluate appropriate result or decision from multiple criterion environment. Present time in advanced research, MCDM technique is progressive analytical process to evaluate a logical decision from various conflict. In addition, Present day Geospatial approach (e.g. Remote sensing and GIS) also another advanced technical approach in a research to collect, process and analyze various spatial data at a time. GIS and Remote sensing together with the MCDM technique could be the best platform to solve a complex decision making process. These two latest process combined very effectively used in site selection for solid waste management in urban policy. The most popular MCDM technique is Weighted Linear Method (WLC) where Analytical Hierarchy Process (AHP) is another popular and consistent techniques used in worldwide as dependable decision making. Consequently, the main objective of this study is improving a AHP model as MCDM technique with Geographic Information System (GIS) to select a suitable landfill site for urban solid waste management. Here AHP technique used as a MCDM tool to select the best suitable landfill location for urban solid waste management. To protect the urban environment in a sustainable way municipal waste needs an appropriate landfill site considering environmental, geological, social and technical aspect of the region. A MCDM model generate from five class related which related to environmental, geological, social and technical using AHP method and input the result set in GIS for final model location for urban solid waste management. The final suitable location comes out that 12.2% of the area corresponds to 22.89 km2 considering the total study area. In this study, Keraniganj sub-district of Dhaka district in Bangladesh is consider as study area which is densely populated city currently undergoes an unmanaged waste management system especially the suitable landfill sites for waste dumping site.
Demographics of reintroduced populations: estimation, modeling, and decision analysis
Converse, Sarah J.; Moore, Clinton T.; Armstrong, Doug P.
2013-01-01
Reintroduction can be necessary for recovering populations of threatened species. However, the success of reintroduction efforts has been poorer than many biologists and managers would hope. To increase the benefits gained from reintroduction, management decision making should be couched within formal decision-analytic frameworks. Decision analysis is a structured process for informing decision making that recognizes that all decisions have a set of components—objectives, alternative management actions, predictive models, and optimization methods—that can be decomposed, analyzed, and recomposed to facilitate optimal, transparent decisions. Because the outcome of interest in reintroduction efforts is typically population viability or related metrics, models used in decision analysis efforts for reintroductions will need to include population models. In this special section of the Journal of Wildlife Management, we highlight examples of the construction and use of models for informing management decisions in reintroduced populations. In this introductory contribution, we review concepts in decision analysis, population modeling for analysis of decisions in reintroduction settings, and future directions. Increased use of formal decision analysis, including adaptive management, has great potential to inform reintroduction efforts. Adopting these practices will require close collaboration among managers, decision analysts, population modelers, and field biologists.
Breaking up is hard to do: why disinvestment in medical technology is harder than investment.
Haas, Marion; Hall, Jane; Viney, Rosalie; Gallego, Gisselle
2012-05-01
Healthcare technology is a two-edged sword - it offers new and better treatment to a wider range of people and, at the same time, is a major driver of increasing costs in health systems. Many countries have developed sophisticated systems of health technology assessment (HTA) to inform decisions about new investments in new healthcare interventions. In this paper, we question whether HTA is also the appropriate framework for guiding or informing disinvestment decisions. In exploring the issues related to disinvestment, we first discuss the various HTA frameworks which have been suggested as a means of encouraging or facilitating disinvestment. We then describe available means of identifying candidates for disinvestment (comparative effectiveness research, clinical practice variations, clinical practice guidelines) and for implementing the disinvestment process (program budgeting and marginal analysis (PBMA) and related techniques). In considering the possible reasons for the lack of progress in active disinvestment, we suggest that HTA is not the right framework as disinvestment involves a different decision making context. The key to disinvestment is not just what to stop doing but how to make it happen - that is, decision makers need to be aware of funding disincentives.
Ramezankhani, Azra; Pournik, Omid; Shahrabi, Jamal; Khalili, Davood; Azizi, Fereidoun; Hadaegh, Farzad
2014-09-01
The aim of this study was to create a prediction model using data mining approach to identify low risk individuals for incidence of type 2 diabetes, using the Tehran Lipid and Glucose Study (TLGS) database. For a 6647 population without diabetes, aged ≥20 years, followed for 12 years, a prediction model was developed using classification by the decision tree technique. Seven hundred and twenty-nine (11%) diabetes cases occurred during the follow-up. Predictor variables were selected from demographic characteristics, smoking status, medical and drug history and laboratory measures. We developed the predictive models by decision tree using 60 input variables and one output variable. The overall classification accuracy was 90.5%, with 31.1% sensitivity, 97.9% specificity; and for the subjects without diabetes, precision and f-measure were 92% and 0.95, respectively. The identified variables included fasting plasma glucose, body mass index, triglycerides, mean arterial blood pressure, family history of diabetes, educational level and job status. In conclusion, decision tree analysis, using routine demographic, clinical, anthropometric and laboratory measurements, created a simple tool to predict individuals at low risk for type 2 diabetes. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Intelligent Diagnostic Assistant for Complicated Skin Diseases through C5's Algorithm.
Jeddi, Fatemeh Rangraz; Arabfard, Masoud; Kermany, Zahra Arab
2017-09-01
Intelligent Diagnostic Assistant can be used for complicated diagnosis of skin diseases, which are among the most common causes of disability. The aim of this study was to design and implement a computerized intelligent diagnostic assistant for complicated skin diseases through C5's Algorithm. An applied-developmental study was done in 2015. Knowledge base was developed based on interviews with dermatologists through questionnaires and checklists. Knowledge representation was obtained from the train data in the database using Excel Microsoft Office. Clementine Software and C5's Algorithms were applied to draw the decision tree. Analysis of test accuracy was performed based on rules extracted using inference chains. The rules extracted from the decision tree were entered into the CLIPS programming environment and the intelligent diagnostic assistant was designed then. The rules were defined using forward chaining inference technique and were entered into Clips programming environment as RULE. The accuracy and error rates obtained in the training phase from the decision tree were 99.56% and 0.44%, respectively. The accuracy of the decision tree was 98% and the error was 2% in the test phase. Intelligent diagnostic assistant can be used as a reliable system with high accuracy, sensitivity, specificity, and agreement.
Ideal AFROC and FROC observers.
Khurd, Parmeshwar; Liu, Bin; Gindi, Gene
2010-02-01
Detection of multiple lesions in images is a medically important task and free-response receiver operating characteristic (FROC) analyses and its variants, such as alternative FROC (AFROC) analyses, are commonly used to quantify performance in such tasks. However, ideal observers that optimize FROC or AFROC performance metrics have not yet been formulated in the general case. If available, such ideal observers may turn out to be valuable for imaging system optimization and in the design of computer aided diagnosis techniques for lesion detection in medical images. In this paper, we derive ideal AFROC and FROC observers. They are ideal in that they maximize, amongst all decision strategies, the area, or any partial area, under the associated AFROC or FROC curve. Calculation of observer performance for these ideal observers is computationally quite complex. We can reduce this complexity by considering forms of these observers that use false positive reports derived from signal-absent images only. We also consider a Bayes risk analysis for the multiple-signal detection task with an appropriate definition of costs. A general decision strategy that minimizes Bayes risk is derived. With particular cost constraints, this general decision strategy reduces to the decision strategy associated with the ideal AFROC or FROC observer.