Sample records for multi-attribute utility analysis

  1. Assessing the Reliability and Validity of Multi-Attribute Utility Procedures: An Application of the Theory of Generalizability

    DTIC Science & Technology

    1975-07-01

    I WIWIHIHlipi pqpv<Hi^«^Rii.i ii mmw AD-A016 282 ASSESSING THE REALIBILITY AND VALIDITY OF MULTI-ATTRIBUTE UTILITY PROCEDURES: AN...more complicated and use data from actual experiments. Example 1: Analysis of raters making Importance judgments about attributes. In MAU studies...generaluablllty of JUDGE as contrasted to ÜASC. To do this, we win reanaIyze the data for each syste™ separately. This 1. valid since the initial

  2. Schizophrenia: multi-attribute utility theory approach to selection of atypical antipsychotics.

    PubMed

    Bettinger, Tawny L; Shuler, Garyn; Jones, Donnamaria R; Wilson, James P

    2007-02-01

    Current guidelines/algorithms recommend atypical antipsychotics as first-line agents for the treatment of schizophrenia. Because there are extensive healthcare costs associated with the treatment of schizophrenia, many institutions and health systems are faced with making restrictive formulary decisions regarding the use of atypical antipsychotics. Often, medication acquisition costs are the driving force behind formulary decisions, while other treatment factors are not considered. To apply a multi-attribute utility theory (MAUT) analysis to aid in the selection of a preferred agent among the atypical antipsychotics for the treatment of schizophrenia. Five atypical antipsychotics (risperidone, olanzapine, quetiapine, ziprasidone, aripiprazole) were selected as the alternative agents to be included in the MAUT analysis. The attributes identified for inclusion in the analysis were efficacy, adverse effects, cost, and adherence, with relative weights of 35%, 35%, 20%, and 10%, respectively. For each agent, attribute scores were calculated, weighted, and then summed to generate a total utility score. The agent with the highest total utility score was considered the preferred agent. Aripiprazole, with a total utility score of 75.8, was the alternative agent with the highest total utility score in this model. This was followed by ziprasidone, risperidone, and quetiapine, with total utility scores of 71.8, 69.0, and 65.9, respectively. Olanzapine received the lowest total utility score. A sensitivity analysis was performed and failed to displace aripiprazole as the agent with the highest total utility score. This model suggests that aripiprazole should be considered a preferred agent for the treatment of schizophrenia unless found to be otherwise inappropriate.

  3. Decision Making Methods in Space Economics and Systems Engineering

    NASA Technical Reports Server (NTRS)

    Shishko, Robert

    2006-01-01

    This viewgraph presentation reviews various methods of decision making and the impact that they have on space economics and systems engineering. Some of the methods discussed are: Present Value and Internal Rate of Return (IRR); Cost-Benefit Analysis; Real Options; Cost-Effectiveness Analysis; Cost-Utility Analysis; Multi-Attribute Utility Theory (MAUT); and Analytic Hierarchy Process (AHP).

  4. EXPERIMENTING WITH MULTI-ATTRIBUTE UTILITY SURVEY METHODS IN A MULTI-DIMENSIONAL VALUATION PROBLEM. (R824699)

    EPA Science Inventory

    Abstract

    The use of willingness-to-pay (WTP) survey techniques based on multi-attribute utility (MAU) approaches has been recommended by some authors as a way to deal simultaneously with two difficulties that increasingly plague environmental valuation. The first of th...

  5. Why do multi-attribute utility instruments produce different utilities: the relative importance of the descriptive systems, scale and 'micro-utility' effects.

    PubMed

    Richardson, Jeff; Iezzi, Angelo; Khan, Munir A

    2015-08-01

    Health state utilities measured by the major multi-attribute utility instruments differ. Understanding the reasons for this is important for the choice of instrument and for research designed to reconcile these differences. This paper investigates these reasons by explaining pairwise differences between utilities derived from six multi-attribute utility instruments in terms of (1) their implicit measurement scales; (2) the structure of their descriptive systems; and (3) 'micro-utility effects', scale-adjusted differences attributable to their utility formula. The EQ-5D-5L, SF-6D, HUI 3, 15D and AQoL-8D were administered to 8,019 individuals. Utilities and unweighted values were calculated using each instrument. Scale effects were determined by the linear relationship between utilities, the effect of the descriptive system by comparison of scale-adjusted values and 'micro-utility effects' by the unexplained difference between utilities and values. Overall, 66 % of the differences between utilities was attributable to the descriptive systems, 30.3 % to scale effects and 3.7 % to micro-utility effects. Results imply that the revision of utility algorithms will not reconcile differences between instruments. The dominating importance of the descriptive system highlights the need for researchers to select the instrument most capable of describing the health states relevant for a study. Reconciliation of inconsistent utilities produced by different instruments must focus primarily upon the content of the descriptive system. Utility weights primarily determine the measurement scale. Other differences, attributable to utility formula, are comparatively unimportant.

  6. Selecting essential information for biosurveillance--a multi-criteria decision analysis.

    PubMed

    Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina

    2014-01-01

    The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.

  7. Operation Exodus: The Massacre of 44 Philippine Police Commandos In Mamasapano Clash

    DTIC Science & Technology

    2016-09-01

    strategic thinking, utilizing Game Theory and Multi-Attribute Decision Making; the combination of these two dynamic tools is used to evaluate their...thinking, utilizing Game Theory and Multi-Attribute Decision Making; the combination of these two dynamic tools is used to evaluate their potential...35 A. GAME THEORETIC APPROACH ......................................................36 B. APPLYING GAME THEORY TO OPLAN: EXODUS

  8. Screening and Evaluation Tool (SET) Users Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pincock, Layne

    This document is the users guide to using the Screening and Evaluation Tool (SET). SET is a tool for comparing multiple fuel cycle options against a common set of criteria and metrics. It does this using standard multi-attribute utility decision analysis methods.

  9. Selecting Essential Information for Biosurveillance—A Multi-Criteria Decision Analysis

    PubMed Central

    Generous, Nicholas; Margevicius, Kristen J.; Taylor-McCabe, Kirsten J.; Brown, Mac; Daniel, W. Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina

    2014-01-01

    The National Strategy for Biosurveillancedefines biosurveillance as “the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels.” However, the strategy does not specify how “essential information” is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being “essential”. Thequestion of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of “essential information” for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system. PMID:24489748

  10. Multi-Task Learning with Low Rank Attribute Embedding for Multi-Camera Person Re-Identification.

    PubMed

    Su, Chi; Yang, Fan; Zhang, Shiliang; Tian, Qi; Davis, Larry Steven; Gao, Wen

    2018-05-01

    We propose Multi-Task Learning with Low Rank Attribute Embedding (MTL-LORAE) to address the problem of person re-identification on multi-cameras. Re-identifications on different cameras are considered as related tasks, which allows the shared information among different tasks to be explored to improve the re-identification accuracy. The MTL-LORAE framework integrates low-level features with mid-level attributes as the descriptions for persons. To improve the accuracy of such description, we introduce the low-rank attribute embedding, which maps original binary attributes into a continuous space utilizing the correlative relationship between each pair of attributes. In this way, inaccurate attributes are rectified and missing attributes are recovered. The resulting objective function is constructed with an attribute embedding error and a quadratic loss concerning class labels. It is solved by an alternating optimization strategy. The proposed MTL-LORAE is tested on four datasets and is validated to outperform the existing methods with significant margins.

  11. Evaluation of infectious diseases and clinical microbiology specialists' preferences for hand hygiene: analysis using the multi-attribute utility theory and the analytic hierarchy process methods.

    PubMed

    Suner, Aslı; Oruc, Ozlem Ege; Buke, Cagri; Ozkaya, Hacer Deniz; Kitapcioglu, Gul

    2017-08-31

    Hand hygiene is one of the most effective attempts to control nosocomial infections, and it is an important measure to avoid the transmission of pathogens. However, the compliance of healthcare workers (HCWs) with hand washing is still poor worldwide. Herein, we aimed to determine the best hand hygiene preference of the infectious diseases and clinical microbiology (IDCM) specialists to prevent transmission of microorganisms from one patient to another. Expert opinions regarding the criteria that influence the best hand hygiene preference were collected through a questionnaire via face-to-face interviews. Afterwards, these opinions were examined with two widely used multi-criteria decision analysis (MCDA) methods, the Multi-Attribute Utility Theory (MAUT) and the Analytic Hierarchy Process (AHP). A total of 15 IDCM specialist opinions were collected from diverse private and public hospitals located in İzmir, Turkey. The mean age of the participants was 49.73 ± 8.46, and the mean experience year of the participants in their fields was 17.67 ± 11.98. The findings that we obtained through two distinct decision making methods, the MAUT and the AHP, suggest that alcohol-based antiseptic solution (ABAS) has the highest utility (0.86) and priority (0.69) among the experts' choices. In conclusion, the MAUT and the AHP, decision models developed here indicate that rubbing the hands with ABAS is the most favorable choice for IDCM specialists to prevent nosocomial infection.

  12. Patient perspectives of telemedicine quality

    PubMed Central

    LeRouge, Cynthia M; Garfield, Monica J; Hevner, Alan R

    2015-01-01

    Background The purpose of this study was to explore the quality attributes required for effective telemedicine encounters from the perspective of the patient. Methods We used a multi-method (direct observation, focus groups, survey) field study to collect data from patients who had experienced telemedicine encounters. Multi-perspectives (researcher and provider) were used to interpret a rich set of data from both a research and practice perspective. Results The result of this field study is a taxonomy of quality attributes for telemedicine service encounters that prioritizes the attributes from the patient perspective. We identify opportunities to control the level of quality for each attribute (ie, who is responsible for control of each attribute and when control can be exerted in relation to the encounter process). This analysis reveals that many quality attributes are in the hands of various stakeholders, and all attributes can be addressed proactively to some degree before the encounter begins. Conclusion Identification of the quality attributes important to a telemedicine encounter from a patient perspective enables one to better design telemedicine encounters. This preliminary work not only identifies such attributes, but also ascertains who is best able to address quality issues prior to an encounter. For practitioners, explicit representation of the quality attributes of technology-based systems and processes and insight on controlling key attributes are essential to implementation, utilization, management, and common understanding. PMID:25565781

  13. Economics of human performance and systems total ownership cost.

    PubMed

    Onkham, Wilawan; Karwowski, Waldemar; Ahram, Tareq Z

    2012-01-01

    Financial costs of investing in people is associated with training, acquisition, recruiting, and resolving human errors have a significant impact on increased total ownership costs. These costs can also affect the exaggerate budgets and delayed schedules. The study of human performance economical assessment in the system acquisition process enhances the visibility of hidden cost drivers which support program management informed decisions. This paper presents the literature review of human total ownership cost (HTOC) and cost impacts on overall system performance. Economic value assessment models such as cost benefit analysis, risk-cost tradeoff analysis, expected value of utility function analysis (EV), growth readiness matrix, multi-attribute utility technique, and multi-regressions model were introduced to reflect the HTOC and human performance-technology tradeoffs in terms of the dollar value. The human total ownership regression model introduces to address the influencing human performance cost component measurement. Results from this study will increase understanding of relevant cost drivers in the system acquisition process over the long term.

  14. The Gamma-Ray Burst ToolSHED is Open for Business

    NASA Astrophysics Data System (ADS)

    Giblin, Timothy W.; Hakkila, Jon; Haglin, David J.; Roiger, Richard J.

    2004-09-01

    The GRB ToolSHED, a Gamma-Ray Burst SHell for Expeditions in Data-Mining, is now online and available via a web browser to all in the scientific community. The ToolSHED is an online web utility that contains pre-processed burst attributes of the BATSE catalog and a suite of induction-based machine learning and statistical tools for classification and cluster analysis. Users create their own login account and study burst properties within user-defined multi-dimensional parameter spaces. Although new GRB attributes are periodically added to the database for user selection, the ToolSHED has a feature that allows users to upload their own burst attributes (e.g. spectral parameters, etc.) so that additional parameter spaces can be explored. A data visualization feature using GNUplot and web-based IDL has also been implemented to provide interactive plotting of user-selected session output. In an era in which GRB observations and attributes are becoming increasingly more complex, a utility such as the GRB ToolSHED may play an important role in deciphering GRB classes and understanding intrinsic burst properties.

  15. A Quadrupole Dalton-based multi-attribute method for product characterization, process development, and quality control of therapeutic proteins.

    PubMed

    Xu, Weichen; Jimenez, Rod Brian; Mowery, Rachel; Luo, Haibin; Cao, Mingyan; Agarwal, Nitin; Ramos, Irina; Wang, Xiangyang; Wang, Jihong

    2017-10-01

    During manufacturing and storage process, therapeutic proteins are subject to various post-translational modifications (PTMs), such as isomerization, deamidation, oxidation, disulfide bond modifications and glycosylation. Certain PTMs may affect bioactivity, stability or pharmacokinetics and pharmacodynamics profile and are therefore classified as potential critical quality attributes (pCQAs). Identifying, monitoring and controlling these PTMs are usually key elements of the Quality by Design (QbD) approach. Traditionally, multiple analytical methods are utilized for these purposes, which is time consuming and costly. In recent years, multi-attribute monitoring methods have been developed in the biopharmaceutical industry. However, these methods combine high-end mass spectrometry with complicated data analysis software, which could pose difficulty when implementing in a quality control (QC) environment. Here we report a multi-attribute method (MAM) using a Quadrupole Dalton (QDa) mass detector to selectively monitor and quantitate PTMs in a therapeutic monoclonal antibody. The result output from the QDa-based MAM is straightforward and automatic. Evaluation results indicate this method provides comparable results to the traditional assays. To ensure future application in the QC environment, this method was qualified according to the International Conference on Harmonization (ICH) guideline and applied in the characterization of drug substance and stability samples. The QDa-based MAM is shown to be an extremely useful tool for product and process characterization studies that facilitates facile understanding of process impact on multiple quality attributes, while being QC friendly and cost-effective.

  16. The Valuation of Scientific and Technical Experiments

    NASA Technical Reports Server (NTRS)

    Williams, F. E.

    1972-01-01

    Rational selection of scientific and technical experiments for space missions is studied. Particular emphasis is placed on the assessment of value or worth of an experiment. A specification procedure is outlined and discussed for the case of one decision maker. Experiments are viewed as multi-attributed entities, and a relevant set of attributes is proposed. Alternative methods of describing levels of the attributes are proposed and discussed. The reasonableness of certain simplifying assumptions such as preferential and utility independence is explored, and it is tentatively concluded that preferential independence applies and utility independence appears to be appropriate.

  17. Diverse Expected Gradient Active Learning for Relative Attributes.

    PubMed

    You, Xinge; Wang, Ruxin; Tao, Dacheng

    2014-06-02

    The use of relative attributes for semantic understanding of images and videos is a promising way to improve communication between humans and machines. However, it is extremely labor- and time-consuming to define multiple attributes for each instance in large amount of data. One option is to incorporate active learning, so that the informative samples can be actively discovered and then labeled. However, most existing active-learning methods select samples one at a time (serial mode), and may therefore lose efficiency when learning multiple attributes. In this paper, we propose a batch-mode active-learning method, called Diverse Expected Gradient Active Learning (DEGAL). This method integrates an informativeness analysis and a diversity analysis to form a diverse batch of queries. Specifically, the informativeness analysis employs the expected pairwise gradient length as a measure of informativeness, while the diversity analysis forces a constraint on the proposed diverse gradient angle. Since simultaneous optimization of these two parts is intractable, we utilize a two-step procedure to obtain the diverse batch of queries. A heuristic method is also introduced to suppress imbalanced multi-class distributions. Empirical evaluations of three different databases demonstrate the effectiveness and efficiency of the proposed approach.

  18. Construction of social value or utility-based health indices: the usefulness of factorial experimental design plans.

    PubMed

    Cadman, D; Goldsmith, C

    1986-01-01

    Global indices, which aggregate multiple health or function attributes into a single summary indicator, are useful measures in health research. Two key issues must be addressed in the initial stages of index construction from the universe of possible health and function attributes, which ones should be included in a new index? and how simple can the statistical model be to combine attributes into a single numeric index value? Factorial experimental designs were used in the initial stages of developing a function index for evaluating a program for the care of young handicapped children. Beginning with eight attributes judged important to the goals of the program by clinicians, social preference values for different function states were obtained from 32 parents of handicapped children and 32 members of the community. Using category rating methods each rater scored 16 written multi-attribute case descriptions which contained information about a child's status for all eight attributes. Either a good or poor level of each function attribute and age 3 or 5 years were described in each case. Thus, 2(8) = 256 different cases were rated. Two factorial design plans were selected and used to allocate case descriptions to raters. Analysis of variance determined that seven of the eight clinician selected attributes were required in a social value based index for handicapped children. Most importantly, the subsequent steps of index construction could be greatly simplified by the finding that a simple additive statistical model without complex attribute interaction terms was adequate for the index. We conclude that factorial experimental designs are an efficient, feasible and powerful tool for the initial stages of constructing a multi-attribute health index.

  19. Multi-Attribute Tradespace Exploration in Space System Design

    NASA Astrophysics Data System (ADS)

    Ross, A. M.; Hastings, D. E.

    2002-01-01

    The complexity inherent in space systems necessarily requires intense expenditures of resources both human and monetary. The high level of ambiguity present in the early design phases of these systems causes long, highly iterative, and costly design cycles. This paper looks at incorporating decision theory methods into the early design processes to streamline communication of wants and needs among stakeholders and between levels of design. Communication channeled through formal utility interviews and analysis enables engineers to better understand the key drivers for the system and allows a more thorough exploration of the design tradespace. Multi-Attribute Tradespace Exploration (MATE), an evolving process incorporating decision theory into model and simulation- based design, has been applied to several space system case studies at MIT. Preliminary results indicate that this process can improve the quality of communication to more quickly resolve project ambiguity, and enable the engineer to discover better value designs for multiple stakeholders. MATE is also being integrated into a concurrent design environment to facilitate the transfer knowledge of important drivers into higher fidelity design phases. Formal utility theory provides a mechanism to bridge the language barrier between experts of different backgrounds and differing needs (e.g. scientists, engineers, managers, etc). MATE with concurrent design couples decision makers more closely to the design, and most importantly, maintains their presence between formal reviews.

  20. Advanced GPR imaging of sedimentary features: integrated attribute analysis applied to sand dunes

    NASA Astrophysics Data System (ADS)

    Zhao, Wenke; Forte, Emanuele; Fontolan, Giorgio; Pipan, Michele

    2018-04-01

    We evaluate the applicability and the effectiveness of integrated GPR attribute analysis to image the internal sedimentary features of the Piscinas Dunes, SW Sardinia, Italy. The main objective is to explore the limits of GPR techniques to study sediment-bodies geometry and to provide a non-invasive high-resolution characterization of the different subsurface domains of dune architecture. On such purpose, we exploit the high-quality Piscinas data-set to extract and test different attributes of the GPR trace. Composite displays of multi-attributes related to amplitude, frequency, similarity and textural features are displayed with overlays and RGB mixed models. A multi-attribute comparative analysis is used to characterize different radar facies to better understand the characteristics of internal reflection patterns. The results demonstrate that the proposed integrated GPR attribute analysis can provide enhanced information about the spatial distribution of sediment bodies, allowing an enhanced and more constrained data interpretation.

  1. Applying an efficient K-nearest neighbor search to forest attribute imputation

    Treesearch

    Andrew O. Finley; Ronald E. McRoberts; Alan R. Ek

    2006-01-01

    This paper explores the utility of an efficient nearest neighbor (NN) search algorithm for applications in multi-source kNN forest attribute imputation. The search algorithm reduces the number of distance calculations between a given target vector and each reference vector, thereby, decreasing the time needed to discover the NN subset. Results of five trials show gains...

  2. Integrated seismic stochastic inversion and multi-attributes to delineate reservoir distribution: Case study MZ fields, Central Sumatra Basin

    NASA Astrophysics Data System (ADS)

    Haris, A.; Novriyani, M.; Suparno, S.; Hidayat, R.; Riyanto, A.

    2017-07-01

    This study presents the integration of seismic stochastic inversion and multi-attributes for delineating the reservoir distribution in term of lithology and porosity in the formation within depth interval between the Top Sihapas and Top Pematang. The method that has been used is a stochastic inversion, which is integrated with multi-attribute seismic by applying neural network Probabilistic Neural Network (PNN). Stochastic methods are used to predict the probability mapping sandstone as the result of impedance varied with 50 realizations that will produce a good probability. Analysis of Stochastic Seismic Tnversion provides more interpretive because it directly gives the value of the property. Our experiment shows that AT of stochastic inversion provides more diverse uncertainty so that the probability value will be close to the actual values. The produced AT is then used for an input of a multi-attribute analysis, which is used to predict the gamma ray, density and porosity logs. To obtain the number of attributes that are used, stepwise regression algorithm is applied. The results are attributes which are used in the process of PNN. This PNN method is chosen because it has the best correlation of others neural network method. Finally, we interpret the product of the multi-attribute analysis are in the form of pseudo-gamma ray volume, density volume and volume of pseudo-porosity to delineate the reservoir distribution. Our interpretation shows that the structural trap is identified in the southeastern part of study area, which is along the anticline.

  3. A Literature Review and Compilation of Nuclear Waste Management System Attributes for Use in Multi-Objective System Evaluations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalinina, Elena Arkadievna; Samsa, Michael

    The purpose of this work was to compile a comprehensive initial set of potential nuclear waste management system attributes. This initial set of attributes is intended to serve as a starting point for additional consideration by system analysts and planners to facilitate the development of a waste management system multi-objective evaluation framework based on the principles and methodology of multi-attribute utility analysis. The compilation is primarily based on a review of reports issued by the Canadian Nuclear Waste Management Organization (NWMO) and the Blue Ribbon Commission on America's Nuclear Future (BRC), but also an extensive review of the available literaturemore » for similar and past efforts as well. Numerous system attributes found in different sources were combined into a single objectives-oriented hierarchical structure. This study provides a discussion of the data sources and the descriptions of the hierarchical structure. A particular focus of this study was on collecting and compiling inputs from past studies that involved the participation of various external stakeholders. However, while the important role of stakeholder input in a country's waste management decision process is recognized in the referenced sources, there are only a limited number of in-depth studies of the stakeholders' differing perspectives. Compiling a comprehensive hierarchical listing of attributes is a complex task since stakeholders have multiple and often conflicting interests. The BRC worked for two years (January 2010 to January 2012) to "ensure it has heard from as many points of view as possible." The Canadian NWMO study took four years and ample resources, involving national and regional stakeholders' dialogs, internet-based dialogs, information and discussion sessions, open houses, workshops, round tables, public attitude research, website, and topic reports. The current compilation effort benefited from the distillation of these many varied inputs conducted by the previous studies.« less

  4. New Tools and Methods for Assessing Risk-Management Strategies

    DTIC Science & Technology

    2004-03-01

    Theories to evaluate the risks and benefits of various acquisition alternatives and allowed researchers to monitor the process students used to make a...revealed distinct risk-management strategies. 15. SUBJECT TERMS risk managements, acquisition process, expected value theory , multi-attribute utility theory ...Utility Theories to evaluate the risks and benefits of various acquisition alternatives, and allowed us to monitor the process subjects used to arrive at

  5. Comparison of potential method in analytic hierarchy process for multi-attribute of catering service companies

    NASA Astrophysics Data System (ADS)

    Mamat, Siti Salwana; Ahmad, Tahir; Awang, Siti Rahmah

    2017-08-01

    Analytic Hierarchy Process (AHP) is a method used in structuring, measuring and synthesizing criteria, in particular ranking of multiple criteria in decision making problems. On the other hand, Potential Method is a ranking procedure in which utilizes preference graph ς (V, A). Two nodes are adjacent if they are compared in a pairwise comparison whereby the assigned arc is oriented towards the more preferred node. In this paper Potential Method is used to solve problem on a catering service selection. The comparison of result by using Potential method is made with Extent Analysis. The Potential Method is found to produce the same rank as Extent Analysis in AHP.

  6. Decision analysis for a data collection system of patient-controlled analgesia with a multi-attribute utility model.

    PubMed

    Lee, I-Jung; Huang, Shih-Yu; Tsou, Mei-Yung; Chan, Kwok-Hon; Chang, Kuang-Yi

    2010-10-01

    Data collection systems are very important for the practice of patient-controlled analgesia (PCA). This study aimed to evaluate 3 PCA data collection systems and selected the most favorable system with the aid of multiattribute utility (MAU) theory. We developed a questionnaire with 10 items to evaluate the PCA data collection system and 1 item for overall satisfaction based on MAU theory. Three systems were compared in the questionnaire, including a paper record, optic card reader and personal digital assistant (PDA). A pilot study demonstrated a good internal and test-retest reliability of the questionnaire. A weighted utility score combining the relative importance of individual items assigned by each participant and their responses to each question was calculated for each system. Sensitivity analyses with distinct weighting protocols were conducted to evaluate the stability of the final results. Thirty potential users of a PCA data collection system were recruited in the study. The item "easy to use" had the highest median rank and received the heaviest mean weight among all items. MAU analysis showed that the PDA system had a higher utility score than that in the other 2 systems. Sensitivity analyses revealed that both inverse and reciprocal weighting processes favored the PDA system. High correlations between overall satisfaction and MAU scores from miscellaneous weighting protocols suggested a good predictive validity of our MAU-based questionnaire. The PDA system was selected as the most favorable PCA data collection system by the MAU analysis. The item "easy to use" was the most important attribute of the PCA data collection system. MAU theory can evaluate alternatives by taking into account individual preferences of stakeholders and aid in better decision-making. Copyright © 2010 Elsevier. Published by Elsevier B.V. All rights reserved.

  7. Multi-attribute Regret-Based Dynamic Pricing

    NASA Astrophysics Data System (ADS)

    Jumadinova, Janyl; Dasgupta, Prithviraj

    In this paper, we consider the problem of dynamic pricing by a set of competing sellers in an information economy where buyers differentiate products along multiple attributes, and buyer preferences can change temporally. Previous research in this area has either focused on dynamic pricing along a limited number of (e.g. binary) attributes, or, assumes that each seller has access to private information such as preference distribution of buyers, and profit/price information of other sellers. However, in real information markets, private information about buyers and sellers cannot be assumed to be available a priori. Moreover, due to the competition between sellers, each seller faces a tradeoff between accuracy and rapidity of the pricing mechanism. In this paper, we describe a multi-attribute dynamic pricing algorithm based on minimax regret that can be used by a seller's agent called a pricebot, to maximize the seller's utility. Our simulation results show that the minimax regret based dynamic pricing algorithm performs significantly better than other algorithms for rapidly and dynamically tracking consumer attributes without using any private information from either buyers or sellers.

  8. A case for multi-model and multi-approach based event attribution: The 2015 European drought

    NASA Astrophysics Data System (ADS)

    Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Seneviratne, Sonia Isabelle

    2017-04-01

    Science on the role of anthropogenic influence on extreme weather events such as heat waves or droughts has evolved rapidly over the past years. The approach of "event attribution" compares the occurrence probability of an event in the present, factual world with the probability of the same event in a hypothetical, counterfactual world without human-induced climate change. Every such analysis necessarily faces multiple methodological choices including, but not limited to: the event definition, climate model configuration, and the design of the counterfactual world. Here, we explore the role of such choices for an attribution analysis of the 2015 European summer drought (Hauser et al., in preparation). While some GCMs suggest that anthropogenic forcing made the 2015 drought more likely, others suggest no impact, or even a decrease in the event probability. These results additionally differ for single GCMs, depending on the reference used for the counterfactual world. Observational results do not suggest a historical tendency towards more drying, but the record may be too short to provide robust assessments because of the large interannual variability of drought occurrence. These results highlight the need for a multi-model and multi-approach framework in event attribution research. This is especially important for events with low signal to noise ratio and high model dependency such as regional droughts. Hauser, M., L. Gudmundsson, R. Orth, A. Jézéquel, K. Haustein, S.I. Seneviratne, in preparation. A case for multi-model and multi-approach based event attribution: The 2015 European drought.

  9. CPTAC Develops LinkedOmics – Public Web Portal to Analyze Multi-Omics Data Within and Across Cancer Types | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    Multi-omics analysis has grown in popularity among biomedical researchers given the comprehensive characterization of thousands of molecular attributes in addition to clinical attributes. Several data portals have been devised to make these datasets directly available to the cancer research community. However, none of the existing data portals allow systematic exploration and interpretation of the complex relationships between the vast amount of clinical and molecular attributes. CPTAC investigator Dr.

  10. The Influence of Game Design on the Collaborative Problem Solving Process: A Cross-Case Study of Multi-Player Collaborative Gameplay Analysis

    ERIC Educational Resources Information Center

    Yildirim, Nilay

    2013-01-01

    This cross-case study examines the relationships between game design attributes and collaborative problem solving process in the context of multi-player video games. The following game design attributes: sensory stimuli elements, level of challenge, and presentation of game goals and rules were examined to determine their influence on game…

  11. LinkedOmics: analyzing multi-omics data within and across 32 cancer types.

    PubMed

    Vasaikar, Suhas V; Straub, Peter; Wang, Jing; Zhang, Bing

    2018-01-04

    The LinkedOmics database contains multi-omics data and clinical data for 32 cancer types and a total of 11 158 patients from The Cancer Genome Atlas (TCGA) project. It is also the first multi-omics database that integrates mass spectrometry (MS)-based global proteomics data generated by the Clinical Proteomic Tumor Analysis Consortium (CPTAC) on selected TCGA tumor samples. In total, LinkedOmics has more than a billion data points. To allow comprehensive analysis of these data, we developed three analysis modules in the LinkedOmics web application. The LinkFinder module allows flexible exploration of associations between a molecular or clinical attribute of interest and all other attributes, providing the opportunity to analyze and visualize associations between billions of attribute pairs for each cancer cohort. The LinkCompare module enables easy comparison of the associations identified by LinkFinder, which is particularly useful in multi-omics and pan-cancer analyses. The LinkInterpreter module transforms identified associations into biological understanding through pathway and network analysis. Using five case studies, we demonstrate that LinkedOmics provides a unique platform for biologists and clinicians to access, analyze and compare cancer multi-omics data within and across tumor types. LinkedOmics is freely available at http://www.linkedomics.org. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  12. Evaluating alcoholics anonymous sponsor attributes using conjoint analysis.

    PubMed

    Stevens, Edward B; Jason, Leonard A

    2015-12-01

    Alcoholics Anonymous (AA) considers sponsorship an important element of the AA program, especially in early recovery. 225 adult individuals who had experience as either a sponsor, sponsee, or both, participated in a hypothetical sponsor ranking exercise where five attributes were varied across three levels. Conjoint analysis was used to compute part-worth utility of the attributes and their levels for experience, knowledge, availability, confidentiality, and goal-setting. Differences in utilities by attribute were found where confidentiality had the greatest overall possible impact on utility and sponsor knowledge had the least. These findings suggest qualitative differences in sponsors may impact their effectiveness. Future research on AA should continue to investigate sponsor influence on an individual's overall recovery trajectory. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Attaining insight into interactions between hydrologic model parameters and geophysical attributes for national-scale model parameter estimation

    NASA Astrophysics Data System (ADS)

    Mizukami, N.; Clark, M. P.; Newman, A. J.; Wood, A.; Gutmann, E. D.

    2017-12-01

    Estimating spatially distributed model parameters is a grand challenge for large domain hydrologic modeling, especially in the context of hydrologic model applications such as streamflow forecasting. Multi-scale Parameter Regionalization (MPR) is a promising technique that accounts for the effects of fine-scale geophysical attributes (e.g., soil texture, land cover, topography, climate) on model parameters and nonlinear scaling effects on model parameters. MPR computes model parameters with transfer functions (TFs) that relate geophysical attributes to model parameters at the native input data resolution and then scales them using scaling functions to the spatial resolution of the model implementation. One of the biggest challenges in the use of MPR is identification of TFs for each model parameter: both functional forms and geophysical predictors. TFs used to estimate the parameters of hydrologic models typically rely on previous studies or were derived in an ad-hoc, heuristic manner, potentially not utilizing maximum information content contained in the geophysical attributes for optimal parameter identification. Thus, it is necessary to first uncover relationships among geophysical attributes, model parameters, and hydrologic processes (i.e., hydrologic signatures) to obtain insight into which and to what extent geophysical attributes are related to model parameters. We perform multivariate statistical analysis on a large-sample catchment data set including various geophysical attributes as well as constrained VIC model parameters at 671 unimpaired basins over the CONUS. We first calibrate VIC model at each catchment to obtain constrained parameter sets. Additionally, parameter sets sampled during the calibration process are used for sensitivity analysis using various hydrologic signatures as objectives to understand the relationships among geophysical attributes, parameters, and hydrologic processes.

  14. Multi-criteria decision analysis in environmental sciences: ten years of applications and trends.

    PubMed

    Huang, Ivy B; Keisler, Jeffrey; Linkov, Igor

    2011-09-01

    Decision-making in environmental projects requires consideration of trade-offs between socio-political, environmental, and economic impacts and is often complicated by various stakeholder views. Multi-criteria decision analysis (MCDA) emerged as a formal methodology to face available technical information and stakeholder values to support decisions in many fields and can be especially valuable in environmental decision making. This study reviews environmental applications of MCDA. Over 300 papers published between 2000 and 2009 reporting MCDA applications in the environmental field were identified through a series of queries in the Web of Science database. The papers were classified by their environmental application area, decision or intervention type. In addition, the papers were also classified by the MCDA methods used in the analysis (analytic hierarchy process, multi-attribute utility theory, and outranking). The results suggest that there is a significant growth in environmental applications of MCDA over the last decade across all environmental application areas. Multiple MCDA tools have been successfully used for environmental applications. Even though the use of the specific methods and tools varies in different application areas and geographic regions, our review of a few papers where several methods were used in parallel with the same problem indicates that recommended course of action does not vary significantly with the method applied. Published by Elsevier B.V.

  15. Application of Molecular Typing Results in Source Attribution Models: The Case of Multiple Locus Variable Number Tandem Repeat Analysis (MLVA) of Salmonella Isolates Obtained from Integrated Surveillance in Denmark.

    PubMed

    de Knegt, Leonardo V; Pires, Sara M; Löfström, Charlotta; Sørensen, Gitte; Pedersen, Karl; Torpdahl, Mia; Nielsen, Eva M; Hald, Tine

    2016-03-01

    Salmonella is an important cause of bacterial foodborne infections in Denmark. To identify the main animal-food sources of human salmonellosis, risk managers have relied on a routine application of a microbial subtyping-based source attribution model since 1995. In 2013, multiple locus variable number tandem repeat analysis (MLVA) substituted phage typing as the subtyping method for surveillance of S. Enteritidis and S. Typhimurium isolated from animals, food, and humans in Denmark. The purpose of this study was to develop a modeling approach applying a combination of serovars, MLVA types, and antibiotic resistance profiles for the Salmonella source attribution, and assess the utility of the results for the food safety decisionmakers. Full and simplified MLVA schemes from surveillance data were tested, and model fit and consistency of results were assessed using statistical measures. We conclude that loci schemes STTR5/STTR10/STTR3 for S. Typhimurium and SE9/SE5/SE2/SE1/SE3 for S. Enteritidis can be used in microbial subtyping-based source attribution models. Based on the results, we discuss that an adjustment of the discriminatory level of the subtyping method applied often will be required to fit the purpose of the study and the available data. The issues discussed are also considered highly relevant when applying, e.g., extended multi-locus sequence typing or next-generation sequencing techniques. © 2015 Society for Risk Analysis.

  16. [A multi-measure analysis of the similarity, attraction, and compromise effects in multi-attribute decision making].

    PubMed

    Tsuzuki, Takashi; Matsui, Hiroshi; Kikuchi, Manabu

    2012-12-01

    In multi-attribute decision making, the similarity, attraction, and compromise effects warrant specific investigation as they cause violations of principles in rational choice. In order to investigate these three effects simultaneously, we assigned 145 undergraduates to three context effect conditions. We requested them to solve the same 20 hypothetical purchase problems, each of which had three alternatives described along two attributes. We measured their choices, confidence ratings, and response times. We found that manipulating the third alternative had significant context effects for choice proportions and confidence ratings in all three conditions. Furthermore, the attraction effect was the most prominent with regard to choice proportions. In the compromise effect condition, although the choice proportion of the third alternative was high, the confidence rating was low and the response time was long. These results indicate that the relationship between choice proportions and confidence ratings requires further theoretical investigation. They also suggest that a combination of experimental and modeling studies is imperative to reveal the mechanisms underlying the context effects in multi-attribute, multi-alternative decision making.

  17. Latin American Clinical Epidemiology Network Series - Paper 4: Economic evaluation of Kangaroo Mother Care: cost utility analysis of results from a randomized controlled trial conducted in Bogotá.

    PubMed

    Ruiz, Juan Gabriel; Charpak, Nathalie; Castillo, Mario; Bernal, Astrid; Ríos, John; Trujillo, Tammy; Córdoba, María Adelaida

    2017-06-01

    Although kangaroo mother care (KMC) has been shown to be safe and effective in randomized controlled trials (RCTs), there are no published complete economic evaluations including the three components of the full intervention. A cost utility analysis performed on the results of an RCT conducted in Bogotá, Colombia between 1993 and 1996. Hospital and ambulatory costs were estimated by microcosting in a sample of preterm infants from a University Hospital in Bogotá in 2011 and at a KMC clinic in the same period. Utility scores were assigned by experts by means of (1) direct ordering and scoring discrete health states and (2) constructing a multi-attribute utility function. Ninety-five percent confidence intervals (CIs) for the incremental cost-utility ratios (ICURs) were computed by the Fiellers theorem method. One-way sensitivity analysis on price estimates for valuing costs was performed. ICUR at 1 year of corrected age was $ -1,546 per extra quality-adjusted life year gained using the KMC method (95% CI $ -7,963 to $ 4,910). In Bogotá, the use of KMC is dominant: more effective and cost-saving. Although results from an economic analysis should not be extrapolated to different systems and communities, this dominant result suggests that KMC could be cost-effective in similar low and middle income countries settings. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Delineating chalk sand distribution of Ekofisk formation using probabilistic neural network (PNN) and stepwise regression (SWR): Case study Danish North Sea field

    NASA Astrophysics Data System (ADS)

    Haris, A.; Nafian, M.; Riyanto, A.

    2017-07-01

    Danish North Sea Fields consist of several formations (Ekofisk, Tor, and Cromer Knoll) that was started from the age of Paleocene to Miocene. In this study, the integration of seismic and well log data set is carried out to determine the chalk sand distribution in the Danish North Sea field. The integration of seismic and well log data set is performed by using the seismic inversion analysis and seismic multi-attribute. The seismic inversion algorithm, which is used to derive acoustic impedance (AI), is model-based technique. The derived AI is then used as external attributes for the input of multi-attribute analysis. Moreover, the multi-attribute analysis is used to generate the linear and non-linear transformation of among well log properties. In the case of the linear model, selected transformation is conducted by weighting step-wise linear regression (SWR), while for the non-linear model is performed by using probabilistic neural networks (PNN). The estimated porosity, which is resulted by PNN shows better suited to the well log data compared with the results of SWR. This result can be understood since PNN perform non-linear regression so that the relationship between the attribute data and predicted log data can be optimized. The distribution of chalk sand has been successfully identified and characterized by porosity value ranging from 23% up to 30%.

  19. Decision Making for Pap Testing among Pacific Islander Women

    ERIC Educational Resources Information Center

    Weiss, Jie W.; Mouttapa, Michele; Sablan-Santos, Lola; DeGuzman Lacsamana, Jasmine; Quitugua, Lourdes; Park Tanjasiri, Sora

    2016-01-01

    This study employed a Multi-Attribute Utility (MAU) model to examine the Pap test decision-making process among Pacific Islanders (PI) residing in Southern California. A total of 585 PI women were recruited through social networks from Samoan and Tongan churches, and Chamorro family clans. A questionnaire assessed Pap test knowledge, beliefs and…

  20. Temporal Drivers of Liking Based on Functional Data Analysis and Non-Additive Models for Multi-Attribute Time-Intensity Data of Fruit Chews.

    PubMed

    Kuesten, Carla; Bi, Jian

    2018-06-03

    Conventional drivers of liking analysis was extended with a time dimension into temporal drivers of liking (TDOL) based on functional data analysis methodology and non-additive models for multiple-attribute time-intensity (MATI) data. The non-additive models, which consider both direct effects and interaction effects of attributes to consumer overall liking, include Choquet integral and fuzzy measure in the multi-criteria decision-making, and linear regression based on variance decomposition. Dynamics of TDOL, i.e., the derivatives of the relative importance functional curves were also explored. Well-established R packages 'fda', 'kappalab' and 'relaimpo' were used in the paper for developing TDOL. Applied use of these methods shows that the relative importance of MATI curves offers insights for understanding the temporal aspects of consumer liking for fruit chews.

  1. m2-ABKS: Attribute-Based Multi-Keyword Search over Encrypted Personal Health Records in Multi-Owner Setting.

    PubMed

    Miao, Yinbin; Ma, Jianfeng; Liu, Ximeng; Wei, Fushan; Liu, Zhiquan; Wang, Xu An

    2016-11-01

    Online personal health record (PHR) is more inclined to shift data storage and search operations to cloud server so as to enjoy the elastic resources and lessen computational burden in cloud storage. As multiple patients' data is always stored in the cloud server simultaneously, it is a challenge to guarantee the confidentiality of PHR data and allow data users to search encrypted data in an efficient and privacy-preserving way. To this end, we design a secure cryptographic primitive called as attribute-based multi-keyword search over encrypted personal health records in multi-owner setting to support both fine-grained access control and multi-keyword search via Ciphertext-Policy Attribute-Based Encryption. Formal security analysis proves our scheme is selectively secure against chosen-keyword attack. As a further contribution, we conduct empirical experiments over real-world dataset to show its feasibility and practicality in a broad range of actual scenarios without incurring additional computational burden.

  2. The Multi-Attribute Group Decision-Making Method Based on Interval Grey Trapezoid Fuzzy Linguistic Variables.

    PubMed

    Yin, Kedong; Wang, Pengyu; Li, Xuemei

    2017-12-13

    With respect to multi-attribute group decision-making (MAGDM) problems, where attribute values take the form of interval grey trapezoid fuzzy linguistic variables (IGTFLVs) and the weights (including expert and attribute weight) are unknown, improved grey relational MAGDM methods are proposed. First, the concept of IGTFLV, the operational rules, the distance between IGTFLVs, and the projection formula between the two IGTFLV vectors are defined. Second, the expert weights are determined by using the maximum proximity method based on the projection values between the IGTFLV vectors. The attribute weights are determined by the maximum deviation method and the priorities of alternatives are determined by improved grey relational analysis. Finally, an example is given to prove the effectiveness of the proposed method and the flexibility of IGTFLV.

  3. Multi-Response Optimization of WEDM Process Parameters Using Taguchi Based Desirability Function Analysis

    NASA Astrophysics Data System (ADS)

    Majumder, Himadri; Maity, Kalipada

    2018-03-01

    Shape memory alloy has a unique capability to return to its original shape after physical deformation by applying heat or thermo-mechanical or magnetic load. In this experimental investigation, desirability function analysis (DFA), a multi-attribute decision making was utilized to find out the optimum input parameter setting during wire electrical discharge machining (WEDM) of Ni-Ti shape memory alloy. Four critical machining parameters, namely pulse on time (TON), pulse off time (TOFF), wire feed (WF) and wire tension (WT) were taken as machining inputs for the experiments to optimize three interconnected responses like cutting speed, kerf width, and surface roughness. Input parameter combination TON = 120 μs., TOFF = 55 μs., WF = 3 m/min. and WT = 8 kg-F were found to produce the optimum results. The optimum process parameters for each desired response were also attained using Taguchi’s signal-to-noise ratio. Confirmation test has been done to validate the optimum machining parameter combination which affirmed DFA was a competent approach to select optimum input parameters for the ideal response quality for WEDM of Ni-Ti shape memory alloy.

  4. METHODS FOR MULTI-SPATIAL SCALE CHARACTERIZATION OF RIPARIAN CORRIDORS

    EPA Science Inventory

    This paper describes the application of aerial photography and GIS technology to develop flexible and transferable methods for multi-spatial scale characterization and analysis of riparian corridors. Relationships between structural attributes of riparian corridors and indicator...

  5. Zoning of an agricultural field using a fuzzy indicator model in combination with tool for multi-attributed decision-making

    USDA-ARS?s Scientific Manuscript database

    Zoning of agricultural fields is an important task for utilization of precision farming technology. This paper extends previously published work entitled “Zoning of an agricultural field using a fuzzy indicator model” to a general case where there is disagreement between groups of managers or expert...

  6. Assortativity Patterns in Multi-dimensional Inter-organizational Networks: A Case Study of the Humanitarian Relief Sector

    NASA Astrophysics Data System (ADS)

    Zhao, Kang; Ngamassi, Louis-Marie; Yen, John; Maitland, Carleen; Tapia, Andrea

    We use computational tools to study assortativity patterns in multi-dimensional inter-organizational networks on the basis of different node attributes. In the case study of an inter-organizational network in the humanitarian relief sector, we consider not only macro-level topological patterns, but also assortativity on the basis of micro-level organizational attributes. Unlike assortative social networks, this inter-organizational network exhibits disassortative or random patterns on three node attributes. We believe organizations' seek of complementarity is one of the main reasons for the special patterns. Our analysis also provides insights on how to promote collaborations among the humanitarian relief organizations.

  7. Psychometric evaluation of a multi-dimensional measure of satisfaction with behavioral interventions.

    PubMed

    Sidani, Souraya; Epstein, Dana R; Fox, Mary

    2017-10-01

    Treatment satisfaction is recognized as an essential aspect in the evaluation of an intervention's effectiveness, but there is no measure that provides for its comprehensive assessment with regard to behavioral interventions. Informed by a conceptualization generated from a literature review, we developed a measure that covers several domains of satisfaction with behavioral interventions. In this paper, we briefly review its conceptualization and describe the Multi-Dimensional Treatment Satisfaction Measure (MDTSM) subscales. Satisfaction refers to the appraisal of the treatment's process and outcome attributes. The MDTSM has 11 subscales assessing treatment process and outcome attributes: treatment components' suitability and utility, attitude toward treatment, desire for continued treatment use, therapist competence and interpersonal style, format and dose, perceived benefits of the health problem and everyday functioning, discomfort, and attribution of outcomes to treatment. The MDTSM was completed by persons (N = 213) in the intervention group in a large trial of a multi-component behavioral intervention for insomnia within 1 week following treatment completion. The MDTSM's subscales demonstrated internal consistency reliability (α: .65 - .93) and validity (correlated with self-reported adherence and perceived insomnia severity at post-test). The MDTSM subscales can be used to assess satisfaction with behavioral interventions and point to aspects of treatments that are viewed favorably or unfavorably. © 2017 Wiley Periodicals, Inc.

  8. Use of multiattribute utility theory for formulary management in a health system.

    PubMed

    Chung, Seonyoung; Kim, Sooyon; Kim, Jeongmee; Sohn, Kieho

    2010-01-15

    The application, utility, and flexibility of the multiattribute utility theory (MAUT) when used as a formulary decision methodology in a Korean medical center were evaluated. A drug analysis model using MAUT consisting of 10 steps was designed for two drug classes of dihydropyridine calcium channel blockers (CCBs) and angiotensin II receptor blockers (ARBs). These two drug classes contain the most diverse agents among cardiovascular drugs on Samsung Medical Center's drug formulary. The attributes identified for inclusion in the drug analysis model were effectiveness, safety, patient convenience, and cost, with relative weights of 50%, 30%, 10%, and 10%, respectively. The factors were incorporated into the model to quantify the contribution of each attribute. For each factor, a utility scale of 0-100 was established, and the total utility score for each alternative was calculated. An attempt was made to make the model adaptable to changing health care and regulatory circumstances. The analysis revealed amlodipine besylate to be an alternative agent, with the highest total utility score among the dihydropyridine CCBs, while barnidipine hydrochloride had the lowest score. For ARBs, losartan potassium had the greatest total utility score, while olmesartan medoxomil had the lowest. A drug analysis model based on the MAUT was successfully developed and used in making formulary decisions for dihydropyridine CCBs and ARBs for a Korean health system. The model incorporates sufficient utility and flexibility of a drug's attributes and can be used as an alternative decision-making tool for formulary management in health systems.

  9. Innovating Big Data Computing Geoprocessing for Analysis of Engineered-Natural Systems

    NASA Astrophysics Data System (ADS)

    Rose, K.; Baker, V.; Bauer, J. R.; Vasylkivska, V.

    2016-12-01

    Big data computing and analytical techniques offer opportunities to improve predictions about subsurface systems while quantifying and characterizing associated uncertainties from these analyses. Spatial analysis, big data and otherwise, of subsurface natural and engineered systems are based on variable resolution, discontinuous, and often point-driven data to represent continuous phenomena. We will present examples from two spatio-temporal methods that have been adapted for use with big datasets and big data geo-processing capabilities. The first approach uses regional earthquake data to evaluate spatio-temporal trends associated with natural and induced seismicity. The second algorithm, the Variable Grid Method (VGM), is a flexible approach that presents spatial trends and patterns, such as those resulting from interpolation methods, while simultaneously visualizing and quantifying uncertainty in the underlying spatial datasets. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analyses to efficiently consume and utilize large geospatial data in these custom analytical algorithms through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom `Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations.

  10. Multi-Band Received Signal Strength Fingerprinting Based Indoor Location System

    NASA Astrophysics Data System (ADS)

    Sertthin, Chinnapat; Fujii, Takeo; Ohtsuki, Tomoaki; Nakagawa, Masao

    This paper proposes a new multi-band received signal strength (MRSS) fingerprinting based indoor location system, which employs the frequency diversity on the conventional single-band received signal strength (RSS) fingerprinting based indoor location system. In the proposed system, the impacts of frequency diversity on the enhancements of positioning accuracy are analyzed. Effectiveness of the proposed system is proved by experimental approach, which was conducted in non line-of-sight (NLOS) environment under the area of 103m2 at Yagami Campus, Keio University. WLAN access points, which simultaneously transmit dual-band signal of 2.4 and 5.2GHz, are utilized as transmitters. Likewise, a dual-band WLAN receiver is utilized as a receiver. Signal distances calculated by both Manhattan and Euclidean were classified by K-Nearest Neighbor (KNN) classifier to illustrate the performance of the proposed system. The results confirmed that Frequency diversity attributions of multi-band signal provide accuracy improvement over 50% of the conventional single-band.

  11. Australian health-related quality of life population norms derived from the SF-6D.

    PubMed

    Norman, Richard; Church, Jody; van den Berg, Bernard; Goodall, Stephen

    2013-02-01

    To investigate population health-related quality of life norms in an Australian general sample by age, gender, BMI, education and socioeconomic status. The SF-36 was included in the 2009/10 wave of the Household, Income and Labour Dynamics in Australia (HILDA) survey (n=17,630 individuals across 7,234 households), and converted into SF-6D utility scores. Trends across the various population subgroups were investigated employing population weights to ensure a balanced panel, and were all sub-stratified by gender. SF-6D scores decline with age beyond 40 years, with decreasing education and by higher levels of socioeconomic disadvantage. Scores were also lower at very low and very high BMI levels. Males reported higher SF-6D scores than females across most analyses. This study reports Australian population utility data measured using the SF-6D, based on a national representative sample. These results can be used in a range of policy settings such as cost-utility analysis or exploration of health-related inequality. In general, the patterns are similar to those reported using other multi-attribute utility instruments and in different countries. © 2013 The Authors. ANZJPH © 2013 Public Health Association of Australia.

  12. A Survey Version of Full-Profile Conjoint Analysis.

    ERIC Educational Resources Information Center

    Chrzan, Keith

    Two studies were conducted to test the viability of a survey version of full-profile conjoint analysis. Conjoint analysis describes a variety of analytic techniques for measuring subjects'"utilities," or preferences for the individual attributes or levels of attributes that constitute objects under study. The first study compared the…

  13. Decision-making in irrigation networks: Selecting appropriate canal structures using multi-attribute decision analysis.

    PubMed

    Hosseinzade, Zeinab; Pagsuyoin, Sheree A; Ponnambalam, Kumaraswamy; Monem, Mohammad J

    2017-12-01

    The stiff competition for water between agriculture and non-agricultural production sectors makes it necessary to have effective management of irrigation networks in farms. However, the process of selecting flow control structures in irrigation networks is highly complex and involves different levels of decision makers. In this paper, we apply multi-attribute decision making (MADM) methodology to develop a decision analysis (DA) framework for evaluating, ranking and selecting check and intake structures for irrigation canals. The DA framework consists of identifying relevant attributes for canal structures, developing a robust scoring system for alternatives, identifying a procedure for data quality control, and identifying a MADM model for the decision analysis. An application is illustrated through an analysis for automation purposes of the Qazvin irrigation network, one of the oldest and most complex irrigation networks in Iran. A survey questionnaire designed based on the decision framework was distributed to experts, managers, and operators of the Qazvin network and to experts from the Ministry of Power in Iran. Five check structures and four intake structures were evaluated. A decision matrix was generated from the average scores collected from the survey, and was subsequently solved using TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution) method. To identify the most critical structure attributes for the selection process, optimal attribute weights were calculated using Entropy method. For check structures, results show that the duckbill weir is the preferred structure while the pivot weir is the least preferred. Use of the duckbill weir can potentially address the problem with existing Amil gates where manual intervention is required to regulate water levels during periods of flow extremes. For intake structures, the Neyrpic® gate and constant head orifice are the most and least preferred alternatives, respectively. Some advantages of the Neyrpic® gate are ease of operation and capacity to measure discharge flows. Overall, the application to the Qazvin irrigation network demonstrates the utility of the proposed DA framework in selecting appropriate structures for regulating water flows in irrigation canals. This framework systematically aids the decision process by capturing decisions made at various levels (individual farmers to high-level management). It can be applied to other cases where a new irrigation network is being designed, or where changes in irrigation structures need to be identified to improve flow control in existing networks. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. DETERMINATION OF RELATIVE IMPORTANCE OF NONPROLIFERATION FACTORS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard Metcalf

    2009-07-01

    Methodologies to determine the proliferation resistance (PR) of nuclear facilities often rely on either expert elicitation, a resource-intensive approach without easily reproducible results, or numeric evaluations, which can fail to take into account the institutional knowledge and expert experience of the nonproliferation community. In an attempt to bridge the gap and bring the institutional knowledge into numeric evaluations of PR, a survey was conducted of 33 individuals to find the relative importance of a set of 62 nonproliferation factors, subsectioned into groups under the headings of Diversion, Transportation, Transformation, and Weaponization. One third of the respondents were self-described nonproliferation professionals,more » and the remaining two thirds were from secondary professions related to nonproliferation, such as industrial engineers or policy analysts. The factors were taken from previous work which used multi-attribute utility analysis with uniform weighting of attributes and did not include institutional knowledge. In both expert and non-expert groups, all four headings and the majority of factors had different relative importance at a confidence of 95% (p=0.05). This analysis and survey demonstrates that institutional knowledge can be brought into numeric evaluations of PR, if there is a sufficient investment of resources made prior to the evaluation.« less

  15. Construction of the descriptive system for the assessment of quality of life AQoL-6D utility instrument

    PubMed Central

    2012-01-01

    Background Multi attribute utility (MAU) instruments are used to include the health related quality of life (HRQoL) in economic evaluations of health programs. Comparative studies suggest different MAU instruments measure related but different constructs. The objective of this paper is to describe the methods employed to achieve content validity in the descriptive system of the Assessment of Quality of Life (AQoL)-6D, MAU instrument. Methods The AQoL program introduced the use of psychometric methods in the construction of health related MAU instruments. To develop the AQoL-6D we selected 112 items from previous research, focus groups and expert judgment and administered them to 316 members of the public and 302 hospital patients. The search for content validity across a broad spectrum of health states required both formative and reflective modelling. We employed Exploratory Factor Analysis and Structural Equation Modelling (SEM) to meet these dual requirements. Results and Discussion The resulting instrument employs 20 items in a multi-tier descriptive system. Latent dimension variables achieve sensitive descriptions of 6 dimensions which, in turn, combine to form a single latent QoL variable. Diagnostic statistics from the SEM analysis are exceptionally good and confirm the hypothesised structure of the model. Conclusions The AQoL-6D descriptive system has good psychometric properties. They imply that the instrument has achieved construct validity and provides a sensitive description of HRQoL. This means that it may be used with confidence for measuring health related quality of life and that it is a suitable basis for modelling utilities for inclusion in the economic evaluation of health programs. PMID:22507254

  16. Construction of the descriptive system for the Assessment of Quality of Life AQoL-6D utility instrument.

    PubMed

    Richardson, Jeffrey R J; Peacock, Stuart J; Hawthorne, Graeme; Iezzi, Angelo; Elsworth, Gerald; Day, Neil A

    2012-04-17

    Multi attribute utility (MAU) instruments are used to include the health related quality of life (HRQoL) in economic evaluations of health programs. Comparative studies suggest different MAU instruments measure related but different constructs. The objective of this paper is to describe the methods employed to achieve content validity in the descriptive system of the Assessment of Quality of Life (AQoL)-6D, MAU instrument. The AQoL program introduced the use of psychometric methods in the construction of health related MAU instruments. To develop the AQoL-6D we selected 112 items from previous research, focus groups and expert judgment and administered them to 316 members of the public and 302 hospital patients. The search for content validity across a broad spectrum of health states required both formative and reflective modelling. We employed Exploratory Factor Analysis and Structural Equation Modelling (SEM) to meet these dual requirements. The resulting instrument employs 20 items in a multi-tier descriptive system. Latent dimension variables achieve sensitive descriptions of 6 dimensions which, in turn, combine to form a single latent QoL variable. Diagnostic statistics from the SEM analysis are exceptionally good and confirm the hypothesised structure of the model. The AQoL-6D descriptive system has good psychometric properties. They imply that the instrument has achieved construct validity and provides a sensitive description of HRQoL. This means that it may be used with confidence for measuring health related quality of life and that it is a suitable basis for modelling utilities for inclusion in the economic evaluation of health programs.

  17. Body Image of Women Submitted to Breast Cancer Treatment

    PubMed

    Guedes, Thais Sousa Rodrigues; Dantas de Oliveira, Nayara Priscila; Holanda, Ayrton Martins; Reis, Mariane Albuquerque; Silva, Clécia Patrocínio da; Rocha e Silva, Bárbara Layse; Cancela, Marianna de Camargo; de Souza, Dyego Leandro Bezerra

    2018-06-25

    Background: The study of body image includes the perception of women regarding the physical appearance of their own body. The objective of the present study was to verify the prevalence of body image dissatisfaction and its associated factors in women submitted to breast cancer treatment. Methods: A cross-sectional study carried out with 103 female residents of the municipality of Natal (Northeast Brazil), diagnosed with breast cancer who had undergone cancer treatment for at least 12 months prior to the study, and remained under clinical monitoring. The variable body image was measured through the validated Body Image Scale (BIS). Socioeconomic variables and clinical history were also collected through an individual interview with each participant. The Pearson’s chi-squared test (Fisher’s Exact) was utilized for bivariate analysis, calculating the prevalence ratio with 95% confidence interval. Poisson regression with robust variance was utilized for multivariate analysis. The statistical significance considered was 0.05. Results: The prevalence of body image dissatisfaction was 74.8% CI (65%-82%). Statistically significant associations were observed between body image and multi-professional follow-up (p=0.009) and return to employment after treatment (p=0.022). Conclusion: It was concluded that women who reported employment after cancer treatment presented more alterations in self-perception concerning their appearance. Patients who did not receive multi-professional follow-up reported negative body image, evidencing the need for strategies that increase and improve healthcare, aiming to meet the demands of this population. Creative Commons Attribution License

  18. Multiattribute selection of acute stroke imaging software platform for Extending the Time for Thrombolysis in Emergency Neurological Deficits (EXTEND) clinical trial.

    PubMed

    Churilov, Leonid; Liu, Daniel; Ma, Henry; Christensen, Soren; Nagakane, Yoshinari; Campbell, Bruce; Parsons, Mark W; Levi, Christopher R; Davis, Stephen M; Donnan, Geoffrey A

    2013-04-01

    The appropriateness of a software platform for rapid MRI assessment of the amount of salvageable brain tissue after stroke is critical for both the validity of the Extending the Time for Thrombolysis in Emergency Neurological Deficits (EXTEND) Clinical Trial of stroke thrombolysis beyond 4.5 hours and for stroke patient care outcomes. The objective of this research is to develop and implement a methodology for selecting the acute stroke imaging software platform most appropriate for the setting of a multi-centre clinical trial. A multi-disciplinary decision making panel formulated the set of preferentially independent evaluation attributes. Alternative Multi-Attribute Value Measurement methods were used to identify the best imaging software platform followed by sensitivity analysis to ensure the validity and robustness of the proposed solution. Four alternative imaging software platforms were identified. RApid processing of PerfusIon and Diffusion (RAPID) software was selected as the most appropriate for the needs of the EXTEND trial. A theoretically grounded generic multi-attribute selection methodology for imaging software was developed and implemented. The developed methodology assured both a high quality decision outcome and a rational and transparent decision process. This development contributes to stroke literature in the area of comprehensive evaluation of MRI clinical software. At the time of evaluation, RAPID software presented the most appropriate imaging software platform for use in the EXTEND clinical trial. The proposed multi-attribute imaging software evaluation methodology is based on sound theoretical foundations of multiple criteria decision analysis and can be successfully used for choosing the most appropriate imaging software while ensuring both robust decision process and outcomes. © 2012 The Authors. International Journal of Stroke © 2012 World Stroke Organization.

  19. Multi-issue Agent Negotiation Based on Fairness

    NASA Astrophysics Data System (ADS)

    Zuo, Baohe; Zheng, Sue; Wu, Hong

    Agent-based e-commerce service has become a hotspot now. How to make the agent negotiation process quickly and high-efficiently is the main research direction of this area. In the multi-issue model, MAUT(Multi-attribute Utility Theory) or its derived theory usually consider little about the fairness of both negotiators. This work presents a general model of agent negotiation which considered the satisfaction of both negotiators via autonomous learning. The model can evaluate offers from the opponent agent based on the satisfaction degree, learn online to get the opponent's knowledge from interactive instances of history and negotiation of this time, make concessions dynamically based on fair object. Through building the optimal negotiation model, the bilateral negotiation achieved a higher efficiency and fairer deal.

  20. k-RP*{sub s}: A scalable distributed data structure for high-performance multi-attribute access

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Litwin, W.; Neimat, M.A.

    k-RP*{sub s} is a new data structure for scalable multicomputer files with multi-attribute (k-d) keys. We discuss the k-RP*{sub s} file evolution and search algorithms. Performance analysis shows that a k-RP*{sub s} file can be much larger and orders of magnitude faster than a traditional k-d file. The speed-up is especially important for range and partial match searches that are often impractical with traditional k-d files. This opens up a new perspective for many applications.

  1. A Multi-Attribute-Utility-Theory Model that Minimizes Interview-Data Requirements: A Consolidation of Space Launch Decisions.

    DTIC Science & Technology

    1994-12-01

    satel~ lites on th" .gruunu" U, .U,- orbit affects the priority given to a new launch. Table 3.9 Launch Priorities Level Level Title Description 0.00 No...value of a satellite’s mission(s) relative to the misston(s) of other sate• lites As such the rating given may reflect an endre class of satellites for...Expected Remaining Lifetime 0 Years Assign a number between 0 and I that best describes the utility of a sate; lite ’,th Vh,,V- ,ta .=.... ,. A at these

  2. A Decision-Making Method with Grey Multi-Source Heterogeneous Data and Its Application in Green Supplier Selection

    PubMed Central

    Dang, Yaoguo; Mao, Wenxin

    2018-01-01

    In view of the multi-attribute decision-making problem that the attribute values are grey multi-source heterogeneous data, a decision-making method based on kernel and greyness degree is proposed. The definitions of kernel and greyness degree of an extended grey number in a grey multi-source heterogeneous data sequence are given. On this basis, we construct the kernel vector and greyness degree vector of the sequence to whiten the multi-source heterogeneous information, then a grey relational bi-directional projection ranking method is presented. Considering the multi-attribute multi-level decision structure and the causalities between attributes in decision-making problem, the HG-DEMATEL method is proposed to determine the hierarchical attribute weights. A green supplier selection example is provided to demonstrate the rationality and validity of the proposed method. PMID:29510521

  3. A Decision-Making Method with Grey Multi-Source Heterogeneous Data and Its Application in Green Supplier Selection.

    PubMed

    Sun, Huifang; Dang, Yaoguo; Mao, Wenxin

    2018-03-03

    In view of the multi-attribute decision-making problem that the attribute values are grey multi-source heterogeneous data, a decision-making method based on kernel and greyness degree is proposed. The definitions of kernel and greyness degree of an extended grey number in a grey multi-source heterogeneous data sequence are given. On this basis, we construct the kernel vector and greyness degree vector of the sequence to whiten the multi-source heterogeneous information, then a grey relational bi-directional projection ranking method is presented. Considering the multi-attribute multi-level decision structure and the causalities between attributes in decision-making problem, the HG-DEMATEL method is proposed to determine the hierarchical attribute weights. A green supplier selection example is provided to demonstrate the rationality and validity of the proposed method.

  4. Optimizing a machine learning based glioma grading system using multi-parametric MRI histogram and texture features

    PubMed Central

    Hu, Yu-Chuan; Li, Gang; Yang, Yang; Han, Yu; Sun, Ying-Zhi; Liu, Zhi-Cheng; Tian, Qiang; Han, Zi-Yang; Liu, Le-De; Hu, Bin-Quan; Qiu, Zi-Yu; Wang, Wen; Cui, Guang-Bin

    2017-01-01

    Current machine learning techniques provide the opportunity to develop noninvasive and automated glioma grading tools, by utilizing quantitative parameters derived from multi-modal magnetic resonance imaging (MRI) data. However, the efficacies of different machine learning methods in glioma grading have not been investigated.A comprehensive comparison of varied machine learning methods in differentiating low-grade gliomas (LGGs) and high-grade gliomas (HGGs) as well as WHO grade II, III and IV gliomas based on multi-parametric MRI images was proposed in the current study. The parametric histogram and image texture attributes of 120 glioma patients were extracted from the perfusion, diffusion and permeability parametric maps of preoperative MRI. Then, 25 commonly used machine learning classifiers combined with 8 independent attribute selection methods were applied and evaluated using leave-one-out cross validation (LOOCV) strategy. Besides, the influences of parameter selection on the classifying performances were investigated. We found that support vector machine (SVM) exhibited superior performance to other classifiers. By combining all tumor attributes with synthetic minority over-sampling technique (SMOTE), the highest classifying accuracy of 0.945 or 0.961 for LGG and HGG or grade II, III and IV gliomas was achieved. Application of Recursive Feature Elimination (RFE) attribute selection strategy further improved the classifying accuracies. Besides, the performances of LibSVM, SMO, IBk classifiers were influenced by some key parameters such as kernel type, c, gama, K, etc. SVM is a promising tool in developing automated preoperative glioma grading system, especially when being combined with RFE strategy. Model parameters should be considered in glioma grading model optimization. PMID:28599282

  5. Optimizing a machine learning based glioma grading system using multi-parametric MRI histogram and texture features.

    PubMed

    Zhang, Xin; Yan, Lin-Feng; Hu, Yu-Chuan; Li, Gang; Yang, Yang; Han, Yu; Sun, Ying-Zhi; Liu, Zhi-Cheng; Tian, Qiang; Han, Zi-Yang; Liu, Le-De; Hu, Bin-Quan; Qiu, Zi-Yu; Wang, Wen; Cui, Guang-Bin

    2017-07-18

    Current machine learning techniques provide the opportunity to develop noninvasive and automated glioma grading tools, by utilizing quantitative parameters derived from multi-modal magnetic resonance imaging (MRI) data. However, the efficacies of different machine learning methods in glioma grading have not been investigated.A comprehensive comparison of varied machine learning methods in differentiating low-grade gliomas (LGGs) and high-grade gliomas (HGGs) as well as WHO grade II, III and IV gliomas based on multi-parametric MRI images was proposed in the current study. The parametric histogram and image texture attributes of 120 glioma patients were extracted from the perfusion, diffusion and permeability parametric maps of preoperative MRI. Then, 25 commonly used machine learning classifiers combined with 8 independent attribute selection methods were applied and evaluated using leave-one-out cross validation (LOOCV) strategy. Besides, the influences of parameter selection on the classifying performances were investigated. We found that support vector machine (SVM) exhibited superior performance to other classifiers. By combining all tumor attributes with synthetic minority over-sampling technique (SMOTE), the highest classifying accuracy of 0.945 or 0.961 for LGG and HGG or grade II, III and IV gliomas was achieved. Application of Recursive Feature Elimination (RFE) attribute selection strategy further improved the classifying accuracies. Besides, the performances of LibSVM, SMO, IBk classifiers were influenced by some key parameters such as kernel type, c, gama, K, etc. SVM is a promising tool in developing automated preoperative glioma grading system, especially when being combined with RFE strategy. Model parameters should be considered in glioma grading model optimization.

  6. Development of a multi-criteria evaluation system to assess growing pig welfare.

    PubMed

    Martín, P; Traulsen, I; Buxadé, C; Krieter, J

    2017-03-01

    The aim of this paper was to present an alternative multi-criteria evaluation model to assess animal welfare on farms based on the Welfare Quality® (WQ) project, using an example of welfare assessment of growing pigs. The WQ assessment protocol follows a three-step aggregation process. Measures are aggregated into criteria, criteria into principles and principles into an overall assessment. This study focussed on the first step of the aggregation. Multi-attribute utility theory (MAUT) was used to produce a value of welfare for each criterion. The utility functions and the aggregation function were constructed in two separated steps. The Measuring Attractiveness by a Categorical Based Evaluation Technique (MACBETH) method was used for utility function determination and the Choquet Integral (CI) was used as an aggregation operator. The WQ decision-makers' preferences were fitted in order to construct the utility functions and to determine the CI parameters. The methods were tested with generated data sets for farms of growing pigs. Using the MAUT, similar results were obtained to the ones obtained applying the WQ protocol aggregation methods. It can be concluded that due to the use of an interactive approach such as MACBETH, this alternative methodology is more transparent and more flexible than the methodology proposed by WQ, which allows the possibility to modify the model according, for instance, to new scientific knowledge.

  7. On the Utility of Content Analysis in Author Attribution: "The Federalist."

    ERIC Educational Resources Information Center

    Martindale, Colin; McKenzie, Dean

    1995-01-01

    Compares the success of lexical statistics, content analysis, and function words in determining the true author of "The Federalist." The function word approach proved most successful in attributing the papers to James Madison. Lexical statistics contributed nothing, while content analytic measures resulted in some success. (MJP)

  8. Information networks in the stock market based on the distance of the multi-attribute dimensions between listed companies

    NASA Astrophysics Data System (ADS)

    Liu, Qian; Li, Huajiao; Liu, Xueyong; Jiang, Meihui

    2018-04-01

    In the stock market, there are widespread information connections between economic agents. Listed companies can obtain mutual information about investment decisions from common shareholders, and the extent of sharing information often determines the relationships between listed companies. Because different shareholder compositions and investment shares lead to different formations of the company's governance mechanisms, we map the investment relationships between shareholders to the multi-attribute dimensional spaces of the listed companies (each shareholder investment in a company is a company dimension). Then, we construct the listed company's information network based on co-shareholder relationships. The weights for the edges in the information network are measured with the Euclidean distance between the listed companies in the multi-attribute dimension space. We define two indices to analyze the information network's features. We conduct an empirical study that analyzes Chinese listed companies' information networks. The results from the analysis show that with the diversification and decentralization of shareholder investments, almost all Chinese listed companies exchanged information through common shareholder relationships, and there is a gradual reduction in information sharing capacity between listed companies that have common shareholders. This network analysis has benefits for risk management and portfolio investments.

  9. Validation of a multi-criteria evaluation model for animal welfare.

    PubMed

    Martín, P; Czycholl, I; Buxadé, C; Krieter, J

    2017-04-01

    The aim of this paper was to validate an alternative multi-criteria evaluation system to assess animal welfare on farms based on the Welfare Quality® (WQ) project, using an example of welfare assessment of growing pigs. This alternative methodology aimed to be more transparent for stakeholders and more flexible than the methodology proposed by WQ. The WQ assessment protocol for growing pigs was implemented to collect data in different farms in Schleswig-Holstein, Germany. In total, 44 observations were carried out. The aggregation system proposed in the WQ protocol follows a three-step aggregation process. Measures are aggregated into criteria, criteria into principles and principles into an overall assessment. This study focussed on the first two steps of the aggregation. Multi-attribute utility theory (MAUT) was used to produce a value of welfare for each criterion and principle. The utility functions and the aggregation function were constructed in two separated steps. The MACBETH (Measuring Attractiveness by a Categorical-Based Evaluation Technique) method was used for utility function determination and the Choquet integral (CI) was used as an aggregation operator. The WQ decision-makers' preferences were fitted in order to construct the utility functions and to determine the CI parameters. The validation of the MAUT model was divided into two steps, first, the results of the model were compared with the results of the WQ project at criteria and principle level, and second, a sensitivity analysis of our model was carried out to demonstrate the relative importance of welfare measures in the different steps of the multi-criteria aggregation process. Using the MAUT, similar results were obtained to those obtained when applying the WQ protocol aggregation methods, both at criteria and principle level. Thus, this model could be implemented to produce an overall assessment of animal welfare in the context of the WQ protocol for growing pigs. Furthermore, this methodology could also be used as a framework in order to produce an overall assessment of welfare for other livestock species. Two main findings are obtained from the sensitivity analysis, first, a limited number of measures had a strong influence on improving or worsening the level of welfare at criteria level and second, the MAUT model was not very sensitive to an improvement in or a worsening of single welfare measures at principle level. The use of weighted sums and the conversion of disease measures into ordinal scores should be reconsidered.

  10. A View on the Importance of "Multi-Attribute Method" for Measuring Purity of Biopharmaceuticals and Improving Overall Control Strategy.

    PubMed

    Rogers, Richard S; Abernathy, Michael; Richardson, Douglas D; Rouse, Jason C; Sperry, Justin B; Swann, Patrick; Wypych, Jette; Yu, Christopher; Zang, Li; Deshpande, Rohini

    2017-11-30

    Today, we are experiencing unprecedented growth and innovation within the pharmaceutical industry. Established protein therapeutic modalities, such as recombinant human proteins, monoclonal antibodies (mAbs), and fusion proteins, are being used to treat previously unmet medical needs. Novel therapies such as bispecific T cell engagers (BiTEs), chimeric antigen T cell receptors (CARTs), siRNA, and gene therapies are paving the path towards increasingly personalized medicine. This advancement of new indications and therapeutic modalities is paralleled by development of new analytical technologies and methods that provide enhanced information content in a more efficient manner. Recently, a liquid chromatography-mass spectrometry (LC-MS) multi-attribute method (MAM) has been developed and designed for improved simultaneous detection, identification, quantitation, and quality control (monitoring) of molecular attributes (Rogers et al. MAbs 7(5):881-90, 2015). Based on peptide mapping principles, this powerful tool represents a true advancement in testing methodology that can be utilized not only during product characterization, formulation development, stability testing, and development of the manufacturing process, but also as a platform quality control method in dispositioning clinical materials for both innovative biotherapeutics and biosimilars.

  11. Can Carbon Nanomaterials Improve CZTS Photovoltaic Devices? Evaluation of Performance and Impacts Using Integrated Life-Cycle Assessment and Decision Analysis.

    PubMed

    Scott, Ryan P; Cullen, Alison C; Fox-Lent, Cate; Linkov, Igor

    2016-10-01

    In emergent photovoltaics, nanoscale materials hold promise for optimizing device characteristics; however, the related impacts remain uncertain, resulting in challenges to decisions on strategic investment in technology innovation. We integrate multi-criteria decision analysis (MCDA) and life-cycle assessment (LCA) results (LCA-MCDA) as a method of incorporating values of a hypothetical federal acquisition manager into the assessment of risks and benefits of emerging photovoltaic materials. Specifically, we compare adoption of copper zinc tin sulfide (CZTS) devices with molybdenum back contacts to alternative devices employing graphite or graphene instead of molybdenum. LCA impact results are interpreted alongside benefits of substitution including cost reductions and performance improvements through application of multi-attribute utility theory. To assess the role of uncertainty we apply Monte Carlo simulation and sensitivity analysis. We find that graphene or graphite back contacts outperform molybdenum under most scenarios and assumptions. The use of decision analysis clarifies potential advantages of adopting graphite as a back contact while emphasizing the importance of mitigating conventional impacts of graphene production processes if graphene is used in emerging CZTS devices. Our research further demonstrates that a combination of LCA and MCDA increases the usability of LCA in assessing product sustainability. In particular, this approach identifies the most influential assumptions and data gaps in the analysis and the areas in which either engineering controls or further data collection may be necessary. © 2016 Society for Risk Analysis.

  12. Qualitative and quantitative comparison of geostatistical techniques of porosity prediction from the seismic and logging data: a case study from the Blackfoot Field, Alberta, Canada

    NASA Astrophysics Data System (ADS)

    Maurya, S. P.; Singh, K. H.; Singh, N. P.

    2018-05-01

    In present study, three recently developed geostatistical methods, single attribute analysis, multi-attribute analysis and probabilistic neural network algorithm have been used to predict porosity in inter well region for Blackfoot field, Alberta, Canada, an offshore oil field. These techniques make use of seismic attributes, generated by model based inversion and colored inversion techniques. The principle objective of the study is to find the suitable combination of seismic inversion and geostatistical techniques to predict porosity and identification of prospective zones in 3D seismic volume. The porosity estimated from these geostatistical approaches is corroborated with the well log porosity. The results suggest that all the three implemented geostatistical methods are efficient and reliable to predict the porosity but the multi-attribute and probabilistic neural network analysis provide more accurate and high resolution porosity sections. A low impedance (6000-8000 m/s g/cc) and high porosity (> 15%) zone is interpreted from inverted impedance and porosity sections respectively between 1060 and 1075 ms time interval and is characterized as reservoir. The qualitative and quantitative results demonstrate that of all the employed geostatistical methods, the probabilistic neural network along with model based inversion is the most efficient method for predicting porosity in inter well region.

  13. Multi-Modal Intelligent Traffic Signal Systems (MMITSS) impacts assessment.

    DOT National Transportation Integrated Search

    2015-08-01

    The study evaluates the potential network-wide impacts of the Multi-Modal Intelligent Transportation Signal System (MMITSS) based on a field data analysis utilizing data collected from a MMITSS prototype and a simulation analysis. The Intelligent Tra...

  14. Characterising volcanic cycles at Soufriere Hills Volcano, Montserrat: Time series analysis of multi-parameter satellite data

    NASA Astrophysics Data System (ADS)

    Flower, Verity J. B.; Carn, Simon A.

    2015-10-01

    The identification of cyclic volcanic activity can elucidate underlying eruption dynamics and aid volcanic hazard mitigation. Whilst satellite datasets are often analysed individually, here we exploit the multi-platform NASA A-Train satellite constellation to cross-correlate cyclical signals identified using complementary measurement techniques at Soufriere Hills Volcano (SHV), Montserrat. In this paper we present a Multi-taper (MTM) Fast Fourier Transform (FFT) analysis of coincident SO2 and thermal infrared (TIR) satellite measurements at SHV facilitating the identification of cyclical volcanic behaviour. These measurements were collected by the Ozone Monitoring Instrument (OMI) and Moderate Resolution Imaging Spectroradiometer (MODIS) (respectively) in the A-Train. We identify a correlating cycle in both the OMI and MODIS data (54-58 days), with this multi-week feature attributable to episodes of dome growth. The 50 day cycles were also identified in ground-based SO2 data at SHV, confirming the validity of our analysis and further corroborating the presence of this cycle at the volcano. In addition a 12 day cycle was identified in the OMI data, previously attributed to variable lava effusion rates on shorter timescales. OMI data also display a one week (7-8 days) cycle attributable to cyclical variations in viewing angle resulting from the orbital characteristics of the Aura satellite. Longer period cycles possibly relating to magma intrusion were identified in the OMI record (102-, 121-, and 159 days); in addition to a 238-day cycle identified in the MODIS data corresponding to periodic destabilisation of the lava dome. Through the analysis of reconstructions generated from cycles identified in the OMI and MODIS data, periods of unrest were identified, including the major dome collapse of 20th May 2006 and significant explosive event of 3rd January 2009. Our analysis confirms the potential for identification of cyclical volcanic activity through combined analysis of satellite data, which would be of particular value at poorly monitored volcanic systems.

  15. Multi-Attribute Sequential Search

    ERIC Educational Resources Information Center

    Bearden, J. Neil; Connolly, Terry

    2007-01-01

    This article describes empirical and theoretical results from two multi-attribute sequential search tasks. In both tasks, the DM sequentially encounters options described by two attributes and must pay to learn the values of the attributes. In the "continuous" version of the task the DM learns the precise numerical value of an attribute when she…

  16. A Generalized Measurement Model to Quantify Health: The Multi-Attribute Preference Response Model

    PubMed Central

    Krabbe, Paul F. M.

    2013-01-01

    After 40 years of deriving metric values for health status or health-related quality of life, the effective quantification of subjective health outcomes is still a challenge. Here, two of the best measurement tools, the discrete choice and the Rasch model, are combined to create a new model for deriving health values. First, existing techniques to value health states are briefly discussed followed by a reflection on the recent revival of interest in patients’ experience with regard to their possible role in health measurement. Subsequently, three basic principles for valid health measurement are reviewed, namely unidimensionality, interval level, and invariance. In the main section, the basic operation of measurement is then discussed in the framework of probabilistic discrete choice analysis (random utility model) and the psychometric Rasch model. It is then shown how combining the main features of these two models yields an integrated measurement model, called the multi-attribute preference response (MAPR) model, which is introduced here. This new model transforms subjective individual rank data into a metric scale using responses from patients who have experienced certain health states. Its measurement mechanism largely prevents biases such as adaptation and coping. Several extensions of the MAPR model are presented. The MAPR model can be applied to a wide range of research problems. If extended with the self-selection of relevant health domains for the individual patient, this model will be more valid than existing valuation techniques. PMID:24278141

  17. Importance of multi-modal approaches to effectively identify cataract cases from electronic health records.

    PubMed

    Peissig, Peggy L; Rasmussen, Luke V; Berg, Richard L; Linneman, James G; McCarty, Catherine A; Waudby, Carol; Chen, Lin; Denny, Joshua C; Wilke, Russell A; Pathak, Jyotishman; Carrell, David; Kho, Abel N; Starren, Justin B

    2012-01-01

    There is increasing interest in using electronic health records (EHRs) to identify subjects for genomic association studies, due in part to the availability of large amounts of clinical data and the expected cost efficiencies of subject identification. We describe the construction and validation of an EHR-based algorithm to identify subjects with age-related cataracts. We used a multi-modal strategy consisting of structured database querying, natural language processing on free-text documents, and optical character recognition on scanned clinical images to identify cataract subjects and related cataract attributes. Extensive validation on 3657 subjects compared the multi-modal results to manual chart review. The algorithm was also implemented at participating electronic MEdical Records and GEnomics (eMERGE) institutions. An EHR-based cataract phenotyping algorithm was successfully developed and validated, resulting in positive predictive values (PPVs) >95%. The multi-modal approach increased the identification of cataract subject attributes by a factor of three compared to single-mode approaches while maintaining high PPV. Components of the cataract algorithm were successfully deployed at three other institutions with similar accuracy. A multi-modal strategy incorporating optical character recognition and natural language processing may increase the number of cases identified while maintaining similar PPVs. Such algorithms, however, require that the needed information be embedded within clinical documents. We have demonstrated that algorithms to identify and characterize cataracts can be developed utilizing data collected via the EHR. These algorithms provide a high level of accuracy even when implemented across multiple EHRs and institutional boundaries.

  18. Thermal power systems small power systems applications project. Decision analysis for evaluating and ranking small solar thermal power system technologies. Volume 1: A brief introduction to multiattribute decision analysis. [explanation of multiattribute decision analysis methods used in evaluating alternatives for small powered systems

    NASA Technical Reports Server (NTRS)

    Feinberg, A.; Miles, R. F., Jr.

    1978-01-01

    The principal concepts of the Keeney and Raiffa approach to multiattribute decision analysis are described. Topics discussed include the concepts of decision alternatives, outcomes, objectives, attributes and their states, attribute utility functions, and the necessary independence properties for the attribute states to be aggregated into a numerical representation of the preferences of the decision maker for the outcomes and decision alternatives.

  19. Fourier crosstalk analysis of multislice and cone-beam helical CT

    NASA Astrophysics Data System (ADS)

    La Riviere, Patrick J.

    2004-05-01

    Multi-slice helical CT scanners allow for much faster scanning and better x-ray utilization than do their single-slice predecessors, but they engender considerably more complicated data sampling patterns due to the interlacing of the samples from different rows as the patient is translated. Characterizing and optimizing this sampling is challenging because the conebeam geometry of such scanners means that the projections measured by each detector row are at least slightly oblique, making it difficult to apply standard multidimensional sampling analyses. In this study, we seek to apply a more general framework for analyzing sampled imaging systems known as Fourier crosstalk analysis. Our purpose in this preliminary work is to compare the information content of the data acquired in three different scanner geometries and operating conditions with ostensibly equivalent volume coverage and average longitudinal sampling interval: a single-slice scanner operating at pitch 1, a four-slice scanner operating at pitch 3 and a 15-slice scanner operating at pitch 15. We find that moving from a single-slice to a multi-slice geometry introduces longitudinal crosstalk characteristic of the longitudinal sampling interval between periods of individual each detector row, and not of the overall interlaced sampling pattern. This is attributed to data inconsistencies caused by the obliqueness of the projections in a multi-slice/conebeam configuration. However, these preliminary results suggest that the significance of this additional crosstalk actually decreases as the number of detector rows increases.

  20. Obesogenic environment: a concept analysis and pediatric perspective.

    PubMed

    Gauthier, Kristine I; Krajicek, Marilyn J

    2013-07-01

    A concept analysis was undertaken to examine the attributes, characteristics, and uses of the concept of obesogenic environment within a pediatric context. Utilizing a modified version of Walker and Avant's method, the attributes and characteristics of obesogenic environment were identified as it pertains to children. Based on the review of the literature and previous definitions applied to adults, a definition of the concept of obesogenic environment within a pediatric context was developed; examples of sample cases illustrate the concept further. Defining the concept of obesogenic environment has utility for nursing theory development, practice, research, and education. © 2013, Wiley Periodicals, Inc.

  1. Enhanced multi-protocol analysis via intelligent supervised embedding (EMPrAvISE): detecting prostate cancer on multi-parametric MRI

    NASA Astrophysics Data System (ADS)

    Viswanath, Satish; Bloch, B. Nicholas; Chappelow, Jonathan; Patel, Pratik; Rofsky, Neil; Lenkinski, Robert; Genega, Elizabeth; Madabhushi, Anant

    2011-03-01

    Currently, there is significant interest in developing methods for quantitative integration of multi-parametric (structural, functional) imaging data with the objective of building automated meta-classifiers to improve disease detection, diagnosis, and prognosis. Such techniques are required to address the differences in dimensionalities and scales of individual protocols, while deriving an integrated multi-parametric data representation which best captures all disease-pertinent information available. In this paper, we present a scheme called Enhanced Multi-Protocol Analysis via Intelligent Supervised Embedding (EMPrAvISE); a powerful, generalizable framework applicable to a variety of domains for multi-parametric data representation and fusion. Our scheme utilizes an ensemble of embeddings (via dimensionality reduction, DR); thereby exploiting the variance amongst multiple uncorrelated embeddings in a manner similar to ensemble classifier schemes (e.g. Bagging, Boosting). We apply this framework to the problem of prostate cancer (CaP) detection on 12 3 Tesla pre-operative in vivo multi-parametric (T2-weighted, Dynamic Contrast Enhanced, and Diffusion-weighted) magnetic resonance imaging (MRI) studies, in turn comprising a total of 39 2D planar MR images. We first align the different imaging protocols via automated image registration, followed by quantification of image attributes from individual protocols. Multiple embeddings are generated from the resultant high-dimensional feature space which are then combined intelligently to yield a single stable solution. Our scheme is employed in conjunction with graph embedding (for DR) and probabilistic boosting trees (PBTs) to detect CaP on multi-parametric MRI. Finally, a probabilistic pairwise Markov Random Field algorithm is used to apply spatial constraints to the result of the PBT classifier, yielding a per-voxel classification of CaP presence. Per-voxel evaluation of detection results against ground truth for CaP extent on MRI (obtained by spatially registering pre-operative MRI with available whole-mount histological specimens) reveals that EMPrAvISE yields a statistically significant improvement (AUC=0.77) over classifiers constructed from individual protocols (AUC=0.62, 0.62, 0.65, for T2w, DCE, DWI respectively) as well as one trained using multi-parametric feature concatenation (AUC=0.67).

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deshmukh, Ranjit; Wu, Grace

    The MapRE (Multi-criteria Analysis for Planning Renewable Energy) GIS (Geographic Information Systems) Tools are a set of ArcGIS tools to a) conduct site suitability analysis for wind and solar resources using inclusion and exclusion criteria, and create resource maps, b) create project opportunity areas and compute various attributes such as cost, distances to existing and planned infrastructure. and environmental impact factors; and c) calculate and update various attributes for already processed renewable energy zones. In addition, MapRE data sets are geospatial data of renewable energy project opportunity areas and zones with pre-calculated attributes for several countries. These tools and datamore » are available at mapre.lbl.gov.« less

  3. Qualitative analysis of MTEM response using instantaneous attributes

    NASA Astrophysics Data System (ADS)

    Fayemi, Olalekan; Di, Qingyun

    2017-11-01

    This paper introduces new technique for qualitative analysis of multi-transient electromagnetic (MTEM) earth impulse response over complex geological structures. Instantaneous phase and frequency attributes were used in place of the conventional common offset section for improved qualitative interpretation of MTEM data by obtaining more detailed information from the earth impulse response. The instantaneous attributes were used to describe the lateral variation in subsurface resistivity and the visible geological structure with respect to given offsets. Instantaneous phase attribute was obtained by converting the impulse response into a complex form using the Hilbert transform. Conversely, the polynomial phase difference (PPD) estimator was favored over the center finite difference (CFD) approximation method in calculating the instantaneous frequency attribute because it is computationally efficient and has the ability to give a smooth variation of the instantaneous frequency over a common offset section. The observed results from the instantaneous attributes were in good agreement with both the subsurface model used and the apparent resistivity section obtained from the MTEM earth impulse response. Hence, this study confirms the capability of both instantaneous phase and frequency attributes as highly effective tools for MTEM qualitative analysis.

  4. BagMOOV: A novel ensemble for heart disease prediction bootstrap aggregation with multi-objective optimized voting.

    PubMed

    Bashir, Saba; Qamar, Usman; Khan, Farhan Hassan

    2015-06-01

    Conventional clinical decision support systems are based on individual classifiers or simple combination of these classifiers which tend to show moderate performance. This research paper presents a novel classifier ensemble framework based on enhanced bagging approach with multi-objective weighted voting scheme for prediction and analysis of heart disease. The proposed model overcomes the limitations of conventional performance by utilizing an ensemble of five heterogeneous classifiers: Naïve Bayes, linear regression, quadratic discriminant analysis, instance based learner and support vector machines. Five different datasets are used for experimentation, evaluation and validation. The datasets are obtained from publicly available data repositories. Effectiveness of the proposed ensemble is investigated by comparison of results with several classifiers. Prediction results of the proposed ensemble model are assessed by ten fold cross validation and ANOVA statistics. The experimental evaluation shows that the proposed framework deals with all type of attributes and achieved high diagnosis accuracy of 84.16 %, 93.29 % sensitivity, 96.70 % specificity, and 82.15 % f-measure. The f-ratio higher than f-critical and p value less than 0.05 for 95 % confidence interval indicate that the results are extremely statistically significant for most of the datasets.

  5. Model based inversion of ultrasound data in composites

    NASA Astrophysics Data System (ADS)

    Roberts, R. A.

    2018-04-01

    Work is reported on model-based defect characterization in CFRP composites. The work utilizes computational models of ultrasound interaction with defects in composites, to determine 1) the measured signal dependence on material and defect properties (forward problem), and 2) an assessment of defect properties from analysis of measured ultrasound signals (inverse problem). Work is reported on model implementation for inspection of CFRP laminates containing multi-ply impact-induced delamination, in laminates displaying irregular surface geometry (roughness), as well as internal elastic heterogeneity (varying fiber density, porosity). Inversion of ultrasound data is demonstrated showing the quantitative extraction of delamination geometry and surface transmissivity. Additionally, data inversion is demonstrated for determination of surface roughness and internal heterogeneity, and the influence of these features on delamination characterization is examined. Estimation of porosity volume fraction is demonstrated when internal heterogeneity is attributed to porosity.

  6. Pros and cons of conjoint analysis of discrete choice experiments to define classification and response criteria in rheumatology.

    PubMed

    Taylor, William J

    2016-03-01

    Conjoint analysis of choice or preference data has been used in marketing for over 40 years but has appeared in healthcare settings much more recently. It may be a useful technique for applications within the rheumatology field. Conjoint analysis in rheumatology contexts has mainly used the approaches implemented in 1000Minds Ltd, Dunedin, New Zealand, Sawtooth Software, Orem UT, USA. Examples include classification criteria, composite response criteria, service prioritization tools and utilities assessment. Limitations imposed by very many attributes can be managed using new techniques. Conjoint analysis studies of classification and response criteria suggest that the assumption of equal weighting of attributes cannot be met, which challenges traditional approaches to composite criteria construction. Weights elicited through choice experiments with experts can derive more accurate classification criteria, than unweighted criteria. Studies that find significant variation in attribute weights for composite response criteria for gout make construction of such criteria problematic. Better understanding of various multiattribute phenomena is likely to increase with increased use of conjoint analysis, especially when the attributes concern individual perceptions or opinions. In addition to classification criteria, some applications for conjoint analysis that are emerging in rheumatology include prioritization tools, remission criteria, and utilities for life areas.

  7. Perceived Insider Status and Feedback Reactions: A Dual Path of Feedback Motivation Attribution.

    PubMed

    Chen, Xiao; Liao, JianQiao; Wu, Weijiong; Zhang, Wei

    2017-01-01

    Many studies have evaluated how the characteristics of feedback receiver, feedback deliverer and feedback information influence psychological feedback reactions of the feedback receiver while largely neglecting that feedback intervention is a kind of social interaction process. To address this issue, this study proposes that employees' perceived insider status (PIS), as a kind of employee-organization relationship, could also influence employees' reactions to supervisory feedback. In particular, this study investigates the influence of PIS focusing on affective and cognitive feedback reactions, namely feedback satisfaction and feedback utility. Surveys were conducted in a machinery manufacturing company in the Guangdong province of China. Samples were collected from 192 employees. Data analysis demonstrated that PIS and feedback utility possessed a U-shaped relationship, whereas PIS and feedback satisfaction exhibited positively linear relationships. The analysis identified two kinds of mediating mechanisms related to feedback satisfaction and feedback utility. Internal feedback motivation attribution partially mediated the relationship between PIS and feedback satisfaction but failed to do the same with respect to the relationship between PIS and feedback utility. In contrast, external feedback motivation attribution partially mediated the relationship between PIS and feedback utility while failing to mediate the relationship between PIS and feedback satisfaction. Theoretical contributions and practical implications of the findings are discussed at the end of the paper.

  8. Perceived Insider Status and Feedback Reactions: A Dual Path of Feedback Motivation Attribution

    PubMed Central

    Chen, Xiao; Liao, JianQiao; Wu, Weijiong; Zhang, Wei

    2017-01-01

    Many studies have evaluated how the characteristics of feedback receiver, feedback deliverer and feedback information influence psychological feedback reactions of the feedback receiver while largely neglecting that feedback intervention is a kind of social interaction process. To address this issue, this study proposes that employees’ perceived insider status (PIS), as a kind of employee-organization relationship, could also influence employees’ reactions to supervisory feedback. In particular, this study investigates the influence of PIS focusing on affective and cognitive feedback reactions, namely feedback satisfaction and feedback utility. Surveys were conducted in a machinery manufacturing company in the Guangdong province of China. Samples were collected from 192 employees. Data analysis demonstrated that PIS and feedback utility possessed a U-shaped relationship, whereas PIS and feedback satisfaction exhibited positively linear relationships. The analysis identified two kinds of mediating mechanisms related to feedback satisfaction and feedback utility. Internal feedback motivation attribution partially mediated the relationship between PIS and feedback satisfaction but failed to do the same with respect to the relationship between PIS and feedback utility. In contrast, external feedback motivation attribution partially mediated the relationship between PIS and feedback utility while failing to mediate the relationship between PIS and feedback satisfaction. Theoretical contributions and practical implications of the findings are discussed at the end of the paper. PMID:28507527

  9. The conceptual foundation of environmental decision support.

    PubMed

    Reichert, Peter; Langhans, Simone D; Lienert, Judit; Schuwirth, Nele

    2015-05-01

    Environmental decision support intends to use the best available scientific knowledge to help decision makers find and evaluate management alternatives. The goal of this process is to achieve the best fulfillment of societal objectives. This requires a careful analysis of (i) how scientific knowledge can be represented and quantified, (ii) how societal preferences can be described and elicited, and (iii) how these concepts can best be used to support communication with authorities, politicians, and the public in environmental management. The goal of this paper is to discuss key requirements for a conceptual framework to address these issues and to suggest how these can best be met. We argue that a combination of probability theory and scenario planning with multi-attribute utility theory fulfills these requirements, and discuss adaptations and extensions of these theories to improve their application for supporting environmental decision making. With respect to (i) we suggest the use of intersubjective probabilities, if required extended to imprecise probabilities, to describe the current state of scientific knowledge. To address (ii), we emphasize the importance of value functions, in addition to utilities, to support decisions under risk. We discuss the need for testing "non-standard" value aggregation techniques, the usefulness of flexibility of value functions regarding attribute data availability, the elicitation of value functions for sub-objectives from experts, and the consideration of uncertainty in value and utility elicitation. With respect to (iii), we outline a well-structured procedure for transparent environmental decision support that is based on a clear separation of scientific prediction and societal valuation. We illustrate aspects of the suggested methodology by its application to river management in general and with a small, didactical case study on spatial river rehabilitation prioritization. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Deriving preference order of post-mining land-uses through MLSA framework: application of an outranking technique

    NASA Astrophysics Data System (ADS)

    Soltanmohammadi, Hossein; Osanloo, Morteza; Aghajani Bazzazi, Abbas

    2009-08-01

    This study intends to take advantage of a previously developed framework for mined land suitability analysis (MLSA) consisted of economical, social, technical and mine site factors to achieve a partial and also a complete pre-order of feasible post-mining land-uses. Analysis by an outranking multi-attribute decision-making (MADM) technique, called PROMETHEE (preference ranking organization method for enrichment evaluation), was taken into consideration because of its clear advantages on the field of MLSA as compared with MADM ranking techniques. Application of the proposed approach on a mined land can be completed through some successive steps. First, performance of the MLSA attributes is scored locally by each individual decision maker (DM). Then the assigned performance scores are normalized and the deviation amplitudes of non-dominated alternatives are calculated. Weights of the attributes are calculated by another MADM technique namely, analytical hierarchy process (AHP) in a separate procedure. Using the Gaussian preference function beside the weights, the preference indexes of the land-use alternatives are obtained. Calculation of the outgoing and entering flows of the alternatives and one by one comparison of these values will lead to partial pre-order of them and calculation of the net flows, will lead to a ranked preference for each land-use. At the final step, utilizing the PROMETHEE group decision support system which incorporates judgments of all the DMs, a consensual ranking can be derived. In this paper, preference order of post-mining land-uses for a hypothetical mined land has been derived according to judgments of one DM to reveal applicability of the proposed approach.

  11. [Real-time detection of quality of Chinese materia medica: strategy of NIR model evaluation].

    PubMed

    Wu, Zhi-sheng; Shi, Xin-yuan; Xu, Bing; Dai, Xing-xing; Qiao, Yan-jiang

    2015-07-01

    The definition of critical quality attributes of Chinese materia medica ( CMM) was put forward based on the top-level design concept. Nowadays, coupled with the development of rapid analytical science, rapid assessment of critical quality attributes of CMM was firstly carried out, which was the secondary discipline branch of CMM. Taking near infrared (NIR) spectroscopy as an example, which is a rapid analytical technology in pharmaceutical process over the past decade, systematic review is the chemometric parameters in NIR model evaluation. According to the characteristics of complexity of CMM and trace components analysis, a multi-source information fusion strategy of NIR model was developed for assessment of critical quality attributes of CMM. The strategy has provided guideline for NIR reliable analysis in critical quality attributes of CMM.

  12. "Fibromyalgia and quality of life: mapping the revised fibromyalgia impact questionnaire to the preference-based instruments".

    PubMed

    Collado-Mateo, Daniel; Chen, Gang; Garcia-Gordillo, Miguel A; Iezzi, Angelo; Adsuar, José C; Olivares, Pedro R; Gusi, Narcis

    2017-05-30

    The revised version of the Fibromyalgia Impact Questionnaire (FIQR) is one of the most widely used specific questionnaires in FM studies. However, this questionnaire does not allow calculation of QALYs as it is not a preference-based measure. The aim of this study was to develop mapping algorithm which enable FIQR scores to be transformed into utility scores that can be used in the cost utility analyses. A cross-sectional survey was conducted. One hundred and 92 Spanish women with Fibromyalgia were asked to complete four general quality of life questionnaires, i.e. EQ-5D-5 L, 15D, AQoL-8D and SF-12, and one specific disease instrument, the FIQR. A direct mapping approach was adopted to derive mapping algorithms between the FIQR and each of the four multi-attribute utility (MAU) instruments. Health state utility was treated as the dependent variable in the regression analysis, whilst the FIQR score and age were predictors. The mean utility scores ranged from 0.47 (AQoL-8D) to 0.69 (15D). All correlations between the FIQR total score and MAU instruments utility scores were highly significant (p < 0.0001) with magnitudes larger than 0.5. Although very slight differences in the mean absolute error were found between ordinary least squares (OLS) estimator and generalized linear model (GLM), models based on GLM were better for EQ-5D-5 L, AQoL-8D and 15D. Mapping algorithms developed in this study enable the estimation of utility values from scores in a fibromyalgia specific questionnaire.

  13. Selection of an Optimum Air Defense Weapon Package Using MAUM (Multi-Attribute Utility Measurement).

    DTIC Science & Technology

    1983-06-01

    SELECTION OF AN OPTIMUM AIR DEFENSE WEAPON PACKAGE USING MAUM by Wilton L. Ham June 1983 Thesis Advisor: R. G. Nickerson Approved for public release...OSSTRIUTON STATEMEN4T (of if AlRpeat) Approved for public release; distribution unlimited. I?. 01STVAGUTgOg STATE[MENT (of me ubeh’ei antered Ian...hold": do not fire except in self defense. 4. Firing Commands. These are commands issued regard- less of the weapons control in effect. There are three

  14. Development of the Attributed Dignity Scale.

    PubMed

    Jacelon, Cynthia S; Dixon, Jane; Knafl, Kathleen A

    2009-07-01

    A sequential, multi-method approach to instrument development beginning with concept analysis, followed by (a) item generation from qualitative data, (b) review of items by expert and lay person panels, (c) cognitive appraisal interviews, (d) pilot testing, and (e) evaluating construct validity was used to develop a measure of attributed dignity in older adults. The resulting positively scored, 23-item scale has three dimensions: Self-Value, Behavioral Respect-Self, and Behavioral Respect-Others. Item-total correlations in the pilot study ranged from 0.39 to 0.85. Correlations between the Attributed Dignity Scale (ADS) and both Rosenberg's Self-Esteem Scale (0.17) and Crowne and Marlowe's Social Desirability Scale (0.36) were modest and in the expected direction, indicating attributed dignity is a related but independent concept. Next steps include testing the ADS with a larger sample to complete factor analysis, test-retest stability, and further study of the relationships between attributed dignity and other concepts.

  15. An importance-performance analysis of hospital information system attributes: A nurses' perspective.

    PubMed

    Cohen, Jason F; Coleman, Emma; Kangethe, Matheri J

    2016-02-01

    Health workers have numerous concerns about hospital IS (HIS) usage. Addressing these concerns requires understanding the system attributes most important to their satisfaction and productivity. Following a recent HIS implementation, our objective was to identify priorities for managerial intervention based on user evaluations of the performance of the HIS attributes as well as the relative importance of these attributes to user satisfaction and productivity outcomes. We collected data along a set of attributes representing system quality, data quality, information quality, and service quality from 154 nurse users. Their quantitative responses were analysed using the partial least squares approach followed by an importance-performance analysis. Qualitative responses were analysed using thematic analysis to triangulate and supplement the quantitative findings. Two system quality attributes (responsiveness and ease of learning), one information quality attribute (detail), one service quality attribute (sufficient support), and three data quality attributes (records complete, accurate and never missing) were identified as high priorities for intervention. Our application of importance-performance analysis is unique in HIS evaluation and we have illustrated its utility for identifying those system attributes for which underperformance is not acceptable to users and therefore should be high priorities for intervention. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. Advanced development of atmospheric models. [SEASAT Program support

    NASA Technical Reports Server (NTRS)

    Kesel, P. G.; Langland, R. A.; Stephens, P. L.; Welleck, R. E.; Wolff, P. M.

    1979-01-01

    A set of atmospheric analysis and prediction models was developed in support of the SEASAT Program existing objective analysis models which utilize a 125x125 polar stereographic grid of the Northern Hemisphere, which were modified in order to incorporate and assess the impact of (real or simulated) satellite data in the analysis of a two-day meteorological scenario in January 1979. Program/procedural changes included: (1) a provision to utilize winds in the sea level pressure and multi-level height analyses (1000-100 MBS); (2) The capability to perform a pre-analysis at two control levels (1000 MBS and 250 MBS); (3) a greater degree of wind- and mass-field coupling, especially at these controls levels; (4) an improved facility to bogus the analyses based on results of the preanalysis; and (5) a provision to utilize (SIRS) satellite thickness values and cloud motion vectors in the multi-level height analysis.

  17. Importance of multi-modal approaches to effectively identify cataract cases from electronic health records

    PubMed Central

    Rasmussen, Luke V; Berg, Richard L; Linneman, James G; McCarty, Catherine A; Waudby, Carol; Chen, Lin; Denny, Joshua C; Wilke, Russell A; Pathak, Jyotishman; Carrell, David; Kho, Abel N; Starren, Justin B

    2012-01-01

    Objective There is increasing interest in using electronic health records (EHRs) to identify subjects for genomic association studies, due in part to the availability of large amounts of clinical data and the expected cost efficiencies of subject identification. We describe the construction and validation of an EHR-based algorithm to identify subjects with age-related cataracts. Materials and methods We used a multi-modal strategy consisting of structured database querying, natural language processing on free-text documents, and optical character recognition on scanned clinical images to identify cataract subjects and related cataract attributes. Extensive validation on 3657 subjects compared the multi-modal results to manual chart review. The algorithm was also implemented at participating electronic MEdical Records and GEnomics (eMERGE) institutions. Results An EHR-based cataract phenotyping algorithm was successfully developed and validated, resulting in positive predictive values (PPVs) >95%. The multi-modal approach increased the identification of cataract subject attributes by a factor of three compared to single-mode approaches while maintaining high PPV. Components of the cataract algorithm were successfully deployed at three other institutions with similar accuracy. Discussion A multi-modal strategy incorporating optical character recognition and natural language processing may increase the number of cases identified while maintaining similar PPVs. Such algorithms, however, require that the needed information be embedded within clinical documents. Conclusion We have demonstrated that algorithms to identify and characterize cataracts can be developed utilizing data collected via the EHR. These algorithms provide a high level of accuracy even when implemented across multiple EHRs and institutional boundaries. PMID:22319176

  18. Informing vaccine decision-making: A strategic multi-attribute ranking tool for vaccines-SMART Vaccines 2.0.

    PubMed

    Knobler, Stacey; Bok, Karin; Gellin, Bruce

    2017-01-20

    SMART Vaccines 2.0 software is being developed to support decision-making among multiple stakeholders in the process of prioritizing investments to optimize the outcomes of vaccine development and deployment. Vaccines and associated vaccination programs are one of the most successful and effective public health interventions to prevent communicable diseases and vaccine researchers are continually working towards expanding targets for communicable and non-communicable diseases through preventive and therapeutic modes. A growing body of evidence on emerging vaccine technologies, trends in disease burden, costs associated with vaccine development and deployment, and benefits derived from disease prevention through vaccination and a range of other factors can inform decision-making and investment in new and improved vaccines and targeted utilization of already existing vaccines. Recognizing that an array of inputs influences these decisions, the strategic multi-attribute ranking method for vaccines (SMART Vaccines 2.0) is in development as a web-based tool-modified from a U.S. Institute of Medicine Committee effort (IOM, 2015)-to highlight data needs and create transparency to facilitate dialogue and information-sharing among decision-makers and to optimize the investment of resources leading to improved health outcomes. Current development efforts of the SMART Vaccines 2.0 framework seek to generate a weighted recommendation on vaccine development or vaccination priorities based on population, disease, economic, and vaccine-specific data in combination with individual preference and weights of user-selected attributes incorporating valuations of health, economics, demographics, public concern, scientific and business, programmatic, and political considerations. Further development of the design and utility of the tool is being carried out by the National Vaccine Program Office of the Department of Health and Human Services and the Fogarty International Center of the National Institutes of Health. We aim to demonstrate the utility of SMART Vaccines 2.0 through the engagement of a community of relevant stakeholders and to identify a limited number of pilot projects to determine explicitly defined attribute preferences and the related data and model requirements that are responsive to user needs and able to improve the use of evidence for vaccine-related decision-making and consequential priorities of vaccination options. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Linguistic Multi-Attribute Group Decision Making with Risk Preferences and Its Use in Low-Carbon Tourism Destination Selection.

    PubMed

    Lin, Hui; Wang, Zhou-Jing

    2017-09-17

    Low-carbon tourism plays an important role in carbon emission reduction and environmental protection. Low-carbon tourism destination selection often involves multiple conflicting and incommensurate attributes or criteria and can be modelled as a multi-attribute decision-making problem. This paper develops a framework to solve multi-attribute group decision-making problems, where attribute evaluation values are provided as linguistic terms and the attribute weight information is incomplete. In order to obtain a group risk preference captured by a linguistic term set with triangular fuzzy semantic information, a nonlinear programming model is established on the basis of individual risk preferences. We first convert individual linguistic-term-based decision matrices to their respective triangular fuzzy decision matrices, which are then aggregated into a group triangular fuzzy decision matrix. Based on this group decision matrix and the incomplete attribute weight information, a linear program is developed to find an optimal attribute weight vector. A detailed procedure is devised for tackling linguistic multi-attribute group decision making problems. A low-carbon tourism destination selection case study is offered to illustrate how to use the developed group decision-making model in practice.

  20. Linguistic Multi-Attribute Group Decision Making with Risk Preferences and Its Use in Low-Carbon Tourism Destination Selection

    PubMed Central

    Lin, Hui; Wang, Zhou-Jing

    2017-01-01

    Low-carbon tourism plays an important role in carbon emission reduction and environmental protection. Low-carbon tourism destination selection often involves multiple conflicting and incommensurate attributes or criteria and can be modelled as a multi-attribute decision-making problem. This paper develops a framework to solve multi-attribute group decision-making problems, where attribute evaluation values are provided as linguistic terms and the attribute weight information is incomplete. In order to obtain a group risk preference captured by a linguistic term set with triangular fuzzy semantic information, a nonlinear programming model is established on the basis of individual risk preferences. We first convert individual linguistic-term-based decision matrices to their respective triangular fuzzy decision matrices, which are then aggregated into a group triangular fuzzy decision matrix. Based on this group decision matrix and the incomplete attribute weight information, a linear program is developed to find an optimal attribute weight vector. A detailed procedure is devised for tackling linguistic multi-attribute group decision making problems. A low-carbon tourism destination selection case study is offered to illustrate how to use the developed group decision-making model in practice. PMID:28926985

  1. ConTour: Data-Driven Exploration of Multi-Relational Datasets for Drug Discovery.

    PubMed

    Partl, Christian; Lex, Alexander; Streit, Marc; Strobelt, Hendrik; Wassermann, Anne-Mai; Pfister, Hanspeter; Schmalstieg, Dieter

    2014-12-01

    Large scale data analysis is nowadays a crucial part of drug discovery. Biologists and chemists need to quickly explore and evaluate potentially effective yet safe compounds based on many datasets that are in relationship with each other. However, there is a lack of tools that support them in these processes. To remedy this, we developed ConTour, an interactive visual analytics technique that enables the exploration of these complex, multi-relational datasets. At its core ConTour lists all items of each dataset in a column. Relationships between the columns are revealed through interaction: selecting one or multiple items in one column highlights and re-sorts the items in other columns. Filters based on relationships enable drilling down into the large data space. To identify interesting items in the first place, ConTour employs advanced sorting strategies, including strategies based on connectivity strength and uniqueness, as well as sorting based on item attributes. ConTour also introduces interactive nesting of columns, a powerful method to show the related items of a child column for each item in the parent column. Within the columns, ConTour shows rich attribute data about the items as well as information about the connection strengths to other datasets. Finally, ConTour provides a number of detail views, which can show items from multiple datasets and their associated data at the same time. We demonstrate the utility of our system in case studies conducted with a team of chemical biologists, who investigate the effects of chemical compounds on cells and need to understand the underlying mechanisms.

  2. A UML-based meta-framework for system design in public health informatics.

    PubMed

    Orlova, Anna O; Lehmann, Harold

    2002-01-01

    The National Agenda for Public Health Informatics calls for standards in data and knowledge representation within public health, which requires a multi-level framework that links all aspects of public health. The literature of public health informatics and public health informatics application were reviewed. A UML-based systems analysis was performed. Face validity of results was evaluated in analyzing the public health domain of lead poisoning. The core class of the UML-based system of public health is the Public Health Domain, which is associated with multiple Problems, for which Actors provide Perspectives. Actors take Actions that define, generate, utilize and/or evaluate Data Sources. The life cycle of the domain is a sequence of activities attributed to its problems that spirals through multiple iterations and realizations within a domain. The proposed Public Health Informatics Meta-Framework broadens efforts in applying informatics principles to the field of public health

  3. Compositional profiling and sensorial analysis of multi-wholegrain extruded puffs as affected by fructan inclusion.

    PubMed

    Handa, C; Goomer, S

    2015-09-01

    Rice grits, corn grits, pulse, wholegrain - finger millet and sorghum were utilized in the production of multigrain extruded puffs using a single screw extruder. The effect of inclusion of fructan - fructoligosaccharide in multi-wholegrain (MWG) extruded puffs was examined. MWG fructan enriched puffs puffs had 450 % higher dietary fiber content than the control puff (CP). These puffs can be categorized as 'Good Source' of fiber as it suffices 17.2 % DV of fiber. Puffs were rated 8.1 ± 0.6, 8.3 ± 0.7, 8.1 ± 0.6, 7.5 ± 0.5 and 8.2 ± 0.6 for color, flavor, texture, appearance and overall acceptability respectively. The scores for all the attributes were found to be not significantly different (p < 0.05) from CP. The MWG fructan puffs were rated higher on flavor than the CP having a score of 8.3 ± 0.7 as opposed to 8.2 ± 0.4 for CP. This indicates that the nutritional quality and acceptability of MWG extruded puffs could be improved by the inclusion of fructans.

  4. BURRITO: An Interactive Multi-Omic Tool for Visualizing Taxa–Function Relationships in Microbiome Data

    PubMed Central

    McNally, Colin P.; Eng, Alexander; Noecker, Cecilia; Gagne-Maynard, William C.; Borenstein, Elhanan

    2018-01-01

    The abundance of both taxonomic groups and gene categories in microbiome samples can now be easily assayed via various sequencing technologies, and visualized using a variety of software tools. However, the assemblage of taxa in the microbiome and its gene content are clearly linked, and tools for visualizing the relationship between these two facets of microbiome composition and for facilitating exploratory analysis of their co-variation are lacking. Here we introduce BURRITO, a web tool for interactive visualization of microbiome multi-omic data with paired taxonomic and functional information. BURRITO simultaneously visualizes the taxonomic and functional compositions of multiple samples and dynamically highlights relationships between taxa and functions to capture the underlying structure of these data. Users can browse for taxa and functions of interest and interactively explore the share of each function attributed to each taxon across samples. BURRITO supports multiple input formats for taxonomic and metagenomic data, allows adjustment of data granularity, and can export generated visualizations as static publication-ready formatted figures. In this paper, we describe the functionality of BURRITO, and provide illustrative examples of its utility for visualizing various trends in the relationship between the composition of taxa and functions in complex microbiomes. PMID:29545787

  5. “Candidatus Paraporphyromonas polyenzymogenes” encodes multi-modular cellulases linked to the type IX secretion system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naas, A. E.; Solden, L. M.; Norbeck, A. D.

    Background In Nature, obligate herbivorous ruminants have a close symbiotic relationship with their gastrointestinal microbiome, which proficiently deconstructs plant biomass. Despite decades of research, lignocellulose degradation in the rumen has thus far been attributed to a limited number of culturable microorganisms. Here, we combine metaomics and enzymology to identify and describe a novel Bacteroidetes family (UMH11) composed entirely of uncultivated strains that are predominant in ruminants and only distantly related to previously characterized taxa. Results The first metabolic reconstruction of UMH11-affiliated genome bins, with a particular focus on the provisionally named UParaporphyromonas polyenzymogenes, illustrated their capacity to degrade various lignocellulosicmore » substrates via comprehensive inventories of singular and multi-modular carbohydrate active enzymes (CAZymes). Closer examination revealed an absence of archetypical polysaccharide utilization loci found in human-gut microbiota. Instead, we identified many multi-modular CAZymes putatively secreted via the Bacteroidetes-specific Type 9 secretion system (T9SS). This included cellulases with two or more catalytic domains, which are modular arrangements that are unique to Bacteroidetes species studied to date. Core metabolic proteins from UP. polyenzymogenes were detected in metaproteomic data and were enriched in rumen-incubated plant biomass, indicating that active saccharification and fermentation of complex carbohydrates could be assigned to members of this novel family. Biochemical analysis of selected UP. polyenzymogenes CAZymes further iterated the cellulolytic activity of this hitherto uncultured bacterium towards linear polymers, such as amorphous and crystalline cellulose as well as mixed linkage β-glucans. Conclusion We propose that UP. olyenzymogenes genotypes and other UMH11 members actively degrade plant biomass in the rumen of cows, sheep, and most likely other ruminants, utilizing singular and multi-domain catalytic CAZymes secreted through the T9SS. The discovery of a prominent role of multi-modular cellulases in the Gramnegative Bacteroidetes, together with similar findings for Gram-positive cellulosomal bacteria (Ruminococcus flavefaciens) and anaerobic fungi (Orpinomyces sp.), suggests that complex enzymes are essential and have evolved within all major cellulolytic dominions inherent to the rumen.« less

  6. “Candidatus Paraporphyromonas polyenzymogenes” encodes multi-modular cellulases linked to the type IX secretion system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naas, A. E.; Solden, L. M.; Norbeck, A. D.

    Abstract. Background In nature, obligate herbivorous ruminants have a close symbiotic relationship with their gastrointestinal microbiome, which proficiently deconstructs plant biomass. Despite decades of research, lignocellulose degradation in the rumen has thus far been attributed to a limited number of culturable microorganisms. Here in this paper, we combine meta-omics and enzymology to identify and describe a novel Bacteroidetes family (“Candidatus MH11”) composed entirely of uncultivated strains that are predominant in ruminants and only distantly related to previously characterized taxa. Results. The first metabolic reconstruction of Ca. MH11-affiliated genome bins, with a particular focus on the provisionally named “Candidatus Paraporphyromonas polyenzymogenes”,more » illustrated their capacity to degrade various lignocellulosic substrates via comprehensive inventories of singular and multi-modular carbohydrate active enzymes (CAZymes). Closer examination revealed an absence of archetypical polysaccharide utilization loci found in human gut microbiota. Instead, we identified many multi-modular CAZymes putatively secreted via the Bacteroidetes-specific type IX secretion system (T9SS). This included cellulases with two or more catalytic domains, which are modular arrangements that are unique to Bacteroidetes species studied to date. Core metabolic proteins from Ca. P. polyenzymogenes were detected in metaproteomic data and were enriched in rumen-incubated plant biomass, indicating that active saccharification and fermentation of complex carbohydrates could be assigned to members of this novel family. Biochemical analysis of selected Ca. P. polyenzymogenes CAZymes further iterated the cellulolytic activity of this hitherto uncultured bacterium towards linear polymers, such as amorphous and crystalline cellulose as well as mixed linkage β-glucans. Conclusion. We propose that Ca. P. polyenzymogene genotypes and other Ca. MH11 members actively degrade plant biomass in the rumen of cows, sheep and most likely other ruminants, utilizing singular and multi-domain catalytic CAZymes secreted through the T9SS. The discovery of a prominent role of multi-modular cellulases in the Gram-negative Bacteroidetes, together with similar findings for Gram-positive cellulosomal bacteria (Ruminococcus flavefaciens) and anaerobic fungi (Orpinomyces sp.), suggests that complex enzymes are essential and have evolved within all major cellulolytic dominions inherent to the rumen.« less

  7. “Candidatus Paraporphyromonas polyenzymogenes” encodes multi-modular cellulases linked to the type IX secretion system

    DOE PAGES

    Naas, A. E.; Solden, L. M.; Norbeck, A. D.; ...

    2018-03-01

    Abstract. Background In nature, obligate herbivorous ruminants have a close symbiotic relationship with their gastrointestinal microbiome, which proficiently deconstructs plant biomass. Despite decades of research, lignocellulose degradation in the rumen has thus far been attributed to a limited number of culturable microorganisms. Here in this paper, we combine meta-omics and enzymology to identify and describe a novel Bacteroidetes family (“Candidatus MH11”) composed entirely of uncultivated strains that are predominant in ruminants and only distantly related to previously characterized taxa. Results. The first metabolic reconstruction of Ca. MH11-affiliated genome bins, with a particular focus on the provisionally named “Candidatus Paraporphyromonas polyenzymogenes”,more » illustrated their capacity to degrade various lignocellulosic substrates via comprehensive inventories of singular and multi-modular carbohydrate active enzymes (CAZymes). Closer examination revealed an absence of archetypical polysaccharide utilization loci found in human gut microbiota. Instead, we identified many multi-modular CAZymes putatively secreted via the Bacteroidetes-specific type IX secretion system (T9SS). This included cellulases with two or more catalytic domains, which are modular arrangements that are unique to Bacteroidetes species studied to date. Core metabolic proteins from Ca. P. polyenzymogenes were detected in metaproteomic data and were enriched in rumen-incubated plant biomass, indicating that active saccharification and fermentation of complex carbohydrates could be assigned to members of this novel family. Biochemical analysis of selected Ca. P. polyenzymogenes CAZymes further iterated the cellulolytic activity of this hitherto uncultured bacterium towards linear polymers, such as amorphous and crystalline cellulose as well as mixed linkage β-glucans. Conclusion. We propose that Ca. P. polyenzymogene genotypes and other Ca. MH11 members actively degrade plant biomass in the rumen of cows, sheep and most likely other ruminants, utilizing singular and multi-domain catalytic CAZymes secreted through the T9SS. The discovery of a prominent role of multi-modular cellulases in the Gram-negative Bacteroidetes, together with similar findings for Gram-positive cellulosomal bacteria (Ruminococcus flavefaciens) and anaerobic fungi (Orpinomyces sp.), suggests that complex enzymes are essential and have evolved within all major cellulolytic dominions inherent to the rumen.« less

  8. "Candidatus Paraporphyromonas polyenzymogenes" encodes multi-modular cellulases linked to the type IX secretion system.

    PubMed

    Naas, A E; Solden, L M; Norbeck, A D; Brewer, H; Hagen, L H; Heggenes, I M; McHardy, A C; Mackie, R I; Paša-Tolić, L; Arntzen, M Ø; Eijsink, V G H; Koropatkin, N M; Hess, M; Wrighton, K C; Pope, P B

    2018-03-01

    In nature, obligate herbivorous ruminants have a close symbiotic relationship with their gastrointestinal microbiome, which proficiently deconstructs plant biomass. Despite decades of research, lignocellulose degradation in the rumen has thus far been attributed to a limited number of culturable microorganisms. Here, we combine meta-omics and enzymology to identify and describe a novel Bacteroidetes family ("Candidatus MH11") composed entirely of uncultivated strains that are predominant in ruminants and only distantly related to previously characterized taxa. The first metabolic reconstruction of Ca. MH11-affiliated genome bins, with a particular focus on the provisionally named "Candidatus Paraporphyromonas polyenzymogenes", illustrated their capacity to degrade various lignocellulosic substrates via comprehensive inventories of singular and multi-modular carbohydrate active enzymes (CAZymes). Closer examination revealed an absence of archetypical polysaccharide utilization loci found in human gut microbiota. Instead, we identified many multi-modular CAZymes putatively secreted via the Bacteroidetes-specific type IX secretion system (T9SS). This included cellulases with two or more catalytic domains, which are modular arrangements that are unique to Bacteroidetes species studied to date. Core metabolic proteins from Ca. P. polyenzymogenes were detected in metaproteomic data and were enriched in rumen-incubated plant biomass, indicating that active saccharification and fermentation of complex carbohydrates could be assigned to members of this novel family. Biochemical analysis of selected Ca. P. polyenzymogenes CAZymes further iterated the cellulolytic activity of this hitherto uncultured bacterium towards linear polymers, such as amorphous and crystalline cellulose as well as mixed linkage β-glucans. We propose that Ca. P. polyenzymogene genotypes and other Ca. MH11 members actively degrade plant biomass in the rumen of cows, sheep and most likely other ruminants, utilizing singular and multi-domain catalytic CAZymes secreted through the T9SS. The discovery of a prominent role of multi-modular cellulases in the Gram-negative Bacteroidetes, together with similar findings for Gram-positive cellulosomal bacteria (Ruminococcus flavefaciens) and anaerobic fungi (Orpinomyces sp.), suggests that complex enzymes are essential and have evolved within all major cellulolytic dominions inherent to the rumen.

  9. Robust Sensitivity Analysis for Multi-Attribute Deterministic Hierarchical Value Models

    DTIC Science & Technology

    2002-03-01

    such as weighted sum method, weighted 5 product method, and the Analytic Hierarchy Process ( AHP ). This research focuses on only weighted sum...different groups. They can be termed as deterministic, stochastic, or fuzzy multi-objective decision methods if they are classified according to the...weighted product model (WPM), and analytic hierarchy process ( AHP ). His method attempts to identify the most important criteria weight and the most

  10. Identifying the needs of elderly, hearing-impaired persons: the importance and utility of hearing aid attributes.

    PubMed

    Meister, Hartmut; Lausberg, Isabel; Kiessling, Juergen; von Wedel, Hasso; Walger, Martin

    2002-11-01

    Older patients represent the majority of hearing-aid users. The needs of elderly, hearing-impaired subjects are not entirely identified. The present study aims to determine the importance of fundamental hearing-aid attributes and to elicit the utility of associated hypothetical hearing aids for older patients. This was achieved using a questionnaire-based conjoint analysis--a decompositional approach to preference measurement offering a realistic study design. A random sample of 200 experienced hearing-aid users participated in the study. Though three out of the six examined attributes revealed age-related dependencies, the only significant effect was found for the attribute "handling", which was considerably more important for older than younger hearing-aid users. A trend of decreasing importance of speech intelligibility in noise and increasing significance of speech in quiet was observed for subjects older than 70 years. In general, the utility of various hypothetical hearing aids was similar for older and younger subjects. Apart from the attribute "handling", older and younger subjects have comparable needs regarding hearing-aid features. On the basis of the examined attributes, there is no requirement for hearing aids designed specifically for elderly hearing-aid users, provided that ergonomic features are considered and the benefits of modern technology are made fully available for older patients.

  11. Application of principal component analysis (PCA) as a sensory assessment tool for fermented food products.

    PubMed

    Ghosh, Debasree; Chattopadhyay, Parimal

    2012-06-01

    The objective of the work was to use the method of quantitative descriptive analysis (QDA) to describe the sensory attributes of the fermented food products prepared with the incorporation of lactic cultures. Panellists were selected and trained to evaluate various attributes specially color and appearance, body texture, flavor, overall acceptability and acidity of the fermented food products like cow milk curd and soymilk curd, idli, sauerkraut and probiotic ice cream. Principal component analysis (PCA) identified the six significant principal components that accounted for more than 90% of the variance in the sensory attribute data. Overall product quality was modelled as a function of principal components using multiple least squares regression (R (2) = 0.8). The result from PCA was statistically analyzed by analysis of variance (ANOVA). These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring the fermented food product attributes that are important for consumer acceptability.

  12. Close Combat Missile Methodology Study

    DTIC Science & Technology

    2010-10-14

    Modeling: Industrial Applications of DEX.” Informatica 23 (1999): 487-491. Bohanec, Marko, Blaz Zupan, and Vladislav Rajkovic. “Applications of...Lisec. “Multi-attribute Decision Analysis in GIS: Weighted Linear Combination and Ordered Weighted Averaging.” Informatica 33, (1999): 459- 474

  13. Satisfaction of active duty soldiers with family dental care.

    PubMed

    Chisick, M C

    1997-02-01

    In the fall of 1992, a random, worldwide sample of 6,442 married and single parent soldiers completed a self-administered survey on satisfaction with 22 attributes of family dental care. Simple descriptive statistics for each attribute were derived, as was a composite overall satisfaction score using factor analysis. Composite scores were regressed on demographics, annual dental utilization, and access barriers to identify those factors having an impact on a soldier's overall satisfaction with family dental care. Separate regression models were constructed for single parents, childless couples, and couples with children. Results show below-average satisfaction with nearly all attributes of family dental care, with access attributes having the lowest average satisfaction scores. Factors influencing satisfaction with family dental care varied by family type with one exception: dependent dental utilization within the past year contributed positively to satisfaction across all family types.

  14. [Comparative study on promoting blood effects of Danshen-Honghua herb pair with different preparations based on chemometrics and multi-attribute comprehensive index methods].

    PubMed

    Qu, Cheng; Tang, Yu-Ping; Shi, Xu-Qin; Zhou, Gui-Sheng; Shang, Er-Xin; Shang, Li-Li; Guo, Jian-Ming; Liu, Pei; Zhao, Jing; Zhao, Bu-Chang; Duan, Jin-Ao

    2017-08-01

    To evaluate the promoting blood circulation and removing blood stasis effects of Danshen-Honghua(DH) herb pair with different preparations (alcohol, 50% alcohol and water) on blood rheology and coagulation functions in acute blood stasis rats, and optimize the best preparation method of DH based on principal component analysis(PCA), hierarchical cluster heatmap analysis and multi-attribute comprehensive index methods. Ice water bath and subcutaneous injection of adrenaline were both used to establish the acute blood stasis rat model. Then the blood stasis rats were administrated intragastrically with DH (alcohol, 50% alcohol and water) extracts. The whole blood viscosity(WBV), plasma viscosity(PV), erythrocyte sedimentation rate(ESR) and haematocrit(HCT) were tested to observe the effects of DH herb pair with different preparations and doses on hemorheology of blood stasis rats; the activated partial thromboplastin time(APTT), thrombin time(TT), prothrombin time(PT), and plasma fibrinogen(FIB) were tested to observe the effects of DH herb pair with different preparations on blood coagulation function and platelet aggregation of blood stasis rats. Then PCA, hierarchical cluster heatmap analysis and multi-attribute comprehensive index methods were all used to comprehensively evaluate the total promoting blood circulation and removing blood stasis effects of DH herb pair with different preparations. The hemorheological indexes and coagulation parameters of model group had significant differences with normal blank group. As compared with the model group, the DH herb pair with different preparations at low, middle and high doses could improve the blood hemorheology indexes and coagulation parameters in acute blood stasis rats with dose-effect relation. Based on the PCA, hierarchical cluster heatmap analysis and multi-attribute comprehensive index methods, the high dose group of 50% alcohol extract had the best effect of promoting blood circulation and removing blood stasis. Under the same dose but different preparations, 50% alcohol DH could obviously improve the hemorheology and blood coagulation function in acute blood stasis rats. These results suggested that DH herb pair with different preparations could obviously ameliorate the abnormality of hemorheology and blood coagulation function in acute blood stasis rats, and the optimized preparation of DH herb pair on promoting blood effects was 50% alcohol extract, providing scientific basis for more effective application of the DH herb pair in modern clinic medicine. Copyright© by the Chinese Pharmaceutical Association.

  15. A Preliminary Analysis of Sex Differences in Attributional Patterns and Self-Esteem Levels.

    ERIC Educational Resources Information Center

    Strohkirch, Carolyn Sue; Hargett, Jennifer G.

    A study examined whether there were differences in the ways that undergraduate college students viewed their academic performance. Relationships between sex of student, motivation, self esteem, achievement, and attributional pattern utilized were examined. Subjects (132 female, 104 male) were chosen on a voluntary basis; most were enrolled in a…

  16. Four Vantage Points to the Language Performance and Capacity of Human Beings: Response to Saloviita and Sariola.

    ERIC Educational Resources Information Center

    Niemi, Jussi; Karna-Lin, Eija

    2003-01-01

    This response to EC 633 617, an analysis of a purported case of facilitated communication, stresses the role of linguistic and grammatical analysis of texts attributed to a Finnish man diagnosed with mental retardation and cerebral palsy. It identifies weaknesses in the analysis, urges use of multi-theoretical approaches, and notes the benefits…

  17. [Spatial-temporal evolution characterization of land subsidence by multi-temporal InSAR method and GIS technology].

    PubMed

    Chen, Bei-Bei; Gong, Hui-Li; Li, Xiao-Juan; Lei, Kun-Chao; Duan, Guang-Yao; Xie, Jin-Rong

    2014-04-01

    Long-term over-exploitation of underground resources, and static and dynamic load increase year by year influence the occurrence and development of regional land subsidence to a certain extent. Choosing 29 scenes Envisat ASAR images covering plain area of Beijing, China, the present paper used the multi-temporal InSAR method incorporating both persistent scatterer and small baseline approaches, and obtained monitoring information of regional land subsidence. Under different situation of space development and utilization, the authors chose five typical settlement areas; With classified information of land-use, multi-spectral remote sensing image, and geological data, and adopting GIS spatial analysis methods, the authors analyzed the time series evolution characteristics of uneven settlement. The comprehensive analysis results suggests that the complex situations of space development and utilization affect the trend of uneven settlement; the easier the situation of space development and utilization, the smaller the settlement gradient, and the less the uneven settlement trend.

  18. Enriching the national map database for multi-scale use: Introducing the visibilityfilter attribution

    USGS Publications Warehouse

    Stauffer, Andrew J.; Webinger, Seth; Roche, Brittany

    2016-01-01

    The US Geological Survey’s (USGS) National Geospatial Technical Operations Center is prototyping and evaluating the ability to filter data through a range of scales using 1:24,000-scale The National Map (TNM) datasets as the source. A “VisibilityFilter” attribute is under evaluation that can be added to all TNM vector data themes and will permit filtering of data to eight target scales between 1:24,000 and 1:5,000,000, thus defining each feature’s smallest applicable scale-of-use. For a prototype implementation, map specifications for 1:100,000- and 1:250,000-scale USGS Topographic Map Series are being utilized to define feature content appropriate at fixed mapping scales to guide generalization decisions that are documented in a ScaleMaster diagram. This paper defines the VisibilityFilter attribute, the generalization decisions made for each TNM data theme, and how these decisions are embedded into the data to support efficient data filtering.

  19. A Multi Agent-Based Framework for Simulating Household PHEV Distribution and Electric Distribution Network Impact

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Xiaohui; Liu, Cheng; Kim, Hoe Kyoung

    2011-01-01

    The variation of household attributes such as income, travel distance, age, household member, and education for different residential areas may generate different market penetration rates for plug-in hybrid electric vehicle (PHEV). Residential areas with higher PHEV ownership could increase peak electric demand locally and require utilities to upgrade the electric distribution infrastructure even though the capacity of the regional power grid is under-utilized. Estimating the future PHEV ownership distribution at the residential household level can help us understand the impact of PHEV fleet on power line congestion, transformer overload and other unforeseen problems at the local residential distribution network level.more » It can also help utilities manage the timing of recharging demand to maximize load factors and utilization of existing distribution resources. This paper presents a multi agent-based simulation framework for 1) modeling spatial distribution of PHEV ownership at local residential household level, 2) discovering PHEV hot zones where PHEV ownership may quickly increase in the near future, and 3) estimating the impacts of the increasing PHEV ownership on the local electric distribution network with different charging strategies. In this paper, we use Knox County, TN as a case study to show the simulation results of the agent-based model (ABM) framework. However, the framework can be easily applied to other local areas in the US.« less

  20. Consumer preferences and brand equity measurement of Spanish national daily newspapers: a conjoint analysis approach.

    PubMed

    Varela Mallou, J; Rial Boubeta, A; Braña Tobío, T

    2001-05-01

    Brand is a product attribute that, for many types of goods or services, makes a major contribution to consumer preferences. Conjoint analysis is a useful technique for the assessment of brand values for a given consumer or group of consumers. In this paper, an application of conjoint analysis to the estimation of brand values in the Spanish daily newspaper market is reported. Four newspaper attributes were considered: brand (i.e., newspaper name), price (0.60, 1.05, or 1.50 euros), Sunday supplement (yes/no), and daily pullout (yes/no). A total of 510 regular readers of the national press, stratified by age and sex, were asked to rank 16 profiles representing an orthogonal fraction of the possible attribute-level combinations. Brand was by far the most important attribute, whereas price had negligible effect. More generally, the results confirm the utility of conjoint analysis for assessing brand equity in the newspaper market and for estimating the relative importance of the various attributes to different subgroups of consumers.

  1. Graduate Attribute Attainment in a Multi-Level Undergraduate Geography Course

    ERIC Educational Resources Information Center

    Mager, Sarah; Spronken-Smith, Rachel

    2014-01-01

    We investigated students' perceptions of graduate attributes in a multi-level (second and third year) geography course. A case study with mixed methodology was employed, with data collected through focus groups and a survey. We found that undergraduate geography students can identify the skills, knowledge and attributes that are developed through…

  2. Using Technical Performance Measures

    NASA Technical Reports Server (NTRS)

    Garrett, Christopher J.; Levack, Daniel J. H.; Rhodes, Russel E.

    2011-01-01

    All programs have requirements. For these requirements to be met, there must be a means of measurement. A Technical Performance Measure (TPM) is defined to produce a measured quantity that can be compared to the requirement. In practice, the TPM is often expressed as a maximum or minimum and a goal. Example TPMs for a rocket program are: vacuum or sea level specific impulse (lsp), weight, reliability (often expressed as a failure rate), schedule, operability (turn-around time), design and development cost, production cost, and operating cost. Program status is evaluated by comparing the TPMs against specified values of the requirements. During the program many design decisions are made and most of them affect some or all of the TPMs. Often, the same design decision changes some TPMs favorably while affecting other TPMs unfavorably. The problem then becomes how to compare the effects of a design decision on different TPMs. How much failure rate is one second of specific impulse worth? How many days of schedule is one pound of weight worth? In other words, how to compare dissimilar quantities in order to trade and manage the TPMs to meet all requirements. One method that has been used successfully and has a mathematical basis is Utility Analysis. Utility Analysis enables quantitative comparison among dissimilar attributes. It uses a mathematical model that maps decision maker preferences over the tradeable range of each attribute. It is capable of modeling both independent and dependent attributes. Utility Analysis is well supported in the literature on Decision Theory. It has been used at Pratt & Whitney Rocketdyne for internal programs and for contracted work such as the J-2X rocket engine program. This paper describes the construction of TPMs and describes Utility Analysis. It then discusses the use of TPMs in design trades and to manage margin during a program using Utility Analysis.

  3. An assessment of advanced technology for industrial cogeneration

    NASA Technical Reports Server (NTRS)

    Moore, N.

    1983-01-01

    The potential of advanced fuel utilization and energy conversion technologies to enhance the outlook for the increased use of industrial cogeneration was assessed. The attributes of advanced cogeneration systems that served as the basis for the assessment included their fuel flexibility and potential for low emissions, efficiency of fuel or energy utilization, capital equipment and operating costs, and state of technological development. Over thirty advanced cogeneration systems were evaluated. These cogeneration system options were based on Rankine cycle, gas turbine engine, reciprocating engine, Stirling engine, and fuel cell energy conversion systems. The alternatives for fuel utilization included atmospheric and pressurized fluidized bed combustors, gasifiers, conventional combustion systems, alternative energy sources, and waste heat recovery. Two advanced cogeneration systems with mid-term (3 to 5 year) potential were found to offer low emissions, multi-fuel capability, and a low cost of producing electricity. Both advanced cogeneration systems are based on conventional gas turbine engine/exhaust heat recovery technology; however, they incorporate advanced fuel utilization systems.

  4. Visible Korean human images on MIOS system

    NASA Astrophysics Data System (ADS)

    Har, Donghwan; Son, Young-Ho; Lee, Sung-Won; Lee, Jung Beom

    2004-05-01

    Basically, photography has the attributes of reason, which encompasses the scientific knowledge of optics, physics and chemistry, and delicate sensibility of individuals. Ultimately, the photograph pursues "effective communication." Communication is "mental and psychosocial exchange mediated by material symbols, such as language, gesture and picture," and it has four compositions: "sender, receiver, message and channel." Recently, a change in the communication method is on the rise in the field of art and culture, including photography. Until now, communication was mainly achieved by the form of messages unilaterally transferred from senders to receivers. But, nowadays, an interactive method, in which the boundary of sender and receiver is obscure, is on the increase. Such new communication method may be said to have arrived from the desire of art and culture societies, pursuing something new and creative in the background of utilization of a variety of information media. The multi-view screen we developed is also a communication tool capable of effective interaction using photos or motion pictures. The viewer can see different images at different locations. It utilizes the basic lenticular characteristics, which have been used in printing. Each motion picture is displayed on the screen without crosstalk. The multi-view screen is different in many aspects from other display media, and is expected to be utilized in many fields, including advertisement, display and education.

  5. Application of Visual Attention in Seismic Attribute Analysis

    NASA Astrophysics Data System (ADS)

    He, M.; Gu, H.; Wang, F.

    2016-12-01

    It has been proved that seismic attributes can be used to predict reservoir. The joint of multi-attribute and geological statistics, data mining, artificial intelligence, further promote the development of the seismic attribute analysis. However, the existing methods tend to have multiple solutions and insufficient generalization ability, which is mainly due to the complex relationship between seismic data and geological information, and undoubtedly own partly to the methods applied. Visual attention is a mechanism model of the human visual system which can concentrate on a few significant visual objects rapidly, even in a mixed scene. Actually, the model qualify good ability of target detection and recognition. In our study, the targets to be predicted are treated as visual objects, and an object representation based on well data is made in the attribute dimensions. Then in the same attribute space, the representation is served as a criterion to search the potential targets outside the wells. This method need not predict properties by building up a complicated relation between attributes and reservoir properties, but with reference to the standard determined before. So it has pretty good generalization ability, and the problem of multiple solutions can be weakened by defining the threshold of similarity.

  6. DLA Systems Modernization Methodology: Logical Analysis and Design Procedures

    DTIC Science & Technology

    1990-07-01

    Information Requirement would have little meaning and thus would lose its value . 3 I3 I 1.1.3 INPUT PRODUCTS 3 1.1.3.1 Enterprise Model Objective List 1.1.3.2...at the same time, the attribute is said to be multi- valued . i For example, an E-R model may contain information on the languages an employee speaks...Relationship model is examined in detail to ensure that each data group contains attributes whose values are absolutely determined by their respective

  7. Multi-level analysis in information systems research: the case of enterprise resource planning system usage in China

    NASA Astrophysics Data System (ADS)

    Sun, Yuan; Bhattacherjee, Anol

    2011-11-01

    Information technology (IT) usage within organisations is a multi-level phenomenon that is influenced by individual-level and organisational-level variables. Yet, current theories, such as the unified theory of acceptance and use of technology, describe IT usage as solely an individual-level phenomenon. This article postulates a model of organisational IT usage that integrates salient organisational-level variables such as user training, top management support and technical support within an individual-level model to postulate a multi-level model of IT usage. The multi-level model was then empirically validated using multi-level data collected from 128 end users and 26 managers in 26 firms in China regarding their use of enterprise resource planning systems and analysed using the multi-level structural equation modelling (MSEM) technique. We demonstrate the utility of MSEM analysis of multi-level data relative to the more common structural equation modelling analysis of single-level data and show how single-level data can be aggregated to approximate multi-level analysis when multi-level data collection is not possible. We hope that this article will motivate future scholars to employ multi-level data and multi-level analysis for understanding organisational phenomena that are truly multi-level in nature.

  8. Coupling Meteorology, Metal Concentrations, and Pb Isotopes for Source Attribution in Archived Precipitation Samples

    EPA Science Inventory

    A technique that couples lead (Pb) isotopes and multi-element concentrations with meteorological analysis was used to assess source contributions to precipitation samples at the Bondville, Illinois USA National Trends Network (NTN) site. Precipitation samples collected over a 16 ...

  9. Inhalation Exposure and Lung Dose Analysis of Multi-mode Complex Ambient Aerosols

    EPA Science Inventory

    Rationale: Ambient aerosols are complex mixture of particles with different size, shape and chemical composition. Although they are known to cause health hazard, it is not fully understood about causal mechanisms and specific attributes of particles causing the effects. Internal ...

  10. Evaluate the use of tanning agent in leather industry using material flow analysis, life cycle assessment and fuzzy multi-attribute decision making (FMADM)

    NASA Astrophysics Data System (ADS)

    Alfarisi, Salman; Sutono, Sugoro Bhakti; Sutopo, Wahyudi

    2017-11-01

    Tanning industry is one of the companies that produce many pollutants and cause the negative impact on the environment. In the production process of tanning leather, the use of input material need to be evaluated. The problem of waste, not only have a negative impact on the environment, but also human health. In this study, the impact of mimosa as vegetable tanning agent evaluated. This study will provide alternative solutions for improvements to the use of vegetable tanning agent. The alternative solution is change mimosa with indusol, gambier, and dulcotan. This study evaluate the vegetable tanning of some aspects using material flow analysis and life cycle assessment approach. Life cycle assessment (LCA) is used to evaluate the environmental impact of vegetable tanning agent. Alternative solution selection using fuzzy multi-attribute decision making (FMADM) approach. Results obtained by considering the environment, human toxicity, climate change, and marine aquatic ecotoxicity, is to use dulcotan.

  11. Secure Data Access Control for Fog Computing Based on Multi-Authority Attribute-Based Signcryption with Computation Outsourcing and Attribute Revocation.

    PubMed

    Xu, Qian; Tan, Chengxiang; Fan, Zhijie; Zhu, Wenye; Xiao, Ya; Cheng, Fujia

    2018-05-17

    Nowadays, fog computing provides computation, storage, and application services to end users in the Internet of Things. One of the major concerns in fog computing systems is how fine-grained access control can be imposed. As a logical combination of attribute-based encryption and attribute-based signature, Attribute-based Signcryption (ABSC) can provide confidentiality and anonymous authentication for sensitive data and is more efficient than traditional "encrypt-then-sign" or "sign-then-encrypt" strategy. Thus, ABSC is suitable for fine-grained access control in a semi-trusted cloud environment and is gaining more and more attention recently. However, in many existing ABSC systems, the computation cost required for the end users in signcryption and designcryption is linear with the complexity of signing and encryption access policy. Moreover, only a single authority that is responsible for attribute management and key generation exists in the previous proposed ABSC schemes, whereas in reality, mostly, different authorities monitor different attributes of the user. In this paper, we propose OMDAC-ABSC, a novel data access control scheme based on Ciphertext-Policy ABSC, to provide data confidentiality, fine-grained control, and anonymous authentication in a multi-authority fog computing system. The signcryption and designcryption overhead for the user is significantly reduced by outsourcing the undesirable computation operations to fog nodes. The proposed scheme is proven to be secure in the standard model and can provide attribute revocation and public verifiability. The security analysis, asymptotic complexity comparison, and implementation results indicate that our construction can balance the security goals with practical efficiency in computation.

  12. Discovering transcription factor binding sites in highly repetitive regions of genomes with multi-read analysis of ChIP-Seq data.

    PubMed

    Chung, Dongjun; Kuan, Pei Fen; Li, Bo; Sanalkumar, Rajendran; Liang, Kun; Bresnick, Emery H; Dewey, Colin; Keleş, Sündüz

    2011-07-01

    Chromatin immunoprecipitation followed by high-throughput sequencing (ChIP-seq) is rapidly replacing chromatin immunoprecipitation combined with genome-wide tiling array analysis (ChIP-chip) as the preferred approach for mapping transcription-factor binding sites and chromatin modifications. The state of the art for analyzing ChIP-seq data relies on using only reads that map uniquely to a relevant reference genome (uni-reads). This can lead to the omission of up to 30% of alignable reads. We describe a general approach for utilizing reads that map to multiple locations on the reference genome (multi-reads). Our approach is based on allocating multi-reads as fractional counts using a weighted alignment scheme. Using human STAT1 and mouse GATA1 ChIP-seq datasets, we illustrate that incorporation of multi-reads significantly increases sequencing depths, leads to detection of novel peaks that are not otherwise identifiable with uni-reads, and improves detection of peaks in mappable regions. We investigate various genome-wide characteristics of peaks detected only by utilization of multi-reads via computational experiments. Overall, peaks from multi-read analysis have similar characteristics to peaks that are identified by uni-reads except that the majority of them reside in segmental duplications. We further validate a number of GATA1 multi-read only peaks by independent quantitative real-time ChIP analysis and identify novel target genes of GATA1. These computational and experimental results establish that multi-reads can be of critical importance for studying transcription factor binding in highly repetitive regions of genomes with ChIP-seq experiments.

  13. Seismic Reservoir Characterization for Assessment of CO2 EOR at the Mississippian Reservoir in South-Central Kansas

    NASA Astrophysics Data System (ADS)

    Tsoflias, G. P.; Graham, B.; Haga, L.; Watney, L.

    2017-12-01

    The Mississippian in Kansas and Oklahoma is a highly heterogeneous, fractured, oil producing reservoir with thickness typically below seismic resolution. At Wellington field in south-central Kansas CO2 was injected in the Mississippian reservoir for enhanced oil recovery. This study examines the utility of active source surface seismic for characterization of Mississippian reservoir properties and monitoring CO2. Analysis of post-stack 3D seismic data showed the expected response of a gradational transition (ramp velocity) where thicker reservoir units corresponded with lower reflection amplitudes, lower frequency and a 90o phase change. Reflection amplitude could be correlated to reservoir thickness. Pre-stack gather analysis showed that porosity zones of the Mississippian reservoir exhibit characteristic AVO response. Simultaneous AVO inversion estimated P- and S-Impedances, which along with formation porosity logs and post-stack seismic data attributes were incorporated in multi-attribute linear-regression analysis and predicted reservoir porosity with an overall correlation of 0.90 to well data. The 3D survey gather azimuthal anisotropy analysis (AVAZ) provided information on the fault and fracture network and showed good agreement to the regional stress field and well data. Mississippian reservoir porosity and fracture predictions agreed well with the observed mobility of the CO2 in monitoring wells. Fluid substitution modeling predicted acoustic impedance reduction in the Mississippian carbonate reservoir introduced by the presence of CO2. Future work includes the assessment of time-lapse seismic, acquired after the injection of CO2. This work demonstrates that advanced seismic interpretation methods can be used successfully for characterization of the Mississippian reservoir and monitoring of CO2.

  14. Multi-attribute criteria applied to electric generation energy system analysis LDRD.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuswa, Glenn W.; Tsao, Jeffrey Yeenien; Drennen, Thomas E.

    2005-10-01

    This report began with a Laboratory-Directed Research and Development (LDRD) project to improve Sandia National Laboratories multidisciplinary capabilities in energy systems analysis. The aim is to understand how various electricity generating options can best serve needs in the United States. The initial product is documented in a series of white papers that span a broad range of topics, including the successes and failures of past modeling studies, sustainability, oil dependence, energy security, and nuclear power. Summaries of these projects are included here. These projects have provided a background and discussion framework for the Energy Systems Analysis LDRD team to carrymore » out an inter-comparison of many of the commonly available electric power sources in present use, comparisons of those options, and efforts needed to realize progress towards those options. A computer aid has been developed to compare various options based on cost and other attributes such as technological, social, and policy constraints. The Energy Systems Analysis team has developed a multi-criteria framework that will allow comparison of energy options with a set of metrics that can be used across all technologies. This report discusses several evaluation techniques and introduces the set of criteria developed for this LDRD.« less

  15. Grid-Enabled Quantitative Analysis of Breast Cancer

    DTIC Science & Technology

    2010-10-01

    large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...research, we designed a pilot study utilizing large scale parallel Grid computing harnessing nationwide infrastructure for medical image analysis . Also

  16. Multi-model attribution of upper-ocean temperature changes using an isothermal approach.

    PubMed

    Weller, Evan; Min, Seung-Ki; Palmer, Matthew D; Lee, Donghyun; Yim, Bo Young; Yeh, Sang-Wook

    2016-06-01

    Both air-sea heat exchanges and changes in ocean advection have contributed to observed upper-ocean warming most evident in the late-twentieth century. However, it is predominantly via changes in air-sea heat fluxes that human-induced climate forcings, such as increasing greenhouse gases, and other natural factors such as volcanic aerosols, have influenced global ocean heat content. The present study builds on previous work using two different indicators of upper-ocean temperature changes for the detection of both anthropogenic and natural external climate forcings. Using simulations from phase 5 of the Coupled Model Intercomparison Project, we compare mean temperatures above a fixed isotherm with the more widely adopted approach of using a fixed depth. We present the first multi-model ensemble detection and attribution analysis using the fixed isotherm approach to robustly detect both anthropogenic and natural external influences on upper-ocean temperatures. Although contributions from multidecadal natural variability cannot be fully removed, both the large multi-model ensemble size and properties of the isotherm analysis reduce internal variability of the ocean, resulting in better observation-model comparison of temperature changes since the 1950s. We further show that the high temporal resolution afforded by the isotherm analysis is required to detect natural external influences such as volcanic cooling events in the upper-ocean because the radiative effect of volcanic forcings is short-lived.

  17. Multi-model attribution of upper-ocean temperature changes using an isothermal approach

    NASA Astrophysics Data System (ADS)

    Weller, Evan; Min, Seung-Ki; Palmer, Matthew D.; Lee, Donghyun; Yim, Bo Young; Yeh, Sang-Wook

    2016-06-01

    Both air-sea heat exchanges and changes in ocean advection have contributed to observed upper-ocean warming most evident in the late-twentieth century. However, it is predominantly via changes in air-sea heat fluxes that human-induced climate forcings, such as increasing greenhouse gases, and other natural factors such as volcanic aerosols, have influenced global ocean heat content. The present study builds on previous work using two different indicators of upper-ocean temperature changes for the detection of both anthropogenic and natural external climate forcings. Using simulations from phase 5 of the Coupled Model Intercomparison Project, we compare mean temperatures above a fixed isotherm with the more widely adopted approach of using a fixed depth. We present the first multi-model ensemble detection and attribution analysis using the fixed isotherm approach to robustly detect both anthropogenic and natural external influences on upper-ocean temperatures. Although contributions from multidecadal natural variability cannot be fully removed, both the large multi-model ensemble size and properties of the isotherm analysis reduce internal variability of the ocean, resulting in better observation-model comparison of temperature changes since the 1950s. We further show that the high temporal resolution afforded by the isotherm analysis is required to detect natural external influences such as volcanic cooling events in the upper-ocean because the radiative effect of volcanic forcings is short-lived.

  18. Technology selection for ballast water treatment by multi-stakeholders: A multi-attribute decision analysis approach based on the combined weights and extension theory.

    PubMed

    Ren, Jingzheng

    2018-01-01

    This objective of this study is to develop a generic multi-attribute decision analysis framework for ranking the technologies for ballast water treatment and determine their grades. An evaluation criteria system consisting of eight criteria in four categories was used to evaluate the technologies for ballast water treatment. The Best-Worst method, which is a subjective weighting method and Criteria importance through inter-criteria correlation method, which is an objective weighting method, were combined to determine the weights of the evaluation criteria. The extension theory was employed to prioritize the technologies for ballast water treatment and determine their grades. An illustrative case including four technologies for ballast water treatment, i.e. Alfa Laval (T 1 ), Hyde (T 2 ), Unitor (T 3 ), and NaOH (T 4 ), were studied by the proposed method, and the Hyde (T 2 ) was recognized as the best technology. Sensitivity analysis was also carried to investigate the effects of the combined coefficients and the weights of the evaluation criteria on the final priority order of the four technologies for ballast water treatment. The sum weighted method and the TOPSIS was also employed to rank the four technologies, and the results determined by these two methods are consistent to that determined by the proposed method in this study. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Comparative Evaluation of Financing Programs: Insights From California’s Experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deason, Jeff

    Berkeley Lab examines criteria for a comparative assessment of multiple financing programs for energy efficiency, developed through a statewide public process in California. The state legislature directed the California Alternative Energy and Advanced Transportation Financing Authority (CAEATFA) to develop these criteria. CAEATFA's report to the legislature, an invaluable reference for other jurisdictions considering these topics, discusses the proposed criteria and the rationales behind them in detail. Berkeley Lab's brief focuses on several salient issues that emerged during the criteria development and discussion process. Many of these issues are likely to arise in other states that plan to evaluate the impactsmore » of energy efficiency financing programs, whether for a single program or multiple programs. Issues discussed in the brief include: -The stakeholder process to develop the proposed assessment criteria -Attribution of outcomes - such as energy savings - to financing programs vs. other drivers -Choosing the outcome metric of primary interest: program take-up levels vs. savings -The use of net benefits vs. benefit-cost ratios for cost-effectiveness evaluation -Non-energy factors -Consumer protection factors -Market transformation impacts -Accommodating varying program goals in a multi-program evaluation -Accounting for costs and risks borne by various parties, including taxpayers and utility customers, in cost-effectiveness analysis -How to account for potential synergies among programs in a multi-program evaluation« less

  20. Multi-MW Closed Cycle MHD Nuclear Space Power Via Nonequilibrium He/Xe Working Plasma

    NASA Technical Reports Server (NTRS)

    Litchford, Ron J.; Harada, Nobuhiro

    2011-01-01

    Prospects for a low specific mass multi-megawatt nuclear space power plant were examined assuming closed cycle coupling of a high-temperature fission reactor with magnetohydrodynamic (MHD) energy conversion and utilization of a nonequilibrium helium/xenon frozen inert plasma (FIP). Critical evaluation of performance attributes and specific mass characteristics was based on a comprehensive systems analysis assuming a reactor operating temperature of 1800 K for a range of subsystem mass properties. Total plant efficiency was expected to be 55.2% including plasma pre-ionization power, and the effects of compressor stage number, regenerator efficiency and radiation cooler temperature on plant efficiency were assessed. Optimal specific mass characteristics were found to be dependent on overall power plant scale with 3 kg/kWe being potentially achievable at a net electrical power output of 1-MWe. This figure drops to less than 2 kg/kWe when power output exceeds 3 MWe. Key technical issues include identification of effective methods for non-equilibrium pre-ionization and achievement of frozen inert plasma conditions within the MHD generator channel. A three-phase research and development strategy is proposed encompassing Phase-I Proof of Principle Experiments, a Phase-II Subscale Power Generation Experiment, and a Phase-III Closed-Loop Prototypical Laboratory Demonstration Test.

  1. Stochastic and Statistical Analysis of Utility Revenues and Weather Data Analysis for Consumer Demand Estimation in Smart Grids

    PubMed Central

    Ali, S. M.; Mehmood, C. A; Khan, B.; Jawad, M.; Farid, U; Jadoon, J. K.; Ali, M.; Tareen, N. K.; Usman, S.; Majid, M.; Anwar, S. M.

    2016-01-01

    In smart grid paradigm, the consumer demands are random and time-dependent, owning towards stochastic probabilities. The stochastically varying consumer demands have put the policy makers and supplying agencies in a demanding position for optimal generation management. The utility revenue functions are highly dependent on the consumer deterministic stochastic demand models. The sudden drifts in weather parameters effects the living standards of the consumers that in turn influence the power demands. Considering above, we analyzed stochastically and statistically the effect of random consumer demands on the fixed and variable revenues of the electrical utilities. Our work presented the Multi-Variate Gaussian Distribution Function (MVGDF) probabilistic model of the utility revenues with time-dependent consumer random demands. Moreover, the Gaussian probabilities outcome of the utility revenues is based on the varying consumer n demands data-pattern. Furthermore, Standard Monte Carlo (SMC) simulations are performed that validated the factor of accuracy in the aforesaid probabilistic demand-revenue model. We critically analyzed the effect of weather data parameters on consumer demands using correlation and multi-linear regression schemes. The statistical analysis of consumer demands provided a relationship between dependent (demand) and independent variables (weather data) for utility load management, generation control, and network expansion. PMID:27314229

  2. Stochastic and Statistical Analysis of Utility Revenues and Weather Data Analysis for Consumer Demand Estimation in Smart Grids.

    PubMed

    Ali, S M; Mehmood, C A; Khan, B; Jawad, M; Farid, U; Jadoon, J K; Ali, M; Tareen, N K; Usman, S; Majid, M; Anwar, S M

    2016-01-01

    In smart grid paradigm, the consumer demands are random and time-dependent, owning towards stochastic probabilities. The stochastically varying consumer demands have put the policy makers and supplying agencies in a demanding position for optimal generation management. The utility revenue functions are highly dependent on the consumer deterministic stochastic demand models. The sudden drifts in weather parameters effects the living standards of the consumers that in turn influence the power demands. Considering above, we analyzed stochastically and statistically the effect of random consumer demands on the fixed and variable revenues of the electrical utilities. Our work presented the Multi-Variate Gaussian Distribution Function (MVGDF) probabilistic model of the utility revenues with time-dependent consumer random demands. Moreover, the Gaussian probabilities outcome of the utility revenues is based on the varying consumer n demands data-pattern. Furthermore, Standard Monte Carlo (SMC) simulations are performed that validated the factor of accuracy in the aforesaid probabilistic demand-revenue model. We critically analyzed the effect of weather data parameters on consumer demands using correlation and multi-linear regression schemes. The statistical analysis of consumer demands provided a relationship between dependent (demand) and independent variables (weather data) for utility load management, generation control, and network expansion.

  3. The importance of accurate muscle modelling for biomechanical analyses: a case study with a lizard skull

    PubMed Central

    Gröning, Flora; Jones, Marc E. H.; Curtis, Neil; Herrel, Anthony; O'Higgins, Paul; Evans, Susan E.; Fagan, Michael J.

    2013-01-01

    Computer-based simulation techniques such as multi-body dynamics analysis are becoming increasingly popular in the field of skull mechanics. Multi-body models can be used for studying the relationships between skull architecture, muscle morphology and feeding performance. However, to be confident in the modelling results, models need to be validated against experimental data, and the effects of uncertainties or inaccuracies in the chosen model attributes need to be assessed with sensitivity analyses. Here, we compare the bite forces predicted by a multi-body model of a lizard (Tupinambis merianae) with in vivo measurements, using anatomical data collected from the same specimen. This subject-specific model predicts bite forces that are very close to the in vivo measurements and also shows a consistent increase in bite force as the bite position is moved posteriorly on the jaw. However, the model is very sensitive to changes in muscle attributes such as fibre length, intrinsic muscle strength and force orientation, with bite force predictions varying considerably when these three variables are altered. We conclude that accurate muscle measurements are crucial to building realistic multi-body models and that subject-specific data should be used whenever possible. PMID:23614944

  4. Efficiently Multi-User Searchable Encryption Scheme with Attribute Revocation and Grant for Cloud Storage

    PubMed Central

    Wang, Shangping; Zhang, Xiaoxue; Zhang, Yaling

    2016-01-01

    Cipher-policy attribute-based encryption (CP-ABE) focus on the problem of access control, and keyword-based searchable encryption scheme focus on the problem of finding the files that the user interested in the cloud storage quickly. To design a searchable and attribute-based encryption scheme is a new challenge. In this paper, we propose an efficiently multi-user searchable attribute-based encryption scheme with attribute revocation and grant for cloud storage. In the new scheme the attribute revocation and grant processes of users are delegated to proxy server. Our scheme supports multi attribute are revoked and granted simultaneously. Moreover, the keyword searchable function is achieved in our proposed scheme. The security of our proposed scheme is reduced to the bilinear Diffie-Hellman (BDH) assumption. Furthermore, the scheme is proven to be secure under the security model of indistinguishability against selective ciphertext-policy and chosen plaintext attack (IND-sCP-CPA). And our scheme is also of semantic security under indistinguishability against chosen keyword attack (IND-CKA) in the random oracle model. PMID:27898703

  5. Efficiently Multi-User Searchable Encryption Scheme with Attribute Revocation and Grant for Cloud Storage.

    PubMed

    Wang, Shangping; Zhang, Xiaoxue; Zhang, Yaling

    2016-01-01

    Cipher-policy attribute-based encryption (CP-ABE) focus on the problem of access control, and keyword-based searchable encryption scheme focus on the problem of finding the files that the user interested in the cloud storage quickly. To design a searchable and attribute-based encryption scheme is a new challenge. In this paper, we propose an efficiently multi-user searchable attribute-based encryption scheme with attribute revocation and grant for cloud storage. In the new scheme the attribute revocation and grant processes of users are delegated to proxy server. Our scheme supports multi attribute are revoked and granted simultaneously. Moreover, the keyword searchable function is achieved in our proposed scheme. The security of our proposed scheme is reduced to the bilinear Diffie-Hellman (BDH) assumption. Furthermore, the scheme is proven to be secure under the security model of indistinguishability against selective ciphertext-policy and chosen plaintext attack (IND-sCP-CPA). And our scheme is also of semantic security under indistinguishability against chosen keyword attack (IND-CKA) in the random oracle model.

  6. COSTS OF CHILDHOOD ASTHMA DUE TO TRAFFIC-RELATED POLLUTION IN TWO CALIFORNIA COMMUNITIES

    PubMed Central

    Brandt, Sylvia J.; Perez, Laura; Künzli, Nino; Lurmann, Fred; McConnell, Rob

    2015-01-01

    Recent research suggests the burden of childhood asthma attributable to air pollution has been underestimated in traditional risk assessments, and there are no estimates of these associated costs. We estimated the yearly childhood asthma-related costs attributable to air pollution for Riverside and Long Beach, California, including: 1) the indirect and direct costs of health care utilization due to asthma exacerbations linked to traffic-related pollution (TRP); and 2) the costs of health care for asthma cases attributable to local TRP exposure. We estimated these costs using estimates from peer-reviewed literature and the authors' analysis of surveys (Medical Expenditure Panel Survey, California Health Interview Survey, National Household Travel Survey, and Health Care Utilization Project). A lower-bound estimate of the asthma burden attributable to air pollution was $18 million yearly. Asthma cases attributable to TRP exposure accounted for almost half of this cost. The cost of bronchitic episodes was a major proportion of both the annual cost of asthma cases attributable to TRP and of pollution-linked exacerbations. Traditional risk assessment methods underestimate both the burden of disease and cost of asthma associated with air pollution, and these costs are borne disproportionately by communities with higher than average TRP. PMID:22267764

  7. Hierarchical competitions subserving multi-attribute choice

    PubMed Central

    Hunt, Laurence T; Dolan, Raymond J; Behrens, Timothy EJ

    2015-01-01

    Valuation is a key tenet of decision neuroscience, where it is generally assumed that different attributes of competing options are assimilated into unitary values. Such values are central to current neural models of choice. By contrast, psychological studies emphasize complex interactions between choice and valuation. Principles of neuronal selection also suggest competitive inhibition may occur in early valuation stages, before option selection. Here, we show behavior in multi-attribute choice is best explained by a model involving competition at multiple levels of representation. This hierarchical model also explains neural signals in human brain regions previously linked to valuation, including striatum, parietal and prefrontal cortex, where activity represents competition within-attribute, competition between attributes, and option selection. This multi-layered inhibition framework challenges the assumption that option values are computed before choice. Instead our results indicate a canonical competition mechanism throughout all stages of a processing hierarchy, not simply at a final choice stage. PMID:25306549

  8. Illness Perception of Patients with Functional Gastrointestinal Disorders.

    PubMed

    Xiong, Na-Na; Wei, Jing; Ke, Mei-Yun; Hong, Xia; Li, Tao; Zhu, Li-Ming; Sha, Yue; Jiang, Jing; Fischer, Felix

    2018-01-01

    To investigate the illness perception characteristics of Chinese patients with functional gastrointestinal disorders (FGID), and the mediating role between symptoms, psychopathology, and clinical outcomes. Six illness groups from four outpatient departments of a general hospital in China were recruited, including the FGID patient group. The modified and validated Chinese version of the illness perception questionnaire-revised was utilized, which contained three sections: symptom identity, illness representation, and causes. The 12-item short-form health survey was utilized to reflect the physical and mental health-related quality of life (HRQoL). The Toronto alexithymia scale was used to measure the severity of alexithymia. Additional behavioral outcome about the frequency of doctor visits in the past 12 months was measured. Pathway analyses with multiple-group comparisons were conducted to test the mediating role of illness perception. Overall, 600 patients were recruited. The illness perceptions of FGID patients were characterized as with broad non-gastrointestinal symptoms (6.8 ± 4.2), a negative illness representation (more chronic course, worse consequences, lower personal and treatment control, lower illness coherence, and heavier emotional distress), and high numbers of psychological and culture-specific attributions. Fit indices of the three hypothesized path models (for physical and mental HRQoL and doctor-visit frequency, respectively) supported the mediating role of illness perceptions. For example, the severity of alexithymia and non-gastrointestinal symptoms had significant negative effect on mental quality of life through both direct (standardized effect: -0.085 and -0.233) and indirect (standardized effect: -0.045 and -0.231) influence via subscales of consequences, emotional representation, and psychological and risk factor attributions. Multi-group confirmatory factor analysis showed similar psychometric properties for FGID patients and the other disease group. The management of FGID patients should take into consideration dysfunctional illness perceptions, non-gastrointestinal symptoms, and emotion regulation.

  9. A Structured Decision Approach for Integrating and Analyzing Community Perspectives in Re-Use Planning of Vacant Properties in Cleveland, Ohio

    EPA Science Inventory

    An integrated GIS-based, multi-attribute decision model deployed in a web-based platform is presented enabling an iterative, spatially explicit and collaborative analysis of relevant and available information for repurposing vacant land. The process incorporated traditional and ...

  10. Averaging Models: Parameters Estimation with the R-Average Procedure

    ERIC Educational Resources Information Center

    Vidotto, G.; Massidda, D.; Noventa, S.

    2010-01-01

    The Functional Measurement approach, proposed within the theoretical framework of Information Integration Theory (Anderson, 1981, 1982), can be a useful multi-attribute analysis tool. Compared to the majority of statistical models, the averaging model can account for interaction effects without adding complexity. The R-Average method (Vidotto &…

  11. A consensus reaching model for 2-tuple linguistic multiple attribute group decision making with incomplete weight information

    NASA Astrophysics Data System (ADS)

    Zhang, Wancheng; Xu, Yejun; Wang, Huimin

    2016-01-01

    The aim of this paper is to put forward a consensus reaching method for multi-attribute group decision-making (MAGDM) problems with linguistic information, in which the weight information of experts and attributes is unknown. First, some basic concepts and operational laws of 2-tuple linguistic label are introduced. Then, a grey relational analysis method and a maximising deviation method are proposed to calculate the incomplete weight information of experts and attributes respectively. To eliminate the conflict in the group, a weight-updating model is employed to derive the weights of experts based on their contribution to the consensus reaching process. After conflict elimination, the final group preference can be obtained which will give the ranking of the alternatives. The model can effectively avoid information distortion which is occurred regularly in the linguistic information processing. Finally, an illustrative example is given to illustrate the application of the proposed method and comparative analysis with the existing methods are offered to show the advantages of the proposed method.

  12. Formal concept analysis with background knowledge: a case study in paleobiological taxonomy of belemnites

    NASA Astrophysics Data System (ADS)

    Belohlavek, Radim; Kostak, Martin; Osicka, Petr

    2013-05-01

    We present a case study in identification of taxa in paleobiological data. Our approach utilizes formal concept analysis and is based on conceiving a taxon as a group of individuals sharing a collection of attributes. In addition to the incidence relation between individuals and their attributes, the method uses expert background knowledge regarding importance of attributes which helps to filter out correctly formed but paleobiologically irrelevant taxa. We present results of experiments carried out with belemnites-a group of extinct cephalopods which seems particularly suitable for such a purpose. We demonstrate that the methods are capable of revealing taxa and relationships among them that are relevant from a paleobiological point of view.

  13. Allometric constraints to inversion of canopy structure from remote sensing

    NASA Astrophysics Data System (ADS)

    Wolf, A.; Berry, J. A.; Asner, G. P.

    2008-12-01

    Canopy radiative transfer models employ a large number of vegetation architectural and leaf biochemical attributes. Studies of leaf biochemistry show a wide array of chemical and spectral diversity that suggests that several leaf biochemical constituents can be independently retrieved from multi-spectral remotely sensed imagery. In contrast, attempts to exploit multi-angle imagery to retrieve canopy structure only succeed in finding two or three of the many unknown canopy arhitectural attributes. We examine a database of over 5000 destructive tree harvests from Eurasia to show that allometry - the covariation of plant form across a broad range of plant size and canopy density - restricts the architectural diversity of plant canopies into a single composite variable ranging from young canopies with many short trees with small crowns to older canopies with fewer trees and larger crowns. Moreover, these architectural attributes are closely linked to biomass via allometric constraints such as the "self-thinning law". We use the measured variance and covariance of plant canopy architecture in these stands to drive the radiative transfer model DISORD, which employs the Li-Strahler geometric optics model. This correlations introduced in the Monte Carlo study are used to determine which attributes of canopy architecture lead to important variation that can be observed by multi-angle or multi-spectral satellite observations, using the sun-view geometry characteristic of MODIS observations in different biomes located at different latitude bands. We conclude that although multi-angle/multi-spectral remote sensing is only sensitive to some of the many unknown canopy attributes that ecologists would wish to know, the strong allometric covariation between these attributes and others permits a large number of inferrences, such as forest biomass, that will be meaningful next-generation vegetation products useful for data assimilation.

  14. Secure Data Access Control for Fog Computing Based on Multi-Authority Attribute-Based Signcryption with Computation Outsourcing and Attribute Revocation

    PubMed Central

    Xu, Qian; Tan, Chengxiang; Fan, Zhijie; Zhu, Wenye; Xiao, Ya; Cheng, Fujia

    2018-01-01

    Nowadays, fog computing provides computation, storage, and application services to end users in the Internet of Things. One of the major concerns in fog computing systems is how fine-grained access control can be imposed. As a logical combination of attribute-based encryption and attribute-based signature, Attribute-based Signcryption (ABSC) can provide confidentiality and anonymous authentication for sensitive data and is more efficient than traditional “encrypt-then-sign” or “sign-then-encrypt” strategy. Thus, ABSC is suitable for fine-grained access control in a semi-trusted cloud environment and is gaining more and more attention recently. However, in many existing ABSC systems, the computation cost required for the end users in signcryption and designcryption is linear with the complexity of signing and encryption access policy. Moreover, only a single authority that is responsible for attribute management and key generation exists in the previous proposed ABSC schemes, whereas in reality, mostly, different authorities monitor different attributes of the user. In this paper, we propose OMDAC-ABSC, a novel data access control scheme based on Ciphertext-Policy ABSC, to provide data confidentiality, fine-grained control, and anonymous authentication in a multi-authority fog computing system. The signcryption and designcryption overhead for the user is significantly reduced by outsourcing the undesirable computation operations to fog nodes. The proposed scheme is proven to be secure in the standard model and can provide attribute revocation and public verifiability. The security analysis, asymptotic complexity comparison, and implementation results indicate that our construction can balance the security goals with practical efficiency in computation. PMID:29772840

  15. Analysis of student attitudes towards e-learning using Fishbein Multiattribute approach

    NASA Astrophysics Data System (ADS)

    Jasuli

    2018-01-01

    This research aimed to know students’ attitudes toward e-learning and to determine what attributes were considered to be dominant by students toward the use of e-learning. The research population was all postgraduate students in the 2016 academic year at Universitas Negeri Surabaya. The sampling technique is using nonprobability sampling and purposive sampling with the sample totaled 100 respondents. The research instrument is using questionnaire with semantic differential scale. The models used to analyze is multi-attribute Fishbein model. The findings indicated that student attitudes toward e-learning are positive and easy accessibility which is considered as the most important attribute by students toward the use of e-learning.

  16. The eyes have it: Using eye tracking to inform information processing strategies in multi-attributes choices.

    PubMed

    Ryan, Mandy; Krucien, Nicolas; Hermens, Frouke

    2018-04-01

    Although choice experiments (CEs) are widely applied in economics to study choice behaviour, understanding of how individuals process attribute information remains limited. We show how eye-tracking methods can provide insight into how decisions are made. Participants completed a CE, while their eye movements were recorded. Results show that although the information presented guided participants' decisions, there were also several processing biases at work. Evidence was found of (a) top-to-bottom, (b) left-to-right, and (c) first-to-last order biases. Experimental factors-whether attributes are defined as "best" or "worst," choice task complexity, and attribute ordering-also influence information processing. How individuals visually process attribute information was shown to be related to their choices. Implications for the design and analysis of CEs and future research are discussed. Copyright © 2017 John Wiley & Sons, Ltd.

  17. PaintOmics 3: a web resource for the pathway analysis and visualization of multi-omics data.

    PubMed

    Hernández-de-Diego, Rafael; Tarazona, Sonia; Martínez-Mira, Carlos; Balzano-Nogueira, Leandro; Furió-Tarí, Pedro; Pappas, Georgios J; Conesa, Ana

    2018-05-25

    The increasing availability of multi-omic platforms poses new challenges to data analysis. Joint visualization of multi-omics data is instrumental in better understanding interconnections across molecular layers and in fully utilizing the multi-omic resources available to make biological discoveries. We present here PaintOmics 3, a web-based resource for the integrated visualization of multiple omic data types onto KEGG pathway diagrams. PaintOmics 3 combines server-end capabilities for data analysis with the potential of modern web resources for data visualization, providing researchers with a powerful framework for interactive exploration of their multi-omics information. Unlike other visualization tools, PaintOmics 3 covers a comprehensive pathway analysis workflow, including automatic feature name/identifier conversion, multi-layered feature matching, pathway enrichment, network analysis, interactive heatmaps, trend charts, and more. It accepts a wide variety of omic types, including transcriptomics, proteomics and metabolomics, as well as region-based approaches such as ATAC-seq or ChIP-seq data. The tool is freely available at www.paintomics.org.

  18. "Slit Mask Design for the Giant Magellan Telescope Multi-object Astronomical and Cosmological Spectrograph"

    NASA Astrophysics Data System (ADS)

    Williams, Darius; Marshall, Jennifer L.; Schmidt, Luke M.; Prochaska, Travis; DePoy, Darren L.

    2018-01-01

    The Giant Magellan Telescope Multi-object Astronomical and Cosmological Spectrograph (GMACS) is currently in development for the Giant Magellan Telescope (GMT). GMACS will employ slit masks with a usable diameter of approximately 0.450 m for the purpose of multi-slit spectroscopy. Of significant importance are the design constraints and parameters of the multi-object slit masks themselves as well as the means for mapping astronomical targets to physical mask locations. Analytical methods are utilized to quantify deformation effects on a potential slit mask due to thermal expansion and vignetting of target light cones. Finite element analysis (FEA) is utilized to simulate mask flexure in changing gravity vectors. The alpha version of the mask creation program for GMACS, GMACS Mask Simulator (GMS), a derivative of the OSMOS Mask Simulator (OMS), is introduced.

  19. Assessment of Trading Partners for China's Rare Earth Exports Using a Decision Analytic Approach

    PubMed Central

    He, Chunyan; Lei, Yalin; Ge, Jianping

    2014-01-01

    Chinese rare earth export policies currently result in accelerating its depletion. Thus adopting an optimal export trade selection strategy is crucial to determining and ultimately identifying the ideal trading partners. This paper introduces a multi-attribute decision-making methodology which is then used to select the optimal trading partner. In the method, an evaluation criteria system is established to assess the seven top trading partners based on three dimensions: political relationships, economic benefits and industrial security. Specifically, a simple additive weighing model derived from an additive utility function is utilized to calculate, rank and select alternatives. Results show that Japan would be the optimal trading partner for Chinese rare earths. The criteria evaluation method of trading partners for China's rare earth exports provides the Chinese government with a tool to enhance rare earth industrial policies. PMID:25051534

  20. Assessment of trading partners for China's rare earth exports using a decision analytic approach.

    PubMed

    He, Chunyan; Lei, Yalin; Ge, Jianping

    2014-01-01

    Chinese rare earth export policies currently result in accelerating its depletion. Thus adopting an optimal export trade selection strategy is crucial to determining and ultimately identifying the ideal trading partners. This paper introduces a multi-attribute decision-making methodology which is then used to select the optimal trading partner. In the method, an evaluation criteria system is established to assess the seven top trading partners based on three dimensions: political relationships, economic benefits and industrial security. Specifically, a simple additive weighing model derived from an additive utility function is utilized to calculate, rank and select alternatives. Results show that Japan would be the optimal trading partner for Chinese rare earths. The criteria evaluation method of trading partners for China's rare earth exports provides the Chinese government with a tool to enhance rare earth industrial policies.

  1. A visual analysis of multi-attribute data using pixel matrix displays

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel; Schreck, Tobias

    2007-01-01

    Charts and tables are commonly used to visually analyze data. These graphics are simple and easy to understand, but charts show only highly aggregated data and present only a limited number of data values while tables often show too many data values. As a consequence, these graphics may either lose or obscure important information, so different techniques are required to monitor complex datasets. Users need more powerful visualization techniques to digest and compare detailed multi-attribute data to analyze the health of their business. This paper proposes an innovative solution based on the use of pixel-matrix displays to represent transaction-level information. With pixelmatrices, users can visualize areas of importance at a glance, a capability not provided by common charting techniques. We present our solutions to use colored pixel-matrices in (1) charts for visualizing data patterns and discovering exceptions, (2) tables for visualizing correlations and finding root-causes, and (3) time series for visualizing the evolution of long-running transactions. The solutions have been applied with success to product sales, Internet network performance analysis, and service contract applications demonstrating the benefits of our method over conventional graphics. The method is especially useful when detailed information is a key part of the analysis.

  2. Propulsion Airframe Aeroacoustics Technology Evaluation and Selection Using a Multi-Attribute Decision Making Process and Non-Deterministic Design

    NASA Technical Reports Server (NTRS)

    Burg, Cecile M.; Hill, Geoffrey A.; Brown, Sherilyn A.; Geiselhart, Karl A.

    2004-01-01

    The Systems Analysis Branch at NASA Langley Research Center has investigated revolutionary Propulsion Airframe Aeroacoustics (PAA) technologies and configurations for a Blended-Wing-Body (BWB) type aircraft as part of its research for NASA s Quiet Aircraft Technology (QAT) Project. Within the context of the long-term NASA goal of reducing the perceived aircraft noise level by a factor of 4 relative to 1997 state of the art, major configuration changes in the propulsion airframe integration system were explored with noise as a primary design consideration. An initial down-select and assessment of candidate PAA technologies for the BWB was performed using a Multi-Attribute Decision Making (MADM) process consisting of organized brainstorming and decision-making tools. The assessments focused on what effect the PAA technologies had on both the overall noise level of the BWB and what effect they had on other major design considerations such as weight, performance and cost. A probabilistic systems analysis of the PAA configurations that presented the best noise reductions with the least negative impact on the system was then performed. Detailed results from the MADM study and the probabilistic systems analysis will be published in the near future.

  3. Urban photogrammetric data base for multi-purpose cadastral-based information systems: the Riyadh city case

    NASA Astrophysics Data System (ADS)

    Al-garni, Abdullah M.

    Urban information systems are economic resources that can benefit decision makers in the planning, development, and management of urban projects and resources. In this research, a conceptual model-based prototype Urban Geographic Information System (UGIS) is developed. The base maps used in developing the system and acquiring visual attributes are obtained from aerial photographs. The system is a multi-purpose parcel-based one that can serve many urban applications such as public utilities, health centres, schools, population estimation, road engineering and maintenance, and many others. A modern region in the capital city of Saudi Arabia is used for the study. The developed model is operational for one urban application (population estimation) and is tested for that particular application. The results showed that the system has a satisfactory accuracy and that it may well be promising for other similar urban applications in countries with similar demographic and social characteristics.

  4. Ordinal preference elicitation methods in health economics and health services research: using discrete choice experiments and ranking methods.

    PubMed

    Ali, Shehzad; Ronaldson, Sarah

    2012-09-01

    The predominant method of economic evaluation is cost-utility analysis, which uses cardinal preference elicitation methods, including the standard gamble and time trade-off. However, such approach is not suitable for understanding trade-offs between process attributes, non-health outcomes and health outcomes to evaluate current practices, develop new programmes and predict demand for services and products. Ordinal preference elicitation methods including discrete choice experiments and ranking methods are therefore commonly used in health economics and health service research. Cardinal methods have been criticized on the grounds of cognitive complexity, difficulty of administration, contamination by risk and preference attitudes, and potential violation of underlying assumptions. Ordinal methods have gained popularity because of reduced cognitive burden, lower degree of abstract reasoning, reduced measurement error, ease of administration and ability to use both health and non-health outcomes. The underlying assumptions of ordinal methods may be violated when respondents use cognitive shortcuts, or cannot comprehend the ordinal task or interpret attributes and levels, or use 'irrational' choice behaviour or refuse to trade-off certain attributes. CURRENT USE AND GROWING AREAS: Ordinal methods are commonly used to evaluate preference for attributes of health services, products, practices, interventions, policies and, more recently, to estimate utility weights. AREAS FOR ON-GOING RESEARCH: There is growing research on developing optimal designs, evaluating the rationalization process, using qualitative tools for developing ordinal methods, evaluating consistency with utility theory, appropriate statistical methods for analysis, generalizability of results and comparing ordinal methods against each other and with cardinal measures.

  5. Air Distribution Retrofit Strategies for Affordable Housing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dentz, Jordan; Conlin, Francis; Holloway, Parker

    2014-03-01

    In multifamily and attached buildings, traditional duct sealing methods are often impractical or costly and disruptive because of the difficulty in accessing leakage sites. In this project, two retrofit duct sealing techniques -- manually-applied sealants and injecting a spray sealant, were implemented in several low-rise multi-unit buildings. An analysis on the cost and performance of the two methods are presented. Each method was used in twenty housing units: approximately half of each group of units are single story and the remainder two-story. Results show that duct leakage to the outside was reduced by an average of 59% through the usemore » of manual methods, and by 90% in the units where the injected spray sealant was used. It was found that 73% of the leakage reduction in homes that were treated with injected spray sealant was attributable to the manual sealing done at boots, returns and the air handler. The cost of manually-applying sealant ranged from $275 to $511 per unit and for the injected spray sealant the cost was $700 per unit. Modeling suggests a simple payback of 2.2 years for manual sealing and 4.7 years for the injected spray sealant system. Utility bills were collected for one year before and after the retrofits. Utility bill analysis shows 14% and 16% energy savings using injected spray sealant system and hand sealing procedure respectively in heating season whereas in cooling season, energy savings using injected spray sealant system and hand sealing were both 16%.« less

  6. Combined interpretation of 3D seismic reflection attributes for geothermal exploration in the Polish Basin using self-organizing maps

    NASA Astrophysics Data System (ADS)

    Bauer, Klaus; Pussak, Marcin; Stiller, Manfred; Bujakowski, Wieslaw

    2014-05-01

    Self-organizing maps (SOM) are neural network techniques which can be used for the joint interpretation of multi-disciplinary data sets. In this investigation we apply SOM within a geothermal exploration project using 3D seismic reflection data. The study area is located in the central part of the Polish basin. Several sedimentary target horizons were identified at this location based on fluid flow rate measurements in the geothermal research well Kompina-2. The general objective is a seismic facies analysis and characterization of the major geothermal target reservoir. A 3D seismic reflection experiment with a sparse acquisition geometry was carried out around well Kompina-2. Conventional signal processing (amplitude corrections, filtering, spectral whitening, deconvolution, static corrections, muting) was followed by normal-moveout (NMO) stacking, and, alternatively, by common-reflection-surface (CRS) stacking. Different signal attributes were then derived from the stacked images including root-mean-square (RMS) amplitude, instantaneous frequency and coherency. Furthermore, spectral decomposition attributes were calculated based on the continuous wavelet transform. The resulting attribute maps along major target horizons appear noisy after the NMO stack and clearly structured after the CRS stack. Consequently, the following SOM-based multi-parameter signal attribute analysis was applied only to the CRS images. We applied our SOM work flow, which includes data preparation, unsupervised learning, segmentation of the trained SOM using image processing techniques, and final application of the learned knowledge. For the Lower Jurassic target horizon Ja1 we derived four different clusters with distinct seismic attribute signatures. As the most striking feature, a corridor parallel to a fault system was identified, which is characterized by decreased RMS amplitudes and low frequencies. In our interpretation we assume that this combination of signal properties can be explained by increased fracture porosity and enhanced fluid saturation within this part of the Lower Jurassic sandstone horizon. Hence, we suggest that a future drilling should be carried out within this compartment of the reservoir.

  7. The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications

    PubMed Central

    Sullivan, Joanne H.; Glisson, Scott R.

    2016-01-01

    Although the scientific peer review process is crucial to distributing research investments, little has been reported about the decision-making processes used by reviewers. One key attribute likely to be important for decision-making is reviewer expertise. Recent data from an experimental blinded review utilizing a direct measure of expertise has found that closer intellectual distances between applicant and reviewer lead to harsher evaluations, possibly suggesting that information is differentially sampled across subject-matter expertise levels and across information type (e.g. strengths or weaknesses). However, social and professional networks have been suggested to play a role in reviewer scoring. In an effort to test whether this result can be replicated in a real-world unblinded study utilizing self-assessed reviewer expertise, we conducted a retrospective multi-level regression analysis of 1,450 individual unblinded evaluations of 725 biomedical research funding applications by 1,044 reviewers. Despite the large variability in the scoring data, the results are largely confirmatory of work from blinded reviews, by which a linear relationship between reviewer expertise and their evaluations was observed—reviewers with higher levels of self-assessed expertise tended to be harsher in their evaluations. However, we also found that reviewer and applicant seniority could influence this relationship, suggesting social networks could have subtle influences on reviewer scoring. Overall, these results highlight the need to explore how reviewers utilize their expertise to gather and weight information from the application in making their evaluations. PMID:27768760

  8. Multi-Attribute Consensus Building Tool

    ERIC Educational Resources Information Center

    Shyyan, Vitaliy; Christensen, Laurene; Thurlow, Martha; Lazarus, Sheryl

    2013-01-01

    The Multi-Attribute Consensus Building (MACB) method is a quantitative approach for determining a group's opinion about the importance of each item (strategy, decision, recommendation, policy, priority, etc.) on a list (Vanderwood, & Erickson, 1994). This process enables a small or large group of participants to generate and discuss a set…

  9. Single-step affinity purification of enzyme biotherapeutics: a platform methodology for accelerated process development.

    PubMed

    Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean

    2014-01-01

    Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT). © 2014 American Institute of Chemical Engineers.

  10. Multimodality medical image database for temporal lobe epilepsy

    NASA Astrophysics Data System (ADS)

    Siadat, Mohammad-Reza; Soltanian-Zadeh, Hamid; Fotouhi, Farshad A.; Elisevich, Kost

    2003-05-01

    This paper presents the development of a human brain multi-modality database for surgical candidacy determination in temporal lobe epilepsy. The focus of the paper is on content-based image management, navigation and retrieval. Several medical image-processing methods including our newly developed segmentation method are utilized for information extraction/correlation and indexing. The input data includes T1-, T2-Weighted and FLAIR MRI and ictal/interictal SPECT modalities with associated clinical data and EEG data analysis. The database can answer queries regarding issues such as the correlation between the attribute X of the entity Y and the outcome of a temporal lobe epilepsy surgery. The entity Y can be a brain anatomical structure such as the hippocampus. The attribute X can be either a functionality feature of the anatomical structure Y, calculated with SPECT modalities, such as signal average, or a volumetric/morphological feature of the entity Y such as volume or average curvature. The outcome of the surgery can be any surgery assessment such as non-verbal Wechsler memory quotient. A determination is made regarding surgical candidacy by analysis of both textual and image data. The current database system suggests a surgical determination for the cases with relatively small hippocampus and high signal intensity average on FLAIR images within the hippocampus. This indication matches the neurosurgeons expectations/observations. Moreover, as the database gets more populated with patient profiles and individual surgical outcomes, using data mining methods one may discover partially invisible correlations between the contents of different modalities of data and the outcome of the surgery.

  11. Multiattribute evaluation in formulary decision making as applied to calcium-channel blockers.

    PubMed

    Schumacher, G E

    1991-02-01

    The use of multiattribute utility theory (MAUT) to make a formulary decision involving calcium-channel blockers (CCBs) is described. The MAUT method is a procedure for identifying, characterizing, and comparing the many variables that may affect a decision. Although applications in pharmacy have been infrequent, MAUT should be particularly appealing to formulary committees. The steps of the MAUT method are (1) determine the viewpoint of the decision makers, (2) identify the decision alternatives, (3) identify the attributes to be evaluated, (4) identify the factors to be used in evaluating the attributes, (5) establish a utility scale for scoring each factor, (6) transform the values for each factor to its utility scale, (7) determine weights for each attribute and factor, (8) calculate the total utility score for each decision alternative, (9) determine which decision alternative has the greatest total score, and (10) perform a sensitivity analysis. The viewpoint of a formulary committee in a health maintenance organization was simulated to develop a model for using the MAUT method to compare CCBs for single-agent therapy of chronic stable angina in ambulatory patients for one year. The attributes chosen were effectiveness, safety, patient acceptance, and cost and weighted 36%, 29%, 21%, and 14%, respectively, as contributions to the evaluation. The rank order of the decision alternatives was (1) generic verapamil, (2) brand-name verapamil, (3) diltiazem, (4) nicardipine, and (5) nifedipine. The MAUT method provides a standardized yet flexible format for comparing and selecting among formulary alternatives.

  12. Coupling meteorology, metal concentrations, and Pb isotopes for source attribution in archived precipitation samples.

    PubMed

    Graney, Joseph R; Landis, Matthew S

    2013-03-15

    A technique that couples lead (Pb) isotopes and multi-element concentrations with meteorological analysis was used to assess source contributions to precipitation samples at the Bondville, Illinois USA National Trends Network (NTN) site. Precipitation samples collected over a 16month period (July 1994-October 1995) at Bondville were parsed into six unique meteorological flow regimes using a minimum variance clustering technique on back trajectory endpoints. Pb isotope ratios and multi-element concentrations were measured using high resolution inductively coupled plasma-sector field mass spectrometry (ICP-SFMS) on the archived precipitation samples. Bondville is located in central Illinois, ~250km downwind from smelters in southeast Missouri. The Mississippi Valley Type ore deposits in Missouri provided a unique multi-element and Pb isotope fingerprint for smelter emissions which could be contrasted to industrial emissions from the Chicago and Indianapolis urban areas (~125km north and east, of Bondville respectively) and regional emissions from electric utility facilities. Differences in Pb isotopes and element concentrations in precipitation corresponded to flow regime. Industrial sources from urban areas, and thorogenic Pb from coal use, could be differentiated from smelter emissions from Missouri by coupling Pb isotopes with variations in element ratios and relative mass factors. Using a three endmember mixing model based on Pb isotope ratio differences, industrial processes in urban airsheds contributed 56±19%, smelters in southeast Missouri 26±13%, and coal combustion 18±7%, of the Pb in precipitation collected in Bondville in the mid-1990s. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Converting Parkinson-Specific Scores into Health State Utilities to Assess Cost-Utility Analysis.

    PubMed

    Chen, Gang; Garcia-Gordillo, Miguel A; Collado-Mateo, Daniel; Del Pozo-Cruz, Borja; Adsuar, José C; Cordero-Ferrera, José Manuel; Abellán-Perpiñán, José María; Sánchez-Martínez, Fernando Ignacio

    2018-06-07

    The aim of this study was to compare the Parkinson's Disease Questionnaire-8 (PDQ-8) with three multi-attribute utility (MAU) instruments (EQ-5D-3L, EQ-5D-5L, and 15D) and to develop mapping algorithms that could be used to transform PDQ-8 scores into MAU scores. A cross-sectional study was conducted. A final sample of 228 evaluable patients was included in the analyses. Sociodemographic and clinical data were also collected. Two EQ-5D questionnaires were scored using Spanish tariffs. Two models and three statistical techniques were used to estimate each model in the direct mapping framework for all three MAU instruments, including the most widely used ordinary least squares (OLS), the robust MM-estimator, and the generalized linear model (GLM). For both EQ-5D-3L and EQ-5D-5L, indirect response mapping based on an ordered logit model was also conducted. Three goodness-of-fit tests were employed to compare the models: the mean absolute error (MAE), the root-mean-square error (RMSE), and the intra-class correlation coefficient (ICC) between the predicted and observed utilities. Health state utility scores ranged from 0.61 (EQ-5D-3L) to 0.74 (15D). The mean PDQ-8 score was 27.51. The correlation between overall PDQ-8 score and each MAU instrument ranged from - 0.729 (EQ-5D-5L) to - 0.752 (EQ-5D-3L). A mapping algorithm based on PDQ-8 items had better performance than using the overall score. For the two EQ-5D questionnaires, in general, the indirect mapping approach had comparable or even better performance than direct mapping based on MAE. Mapping algorithms developed in this study enable the estimation of utility values from the PDQ-8. The indirect mapping equations reported for two EQ-5D questionnaires will further facilitate the calculation of EQ-5D utility scores using other country-specific tariffs.

  14. An ethnographic object-oriented analysis of explorer presence in a volcanic terrain environment: Claims and evidence

    NASA Technical Reports Server (NTRS)

    Mcgreevy, Michael W.

    1994-01-01

    An ethnographic field study was conducted to investigate the nature of presence in field geology, and to develop specifications for domain-based planetary exploration systems utilizing virtual presence. Two planetary geologists were accompanied on a multi-day geologic field trip that they had arranged for their own scientific purposes, which centered on an investigation of the extraordinary xenolith/nodule deposits in the Kaupulehu lava flow of Hualalai Volcano, on the island of Hawaii. The geologists were observed during the course of their field investigations and interviewed regarding their activities and ideas. Analysis of the interview resulted in the identification of key domain entities and their attributes, relations among the entities, and explorer interactions with the environment. The results support and extend the author's previously reported continuity theory of presence, indicating that presence in field geology is characterized by persistent engagement with objects associated by metonymic relations. The results also provide design specifications for virtual planetary exploration systems, including an integrating structure for disparate data integration. Finally, the results suggest that unobtrusive participant observation coupled with field interviews is an effective research methodology for engineering ethnography.

  15. Clinical validation and applicability of different tipranavir/ritonavir genotypic scores in HIV-1 protease inhibitor-experienced patients.

    PubMed

    Saracino, Annalisa; Monno, Laura; Tartaglia, Alessandra; Tinelli, Carmine; Seminari, Elena; Maggiolo, Franco; Bonora, Stefano; Rusconi, Stefano; Micheli, Valeria; Lo Caputo, Sergio; Lazzaroni, Laura; Ferrara, Sergio; Ladisa, Nicoletta; Nasta, Paola; Parruti, Giustino; Bellagamba, Rita; Forbici, Federica; Angarano, Gioacchino

    2009-07-01

    Tipranavir, a non-peptidic protease inhibitor which shows in vitro efficacy against some HIV-1-resistant strains, can be used in salvage therapies for multi-experienced HIV patients due to its peculiar resistance profile including 21 mutations at 16 protease positions according to International AIDS Society (IAS). Other genotypic scores, however, which attribute a different weight to single amino-acid substitutions, have been recently proposed. To validate the clinical utility of four different genotypic scores for selecting tipranavir responders, the baseline resistance pattern of 176 HIV heavily experienced patients was correlated with virological success (HIV-RNA<50 copies/ml) after 24 weeks of a new treatment based on tipranavir/ritonavir. Virological suppression after 24 weeks was reached by 42.5% of patients. With univariate analysis, genotypic scores were all associated with outcome but showed a low accuracy with ROC analysis, with the weighted score (WS) by Scherer et al. demonstrating the best performance with an AUC of 68%. Only 52% of patients classified as susceptible (WS< or =3) responded to the new therapy. The following variables were significantly associated (p<0.05) to failure with multivariate analysis: WS, log peak of HIV-RNA, IAS mutations: L33F, I54AMV, Q58E, and non-IAS mutation: N37DES. On the contrary, the use of T20 in T20-naïve patients and the V82AFSI and F53LY non-IAS mutations were associated with virological success. The study suggests that even if the "weighted" scores are able to interpret correctly the antiretroviral resistance profile of multi-experienced patients, it is difficult to individuate a cut-off which can be easily applied to this population for discriminating responders.

  16. Estimating structural attributes of Douglas-fir/western hemlock forest stands from Landsat and SPOT imagery

    NASA Technical Reports Server (NTRS)

    Cohen, Warren B.; Spies, Thomas A.

    1992-01-01

    Relationships between spectral and texture variables derived from SPOT HRV 10 m panchromatic and Landsat TM 30 m multispectral data and 16 forest stand structural attributes is evaluated to determine the utility of satellite data for analysis of hemlock forests west of the Cascade Mountains crest in Oregon and Washington, USA. Texture of the HRV data was found to be strongly related to many of the stand attributes evaluated, whereas TM texture was weakly related to all attributes. Data analysis based on regression models indicates that both TM and HRV imagery should yield equally accurate estimates of forest age class and stand structure. It is concluded that the satellite data are a valuable source for estimation of the standard deviation of tree sizes, mean size and density of trees in the upper canopy layers, a structural complexity index, and stand age.

  17. National Drug Formulary review of statin therapeutic group using the multiattribute scoring tool

    PubMed Central

    Ramli, Azuana; Aljunid, Syed Mohamed; Sulong, Saperi; Md Yusof, Faridah Aryani

    2013-01-01

    Purpose HMG-CoA reductase inhibitors (statins) are extensively used in treating hypercholesterolemia. The statins available in Malaysia include atorvastatin, lovastatin, pravastatin, rosuvastatin, simvastatin, and fluvastatin. Over the years, they have accumulated in the National Drug Formulary; hence, the need for review. Effective selection of the best drugs to remain in the formulary can become complex due to the multiple drug attributes involved, and is made worse by the limited time and resources available. The multiattribute scoring tool (MAST) systematizes the evaluation of the drug attributes to facilitate the drug selection process. In this study, a MAST framework was developed to rank the statins based on their utilities or benefits. Methods Published literature on multicriteria decision analysis (MCDA) were studied and five sessions of expert group discussions were conducted to build the MAST framework and to review the evidence. The attributes identified and selected for analysis were efficacy (clinical efficacy, clinical endpoints), safety (drug interactions, serious side effects and documentation), drug applicability (drug strength/formulation, indications, dose frequency, side effects, food–drug interactions, and dose adjustments), and cost. The average weights assigned by the members for efficacy, safety, drug applicability and cost were 32.6%, 26.2%, 24.1%, and 17.1%, respectively. The utility values of the attributes were scored based on the published evidence or/and agreements during the group discussions. The attribute scores were added up to provide the total utility score. Results Using the MAST, the six statins under review were successfully scored and ranked. Atorvastatin scored the highest total utility score (TUS) of 84.48, followed by simvastatin (83.11). Atorvastatin and simvastatin scored consistently high, even before drug costs were included. The low scores on the side effects for atorvastatin were compensated for by the higher scores on the clinical endpoints resulting in a higher TUS for atorvastatin. Fluvastatin recorded the lowest TUS. Conclusion The multiattribute scoring tool was successfully applied to organize decision variables in reviewing statins for the formulary. Based on the TUS, atorvastatin is recommended to remain in the formulary and be considered as first-line in the treatment of hypercholesterolemia. PMID:24353428

  18. A proteorhodopsin-based biohybrid light-powering pH sensor.

    PubMed

    Rao, Siyuan; Guo, Zhibin; Liang, Dawei; Chen, Deliang; Wei, Yen; Xiang, Yan

    2013-10-14

    The biohybrid sensor is an emerging technique for multi-functional detection that utilizes the instinctive responses or interactions of biomolecules. We develop a biohybrid pH sensor by taking advantage of the pH-dependent photoelectric characteristics of proteorhodopsin (pR). The transient absorption kinetics study indicates that the photoelectric behavior of pR is attributed to the varying lifetime of the M intermediate at different environmental pH values. This pR-based biohybrid light-powering sensor with microfluidic design can achieve real-time pH detection with quick response and high sensitivity. The results of this work would shed light on pR and its potential applications.

  19. Differential Weighting in Multi-Attribute Utility Measurement: When it Should Not and When it does make a Difference

    DTIC Science & Technology

    1976-08-01

    of the motoring public. "The design should balance and optimize characteristics serving environmental, safety, and conservation goals " (McDonald...59 9,738 165 Datsun 610 58 4,766 82 Buick Century 55 5,558 101 Mazda RX4 54 5,207 96 Volkswagen Rabbit 54 4,353 81 AMC Matador 53 4,837 91 Toyota...53 Ford Maverick 8 92 4,229 46 ANMC Hornet 6 94 4,127 44 A’X Pacer 95 4,569 48 Audi Fox 96 5,678 59 Mazda RX4 96 5,207 54 Toyota Corona MIK II 98

  20. New agrophysics divisions: application of GIS and fuzzy multi attributive comparison of alternatives (review)

    USDA-ARS?s Scientific Manuscript database

    This review paper is devoted to review the new scientific divisions that emerged in agrophysics in the last 10-15 years. Among them are the following: 1) application of Geographic Information Systems, 2) development and application of fuzzy multi attributive comparison of alternatives. In recent yea...

  1. Production Task Queue Optimization Based on Multi-Attribute Evaluation for Complex Product Assembly Workshop.

    PubMed

    Li, Lian-Hui; Mo, Rong

    2015-01-01

    The production task queue has a great significance for manufacturing resource allocation and scheduling decision. Man-made qualitative queue optimization method has a poor effect and makes the application difficult. A production task queue optimization method is proposed based on multi-attribute evaluation. According to the task attributes, the hierarchical multi-attribute model is established and the indicator quantization methods are given. To calculate the objective indicator weight, criteria importance through intercriteria correlation (CRITIC) is selected from three usual methods. To calculate the subjective indicator weight, BP neural network is used to determine the judge importance degree, and then the trapezoid fuzzy scale-rough AHP considering the judge importance degree is put forward. The balanced weight, which integrates the objective weight and the subjective weight, is calculated base on multi-weight contribution balance model. The technique for order preference by similarity to an ideal solution (TOPSIS) improved by replacing Euclidean distance with relative entropy distance is used to sequence the tasks and optimize the queue by the weighted indicator value. A case study is given to illustrate its correctness and feasibility.

  2. Production Task Queue Optimization Based on Multi-Attribute Evaluation for Complex Product Assembly Workshop

    PubMed Central

    Li, Lian-hui; Mo, Rong

    2015-01-01

    The production task queue has a great significance for manufacturing resource allocation and scheduling decision. Man-made qualitative queue optimization method has a poor effect and makes the application difficult. A production task queue optimization method is proposed based on multi-attribute evaluation. According to the task attributes, the hierarchical multi-attribute model is established and the indicator quantization methods are given. To calculate the objective indicator weight, criteria importance through intercriteria correlation (CRITIC) is selected from three usual methods. To calculate the subjective indicator weight, BP neural network is used to determine the judge importance degree, and then the trapezoid fuzzy scale-rough AHP considering the judge importance degree is put forward. The balanced weight, which integrates the objective weight and the subjective weight, is calculated base on multi-weight contribution balance model. The technique for order preference by similarity to an ideal solution (TOPSIS) improved by replacing Euclidean distance with relative entropy distance is used to sequence the tasks and optimize the queue by the weighted indicator value. A case study is given to illustrate its correctness and feasibility. PMID:26414758

  3. Fault zone identification in the eastern part of the Persian Gulf based on combined seismic attributes

    NASA Astrophysics Data System (ADS)

    Mirkamali, M. S.; Keshavarz FK, N.; Bakhtiari, M. R.

    2013-02-01

    Faults, as main pathways for fluids, play a critical role in creating regions of high porosity and permeability, in cutting cap rock and in the migration of hydrocarbons into the reservoir. Therefore, accurate identification of fault zones is very important in maximizing production from petroleum traps. Image processing and modern visualization techniques are provided for better mapping of objects of interest. In this study, the application of fault mapping in the identification of fault zones within the Mishan and Aghajari formations above the Guri base unconformity surface in the eastern part of Persian Gulf is investigated. Seismic single- and multi-trace attribute analyses are employed separately to determine faults in a vertical section, but different kinds of geological objects cannot be identified using individual attributes only. A mapping model is utilized to improve the identification of the faults, giving more accurate results. This method is based on combinations of all individual relevant attributes using a neural network system to create combined attributes, which gives an optimal view of the object of interest. Firstly, a set of relevant attributes were separately calculated on the vertical section. Then, at interpreted positions, some example training locations were manually selected in each fault and non-fault class by an interpreter. A neural network was trained on combinations of the attributes extracted at the example training locations to generate an optimized fault cube. Finally, the results of the fault and nonfault probability cube were estimated, which the neural network applied to the entire data set. The fault probability cube was obtained with higher mapping accuracy and greater contrast, and with fewer disturbances in comparison with individual attributes. The computed results of this study can support better understanding of the data, providing fault zone mapping with reliable results.

  4. Transient segregation behavior in Cd1-xZnxTe with low Zn content-A qualitative and quantitative analysis

    NASA Astrophysics Data System (ADS)

    Neubert, M.; Jurisch, M.

    2015-06-01

    The paper analyzes experimental compositional profiles in Vertical Bridgman (VB, VGF) grown (Cd,Zn)Te crystals, found in the literature. The origin of the observed axial ZnTe-distribution profiles is attributed to dendritic growth after initial nucleation from supercooled melts. The analysis was done by utilizing a boundary layer model providing a very good approximation of the experimental data. Besides the discussion of the qualitative results also a quantitative analysis of the fitted model parameters is presented as far as it is possible by the utilized model.

  5. Visualization of multi-INT fusion data using Java Viewer (JVIEW)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Aved, Alex; Nagy, James; Scott, Stephen

    2014-05-01

    Visualization is important for multi-intelligence fusion and we demonstrate issues for presenting physics-derived (i.e., hard) and human-derived (i.e., soft) fusion results. Physics-derived solutions (e.g., imagery) typically involve sensor measurements that are objective, while human-derived (e.g., text) typically involve language processing. Both results can be geographically displayed for user-machine fusion. Attributes of an effective and efficient display are not well understood, so we demonstrate issues and results for filtering, correlation, and association of data for users - be they operators or analysts. Operators require near-real time solutions while analysts have the opportunities of non-real time solutions for forensic analysis. In a use case, we demonstrate examples using the JVIEW concept that has been applied to piloting, space situation awareness, and cyber analysis. Using the open-source JVIEW software, we showcase a big data solution for multi-intelligence fusion application for context-enhanced information fusion.

  6. Heterogeneous Face Attribute Estimation: A Deep Multi-Task Learning Approach.

    PubMed

    Han, Hu; K Jain, Anil; Shan, Shiguang; Chen, Xilin

    2017-08-10

    Face attribute estimation has many potential applications in video surveillance, face retrieval, and social media. While a number of methods have been proposed for face attribute estimation, most of them did not explicitly consider the attribute correlation and heterogeneity (e.g., ordinal vs. nominal and holistic vs. local) during feature representation learning. In this paper, we present a Deep Multi-Task Learning (DMTL) approach to jointly estimate multiple heterogeneous attributes from a single face image. In DMTL, we tackle attribute correlation and heterogeneity with convolutional neural networks (CNNs) consisting of shared feature learning for all the attributes, and category-specific feature learning for heterogeneous attributes. We also introduce an unconstrained face database (LFW+), an extension of public-domain LFW, with heterogeneous demographic attributes (age, gender, and race) obtained via crowdsourcing. Experimental results on benchmarks with multiple face attributes (MORPH II, LFW+, CelebA, LFWA, and FotW) show that the proposed approach has superior performance compared to state of the art. Finally, evaluations on a public-domain face database (LAP) with a single attribute show that the proposed approach has excellent generalization ability.

  7. Angular difference feature extraction for urban scene classification using ZY-3 multi-angle high-resolution satellite imagery

    NASA Astrophysics Data System (ADS)

    Huang, Xin; Chen, Huijun; Gong, Jianya

    2018-01-01

    Spaceborne multi-angle images with a high-resolution are capable of simultaneously providing spatial details and three-dimensional (3D) information to support detailed and accurate classification of complex urban scenes. In recent years, satellite-derived digital surface models (DSMs) have been increasingly utilized to provide height information to complement spectral properties for urban classification. However, in such a way, the multi-angle information is not effectively exploited, which is mainly due to the errors and difficulties of the multi-view image matching and the inaccuracy of the generated DSM over complex and dense urban scenes. Therefore, it is still a challenging task to effectively exploit the available angular information from high-resolution multi-angle images. In this paper, we investigate the potential for classifying urban scenes based on local angular properties characterized from high-resolution ZY-3 multi-view images. Specifically, three categories of angular difference features (ADFs) are proposed to describe the angular information at three levels (i.e., pixel, feature, and label levels): (1) ADF-pixel: the angular information is directly extrapolated by pixel comparison between the multi-angle images; (2) ADF-feature: the angular differences are described in the feature domains by comparing the differences between the multi-angle spatial features (e.g., morphological attribute profiles (APs)). (3) ADF-label: label-level angular features are proposed based on a group of urban primitives (e.g., buildings and shadows), in order to describe the specific angular information related to the types of primitive classes. In addition, we utilize spatial-contextual information to refine the multi-level ADF features using superpixel segmentation, for the purpose of alleviating the effects of salt-and-pepper noise and representing the main angular characteristics within a local area. The experiments on ZY-3 multi-angle images confirm that the proposed ADF features can effectively improve the accuracy of urban scene classification, with a significant increase in overall accuracy (3.8-11.7%) compared to using the spectral bands alone. Furthermore, the results indicated the superiority of the proposed ADFs in distinguishing between the spectrally similar and complex man-made classes, including roads and various types of buildings (e.g., high buildings, urban villages, and residential apartments).

  8. Conjoint analysis of nature tourism values in Bahia, Brazil

    Treesearch

    Thomas Holmes; Chris Zinkhan; Keith Alger; D. Evan Mercer

    1996-01-01

    This paper uses conjoint analysis to estimate the value of nature tourism attributes in a threatened forest ecosystem in northeastern Brazil. Computerized interviews were conducted using a paired comparison design. An ordinal interpretation of the rating scale was used and marginal utilities were estimated using ordered probit. The empirical results showed that the...

  9. Application of Grey Relational Analysis to Decision-Making during Product Development

    ERIC Educational Resources Information Center

    Hsiao, Shih-Wen; Lin, Hsin-Hung; Ko, Ya-Chuan

    2017-01-01

    A multi-attribute decision-making (MADM) approach was proposed in this study as a prediction method that differs from the conventional production and design methods for a product. When a client has different dimensional requirements, this approach can quickly provide a company with design decisions for each product. The production factors of a…

  10. Postoptimality analysis in the selection of technology portfolios

    NASA Technical Reports Server (NTRS)

    Adumitroaie, Virgil; Shelton, Kacie; Elfes, Alberto; Weisbin, Charles R.

    2006-01-01

    This paper describes an approach for qualifying optimal technology portfolios obtained with a multi-attribute decision support system. The goal is twofold: to gauge the degree of confidence in the optimal solution and to provide the decision-maker with an array of viable selection alternatives, which take into account input uncertainties and possibly satisfy non-technical constraints.

  11. Conjoint Analysis: A Preference-Based Approach for the Accounting of Multiple Benefits in Southern Forest Management

    Treesearch

    F. Christian Zinkhan; Thomas P. Holmes; D. Evan Mercer

    1997-01-01

    Conjoint analysis, which enables a manager to measure the relative importance of a forest's multidimensional attributes, is critically reviewed and assessed. Special attention is given to the feasibility of using conjoint analysis for measuring the utility of market and nonmarket outputs from southern forests. Also, an application to a case of designing a nature...

  12. Beyond diagnoses: family medicine core themes in student reflective writing.

    PubMed

    Bradner, Melissa K; Crossman, Steven H; Gary, Judy; Vanderbilt, Allison A; VanderWielen, Lynn

    2015-03-01

    We share qualitative study results of third-year medical student writings during their family medicine clerkship utilizing a reflective writing exercise from 2005 and 2013. For this paper, 50 student writings were randomly selected from the 2005 cohort in addition to 50 student writings completed by the 2013 cohort. Deductive thematic analysis utilizing Atlas.ti software was completed utilizing the Future of Family Medicine core attributes of family physicians as the a priori coding template. Student writings actively reflect key attributes of family physicians as described by the Future of Family Medicine Report: a deep understanding of the dynamics of the whole person, a generative impact on patients' lives, a talent for humanizing the health care experience, and a natural command of complexity and multidimensional access to care. We discuss how to lead the writing exercise and provide suggestions for facilitating the discussion to bring out these important aspects of family medicine care.

  13. Conflicts in Coalitions: A Stability Analysis of Robust Multi-City Regional Water Supply Portfolios

    NASA Astrophysics Data System (ADS)

    Gold, D.; Trindade, B. C.; Reed, P. M.; Characklis, G. W.

    2017-12-01

    Regional cooperation among water utilities can improve the robustness of urban water supply portfolios to deeply uncertain future conditions such as those caused by climate change or population growth. Coordination mechanisms such as water transfers, coordinated demand management, and shared infrastructure, can improve the efficiency of resource allocation and delay the need for new infrastructure investments. Regionalization does however come at a cost. Regionally coordinated water supply plans may be vulnerable to any emerging instabilities in the regional coalition. If one or more regional actors does not cooperate or follow the required regional actions in a time of crisis, the overall system performance may degrade. Furthermore, when crafting regional water supply portfolios, decision makers must choose a framework for measuring the performance of regional policies based on the evaluation of the objective values for each individual actor. Regional evaluations may inherently favor one actor's interests over those of another. This work focuses on four interconnected water utilities in the Research Triangle region of North Carolina for which robust regional water supply portfolios have previously been designed using multi-objective optimization to maximize the robustness of the worst performing utility across several objectives. This study 1) examines the sensitivity of portfolio performance to deviations from prescribed actions by individual utilities, 2) quantifies the implications of the regional formulation used to evaluate robustness for the portfolio performance of each individual utility and 3) elucidates the inherent regional tensions and conflicts that exist between utilities under this regionalization scheme through visual diagnostics of the system under simulated drought scenarios. Results of this analysis will help inform the creation of future regional water supply portfolios and provide insight into the nature of multi-actor water supply systems.

  14. Microbial Characteristics of Nosocomial Infections and Their Association with the Utilization of Hand Hygiene Products: A Hospital-Wide Analysis of 78,344 Cases.

    PubMed

    Liu, Song; Wang, Meng; Wang, Gefei; Wu, Xiuwen; Guan, Wenxian; Ren, Jianan

    Nosocomial infections are the main adverse events during health care delivery. Hand hygiene is the fundamental strategy for the prevention of nosocomial infections. Microbial characteristics of nosocomial infections in the Asia-Pacific region have not been investigated fully. Correlation between the use of hand hygiene products and the incidence of nosocomial infections is still unknown. This study investigates the microbial characteristics of nosocomial infections in the Asia-Pacific region and analyzes the association between the utilization of hand hygiene products and the incidence of nosocomial infections. A total of 78,344 patients were recruited from a major tertiary hospital in China. Microbial characteristics of major types of nosocomial infections were described. The association between the utilization of hand hygiene products and the incidence of nosocomial infections was analyzed using correlation and regression models. The overall incidence of nosocomial infections was 3.04%, in which the incidence of surgical site infection was 1%. Multi-drug resistance was found in 22.8% of all pathogens, in which multi-drug-resistant Acinetobacter baumannii and methicillin-resistant Staphylococcus aureus were 56.6% and 54.9%, respectively. The utilization of hand hygiene products (including hand sanitizer, soap and paper towel) was associated negatively with the incidence of surgical site infection in surgical departments and the incidence of nosocomial infections in non-intensive care unit (ICU) departments (especially in surgical departments). Regression analysis further identified that higher utilization of hand hygiene products correlated with decreased incidence of major types of nosocomial infections. Multi-drug-resistant organisms are emerging in Asia-Pacific health care facilities. Utilization of hand hygiene products is associated with the incidence of nosocomial infections.

  15. Multi-level optimization of a beam-like space truss utilizing a continuum model

    NASA Technical Reports Server (NTRS)

    Yates, K.; Gurdal, Z.; Thangjitham, S.

    1992-01-01

    A continuous beam model is developed for approximate analysis of a large, slender, beam-like truss. The model is incorporated in a multi-level optimization scheme for the weight minimization of such trusses. This scheme is tested against traditional optimization procedures for savings in computational cost. Results from both optimization methods are presented for comparison.

  16. Cultural Geography Model Validation

    DTIC Science & Technology

    2010-03-01

    the Cultural Geography Model (CGM), a government owned, open source multi - agent system utilizing Bayesian networks, queuing systems, the Theory of...referent determined either from theory or SME opinion. 4. CGM Overview The CGM is a government-owned, open source, data driven multi - agent social...HSCB, validation, social network analysis ABSTRACT: In the current warfighting environment , the military needs robust modeling and simulation (M&S

  17. Quantitative descriptive analysis and principal component analysis for sensory characterization of Indian milk product cham-cham.

    PubMed

    Puri, Ritika; Khamrui, Kaushik; Khetra, Yogesh; Malhotra, Ravinder; Devraja, H C

    2016-02-01

    Promising development and expansion in the market of cham-cham, a traditional Indian dairy product is expected in the coming future with the organized production of this milk product by some large dairies. The objective of this study was to document the extent of variation in sensory properties of market samples of cham-cham collected from four different locations known for their excellence in cham-cham production and to find out the attributes that govern much of variation in sensory scores of this product using quantitative descriptive analysis (QDA) and principal component analysis (PCA). QDA revealed significant (p < 0.05) difference in sensory attributes of cham-cham among the market samples. PCA identified four significant principal components that accounted for 72.4 % of the variation in the sensory data. Factor scores of each of the four principal components which primarily correspond to sweetness/shape/dryness of interior, surface appearance/surface dryness, rancid and firmness attributes specify the location of each market sample along each of the axes in 3-D graphs. These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring attributes of cham-cham that contribute most to its sensory acceptability.

  18. Variability Extraction and Synthesis via Multi-Resolution Analysis using Distribution Transformer High-Speed Power Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamana, Manohar; Mather, Barry A

    A library of load variability classes is created to produce scalable synthetic data sets using historical high-speed raw data. These data are collected from distribution monitoring units connected at the secondary side of a distribution transformer. Because of the irregular patterns and large volume of historical high-speed data sets, the utilization of current load characterization and modeling techniques are challenging. Multi-resolution analysis techniques are applied to extract the necessary components and eliminate the unnecessary components from the historical high-speed raw data to create the library of classes, which are then utilized to create new synthetic load data sets. A validationmore » is performed to ensure that the synthesized data sets contain the same variability characteristics as the training data sets. The synthesized data sets are intended to be utilized in quasi-static time-series studies for distribution system planning studies on a granular scale, such as detailed PV interconnection studies.« less

  19. Model for multi-stand management based on structural attributes of individual stands

    Treesearch

    G.W. Miller; J. Sullivan

    1997-01-01

    A growing interest in managing forest ecosystems calls for decision models that take into account attribute goals for large forest areas while continuing to recognize the individual stand as a basic unit of forest management. A dynamic, nonlinear forest management model is described that schedules silvicultural treatments for individual stands that are linked by multi-...

  20. Evaluation of Savings in Energy-Efficient Public Housing in the Pacific Northwest

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gordon, A.; Lubliner, M.; Howard, L.

    2013-10-01

    This report presents the results of an energy performance and cost-effectiveness analysis. The Salishan phase 7 and demonstration homes were compared to Salishan phase 6 homes built to 2006 Washington State Energy Code specifications 2. Predicted annual energy savings (over Salishan phase 6) was 19% for Salishan phase 7, and between 19-24% for the demonstration homes (depending on ventilation strategy). Approximately two-thirds of the savings are attributable to the DHP. Working with the electric utility provider, Tacoma Public Utilities, researchers conducted a billing analysis for Salishan phase 7.

  1. Software attribute visualization for high integrity software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  2. Factors associated with utilization of long-acting and permanent contraceptive methods among women who have decided not to have more children in Gondar city.

    PubMed

    Zenebe, Chernet Baye; Adefris, Mulat; Yenit, Melaku Kindie; Gelaw, Yalemzewod Assefa

    2017-09-06

    Despite the fact that long acting family planning methods reduce population growth and improve maternal health, their utilization remains poor. Therefore, this study assessed the prevalence of long acting and permanent family planning method utilization and associated factors among women in reproductive age groups who have decided not to have more children in Gondar city, northwest Ethiopia. An institution based cross-sectional study was conducted from August to October, 2015. Three hundred seventeen women who have decided not to have more children were selected consecutively into the study. A structured and pretested questionnaire was used to collect data. Both bivariate and multi-variable logistic regressions analyses were used to identify factors associated with utilization of long acting and permanent family planning methods. The multi-variable logistic regression analysis was used to investigate factors associated with the utilization of long acting and permanent family planning methods. The Adjusted Odds Ratio (AOR) with the corresponding 95% Confidence Interval (CI) was used to show the strength of associations, and variables with a P-value of <0.05 were considered statistically significant. In this study, the overall prevalence of long acting and permanent contraceptive (LAPCM) method utilization was 34.7% (95% CI: 29.5-39.9). According to the multi-variable logistic regression analysis, utilization of long acting and permanent contraceptive methods was significantly associated with women who had secondary school, (AOR: 2279, 95% CI: 1.17, 4.44), college, and above education (AOR: 2.91, 95% CI: 1.36, 6.24), history of previous utilization (AOR: 3.02, 95% CI: 1.69, 5.38), and information about LAPCM (AOR: 8.85, 95% CI: 2.04, 38.41). In this study the prevalence of long acting and permanent family planning method utilization among women who have decided not to have more children was high compared with previous studies conducted elsewhere. Advanced educational status, previous utilization of LAPCM, and information on LAPCM were significantly associated with the utilization of LAPCM. As a result, strengthening behavioral change communication channels to make information accessible is highly recommended.

  3. Uncertainty evaluation in normalization of isotope delta measurement results against international reference materials.

    PubMed

    Meija, Juris; Chartrand, Michelle M G

    2018-01-01

    Isotope delta measurements are normalized against international reference standards. Although multi-point normalization is becoming a standard practice, the existing uncertainty evaluation practices are either undocumented or are incomplete. For multi-point normalization, we present errors-in-variables regression models for explicit accounting of the measurement uncertainty of the international standards along with the uncertainty that is attributed to their assigned values. This manuscript presents framework to account for the uncertainty that arises due to a small number of replicate measurements and discusses multi-laboratory data reduction while accounting for inevitable correlations between the laboratories due to the use of identical reference materials for calibration. Both frequentist and Bayesian methods of uncertainty analysis are discussed.

  4. Deriving a preference-based utility measure for cancer patients from the European Organisation for the Research and Treatment of Cancer’s Quality of Life Questionnaire C30: a confirmatory versus exploratory approach

    PubMed Central

    Costa, Daniel SJ; Aaronson, Neil K; Fayers, Peter M; Grimison, Peter S; Janda, Monika; Pallant, Julie F; Rowen, Donna; Velikova, Galina; Viney, Rosalie; Young, Tracey A; King, Madeleine T

    2014-01-01

    Background Multi attribute utility instruments (MAUIs) are preference-based measures that comprise a health state classification system (HSCS) and a scoring algorithm that assigns a utility value to each health state in the HSCS. When developing a MAUI from a health-related quality of life (HRQOL) questionnaire, first a HSCS must be derived. This typically involves selecting a subset of domains and items because HRQOL questionnaires typically have too many items to be amendable to the valuation task required to develop the scoring algorithm for a MAUI. Currently, exploratory factor analysis (EFA) followed by Rasch analysis is recommended for deriving a MAUI from a HRQOL measure. Aim To determine whether confirmatory factor analysis (CFA) is more appropriate and efficient than EFA to derive a HSCS from the European Organisation for the Research and Treatment of Cancer’s core HRQOL questionnaire, Quality of Life Questionnaire (QLQ-C30), given its well-established domain structure. Methods QLQ-C30 (Version 3) data were collected from 356 patients receiving palliative radiotherapy for recurrent/metastatic cancer (various primary sites). The dimensional structure of the QLQ-C30 was tested with EFA and CFA, the latter informed by the established QLQ-C30 structure and views of both patients and clinicians on which are the most relevant items. Dimensions determined by EFA or CFA were then subjected to Rasch analysis. Results CFA results generally supported the proposed QLQ-C30 structure (comparative fit index =0.99, Tucker–Lewis index =0.99, root mean square error of approximation =0.04). EFA revealed fewer factors and some items cross-loaded on multiple factors. Further assessment of dimensionality with Rasch analysis allowed better alignment of the EFA dimensions with those detected by CFA. Conclusion CFA was more appropriate and efficient than EFA in producing clinically interpretable results for the HSCS for a proposed new cancer-specific MAUI. Our findings suggest that CFA should be recommended generally when deriving a preference-based measure from a HRQOL measure that has an established domain structure. PMID:25395875

  5. Mapping clinical outcomes expectations to treatment decisions: an application to vestibular schwannoma management.

    PubMed

    Cheung, Steven W; Aranda, Derick; Driscoll, Colin L W; Parsa, Andrew T

    2010-02-01

    Complex medical decision making obligates tradeoff assessments among treatment outcomes expectations, but an accessible tool to perform the necessary analysis is conspicuously absent. We aimed to demonstrate methodology and feasibility of adapting conjoint analysis for mapping clinical outcomes expectations to treatment decisions in vestibular schwannoma (VS) management. Prospective. Tertiary medical center and US-based otologists/neurotologists. Treatment preference profiles among VS stakeholders-61 younger and 74 older prospective patients, 61 observation patients, and 60 surgeons-were assessed for the synthetic VS case scenario of a 10-mm tumor in association with useful hearing and normal facial function. Treatment attribute utility. Conjoint analysis attribute levels were set in accordance to the results of a meta-analysis. Forty-five case series were disaggregated to formulate microsurgery facial nerve and hearing preservation outcomes expectations models. Attribute utilities were computed and mapped to the realistic treatment choices of translabyrinthine craniotomy, middle fossa craniotomy, and gamma knife radiosurgery. Among the treatment attributes of likelihoods of causing deafness, temporary facial weakness for 2 months, and incurable cancer within 20 years, and recovery time, permanent deafness was less important to tumor surgeons, and temporary facial weakness was more important to tumor surgeons and observation patients (Wilcoxon rank-sum, p < 0.001). Inverse mapping of preference profiles to realistic treatment choices showed all study cohorts were inclined to choose gamma knife radiosurgery. Mapping clinical outcomes expectations to treatment decisions for a synthetic clinical scenario revealed inhomogeneous drivers of choice selection among study cohorts. Medical decision engines that analyze personal preferences of outcomes expectations for VS and many other diseases may be developed to promote shared decision making among health care stakeholders and transparency in the informed consent process.

  6. Multi-class geospatial object detection and geographic image classification based on collection of part detectors

    NASA Astrophysics Data System (ADS)

    Cheng, Gong; Han, Junwei; Zhou, Peicheng; Guo, Lei

    2014-12-01

    The rapid development of remote sensing technology has facilitated us the acquisition of remote sensing images with higher and higher spatial resolution, but how to automatically understand the image contents is still a big challenge. In this paper, we develop a practical and rotation-invariant framework for multi-class geospatial object detection and geographic image classification based on collection of part detectors (COPD). The COPD is composed of a set of representative and discriminative part detectors, where each part detector is a linear support vector machine (SVM) classifier used for the detection of objects or recurring spatial patterns within a certain range of orientation. Specifically, when performing multi-class geospatial object detection, we learn a set of seed-based part detectors where each part detector corresponds to a particular viewpoint of an object class, so the collection of them provides a solution for rotation-invariant detection of multi-class objects. When performing geographic image classification, we utilize a large number of pre-trained part detectors to discovery distinctive visual parts from images and use them as attributes to represent the images. Comprehensive evaluations on two remote sensing image databases and comparisons with some state-of-the-art approaches demonstrate the effectiveness and superiority of the developed framework.

  7. Rooftop greenhouses in educational centers: A sustainability assessment of urban agriculture in compact cities.

    PubMed

    Nadal, Ana; Pons, Oriol; Cuerva, Eva; Rieradevall, Joan; Josa, Alejandro

    2018-06-01

    Today, urban agriculture is one of the most widely used sustainability strategies to improve the metabolism of a city. Schools can play an important role in the implementation of sustainability master plans, due their socio-educational activities and their cohesive links with families; all key elements in the development of urban agriculture. Thus, the main objective of this research is to develop a procedure, in compact cities, to assess the potential installation of rooftop greenhouses (RTGs) in schools. The generation of a dynamic assessment tool capable of identifying and prioritizing schools with a high potential for RTGs and their eventual implementation would also represent a significant factor in the environmental, social, and nutritional education of younger generations. The methodology has four-stages (Pre-selection criteria; Selection of necessities; Sustainability analysis; and Sensitivity analysis and selection of the best alternative) in which economic, environmental, social and governance aspects all are considered. It makes use of Multi-Attribute Utility Theory and Multi-Criteria Decision Making, through the Integrated Value Model for Sustainability Assessments and the participation of two panels of multidisciplinary specialists, for the preparation of a unified sustainability index that guarantees the objectivity of the selection process. This methodology has been applied and validated in a case study of 11 schools in Barcelona (Spain). The social perspective of the proposed methodology favored the school in the case-study with the most staff and the largest parent-teacher association (social and governance indicators) that obtained the highest sustainability index (S11); at a considerable distance (45%) from the worst case (S3) with fewer school staff and parental support. Finally, objective decisions may be taken with the assistance of this appropriate, adaptable, and reliable Multi-Criteria Decision-Making tool on the vertical integration and implementation of urban agriculture in schools, in support of the goals of sustainable development and the circular economy. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Sustainability assessment of tertiary wastewater treatment technologies: a multi-criteria analysis.

    PubMed

    Plakas, K V; Georgiadis, A A; Karabelas, A J

    2016-01-01

    The multi-criteria analysis gives the opportunity to researchers, designers and decision-makers to examine decision options in a multi-dimensional fashion. On this basis, four tertiary wastewater treatment (WWT) technologies were assessed regarding their sustainability performance in producing recycled wastewater, considering a 'triple bottom line' approach (i.e. economic, environmental, and social). These are powdered activated carbon adsorption coupled with ultrafiltration membrane separation (PAC-UF), reverse osmosis, ozone/ultraviolet-light oxidation and heterogeneous photo-catalysis coupled with low-pressure membrane separation (photocatalytic membrane reactor, PMR). The participatory method called simple multi-attribute rating technique exploiting ranks was employed for assigning weights to selected sustainability indicators. This sustainability assessment approach resulted in the development of a composite index as a final metric, for each WWT technology evaluated. The PAC-UF technology appears to be the most appropriate technology, attaining the highest composite value regarding the sustainability performance. A scenario analysis confirmed the results of the original scenario in five out of seven cases. In parallel, the PMR was highlighted as the technology with the least variability in its performance. Nevertheless, additional actions and approaches are proposed to strengthen the objectivity of the final results.

  9. Predicting crash frequency for multi-vehicle collision types using multivariate Poisson-lognormal spatial model: A comparative analysis.

    PubMed

    Hosseinpour, Mehdi; Sahebi, Sina; Zamzuri, Zamira Hasanah; Yahaya, Ahmad Shukri; Ismail, Noriszura

    2018-06-01

    According to crash configuration and pre-crash conditions, traffic crashes are classified into different collision types. Based on the literature, multi-vehicle crashes, such as head-on, rear-end, and angle crashes, are more frequent than single-vehicle crashes, and most often result in serious consequences. From a methodological point of view, the majority of prior studies focused on multivehicle collisions have employed univariate count models to estimate crash counts separately by collision type. However, univariate models fail to account for correlations which may exist between different collision types. Among others, multivariate Poisson lognormal (MVPLN) model with spatial correlation is a promising multivariate specification because it not only allows for unobserved heterogeneity (extra-Poisson variation) and dependencies between collision types, but also spatial correlation between adjacent sites. However, the MVPLN spatial model has rarely been applied in previous research for simultaneously modelling crash counts by collision type. Therefore, this study aims at utilizing a MVPLN spatial model to estimate crash counts for four different multi-vehicle collision types, including head-on, rear-end, angle, and sideswipe collisions. To investigate the performance of the MVPLN spatial model, a two-stage model and a univariate Poisson lognormal model (UNPLN) spatial model were also developed in this study. Detailed information on roadway characteristics, traffic volume, and crash history were collected on 407 homogeneous segments from Malaysian federal roads. The results indicate that the MVPLN spatial model outperforms the other comparing models in terms of goodness-of-fit measures. The results also show that the inclusion of spatial heterogeneity in the multivariate model significantly improves the model fit, as indicated by the Deviance Information Criterion (DIC). The correlation between crash types is high and positive, implying that the occurrence of a specific collision type is highly associated with the occurrence of other crash types on the same road segment. These results support the utilization of the MVPLN spatial model when predicting crash counts by collision manner. In terms of contributing factors, the results show that distinct crash types are attributed to different subsets of explanatory variables. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Landscape preference assessment of Louisiana river landscapes: a methodological study

    Treesearch

    Michael S. Lee

    1979-01-01

    The study pertains to the development of an assessment system for the analysis of visual preference attributed to Louisiana river landscapes. The assessment system was utilized in the evaluation of 20 Louisiana river scenes. Individuals were tested for their free choice preference for the same scenes. A statistical analysis was conducted to examine the relationship...

  11. The Spectrum Analysis Solution (SAS) System: Theoretical Analysis, Hardware Design and Implementation.

    PubMed

    Narayanan, Ram M; Pooler, Richard K; Martone, Anthony F; Gallagher, Kyle A; Sherbondy, Kelly D

    2018-02-22

    This paper describes a multichannel super-heterodyne signal analyzer, called the Spectrum Analysis Solution (SAS), which performs multi-purpose spectrum sensing to support spectrally adaptive and cognitive radar applications. The SAS operates from ultrahigh frequency (UHF) to the S-band and features a wideband channel with eight narrowband channels. The wideband channel acts as a monitoring channel that can be used to tune the instantaneous band of the narrowband channels to areas of interest in the spectrum. The data collected from the SAS has been utilized to develop spectrum sensing algorithms for the budding field of spectrum sharing (SS) radar. Bandwidth (BW), average total power, percent occupancy (PO), signal-to-interference-plus-noise ratio (SINR), and power spectral entropy (PSE) have been examined as metrics for the characterization of the spectrum. These metrics are utilized to determine a contiguous optimal sub-band (OSB) for a SS radar transmission in a given spectrum for different modalities. Three OSB algorithms are presented and evaluated: the spectrum sensing multi objective (SS-MO), the spectrum sensing with brute force PSE (SS-BFE), and the spectrum sensing multi-objective with brute force PSE (SS-MO-BFE).

  12. The Spectrum Analysis Solution (SAS) System: Theoretical Analysis, Hardware Design and Implementation

    PubMed Central

    Pooler, Richard K.; Martone, Anthony F.; Gallagher, Kyle A.; Sherbondy, Kelly D.

    2018-01-01

    This paper describes a multichannel super-heterodyne signal analyzer, called the Spectrum Analysis Solution (SAS), which performs multi-purpose spectrum sensing to support spectrally adaptive and cognitive radar applications. The SAS operates from ultrahigh frequency (UHF) to the S-band and features a wideband channel with eight narrowband channels. The wideband channel acts as a monitoring channel that can be used to tune the instantaneous band of the narrowband channels to areas of interest in the spectrum. The data collected from the SAS has been utilized to develop spectrum sensing algorithms for the budding field of spectrum sharing (SS) radar. Bandwidth (BW), average total power, percent occupancy (PO), signal-to-interference-plus-noise ratio (SINR), and power spectral entropy (PSE) have been examined as metrics for the characterization of the spectrum. These metrics are utilized to determine a contiguous optimal sub-band (OSB) for a SS radar transmission in a given spectrum for different modalities. Three OSB algorithms are presented and evaluated: the spectrum sensing multi objective (SS-MO), the spectrum sensing with brute force PSE (SS-BFE), and the spectrum sensing multi-objective with brute force PSE (SS-MO-BFE). PMID:29470448

  13. Decision making for Pap testing among Pacific Islander women.

    PubMed

    Weiss, Jie W; Mouttapa, Michele; Sablan-Santos, Lola; DeGuzman Lacsamana, Jasmine; Quitugua, Lourdes; Park Tanjasiri, Sora

    2016-12-01

    This study employed a Multi-Attribute Utility (MAU) model to examine the Pap test decision-making process among Pacific Islanders (PI) residing in Southern California. A total of 585 PI women were recruited through social networks from Samoan and Tongan churches, and Chamorro family clans. A questionnaire assessed Pap test knowledge, beliefs and past behaviour. The three MAU parameters of subjective value, subjective probability and momentary salience were measured for eight anticipated consequences of having a Pap test (e.g., feeling embarrassed, spending money). Logistic regression indicated that women who had a Pap test (Pap women) had higher total MAU utility scores compared to women who had not had a Pap test within the past three years (No Pap women) (adjusted Odds Ratio = 1.10). In particular, Pap women had higher utilities for the positive consequences 'Detecting cervical cancer early, Peace of mind, and Protecting my family', compared to No Pap women. It is concluded that the connection between utility and behaviour offers a promising pathway toward a better understanding of the decision to undergo Pap testing. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  14. Multisensor satellite data for water quality analysis and water pollution risk assessment: decision making under deep uncertainty with fuzzy algorithm in framework of multimodel approach

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, Yuriy V.; Sztoyka, Yulia; Kopachevsky, Ivan; Artemenko, Igor; Yuschenko, Maxim

    2017-10-01

    Multi-model approach for remote sensing data processing and interpretation is described. The problem of satellite data utilization in multi-modeling approach for socio-ecological risks assessment is formally defined. Observation, measurement and modeling data utilization method in the framework of multi-model approach is described. Methodology and models of risk assessment in framework of decision support approach are defined and described. Method of water quality assessment using satellite observation data is described. Method is based on analysis of spectral reflectance of aquifers. Spectral signatures of freshwater bodies and offshores are analyzed. Correlations between spectral reflectance, pollutions and selected water quality parameters are analyzed and quantified. Data of MODIS, MISR, AIRS and Landsat sensors received in 2002-2014 have been utilized verified by in-field spectrometry and lab measurements. Fuzzy logic based approach for decision support in field of water quality degradation risk is discussed. Decision on water quality category is making based on fuzzy algorithm using limited set of uncertain parameters. Data from satellite observations, field measurements and modeling is utilizing in the framework of the approach proposed. It is shown that this algorithm allows estimate water quality degradation rate and pollution risks. Problems of construction of spatial and temporal distribution of calculated parameters, as well as a problem of data regularization are discussed. Using proposed approach, maps of surface water pollution risk from point and diffuse sources are calculated and discussed.

  15. Using Consumer Behavior and Decision Models to Aid Students in Choosing a Major.

    ERIC Educational Resources Information Center

    Kaynama, Shohreh A.; Smith, Louise W.

    1996-01-01

    A study found that using consumer behavior and decision models to guide students to a major can be useful and enjoyable for students. Students consider many of the basic parameters through multi-attribute and decision-analysis models, so time with professors, who were found to be the most influential group, can be used for more individual and…

  16. Application fuzzy multi-attribute decision analysis method to prioritize project success criteria

    NASA Astrophysics Data System (ADS)

    Phong, Nguyen Thanh; Quyen, Nguyen Le Hoang Thuy To

    2017-11-01

    Project success is a foundation for project owner to manage and control not only for the current project but also for future potential projects in construction companies. However, identifying the key success criteria for evaluating a particular project in real practice is a challenging task. Normally, it depends on a lot of factors, such as the expectation of the project owner and stakeholders, triple constraints of the project (cost, time, quality), and company's mission, vision, and objectives. Traditional decision-making methods for measuring the project success are usually based on subjective opinions of panel experts, resulting in irrational and inappropriate decisions. Therefore, this paper introduces a multi-attribute decision analysis method (MADAM) for weighting project success criteria by using fuzzy Analytical Hierarchy Process approach. It is found that this method is useful when dealing with imprecise and uncertain human judgments in evaluating project success criteria. Moreover, this research also suggests that although cost, time, and quality are three project success criteria projects, the satisfaction of project owner and acceptance of project stakeholders with the completed project criteria is the most important criteria for project success evaluation in Vietnam.

  17. Provenance Establishment of Stingless Bee Honey Using Multi-element Analysis in Combination with Chemometrics Techniques.

    PubMed

    Shadan, Aidil Fahmi; Mahat, Naji A; Wan Ibrahim, Wan Aini; Ariffin, Zaiton; Ismail, Dzulkiflee

    2018-01-01

    As consumption of stingless bee honey has been gaining popularity in many countries including Malaysia, ability to identify accurately its geographical origin proves pertinent for investigating fraudulent activities for consumer protection. Because a chemical signature can be location-specific, multi-element distribution patterns may prove useful for provenancing such product. Using the inductively coupled-plasma optical emission spectrometer as well as principal component analysis (PCA) and linear discriminant analysis (LDA), the distributions of multi-elements in stingless bee honey collected at four different geographical locations (North, West, East, and South) in Johor, Malaysia, were investigated. While cross-validation using PCA demonstrated 87.0% correct classification rate, the same was improved (96.2%) with the use of LDA, indicating that discrimination was possible for the different geographical regions. Therefore, utilization of multi-element analysis coupled with chemometrics techniques for assigning the provenance of stingless bee honeys for forensic applications is supported. © 2017 American Academy of Forensic Sciences.

  18. Ducts Sealing Using Injected Spray Sealant, Raleigh, North Carolina (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    2014-03-01

    In multifamily and attached buildings, traditional duct sealing methods are often impractical or costly and disruptive because of the difficulty in accessing leakage sites. In this project, two retrofit duct sealing techniques - manually-applied sealants and injecting a spray sealant, were implemented in several low-rise multi-unit buildings. An analysis on the cost and performance of the two methods are presented. Each method was used in twenty housing units: approximately half of each group of units are single story and the remainder two-story. Results show that duct leakage to the outside was reduced by an average of 59% through the usemore » of manual methods, and by 90% in the units where the injected spray sealant was used. It was found that 73% of the leakage reduction in homes that were treated with injected spray sealant was attributable to the manual sealing done at boots, returns and the air handler. The cost of manually-applying sealant ranged from $275 to $511 per unit and for the injected spray sealant the cost was $700 per unit. Modeling suggests a simple payback of 2.2 years for manual sealing and 4.7 years for the injected spray sealant system. Utility bills were collected for one year before and after the retrofits. Utility bill analysis shows 14% and 16% energy savings using injected spray sealant system and hand sealing procedure respectively in heating season whereas in cooling season, energy savings using injected spray sealant system and hand sealing were both 16%.« less

  19. Multiple utility constrained multi-objective programs using Bayesian theory

    NASA Astrophysics Data System (ADS)

    Abbasian, Pooneh; Mahdavi-Amiri, Nezam; Fazlollahtabar, Hamed

    2018-03-01

    A utility function is an important tool for representing a DM's preference. We adjoin utility functions to multi-objective optimization problems. In current studies, usually one utility function is used for each objective function. Situations may arise for a goal to have multiple utility functions. Here, we consider a constrained multi-objective problem with each objective having multiple utility functions. We induce the probability of the utilities for each objective function using Bayesian theory. Illustrative examples considering dependence and independence of variables are worked through to demonstrate the usefulness of the proposed model.

  20. Diverse expected gradient active learning for relative attributes.

    PubMed

    You, Xinge; Wang, Ruxin; Tao, Dacheng

    2014-07-01

    The use of relative attributes for semantic understanding of images and videos is a promising way to improve communication between humans and machines. However, it is extremely labor- and time-consuming to define multiple attributes for each instance in large amount of data. One option is to incorporate active learning, so that the informative samples can be actively discovered and then labeled. However, most existing active-learning methods select samples one at a time (serial mode), and may therefore lose efficiency when learning multiple attributes. In this paper, we propose a batch-mode active-learning method, called diverse expected gradient active learning. This method integrates an informativeness analysis and a diversity analysis to form a diverse batch of queries. Specifically, the informativeness analysis employs the expected pairwise gradient length as a measure of informativeness, while the diversity analysis forces a constraint on the proposed diverse gradient angle. Since simultaneous optimization of these two parts is intractable, we utilize a two-step procedure to obtain the diverse batch of queries. A heuristic method is also introduced to suppress imbalanced multiclass distributions. Empirical evaluations of three different databases demonstrate the effectiveness and efficiency of the proposed approach.

  1. Integrating sensory evaluation in adaptive conjoint analysis to elaborate the conflicting influence of intrinsic and extrinsic attributes on food choice.

    PubMed

    Hoppert, Karin; Mai, Robert; Zahn, Susann; Hoffmann, Stefan; Rohm, Harald

    2012-12-01

    Sensory properties and packaging information are factors which considerably contribute to food choice. We present a new methodology in which sensory preference testing was integrated in adaptive conjoint analysis. By simultaneous variation of intrinsic and extrinsic attributes on identical levels, this procedure allows assessing the importance of attribute/level combinations on product selection. In a set-up with nine pair-wise comparisons and four subsequent calibration assessments, 101 young consumers evaluated vanilla yoghurt which was varied in fat content (four levels), sugar content (two levels) and flavour intensity (two levels); the same attribute/level combinations were also presented as extrinsic information. The results indicate that the evaluation of a particular attribute may largely diverge in intrinsic and in extrinsic processing. We noticed from our utility values that, for example, the acceptance of yoghurt increases with an increasing level of the actual fat content, whereas acceptance diminishes when a high fat content is labelled on the product. This article further implicates that neglecting these diverging relationships may lead to an over- or underestimation of the importance of an attribute for food choice. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Exploring complex dynamics in multi agent-based intelligent systems: Theoretical and experimental approaches using the Multi Agent-based Behavioral Economic Landscape (MABEL) model

    NASA Astrophysics Data System (ADS)

    Alexandridis, Konstantinos T.

    This dissertation adopts a holistic and detailed approach to modeling spatially explicit agent-based artificial intelligent systems, using the Multi Agent-based Behavioral Economic Landscape (MABEL) model. The research questions that addresses stem from the need to understand and analyze the real-world patterns and dynamics of land use change from a coupled human-environmental systems perspective. Describes the systemic, mathematical, statistical, socio-economic and spatial dynamics of the MABEL modeling framework, and provides a wide array of cross-disciplinary modeling applications within the research, decision-making and policy domains. Establishes the symbolic properties of the MABEL model as a Markov decision process, analyzes the decision-theoretic utility and optimization attributes of agents towards comprising statistically and spatially optimal policies and actions, and explores the probabilogic character of the agents' decision-making and inference mechanisms via the use of Bayesian belief and decision networks. Develops and describes a Monte Carlo methodology for experimental replications of agent's decisions regarding complex spatial parcel acquisition and learning. Recognizes the gap on spatially-explicit accuracy assessment techniques for complex spatial models, and proposes an ensemble of statistical tools designed to address this problem. Advanced information assessment techniques such as the Receiver-Operator Characteristic curve, the impurity entropy and Gini functions, and the Bayesian classification functions are proposed. The theoretical foundation for modular Bayesian inference in spatially-explicit multi-agent artificial intelligent systems, and the ensembles of cognitive and scenario assessment modular tools build for the MABEL model are provided. Emphasizes the modularity and robustness as valuable qualitative modeling attributes, and examines the role of robust intelligent modeling as a tool for improving policy-decisions related to land use change. Finally, the major contributions to the science are presented along with valuable directions for future research.

  3. Follow on Research for Multi-Utility Technology Test Bed Aircraft at NASA Dryden Flight Research Center (FY13 Progress Report)

    NASA Technical Reports Server (NTRS)

    Pak, Chan-Gi

    2013-01-01

    Modern aircraft employ a significant fraction of their weight in composite materials to reduce weight and improve performance. Aircraft aeroservoelastic models are typically characterized by significant levels of model parameter uncertainty due to the composite manufacturing process. Small modeling errors in the finite element model will eventually induce errors in the structural flexibility and mass, thus propagating into unpredictable errors in the unsteady aerodynamics and the control law design. One of the primary objectives of Multi Utility Technology Test-bed (MUTT) aircraft is the flight demonstration of active flutter suppression, and therefore in this study, the identification of the primary and secondary modes for the structural model tuning based on the flutter analysis of MUTT aircraft. The ground vibration test-validated structural dynamic finite element model of the MUTT aircraft is created in this study. The structural dynamic finite element model of MUTT aircraft is improved using the in-house Multi-disciplinary Design, Analysis, and Optimization tool. In this study, two different weight configurations of MUTT aircraft have been improved simultaneously in a single model tuning procedure.

  4. Multi criteria evaluation for universal soil loss equation based on geographic information system

    NASA Astrophysics Data System (ADS)

    Purwaamijaya, I. M.

    2018-05-01

    The purpose of this research were to produce(l) a conceptual, functional model designed and implementation for universal soil loss equation (usle), (2) standard operational procedure for multi criteria evaluation of universal soil loss equation (usle) using geographic information system, (3) overlay land cover, slope, soil and rain fall layers to gain universal soil loss equation (usle) using multi criteria evaluation, (4) thematic map of universal soil loss equation (usle) in watershed, (5) attribute table of universal soil loss equation (usle) in watershed. Descriptive and formal correlation methods are used for this research. Cikapundung Watershed, Bandung, West Java, Indonesia was study location. This research was conducted on January 2016 to May 2016. A spatial analysis is used to superimposed land cover, slope, soil and rain layers become universal soil loss equation (usle). Multi criteria evaluation for universal soil loss equation (usle) using geographic information system could be used for conservation program.

  5. Where's the beef? Retail channel choice and beef preferences in Argentina.

    PubMed

    Colella, Florencia; Ortega, David L

    2017-11-01

    Argentinean beef is recognized and demanded internationally. Locally, consumers are often unable to afford certified beef products, and may rely on external cues to determine beef quality. Uncovering demand for beef attributes and marketing them accordingly, may require an understanding of consumers' product purchasing strategies, which involves retailer choice. We develop a framework utilizing latent class analysis to identify consumer groups with different retailer preferences, and separately estimate their demand for beef product attributes. This framework accounts for the interrelationship between consumers' choice of retail outlets and beef product preferences. Our analysis of data from the city of Buenos Aires identifies two groups of consumers, a convenience- (67%) and a service- (33%) oriented group. We find significant differences in demand for beef attributes across these groups, and find that the service oriented group, while not willing to pay for credence attributes, relies on a service-providing retailer-namely a butcher-as a source of product quality assurance. Copyright © 2017. Published by Elsevier Ltd.

  6. Strategic rehabilitation planning of piped water networks using multi-criteria decision analysis.

    PubMed

    Scholten, Lisa; Scheidegger, Andreas; Reichert, Peter; Maurer, Max; Mauer, Max; Lienert, Judit

    2014-02-01

    To overcome the difficulties of strategic asset management of water distribution networks, a pipe failure and a rehabilitation model are combined to predict the long-term performance of rehabilitation strategies. Bayesian parameter estimation is performed to calibrate the failure and replacement model based on a prior distribution inferred from three large water utilities in Switzerland. Multi-criteria decision analysis (MCDA) and scenario planning build the framework for evaluating 18 strategic rehabilitation alternatives under future uncertainty. Outcomes for three fundamental objectives (low costs, high reliability, and high intergenerational equity) are assessed. Exploitation of stochastic dominance concepts helps to identify twelve non-dominated alternatives and local sensitivity analysis of stakeholder preferences is used to rank them under four scenarios. Strategies with annual replacement of 1.5-2% of the network perform reasonably well under all scenarios. In contrast, the commonly used reactive replacement is not recommendable unless cost is the only relevant objective. Exemplified for a small Swiss water utility, this approach can readily be adapted to support strategic asset management for any utility size and based on objectives and preferences that matter to the respective decision makers. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Grid-Enabled Quantitative Analysis of Breast Cancer

    DTIC Science & Technology

    2009-10-01

    large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...pilot study to utilize large scale parallel Grid computing to harness the nationwide cluster infrastructure for optimization of medical image ... analysis parameters. Additionally, we investigated the use of cutting edge dataanalysis/ mining techniques as applied to Ultrasound, FFDM, and DCE-MRI Breast

  8. A needs analysis method for land-use planning of illegal dumping sites: a case study in Aomori-Iwate, Japan.

    PubMed

    Ishii, Kazuei; Furuichi, Toru; Nagao, Yukari

    2013-02-01

    Land use at contaminated sites, following remediation, is often needed for regional redevelopment. However, there exist few methods of developing economically and socially feasible land-use plans based on regional needs because of the wide variety of land-use requirements. This study proposes a new needs analysis method for the conceptual land-use planning of contaminated sites and illustrates this method with a case study of an illegal dumping site for hazardous waste. In this method, planning factors consisting of the land-use attributes and related facilities are extracted from the potential needs of the residents through a preliminary questionnaire. Using the extracted attributes of land use and the related facilities, land-use cases are designed for selection-based conjoint analysis. A second questionnaire for respondents to the first one who indicated an interest in participating in the second questionnaire is conducted for the conjoint analysis to determine the utility function and marginal cost of each attribute in order to prioritize the planning factors to develop a quantitative and economically and socially feasible land-use plan. Based on the results, site-specific land-use alternatives are developed and evaluated by the utility function obtained from the conjoint analysis. In this case study of an illegal dumping site for hazardous waste, the uses preferred as part of a conceptual land-use plan following remediation of the site were (1) agricultural land and a biogas plant designed to recover energy from biomass or (2) a park with a welfare facility and an athletic field. Our needs analysis method with conjoint analysis is applicable to the development of conceptual land-use planning for similar sites following remediation, particularly when added value is considered. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Students' Self-Identified Long-Term Leadership Development Goals: An Analysis by Gender and Race

    ERIC Educational Resources Information Center

    Rosch, David M.; Boyd, Barry L.; Duran, Kristina M.

    2014-01-01

    Leadership development goal statements of 92 undergraduate students enrolled in a multi-year self-directed leadership development program were analyzed using content and thematic analyses to investigate patterns of similarities and differences across gender and race. This qualitative analysis utilized a theoretical framework that approached…

  10. Drivers of liking for soy-based Indian-style extruded snack foods determined by U.S. and Indian consumers.

    PubMed

    Neely, Erika A; Lee, Youngsoo; Lee, Soo-Yeun

    2010-08-01

    Although many researchers have studied potential ways to deliver soy in novel forms, little is known about specific sensory attributes associated with soy snacks, or how those attributes drive liking for consumers. The first objective of this study was to use sensory descriptive analysis to characterize 9 extruded soy snacks with varying soy levels and soy grits contents. A total of 12 trained panelists used a descriptive analysis method to evaluate the snacks and found 14 attributes to be significantly different across the samples. Furthermore, it is not known how preferences of Indian snack consumers living in the United States and India may vary for sensory attributes of soy snacks. The 2nd objective was to correlate descriptive profiling data and previously collected consumer data to construct preference maps illustrating consumers' attitudes toward the snacks. Results indicate that consumers generally accept samples characterized by attributes such as crunchy, cumin, curry, salty, and umami, but dislike samples with wheat, rough, or porous attributes. Indian consumers differed from the U.S. consumers in that their preferences were more varied, and they tended to be more tolerant of wheat and porous attributes. Therefore, different strategies should be utilized when developing products for these groups to cater to their specific inclinations.

  11. A Model for Communications Satellite System Architecture Assessment

    DTIC Science & Technology

    2011-09-01

    This is shown in Equation 4. The total system cost includes all development, acquisition, fielding, operations, maintenance and upgrades, and system...protection. A mathematical model was implemented to enable the analysis of communications satellite system architectures based on multiple system... implemented to enable the analysis of communications satellite system architectures based on multiple system attributes. Utilization of the model in

  12. School adjustment of children in residential care: a multi-source analysis.

    PubMed

    Martín, Eduardo; Muñoz de Bustillo, María del Carmen

    2009-11-01

    School adjustment is one the greatest challenges in residential child care programs. This study has two aims: to analyze school adjustment compared to a normative population, and to carry out a multi-source analysis (child, classmates, and teacher) of this adjustment. A total of 50 classrooms containing 60 children from residential care units were studied. The "Método de asignación de atributos perceptivos" (Allocation of perceptive attributes; Díaz-Aguado, 2006), the "Test Autoevaluativo Multifactorial de Adaptación Infantil" (TAMAI [Multifactor Self-assessment Test of Child Adjustment]; Hernández, 1996) and the "Protocolo de valoración para el profesorado (Evaluation Protocol for Teachers; Fernández del Valle, 1998) were applied. The main results indicate that, compared with their classmates, children in residential care are perceived as more controversial and less integrated at school, although no differences were observed in problems of isolation. The multi-source analysis shows that there is agreement among the different sources when the externalized and visible aspects are evaluated. These results are discussed in connection with the practices that are being developed in residential child care programs.

  13. An Australian discrete choice experiment to value eq-5d health states.

    PubMed

    Viney, Rosalie; Norman, Richard; Brazier, John; Cronin, Paula; King, Madeleine T; Ratcliffe, Julie; Street, Deborah

    2014-06-01

    Conventionally, generic quality-of-life health states, defined within multi-attribute utility instruments, have been valued using a Standard Gamble or a Time Trade-Off. Both are grounded in expected utility theory but impose strong assumptions about the form of the utility function. Preference elicitation tasks for both are complicated, limiting the number of health states that each respondent can value and, therefore, that can be valued overall. The usual approach has been to value a set of the possible health states and impute values for the remainder. Discrete Choice Experiments (DCEs) offer an attractive alternative, allowing investigation of more flexible specifications of the utility function and greater coverage of the response surface. We designed a DCE to obtain values for EQ-5D health states and implemented it in an Australia-representative online panel (n = 1,031). A range of specifications investigating non-linear preferences with respect to time and interactions between EQ-5D levels were estimated using a random-effects probit model. The results provide empirical support for a flexible utility function, including at least some two-factor interactions. We then constructed a preference index such that full health and death were valued at 1 and 0, respectively, to provide a DCE-based algorithm for Australian cost-utility analyses. Copyright © 2013 John Wiley & Sons, Ltd.

  14. Adaptation of Decoy Fusion Strategy for Existing Multi-Stage Search Workflows

    NASA Astrophysics Data System (ADS)

    Ivanov, Mark V.; Levitsky, Lev I.; Gorshkov, Mikhail V.

    2016-09-01

    A number of proteomic database search engines implement multi-stage strategies aiming at increasing the sensitivity of proteome analysis. These approaches often employ a subset of the original database for the secondary stage of analysis. However, if target-decoy approach (TDA) is used for false discovery rate (FDR) estimation, the multi-stage strategies may violate the underlying assumption of TDA that false matches are distributed uniformly across the target and decoy databases. This violation occurs if the numbers of target and decoy proteins selected for the second search are not equal. Here, we propose a method of decoy database generation based on the previously reported decoy fusion strategy. This method allows unbiased TDA-based FDR estimation in multi-stage searches and can be easily integrated into existing workflows utilizing popular search engines and post-search algorithms.

  15. Retrospective Analysis of Communication Events - Understanding the Dynamics of Collaborative Multi-Party Discourse

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cowell, Andrew J.; Haack, Jereme N.; McColgin, Dave W.

    2006-06-08

    This research is aimed at understanding the dynamics of collaborative multi-party discourse across multiple communication modalities. Before we can truly make sig-nificant strides in devising collaborative communication systems, there is a need to understand how typical users utilize com-putationally supported communications mechanisms such as email, instant mes-saging, video conferencing, chat rooms, etc., both singularly and in conjunction with traditional means of communication such as face-to-face meetings, telephone calls and postal mail. Attempting to un-derstand an individual’s communications profile with access to only a single modal-ity is challenging at best and often futile. Here, we discuss the development of RACE –more » Retrospective Analysis of Com-munications Events – a test-bed prototype to investigate issues relating to multi-modal multi-party discourse.« less

  16. Impaired Clearance And Enhanced Pulmonary Inflammatory/Fibrotic Response To Carbon Nanotubes In Myeloperoxidase-Deficient Mice

    DTIC Science & Technology

    2012-03-30

    utilized SWCNT, it is highly likely that multi-walled carbon nanotubes, fullerenes , graphene and other carbonaceous particles may also undergo MPO...screening and analysis system to distinguish between the organic tissue and the inorganic SWCNT (under bright field imaging settings). Optically...Cytotoxicity of carbon nanomaterials: single-wall nanotube, multi-wall nanotube, and fullerene . Environ Sci Technol 39: 1378–1383. 7. Kisin ER, Murray

  17. Systems Biology Approaches for Host–Fungal Interactions: An Expanding Multi-Omics Frontier

    PubMed Central

    Culibrk, Luka; Croft, Carys A.

    2016-01-01

    Abstract Opportunistic fungal infections are an increasing threat for global health, and for immunocompromised patients in particular. These infections are characterized by interaction between fungal pathogen and host cells. The exact mechanisms and the attendant variability in host and fungal pathogen interaction remain to be fully elucidated. The field of systems biology aims to characterize a biological system, and utilize this knowledge to predict the system's response to stimuli such as fungal exposures. A multi-omics approach, for example, combining data from genomics, proteomics, metabolomics, would allow a more comprehensive and pan-optic “two systems” biology of both the host and the fungal pathogen. In this review and literature analysis, we present highly specialized and nascent methods for analysis of multiple -omes of biological systems, in addition to emerging single-molecule visualization techniques that may assist in determining biological relevance of multi-omics data. We provide an overview of computational methods for modeling of gene regulatory networks, including some that have been applied towards the study of an interacting host and pathogen. In sum, comprehensive characterizations of host–fungal pathogen systems are now possible, and utilization of these cutting-edge multi-omics strategies may yield advances in better understanding of both host biology and fungal pathogens at a systems scale. PMID:26885725

  18. TH-CD-202-07: A Methodology for Generating Numerical Phantoms for Radiation Therapy Using Geometric Attribute Distribution Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dolly, S; Chen, H; Mutic, S

    Purpose: A persistent challenge for the quality assessment of radiation therapy treatments (e.g. contouring accuracy) is the absence of the known, ground truth for patient data. Moreover, assessment results are often patient-dependent. Computer simulation studies utilizing numerical phantoms can be performed for quality assessment with a known ground truth. However, previously reported numerical phantoms do not include the statistical properties of inter-patient variations, as their models are based on only one patient. In addition, these models do not incorporate tumor data. In this study, a methodology was developed for generating numerical phantoms which encapsulate the statistical variations of patients withinmore » radiation therapy, including tumors. Methods: Based on previous work in contouring assessment, geometric attribute distribution (GAD) models were employed to model both the deterministic and stochastic properties of individual organs via principle component analysis. Using pre-existing radiation therapy contour data, the GAD models are trained to model the shape and centroid distributions of each organ. Then, organs with different shapes and positions can be generated by assigning statistically sound weights to the GAD model parameters. Organ contour data from 20 retrospective prostate patient cases were manually extracted and utilized to train the GAD models. As a demonstration, computer-simulated CT images of generated numerical phantoms were calculated and assessed subjectively and objectively for realism. Results: A cohort of numerical phantoms of the male human pelvis was generated. CT images were deemed realistic both subjectively and objectively in terms of image noise power spectrum. Conclusion: A methodology has been developed to generate realistic numerical anthropomorphic phantoms using pre-existing radiation therapy data. The GAD models guarantee that generated organs span the statistical distribution of observed radiation therapy patients, according to the training dataset. The methodology enables radiation therapy treatment assessment with multi-modality imaging and a known ground truth, and without patient-dependent bias.« less

  19. New paradigms for Salmonella source attribution based on microbial subtyping.

    PubMed

    Mughini-Gras, Lapo; Franz, Eelco; van Pelt, Wilfrid

    2018-05-01

    Microbial subtyping is the most common approach for Salmonella source attribution. Typically, attributions are computed using frequency-matching models like the Dutch and Danish models based on phenotyping data (serotyping, phage-typing, and antimicrobial resistance profiling). Herewith, we critically review three major paradigms facing Salmonella source attribution today: (i) the use of genotyping data, particularly Multi-Locus Variable Number of Tandem Repeats Analysis (MLVA), which is replacing traditional Salmonella phenotyping beyond serotyping; (ii) the integration of case-control data into source attribution to improve risk factor identification/characterization; (iii) the investigation of non-food sources, as attributions tend to focus on foods of animal origin only. Population genetics models or simplified MLVA schemes may provide feasible options for source attribution, although there is a strong need to explore novel modelling options as we move towards whole-genome sequencing as the standard. Classical case-control studies are enhanced by incorporating source attribution results, as individuals acquiring salmonellosis from different sources have different associated risk factors. Thus, the more such analyses are performed the better Salmonella epidemiology will be understood. Reparametrizing current models allows for inclusion of sources like reptiles, the study of which improves our understanding of Salmonella epidemiology beyond food to tackle the pathogen in a more holistic way. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Merging information from multi-model flood projections in a hierarchical Bayesian framework

    NASA Astrophysics Data System (ADS)

    Le Vine, Nataliya

    2016-04-01

    Multi-model ensembles are becoming widely accepted for flood frequency change analysis. The use of multiple models results in large uncertainty around estimates of flood magnitudes, due to both uncertainty in model selection and natural variability of river flow. The challenge is therefore to extract the most meaningful signal from the multi-model predictions, accounting for both model quality and uncertainties in individual model estimates. The study demonstrates the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach facilitates explicit treatment of shared multi-model discrepancy as well as the probabilistic nature of the flood estimates, by treating the available models as a sample from a hypothetical complete (but unobserved) set of models. The advantages of the approach are: 1) to insure an adequate 'baseline' conditions with which to compare future changes; 2) to reduce flood estimate uncertainty; 3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; 4) to adjust multi-model consistency criteria when model biases are large; and 5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.

  1. Advanced 3D Geological Modelling Using Multi Geophysical Data in the Yamagawa Geothermal Field, Japan

    NASA Astrophysics Data System (ADS)

    Mochinaga, H.; Aoki, N.; Mouri, T.

    2017-12-01

    We propose a robust workflow of 3D geological modelling based on integrated analysis while honouring seismic, gravity, and wellbore data for exploration and development at flash steam geothermal power plants. We design the workflow using temperature logs at less than 10 well locations for practical use at an early stage of geothermal exploration and development. In the workflow, geostatistical technique, multi-attribute analysis, and artificial neural network are employed for the integration of multi geophysical data. The geological modelling is verified by using a 3D seismic data which was acquired in the Yamagawa Demonstration Area (approximately 36 km2), located at the city of Ibusuki in Kagoshima, Japan in 2015. Temperature-depth profiles are typically characterized by heat transfer of conduction, outflow, and up-flow which have low frequency trends. On the other hand, feed and injection zones with high permeability would cause high frequency perturbation on temperature-depth profiles. Each trend is supposed to be caused by different geological properties and subsurface structures. In this study, we estimate high frequency (> 2 cycles/km) and low frequency (< 1 cycle/km) models separately by means of different types of attribute volumes. These attributes are mathematically generated from P-impedance and density volumes derived from seismic inversion, an ant-tracking seismic volume, and a geostatistical temperature model prior to application of artificial neural network on the geothermal modelling. As a result, the band-limited stepwise approach predicts a more precise geothermal model than that of full-band temperature profiles at a time. Besides, lithofacies interpretation confirms reliability of the predicted geothermal model. The integrated interpretation is significantly consistent with geological reports from previous studies. Isotherm geobodies illustrate specific features of geothermal reservoir and cap rock, shallow aquifer, and its hydrothermal circulation in 3D visualization. The advanced workflow of 3D geological modelling is suitable for optimization of well locations for production and reinjection in geothermal fields.

  2. On the utilization of novel spectral laser scanning for three-dimensional classification of vegetation elements.

    PubMed

    Li, Zhan; Schaefer, Michael; Strahler, Alan; Schaaf, Crystal; Jupp, David

    2018-04-06

    The Dual-Wavelength Echidna Lidar (DWEL), a full waveform terrestrial laser scanner (TLS), has been used to scan a variety of forested and agricultural environments. From these scanning campaigns, we summarize the benefits and challenges given by DWEL's novel coaxial dual-wavelength scanning technology, particularly for the three-dimensional (3D) classification of vegetation elements. Simultaneous scanning at both 1064 nm and 1548 nm by DWEL instruments provides a new spectral dimension to TLS data that joins the 3D spatial dimension of lidar as an information source. Our point cloud classification algorithm explores the utilization of both spectral and spatial attributes of individual points from DWEL scans and highlights the strengths and weaknesses of each attribute domain. The spectral and spatial attributes for vegetation element classification each perform better in different parts of vegetation (canopy interior, fine branches, coarse trunks, etc.) and under different vegetation conditions (dead or live, leaf-on or leaf-off, water content, etc.). These environmental characteristics of vegetation, convolved with the lidar instrument specifications and lidar data quality, result in the actual capabilities of spectral and spatial attributes to classify vegetation elements in 3D space. The spectral and spatial information domains thus complement each other in the classification process. The joint use of both not only enhances the classification accuracy but also reduces its variance across the multiple vegetation types we have examined, highlighting the value of the DWEL as a new source of 3D spectral information. Wider deployment of the DWEL instruments is in practice currently held back by challenges in instrument development and the demands of data processing required by coaxial dual- or multi-wavelength scanning. But the simultaneous 3D acquisition of both spectral and spatial features, offered by new multispectral scanning instruments such as the DWEL, opens doors to study biophysical and biochemical properties of forested and agricultural ecosystems at more detailed scales.

  3. Principle component analysis and linear discriminant analysis of multi-spectral autofluorescence imaging data for differentiating basal cell carcinoma and healthy skin

    NASA Astrophysics Data System (ADS)

    Chernomyrdin, Nikita V.; Zaytsev, Kirill I.; Lesnichaya, Anastasiya D.; Kudrin, Konstantin G.; Cherkasova, Olga P.; Kurlov, Vladimir N.; Shikunova, Irina A.; Perchik, Alexei V.; Yurchenko, Stanislav O.; Reshetov, Igor V.

    2016-09-01

    In present paper, an ability to differentiate basal cell carcinoma (BCC) and healthy skin by combining multi-spectral autofluorescence imaging, principle component analysis (PCA), and linear discriminant analysis (LDA) has been demonstrated. For this purpose, the experimental setup, which includes excitation and detection branches, has been assembled. The excitation branch utilizes a mercury arc lamp equipped with a 365-nm narrow-linewidth excitation filter, a beam homogenizer, and a mechanical chopper. The detection branch employs a set of bandpass filters with the central wavelength of spectral transparency of λ = 400, 450, 500, and 550 nm, and a digital camera. The setup has been used to study three samples of freshly excised BCC. PCA and LDA have been implemented to analyze the data of multi-spectral fluorescence imaging. Observed results of this pilot study highlight the advantages of proposed imaging technique for skin cancer diagnosis.

  4. Explaining the Substantial Inter-Domain and Over-Time Correlations in Student Achievement: The Importance of Stable Student Attributes

    ERIC Educational Resources Information Center

    Marks, Gary N.

    2016-01-01

    Multi-domain and longitudinal studies of student achievement routinely find moderate to strong correlations across achievement domains and even stronger within-domain correlations over time. The purpose of this study is to examine the sources of these patterns analysing student achievement in 5 domains across Years 3, 5 and 7. The analysis is of…

  5. Responding to Vaccine Safety Signals during Pandemic Influenza: A Modeling Study

    PubMed Central

    Maro, Judith C.; Fryback, Dennis G.; Lieu, Tracy A.; Lee, Grace M.; Martin, David B.

    2014-01-01

    Background Managing emerging vaccine safety signals during an influenza pandemic is challenging. Federal regulators must balance vaccine risks against benefits while maintaining public confidence in the public health system. Methods We developed a multi-criteria decision analysis model to explore regulatory decision-making in the context of emerging vaccine safety signals during a pandemic. We simulated vaccine safety surveillance system capabilities and used an age-structured compartmental model to develop potential pandemic scenarios. We used an expert-derived multi-attribute utility function to evaluate potential regulatory responses by combining four outcome measures into a single measure of interest: 1) expected vaccination benefit from averted influenza; 2) expected vaccination risk from vaccine-associated febrile seizures; 3) expected vaccination risk from vaccine-associated Guillain-Barre Syndrome; and 4) expected change in vaccine-seeking behavior in future influenza seasons. Results Over multiple scenarios, risk communication, with or without suspension of vaccination of high-risk persons, were the consistently preferred regulatory responses over no action or general suspension when safety signals were detected during a pandemic influenza. On average, the expert panel valued near-term vaccine-related outcomes relative to long-term projected outcomes by 3∶1. However, when decision-makers had minimal ability to influence near-term outcomes, the response was selected primarily by projected impacts on future vaccine-seeking behavior. Conclusions The selected regulatory response depends on how quickly a vaccine safety signal is identified relative to the peak of the pandemic and the initiation of vaccination. Our analysis suggested two areas for future investment: efforts to improve the size and timeliness of the surveillance system and behavioral research to understand changes in vaccine-seeking behavior. PMID:25536228

  6. Valuing SF-6D Health States Using a Discrete Choice Experiment.

    PubMed

    Norman, Richard; Viney, Rosalie; Brazier, John; Burgess, Leonie; Cronin, Paula; King, Madeleine; Ratcliffe, Julie; Street, Deborah

    2014-08-01

    SF-6D utility weights are conventionally produced using a standard gamble (SG). SG-derived weights consistently demonstrate a floor effect not observed with other elicitation techniques. Recent advances in discrete choice methods have allowed estimation of utility weights. The objective was to produce Australian utility weights for the SF-6D and to explore the application of discrete choice experiment (DCE) methods in this context. We hypothesized that weights derived using this method would reflect the largely monotonic construction of the SF-6D. We designed an online DCE and administered it to an Australia-representative online panel (n = 1017). A range of specifications investigating nonlinear preferences with respect to additional life expectancy were estimated using a random-effects probit model. The preferred model was then used to estimate a preference index such that full health and death were valued at 1 and 0, respectively, to provide an algorithm for Australian cost-utility analyses. Physical functioning, pain, mental health, and vitality were the largest drivers of utility weights. Combining levels to remove illogical orderings did not lead to a poorer model fit. Relative to international SG-derived weights, the range of utility weights was larger with 5% of health states valued below zero. s. DCEs can be used to investigate preferences for health profiles and to estimate utility weights for multi-attribute utility instruments. Australian cost-utility analyses can now use domestic SF-6D weights. The comparability of DCE results to those using other elicitation methods for estimating utility weights for quality-adjusted life-year calculations should be further investigated. © The Author(s) 2013.

  7. Prediction of SOFC Performance with or without Experiments: A Study on Minimum Requirements for Experimental Data

    DOE PAGES

    Yang, Tao; Sezer, Hayri; Celik, Ismail B.; ...

    2015-06-02

    In the present paper, a physics-based procedure combining experiments and multi-physics numerical simulations is developed for overall analysis of SOFCs operational diagnostics and performance predictions. In this procedure, essential information for the fuel cell is extracted first by utilizing empirical polarization analysis in conjunction with experiments and refined by multi-physics numerical simulations via simultaneous analysis and calibration of polarization curve and impedance behavior. The performance at different utilization cases and operating currents is also predicted to confirm the accuracy of the proposed model. It is demonstrated that, with the present electrochemical model, three air/fuel flow conditions are needed to producemore » a set of complete data for better understanding of the processes occurring within SOFCs. After calibration against button cell experiments, the methodology can be used to assess performance of planar cell without further calibration. The proposed methodology would accelerate the calibration process and improve the efficiency of design and diagnostics.« less

  8. An interactive computer approach to performing resource analysis for a multi-resource/multi-project problem. [Spacelab inventory procurement planning

    NASA Technical Reports Server (NTRS)

    Schlagheck, R. A.

    1977-01-01

    New planning techniques and supporting computer tools are needed for the optimization of resources and costs for space transportation and payload systems. Heavy emphasis on cost effective utilization of resources has caused NASA program planners to look at the impact of various independent variables that affect procurement buying. A description is presented of a category of resource planning which deals with Spacelab inventory procurement analysis. Spacelab is a joint payload project between NASA and the European Space Agency and will be flown aboard the Space Shuttle starting in 1980. In order to respond rapidly to the various procurement planning exercises, a system was built that could perform resource analysis in a quick and efficient manner. This system is known as the Interactive Resource Utilization Program (IRUP). Attention is given to aspects of problem definition, an IRUP system description, questions of data base entry, the approach used for project scheduling, and problems of resource allocation.

  9. Online Tools for Uncovering Data Quality (DQ) Issues in Satellite-Based Global Precipitation Products

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Heo, Gil

    2015-01-01

    Data quality (DQ) has many attributes or facets (i.e., errors, biases, systematic differences, uncertainties, benchmark, false trends, false alarm ratio, etc.)Sources can be complicated (measurements, environmental conditions, surface types, algorithms, etc.) and difficult to be identified especially for multi-sensor and multi-satellite products with bias correction (TMPA, IMERG, etc.) How to obtain DQ info fast and easily, especially quantified info in ROI Existing parameters (random error), literature, DIY, etc.How to apply the knowledge in research and applications.Here, we focus on online systems for integration of products and parameters, visualization and analysis as well as investigation and extraction of DQ information.

  10. Observability and Estimation of Distributed Space Systems via Local Information-Exchange Networks

    NASA Technical Reports Server (NTRS)

    Rahmani, Amirreza; Mesbahi, Mehran; Fathpour, Nanaz; Hadaegh, Fred Y.

    2008-01-01

    In this work, we develop an approach to formation estimation by explicitly characterizing formation's system-theoretic attributes in terms of the underlying inter-spacecraft information-exchange network. In particular, we approach the formation observer/estimator design by relaxing the accessibility to the global state information by a centralized observer/estimator- and in turn- providing an analysis and synthesis framework for formation observers/estimators that rely on local measurements. The noveltyof our approach hinges upon the explicit examination of the underlying distributed spacecraft network in the realm of guidance, navigation, and control algorithmic analysis and design. The overarching goal of our general research program, some of whose results are reported in this paper, is the development of distributed spacecraft estimation algorithms that are scalable, modular, and robust to variations inthe topology and link characteristics of the formation information exchange network. In this work, we consider the observability of a spacecraft formation from a single observation node and utilize the agreement protocol as a mechanism for observing formation states from local measurements. Specifically, we show how the symmetry structure of the network, characterized in terms of its automorphism group, directly relates to the observability of the corresponding multi-agent system The ramification of this notion of observability over networks is then explored in the context of distributed formation estimation.

  11. Assessment of active methods for removal of LEO debris

    NASA Astrophysics Data System (ADS)

    Hakima, Houman; Emami, M. Reza

    2018-03-01

    This paper investigates the applicability of five active methods for removal of large low Earth orbit debris. The removal methods, namely net, laser, electrodynamic tether, ion beam shepherd, and robotic arm, are selected based on a set of high-level space mission constraints. Mission level criteria are then utilized to assess the performance of each redirection method in light of the results obtained from a Monte Carlo simulation. The simulation provides an insight into the removal time, performance robustness, and propellant mass criteria for the targeted debris range. The remaining attributes are quantified based on the models provided in the literature, which take into account several important parameters pertaining to each removal method. The means of assigning attributes to each assessment criterion is discussed in detail. A systematic comparison is performed using two different assessment schemes: Analytical Hierarchy Process and utility-based approach. A third assessment technique, namely the potential-loss analysis, is utilized to highlight the effect of risks in each removal methods.

  12. Bidding Behavior in a Multi-attribute First-price Auction

    DTIC Science & Technology

    2010-01-01

    of applying key features of the multi-unit auction to proxy buyer /seller marginal valuations of the attributes of a job. Two experiments were...compensation package show promise in ascertaining buyer /seller marginal valuations of a job. This research effort was supported by a grant from the...auctions observed in the goods market, as measured by maximizing consumer and producer surplus, are likely to have promising applications to labor markets

  13. An Updated Version of the U.S. Air Force Multi-Attribute Task Battery (AF-MATB)

    DTIC Science & Technology

    2014-08-01

    assessing human performance in a controlled multitask environment. The most recent release of AF-MATB contains numerous improvements and additions...Strategic Behavior, MATB, Multitasking , Task Battery, Simulator, Multi-Attribute Task Battery, Automation 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...performance and multitasking strategy. As a result, a specific Information Throughput (IT) Mode was designed to customize the task to fit the Human

  14. A study of usability principles and interface design for mobile e-books.

    PubMed

    Wang, Chao-Ming; Huang, Ching-Hua

    2015-01-01

    This study examined usability principles and interface designs in order to understand the relationship between the intentions of mobile e-book interface designs and users' perceptions. First, this study summarised 4 usability principles and 16 interface attributes, in order to conduct usability testing and questionnaire survey by referring to Nielsen (1993), Norman (2002), and Yeh (2010), who proposed the usability principles. Second, this study used the interviews to explore the perceptions and behaviours of user operations through senior users of multi-touch prototype devices. The results of this study are as follows: (1) users' behaviour of operating an interactive interface is related to user prior experience; (2) users' rating of the visibility principle is related to users' subjective perception but not related to user prior experience; however, users' ratings of the ease, efficiency, and enjoyment principles are related to user prior experience; (3) the interview survey reveals that the key attributes affecting users' behaviour of operating an interface include aesthetics, achievement, and friendliness. This study conducts experiments to explore the effects of users’ prior multi-touch experience on users’ behaviour of operating a mobile e-book interface and users’ rating of usability principles. Both qualitative and quantitative data analyses were performed. By applying protocol analysis, key attributes affecting users’ behaviour of operation were determined.

  15. A New Heterogeneous Multidimensional Unfolding Procedure

    ERIC Educational Resources Information Center

    Park, Joonwook; Rajagopal, Priyali; DeSarbo, Wayne S.

    2012-01-01

    A variety of joint space multidimensional scaling (MDS) methods have been utilized for the spatial analysis of two- or three-way dominance data involving subjects' preferences, choices, considerations, intentions, etc. so as to provide a parsimonious spatial depiction of the underlying relevant dimensions, attributes, stimuli, and/or subjects'…

  16. Utilization of physicochemical variables developed from changes in sensory attributes and consumer acceptability to predict the shelf life of fresh-cut mango fruit.

    PubMed

    Salinas-Hernández, Rosa María; González-Aguilar, Gustavo A; Tiznado-Hernández, Martín Ernesto

    2015-01-01

    Sensory evaluation is the ideal tool for shelf-life determination. With the objective to develop an easy shelf-life indicator, color (L*, a*, b*, chroma and hue angle), total soluble solids (TSS), firmness (F), pH, acidity, and the sensory attributes of appearance, brightness, browning, odor, flavor, texture, color, acidity and sweetness were evaluated in fresh cut mangoes (FCM) stored at 5, 10, 15 and 20 °C. Overall acceptability was evaluated by consumers. Correlation analysis between sensory attributes and physicochemical variables was carried out. Physicochemical cut-off points based on sensory attributes and consumer acceptability was obtained by regression analysis and utilized to estimate FCM shelf-life by kinetic models fitted to each variable. The validation of the model was done by comparing the shelf life estimated by kinetic models and consumers. It was recorded large correlations between appearance, brightness, and color with L*; appearance and color with chroma and hue angle; sweetness and flavor with TSS, and between F and texture. The shelf life estimated based on consumer using a 9 point hedonic scale was in the range of 10-12, 2.3-2.6, 1.3-1.5 and 1.0-1.1 days for 5, 10, 15 and 20 °C. It was recorded large correlation coefficients between the shelf life estimated by consumer acceptability scores and physicochemical variables. Kinetic models based on physicochemical variables showed a tendency to overestimate the shelf life as compared with the models bases on the sensory attributes. It was concluded that physicochemical variables can be used as a tool to estimate the FCM shelf life.

  17. Multi-ingredients determination and fingerprint analysis of leaves from Ilex latifolia using ultra-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry.

    PubMed

    Fan, Chunlin; Deng, Jiewei; Yang, Yunyun; Liu, Junshan; Wang, Ying; Zhang, Xiaoqi; Fai, Kuokchiu; Zhang, Qingwen; Ye, Wencai

    2013-10-01

    An ultra-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry (UPLC-QTOF-MS) method integrating multi-ingredients determination and fingerprint analysis has been established for quality assessment and control of leaves from Ilex latifolia. The method possesses the advantages of speediness, efficiency, accuracy, and allows the multi-ingredients determination and fingerprint analysis in one chromatographic run within 13min. Multi-ingredients determination was performed based on the extracted ion chromatograms of the exact pseudo-molecular ions (with a 0.01Da window), and fingerprint analysis was performed based on the base peak chromatograms, obtained by negative-ion electrospray ionization QTOF-MS. The method validation results demonstrated our developed method possessing desirable specificity, linearity, precision and accuracy. The method was utilized to analyze 22 I. latifolia samples from different origins. The quality assessment was achieved by using both similarity analysis (SA) and principal component analysis (PCA), and the results from SA were consistent with those from PCA. Our experimental results demonstrate that the strategy integrated multi-ingredients determination and fingerprint analysis using UPLC-QTOF-MS technique is a useful approach for rapid pharmaceutical analysis, with promising prospects for the differentiation of origin, the determination of authenticity, and the overall quality assessment of herbal medicines. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Method Development for Clinical Comprehensive Evaluation of Pediatric Drugs Based on Multi-Criteria Decision Analysis: Application to Inhaled Corticosteroids for Children with Asthma.

    PubMed

    Yu, Yuncui; Jia, Lulu; Meng, Yao; Hu, Lihua; Liu, Yiwei; Nie, Xiaolu; Zhang, Meng; Zhang, Xuan; Han, Sheng; Peng, Xiaoxia; Wang, Xiaoling

    2018-04-01

    Establishing a comprehensive clinical evaluation system is critical in enacting national drug policy and promoting rational drug use. In China, the 'Clinical Comprehensive Evaluation System for Pediatric Drugs' (CCES-P) project, which aims to compare drugs based on clinical efficacy and cost effectiveness to help decision makers, was recently proposed; therefore, a systematic and objective method is required to guide the process. An evidence-based multi-criteria decision analysis model that involved an analytic hierarchy process (AHP) was developed, consisting of nine steps: (1) select the drugs to be reviewed; (2) establish the evaluation criterion system; (3) determine the criterion weight based on the AHP; (4) construct the evidence body for each drug under evaluation; (5) select comparative measures and calculate the original utility score; (6) place a common utility scale and calculate the standardized utility score; (7) calculate the comprehensive utility score; (8) rank the drugs; and (9) perform a sensitivity analysis. The model was applied to the evaluation of three different inhaled corticosteroids (ICSs) used for asthma management in children (a total of 16 drugs with different dosage forms and strengths or different manufacturers). By applying the drug analysis model, the 16 ICSs under review were successfully scored and evaluated. Budesonide suspension for inhalation (drug ID number: 7) ranked the highest, with comprehensive utility score of 80.23, followed by fluticasone propionate inhaled aerosol (drug ID number: 16), with a score of 79.59, and budesonide inhalation powder (drug ID number: 6), with a score of 78.98. In the sensitivity analysis, the ranking of the top five and lowest five drugs remains unchanged, suggesting this model is generally robust. An evidence-based drug evaluation model based on AHP was successfully developed. The model incorporates sufficient utility and flexibility for aiding the decision-making process, and can be a useful tool for the CCES-P.

  19. Development of an index to define overall disease severity in IBD.

    PubMed

    Siegel, Corey A; Whitman, Cynthia B; Spiegel, Brennan M R; Feagan, Brian; Sands, Bruce; Loftus, Edward V; Panaccione, Remo; D'Haens, Geert; Bernstein, Charles N; Gearry, Richard; Ng, Siew C; Mantzaris, Gerassimos J; Sartor, Balfour; Silverberg, Mark S; Riddell, Robert; Koutroubakis, Ioannis E; O'Morain, Colm; Lakatos, Peter L; McGovern, Dermot P B; Halfvarson, Jonas; Reinisch, Walter; Rogler, Gerhard; Kruis, Wolfgang; Tysk, Curt; Schreiber, Stefan; Danese, Silvio; Sandborn, William; Griffiths, Anne; Moum, Bjorn; Gasche, Christoph; Pallone, Francesco; Travis, Simon; Panes, Julian; Colombel, Jean-Frederic; Hanauer, Stephen; Peyrin-Biroulet, Laurent

    2018-02-01

    Disease activity for Crohn's disease (CD) and UC is typically defined based on symptoms at a moment in time, and ignores the long-term burden of disease. The aims of this study were to select the attributes determining overall disease severity, to rank the importance of and to score these individual attributes for both CD and UC. Using a modified Delphi panel, 14 members of the International Organization for the Study of Inflammatory Bowel Diseases (IOIBD) selected the most important attributes related to IBD. Eighteen IOIBD members then completed a statistical exercise (conjoint analysis) to create a relative ranking of these attributes. Adjusted utilities were developed by creating proportions for each level within an attribute. For CD, 15.8% of overall disease severity was attributed to the presence of mucosal lesions, 10.9% to history of a fistula, 9.7% to history of abscess and 7.4% to history of intestinal resection. For UC, 18.1% of overall disease severity was attributed to mucosal lesions, followed by 14.0% for impact on daily activities, 11.2% C reactive protein and 10.1% for prior experience with biologics. Overall disease severity indices were created on a 100-point scale by applying each attribute's average importance to the adjusted utilities. Based on specialist opinion, overall CD severity was associated more with intestinal damage, in contrast to overall UC disease severity, which was more dependent on symptoms and impact on daily life. Once validated, disease severity indices may provide a useful tool for consistent assessment of overall disease severity in patients with IBD. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  20. Use of Multi-Intake Temporal Dominance of Sensations (TDS) to Evaluate the Influence of Cheese on Wine Perception.

    PubMed

    Galmarini, Mara V; Loiseau, Anne-Laure; Visalli, Michel; Schlich, Pascal

    2016-10-01

    Though the gastronomic sector recommends certain wine-cheese associations, there is little sensory evidence on how cheese influences the perception of wine. It was the aim of this study to dynamically characterize 4 wines as they would be perceived when consumed with and without cheese. The tasting protocol was based on multi-intake temporal dominance of sensations (TDS) coupled with hedonic rating. In the 1st session, 31 French wine and cheese consumers evaluated the wines (Pacherenc, Sancerre, Bourgogne, and Madiran) over 3 consecutive sips. In the following sessions, they performed the same task, but eating small portions of cheese (Epoisses, Comté, Roquefort, Crottin de Chavignol) between sips. All cheeses were tasted with all wines over 4 sessions. TDS data were mainly analyzed in terms of each attribute's duration of dominance by analysis of variance, multivariate analysis of variance, and canonical variate analysis. Results showed that cheese consumption had an impact (P < 0.1) on dominance duration of attributes and on preference for most wines. For example, in Madiran, all cheeses reduced dominance duration (P < 0.01) of astringency and sourness and increased duration of red fruit aroma. Although the number of consumers was small to make extended general conclusions on wine's preference, significant changes were observed before and after cheese intake. © 2016 Institute of Food Technologists®.

  1. A multi-tissue type genome-scale metabolic network for analysis of whole-body systems physiology

    PubMed Central

    2011-01-01

    Background Genome-scale metabolic reconstructions provide a biologically meaningful mechanistic basis for the genotype-phenotype relationship. The global human metabolic network, termed Recon 1, has recently been reconstructed allowing the systems analysis of human metabolic physiology and pathology. Utilizing high-throughput data, Recon 1 has recently been tailored to different cells and tissues, including the liver, kidney, brain, and alveolar macrophage. These models have shown utility in the study of systems medicine. However, no integrated analysis between human tissues has been done. Results To describe tissue-specific functions, Recon 1 was tailored to describe metabolism in three human cells: adipocytes, hepatocytes, and myocytes. These cell-specific networks were manually curated and validated based on known cellular metabolic functions. To study intercellular interactions, a novel multi-tissue type modeling approach was developed to integrate the metabolic functions for the three cell types, and subsequently used to simulate known integrated metabolic cycles. In addition, the multi-tissue model was used to study diabetes: a pathology with systemic properties. High-throughput data was integrated with the network to determine differential metabolic activity between obese and type II obese gastric bypass patients in a whole-body context. Conclusion The multi-tissue type modeling approach presented provides a platform to study integrated metabolic states. As more cell and tissue-specific models are released, it is critical to develop a framework in which to study their interdependencies. PMID:22041191

  2. Multi-Interval Discretization of Continuous-Valued Attributes for Classification Learning

    NASA Technical Reports Server (NTRS)

    Fayyad, U.; Irani, K.

    1993-01-01

    Since most real-world applications of classification learning involve continuous-valued attributes, properly addressing the discretization process is an important problem. This paper addresses the use of the entropy minimization heuristic for discretizing the range of a continuous-valued attribute into multiple intervals.

  3. Principal Component Analysis for Enhancement of Infrared Spectra Monitoring

    NASA Astrophysics Data System (ADS)

    Haney, Ricky Lance

    The issue of air quality within the aircraft cabin is receiving increasing attention from both pilot and flight attendant unions. This is due to exposure events caused by poor air quality that in some cases may have contained toxic oil components due to bleed air that flows from outside the aircraft and then through the engines into the aircraft cabin. Significant short and long-term medical issues for aircraft crew have been attributed to exposure. The need for air quality monitoring is especially evident in the fact that currently within an aircraft there are no sensors to monitor the air quality and potentially harmful gas levels (detect-to-warn sensors), much less systems to monitor and purify the air (detect-to-treat sensors) within the aircraft cabin. The specific purpose of this research is to utilize a mathematical technique called principal component analysis (PCA) in conjunction with principal component regression (PCR) and proportionality constant calculations (PCC) to simplify complex, multi-component infrared (IR) spectra data sets into a reduced data set used for determination of the concentrations of the individual components. Use of PCA can significantly simplify data analysis as well as improve the ability to determine concentrations of individual target species in gas mixtures where significant band overlap occurs in the IR spectrum region. Application of this analytical numerical technique to IR spectrum analysis is important in improving performance of commercial sensors that airlines and aircraft manufacturers could potentially use in an aircraft cabin environment for multi-gas component monitoring. The approach of this research is two-fold, consisting of a PCA application to compare simulation and experimental results with the corresponding PCR and PCC to determine quantitatively the component concentrations within a mixture. The experimental data sets consist of both two and three component systems that could potentially be present as air contaminants in an aircraft cabin. In addition, experimental data sets are analyzed for a hydrogen peroxide (H2O2) aqueous solution mixture to determine H2O2 concentrations at various levels that could be produced during use of a vapor phase hydrogen peroxide (VPHP) decontamination system. After the PCA application to two and three component systems, the analysis technique is further expanded to include the monitoring of potential bleed air contaminants from engine oil combustion. Simulation data sets created from database spectra were utilized to predict gas components and concentrations in unknown engine oil samples at high temperatures as well as time-evolved gases from the heating of engine oils.

  4. Prediction accuracies for growth and wood attributes of interior spruce in space using genotyping-by-sequencing.

    PubMed

    Gamal El-Dien, Omnia; Ratcliffe, Blaise; Klápště, Jaroslav; Chen, Charles; Porth, Ilga; El-Kassaby, Yousry A

    2015-05-09

    Genomic selection (GS) in forestry can substantially reduce the length of breeding cycle and increase gain per unit time through early selection and greater selection intensity, particularly for traits of low heritability and late expression. Affordable next-generation sequencing technologies made it possible to genotype large numbers of trees at a reasonable cost. Genotyping-by-sequencing was used to genotype 1,126 Interior spruce trees representing 25 open-pollinated families planted over three sites in British Columbia, Canada. Four imputation algorithms were compared (mean value (MI), singular value decomposition (SVD), expectation maximization (EM), and a newly derived, family-based k-nearest neighbor (kNN-Fam)). Trees were phenotyped for several yield and wood attributes. Single- and multi-site GS prediction models were developed using the Ridge Regression Best Linear Unbiased Predictor (RR-BLUP) and the Generalized Ridge Regression (GRR) to test different assumption about trait architecture. Finally, using PCA, multi-trait GS prediction models were developed. The EM and kNN-Fam imputation methods were superior for 30 and 60% missing data, respectively. The RR-BLUP GS prediction model produced better accuracies than the GRR indicating that the genetic architecture for these traits is complex. GS prediction accuracies for multi-site were high and better than those of single-sites while multi-site predictability produced the lowest accuracies reflecting type-b genetic correlations and deemed unreliable. The incorporation of genomic information in quantitative genetics analyses produced more realistic heritability estimates as half-sib pedigree tended to inflate the additive genetic variance and subsequently both heritability and gain estimates. Principle component scores as representatives of multi-trait GS prediction models produced surprising results where negatively correlated traits could be concurrently selected for using PCA2 and PCA3. The application of GS to open-pollinated family testing, the simplest form of tree improvement evaluation methods, was proven to be effective. Prediction accuracies obtained for all traits greatly support the integration of GS in tree breeding. While the within-site GS prediction accuracies were high, the results clearly indicate that single-site GS models ability to predict other sites are unreliable supporting the utilization of multi-site approach. Principle component scores provided an opportunity for the concurrent selection of traits with different phenotypic optima.

  5. Deriving health utilities from the MacNew Heart Disease Quality of Life Questionnaire.

    PubMed

    Chen, Gang; McKie, John; Khan, Munir A; Richardson, Jeff R

    2015-10-01

    Quality of life is included in the economic evaluation of health services by measuring the preference for health states, i.e. health state utilities. However, most intervention studies include a disease-specific, not a utility, instrument. Consequently, there has been increasing use of statistical mapping algorithms which permit utilities to be estimated from a disease-specific instrument. The present paper provides such algorithms between the MacNew Heart Disease Quality of Life Questionnaire (MacNew) instrument and six multi-attribute utility (MAU) instruments, the Euroqol (EQ-5D), the Short Form 6D (SF-6D), the Health Utilities Index (HUI) 3, the Quality of Wellbeing (QWB), the 15D (15 Dimension) and the Assessment of Quality of Life (AQoL-8D). Heart disease patients and members of the healthy public were recruited from six countries. Non-parametric rank tests were used to compare subgroup utilities and MacNew scores. Mapping algorithms were estimated using three separate statistical techniques. Mapping algorithms achieved a high degree of precision. Based on the mean absolute error and the intra class correlation the preferred mapping is MacNew into SF-6D or 15D. Using the R squared statistic the preferred mapping is MacNew into AQoL-8D. The algorithms reported in this paper enable MacNew data to be mapped into utilities predicted from any of six instruments. This permits studies which have included the MacNew to be used in cost utility analyses which, in turn, allows the comparison of services with interventions across the health system. © The European Society of Cardiology 2014.

  6. A model for plant lighting system selection.

    PubMed

    Ciolkosz, D E; Albright, L D; Sager, J C; Langhans, R W

    2002-01-01

    A decision model is presented that compares lighting systems for a plant growth scenario and chooses the most appropriate system from a given set of possible choices. The model utilizes a Multiple Attribute Utility Theory approach, and incorporates expert input and performance simulations to calculate a utility value for each lighting system being considered. The system with the highest utility is deemed the most appropriate system. The model was applied to a greenhouse scenario, and analyses were conducted to test the model's output for validity. Parameter variation indicates that the model performed as expected. Analysis of model output indicates that differences in utility among the candidate lighting systems were sufficiently large to give confidence that the model's order of selection was valid.

  7. Comparison of two multi-criteria decision techniques for eliciting treatment preferences in people with neurological disorders.

    PubMed

    Ijzerman, Maarten J; van Til, Janine A; Snoek, Govert J

    2008-12-01

    To present and compare two multi-criteria decision techniques (analytic hierarchy process [AHP] and conjoint analysis [CA]) for eliciting preferences in patients with cervical spinal cord injury (SCI) who are eligible for surgical augmentation of hand function, either with or without implantation of a neuroprosthesis. The methods were compared in respect to attribute weights, overall preference, and practical experiences. Two previously designed and administered multi-criteria decision surveys in patients with SCI were compared and further analysed. Attributes and their weights in the AHP experiment were determined by an expert panel, followed by determination of the weights in the patient group. Attributes for the CA were selected and validated using an expert panel, piloted in six patients with SCI and subsequently administered to the same group of patients as participated in the AHP experiment. Both experiments showed the importance of non-outcome-related factors such as inpatient stay and number of surgical procedures. In particular, patients were less concerned with clinical outcomes in actual decision making. Overall preference in both the AHP and CA was in favor of tendon reconstruction (0.6 vs 0.4 for neuroprosthetic implantation). Both methods were easy to apply, but AHP was less easily explained and understood. Both the AHP and CA methods produced similar outcomes, which may have been caused by the obvious preferences of patients. CA may be preferred because of the holistic approach of considering all treatment attributes simultaneously and, hence, its power in simulating real market decisions. On the other hand, the AHP method is preferred as a hands-on, easy-to-implement task with immediate feedback to the respondent. This flexibility allows AHP to be used in shared decision making. However, the way the technique is composed results in many inconsistencies. Patients preferred CA but complained about the number of choice tasks.

  8. Analysis of Rapid Multi-Focal Zone ARFI Imaging

    PubMed Central

    Rosenzweig, Stephen; Palmeri, Mark; Nightingale, Kathryn

    2015-01-01

    Acoustic radiation force impulse (ARFI) imaging has shown promise for visualizing structure and pathology within multiple organs; however, because the contrast depends on the push beam excitation width, image quality suffers outside of the region of excitation. Multi-focal zone ARFI imaging has previously been used to extend the region of excitation (ROE), but the increased acquisition duration and acoustic exposure have limited its utility. Supersonic shear wave imaging has previously demonstrated that through technological improvements in ultrasound scanners and power supplies, it is possible to rapidly push at multiple locations prior to tracking displacements, facilitating extended depth of field shear wave sources. Similarly, ARFI imaging can utilize these same radiation force excitations to achieve tight pushing beams with a large depth of field. Finite element method simulations and experimental data are presented demonstrating that single- and rapid multi-focal zone ARFI have comparable image quality (less than 20% loss in contrast), but the multi-focal zone approach has an extended axial region of excitation. Additionally, as compared to single push sequences, the rapid multi-focal zone acquisitions improve the contrast to noise ratio by up to 40% in an example 4 mm diameter lesion. PMID:25643078

  9. The New Southern FIA Data Compilation System

    Treesearch

    V. Clark Baldwin; Larry Royer

    2001-01-01

    In general, the major national Forest Inventory and Analysis annual inventory emphasis has been on data-base design and not on data processing and calculation of various new attributes. Two key programming techniques required for efficient data processing are indexing and modularization. The Southern Research Station Compilation System utilizes modular and indexing...

  10. Assessment-Based Antecedent Interventions Used in Natural Settings To Reduce Challenging Behavior: An Analysis of the Literature.

    ERIC Educational Resources Information Center

    Kern, Lee; Choutka, Claire Maher; Sokol, Natalie G.

    2002-01-01

    This article reviews research describing assessment-based antecedent interventions implemented in natural settings. Descriptive information is provided along a number of dimensions pertaining to participant characteristics (n=42), assessment utilized, and intervention attributes. Results indicate the most common interventions targeted aggression,…

  11. Land usage attributed to corn ethanol production in the United States: sensitivity to technological advances in corn grain yield, ethanol conversion, and co-product utilization.

    PubMed

    Mumm, Rita H; Goldsmith, Peter D; Rausch, Kent D; Stein, Hans H

    2014-01-01

    Although the system for producing yellow corn grain is well established in the US, its role among other biofeedstock alternatives to petroleum-based energy sources has to be balanced with its predominant purpose for food and feed as well as economics, land use, and environmental stewardship. We model land usage attributed to corn ethanol production in the US to evaluate the effects of anticipated technological change in corn grain production, ethanol processing, and livestock feeding through a multi-disciplinary approach. Seven scenarios are evaluated: four considering the impact of technological advances on corn grain production, two focused on improved efficiencies in ethanol processing, and one reflecting greater use of ethanol co-products (that is, distillers dried grains with solubles) in diets for dairy cattle, pigs, and poultry. For each scenario, land area attributed to corn ethanol production is estimated for three time horizons: 2011 (current), the time period at which the 15 billion gallon cap for corn ethanol as per the Renewable Fuel Standard is achieved, and 2026 (15 years out). Although 40.5% of corn grain was channeled to ethanol processing in 2011, only 25% of US corn acreage was attributable to ethanol when accounting for feed co-product utilization. By 2026, land area attributed to corn ethanol production is reduced to 11% to 19% depending on the corn grain yield level associated with the four corn production scenarios, considering oil replacement associated with the soybean meal substituted in livestock diets with distillers dried grains with solubles. Efficiencies in ethanol processing, although producing more ethanol per bushel of processed corn, result in less co-products and therefore less offset of corn acreage. Shifting the use of distillers dried grains with solubles in feed to dairy cattle, pigs, and poultry substantially reduces land area attributed to corn ethanol production. However, because distillers dried grains with solubles substitutes at a higher rate for soybean meal, oil replacement requirements intensify and positively feedback to elevate estimates of land usage. Accounting for anticipated technological changes in the corn ethanol system is important for understanding the associated land base ascribed, and may aid in calibrating parameters for land use models in biofuel life-cycle analyses.

  12. Land usage attributed to corn ethanol production in the United States: sensitivity to technological advances in corn grain yield, ethanol conversion, and co-product utilization

    PubMed Central

    2014-01-01

    Background Although the system for producing yellow corn grain is well established in the US, its role among other biofeedstock alternatives to petroleum-based energy sources has to be balanced with its predominant purpose for food and feed as well as economics, land use, and environmental stewardship. We model land usage attributed to corn ethanol production in the US to evaluate the effects of anticipated technological change in corn grain production, ethanol processing, and livestock feeding through a multi-disciplinary approach. Seven scenarios are evaluated: four considering the impact of technological advances on corn grain production, two focused on improved efficiencies in ethanol processing, and one reflecting greater use of ethanol co-products (that is, distillers dried grains with solubles) in diets for dairy cattle, pigs, and poultry. For each scenario, land area attributed to corn ethanol production is estimated for three time horizons: 2011 (current), the time period at which the 15 billion gallon cap for corn ethanol as per the Renewable Fuel Standard is achieved, and 2026 (15 years out). Results Although 40.5% of corn grain was channeled to ethanol processing in 2011, only 25% of US corn acreage was attributable to ethanol when accounting for feed co-product utilization. By 2026, land area attributed to corn ethanol production is reduced to 11% to 19% depending on the corn grain yield level associated with the four corn production scenarios, considering oil replacement associated with the soybean meal substituted in livestock diets with distillers dried grains with solubles. Efficiencies in ethanol processing, although producing more ethanol per bushel of processed corn, result in less co-products and therefore less offset of corn acreage. Shifting the use of distillers dried grains with solubles in feed to dairy cattle, pigs, and poultry substantially reduces land area attributed to corn ethanol production. However, because distillers dried grains with solubles substitutes at a higher rate for soybean meal, oil replacement requirements intensify and positively feedback to elevate estimates of land usage. Conclusions Accounting for anticipated technological changes in the corn ethanol system is important for understanding the associated land base ascribed, and may aid in calibrating parameters for land use models in biofuel life-cycle analyses. PMID:24725504

  13. On Multifunctional Collaborative Methods in Engineering Science

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.

    2001-01-01

    Multifunctional methodologies and analysis procedures are formulated for interfacing diverse subdomain idealizations including multi-fidelity modeling methods and multi-discipline analysis methods. These methods, based on the method of weighted residuals, ensure accurate compatibility of primary and secondary variables across the subdomain interfaces. Methods are developed using diverse mathematical modeling (i.e., finite difference and finite element methods) and multi-fidelity modeling among the subdomains. Several benchmark scalar-field and vector-field problems in engineering science are presented with extensions to multidisciplinary problems. Results for all problems presented are in overall good agreement with the exact analytical solution or the reference numerical solution. Based on the results, the integrated modeling approach using the finite element method for multi-fidelity discretization among the subdomains is identified as most robust. The multiple method approach is advantageous when interfacing diverse disciplines in which each of the method's strengths are utilized.

  14. A Decision Analysis Perspective on Multiple Response Robust Optimization

    DTIC Science & Technology

    2012-03-01

    the utility function in question is monotonically increasing and is twice differentiable . If γ(y) = 0, the utility function is describing risk neutral...twice differentiable , the risk aversion function with respect to a single attribute, yi, i = 1, . . . , n, is given in Equation 2.9, γUyi = − U ′′yi U...UV (V (y1, y2)) and fol- lowing the chain rule of differentiation , Matheson and Abbas [31] show that the risk aversion with respect to a single

  15. Cynicism about organizational change: an attribution process perspective.

    PubMed

    Wanous, John P; Reichers, Arnon E; Austin, James T

    2004-06-01

    The underlying attribution process for cynicism about organizational change is examined with six samples from four different organizations. The samples include hourly (n=777) and salaried employees (n= 155) from a manufacturing plant, faculty (n=293) and staff (n=302) from a large university, managers from a utility company (n=97), and young managers (n=65) from various organizations who were attending an evening MBA program. This form of cynicism is defined as the combination of Pessimism (about future change efforts) and a Dispositional attribution (why past efforts to change failed). Three analyses support this definition. First, an exploratory factor analysis (from the largest sample) produced two factors, one composed of Pessimism and the Dispositional attribution items and the second of the Situational attribution items. Second, the average correlation (across several samples) between Pessimism and Dispositional attribution is much higher (.59) than the average correlation between Pessimism and Situational attribution (.17). Third, scores on two different trait-based measures of cynicism correlate highest with the Dispositional attribution component of cynicism. A practical implication is that organizational leaders may minimize cynicism by managing both employees' pessimism about organizational change and employees' attributions about it. Specific suggestions for how this might be done are offered.

  16. Dynamic systems and inferential information processing in human communication.

    PubMed

    Grammer, Karl; Fink, Bernhard; Renninger, LeeAnn

    2002-12-01

    Research in human communication on an ethological basis is almost obsolete. The reasons for this are manifold and lie partially in methodological problems connected to the observation and description of behavior, as well as the nature of human behavior itself. In this chapter, we present a new, non-intrusive, technical approach to the analysis of human non-verbal behavior, which could help to solve the problem of categorization that plagues the traditional approaches. We utilize evolutionary theory to propose a new theory-driven methodological approach to the 'multi-unit multi-channel modulation' problem of human nonverbal communication. Within this concept, communication is seen as context-dependent (the meaning of a signal is adapted to the situation), as a multi-channel and a multi-unit process (a string of many events interrelated in 'communicative' space and time), and as related to the function it serves. Such an approach can be utilized to successfully bridge the gap between evolutionary psychological research, which focuses on social cognition adaptations, and human ethology, which describes every day behavior in an objective, systematic way.

  17. District Heating Systems Performance Analyses. Heat Energy Tariff

    NASA Astrophysics Data System (ADS)

    Ziemele, Jelena; Vigants, Girts; Vitolins, Valdis; Blumberga, Dagnija; Veidenbergs, Ivars

    2014-12-01

    The paper addresses an important element of the European energy sector: the evaluation of district heating (DH) system operations from the standpoint of increasing energy efficiency and increasing the use of renewable energy resources. This has been done by developing a new methodology for the evaluation of the heat tariff. The paper presents an algorithm of this methodology, which includes not only a data base and calculation equation systems, but also an integrated multi-criteria analysis module using MADM/MCDM (Multi-Attribute Decision Making / Multi-Criteria Decision Making) based on TOPSIS (Technique for Order Performance by Similarity to Ideal Solution). The results of the multi-criteria analysis are used to set the tariff benchmarks. The evaluation methodology has been tested for Latvian heat tariffs, and the obtained results show that only half of heating companies reach a benchmark value equal to 0.5 for the efficiency closeness to the ideal solution indicator. This means that the proposed evaluation methodology would not only allow companies to determine how they perform with regard to the proposed benchmark, but also to identify their need to restructure so that they may reach the level of a low-carbon business.

  18. The Interactive Attribution of School Success in Multi-Ethnic Schools

    ERIC Educational Resources Information Center

    de Haan, Mariette; Wissink, Inge

    2013-01-01

    The study shows how explanations for school success are expressed and dialogically constructed during teacher-parent conferences at school. Attribution theory is used to conceptualize the various explanations for school success that were expressed. However, instead of only looking at attributions as beliefs which individuals or groups "have", the…

  19. Decision Making In Assignment Problem With Multiple Attributes Under Intuitionistic Fuzzy Environment

    NASA Astrophysics Data System (ADS)

    Mukherjee, Sathi; Basu, Kajla

    2010-10-01

    In this paper we develop a methodology to solve the multiple attribute assignment problems where the attributes are considered to be Intuitionistic Fuzzy Sets (IFS). We apply the concept of similarity measures of IFS to solve the Intuitionistic Fuzzy Multi-Attribute Assignment Problem (IFMAAP). The weights of the attributes are determined from expert opinion. An illustrative example is solved to verify the developed approach and to demonstrate its practicality.

  20. Uncertainty Budget Analysis for Dimensional Inspection Processes (U)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valdez, Lucas M.

    2012-07-26

    This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensionalmore » inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.« less

  1. MONGKIE: an integrated tool for network analysis and visualization for multi-omics data.

    PubMed

    Jang, Yeongjun; Yu, Namhee; Seo, Jihae; Kim, Sun; Lee, Sanghyuk

    2016-03-18

    Network-based integrative analysis is a powerful technique for extracting biological insights from multilayered omics data such as somatic mutations, copy number variations, and gene expression data. However, integrated analysis of multi-omics data is quite complicated and can hardly be done in an automated way. Thus, a powerful interactive visual mining tool supporting diverse analysis algorithms for identification of driver genes and regulatory modules is much needed. Here, we present a software platform that integrates network visualization with omics data analysis tools seamlessly. The visualization unit supports various options for displaying multi-omics data as well as unique network models for describing sophisticated biological networks such as complex biomolecular reactions. In addition, we implemented diverse in-house algorithms for network analysis including network clustering and over-representation analysis. Novel functions include facile definition and optimized visualization of subgroups, comparison of a series of data sets in an identical network by data-to-visual mapping and subsequent overlaying function, and management of custom interaction networks. Utility of MONGKIE for network-based visual data mining of multi-omics data was demonstrated by analysis of the TCGA glioblastoma data. MONGKIE was developed in Java based on the NetBeans plugin architecture, thus being OS-independent with intrinsic support of module extension by third-party developers. We believe that MONGKIE would be a valuable addition to network analysis software by supporting many unique features and visualization options, especially for analysing multi-omics data sets in cancer and other diseases. .

  2. Optimizing conservation strategies for Mexican freetailed bats: a population viability and ecosystem services approach

    USGS Publications Warehouse

    Wiederholt, Ruscena; Lopez-Hoffman, Laura; Svancara, Colleen; McCracken, Gary; Thogmartin, Wayne E.; Diffendorfer, James E.; Mattson, Brady; Bagstad, Kenneth J.; Cryan, Paul; Russell, Amy; Semmens, Darius J.; Rodrigo A. Medellín,

    2015-01-01

    Conservation planning can be challenging due to the need to balance biological concerns about population viability with social concerns about the benefits biodiversity provide to society, often while operating under a limited budget. Methods and tools that help prioritize conservation actions are critical for the management of at-risk species. Here, we use a multi-attribute utility function to assess the optimal maternity roosts to conserve for maintaining the population viability and the ecosystem services of a single species, the Mexican free-tailed bat (Tadarida brasiliensis mexicana). Mexican free-tailed bats provide ecosystem services such as insect pest-suppression in agricultural areas and recreational viewing opportunities, and may be threatened by climate change and development of wind energy. We evaluated each roost based on five attributes: the maternity roost’s contribution to population viability, the pest suppression ecosystem services to the surrounding area provided by the bats residing in the roost, the ecotourism value of the roost, the risks posed to each roost structure, and the risks posed to the population of bats residing in each roost. We compared several scenarios that prioritized these attributes differently, hypothesizing that the set of roosts with the highest rankings would vary according to the conservation scenario. Our results indicate that placing higher values on different roost attributes (e.g. population importance over ecosystem service value) altered the roost rankings. We determined that the values placed on various conservation objectives are an important determinant of habitat planning.

  3. Lidar-based fracture characterization: An outcrop-scale study of the Woodford Shale, McAlister Shale Pit, Oklahoma

    NASA Astrophysics Data System (ADS)

    Hanzel, Jason

    The use of lidar (light detection and ranging), a remote sensing tool based on principles of laser optometry, in mapping complex, multi-scale fracture networks had not been rigorously tested prior to this study despite its foreseeable utility in interpreting rock fabric with imprints of complex tectonic evolution. This thesis demonstrates lidar-based characterization of the Woodford Shale where intense fracturing could be due to both tectonism and mineralogy. The study area is the McAlister Shale Pit in south-central Oklahoma where both the upper and middle sections of the Woodford Shale are exposed and can be lidar-mapped. Lidar results are validated using hand-measured strike and dips of fracture planes, thin sections and mineral chemistry of selected samples using X-ray diffraction (XRD). Complexity of the fracture patterns as well as inaccessibility of multiple locations within the shale pit makes hand-measurement prone to errors and biases; lidar provides an opportunity for less biased and more efficient field mapping. Fracture mapping with lidar is a multi-step process. The lidar data are converted from point clouds into a mesh through triangulation. User-defined parameters such as size and orientation of the individual triangular elements are then used to group similar elements into surfaces. The strike and dip attribute of the simulated surfaces are visualized in an equal area lower hemisphere projection stereonet. Three fracture sets were identified in the upper and middle sections with common orientation but substantially different spatial density. Measured surface attributes and spatial density relations from lidar were validated using their hand-measured counterparts. Thin section analysis suggests that high fracture density in the upper Woodford measured by both the lidar and the hand-measured data could be due to high quartz. A significant finding of this study is the reciprocal relation between lidar intensity and gamma-ray (GR), which is generally used to infer outcrop mineralogy. XRD analysis of representative samples along the common profiles show that both GR and lidar intensity were influenced by the same minerals in essentially opposite ways. Results strongly suggest that the lidar cannot only remotely map the geomorphology, but also the relative mineralogical variations to the first order of approximation.

  4. Fermentanomics: Relating quality attributes of a monoclonal antibody to cell culture process variables and raw materials using multivariate data analysis.

    PubMed

    Rathore, Anurag S; Kumar Singh, Sumit; Pathak, Mili; Read, Erik K; Brorson, Kurt A; Agarabi, Cyrus D; Khan, Mansoor

    2015-01-01

    Fermentanomics is an emerging field of research and involves understanding the underlying controlled process variables and their effect on process yield and product quality. Although major advancements have occurred in process analytics over the past two decades, accurate real-time measurement of significant quality attributes for a biotech product during production culture is still not feasible. Researchers have used an amalgam of process models and analytical measurements for monitoring and process control during production. This article focuses on using multivariate data analysis as a tool for monitoring the internal bioreactor dynamics, the metabolic state of the cell, and interactions among them during culture. Quality attributes of the monoclonal antibody product that were monitored include glycosylation profile of the final product along with process attributes, such as viable cell density and level of antibody expression. These were related to process variables, raw materials components of the chemically defined hybridoma media, concentration of metabolites formed during the course of the culture, aeration-related parameters, and supplemented raw materials such as glucose, methionine, threonine, tryptophan, and tyrosine. This article demonstrates the utility of multivariate data analysis for correlating the product quality attributes (especially glycosylation) to process variables and raw materials (especially amino acid supplements in cell culture media). The proposed approach can be applied for process optimization to increase product expression, improve consistency of product quality, and target the desired quality attribute profile. © 2015 American Institute of Chemical Engineers.

  5. Multiple-attribute group decision making with different formats of preference information on attributes.

    PubMed

    Xu, Zeshui

    2007-12-01

    Interval utility values, interval fuzzy preference relations, and interval multiplicative preference relations are three common uncertain-preference formats used by decision-makers to provide their preference information in the process of decision making under fuzziness. This paper is devoted in investigating multiple-attribute group-decision-making problems where the attribute values are not precisely known but the value ranges can be obtained, and the decision-makers provide their preference information over attributes by three different uncertain-preference formats i.e., 1) interval utility values; 2) interval fuzzy preference relations; and 3) interval multiplicative preference relations. We first utilize some functions to normalize the uncertain decision matrix and then transform it into an expected decision matrix. We establish a goal-programming model to integrate the expected decision matrix and all three different uncertain-preference formats from which the attribute weights and the overall attribute values of alternatives can be obtained. Then, we use the derived overall attribute values to get the ranking of the given alternatives and to select the best one(s). The model not only can reflect both the subjective considerations of all decision-makers and the objective information but also can avoid losing and distorting the given objective and subjective decision information in the process of information integration. Furthermore, we establish some models to solve the multiple-attribute group-decision-making problems with three different preference formats: 1) utility values; 2) fuzzy preference relations; and 3) multiplicative preference relations. Finally, we illustrate the applicability and effectiveness of the developed models with two practical examples.

  6. Hybrid Parallelization of Adaptive MHD-Kinetic Module in Multi-Scale Fluid-Kinetic Simulation Suite

    DOE PAGES

    Borovikov, Sergey; Heerikhuisen, Jacob; Pogorelov, Nikolai

    2013-04-01

    The Multi-Scale Fluid-Kinetic Simulation Suite has a computational tool set for solving partially ionized flows. In this paper we focus on recent developments of the kinetic module which solves the Boltzmann equation using the Monte-Carlo method. The module has been recently redesigned to utilize intra-node hybrid parallelization. We describe in detail the redesign process, implementation issues, and modifications made to the code. Finally, we conduct a performance analysis.

  7. Massive land system changes impact water quality of the Jhelum River in Kashmir Himalaya.

    PubMed

    Rather, Mohmmad Irshad; Rashid, Irfan; Shahi, Nuzhat; Murtaza, Khalid Omar; Hassan, Khalida; Yousuf, Abdul Rehman; Romshoo, Shakil Ahmad; Shah, Irfan Yousuf

    2016-03-01

    The pristine aquatic ecosystems in the Himalayas are facing an ever increasing threat from various anthropogenic pressures which necessitate better understanding of the spatial and temporal variability of pollutants, their sources, and possible remedies. This study demonstrates the multi-disciplinary approach utilizing the multivariate statistical techniques, data from remote sensing, lab, and field-based observations for assessing the impact of massive land system changes on water quality of the river Jhelum. Land system changes over a period of 38 years have been quantified using multi-spectral satellite data to delineate the extent of different anthropogenically driven land use types that are the main non-point sources of pollution. Fifteen water quality parameters, at 12 sampling sites distributed uniformly along the length of the Jhelum, have been assessed to identify the possible sources of pollution. Our analysis indicated that 18% of the forested area has degraded into sparse forest or scrublands from 1972 to 2010, and the areas under croplands have decreased by 24% as people shifted from irrigation-intensive agriculture to orchard farming while as settlements showed a 397% increase during the observation period. One-way ANOVA revealed that all the water quality parameters had significant spatio-temporal differences (p < 0.01). Cluster analysis (CA) helped us to classify all the sampling sites into three groups. Factor analysis revealed that 91.84% of the total variance was mainly explained by five factors. Drastic changes in water quality of the Jhelum since the past three decades are manifested by increases in nitrate-nitrogen, TDS, and electric conductivity. The especially high levels of nitrogen (858 ± 405 μgL(-1)) and phosphorus (273 ± 18 μgL(-1)) in the Jhelum could be attributed to the reckless application of fertilizers, pesticides, and unplanned urbanization in the area.

  8. Impact of scale on morphological spatial pattern of forest

    Treesearch

    Katarzyna Ostapowicz; Peter Vogt; Kurt H. Riitters; Jacek Kozak; Christine Estreguil

    2008-01-01

    Assessing and monitoring landscape pattern structure from multi-scale land-cover maps can utilize morphological spatial pattern analysis (MSPA), only if various influences of scale are known and taken into account. This paper lays part of the foundation for applying MSPA analysis in landscape monitoring by quantifying scale effects on six classes of spatial patterns...

  9. Health Seeking in Men: A Concept Analysis.

    PubMed

    Hooper, Gwendolyn L; Quallich, Susanne A

    2016-01-01

    This article describes the analysis of the concept of health seeking in men. Men have shorter life expectancies and utilize health services less often than women, leading to poor health outcomes, but a gendered basis for health seeking remains poorly defined. Walker and Avant’s framework was used to guide this concept analysis. Literature published in English from 1990-2015 was reviewed. Thematic analysis identified attributes, antecedents, and consequences of the concept. Based on the analysis, a contemporary definition for health seeking in men was constructed, rooted in the concept of health. The definition is based on the concept analysis and the defining attributes that were identified. This analysis provides a definition specifically for health seeking in American men, making it more specific and gender-based than the parent concept of “health.” This concept analysis provides conceptual clarity that can guide development of a conceptual framework that may be uniquely relevant to providers in urology. Further exploration will uncover specific cultural, social, sexual, and geographic perspectives.

  10. GPU accelerated dynamic functional connectivity analysis for functional MRI data.

    PubMed

    Akgün, Devrim; Sakoğlu, Ünal; Esquivel, Johnny; Adinoff, Bryon; Mete, Mutlu

    2015-07-01

    Recent advances in multi-core processors and graphics card based computational technologies have paved the way for an improved and dynamic utilization of parallel computing techniques. Numerous applications have been implemented for the acceleration of computationally-intensive problems in various computational science fields including bioinformatics, in which big data problems are prevalent. In neuroimaging, dynamic functional connectivity (DFC) analysis is a computationally demanding method used to investigate dynamic functional interactions among different brain regions or networks identified with functional magnetic resonance imaging (fMRI) data. In this study, we implemented and analyzed a parallel DFC algorithm based on thread-based and block-based approaches. The thread-based approach was designed to parallelize DFC computations and was implemented in both Open Multi-Processing (OpenMP) and Compute Unified Device Architecture (CUDA) programming platforms. Another approach developed in this study to better utilize CUDA architecture is the block-based approach, where parallelization involves smaller parts of fMRI time-courses obtained by sliding-windows. Experimental results showed that the proposed parallel design solutions enabled by the GPUs significantly reduce the computation time for DFC analysis. Multicore implementation using OpenMP on 8-core processor provides up to 7.7× speed-up. GPU implementation using CUDA yielded substantial accelerations ranging from 18.5× to 157× speed-up once thread-based and block-based approaches were combined in the analysis. Proposed parallel programming solutions showed that multi-core processor and CUDA-supported GPU implementations accelerated the DFC analyses significantly. Developed algorithms make the DFC analyses more practical for multi-subject studies with more dynamic analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. An optimized solution of multi-criteria evaluation analysis of landslide susceptibility using fuzzy sets and Kalman filter

    NASA Astrophysics Data System (ADS)

    Gorsevski, Pece V.; Jankowski, Piotr

    2010-08-01

    The Kalman recursive algorithm has been very widely used for integrating navigation sensor data to achieve optimal system performances. This paper explores the use of the Kalman filter to extend the aggregation of spatial multi-criteria evaluation (MCE) and to find optimal solutions with respect to a decision strategy space where a possible decision rule falls. The approach was tested in a case study in the Clearwater National Forest in central Idaho, using existing landslide datasets from roaded and roadless areas and terrain attributes. In this approach, fuzzy membership functions were used to standardize terrain attributes and develop criteria, while the aggregation of the criteria was achieved by the use of a Kalman filter. The approach presented here offers advantages over the classical MCE theory because the final solution includes both the aggregated solution and the areas of uncertainty expressed in terms of standard deviation. A comparison of this methodology with similar approaches suggested that this approach is promising for predicting landslide susceptibility and further application as a spatial decision support system.

  12. DCL System Research Using Advanced Approaches for Land-based or Ship-based Real-Time Recognition and Localization of Marine Mammals

    DTIC Science & Technology

    2012-09-30

    recognition. Algorithm design and statistical analysis and feature analysis. Post -Doctoral Associate, Cornell University, Bioacoustics Research...short. The HPC-ADA was designed based on fielded systems [1-4, 6] that offer a variety of desirable attributes, specifically dynamic resource...The software package was designed to utilize parallel and distributed processing for running recognition and other advanced algorithms. DeLMA

  13. Radial sets: interactive visual analysis of large overlapping sets.

    PubMed

    Alsallakh, Bilal; Aigner, Wolfgang; Miksch, Silvia; Hauser, Helwig

    2013-12-01

    In many applications, data tables contain multi-valued attributes that often store the memberships of the table entities to multiple sets such as which languages a person masters, which skills an applicant documents, or which features a product comes with. With a growing number of entities, the resulting element-set membership matrix becomes very rich of information about how these sets overlap. Many analysis tasks targeted at set-typed data are concerned with these overlaps as salient features of such data. This paper presents Radial Sets, a novel visual technique to analyze set memberships for a large number of elements. Our technique uses frequency-based representations to enable quickly finding and analyzing different kinds of overlaps between the sets, and relating these overlaps to other attributes of the table entities. Furthermore, it enables various interactions to select elements of interest, find out if they are over-represented in specific sets or overlaps, and if they exhibit a different distribution for a specific attribute compared to the rest of the elements. These interactions allow formulating highly-expressive visual queries on the elements in terms of their set memberships and attribute values. As we demonstrate via two usage scenarios, Radial Sets enable revealing and analyzing a multitude of overlapping patterns between large sets, beyond the limits of state-of-the-art techniques.

  14. Multi-State Vibronic Interactions in Fluorinated Benzene Radical Cations.

    NASA Astrophysics Data System (ADS)

    Faraji, S.; Köppel, H.

    2009-06-01

    Conical intersections of potential energy surfaces have emerged as paradigms for signalling strong nonadiabatic coupling effects. An important class of systems where some of these effects have been analyzed in the literature, are the benzene and benzenoid cations, where the electronic structure, spectroscopy, and dynamics have received great attention in the literature. In the present work a brief overview is given over our theoretical treatments of multi-mode and multi-state vibronic interactions in the benzene radical cation and some of its fluorinated derivatives. The fluorobenzene derivatives are of systematic interest for at least two different reasons. (1) The reduction of symmetry by incomplete fluorination leads to a disappearance of the Jahn-Teller effect present in the parent cation. (2) A specific, more chemical effect of fluorination consists in the energetic increase of the lowest σ-type electronic states of the radical cations. The multi-mode multi-state vibronic interactions between the five lowest electronic states of the fluorobenzene radical cations are investigated theoretically, based on ab initio electronic structure data, and employing the well-established linear vibronic coupling model, augmented by quadratic coupling terms for the totally symmetric vibrational modes. Low-energy conical intersections, and strong vibronic couplings are found to prevail within the set of tilde{X}-tilde{A} and tilde{B}-tilde{C}-tilde{D} cationic states, while the interactions between these two sets of states are found to be weaker and depend on the particular isomer. This is attributed to the different location of the minima of the various conical intersections occurring in these systems. Wave-packet dynamical simulations for these coupled potential energy surfaces, utilizing the powerful multi-configuration time-dependent Hartree method are performed. Ultrafast internal conversion processes and the analysis of the MATI and photo-electron spectra shed new light on the spectroscopy and fluorescence dynamics of these species. W. Domcke, D. R. Yarkony, and H. Köppel, Advanced Series in Physical Chemistry, World Scientific, Singapore (2004). M. H. Beck and A. Jäckle and G. A. Worth and H. -D. Meyer, Phys. Rep. 324, 1 (2000). S. Faraji, H. Köppel, (Part I) ; S. Faraji, H. Köppel, H.-D. Meyer, (Part II) J. Chem. Phys. 129, 074310 (2008).

  15. Investigation of Priority Needs in Terms of Museum Service Accessibility for Visually Impaired Visitors

    ERIC Educational Resources Information Center

    Handa, Kozue; Dairoku, Hitoshi; Toriyama, Yoshiko

    2010-01-01

    This study investigates the priority needs of museum service accessibility for visually impaired visitors. For this purpose, conjoint analysis was utilized. Four conjoint attributes of museum services were selected: A--facilities for wayfinding; B--exhibitions and collections including objects for touching, hearing, smelling, etc.; C--information…

  16. Application of Person-Centered Approaches to Critical Quantitative Research: Exploring Inequities in College Financing Strategies

    ERIC Educational Resources Information Center

    Malcom-Piqueux, Lindsey

    2014-01-01

    This chapter discusses the utility of person-centered approaches to critical quantitative researchers. These techniques, which identify groups of individuals who share similar attributes, experiences, or outcomes, are contrasted with more commonly used variable-centered approaches. An illustrative example of a latent class analysis of the college…

  17. A Stochastic Multi-Attribute Assessment of Energy Options for Fairbanks, Alaska

    NASA Astrophysics Data System (ADS)

    Read, L.; Madani, K.; Mokhtari, S.; Hanks, C. L.; Sheets, B.

    2012-12-01

    Many competing projects have been proposed to address Interior Alaska's high cost of energy—both for electricity production and for heating. Public and private stakeholders are considering the costs associated with these competing projects which vary in fuel source, subsidy requirements, proximity, and other factors. As a result, the current projects under consideration involve a complex cost structure of potential subsidies and reliance on present and future market prices, introducing a significant amount of uncertainty associated with each selection. Multi-criteria multi-decision making (MCMDM) problems of this nature can benefit from game theory and systems engineering methods, which account for behavior and preferences of stakeholders in the analysis to produce feasible and relevant solutions. This work uses a stochastic MCMDM framework to evaluate the trade-offs of each proposed project based on a complete cost analysis, environmental impact, and long-term sustainability. Uncertainty in the model is quantified via a Monte Carlo analysis, which helps characterize the sensitivity and risk associated with each project. Based on performance measures and criteria outlined by the stakeholders, a decision matrix will inform policy on selecting a project that is both efficient and preferred by the constituents.

  18. Methods and Model Dependency of Extreme Event Attribution: The 2015 European Drought

    NASA Astrophysics Data System (ADS)

    Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Vautard, Robert; van Oldenborgh, Geert J.; Wilcox, Laura; Seneviratne, Sonia I.

    2017-10-01

    Science on the role of anthropogenic influence on extreme weather events, such as heatwaves or droughts, has evolved rapidly in the past years. The approach of "event attribution" compares the occurrence-probability of an event in the present, factual climate with its probability in a hypothetical, counterfactual climate without human-induced climate change. Several methods can be used for event attribution, based on climate model simulations and observations, and usually researchers only assess a subset of methods and data sources. Here, we explore the role of methodological choices for the attribution of the 2015 meteorological summer drought in Europe. We present contradicting conclusions on the relevance of human influence as a function of the chosen data source and event attribution methodology. Assessments using the maximum number of models and counterfactual climates with pre-industrial greenhouse gas concentrations point to an enhanced drought risk in Europe. However, other evaluations show contradictory evidence. These results highlight the need for a multi-model and multi-method framework in event attribution research, especially for events with a low signal-to-noise ratio and high model dependency such as regional droughts.

  19. Research on efficiency evaluation model of integrated energy system based on hybrid multi-attribute decision-making.

    PubMed

    Li, Yan

    2017-05-25

    The efficiency evaluation model of integrated energy system, involving many influencing factors, and the attribute values are heterogeneous and non-deterministic, usually cannot give specific numerical or accurate probability distribution characteristics, making the final evaluation result deviation. According to the characteristics of the integrated energy system, a hybrid multi-attribute decision-making model is constructed. The evaluation model considers the decision maker's risk preference. In the evaluation of the efficiency of the integrated energy system, the evaluation value of some evaluation indexes is linguistic value, or the evaluation value of the evaluation experts is not consistent. These reasons lead to ambiguity in the decision information, usually in the form of uncertain linguistic values and numerical interval values. In this paper, the risk preference of decision maker is considered when constructing the evaluation model. Interval-valued multiple-attribute decision-making method and fuzzy linguistic multiple-attribute decision-making model are proposed. Finally, the mathematical model of efficiency evaluation of integrated energy system is constructed.

  20. Some induced intuitionistic fuzzy aggregation operators applied to multi-attribute group decision making

    NASA Astrophysics Data System (ADS)

    Su, Zhi-xin; Xia, Guo-ping; Chen, Ming-yuan

    2011-11-01

    In this paper, we define various induced intuitionistic fuzzy aggregation operators, including induced intuitionistic fuzzy ordered weighted averaging (OWA) operator, induced intuitionistic fuzzy hybrid averaging (I-IFHA) operator, induced interval-valued intuitionistic fuzzy OWA operator, and induced interval-valued intuitionistic fuzzy hybrid averaging (I-IIFHA) operator. We also establish various properties of these operators. And then, an approach based on I-IFHA operator and intuitionistic fuzzy weighted averaging (WA) operator is developed to solve multi-attribute group decision-making (MAGDM) problems. In such problems, attribute weights and the decision makers' (DMs') weights are real numbers and attribute values provided by the DMs are intuitionistic fuzzy numbers (IFNs), and an approach based on I-IIFHA operator and interval-valued intuitionistic fuzzy WA operator is developed to solve MAGDM problems where the attribute values provided by the DMs are interval-valued IFNs. Furthermore, induced intuitionistic fuzzy hybrid geometric operator and induced interval-valued intuitionistic fuzzy hybrid geometric operator are proposed. Finally, a numerical example is presented to illustrate the developed approaches.

  1. Non-ad-hoc decision rule for the Dempster-Shafer method of evidential reasoning

    NASA Astrophysics Data System (ADS)

    Cheaito, Ali; Lecours, Michael; Bosse, Eloi

    1998-03-01

    This paper is concerned with the fusion of identity information through the use of statistical analysis rooted in Dempster-Shafer theory of evidence to provide automatic identification aboard a platform. An identity information process for a baseline Multi-Source Data Fusion (MSDF) system is defined. The MSDF system is applied to information sources which include a number of radars, IFF systems, an ESM system, and a remote track source. We use a comprehensive Platform Data Base (PDB) containing all the possible identity values that the potential target may take, and we use the fuzzy logic strategies which enable the fusion of subjective attribute information from sensor and the PDB to make the derivation of target identity more quickly, more precisely, and with statistically quantifiable measures of confidence. The conventional Dempster-Shafer lacks a formal basis upon which decision can be made in the face of ambiguity. We define a non-ad hoc decision rule based on the expected utility interval for pruning the `unessential' propositions which would otherwise overload the real-time data fusion systems. An example has been selected to demonstrate the implementation of our modified Dempster-Shafer method of evidential reasoning.

  2. Mirage events & driver haptic steering alerts in a motion-base driving simulator: A method for selecting an optimal HMI.

    PubMed

    Talamonti, Walter; Tijerina, Louis; Blommer, Mike; Swaminathan, Radhakrishnan; Curry, Reates; Ellis, R Darin

    2017-11-01

    This paper describes a new method, a 'mirage scenario,' to support formative evaluation of driver alerting or warning displays for manual and automated driving. This method provides driving contexts (e.g., various Times-To-Collision (TTCs) to a lead vehicle) briefly presented and then removed. In the present study, during each mirage event, a haptic steering display was evaluated. This haptic display indicated a steering response may be initiated to drive around an obstacle ahead. A motion-base simulator was used in a 32-participant study to present vehicle motion cues similar to the actual application. Surprise was neither present nor of concern, as it would be for a summative evaluation of a forward collision warning system. Furthermore, no collision avoidance maneuvers were performed, thereby reducing the risk of simulator sickness. This paper illustrates the mirage scenario procedures, the rating methods and definitions used with the mirage scenario, and analysis of the ratings obtained, together with a multi-attribute utility theory (MAUT) approach to evaluate and select among alternative designs for future summative evaluation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. BioVLAB-mCpG-SNP-EXPRESS: A system for multi-level and multi-perspective analysis and exploration of DNA methylation, sequence variation (SNPs), and gene expression from multi-omics data.

    PubMed

    Chae, Heejoon; Lee, Sangseon; Seo, Seokjun; Jung, Daekyoung; Chang, Hyeonsook; Nephew, Kenneth P; Kim, Sun

    2016-12-01

    Measuring gene expression, DNA sequence variation, and DNA methylation status is routinely done using high throughput sequencing technologies. To analyze such multi-omics data and explore relationships, reliable bioinformatics systems are much needed. Existing systems are either for exploring curated data or for processing omics data in the form of a library such as R. Thus scientists have much difficulty in investigating relationships among gene expression, DNA sequence variation, and DNA methylation using multi-omics data. In this study, we report a system called BioVLAB-mCpG-SNP-EXPRESS for the integrated analysis of DNA methylation, sequence variation (SNPs), and gene expression for distinguishing cellular phenotypes at the pairwise and multiple phenotype levels. The system can be deployed on either the Amazon cloud or a publicly available high-performance computing node, and the data analysis and exploration of the analysis result can be conveniently done using a web-based interface. In order to alleviate analysis complexity, all the process are fully automated, and graphical workflow system is integrated to represent real-time analysis progression. The BioVLAB-mCpG-SNP-EXPRESS system works in three stages. First, it processes and analyzes multi-omics data as input in the form of the raw data, i.e., FastQ files. Second, various integrated analyses such as methylation vs. gene expression and mutation vs. methylation are performed. Finally, the analysis result can be explored in a number of ways through a web interface for the multi-level, multi-perspective exploration. Multi-level interpretation can be done by either gene, gene set, pathway or network level and multi-perspective exploration can be explored from either gene expression, DNA methylation, sequence variation, or their relationship perspective. The utility of the system is demonstrated by performing analysis of phenotypically distinct 30 breast cancer cell line data set. BioVLAB-mCpG-SNP-EXPRESS is available at http://biohealth.snu.ac.kr/software/biovlab_mcpg_snp_express/. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Analysis of information quality attribute for SME towards adoption of research result

    NASA Astrophysics Data System (ADS)

    Febriani, E.; Dewobroto, W. S.; Anggraini, R. D.

    2017-12-01

    Small and Medium Enterprises (SME) holds significant role in fostering Indonesian economy. However, the research that is supposed to support the development of SMEs business has not yet fully adopted or utilized. Information attributes may be used as the benchmark to find the intention of SMEs from a research result and develop the strategy of quality information for all organizations both SMEs and the researcher. Therefore, because of the importance of information quality attribute required by SMEs, the research aims to analyses the information quality required by SMEs to clarify the information quality into the dimension of information quality. The research was started by distributing online questionnaire to SMEs. The questionnaire result showed that the content dimension is the most aspect required by SMEs, followed by time and form dimension, respectively. Quality information attribute required by SMEs from a research is that the result may be applied to the business.

  5. Population Attributable and Preventable Fractions: Cancer Risk Factor Surveillance, and Cancer Policy Projection

    PubMed Central

    Shield, Kevin D.; Parkin, D. Maxwell; Whiteman, David C.; Rehm, Jürgen; Viallon, Vivian; Micallef, Claire Marant; Vineis, Paolo; Rushton, Lesley; Bray, Freddie; Soerjomataram, Isabelle

    2016-01-01

    The proportions of new cancer cases and deaths that are caused by exposure to risk factors and that could be prevented are key statistics for public health policy and planning. This paper summarizes the methodologies for estimating, challenges in the analysis of, and utility of, population attributable and preventable fractions for cancers caused by major risk factors such as tobacco smoking, dietary factors, high body fat, physical inactivity, alcohol consumption, infectious agents, occupational exposure, air pollution, sun exposure, and insufficient breastfeeding. For population attributable and preventable fractions, evidence of a causal relationship between a risk factor and cancer, outcome (such as incidence and mortality), exposure distribution, relative risk, theoretical-minimum-risk, and counterfactual scenarios need to be clearly defined and congruent. Despite limitations of the methodology and the data used for estimations, the population attributable and preventable fractions are a useful tool for public health policy and planning. PMID:27547696

  6. A Multi-Scale, Multi-Physics Optimization Framework for Additively Manufactured Structural Components

    NASA Astrophysics Data System (ADS)

    El-Wardany, Tahany; Lynch, Mathew; Gu, Wenjiong; Hsu, Arthur; Klecka, Michael; Nardi, Aaron; Viens, Daniel

    This paper proposes an optimization framework enabling the integration of multi-scale / multi-physics simulation codes to perform structural optimization design for additively manufactured components. Cold spray was selected as the additive manufacturing (AM) process and its constraints were identified and included in the optimization scheme. The developed framework first utilizes topology optimization to maximize stiffness for conceptual design. The subsequent step applies shape optimization to refine the design for stress-life fatigue. The component weight was reduced by 20% while stresses were reduced by 75% and the rigidity was improved by 37%. The framework and analysis codes were implemented using Altair software as well as an in-house loading code. The optimized design was subsequently produced by the cold spray process.

  7. Situation exploration in a persistent surveillance system with multidimensional data

    NASA Astrophysics Data System (ADS)

    Habibi, Mohammad S.

    2013-03-01

    There is an emerging need for fusing hard and soft sensor data in an efficient surveillance system to provide accurate estimation of situation awareness. These mostly abstract, multi-dimensional and multi-sensor data pose a great challenge to the user in performing analysis of multi-threaded events efficiently and cohesively. To address this concern an interactive Visual Analytics (VA) application is developed for rapid assessment and evaluation of different hypotheses based on context-sensitive ontology spawn from taxonomies describing human/human and human/vehicle/object interactions. A methodology is described here for generating relevant ontology in a Persistent Surveillance System (PSS) and demonstrates how they can be utilized in the context of PSS to track and identify group activities pertaining to potential threats. The proposed VA system allows for visual analysis of raw data as well as metadata that have spatiotemporal representation and content-based implications. Additionally in this paper, a technique for rapid search of tagged information contingent to ranking and confidence is explained for analysis of multi-dimensional data. Lastly the issue of uncertainty associated with processing and interpretation of heterogeneous data is also addressed.

  8. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations. The presentation includes examples of the approach being applied to a range of subsurface, geospatial studies (e.g. induced seismicity risk).

  9. Investigating the multi-causal and complex nature of the accident causal influence of construction project features.

    PubMed

    Manu, Patrick A; Ankrah, Nii A; Proverbs, David G; Suresh, Subashini

    2012-09-01

    Construction project features (CPFs) are organisational, physical and operational attributes that characterise construction projects. Although previous studies have examined the accident causal influence of CPFs, the multi-causal attribute of this causal phenomenon still remain elusive and thus requires further investigation. Aiming to shed light on this facet of the accident causal phenomenon of CPFs, this study examines relevant literature and crystallises the attained insight of the multi-causal attribute by a graphical model which is subsequently operationalised by a derived mathematical risk expression that offers a systematic approach for evaluating the potential of CPFs to cause harm and consequently their health and safety (H&S) risk implications. The graphical model and the risk expression put forth by the study thus advance current understanding of the accident causal phenomenon of CPFs and they present an opportunity for project participants to manage the H&S risk associated with CPFs from the early stages of project procurement. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Multifunctional spaces in slum settlements and their relation to activity pattern case study of Kampung Sangkrah, Surakarta

    NASA Astrophysics Data System (ADS)

    Shobirin, Abyzhar; Ramadhanty, Almira Husna; Hardiana, Ana

    2018-02-01

    Surakarta is a rapidly urbanized city and it causes the limitation of the availability of land within its urban area. This entangled problem is resulting in the development of slum settlements that spread across the city. One of the slum concentration areas is located on Pepe riverbanks downstream area that belongs to Kampung Sangkrah administrative boundaries. Slum settlements are characterized as a densely-populated area lacking of, or absence of, open space. This condition forces slum inhabitants to effectively use their available spaces, even multi-functionally. This research aims to observe how slum inhabitants multi-functionally use the spaces around their houses and determine the typology of multifunctional space and also the factors that influence it. To understand this phenomenon, this research used activity pattern perspectives. The scope of observation covers in-house (internal) space utilizations and neighborhood-level (external) space utilization. The data used for this research were collected primarily through site observations and interviews, using sampling to conduct data collection for in-house activities and space utilization. The analysis was conducted using descriptive method qualitatively. The research concluded that there are three types of multifunctional space utilization within slum settlements, and the utilization of spaces, whether internal or external utilization also varies depending on the inhabitants' economic-related activities.

  11. Action tagging in a multi-user indoor environment for behavioural analysis purposes.

    PubMed

    Guerra, Claudio; Bianchi, Valentina; De Munari, Ilaria; Ciampolini, Paolo

    2015-01-01

    EU population is getting older, so that ICT-based solutions are expected to provide support in the challenges implied by the demographic change. At the University of Parma an AAL (Ambient Assisted Living) system, named CARDEA, has been developed. In this paper a new feature of the system is introduced, in which environmental and personal (i.e., wearable) sensors coexist, providing an accurate picture of the user's activity and needs. Environmental devices may greatly help in performing activity recognition and behavioral analysis tasks. However, in a multi-user environment, this implies the need of attributing environmental sensors outcome to a specific user, i.e., identifying the user when he performs a task detected by an environmental device. We implemented such an "action tagging" feature, based on information fusion, within the CARDEA environment, as an inexpensive, alternative solution to the problematic issue of indoor locationing.

  12. Integral Vision: A Multi-Perspective Approach to the Recognition of Graduate Attributes

    ERIC Educational Resources Information Center

    Haigh, Martin; Clifford, Valerie A.

    2011-01-01

    The increasing focus of universities on employability is stimulating debates about the purpose of higher education. In this article, we consider what attributes society will demand from graduates in the future. We use Wilber's integral theory to tease out some of the issues in the current conceptualisation of graduate attributes and argue that we…

  13. Attribute Utility Motivated k-anonymization of Datasets to Support the Heterogeneous Needs of Biomedical Researchers

    PubMed Central

    Ye, Huimin; Chen, Elizabeth S.

    2011-01-01

    In order to support the increasing need to share electronic health data for research purposes, various methods have been proposed for privacy preservation including k-anonymity. Many k-anonymity models provide the same level of anoymization regardless of practical need, which may decrease the utility of the dataset for a particular research study. In this study, we explore extensions to the k-anonymity algorithm that aim to satisfy the heterogeneous needs of different researchers while preserving privacy as well as utility of the dataset. The proposed algorithm, Attribute Utility Motivated k-anonymization (AUM), involves analyzing the characteristics of attributes and utilizing them to minimize information loss during the anonymization process. Through comparison with two existing algorithms, Mondrian and Incognito, preliminary results indicate that AUM may preserve more information from original datasets thus providing higher quality results with lower distortion. PMID:22195223

  14. Attribution of the 2015 record high sea surface temperatures over the central equatorial Pacific and tropical Indian Ocean

    NASA Astrophysics Data System (ADS)

    Park, In-Hong; Min, Seung-Ki; Yeh, Sang-Wook; Weller, Evan; Kim, Seon Tae

    2017-04-01

    This study assessed the anthropogenic contribution to the 2015 record-breaking high sea surface temperatures (SSTs) observed in the central equatorial Pacific and tropical Indian Ocean. Considering a close link between extreme warm events in these regions, we conducted a joint attribution analysis using a fraction of attributable risk approach. Probability of occurrence of such extreme anomalies and long-term trends for the two oceanic regions were compared between CMIP5 multi-model simulations with and without anthropogenic forcing. Results show that the excessive warming in both regions is well beyond the range of natural variability and robustly attributable to human activities due to greenhouse gas increase. We further explored associated mechanisms including the Bjerknes feedback and background anthropogenic warming. It is concluded that background warming was the main contribution to the 2015 extreme SST event over the central equatorial Pacific Ocean on a developing El Niño condition, which in turn induced the extreme SST event over the tropical Indian Ocean through the atmospheric bridge effect.

  15. Introducing anisotropic Minkowski functionals and quantitative anisotropy measures for local structure analysis in biomedical imaging

    NASA Astrophysics Data System (ADS)

    Wismüller, Axel; De, Titas; Lochmüller, Eva; Eckstein, Felix; Nagarajan, Mahesh B.

    2013-03-01

    The ability of Minkowski Functionals to characterize local structure in different biological tissue types has been demonstrated in a variety of medical image processing tasks. We introduce anisotropic Minkowski Functionals (AMFs) as a novel variant that captures the inherent anisotropy of the underlying gray-level structures. To quantify the anisotropy characterized by our approach, we further introduce a method to compute a quantitative measure motivated by a technique utilized in MR diffusion tensor imaging, namely fractional anisotropy. We showcase the applicability of our method in the research context of characterizing the local structure properties of trabecular bone micro-architecture in the proximal femur as visualized on multi-detector CT. To this end, AMFs were computed locally for each pixel of ROIs extracted from the head, neck and trochanter regions. Fractional anisotropy was then used to quantify the local anisotropy of the trabecular structures found in these ROIs and to compare its distribution in different anatomical regions. Our results suggest a significantly greater concentration of anisotropic trabecular structures in the head and neck regions when compared to the trochanter region (p < 10-4). We also evaluated the ability of such AMFs to predict bone strength in the femoral head of proximal femur specimens obtained from 50 donors. Our results suggest that such AMFs, when used in conjunction with multi-regression models, can outperform more conventional features such as BMD in predicting failure load. We conclude that such anisotropic Minkowski Functionals can capture valuable information regarding directional attributes of local structure, which may be useful in a wide scope of biomedical imaging applications.

  16. Land cover's refined classification based on multi source of remote sensing information fusion: a case study of national geographic conditions census in China

    NASA Astrophysics Data System (ADS)

    Cheng, Tao; Zhang, Jialong; Zheng, Xinyan; Yuan, Rujin

    2018-03-01

    The project of The First National Geographic Conditions Census developed by Chinese government has designed the data acquisition content and indexes, and has built corresponding classification system mainly based on the natural property of material. However, the unified standard for land cover classification system has not been formed; the production always needs converting to meet the actual needs. Therefore, it proposed a refined classification method based on multi source of remote sensing information fusion. It takes the third-level classes of forest land and grassland for example, and has collected the thematic data of Vegetation Map of China (1:1,000,000), attempts to develop refined classification utilizing raster spatial analysis model. Study area is selected, and refined classification is achieved by using the proposed method. The results show that land cover within study area is divided principally among 20 classes, from subtropical broad-leaved forest (31131) to grass-forb community type of low coverage grassland (41192); what's more, after 30 years in the study area, climatic factors, developmental rhythm characteristics and vegetation ecological geographical characteristics have not changed fundamentally, only part of the original vegetation types have changed in spatial distribution range or land cover types. Research shows that refined classification for the third-level classes of forest land and grassland could make the results take on both the natural attributes of the original and plant community ecology characteristics, which could meet the needs of some industry application, and has certain practical significance for promoting the product of The First National Geographic Conditions Census.

  17. Introducing Anisotropic Minkowski Functionals and Quantitative Anisotropy Measures for Local Structure Analysis in Biomedical Imaging

    PubMed Central

    Wismüller, Axel; De, Titas; Lochmüller, Eva; Eckstein, Felix; Nagarajan, Mahesh B.

    2017-01-01

    The ability of Minkowski Functionals to characterize local structure in different biological tissue types has been demonstrated in a variety of medical image processing tasks. We introduce anisotropic Minkowski Functionals (AMFs) as a novel variant that captures the inherent anisotropy of the underlying gray-level structures. To quantify the anisotropy characterized by our approach, we further introduce a method to compute a quantitative measure motivated by a technique utilized in MR diffusion tensor imaging, namely fractional anisotropy. We showcase the applicability of our method in the research context of characterizing the local structure properties of trabecular bone micro-architecture in the proximal femur as visualized on multi-detector CT. To this end, AMFs were computed locally for each pixel of ROIs extracted from the head, neck and trochanter regions. Fractional anisotropy was then used to quantify the local anisotropy of the trabecular structures found in these ROIs and to compare its distribution in different anatomical regions. Our results suggest a significantly greater concentration of anisotropic trabecular structures in the head and neck regions when compared to the trochanter region (p < 10−4). We also evaluated the ability of such AMFs to predict bone strength in the femoral head of proximal femur specimens obtained from 50 donors. Our results suggest that such AMFs, when used in conjunction with multi-regression models, can outperform more conventional features such as BMD in predicting failure load. We conclude that such anisotropic Minkowski Functionals can capture valuable information regarding directional attributes of local structure, which may be useful in a wide scope of biomedical imaging applications. PMID:29170580

  18. A Quality by Design approach to investigate tablet dissolution shift upon accelerated stability by multivariate methods.

    PubMed

    Huang, Jun; Goolcharran, Chimanlall; Ghosh, Krishnendu

    2011-05-01

    This paper presents the use of experimental design, optimization and multivariate techniques to investigate root-cause of tablet dissolution shift (slow-down) upon stability and develop control strategies for a drug product during formulation and process development. The effectiveness and usefulness of these methodologies were demonstrated through two application examples. In both applications, dissolution slow-down was observed during a 4-week accelerated stability test under 51°C/75%RH storage condition. In Application I, an experimental design was carried out to evaluate the interactions and effects of the design factors on critical quality attribute (CQA) of dissolution upon stability. The design space was studied by design of experiment (DOE) and multivariate analysis to ensure desired dissolution profile and minimal dissolution shift upon stability. Multivariate techniques, such as multi-way principal component analysis (MPCA) of the entire dissolution profiles upon stability, were performed to reveal batch relationships and to evaluate the impact of design factors on dissolution. In Application II, an experiment was conducted to study the impact of varying tablet breaking force on dissolution upon stability utilizing MPCA. It was demonstrated that the use of multivariate methods, defined as Quality by Design (QbD) principles and tools in ICH-Q8 guidance, provides an effective means to achieve a greater understanding of tablet dissolution upon stability. Copyright © 2010 Elsevier B.V. All rights reserved.

  19. Exploring the impact of different multi-level measures of physician communities in patient-centric care networks on healthcare outcomes: A multi-level regression approach.

    PubMed

    Uddin, Shahadat

    2016-02-04

    A patient-centric care network can be defined as a network among a group of healthcare professionals who provide treatments to common patients. Various multi-level attributes of the members of this network have substantial influence to its perceived level of performance. In order to assess the impact different multi-level attributes of patient-centric care networks on healthcare outcomes, this study first captured patient-centric care networks for 85 hospitals using health insurance claim dataset. From these networks, this study then constructed physician collaboration networks based on the concept of patient-sharing network among physicians. A multi-level regression model was then developed to explore the impact of different attributes that are organised at two levels on hospitalisation cost and hospital length of stay. For Level-1 model, the average visit per physician significantly predicted both hospitalisation cost and hospital length of stay. The number of different physicians significantly predicted only the hospitalisation cost, which has significantly been moderated by age, gender and Comorbidity score of patients. All Level-1 findings showed significance variance across physician collaboration networks having different community structure and density. These findings could be utilised as a reflective measure by healthcare decision makers. Moreover, healthcare managers could consider them in developing effective healthcare environments.

  20. Analysis of Intelligence and Academic Scores as a Predictor of Promotion Rate for U.S. Army Noncommissioned Officers.

    DTIC Science & Technology

    1987-07-06

    levels of intellegence tests and academic background as values to predict promotion. The model, however, demonstrated only limited utility as a preditive...attributes in the form of promotion points or a minimum threshold scale would be one approach. Unfortunately, this may artificially force NCO’s of less

  1. Architecture-led Requirements and Safety Analysis of an Aircraft Survivability Situational Awareness System

    DTIC Science & Technology

    2015-05-01

    quality attributes. Prioritization of the utility tree leafs driven by mission goals help the user ensure that critical requirements are well-specified...Methods: State of the Art and Future Directions”, ACM Computing Surveys. 1996. 10 Laitenberger, Oliver , “A Survey of Software Inspection Technologies, Handbook on Software Engineering and Knowledge Engineering”. 2002.

  2. Estimating mapped-plot forest attributes with ratios of means

    Treesearch

    S.J. Zarnoch; W.A. Bechtold

    2000-01-01

    The mapped-plot design utilized by the U.S. Department of Agriculture (USDA) Forest Inventory and Analysis and the National Forest Health Monitoring Programs is described. Data from 2458 forested mapped plots systematically spread across 25 States reveal that 35 percent straddle multiple conditions. The ratio-of-means estimator is developed as a method to obtain...

  3. Facies Modeling Using 3D Pre-Stack Simultaneous Seismic Inversion and Multi-Attribute Probability Neural Network Transform in the Wattenberg Field, Colorado

    NASA Astrophysics Data System (ADS)

    Harryandi, Sheila

    The Niobrara/Codell unconventional tight reservoir play at Wattenberg Field, Colorado has potentially two billion barrels of oil equivalent requiring hundreds of wells to access this resource. The Reservoir Characterization Project (RCP), in conjunction with Anadarko Petroleum Corporation (APC), began reservoir characterization research to determine how to increase reservoir recovery while maximizing operational efficiency. Past research results indicate that targeting the highest rock quality within the reservoir section for hydraulic fracturing is optimal for improving horizontal well stimulation through multi-stage hydraulic fracturing. The reservoir is highly heterogeneous, consisting of alternating chalks and marls. Modeling the facies within the reservoir is very important to be able to capture the heterogeneity at the well-bore scale; this heterogeneity is then upscaled from the borehole scale to the seismic scale to distribute the heterogeneity in the inter-well space. I performed facies clustering analysis to create several facies defining the reservoir interval in the RCP Wattenberg Field study area. Each facies can be expressed in terms of a range of rock property values from wells obtained by cluster analysis. I used the facies classification from the wells to guide the pre-stack seismic inversion and multi-attribute transform. The seismic data extended the facies information and rock quality information from the wells. By obtaining this information from the 3D facies model, I generated a facies volume capturing the reservoir heterogeneity throughout a ten square mile study-area within the field area. Recommendations are made based on the facies modeling, which include the location for future hydraulic fracturing/re-fracturing treatments to improve recovery from the reservoir, and potential deeper intervals for future exploration drilling targets.

  4. Economic and environmental optimization of a multi-site utility network for an industrial complex.

    PubMed

    Kim, Sang Hun; Yoon, Sung-Geun; Chae, Song Hwa; Park, Sunwon

    2010-01-01

    Most chemical companies consume a lot of steam, water and electrical resources in the production process. Given recent record fuel costs, utility networks must be optimized to reduce the overall cost of production. Environmental concerns must also be considered when preparing modifications to satisfy the requirements for industrial utilities, since wastes discharged from the utility networks are restricted by environmental regulations. Construction of Eco-Industrial Parks (EIPs) has drawn attention as a promising approach for retrofitting existing industrial parks to improve energy efficiency. The optimization of the utility network within an industrial complex is one of the most important undertakings to minimize energy consumption and waste loads in the EIP. In this work, a systematic approach to optimize the utility network of an industrial complex is presented. An important issue in the optimization of a utility network is the desire of the companies to achieve high profits while complying with the environmental regulations. Therefore, the proposed optimization was performed with consideration of both economic and environmental factors. The proposed approach consists of unit modeling using thermodynamic principles, mass and energy balances, development of a multi-period Mixed Integer Linear Programming (MILP) model for the integration of utility systems in an industrial complex, and an economic/environmental analysis of the results. This approach is applied to the Yeosu Industrial Complex, considering seasonal utility demands. The results show that both the total utility cost and waste load are reduced by optimizing the utility network of an industrial complex. 2009 Elsevier Ltd. All rights reserved.

  5. Automated Student Aid Processing: The Challenge and Opportunity.

    ERIC Educational Resources Information Center

    St. John, Edward P.

    1985-01-01

    To utilize automated technology for student aid processing, it is necessary to work with multi-institutional offices (student aid, admissions, registration, and business) and to develop automated interfaces with external processing systems at state and federal agencies and perhaps at need-analysis organizations and lenders. (MLW)

  6. Multi Attribute Decision Analysis in Public Health - Analyzing Effectiveness of Alternate Modes of Dispensing

    DTIC Science & Technology

    2007-09-01

    curve (Smith, 2007). This curve shows the relative performance of an option based on the selected factors (Chan & Mauborgne, 2007). Value cures ...that they and their families are safe; anything less will result in staffing shortages and absenteeism . d. POD Staff Training POD volunteers would...can expect high rates of absenteeism . Local law enforcement in LAC has therefore not guaranteed one- on-one protection for the 3,750 postal carriers

  7. Identification of Low-Latency Obfuscated Traffic Using Multi-Attribute Analysis

    DTIC Science & Technology

    2017-03-01

    the distribution of common Tor packet sizes. Herrmann et al. also contend that the remaining variations in observed packet sizes are caused by OS...specific fragmentation and that Tor’s variation in packet size provides an additional level of protection as the false positive rate (FPR) using packet...three pre-filter variations , the observed FPR for non-Tor ranged from 94.4 percent to 7.2 percent, and the observed FNR for Tor ranged from 61.3

  8. Application of multi attribute failure mode analysis of milk production using analytical hierarchy process method

    NASA Astrophysics Data System (ADS)

    Rucitra, A. L.

    2018-03-01

    Pusat Koperasi Induk Susu (PKIS) Sekar Tanjung, East Java is one of the modern dairy industries producing Ultra High Temperature (UHT) milk. A problem that often occurs in the production process in PKIS Sekar Tanjung is a mismatch between the production process and the predetermined standard. The purpose of applying Analytical Hierarchy Process (AHP) was to identify the most potential cause of failure in the milk production process. Multi Attribute Failure Mode Analysis (MAFMA) method was used to eliminate or reduce the possibility of failure when viewed from the failure causes. This method integrates the severity, occurrence, detection, and expected cost criteria obtained from depth interview with the head of the production department as an expert. The AHP approach was used to formulate the priority ranking of the cause of failure in the milk production process. At level 1, the severity has the highest weight of 0.41 or 41% compared to other criteria. While at level 2, identifying failure in the UHT milk production process, the most potential cause was the average mixing temperature of more than 70 °C which was higher than the standard temperature (≤70 ° C). This failure cause has a contributes weight of 0.47 or 47% of all criteria Therefore, this study suggested the company to control the mixing temperature to minimise or eliminate the failure in this process.

  9. Medical decision-making and the patient: understanding preference patterns for growth hormone therapy using conjoint analysis.

    PubMed

    Singh, J; Cuttler, L; Shin, M; Silvers, J B; Neuhauser, D

    1998-08-01

    This study examines two questions that relate to patients' role in medical decision making: (1) Do patients utilize multiple attributes in evaluating different treatment options?, and (2) Do patient treatment preferences evidence heterogeneity and disparate patterns? Although research has examined these questions by using either individual- or aggregate-level approaches, the authors demonstrate an intermediate level approach (ie, relating to patient subgroups). The authors utilize growth augmentation therapy (GAT) as a context for analyzing these questions because GAT reflects a class of nonemergency treatments that (1) are based on genetic technology, (2) aim to improve the quality (rather than quantity) of life, and (3) offer useful insights for the patient's role in medical decision making. Using conjoint analysis, a methodology especially suited for the study of patient-consumer preferences but largely unexplored in the medical field, data were obtained from 154 parents for their decision to pursue GAT for their child. In all, six attributes were utilized to study GAT, including risk of long-term side effects (1:10,000 or 1:100,000), certainty of effect (50% or 100% of cases), amount of effect (1-2 inches or 4-5 inches in adult height), out-of-pocket cost ($100, $2,000, or $10,000/year) and child's attitude (likes or not likes therapy). An experimental design using conjoint analysis procedures revealed five preference patterns that reflect clear disparities in the importance that parents attach to the different attributes of growth therapy. These preference patterns are (1) child-focused (23%), (2) risk-conscious (36%), (3) balanced (23%), (4) cost-conscious (14%), and (5) ease-of-use (4%) oriented. Additional tests provided evidence for the validity of these preference patterns. Finally, this preference heterogeneity related systematically to parental characteristics (eg, demographic, psychologic). The study results offer additional insights into medical decision making with the consumer as the focal point and extend previous work that has tended to emphasize either an individual- or aggregate-based analysis. Implications for researchers and health care delivery in general and growth hormone management in particular are provided.

  10. Design optimization of a prescribed vibration system using conjoint value analysis

    NASA Astrophysics Data System (ADS)

    Malinga, Bongani; Buckner, Gregory D.

    2016-12-01

    This article details a novel design optimization strategy for a prescribed vibration system (PVS) used to mechanically filter solids from fluids in oil and gas drilling operations. A dynamic model of the PVS is developed, and the effects of disturbance torques are detailed. This model is used to predict the effects of design parameters on system performance and efficiency, as quantified by system attributes. Conjoint value analysis, a statistical technique commonly used in marketing science, is utilized to incorporate designer preferences. This approach effectively quantifies and optimizes preference-based trade-offs in the design process. The effects of designer preferences on system performance and efficiency are simulated. This novel optimization strategy yields improvements in all system attributes across all simulated vibration profiles, and is applicable to other industrial electromechanical systems.

  11. Whole-Genome Sequencing in Microbial Forensic Analysis of Gamma-Irradiated Microbial Materials.

    PubMed

    Broomall, Stacey M; Ait Ichou, Mohamed; Krepps, Michael D; Johnsky, Lauren A; Karavis, Mark A; Hubbard, Kyle S; Insalaco, Joseph M; Betters, Janet L; Redmond, Brady W; Rivers, Bryan A; Liem, Alvin T; Hill, Jessica M; Fochler, Edward T; Roth, Pierce A; Rosenzweig, C Nicole; Skowronski, Evan W; Gibbons, Henry S

    2016-01-15

    Effective microbial forensic analysis of materials used in a potential biological attack requires robust methods of morphological and genetic characterization of the attack materials in order to enable the attribution of the materials to potential sources and to exclude other potential sources. The genetic homogeneity and potential intersample variability of many of the category A to C bioterrorism agents offer a particular challenge to the generation of attributive signatures, potentially requiring whole-genome or proteomic approaches to be utilized. Currently, irradiation of mail is standard practice at several government facilities judged to be at particularly high risk. Thus, initial forensic signatures would need to be recovered from inactivated (nonviable) material. In the study described in this report, we determined the effects of high-dose gamma irradiation on forensic markers of bacterial biothreat agent surrogate organisms with a particular emphasis on the suitability of genomic DNA (gDNA) recovered from such sources as a template for whole-genome analysis. While irradiation of spores and vegetative cells affected the retention of Gram and spore stains and sheared gDNA into small fragments, we found that irradiated material could be utilized to generate accurate whole-genome sequence data on the Illumina and Roche 454 sequencing platforms. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  12. Which product characteristics are preferred by Chinese consumers when choosing pork? A conjoint analysis on perceived quality of selected pork attributes.

    PubMed

    Ma, Xiu Q; Verkuil, Julia M; Reinbach, Helene C; Meinert, Lene

    2017-05-01

    Due to the economic growth achieved by China over the past 20 years, Chinese consumers have changed their purchasing behavior regarding meat. Instead of buying locally produced pork, they are increasingly willing to purchase imported pork. A conjoint analysis investigated how intrinsic pork attributes ( fat content and processing) and extrinsic pork attributes ( origin , price , and packaging ) relate to the perceived quality of pork and the choices made by Chinese consumers. A questionnaire distributed among a sample of Chinese consumers ( n  = 81) revealed that processing (fresh/frozen) is the most important determinant of pork choice (36%), followed by fat content (27%), origin (18%), price (12%), and packaging (6.6%). Estimates of utility showed that Chinese consumers value fresh pork highly (0.147), followed by lean pork (0.111) and pork imported from countries other than China (0.073). The findings indicate that Chinese consumer's value both intrinsic and extrinsic attributes, and these results may help the meat industry improve China's competitive meat market by developing new and more products that are tailored to the needs of the consumer.

  13. Application of Multiple Imputation for Missing Values in Three-Way Three-Mode Multi-Environment Trial Data

    PubMed Central

    Tian, Ting; McLachlan, Geoffrey J.; Dieters, Mark J.; Basford, Kaye E.

    2015-01-01

    It is a common occurrence in plant breeding programs to observe missing values in three-way three-mode multi-environment trial (MET) data. We proposed modifications of models for estimating missing observations for these data arrays, and developed a novel approach in terms of hierarchical clustering. Multiple imputation (MI) was used in four ways, multiple agglomerative hierarchical clustering, normal distribution model, normal regression model, and predictive mean match. The later three models used both Bayesian analysis and non-Bayesian analysis, while the first approach used a clustering procedure with randomly selected attributes and assigned real values from the nearest neighbour to the one with missing observations. Different proportions of data entries in six complete datasets were randomly selected to be missing and the MI methods were compared based on the efficiency and accuracy of estimating those values. The results indicated that the models using Bayesian analysis had slightly higher accuracy of estimation performance than those using non-Bayesian analysis but they were more time-consuming. However, the novel approach of multiple agglomerative hierarchical clustering demonstrated the overall best performances. PMID:26689369

  14. Application of Multiple Imputation for Missing Values in Three-Way Three-Mode Multi-Environment Trial Data.

    PubMed

    Tian, Ting; McLachlan, Geoffrey J; Dieters, Mark J; Basford, Kaye E

    2015-01-01

    It is a common occurrence in plant breeding programs to observe missing values in three-way three-mode multi-environment trial (MET) data. We proposed modifications of models for estimating missing observations for these data arrays, and developed a novel approach in terms of hierarchical clustering. Multiple imputation (MI) was used in four ways, multiple agglomerative hierarchical clustering, normal distribution model, normal regression model, and predictive mean match. The later three models used both Bayesian analysis and non-Bayesian analysis, while the first approach used a clustering procedure with randomly selected attributes and assigned real values from the nearest neighbour to the one with missing observations. Different proportions of data entries in six complete datasets were randomly selected to be missing and the MI methods were compared based on the efficiency and accuracy of estimating those values. The results indicated that the models using Bayesian analysis had slightly higher accuracy of estimation performance than those using non-Bayesian analysis but they were more time-consuming. However, the novel approach of multiple agglomerative hierarchical clustering demonstrated the overall best performances.

  15. Reduction of multi-dimensional laboratory data to a two-dimensional plot: a novel technique for the identification of laboratory error.

    PubMed

    Kazmierczak, Steven C; Leen, Todd K; Erdogmus, Deniz; Carreira-Perpinan, Miguel A

    2007-01-01

    The clinical laboratory generates large amounts of patient-specific data. Detection of errors that arise during pre-analytical, analytical, and post-analytical processes is difficult. We performed a pilot study, utilizing a multidimensional data reduction technique, to assess the utility of this method for identifying errors in laboratory data. We evaluated 13,670 individual patient records collected over a 2-month period from hospital inpatients and outpatients. We utilized those patient records that contained a complete set of 14 different biochemical analytes. We used two-dimensional generative topographic mapping to project the 14-dimensional record to a two-dimensional space. The use of a two-dimensional generative topographic mapping technique to plot multi-analyte patient data as a two-dimensional graph allows for the rapid identification of potentially anomalous data. Although we performed a retrospective analysis, this technique has the benefit of being able to assess laboratory-generated data in real time, allowing for the rapid identification and correction of anomalous data before they are released to the physician. In addition, serial laboratory multi-analyte data for an individual patient can also be plotted as a two-dimensional plot. This tool might also be useful for assessing patient wellbeing and prognosis.

  16. Estimated Costs of Sporadic Gastrointestinal Illness Associated with Surface Water Recreation: A Combined Analysis of Data from NEEAR and CHEERS Studies

    PubMed Central

    DeFlorio-Barker, Stephanie; Wade, Timothy J.; Jones, Rachael M.; Friedman, Lee S.; Wing, Coady; Dorevitch, Samuel

    2016-01-01

    Background: The burden of illness can be described by addressing both incidence and illness severity attributable to water recreation. Monetized as cost, attributable disease burden estimates can be useful for environmental management decisions. Objectives: We characterize the disease burden attributable to water recreation using data from two cohort studies using a cost of illness (COI) approach and estimate the largest drivers of the disease burden of water recreation. Methods: Data from the NEEAR study, which evaluated swimming and wading in marine and freshwater beaches in six U.S. states, and CHEERS, which evaluated illness after incidental-contact recreation (boating, canoeing, fishing, kayaking, and rowing) on waterways in the Chicago area, were used to estimate the cost per case of gastrointestinal illness and costs attributable to water recreation. Data on health care and medication utilization and missed days of work or leisure were collected and combined with cost data to construct measures of COI. Results: Depending on different assumptions, the cost of gastrointestinal symptoms attributable to water recreation are estimated to be $1,220 for incidental-contact recreation (range $338–$1,681) and $1,676 for swimming/wading (range $425–2,743) per 1,000 recreators. Lost productivity is a major driver of the estimated COI, accounting for up to 90% of total costs. Conclusions: Our estimates suggest gastrointestinal illness attributed to surface water recreation at urban waterways, lakes, and coastal marine beaches is responsible for costs that should be accounted for when considering the monetary impact of efforts to improve water quality. The COI provides more information than the frequency of illness, as it takes into account disease incidence, health care utilization, and lost productivity. Use of monetized disease severity information should be included in future studies of water quality and health. Citation: DeFlorio-Barker S, Wade TJ, Jones RM, Friedman LS, Wing C, Dorevitch S. 2017. Estimated costs of sporadic gastrointestinal illness associated with surface water recreation: a combined analysis of data from NEEAR and CHEERS Studies. Environ Health Perspect 125:215–222; http://dx.doi.org/10.1289/EHP130 PMID:27459727

  17. Estimated Costs of Sporadic Gastrointestinal Illness Associated with Surface Water Recreation: A Combined Analysis of Data from NEEAR and CHEERS Studies.

    PubMed

    DeFlorio-Barker, Stephanie; Wade, Timothy J; Jones, Rachael M; Friedman, Lee S; Wing, Coady; Dorevitch, Samuel

    2017-02-01

    The burden of illness can be described by addressing both incidence and illness severity attributable to water recreation. Monetized as cost, attributable disease burden estimates can be useful for environmental management decisions. We characterize the disease burden attributable to water recreation using data from two cohort studies using a cost of illness (COI) approach and estimate the largest drivers of the disease burden of water recreation. Data from the NEEAR study, which evaluated swimming and wading in marine and freshwater beaches in six U.S. states, and CHEERS, which evaluated illness after incidental-contact recreation (boating, canoeing, fishing, kayaking, and rowing) on waterways in the Chicago area, were used to estimate the cost per case of gastrointestinal illness and costs attributable to water recreation. Data on health care and medication utilization and missed days of work or leisure were collected and combined with cost data to construct measures of COI. Depending on different assumptions, the cost of gastrointestinal symptoms attributable to water recreation are estimated to be $1,220 for incidental-contact recreation (range $338-$1,681) and $1,676 for swimming/wading (range $425-2,743) per 1,000 recreators. Lost productivity is a major driver of the estimated COI, accounting for up to 90% of total costs. Our estimates suggest gastrointestinal illness attributed to surface water recreation at urban waterways, lakes, and coastal marine beaches is responsible for costs that should be accounted for when considering the monetary impact of efforts to improve water quality. The COI provides more information than the frequency of illness, as it takes into account disease incidence, health care utilization, and lost productivity. Use of monetized disease severity information should be included in future studies of water quality and health. Citation: DeFlorio-Barker S, Wade TJ, Jones RM, Friedman LS, Wing C, Dorevitch S. 2017. Estimated costs of sporadic gastrointestinal illness associated with surface water recreation: a combined analysis of data from NEEAR and CHEERS Studies. Environ Health Perspect 125:215-222; http://dx.doi.org/10.1289/EHP130.

  18. LAB ANALYSIS OF EMERGENCY WATER SAMPLES CONTAINING UNKNOWN CONTAMINANTS: CONSIDERATIONS FROM THE USEPA RESPONSE PROTOCOL TOOLBOX

    EPA Science Inventory

    EPA's Office of Research and Development and Office of Water/Water Security Division have jointly developed a Response Protocol Toolbox (RPTB) to address the complex, multi-faceted challenges of a water utility's planning and response to intentional contamination of drinking wate...

  19. Geographic Information System (GIS) Applications at a Multi-Site Community College.

    ERIC Educational Resources Information Center

    Pottle, Laura

    This report presents the Front Range Community College (FRCC) (Colorado) Office of Institutional Research's recent expansion of its data analysis and reporting capabilities to include a geographic information system (GIS). Utilizing ArcView GIS software, the college is better able to visualize institutional and environmental data. They have…

  20. Willingness to pay for improved respiratory and cardiovascular health: a multiple-format, stated-preference approach.

    PubMed

    Johnson, F R; Banzhaf, M R; Desvousges, W H

    2000-06-01

    This study uses stated-preference (SP) analysis to measure willingness to pay (WTP) to reduce acute episodes of respiratory and cardiovascular ill health. The SP survey employs a modified version of the health state descriptions used in the Quality of Well Being (QWB) Index. The four health state attributes are symptom, episode duration, activity restrictions and cost. Preferences are elicited using two different SP formats: graded-pair and discrete-choice. The different formats cause subjects to focus on different evaluation strategies. Combining two elicitation formats yields more valid and robust estimates than using only one approach. Estimates of indirect utility function parameters are obtained using advanced panel econometrics for each format separately and jointly. Socio-economic differences in health preferences are modelled by allowing the marginal utility of money relative to health attributes to vary across respondents. Because the joint model captures the combined preference information provided by both elicitation formats, these model estimates are used to calculate WTP. The results demonstrate the feasibility of estimating meaningful WTP values for policy-relevant respiratory and cardiac symptoms, even from subjects who never have personally experienced these conditions. Furthermore, because WTP estimates are for individual components of health improvements, estimates can be aggregated in various ways depending upon policy needs. Thus, using generic health attributes facilitates transferring WTP estimates for benefit-cost analysis of a variety of potential health interventions. Copyright 2000 John Wiley & Sons, Ltd.

  1. A systematic literature review of health state utility values in head and neck cancer.

    PubMed

    Meregaglia, Michela; Cairns, John

    2017-09-02

    Health state utility values (HSUVs) are essential parameters in model-based economic evaluations. This study systematically identifies HSUVs in head and neck cancer and provides guidance for selecting them from a growing body of health-related quality of life studies. We systematically reviewed the published literature by searching PubMed, EMBASE and The Cochrane Library using a pre-defined combination of keywords. The Tufts Cost-Effectiveness Analysis Registry and the School of Health and Related Research Health Utilities Database (ScHARRHUD) specifically containing health utilities were also queried, in addition to the Health Economics Research Centre database of mapping studies. Studies were considered for inclusion if reporting original HSUVs assessed using established techniques. The characteristics of each study including country, design, sample size, cancer subsite addressed and demographics of responders were summarized narratively using a data extraction form. Quality scoring and critical appraisal of the included studies were performed based on published recommendations. Of a total 1048 records identified by the search, 28 studies qualified for data extraction and 346 unique HSUVs were retrieved from them. HSUVs were estimated using direct methods (e.g. standard gamble; n = 10 studies), multi-attribute utility instruments (MAUIs; n = 13) and mapping techniques (n = 3); two studies adopted both direct and indirect approaches. Within the MAUIs, the EuroQol 5-dimension questionnaire (EQ-5D) was the most frequently used (n = 11), followed by the Health Utility Index Mark 3 (HUI3; n = 2), the 15D (n = 2) and the Short Form-Six Dimension (SF-6D; n = 1). Different methods and types of responders (i.e. patients, healthy subjects, clinical experts) influenced the magnitude of HSUVs for comparable health states. Only one mapping study developed an original algorithm using head and neck cancer data. The identified studies were considered of intermediate quality. This review provides a dataset of HSUVs systematically retrieved from published studies in head and neck cancer. There is currently a lack of research for some disease phases including recurrent and metastatic cancer, and treatment-related complications. In selecting HSUVs for cost-effectiveness modeling purposes, preference should be given to EQ-5D utility values; however, mapping to EQ-5D is a potentially valuable technique that should be further developed in this cancer population.

  2. Consumer attitudes and preferences for fresh market tomatoes.

    PubMed

    Oltman, A E; Jervis, S M; Drake, M A

    2014-10-01

    This study established attractive attributes and consumer desires for fresh tomatoes. Three focus groups (n = 28 participants) were conducted to explore how consumers perceived tomatoes, including how they purchased and consumed them. Subsequently, an Adaptive Choice Based Conjoint (ACBC) survey was conducted to understand consumer preferences toward traditional tomatoes. The ACBC survey with Kano questions (n = 1037 consumers in Raleigh, NC) explored the importance of color, firmness, size, skin, texture, interior, seed presence, flavor, and health benefits. The most important tomato attribute was color, then juice when sliced, followed by size, followed by seed presence, which was at parity with firmness. An attractive tomato was red, firm, medium/small sized, crisp, meaty, juicy, flavorful, and with few seeds. Deviations from these features resulted in a tomato that was rejected by consumers. Segmentations of consumers were determined by patterns in utility scores. External attributes were the main drivers of tomato liking, but different groups of tomato consumers exist with distinct preferences for juiciness, firmness, flavor, and health benefits. Conjoint analysis is a research technique that collects a large amount of data from consumers in a format designed to be reflective of a real life market setting and can be combined with qualitative insight from focus groups to gain information on consumer consumption and purchase behaviors. This study established that the most important fresh tomato attributes were color, amount of juice when sliced, and size. Distinct consumer clusters were differentiated by preference for color/appearance, juiciness and firm texture. Tomato growers can utilize the results to target attributes that drive consumer choice for fresh tomatoes. © 2014 Institute of Food Technologists®

  3. Social Utility versus Social Desirability of Students' Attributional Self-Presentation Strategies

    ERIC Educational Resources Information Center

    Matteucci, Maria Cristina

    2014-01-01

    Research on impression management has shown that students can manage their social images by providing attributional self-presentation strategies (ASPSs). Based on the distinction between social desirability judgments and social utility judgments, two studies were conducted to examine the students' understanding of the impact of ASPSs both on…

  4. Multi-layer holographic bifurcative neural network system for real-time adaptive EOS data analysis

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang; Huang, K. S.; Diep, J.

    1993-01-01

    Optical data processing techniques have the inherent advantage of high data throughout, low weight and low power requirements. These features are particularly desirable for onboard spacecraft in-situ real-time data analysis and data compression applications. the proposed multi-layer optical holographic neural net pattern recognition technique will utilize the nonlinear photorefractive devices for real-time adaptive learning to classify input data content and recognize unexpected features. Information can be stored either in analog or digital form in a nonlinear photofractive device. The recording can be accomplished in time scales ranging from milliseconds to microseconds. When a system consisting of these devices is organized in a multi-layer structure, a feedforward neural net with bifurcating data classification capability is formed. The interdisciplinary research will involve the collaboration with top digital computer architecture experts at the University of Southern California.

  5. Change Detection of High-Resolution Remote Sensing Images Based on Adaptive Fusion of Multiple Features

    NASA Astrophysics Data System (ADS)

    Wang, G. H.; Wang, H. B.; Fan, W. F.; Liu, Y.; Chen, C.

    2018-04-01

    In view of the traditional change detection algorithm mainly depends on the spectral information image spot, failed to effectively mining and fusion of multi-image feature detection advantage, the article borrows the ideas of object oriented analysis proposed a multi feature fusion of remote sensing image change detection algorithm. First by the multi-scale segmentation of image objects based; then calculate the various objects of color histogram and linear gradient histogram; utilizes the color distance and edge line feature distance between EMD statistical operator in different periods of the object, using the adaptive weighted method, the color feature distance and edge in a straight line distance of combination is constructed object heterogeneity. Finally, the curvature histogram analysis image spot change detection results. The experimental results show that the method can fully fuse the color and edge line features, thus improving the accuracy of the change detection.

  6. Applications of multi-frequency single beam sonar fisheries analysis methods for seep quantification and characterization

    NASA Astrophysics Data System (ADS)

    Price, V.; Weber, T.; Jerram, K.; Doucet, M.

    2016-12-01

    The analysis of multi-frequency, narrow-band single-beam acoustic data for fisheries applications has long been established, with methodology focusing on characterizing targets in the water column by utilizing complex algorithms and false-color time series data to create and compare frequency response curves for dissimilar biological groups. These methods were built on concepts developed for multi-frequency analysis of satellite imagery for terrestrial analysis and have been applied to a broad range of data types and applications. Single-beam systems operating at multiple frequencies are also used for the detection and identification of seeps in water column data. Here we incorporate the same analysis and visualization techniques used for fisheries applications to attempt to characterize and quantify seeps by creating and comparing frequency response curves and applying false coloration to shallow and deep multi-channel seep data. From this information, we can establish methods to differentiate bubble size in the echogram and differentiate seep composition. These techniques are also useful in differentiating plume content from biological noise (volume reverberation) created by euphausid layers and fish with gas-filled swim bladders. The combining of the multiple frequencies using false coloring and other image analysis techniques after applying established normalization and beam pattern correction algorithms is a novel approach to quantitatively describing seeps. Further, this information could be paired with geological models, backscatter, and bathymetry data to assess seep distribution.

  7. Alignment of Assessment Objectives with Instructional Objectives Using Revised Bloom's Taxonomy--The Case for Food Science and Technology Education

    ERIC Educational Resources Information Center

    Jideani, V. A.; Jideani, I. A.

    2012-01-01

    Nine food science and technology (FST) subjects were assessed for alignment between the learning outcomes and assessment using revised Bloom's taxonomy (RBT) of cognitive knowledge. Conjoint analysis was used to estimate the utilities of the levels of cognitive, knowledge, and the attribute importance (cognitive process and knowledge dimension)…

  8. An Analysis of Attitudes toward the Vietnam War Displayed in Children's and Young Adult Literature Published after 1982.

    ERIC Educational Resources Information Center

    Verhoek, Nancy A.

    A study involved the creation of a 20-variable checklist for children's and young adult war literature, to be utilized as a data-recording instrument for 24 examples of literature. The checklist components were based upon a combination of cognitive and affective attributes assisting in the formulation of attitudes toward war displayed by…

  9. Interactive multi-mode blade impact analysis

    NASA Technical Reports Server (NTRS)

    Alexander, A.; Cornell, R. W.

    1978-01-01

    The theoretical methodology used in developing an analysis for the response of turbine engine fan blades subjected to soft-body (bird) impacts is reported, and the computer program developed using this methodology as its basis is described. This computer program is an outgrowth of two programs that were previously developed for the purpose of studying problems of a similar nature (a 3-mode beam impact analysis and a multi-mode beam impact analysis). The present program utilizes an improved missile model that is interactively coupled with blade motion which is more consistent with actual observations. It takes into account local deformation at the impact area, blade camber effects, and the spreading of the impacted missile mass on the blade surface. In addition, it accommodates plate-type mode shapes. The analysis capability in this computer program represents a significant improvement in the development of the methodology for evaluating potential fan blade materials and designs with regard to foreign object impact resistance.

  10. Content analysis of resident evaluations of faculty anesthesiologists: supervision encompasses some attributes of the professionalism core competency.

    PubMed

    Dexter, Franklin; Szeluga, Debra; Hindman, Bradley J

    2017-05-01

    Anesthesiology departments need an instrument with which to assess practicing anesthesiologists' professionalism. The purpose of this retrospective analysis of the content of a cohort of resident evaluations of faculty anesthesiologists was to investigate the relationship between a clinical supervision scale and the multiple attributes of professionalism. From July 1, 2013 to the present, our department has utilized the de Oliveira Filho unidimensional nine-item supervision scale to assess the quality of clinical supervision of residents provided by our anesthesiologists. The "cohort" we examined included all 13,664 resident evaluations of all faculty anesthesiologists from July 1, 2013 through December 31, 2015, including 1,387 accompanying comments. Words and phrases associated with the core competency of professionalism were obtained from previous studies, and the supervision scale was analyzed for the presence of these words and phrases. The supervision scale assesses some attributes of anesthesiologists' professionalism as well as patient care and procedural skills and interpersonal and communication skills. The comments that residents provided with the below-average supervision scores included attributes of professionalism, although numerous words and phrases related to professionalism were not present in any of the residents' comments. The de Oliveira Filho clinical supervision scale includes some attributes of anesthesiologists' professionalism. The core competency of professionalism, however, is multidimensional, and the supervision scale and/or residents' comments did not address many of the other established attributes of professionalism.

  11. [Comparative Study of Patient Identifications for Conventional and Portable Chest Radiographs Utilizing ROC Analysis].

    PubMed

    Kawashima, Hiroki; Hayashi, Norio; Ohno, Naoki; Matsuura, Yukihiro; Sanada, Shigeru

    2015-08-01

    To evaluate the patient identification ability of radiographers, previous and current chest radiographs were assessed with observer study utilizing a receiver operating characteristics (ROCs) analysis. This study included portable and conventional chest radiographs from 43 same and 43 different patients. The dataset used in this study was divided into the three following groups: (1) a pair of portable radiographs, (2) a pair of conventional radiographs, and (3) a combination of each type of radiograph. Seven observers participated in this ROC study, which aimed to identify same or different patients, using these datasets. ROC analysis was conducted to calculate the average area under ROC curve obtained by each observer (AUCave), and a statistical test was performed using the multi-reader multi-case method. Comparable results were obtained with pairs of portable (AUCave: 0.949) and conventional radiographs (AUCave: 0.951). In a comparison between the same modality, there were no significant differences. In contrast, the ability to identify patients by comparing a portable and conventional radiograph (AUCave: 0.873) was lower than with the matching datasets (p=0.002 and p=0.004, respectively). In conclusion, the use of different imaging modalities reduces radiographers' ability to identify their patients.

  12. Using Multicriteria Analysis in Issues Concerning Adaptation of Historic Facilities for the Needs of Public Utility Buildings with a Function of a Theatre

    NASA Astrophysics Data System (ADS)

    Obracaj, Piotr; Fabianowski, Dariusz

    2017-10-01

    Implementations concerning adaptation of historic facilities for public utility objects are associated with the necessity of solving many complex, often conflicting expectations of future users. This mainly concerns the function that includes construction, technology and aesthetic issues. The list of issues is completed with proper protection of historic values, different in each case. The procedure leading to obtaining the expected solution is a multicriteria procedure, usually difficult to accurately define and requiring designer’s large experience. An innovative approach has been used for the analysis, namely - the modified EA FAHP (Extent Analysis Fuzzy Analytic Hierarchy Process) Chang’s method of a multicriteria analysis for the assessment of complex functional and spatial issues. Selection of optimal spatial form of an adapted historic building intended for the multi-functional public utility facility was analysed. The assumed functional flexibility was determined in the scope of: education, conference, and chamber spectacles, such as drama, concerts, in different stage-audience layouts.

  13. Deep multi-scale convolutional neural network for hyperspectral image classification

    NASA Astrophysics Data System (ADS)

    Zhang, Feng-zhe; Yang, Xia

    2018-04-01

    In this paper, we proposed a multi-scale convolutional neural network for hyperspectral image classification task. Firstly, compared with conventional convolution, we utilize multi-scale convolutions, which possess larger respective fields, to extract spectral features of hyperspectral image. We design a deep neural network with a multi-scale convolution layer which contains 3 different convolution kernel sizes. Secondly, to avoid overfitting of deep neural network, dropout is utilized, which randomly sleeps neurons, contributing to improve the classification accuracy a bit. In addition, new skills like ReLU in deep learning is utilized in this paper. We conduct experiments on University of Pavia and Salinas datasets, and obtained better classification accuracy compared with other methods.

  14. Security of statistical data bases: invasion of privacy through attribute correlational modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palley, M.A.

    This study develops, defines, and applies a statistical technique for the compromise of confidential information in a statistical data base. Attribute Correlational Modeling (ACM) recognizes that the information contained in a statistical data base represents real world statistical phenomena. As such, ACM assumes correlational behavior among the database attributes. ACM proceeds to compromise confidential information through creation of a regression model, where the confidential attribute is treated as the dependent variable. The typical statistical data base may preclude the direct application of regression. In this scenario, the research introduces the notion of a synthetic data base, created through legitimate queriesmore » of the actual data base, and through proportional random variation of responses to these queries. The synthetic data base is constructed to resemble the actual data base as closely as possible in a statistical sense. ACM then applies regression analysis to the synthetic data base, and utilizes the derived model to estimate confidential information in the actual database.« less

  15. Evaluation of hierarchical agglomerative cluster analysis methods for discrimination of primary biological aerosol

    NASA Astrophysics Data System (ADS)

    Crawford, I.; Ruske, S.; Topping, D. O.; Gallagher, M. W.

    2015-07-01

    In this paper we present improved methods for discriminating and quantifying Primary Biological Aerosol Particles (PBAP) by applying hierarchical agglomerative cluster analysis to multi-parameter ultra violet-light induced fluorescence (UV-LIF) spectrometer data. The methods employed in this study can be applied to data sets in excess of 1×106 points on a desktop computer, allowing for each fluorescent particle in a dataset to be explicitly clustered. This reduces the potential for misattribution found in subsampling and comparative attribution methods used in previous approaches, improving our capacity to discriminate and quantify PBAP meta-classes. We evaluate the performance of several hierarchical agglomerative cluster analysis linkages and data normalisation methods using laboratory samples of known particle types and an ambient dataset. Fluorescent and non-fluorescent polystyrene latex spheres were sampled with a Wideband Integrated Bioaerosol Spectrometer (WIBS-4) where the optical size, asymmetry factor and fluorescent measurements were used as inputs to the analysis package. It was found that the Ward linkage with z-score or range normalisation performed best, correctly attributing 98 and 98.1 % of the data points respectively. The best performing methods were applied to the BEACHON-RoMBAS ambient dataset where it was found that the z-score and range normalisation methods yield similar results with each method producing clusters representative of fungal spores and bacterial aerosol, consistent with previous results. The z-score result was compared to clusters generated with previous approaches (WIBS AnalysiS Program, WASP) where we observe that the subsampling and comparative attribution method employed by WASP results in the overestimation of the fungal spore concentration by a factor of 1.5 and the underestimation of bacterial aerosol concentration by a factor of 5. We suggest that this likely due to errors arising from misatrribution due to poor centroid definition and failure to assign particles to a cluster as a result of the subsampling and comparative attribution method employed by WASP. The methods used here allow for the entire fluorescent population of particles to be analysed yielding an explict cluster attribution for each particle, improving cluster centroid definition and our capacity to discriminate and quantify PBAP meta-classes compared to previous approaches.

  16. Analysis respons to the implementation of nuclear installations safety culture using AHP-TOPSIS

    NASA Astrophysics Data System (ADS)

    Situmorang, J.; Kuntoro, I.; Santoso, S.; Subekti, M.; Sunaryo, G. R.

    2018-02-01

    An analysis of responses to the implementation of nuclear installations safety culture has been done using AHP (Analitic Hierarchy Process) - TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution). Safety culture is considered as collective commitments of the decision-making level, management level, and individual level. Thus each level will provide a subjective perspective as an alternative approach to implementation. Furthermore safety culture is considered by the statement of five characteristics which in more detail form consist of 37 attributes, and therefore can be expressed as multi-attribute state. Those characteristics and or attributes will be a criterion and its value is difficult to determine. Those criteria of course, will determine and strongly influence the implementation of the corresponding safety culture. To determine the pattern and magnitude of the influence is done by using a TOPSIS that is based on decision matrix approach and is composed of alternatives and criteria. The weight of each criterion is determined by AHP technique. The data used are data collected through questionnaires at the workshop on safety and health in 2015. .Reliability test of data gives Cronbah Alpha value of 95.5% which according to the criteria is stated reliable. Validity test using bivariate correlation analysis technique between each attribute give Pearson correlation for all attribute is significant at level 0,01. Using confirmatory factor analysis gives Kaise-Meyer-Olkin of sampling Adequacy (KMO) is 0.719 and it is greater than the acceptance criterion 0.5 as well as the 0.000 significance level much smaller than 0.05 and stated that further analysis could be performed. As a result of the analysis it is found that responses from the level of decision maker (second echelon) dominate the best order preference rank to be the best solution in strengthening the nuclear installation safety culture, except for the first characteristics, safety is a clearly recognized value. The rank of preference order is obtained sequentially according to the level of policy maker, management and individual or staff.

  17. Using qualitative methods for attribute development for discrete choice experiments: issues and recommendations.

    PubMed

    Coast, Joanna; Al-Janabi, Hareth; Sutton, Eileen J; Horrocks, Susan A; Vosper, A Jane; Swancutt, Dawn R; Flynn, Terry N

    2012-06-01

    Attribute generation for discrete choice experiments (DCEs) is often poorly reported, and it is unclear whether this element of research is conducted rigorously. This paper explores issues associated with developing attributes for DCEs and contrasts different qualitative approaches. The paper draws on eight studies, four developed attributes for measures, and four developed attributes for more ad hoc policy questions. Issues that have become apparent through these studies include the following: the theoretical framework for random utility theory and the need for attributes that are neither too close to the latent construct nor too intrinsic to people's personality; the need to think about attribute development as a two-stage process involving conceptual development followed by refinement of language to convey the intended meaning; and the difficulty in resolving tensions inherent in the reductiveness of condensing complex and nuanced qualitative findings into precise terms. The comparison of alternative qualitative approaches suggests that the nature of data collection will depend both on the characteristics of the question (its sensitivity, for example) and the availability of existing qualitative information. An iterative, constant comparative approach to analysis is recommended. Finally, the paper provides a series of recommendations for improving the reporting of this element of DCE studies. Copyright © 2011 John Wiley & Sons, Ltd.

  18. An empirical analysis of Moscovitch's reconceptualised model of social anxiety: How is it different from fear of negative evaluation?

    PubMed

    Kizilcik, Isilay N; Gregory, Bree; Baillie, Andrew J; Crome, Erica

    2016-01-01

    Cognitive-behavioural models propose that excessive fear of negative evaluation is central to social anxiety. Moscovitch (2009) instead proposes that perceived deficiencies in three self attributes: fears of showing signs of anxiety, deficits in physical appearance, or deficits in social competence are at the core of social anxiety. However, these attributes are likely to overlap with fear of negative evaluation. Responses to an online survey of 286 participants with a range of social anxiety severity were analysed using hierarchical multiple regression to identify the overall unique predictive value of Moscovitch's model. Altogether, Moscovitch's model provided improvements in the prediction of safety behaviours, types of fears and cognitions; however only the fear of showing anxiety subscale provided unique information. This research supports further investigations into the utility of this revised model, particularly related to utility of explicitly assessing and addressing fears of showing anxiety. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Data Visualization of Item-Total Correlation by Median Smoothing

    ERIC Educational Resources Information Center

    Yu, Chong Ho; Douglas, Samantha; Lee, Anna; An, Min

    2016-01-01

    This paper aims to illustrate how data visualization could be utilized to identify errors prior to modeling, using an example with multi-dimensional item response theory (MIRT). MIRT combines item response theory and factor analysis to identify a psychometric model that investigates two or more latent traits. While it may seem convenient to…

  20. Concurrent Probabilistic Simulation of High Temperature Composite Structural Response

    NASA Technical Reports Server (NTRS)

    Abdi, Frank

    1996-01-01

    A computational structural/material analysis and design tool which would meet industry's future demand for expedience and reduced cost is presented. This unique software 'GENOA' is dedicated to parallel and high speed analysis to perform probabilistic evaluation of high temperature composite response of aerospace systems. The development is based on detailed integration and modification of diverse fields of specialized analysis techniques and mathematical models to combine their latest innovative capabilities into a commercially viable software package. The technique is specifically designed to exploit the availability of processors to perform computationally intense probabilistic analysis assessing uncertainties in structural reliability analysis and composite micromechanics. The primary objectives which were achieved in performing the development were: (1) Utilization of the power of parallel processing and static/dynamic load balancing optimization to make the complex simulation of structure, material and processing of high temperature composite affordable; (2) Computational integration and synchronization of probabilistic mathematics, structural/material mechanics and parallel computing; (3) Implementation of an innovative multi-level domain decomposition technique to identify the inherent parallelism, and increasing convergence rates through high- and low-level processor assignment; (4) Creating the framework for Portable Paralleled architecture for the machine independent Multi Instruction Multi Data, (MIMD), Single Instruction Multi Data (SIMD), hybrid and distributed workstation type of computers; and (5) Market evaluation. The results of Phase-2 effort provides a good basis for continuation and warrants Phase-3 government, and industry partnership.

  1. Multi-Scale and Object-Oriented Analysis for Mountain Terrain Segmentation and Geomorphological Assessment

    NASA Astrophysics Data System (ADS)

    Marston, B. K.; Bishop, M. P.; Shroder, J. F.

    2009-12-01

    Digital terrain analysis of mountain topography is widely utilized for mapping landforms, assessing the role of surface processes in landscape evolution, and estimating the spatial variation of erosion. Numerous geomorphometry techniques exist to characterize terrain surface parameters, although their utility to characterize the spatial hierarchical structure of the topography and permit an assessment of the erosion/tectonic impact on the landscape is very limited due to scale and data integration issues. To address this problem, we apply scale-dependent geomorphometric and object-oriented analyses to characterize the hierarchical spatial structure of mountain topography. Specifically, we utilized a high resolution digital elevation model to characterize complex topography in the Shimshal Valley in the Western Himalaya of Pakistan. To accomplish this, we generate terrain objects (geomorphological features and landform) including valley floors and walls, drainage basins, drainage network, ridge network, slope facets, and elemental forms based upon curvature. Object-oriented analysis was used to characterize object properties accounting for object size, shape, and morphometry. The spatial overlay and integration of terrain objects at various scales defines the nature of the hierarchical organization. Our results indicate that variations in the spatial complexity of the terrain hierarchical organization is related to the spatio-temporal influence of surface processes and landscape evolution dynamics. Terrain segmentation and the integration of multi-scale terrain information permits further assessment of process domains and erosion, tectonic impact potential, and natural hazard potential. We demonstrate this with landform mapping and geomorphological assessment examples.

  2. Application of Analytic Hierarchy Process (AHP) in the analysis of the fuel efficiency in the automobile industry with the utilization of Natural Fiber Polymer Composites (NFPC)

    NASA Astrophysics Data System (ADS)

    Jayamani, E.; Perera, D. S.; Soon, K. H.; Bakri, M. K. B.

    2017-04-01

    A systematic method of material analysis aiming for fuel efficiency improvement with the utilization of natural fiber reinforced polymer matrix composites in the automobile industry is proposed. A multi-factor based decision criteria with Analytical Hierarchy Process (AHP) was used and executed through MATLAB to achieve improved fuel efficiency through the weight reduction of vehicular components by effective comparison between two engine hood designs. The reduction was simulated by utilizing natural fiber polymer composites with thermoplastic polypropylene (PP) as the matrix polymer and benchmarked against a synthetic based composite component. Results showed that PP with 35% of flax fiber loading achieved a 0.4% improvement in fuel efficiency, and it was the highest among the 27 candidate fibers.

  3. Consumer preferences for hearing aid attributes: a comparison of rating and conjoint analysis methods.

    PubMed

    Bridges, John F P; Lataille, Angela T; Buttorff, Christine; White, Sharon; Niparko, John K

    2012-03-01

    Low utilization of hearing aids has drawn increased attention to the study of consumer preferences using both simple ratings (e.g., Likert scale) and conjoint analyses, but these two approaches often produce inconsistent results. The study aims to directly compare Likert scales and conjoint analysis in identifying important attributes associated with hearing aids among those with hearing loss. Seven attributes of hearing aids were identified through qualitative research: performance in quiet settings, comfort, feedback, frequency of battery replacement, purchase price, water and sweat resistance, and performance in noisy settings. The preferences of 75 outpatients with hearing loss were measured with both a 5-point Likert scale and with 8 paired-comparison conjoint tasks (the latter being analyzed using OLS [ordinary least squares] and logistic regression). Results were compared by examining implied willingness-to-pay and Pearson's Rho. A total of 56 respondents (75%) provided complete responses. Two thirds of respondents were male, most had sensorineural hearing loss, and most were older than 50; 44% of respondents had never used a hearing aid. Both methods identified improved performance in noisy settings as the most valued attribute. Respondents were twice as likely to buy a hearing aid with better functionality in noisy environments (p < .001), and willingness to pay for this attribute ranged from US$2674 on the Likert to US$9000 in the conjoint analysis. The authors find a high level of concordance between the methods-a result that is in stark contrast with previous research. The authors conclude that their result stems from constraining the levels on the Likert scale.

  4. Psychological Attributes in Foreign Language Reading: An Explorative Study of Japanese College Students

    ERIC Educational Resources Information Center

    Mikami, Hitoshi; Leung, Chi Yui; Yoshikawa, Lisa

    2016-01-01

    This study explores the internal structure of psychological attributes (i.e., motivation, belief and emotion) related to foreign language reading (FLR) (hereafter FLR attributes) and checks the utility of existing FLR attribute measurements for the specific learner group (i.e., Japanese university students studying English as their foreign…

  5. Salty or Sweet: Exploring the Challenges of Groundwater Salinization Within a Sustainability Framework

    NASA Astrophysics Data System (ADS)

    Basu, N. B.; Van Meter, K. J.; Tate, E.

    2012-12-01

    In semi-arid to arid landscapes under intensive irrigation, groundwater salinization can be a persistent and critical problem, leading to reduced agricultural productivity, limited access to fresh drinking water, and ultimately desertification. It is estimated that in India alone, problems of salinity are now affecting over 6 million hectares of agricultural land. In villages of the Mewat district of Haryana in Northern India, subsistence-level farming is the primary source of income, and farming families live under serious threat from increasing salinity levels, both in terms of crop production and adequate supplies of drinking water. The Institute for Rural Research and Development (IRRAD), a non-governmental organization (NGO) working in Mewat, has taken an innovative approach in this area to problems of groundwater salinization, using check dams and rainwater harvesting ponds to recharge aquifers in the freshwater zones of upstream hill areas, and to create freshwater pockets within the saline groundwater zones of down-gradient areas. Initial, pilot-scale efforts have led to apparent success in raising groundwater levels in freshwater zones and changing the dynamics of encroaching groundwater salinity, but the expansion of such efforts to larger-scale restoration is constrained by the availability of adequate resources. Under such resource constraints, which are typical of international development work, it becomes critical to utilize a decision-analysis framework to quantify both the immediate and long-term effectiveness and sustainability of interventions by NGOs such as IRRAD. In the present study, we have developed such a framework, linking the climate-hydrological dynamics of monsoon driven systems with village-scale socio-economic attributes to evaluate the sustainability of current restoration efforts and to prioritize future areas for intervention. We utilize a multi-dimensional metric that takes into account both physical factors related to water availability as well as socio-economic factors related to the capacity to deal with water stress. This metric allows us to evaluate and compare water-driven sustainability at the village, block, and district levels in Northern India based on a combination of readily available census and water resource data. Further, we utilize a pressure-response framework that considers monsoonal dynamics and effectively evaluates the effects of intervention efforts over time. Our results indicate that in arid to semi-arid regions, where problems of groundwater salinity are paramount, scaling factors corresponding to salinity levels as well as the relative size of the saline zone must be incorporated into indicators of water access and availability to accurately reflect overall sustainability. More importantly, the results point towards the value of incorporating dynamic, multi-dimensional sustainability metrics into decision-analysis frameworks used to aid in resource prioritization and the evaluation of intervention efforts.

  6. The multiscale classification system and grid encoding mode of ecological land in China

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Liu, Aixia; Lin, Yifan

    2017-10-01

    Ecological land provides goods and services that have direct or indirect benefic to eco-environment and human welfare. In recent years, researches on ecological land have become important in the field of land changes and ecosystem management. In the study, a multi-scale classification scheme of ecological land was developed for land management based on combination of the land-use classification and the ecological function zoning in China, including eco-zone, eco-region, eco-district, land ecosystem, and ecological land-use type. The geographical spatial unit leads toward greater homogeneity from macro to micro scale. The term "ecological land-use type" is the smallest one, being important to maintain the key ecological processes in land ecosystem. Ecological land-use type was categorized into main-functional and multi-functional ecological land-use type according to its ecological function attributes and production function attributes. Main-functional type was defined as one kind of land-use type mainly providing ecological goods and function attributes, such as river, lake, swampland, shoaly land, glacier and snow, while multi-functional type not only providing ecological goods and function attributes but also productive goods and function attributes, such as arable land, forestry land, and grassland. Furthermore, a six-level grid encoding mode was proposed for modern management of ecological land and data update under cadastral encoding. The six-level irregular grid encoding from macro to micro scale included eco-zone, eco-region, eco-district, cadastral area, land ecosystem, land ownership type, ecological land-use type, and parcel. Besides, the methodologies on ecosystem management were discussed for integrated management of natural resources in China.

  7. Psychological Literacy Weakly Differentiates Students by Discipline and Year of Enrolment

    PubMed Central

    Heritage, Brody; Roberts, Lynne D.; Gasson, Natalie

    2016-01-01

    Psychological literacy, a construct developed to reflect the types of skills graduates of a psychology degree should possess and be capable of demonstrating, has recently been scrutinized in terms of its measurement adequacy. The recent development of a multi-item measure encompassing the facets of psychological literacy has provided the potential for improved validity in measuring the construct. We investigated the known-groups validity of this multi-item measure of psychological literacy to examine whether psychological literacy could predict (a) students’ course of enrolment and (b) students’ year of enrolment. Five hundred and fifteen undergraduate psychology students, 87 psychology/human resource management students, and 83 speech pathology students provided data. In the first year cohort, the reflective processes (RPs) factor significantly predicted psychology and psychology/human resource management course enrolment, although no facets significantly differentiated between psychology and speech pathology enrolment. Within the second year cohort, generic graduate attributes (GGAs) and RPs differentiated psychology and speech pathology course enrolment. GGAs differentiated first-year and second-year psychology students, with second-year students more likely to have higher scores on this factor. Due to weak support for known-groups validity, further measurement refinements are recommended to improve the construct’s utility. PMID:26909058

  8. Detecting spatial defects in colored patterns using self-oscillating gels

    NASA Astrophysics Data System (ADS)

    Fang, Yan; Yashin, Victor V.; Dickerson, Samuel J.; Balazs, Anna C.

    2018-06-01

    With the growing demand for wearable computers, there is a need for material systems that can perform computational tasks without relying on external electrical power. Using theory and simulation, we design a material system that "computes" by integrating the inherent behavior of self-oscillating gels undergoing the Belousov-Zhabotinsky (BZ) reaction and piezoelectric (PZ) plates. These "BZ-PZ" units are connected electrically to form a coupled oscillator network, which displays specific modes of synchronization. We exploit this attribute in employing multiple BZ-PZ networks to perform pattern matching on complex multi-dimensional data, such as colored images. By decomposing a colored image into sets of binary vectors, we use each BZ-PZ network, or "channel," to store distinct information about the color and the shape of the image and perform the pattern matching operation. Our simulation results indicate that the multi-channel BZ-PZ device can detect subtle differences between the input and stored patterns, such as the color variation of one pixel or a small change in the shape of an object. To demonstrate a practical application, we utilize our system to process a colored Quick Response code and show its potential in cryptography and steganography.

  9. Psychological Literacy Weakly Differentiates Students by Discipline and Year of Enrolment.

    PubMed

    Heritage, Brody; Roberts, Lynne D; Gasson, Natalie

    2016-01-01

    Psychological literacy, a construct developed to reflect the types of skills graduates of a psychology degree should possess and be capable of demonstrating, has recently been scrutinized in terms of its measurement adequacy. The recent development of a multi-item measure encompassing the facets of psychological literacy has provided the potential for improved validity in measuring the construct. We investigated the known-groups validity of this multi-item measure of psychological literacy to examine whether psychological literacy could predict (a) students' course of enrolment and (b) students' year of enrolment. Five hundred and fifteen undergraduate psychology students, 87 psychology/human resource management students, and 83 speech pathology students provided data. In the first year cohort, the reflective processes (RPs) factor significantly predicted psychology and psychology/human resource management course enrolment, although no facets significantly differentiated between psychology and speech pathology enrolment. Within the second year cohort, generic graduate attributes (GGAs) and RPs differentiated psychology and speech pathology course enrolment. GGAs differentiated first-year and second-year psychology students, with second-year students more likely to have higher scores on this factor. Due to weak support for known-groups validity, further measurement refinements are recommended to improve the construct's utility.

  10. Radiometric stability of the Multi-angle Imaging SpectroRadiometer (MISR) following 15 years on-orbit

    NASA Astrophysics Data System (ADS)

    Bruegge, Carol J.; Val, Sebastian; Diner, David J.; Jovanovic, Veljko; Gray, Ellyn; Di Girolamo, Larry; Zhao, Guangyu

    2014-09-01

    The Multi-angle Imaging SpectroRadiometer (MISR) has successfully operated on the EOS/ Terra spacecraft since 1999. It consists of nine cameras pointing from nadir to 70.5° view angle with four spectral channels per camera. Specifications call for a radiometric uncertainty of 3% absolute and 1% relative to the other cameras. To accomplish this, MISR utilizes an on-board calibrator (OBC) to measure camera response changes. Once every two months the two Spectralon panels are deployed to direct solar-light into the cameras. Six photodiode sets measure the illumination level that are compared to MISR raw digital numbers, thus determining the radiometric gain coefficients used in Level 1 data processing. Although panel stability is not required, there has been little detectable change in panel reflectance, attributed to careful preflight handling techniques. The cameras themselves have degraded in radiometric response by 10% since launch, but calibration updates using the detector-based scheme has compensated for these drifts and allowed the radiance products to meet accuracy requirements. Validation using Sahara desert observations show that there has been a drift of ~1% in the reported nadir-view radiance over a decade, common to all spectral bands.

  11. Two-dimensional molybdenum disulphide nanosheet-covered metal nanoparticle array as a floating gate in multi-functional flash memories

    NASA Astrophysics Data System (ADS)

    Han, Su-Ting; Zhou, Ye; Chen, Bo; Zhou, Li; Yan, Yan; Zhang, Hua; Roy, V. A. L.

    2015-10-01

    Semiconducting two-dimensional materials appear to be excellent candidates for non-volatile memory applications. However, the limited controllability of charge trapping behaviors and the lack of multi-bit storage studies in two-dimensional based memory devices require further improvement for realistic applications. Here, we report a flash memory consisting of metal NPs-molybdenum disulphide (MoS2) as a floating gate by introducing a metal nanoparticle (NP) (Ag, Au, Pt) monolayer underneath the MoS2 nanosheets. Controlled charge trapping and long data retention have been achieved in a metal (Ag, Au, Pt) NPs-MoS2 floating gate flash memory. This controlled charge trapping is hypothesized to be attributed to band bending and a built-in electric field ξbi between the interface of the metal NPs and MoS2. The metal NPs-MoS2 floating gate flash memories were further proven to be multi-bit memory storage devices possessing a 3-bit storage capability and a good retention capability up to 104 s. We anticipate that these findings would provide scientific insight for the development of novel memory devices utilizing an atomically thin two-dimensional lattice structure.Semiconducting two-dimensional materials appear to be excellent candidates for non-volatile memory applications. However, the limited controllability of charge trapping behaviors and the lack of multi-bit storage studies in two-dimensional based memory devices require further improvement for realistic applications. Here, we report a flash memory consisting of metal NPs-molybdenum disulphide (MoS2) as a floating gate by introducing a metal nanoparticle (NP) (Ag, Au, Pt) monolayer underneath the MoS2 nanosheets. Controlled charge trapping and long data retention have been achieved in a metal (Ag, Au, Pt) NPs-MoS2 floating gate flash memory. This controlled charge trapping is hypothesized to be attributed to band bending and a built-in electric field ξbi between the interface of the metal NPs and MoS2. The metal NPs-MoS2 floating gate flash memories were further proven to be multi-bit memory storage devices possessing a 3-bit storage capability and a good retention capability up to 104 s. We anticipate that these findings would provide scientific insight for the development of novel memory devices utilizing an atomically thin two-dimensional lattice structure. Electronic supplementary information (ESI) available: Energy-dispersive X-ray spectroscopy (EDS) spectra of the metal NPs, SEM image of MoS2 on Au NPs, erasing operations of the metal NPs-MoS2 memory device, transfer characteristics of the standard FET devices and Ag NP devices under programming operation, tapping-mode AFM height image of the fabricated MoS2 film for pristine MoS2 flash memory, gate signals used for programming the Au NPs-MoS2 and Pt NPs-MoS2 flash memories, and data levels recorded for 100 sequential cycles. See DOI: 10.1039/c5nr05054e

  12. Integrating multi-criteria evaluation techniques with geographic information systems for landfill site selection: a case study using ordered weighted average.

    PubMed

    Gorsevski, Pece V; Donevska, Katerina R; Mitrovski, Cvetko D; Frizado, Joseph P

    2012-02-01

    This paper presents a GIS-based multi-criteria decision analysis approach for evaluating the suitability for landfill site selection in the Polog Region, Macedonia. The multi-criteria decision framework considers environmental and economic factors which are standardized by fuzzy membership functions and combined by integration of analytical hierarchy process (AHP) and ordered weighted average (OWA) techniques. The AHP is used for the elicitation of attribute weights while the OWA operator function is used to generate a wide range of decision alternatives for addressing uncertainty associated with interaction between multiple criteria. The usefulness of the approach is illustrated by different OWA scenarios that report landfill suitability on a scale between 0 and 1. The OWA scenarios are intended to quantify the level of risk taking (i.e., optimistic, pessimistic, and neutral) and to facilitate a better understanding of patterns that emerge from decision alternatives involved in the decision making process. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Deep learning for classification of islanding and grid disturbance based on multi-resolution singular spectrum entropy

    NASA Astrophysics Data System (ADS)

    Li, Tie; He, Xiaoyang; Tang, Junci; Zeng, Hui; Zhou, Chunying; Zhang, Nan; Liu, Hui; Lu, Zhuoxin; Kong, Xiangrui; Yan, Zheng

    2018-02-01

    Forasmuch as the distinguishment of islanding is easy to be interfered by grid disturbance, island detection device may make misjudgment thus causing the consequence of photovoltaic out of service. The detection device must provide with the ability to differ islanding from grid disturbance. In this paper, the concept of deep learning is introduced into classification of islanding and grid disturbance for the first time. A novel deep learning framework is proposed to detect and classify islanding or grid disturbance. The framework is a hybrid of wavelet transformation, multi-resolution singular spectrum entropy, and deep learning architecture. As a signal processing method after wavelet transformation, multi-resolution singular spectrum entropy combines multi-resolution analysis and spectrum analysis with entropy as output, from which we can extract the intrinsic different features between islanding and grid disturbance. With the features extracted, deep learning is utilized to classify islanding and grid disturbance. Simulation results indicate that the method can achieve its goal while being highly accurate, so the photovoltaic system mistakenly withdrawing from power grids can be avoided.

  14. Multi-scale Modeling of the Impact Response of a Strain Rate Sensitive High-Manganese Austenitic Steel

    NASA Astrophysics Data System (ADS)

    Önal, Orkun; Ozmenci, Cemre; Canadinc, Demircan

    2014-09-01

    A multi-scale modeling approach was applied to predict the impact response of a strain rate sensitive high-manganese austenitic steel. The roles of texture, geometry and strain rate sensitivity were successfully taken into account all at once by coupling crystal plasticity and finite element (FE) analysis. Specifically, crystal plasticity was utilized to obtain the multi-axial flow rule at different strain rates based on the experimental deformation response under uniaxial tensile loading. The equivalent stress - equivalent strain response was then incorporated into the FE model for the sake of a more representative hardening rule under impact loading. The current results demonstrate that reliable predictions can be obtained by proper coupling of crystal plasticity and FE analysis even if the experimental flow rule of the material is acquired under uniaxial loading and at moderate strain rates that are significantly slower than those attained during impact loading. Furthermore, the current findings also demonstrate the need for an experiment-based multi-scale modeling approach for the sake of reliable predictions of the impact response.

  15. A decision support for an integrated multi-scale analysis of irrigation: DSIRR.

    PubMed

    Bazzani, Guido M

    2005-12-01

    The paper presents a decision support designed to conduct an economic-environmental assessment of the agricultural activity focusing on irrigation called 'Decision Support for IRRigated Agriculture' (DSIRR). The program describes the effect at catchment scale of choices taken at micro scale by independent actors, the farmers, by simulating their decision process. The decision support (DS) has been thought of as a support tool for participatory water policies as requested by the Water Framework Directive and it aims at analyzing alternatives in production and technology, according to different market, policy and climate conditions. The tool uses data and models, provides a graphical user interface and can incorporate the decision makers' own insights. Heterogeneity in preferences is admitted since it is assumed that irrigators try to optimize personal multi-attribute utility functions, subject to a set of constraints. Consideration of agronomic and engineering aspects allows an accurate description of irrigation. Mathematical programming techniques are applied to find solutions. The program has been applied in the river Po basin (northern Italy) to analyze the impact of a pricing policy in a context of irrigation technology innovation. Water demand functions and elasticity to water price have been estimated. Results demonstrate how different areas and systems react to the same policy in quite a different way. While in the annual cropping system pricing seems effective to save the resource at the cost of impeding Water Agencies cost recovery, the same policy has an opposite effect in the perennial fruit system which shows an inelastic response to water price. The multidimensional assessment conducted clarified the trades-off among conflicting economic-social-environmental objectives, thus generating valuable information to design a more tailored mix of measures.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herrington, J; Price, M; Brindle, J

    Purpose: To evaluate the equivalence of spine SBRT treatment plans created in Eclipse for the TrueBeam STx (Varian Medical System, Palo Alto, CA) compared to plans using CyberKnife and MultiPlan (Accuray, Sunnyvale, CA). Methods: CT data and contours for 23 spine SBRT patients previously treated using CyberKnife (CK) were exported from MultiPlan treatment planning system into Eclipse where they were planned using static IMRT 6MV coplanar beams. Plans were created according to the original prescription dose and fractionation schedule while limiting spinal dose according to the RTOG 0631 protocol and maintaining target coverage comparable to the original CK plans. Plansmore » were evaluated using new conformity index (nCI), homogeneity index (HI), dose-volume histogram data, number of MU, and estimated treatment time. To ensure all Eclipse plans were deliverable, standard clinical IMRT QA was performed. The plan results were matched with their complimentary CK plans for paired statistical analysis. Results: Plans generated in Eclipse demonstrated statistically significant (p<0.01) improvements compared to complimentary CK plans in median values of maximum spinal cord dose (17.39 vs. 18.12 Gy), RTOG spinal cord constraint dose (14.50 vs. 16.93 Gy), nCI (1.28 vs. 1.54), HI (1.13 vs. 1.27), MU (3918 vs. 36416), and estimated treatment time (8 vs. 48 min). All Eclipse generated plans passed our clinically used protocols for IMRT QA. Conclusion: CK spine SBRT replanned utilizing Eclipse for LINAC delivery demonstrated dosimetric advantages. We propose improvements in plan quality metrics reviewed in this study may be attributed to dynamic MLCs that facilitate treatment of complicated geometries as well as posterior beams ideal for centrally located and/or posterior targets afforded by gantry-based RT delivery.« less

  17. Decision Making and Priority Setting: The Evolving Path Towards Universal Health Coverage.

    PubMed

    Paolucci, Francesco; Redekop, Ken; Fouda, Ayman; Fiorentini, Gianluca

    2017-12-01

    Health technology assessment (HTA) is widely viewed as an essential component in good universal health coverage (UHC) decision-making in any country. Various HTA tools and metrics have been developed and refined over the years, including systematic literature reviews (Cochrane), economic modelling, and cost-effectiveness ratios and acceptability curves. However, while the cost-effectiveness ratio is faithfully reported in most full economic evaluations, it is viewed by many as an insufficient basis for reimbursement decisions. Emotional debates about the reimbursement of cancer drugs, orphan drugs, and end-of-life treatments have revealed fundamental disagreements about what should and should not be considered in reimbursement decisions. Part of this disagreement seems related to the equity-efficiency tradeoff, which reflects fundamental differences in priorities. All in all, it is clear that countries aiming to improve UHC policies will have to go beyond the capacity building needed to utilize the available HTA toolbox. Multi-criteria decision analysis (MCDA) offers a more comprehensive tool for reimbursement decisions where different weights of different factors/attributes can give policymakers important insights to consider. Sooner or later, every country will have to develop their own way to carefully combine the results of those tools with their own priorities. In the end, all policymaking is based on a mix of facts and values.

  18. Measurement of the edge plasma rotation on J-TEXT tokamak

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, Z. F.; Luo, J.; Wang, Z. J.

    2013-07-15

    A multi-channel high resolution spectrometer was developed for the measurement of the edge plasma rotation on J-TEXT tokamak. With the design of two opposite viewing directions, the poloidal and toroidal rotations can be measured simultaneously, and velocity accuracy is up to 1 km/s. The photon flux was enhanced by utilizing combined optical fiber. With this design, the time resolution reaches 3 ms. An assistant software “Spectra Assist” was developed for implementing the spectrometer control and data analysis automatically. A multi-channel monochromatic analyzer is designed to get the location of chosen ions simultaneously through the inversion analysis. Some preliminary experimental resultsmore » about influence of plasma density, different magnetohydrodynamics behaviors, and applying of biased electrode are presented.« less

  19. Determining urban land uses through building-associated element attributes derived from lidar and aerial photographs

    NASA Astrophysics Data System (ADS)

    Meng, Xuelian

    Urban land-use research is a key component in analyzing the interactions between human activities and environmental change. Researchers have conducted many experiments to classify urban or built-up land, forest, water, agriculture, and other land-use and land-cover types. Separating residential land uses from other land uses within urban areas, however, has proven to be surprisingly troublesome. Although high-resolution images have recently become more available for land-use classification, an increase in spatial resolution does not guarantee improved classification accuracy by traditional classifiers due to the increase of class complexity. This research presents an approach to detect and separate residential land uses on a building scale directly from remotely sensed imagery to enhance urban land-use analysis. Specifically, the proposed methodology applies a multi-directional ground filter to generate a bare ground surface from lidar data, then utilizes a morphology-based building detection algorithm to identify buildings from lidar and aerial photographs, and finally separates residential buildings using a supervised C4.5 decision tree analysis based on the seven selected building land-use indicators. Successful execution of this study produces three independent methods, each corresponding to the steps of the methodology: lidar ground filtering, building detection, and building-based object-oriented land-use classification. Furthermore, this research provides a prototype as one of the few early explorations of building-based land-use analysis and successful separation of more than 85% of residential buildings based on an experiment on an 8.25-km2 study site located in Austin, Texas.

  20. Detecting Hotspot Information Using Multi-Attribute Based Topic Model

    PubMed Central

    Wang, Jing; Li, Li; Tan, Feng; Zhu, Ying; Feng, Weisi

    2015-01-01

    Microblogging as a kind of social network has become more and more important in our daily lives. Enormous amounts of information are produced and shared on a daily basis. Detecting hot topics in the mountains of information can help people get to the essential information more quickly. However, due to short and sparse features, a large number of meaningless tweets and other characteristics of microblogs, traditional topic detection methods are often ineffective in detecting hot topics. In this paper, we propose a new topic model named multi-attribute latent dirichlet allocation (MA-LDA), in which the time and hashtag attributes of microblogs are incorporated into LDA model. By introducing time attribute, MA-LDA model can decide whether a word should appear in hot topics or not. Meanwhile, compared with the traditional LDA model, applying hashtag attribute in MA-LDA model gives the core words an artificially high ranking in results meaning the expressiveness of outcomes can be improved. Empirical evaluations on real data sets demonstrate that our method is able to detect hot topics more accurately and efficiently compared with several baselines. Our method provides strong evidence of the importance of the temporal factor in extracting hot topics. PMID:26496635

  1. Risk Decision Making Model for Reservoir Floodwater resources Utilization

    NASA Astrophysics Data System (ADS)

    Huang, X.

    2017-12-01

    Floodwater resources utilization(FRU) can alleviate the shortage of water resources, but there are risks. In order to safely and efficiently utilize the floodwater resources, it is necessary to study the risk of reservoir FRU. In this paper, the risk rate of exceeding the design flood water level and the risk rate of exceeding safety discharge are estimated. Based on the principle of the minimum risk and the maximum benefit of FRU, a multi-objective risk decision making model for FRU is constructed. Probability theory and mathematical statistics method is selected to calculate the risk rate; C-D production function method and emergy analysis method is selected to calculate the risk benefit; the risk loss is related to flood inundation area and unit area loss; the multi-objective decision making problem of the model is solved by the constraint method. Taking the Shilianghe reservoir in Jiangsu Province as an example, the optimal equilibrium solution of FRU of the Shilianghe reservoir is found by using the risk decision making model, and the validity and applicability of the model are verified.

  2. A novel hybrid MCDM model for performance evaluation of research and technology organizations based on BSC approach.

    PubMed

    Varmazyar, Mohsen; Dehghanbaghi, Maryam; Afkhami, Mehdi

    2016-10-01

    Balanced Scorecard (BSC) is a strategic evaluation tool using both financial and non-financial indicators to determine the business performance of organizations or companies. In this paper, a new integrated approach based on the Balanced Scorecard (BSC) and multi-criteria decision making (MCDM) methods are proposed to evaluate the performance of research centers of research and technology organization (RTO) in Iran. Decision-Making Trial and Evaluation Laboratory (DEMATEL) are employed to reflect the interdependencies among BSC perspectives. Then, Analytic Network Process (ANP) is utilized to weight the indices influencing the considered problem. In the next step, we apply four MCDM methods including Additive Ratio Assessment (ARAS), Complex Proportional Assessment (COPRAS), Multi-Objective Optimization by Ratio Analysis (MOORA), and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) for ranking of alternatives. Finally, the utility interval technique is applied to combine the ranking results of MCDM methods. Weighted utility intervals are computed by constructing a correlation matrix between the ranking methods. A real case is presented to show the efficacy of the proposed approach. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Exacerbations of COPD: quantifying the patient's perspective using discrete choice modelling.

    PubMed

    Haughney, J; Partridge, M R; Vogelmeier, C; Larsson, T; Kessler, R; Ståhl, E; Brice, R; Löfdahl, C-G

    2005-10-01

    Patient-centred care is the current vogue in chronic obstructive pulmonary disease (COPD), but it is only recently that robust techniques have become available to determine patients' values and preferences. In this international cross-sectional study, patients' concerns and expectations regarding COPD exacerbations were explored using discrete choice modelling. A fractional factorial design was used to develop scenarios comprising a combination of levels for nine different attributes. In face-to-face interviews, patients were presented with paired scenarios and asked to choose the least preferable. Multinomial logit (with hierarchical Bayes) methods were used to estimate utilities. A total of 125 patients (82 males; mean age 66 yrs; 4.6 mean exacerbations.yr-1) were recruited. The attributes of exacerbations considered most important were impact on everyday life (20%), need for medical care (16%), number of future attacks (12%) and breathlessness (11%). The next most important attributes were speed of recovery, productive cough and social impact (all 9%), followed by sleep disturbance and impact on mood (both 7%). Importantly, analysis of utility shifts showed that patients most feared being hospitalised, housebound or bedridden. These issues were more important than symptom improvement. Strategies for the clinical management of chronic obstructive pulmonary disease should clearly address patients' concerns and focus on preventing and treating exacerbations to avoid these feared outcomes.

  4. Personalized Recommendation of Learning Material Using Sequential Pattern Mining and Attribute Based Collaborative Filtering

    ERIC Educational Resources Information Center

    Salehi, Mojtaba; Nakhai Kamalabadi, Isa; Ghaznavi Ghoushchi, Mohammad Bagher

    2014-01-01

    Material recommender system is a significant part of e-learning systems for personalization and recommendation of appropriate materials to learners. However, in the existing recommendation algorithms, dynamic interests and multi-preference of learners and multidimensional-attribute of materials are not fully considered simultaneously. Moreover,…

  5. Creating a Test Validated Structural Dynamic Finite Element Model of the Multi-Utility Technology Test Bed Aircraft

    NASA Technical Reports Server (NTRS)

    Pak, Chan-Gi; Truong, Samson S.

    2014-01-01

    Small modeling errors in the finite element model will eventually induce errors in the structural flexibility and mass, thus propagating into unpredictable errors in the unsteady aerodynamics and the control law design. One of the primary objectives of Multi Utility Technology Test Bed, X-56A, aircraft is the flight demonstration of active flutter suppression, and therefore in this study, the identification of the primary and secondary modes for the structural model tuning based on the flutter analysis of X-56A. The ground vibration test validated structural dynamic finite element model of the X-56A is created in this study. The structural dynamic finite element model of the X-56A is improved using a model tuning tool. In this study, two different weight configurations of the X-56A have been improved in a single optimization run.

  6. Utilization of multi-band OFDM modulation to increase traffic rate of phosphor-LED wireless VLC.

    PubMed

    Yeh, Chien-Hung; Chen, Hsing-Yu; Chow, Chi-Wai; Liu, Yen-Liang

    2015-01-26

    To increase the traffic rate in phosphor-LED visible light communication (VLC), a multi-band orthogonal frequency division multiplexed (OFDM) modulation is first proposed and demonstrated. In the measurement, we do not utilize optical blue filter to increase modulation bandwidth of phosphor-LED in the VLC system. In this proposed scheme, different bands of OFDM signals are applied to different LED chips in a LED lamp, this can avoid the power fading and nonlinearity issue by applying the same OFDM signal to all the LED chips in a LED lamp. Here, the maximum increase percentages of traffic rates are 41.1%, 17.8% and 17.8% under received illuminations of 200, 500 and 1000 Lux, respectively, when the proposed three-band OFDM modulation is used in the VLC system. In addition, the analysis and verification by experiments are also performed.

  7. Consumer preference of fertilizer in West Java using multi-dimensional scaling approach

    NASA Astrophysics Data System (ADS)

    Utami, Hesty Nurul; Sadeli, Agriani Hermita; Perdana, Tomy; Renaldy, Eddy; Mahra Arari, H.; Ajeng Sesy N., P.; Fernianda Rahayu, H.; Ginanjar, Tetep; Sanjaya, Sonny

    2018-02-01

    There are various fertilizer products in the markets for farmers to be used for farming activities. Fertilizers are a supplements supply to soil nutrients, build up soil fertility in order to support plant nutrients and increase plants productivity. Fertilizers consists of nitrogen, phosphorous, potassium, micro vitamins and other complex nutrient in farming systems that commonly used in agricultural activities to improve quantity and quality of harvest. Recently, market demand for fertilizer has been increased dramatically; furthermore, fertilizer companies are required to develop strategies to know about consumer preferences towards several issues. Consumer preference depends on consumer needs selected by subject (individual) that is measured by utilization from several things that market offered and as final decision on purchase process. West Java is one of province as the main producer of agricultural products and automatically is one of the potential consumer's fertilizers on farming activities. This research is a case study in nine districts in West Java province, i.e., Bandung, West Bandung, Bogor, Depok, Garut, Indramayu, Majalengka, Cirebon and Cianjur. The purpose of this research is to describe the attributes on consumer preference for fertilizers. The multi-dimensional scaling method is used as quantitative method to help visualize the level of similarity of individual cases on a dataset, to describe and mapping the information system and to accept the goal. The attributes in this research are availability, nutrients content, price, form of fertilizer, decomposition speed, easy to use, label, packaging type, color, design and size of packaging, hardening process and promotion. There are tendency towards two fertilizer brand have similarity on availability of products, price, speed of decomposition and hardening process.

  8. Scalable Methods for Uncertainty Quantification, Data Assimilation and Target Accuracy Assessment for Multi-Physics Advanced Simulation of Light Water Reactors

    NASA Astrophysics Data System (ADS)

    Khuwaileh, Bassam

    High fidelity simulation of nuclear reactors entails large scale applications characterized with high dimensionality and tremendous complexity where various physics models are integrated in the form of coupled models (e.g. neutronic with thermal-hydraulic feedback). Each of the coupled modules represents a high fidelity formulation of the first principles governing the physics of interest. Therefore, new developments in high fidelity multi-physics simulation and the corresponding sensitivity/uncertainty quantification analysis are paramount to the development and competitiveness of reactors achieved through enhanced understanding of the design and safety margins. Accordingly, this dissertation introduces efficient and scalable algorithms for performing efficient Uncertainty Quantification (UQ), Data Assimilation (DA) and Target Accuracy Assessment (TAA) for large scale, multi-physics reactor design and safety problems. This dissertation builds upon previous efforts for adaptive core simulation and reduced order modeling algorithms and extends these efforts towards coupled multi-physics models with feedback. The core idea is to recast the reactor physics analysis in terms of reduced order models. This can be achieved via identifying the important/influential degrees of freedom (DoF) via the subspace analysis, such that the required analysis can be recast by considering the important DoF only. In this dissertation, efficient algorithms for lower dimensional subspace construction have been developed for single physics and multi-physics applications with feedback. Then the reduced subspace is used to solve realistic, large scale forward (UQ) and inverse problems (DA and TAA). Once the elite set of DoF is determined, the uncertainty/sensitivity/target accuracy assessment and data assimilation analysis can be performed accurately and efficiently for large scale, high dimensional multi-physics nuclear engineering applications. Hence, in this work a Karhunen-Loeve (KL) based algorithm previously developed to quantify the uncertainty for single physics models is extended for large scale multi-physics coupled problems with feedback effect. Moreover, a non-linear surrogate based UQ approach is developed, used and compared to performance of the KL approach and brute force Monte Carlo (MC) approach. On the other hand, an efficient Data Assimilation (DA) algorithm is developed to assess information about model's parameters: nuclear data cross-sections and thermal-hydraulics parameters. Two improvements are introduced in order to perform DA on the high dimensional problems. First, a goal-oriented surrogate model can be used to replace the original models in the depletion sequence (MPACT -- COBRA-TF - ORIGEN). Second, approximating the complex and high dimensional solution space with a lower dimensional subspace makes the sampling process necessary for DA possible for high dimensional problems. Moreover, safety analysis and design optimization depend on the accurate prediction of various reactor attributes. Predictions can be enhanced by reducing the uncertainty associated with the attributes of interest. Accordingly, an inverse problem can be defined and solved to assess the contributions from sources of uncertainty; and experimental effort can be subsequently directed to further improve the uncertainty associated with these sources. In this dissertation a subspace-based gradient-free and nonlinear algorithm for inverse uncertainty quantification namely the Target Accuracy Assessment (TAA) has been developed and tested. The ideas proposed in this dissertation were first validated using lattice physics applications simulated using SCALE6.1 package (Pressurized Water Reactor (PWR) and Boiling Water Reactor (BWR) lattice models). Ultimately, the algorithms proposed her were applied to perform UQ and DA for assembly level (CASL progression problem number 6) and core wide problems representing Watts Bar Nuclear 1 (WBN1) for cycle 1 of depletion (CASL Progression Problem Number 9) modeled via simulated using VERA-CS which consists of several multi-physics coupled models. The analysis and algorithms developed in this dissertation were encoded and implemented in a newly developed tool kit algorithms for Reduced Order Modeling based Uncertainty/Sensitivity Estimator (ROMUSE).

  9. DARHT Multi-intelligence Seismic and Acoustic Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, Garrison Nicole; Van Buren, Kendra Lu; Hemez, Francois M.

    The purpose of this report is to document the analysis of seismic and acoustic data collected at the Dual-Axis Radiographic Hydrodynamic Test (DARHT) facility at Los Alamos National Laboratory for robust, multi-intelligence decision making. The data utilized herein is obtained from two tri-axial seismic sensors and three acoustic sensors, resulting in a total of nine data channels. The goal of this analysis is to develop a generalized, automated framework to determine internal operations at DARHT using informative features extracted from measurements collected external of the facility. Our framework involves four components: (1) feature extraction, (2) data fusion, (3) classification, andmore » finally (4) robustness analysis. Two approaches are taken for extracting features from the data. The first of these, generic feature extraction, involves extraction of statistical features from the nine data channels. The second approach, event detection, identifies specific events relevant to traffic entering and leaving the facility as well as explosive activities at DARHT and nearby explosive testing sites. Event detection is completed using a two stage method, first utilizing signatures in the frequency domain to identify outliers and second extracting short duration events of interest among these outliers by evaluating residuals of an autoregressive exogenous time series model. Features extracted from each data set are then fused to perform analysis with a multi-intelligence paradigm, where information from multiple data sets are combined to generate more information than available through analysis of each independently. The fused feature set is used to train a statistical classifier and predict the state of operations to inform a decision maker. We demonstrate this classification using both generic statistical features and event detection and provide a comparison of the two methods. Finally, the concept of decision robustness is presented through a preliminary analysis where uncertainty is added to the system through noise in the measurements.« less

  10. Natural Hazard Susceptibility Assessment for Road Planning Using Spatial Multi-Criteria Analysis

    NASA Astrophysics Data System (ADS)

    Karlsson, Caroline S. J.; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve W.

    2017-11-01

    Inadequate infrastructural networks can be detrimental to society if transport between locations becomes hindered or delayed, especially due to natural hazards which are difficult to control. Thus determining natural hazard susceptible areas and incorporating them in the initial planning process, may reduce infrastructural damages in the long run. The objective of this study was to evaluate the usefulness of expert judgments for assessing natural hazard susceptibility through a spatial multi-criteria analysis approach using hydrological, geological, and land use factors. To utilize spatial multi-criteria analysis for decision support, an analytic hierarchy process was adopted where expert judgments were evaluated individually and in an aggregated manner. The estimates of susceptible areas were then compared with the methods weighted linear combination using equal weights and factor interaction method. Results showed that inundation received the highest susceptibility. Using expert judgment showed to perform almost the same as equal weighting where the difference in susceptibility between the two for inundation was around 4%. The results also showed that downscaling could negatively affect the susceptibility assessment and be highly misleading. Susceptibility assessment through spatial multi-criteria analysis is useful for decision support in early road planning despite its limitation to the selection and use of decision rules and criteria. A natural hazard spatial multi-criteria analysis could be used to indicate areas where more investigations need to be undertaken from a natural hazard point of view, and to identify areas thought to have higher susceptibility along existing roads where mitigation measures could be targeted after in-situ investigations.

  11. Natural Hazard Susceptibility Assessment for Road Planning Using Spatial Multi-Criteria Analysis.

    PubMed

    Karlsson, Caroline S J; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve W

    2017-11-01

    Inadequate infrastructural networks can be detrimental to society if transport between locations becomes hindered or delayed, especially due to natural hazards which are difficult to control. Thus determining natural hazard susceptible areas and incorporating them in the initial planning process, may reduce infrastructural damages in the long run. The objective of this study was to evaluate the usefulness of expert judgments for assessing natural hazard susceptibility through a spatial multi-criteria analysis approach using hydrological, geological, and land use factors. To utilize spatial multi-criteria analysis for decision support, an analytic hierarchy process was adopted where expert judgments were evaluated individually and in an aggregated manner. The estimates of susceptible areas were then compared with the methods weighted linear combination using equal weights and factor interaction method. Results showed that inundation received the highest susceptibility. Using expert judgment showed to perform almost the same as equal weighting where the difference in susceptibility between the two for inundation was around 4%. The results also showed that downscaling could negatively affect the susceptibility assessment and be highly misleading. Susceptibility assessment through spatial multi-criteria analysis is useful for decision support in early road planning despite its limitation to the selection and use of decision rules and criteria. A natural hazard spatial multi-criteria analysis could be used to indicate areas where more investigations need to be undertaken from a natural hazard point of view, and to identify areas thought to have higher susceptibility along existing roads where mitigation measures could be targeted after in-situ investigations.

  12. Choice Experiments to Quantify Preferences for Health and Healthcare: State of the Practice.

    PubMed

    Mühlbacher, Axel; Johnson, F Reed

    2016-06-01

    Stated-preference methods increasingly are used to quantify preferences in health economics, health technology assessment, benefit-risk analysis and health services research. The objective of stated-preference studies is to acquire information about trade-off preferences among treatment outcomes, prioritization of clinical decision criteria, likely uptake or adherence to healthcare products and acceptability of healthcare services or policies. A widely accepted approach to eliciting preferences is discrete-choice experiments. Patient, physician, insurant or general-public respondents choose among constructed, experimentally controlled alternatives described by decision-relevant features or attributes. Attributes can represent complete health states, sets of treatment outcomes or characteristics of a healthcare system. The observed pattern of choice reveals how different respondents or groups of respondents implicitly weigh, value and assess different characteristics of treatments, products or services. An important advantage of choice experiments is their foundation in microeconomic utility theory. This conceptual framework provides tests of internal validity, guidance for statistical analysis of latent preference structures, and testable behavioural hypotheses. Choice experiments require expertise in survey-research methods, random-utility theory, experimental design and advanced statistical analysis. This paper should be understood as an introduction to setting up a basic experiment rather than an exhaustive critique of the latest findings and procedures. Where appropriate, we have identified topics of active research where a broad consensus has not yet been established.

  13. Fingerprint Analysis: Moving Toward Multiattribute Determination via Individual Markers.

    PubMed

    Brunelle, Erica; Huynh, Crystal; Alin, Eden; Eldridge, Morgan; Le, Anh Minh; Halámková, Lenka; Halámek, Jan

    2018-01-02

    Forensic science will be forever revolutionized if law enforcement can identify personal attributes of a person of interest solely from a fingerprint. For the past 2 years, the goal of our group has been to establish a way to identify originator attributes, specifically biological sex, from a single analyte. To date, an enzymatic assay and two chemical assays have been developed for the analysis of multiple analytes. In this manuscript, two additional assays have been developed. This time, however, the assays utilize only one amino acid each. The enzymatic assay targets alanine and employs alanine transaminase (ALT), pyruvate oxidase (POx), and horseradish peroxidase (HRP). The other, a chemical assay, is known as the Sakaguchi test and targets arginine. It is important to note that alanine has a significantly higher concentration than arginine in the fingerprint content of both males and females. Both assays proved to be capable of accurately differentiating between male and female fingerprints, regardless of their respective average concentration. The ability to target a single analyte will transform forensic science as each originator attribute can be correlated to a different analyte. This would then lead to the possibility of identifying multiple attributes from a single fingerprint sample. Ultimately, this would allow for a profile of a person of interest to be established without the need for time-consuming lab processes.

  14. Estimating QALY gains in applied studies: a review of cost-utility analyses published in 2010.

    PubMed

    Wisløff, Torbjørn; Hagen, Gunhild; Hamidi, Vida; Movik, Espen; Klemp, Marianne; Olsen, Jan Abel

    2014-04-01

    Reimbursement agencies in several countries now require health outcomes to be measured in terms of quality-adjusted life-years (QALYs), leading to an immense increase in publications reporting QALY gains. However, there is a growing concern that the various 'multi-attribute utility' (MAU) instruments designed to measure the Q in the QALY yield disparate values, implying that results from different instruments are incommensurable. By reviewing cost-utility analyses published in 2010, we aim to contribute to improved knowledge on how QALYs are currently calculated in applied analyses; how transparently QALY measurement is presented; and how large the expected incremental QALY gains are. We searched Embase, MEDLINE and NHS EED for all cost-utility analyses published in 2010. All analyses that had estimated QALYs gained from health interventions were included. Of the 370 studies included in this review, 48% were pharmacoeconomic evaluations. Active comparators were used in 71% of studies. The median incremental QALY gain was 0.06, which translates to 3 weeks in best imaginable health. The EQ-5D-3L is the dominant instrument used. However, reporting of how QALY gains are estimated is generally inadequate. In 55% of the studies there was no reference to which MAU instrument or direct valuation method QALY data came from. The methods used for estimating expected QALY gains are not transparently reported in published papers. Given the wide variation in utility scores that different methodologies may assign to an identical health state, it is important for journal editors to require a more transparent way of reporting the estimation of incremental QALY gains.

  15. Electric utilities, fiscal illusion and the provision of local public services

    NASA Astrophysics Data System (ADS)

    Dowell, Paula Elizabeth Kay

    2000-10-01

    Restructuring activity in the electric utility industry is threatening a once stable and significant source of revenue for local governments. Potentially declining revenues from electric utilities leaves local policymakers with the unpopular decision of raising taxes or reducing the level of public services provided. This has led to pressure on state governments to introduce legislation aimed at mitigating potential revenue loss for local government due to restructuring activity. However, before imposing such legislation, a better understanding of the potential distortionary effects of internal subsidization by electric utilities is needed. Two models of the demand for local public services--a structural model using the Stone-Geary utility framework and a reduced form model--are developed in an attempt to model the behavioral responses of local public expenditures to revenue contributions from electric utilities. Empirical analysis of both models is conducted using a panel data set for 242 municipalities in Tennessee from 1988 to 1998. Aggregate spending and expenditures on four specific service functions are examined. The results provide evidence of a positive flypaper effect. Furthermore, the source of the flypaper effect is attributed to fiscal illusion caused by price distortions. The stimulative effect of electric utility revenue contributions on the level of local public services indicate that a 1.00 change in electric utility subsidies results in a change in local expenditures ranging from 0.22 to 1.32 for the structural model and 1.97 to 2.51 for the reduced form model. The amount of the marginal effect directly attributed to price illusion is estimated to range from 0.04 to $0.85. In addition, the elasticities of electric utility revenue contributions are estimated to range from 0.05 to 0.90. The results raise a number of interesting issues regarding municipal ownership of utilities and legislation regarding tax treatment of utilities after restructuring. The fact that the current study suggests that electric utility subsidies give rise to fiscal illusion raises new questions regarding the justification of safeguarding the exclusive franchise of municipally-owned utilities and revenues from electric utilities in the era of restructuring.

  16. Femtosecond Laser Ablation Multicollector ICPMS Analysis of Uranium Isotopes in NIST Glass

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duffin, Andrew M.; Springer, Kellen WE; Ward, Jesse D.

    We have utilized femtosecond laser ablation coupled to multi-collector inductively couple plasma mass spectrometry to measure the uranium isotopic content of NIST 61x (x=0,2,4,6) glasses. The uranium content of these glasses is a linear two-component mixing between isotopically natural uranium and the isotopically depleted spike used in preparing the glasses. Laser ablation results match extremely well, generally within a few ppm, with solution analysis following sample dissolution and chemical separation. In addition to isotopic data, sample utilization efficiency measurements indicate that over 1% of ablated uranium atoms reach a mass spectrometer detector, making this technique extremely efficient. Laser sampling alsomore » allows for spatial analysis and our data indicate that rare uranium concentration inhomogeneities exist in NIST 616 glass.« less

  17. Multi-Agent Many-Objective Robust Decision Making: Supporting Cooperative Regional Water Portfolio Planning in the Eastern United States

    NASA Astrophysics Data System (ADS)

    Herman, J. D.; Zeff, H. B.; Reed, P. M.; Characklis, G. W.

    2013-12-01

    In the Eastern United States, water infrastructure and institutional frameworks have evolved in a historically water-rich environment. However, large regional droughts over the past decade combined with continuing population growth have marked a transition to a state of water scarcity, for which current planning paradigms are ill-suited. Significant opportunities exist to improve the efficiency of water infrastructure via regional coordination, namely, regional 'portfolios' of water-related assets such as reservoirs, conveyance, conservation measures, and transfer agreements. Regional coordination offers the potential to improve reliability, cost, and environmental impact in the expected future state of the world, and, with informed planning, to improve robustness to future uncertainty. In support of this challenge, this study advances a multi-agent many-objective robust decision making (multi-agent MORDM) framework that blends novel computational search and uncertainty analysis tools to discover flexible, robust regional portfolios. Our multi-agent MORDM framework is demonstrated for four water utilities in the Research Triangle region of North Carolina, USA. The utilities supply nearly two million customers and have the ability to interact with one another via transfer agreements and shared infrastructure. We show that strategies for this region which are Pareto-optimal in the expected future state of the world remain vulnerable to performance degradation under alternative scenarios of deeply uncertain hydrologic and economic factors. We then apply the Patient Rule Induction Method (PRIM) to identify which of these uncertain factors drives the individual and collective vulnerabilities for the four cooperating utilities. Our results indicate that clear multi-agent tradeoffs emerge for attaining robustness across the utilities. Furthermore, the key factor identified for improving the robustness of the region's water supply is cooperative demand reduction. This type of approach is critically important given the risks and challenges posed by rising supply development costs, limits on new infrastructure, growing water demands and the underlying uncertainties associated with climate change. The proposed framework serves as a planning template for other historically water-rich regions which must now confront the reality of impending water scarcity.

  18. Implications of Diet for the Extinction of Saber-Toothed Cats and American Lions

    PubMed Central

    DeSantis, Larisa R. G.; Schubert, Blaine W.; Scott, Jessica R.; Ungar, Peter S.

    2012-01-01

    The saber-toothed cat, Smilodon fatalis, and American lion, Panthera atrox, were among the largest terrestrial carnivores that lived during the Pleistocene, going extinct along with other megafauna ∼12,000 years ago. Previous work suggests that times were difficult at La Brea (California) during the late Pleistocene, as nearly all carnivores have greater incidences of tooth breakage (used to infer greater carcass utilization) compared to today. As Dental Microwear Texture Analysis (DMTA) can differentiate between levels of bone consumption in extant carnivores, we use DMTA to clarify the dietary niches of extinct carnivorans from La Brea. Specifically, we test the hypothesis that times were tough at La Brea with carnivorous taxa utilizing more of the carcasses. Our results show no evidence of bone crushing by P. atrox, with DMTA attributes most similar to the extant cheetah, Acinonyx jubatus, which actively avoids bone. In contrast, S. fatalis has DMTA attributes most similar to the African lion Panthera leo, implying that S. fatalis did not avoid bone to the extent previously suggested by SEM microwear data. DMTA characters most indicative of bone consumption (i.e., complexity and textural fill volume) suggest that carcass utilization by the extinct carnivorans was not necessarily more complete during the Pleistocene at La Brea; thus, times may not have been “tougher” than the present. Additionally, minor to no significant differences in DMTA attributes from older (∼30–35 Ka) to younger (∼11.5 Ka) deposits offer little evidence that declining prey resources were a primary cause of extinction for these large cats. PMID:23300674

  19. Implications of diet for the extinction of saber-toothed cats and American lions.

    PubMed

    Desantis, Larisa R G; Schubert, Blaine W; Scott, Jessica R; Ungar, Peter S

    2012-01-01

    The saber-toothed cat, Smilodon fatalis, and American lion, Panthera atrox, were among the largest terrestrial carnivores that lived during the Pleistocene, going extinct along with other megafauna ∼12,000 years ago. Previous work suggests that times were difficult at La Brea (California) during the late Pleistocene, as nearly all carnivores have greater incidences of tooth breakage (used to infer greater carcass utilization) compared to today. As Dental Microwear Texture Analysis (DMTA) can differentiate between levels of bone consumption in extant carnivores, we use DMTA to clarify the dietary niches of extinct carnivorans from La Brea. Specifically, we test the hypothesis that times were tough at La Brea with carnivorous taxa utilizing more of the carcasses. Our results show no evidence of bone crushing by P. atrox, with DMTA attributes most similar to the extant cheetah, Acinonyx jubatus, which actively avoids bone. In contrast, S. fatalis has DMTA attributes most similar to the African lion Panthera leo, implying that S. fatalis did not avoid bone to the extent previously suggested by SEM microwear data. DMTA characters most indicative of bone consumption (i.e., complexity and textural fill volume) suggest that carcass utilization by the extinct carnivorans was not necessarily more complete during the Pleistocene at La Brea; thus, times may not have been "tougher" than the present. Additionally, minor to no significant differences in DMTA attributes from older (∼30-35 Ka) to younger (∼11.5 Ka) deposits offer little evidence that declining prey resources were a primary cause of extinction for these large cats.

  20. Compact full-motion video hyperspectral cameras: development, image processing, and applications

    NASA Astrophysics Data System (ADS)

    Kanaev, A. V.

    2015-10-01

    Emergence of spectral pixel-level color filters has enabled development of hyper-spectral Full Motion Video (FMV) sensors operating in visible (EO) and infrared (IR) wavelengths. The new class of hyper-spectral cameras opens broad possibilities of its utilization for military and industry purposes. Indeed, such cameras are able to classify materials as well as detect and track spectral signatures continuously in real time while simultaneously providing an operator the benefit of enhanced-discrimination-color video. Supporting these extensive capabilities requires significant computational processing of the collected spectral data. In general, two processing streams are envisioned for mosaic array cameras. The first is spectral computation that provides essential spectral content analysis e.g. detection or classification. The second is presentation of the video to an operator that can offer the best display of the content depending on the performed task e.g. providing spatial resolution enhancement or color coding of the spectral analysis. These processing streams can be executed in parallel or they can utilize each other's results. The spectral analysis algorithms have been developed extensively, however demosaicking of more than three equally-sampled spectral bands has been explored scarcely. We present unique approach to demosaicking based on multi-band super-resolution and show the trade-off between spatial resolution and spectral content. Using imagery collected with developed 9-band SWIR camera we demonstrate several of its concepts of operation including detection and tracking. We also compare the demosaicking results to the results of multi-frame super-resolution as well as to the combined multi-frame and multiband processing.

  1. Quantifying cadherin mechanotransduction machinery assembly/disassembly dynamics using fluorescence covariance analysis.

    PubMed

    Vedula, Pavan; Cruz, Lissette A; Gutierrez, Natasha; Davis, Justin; Ayee, Brian; Abramczyk, Rachel; Rodriguez, Alexis J

    2016-06-30

    Quantifying multi-molecular complex assembly in specific cytoplasmic compartments is crucial to understand how cells use assembly/disassembly of these complexes to control function. Currently, biophysical methods like Fluorescence Resonance Energy Transfer and Fluorescence Correlation Spectroscopy provide quantitative measurements of direct protein-protein interactions, while traditional biochemical approaches such as sub-cellular fractionation and immunoprecipitation remain the main approaches used to study multi-protein complex assembly/disassembly dynamics. In this article, we validate and quantify multi-protein adherens junction complex assembly in situ using light microscopy and Fluorescence Covariance Analysis. Utilizing specific fluorescently-labeled protein pairs, we quantified various stages of adherens junction complex assembly, the multiprotein complex regulating epithelial tissue structure and function following de novo cell-cell contact. We demonstrate: minimal cadherin-catenin complex assembly in the perinuclear cytoplasm and subsequent localization to the cell-cell contact zone, assembly of adherens junction complexes, acto-myosin tension-mediated anchoring, and adherens junction maturation following de novo cell-cell contact. Finally applying Fluorescence Covariance Analysis in live cells expressing fluorescently tagged adherens junction complex proteins, we also quantified adherens junction complex assembly dynamics during epithelial monolayer formation.

  2. Multi-Source Multi-Target Dictionary Learning for Prediction of Cognitive Decline.

    PubMed

    Zhang, Jie; Li, Qingyang; Caselli, Richard J; Thompson, Paul M; Ye, Jieping; Wang, Yalin

    2017-06-01

    Alzheimer's Disease (AD) is the most common type of dementia. Identifying correct biomarkers may determine pre-symptomatic AD subjects and enable early intervention. Recently, Multi-task sparse feature learning has been successfully applied to many computer vision and biomedical informatics researches. It aims to improve the generalization performance by exploiting the shared features among different tasks. However, most of the existing algorithms are formulated as a supervised learning scheme. Its drawback is with either insufficient feature numbers or missing label information. To address these challenges, we formulate an unsupervised framework for multi-task sparse feature learning based on a novel dictionary learning algorithm. To solve the unsupervised learning problem, we propose a two-stage Multi-Source Multi-Target Dictionary Learning (MMDL) algorithm. In stage 1, we propose a multi-source dictionary learning method to utilize the common and individual sparse features in different time slots. In stage 2, supported by a rigorous theoretical analysis, we develop a multi-task learning method to solve the missing label problem. Empirical studies on an N = 3970 longitudinal brain image data set, which involves 2 sources and 5 targets, demonstrate the improved prediction accuracy and speed efficiency of MMDL in comparison with other state-of-the-art algorithms.

  3. Examining the Impact of Critical Feedback on Learner Engagement in Secondary Mathematics Classrooms: A Multi-Level Analysis

    ERIC Educational Resources Information Center

    Kearney, W. Sean; Webb, Michael; Goldhorn, Jeff; Peters, Michelle L.

    2013-01-01

    This article presents a quantitative study utilizing HLM to analyze classroom walkthrough data completed by principals within 87 secondary mathematics classrooms across 9 public schools in Texas. This research is based on the theoretical framework of learner engagement as established by Argryis & Schon (1996), and refined by Marks (2000). It…

  4. Decision Support for Personalized Cloud Service Selection through Multi-Attribute Trustworthiness Evaluation

    PubMed Central

    Ding, Shuai; Xia, Chen-Yi; Zhou, Kai-Le; Yang, Shan-Lin; Shang, Jennifer S.

    2014-01-01

    Facing a customer market with rising demands for cloud service dependability and security, trustworthiness evaluation techniques are becoming essential to cloud service selection. But these methods are out of the reach to most customers as they require considerable expertise. Additionally, since the cloud service evaluation is often a costly and time-consuming process, it is not practical to measure trustworthy attributes of all candidates for each customer. Many existing models cannot easily deal with cloud services which have very few historical records. In this paper, we propose a novel service selection approach in which the missing value prediction and the multi-attribute trustworthiness evaluation are commonly taken into account. By simply collecting limited historical records, the current approach is able to support the personalized trustworthy service selection. The experimental results also show that our approach performs much better than other competing ones with respect to the customer preference and expectation in trustworthiness assessment. PMID:24972237

  5. Decision support for personalized cloud service selection through multi-attribute trustworthiness evaluation.

    PubMed

    Ding, Shuai; Xia, Cheng-Yi; Xia, Chen-Yi; Zhou, Kai-Le; Yang, Shan-Lin; Shang, Jennifer S

    2014-01-01

    Facing a customer market with rising demands for cloud service dependability and security, trustworthiness evaluation techniques are becoming essential to cloud service selection. But these methods are out of the reach to most customers as they require considerable expertise. Additionally, since the cloud service evaluation is often a costly and time-consuming process, it is not practical to measure trustworthy attributes of all candidates for each customer. Many existing models cannot easily deal with cloud services which have very few historical records. In this paper, we propose a novel service selection approach in which the missing value prediction and the multi-attribute trustworthiness evaluation are commonly taken into account. By simply collecting limited historical records, the current approach is able to support the personalized trustworthy service selection. The experimental results also show that our approach performs much better than other competing ones with respect to the customer preference and expectation in trustworthiness assessment.

  6. Multiattribute risk analysis in nuclear emergency management.

    PubMed

    Hämäläinen, R P; Lindstedt, M R; Sinkko, K

    2000-08-01

    Radiation protection authorities have seen a potential for applying multiattribute risk analysis in nuclear emergency management and planning to deal with conflicting objectives, different parties involved, and uncertainties. This type of approach is expected to help in the following areas: to ensure that all relevant attributes are considered in decision making; to enhance communication between the concerned parties, including the public; and to provide a method for explicitly including risk analysis in the process. A multiattribute utility theory analysis was used to select a strategy for protecting the population after a simulated nuclear accident. The value-focused approach and the use of a neutral facilitator were identified as being useful.

  7. Occupational asthma in a national disability survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blanc, P.

    1987-10-01

    The contribution of workplace exposures to the prevalence of asthma in adults has been minimized in the epidemiology of this illness. Analysis of the 1978 Social Security Disability Survey provides a population-based assessment as a novel approach utilizing self-attributed, occupationally related asthma as a measure of disease. Of 6063 respondents, 468 (7.7 percent) identified asthma as a personal medical condition; 72 (1.2 percent (15.4 percent of all those with asthma)) attributed it to workplace exposures. These subjects were older and included more men and cigarette smokers than groups of both asthmatic and nonasthmatic subjects. The relative risk for occupationally attributedmore » asthma was elevated among industrial and agricultural workers as compared with white collar and service occupations. Analysis of disability benefit status did not indicate that this introduced major reporting bias in this survey. This study suggests that occupational factors may have a greater role in adult asthma than previously thought.« less

  8. Multi-Response Parameter Interval Sensitivity and Optimization for the Composite Tape Winding Process.

    PubMed

    Deng, Bo; Shi, Yaoyao; Yu, Tao; Kang, Chao; Zhao, Pan

    2018-01-31

    The composite tape winding process, which utilizes a tape winding machine and prepreg tapes, provides a promising way to improve the quality of composite products. Nevertheless, the process parameters of composite tape winding have crucial effects on the tensile strength and void content, which are closely related to the performances of the winding products. In this article, two different object values of winding products, including mechanical performance (tensile strength) and a physical property (void content), were respectively calculated. Thereafter, the paper presents an integrated methodology by combining multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis to obtain the optimal intervals of the composite tape winding process. First, the global multi-parameter sensitivity analysis method was applied to investigate the sensitivity of each parameter in the tape winding processing. Then, the local single-parameter sensitivity analysis method was employed to calculate the sensitivity of a single parameter within the corresponding range. Finally, the stability and instability ranges of each parameter were distinguished. Meanwhile, the authors optimized the process parameter ranges and provided comprehensive optimized intervals of the winding parameters. The verification test validated that the optimized intervals of the process parameters were reliable and stable for winding products manufacturing.

  9. Multi-Response Parameter Interval Sensitivity and Optimization for the Composite Tape Winding Process

    PubMed Central

    Yu, Tao; Kang, Chao; Zhao, Pan

    2018-01-01

    The composite tape winding process, which utilizes a tape winding machine and prepreg tapes, provides a promising way to improve the quality of composite products. Nevertheless, the process parameters of composite tape winding have crucial effects on the tensile strength and void content, which are closely related to the performances of the winding products. In this article, two different object values of winding products, including mechanical performance (tensile strength) and a physical property (void content), were respectively calculated. Thereafter, the paper presents an integrated methodology by combining multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis to obtain the optimal intervals of the composite tape winding process. First, the global multi-parameter sensitivity analysis method was applied to investigate the sensitivity of each parameter in the tape winding processing. Then, the local single-parameter sensitivity analysis method was employed to calculate the sensitivity of a single parameter within the corresponding range. Finally, the stability and instability ranges of each parameter were distinguished. Meanwhile, the authors optimized the process parameter ranges and provided comprehensive optimized intervals of the winding parameters. The verification test validated that the optimized intervals of the process parameters were reliable and stable for winding products manufacturing. PMID:29385048

  10. Multi-country health surveys: are the analyses misleading?

    PubMed

    Masood, Mohd; Reidpath, Daniel D

    2014-05-01

    The aim of this paper was to review the types of approaches currently utilized in the analysis of multi-country survey data, specifically focusing on design and modeling issues with a focus on analyses of significant multi-country surveys published in 2010. A systematic search strategy was used to identify the 10 multi-country surveys and the articles published from them in 2010. The surveys were selected to reflect diverse topics and foci; and provide an insight into analytic approaches across research themes. The search identified 159 articles appropriate for full text review and data extraction. The analyses adopted in the multi-country surveys can be broadly classified as: univariate/bivariate analyses, and multivariate/multivariable analyses. Multivariate/multivariable analyses may be further divided into design- and model-based analyses. Of the 159 articles reviewed, 129 articles used model-based analysis, 30 articles used design-based analyses. Similar patterns could be seen in all the individual surveys. While there is general agreement among survey statisticians that complex surveys are most appropriately analyzed using design-based analyses, most researchers continued to use the more common model-based approaches. Recent developments in design-based multi-level analysis may be one approach to include all the survey design characteristics. This is a relatively new area, however, and there remains statistical, as well as applied analytic research required. An important limitation of this study relates to the selection of the surveys used and the choice of year for the analysis, i.e., year 2010 only. There is, however, no strong reason to believe that analytic strategies have changed radically in the past few years, and 2010 provides a credible snapshot of current practice.

  11. Costs of childhood asthma due to traffic-related pollution in two California communities.

    PubMed

    Brandt, Sylvia J; Perez, Laura; Künzli, Nino; Lurmann, Fred; McConnell, Rob

    2012-08-01

    Recent research suggests the burden of childhood asthma that is attributable to air pollution has been underestimated in traditional risk assessments, and there are no estimates of these associated costs. We aimed to estimate the yearly childhood asthma-related costs attributable to air pollution for Riverside and Long Beach, CA, USA, including: 1) the indirect and direct costs of healthcare utilisation due to asthma exacerbations linked with traffic-related pollution (TRP); and 2) the costs of health care for asthma cases attributable to local TRP exposure. We calculated costs using estimates from peer-reviewed literature and the authors' analysis of surveys (Medical Expenditure Panel Survey, California Health Interview Survey, National Household Travel Survey, and Health Care Utilization Project). A lower-bound estimate of the asthma burden attributable to air pollution was US$18 million yearly. Asthma cases attributable to TRP exposure accounted for almost half of this cost. The cost of bronchitic episodes was a major proportion of both the annual cost of asthma cases attributable to TRP and of pollution-linked exacerbations. Traditional risk assessment methods underestimate both the burden of disease and cost of asthma associated with air pollution, and these costs are borne disproportionately by communities with higher than average TRP.

  12. Habitat Utilization Assessment - Building in Behaviors

    NASA Technical Reports Server (NTRS)

    Whitmore, Mihriban; Blume, Jennifer

    2004-01-01

    Habitability, and the associated architectural and design attributes of an environment, is a powerful performance shaping factor. By identifying how inhabitants use an area, we can draw conclusions about what design or architectural attributes cause what behaviors and systematically design in desired human performance. We are analyzing how a crew uses a long duration habitat and work environment during a four-day underwater mission and identifying certain architectural and design attributes that are related to, and potential enablers of, certain crew behaviors. By identifying how inhabitants use the habitat, we can draw conclusions about what habitability attributes cause what behaviors and systematically design in desired human performance (applicable to NASA's Bioastronautics Human Behavior and Performance Critical Path Roadmap question 6.12). This assessment replicates a methodology reported in a chapter titled "Sociokinetic Analysis as a Tool for Optimization of Environmental Design" by C. Adams.' That study collected video imagery of certain areas of a closed habitat during a 91 day test and from that data calculated time spent in different volumes during the mission, and characterized the behaviors occurring in certain habitat volumes thus concluding various rules for design of such habitats. This study assesses the utilization of the Aquarius Habitat, an underwater station, which will support six Aquanauts for a fourteen-day mission during which the crew will perform specific scientific and engineering studies. Video is recorded for long uninterrupted periods of time during the mission and from that data the time spent in each area is calculated. In addition, qualitative and descriptive analysis of the types of behaviors in each area is performed with the purpose of identifying any behaviors that are not typical of a certain area. If a participant uses an area in a way different from expected, a subsequent analysis of the features of that area may result in conclusions of performance shaping factors. With the addition of this study, we can make comparisons between the two different habitats and begin drawing correlation judgments about design features and behavior. Ideally, this methodology should be repeated in additional Aquarius missions and other analog environments because the real information will come from comparisons between habitats.

  13. Attributions for School Achievement of Anglo and Native American Community College Students.

    ERIC Educational Resources Information Center

    Powers, Stephen; Rossman, Mark H.

    Attributions for school success and failure were examined among 211 community college students (112 Native Americans and 99 Anglos) enrolled in remedial reading classes at a large, urban multi-campus community college system in the Southwest. The Multidimensional-Multiattributional Causality Scale (MMCS) was administered to the students in their…

  14. Podium: Ranking Data Using Mixed-Initiative Visual Analytics.

    PubMed

    Wall, Emily; Das, Subhajit; Chawla, Ravish; Kalidindi, Bharath; Brown, Eli T; Endert, Alex

    2018-01-01

    People often rank and order data points as a vital part of making decisions. Multi-attribute ranking systems are a common tool used to make these data-driven decisions. Such systems often take the form of a table-based visualization in which users assign weights to the attributes representing the quantifiable importance of each attribute to a decision, which the system then uses to compute a ranking of the data. However, these systems assume that users are able to quantify their conceptual understanding of how important particular attributes are to a decision. This is not always easy or even possible for users to do. Rather, people often have a more holistic understanding of the data. They form opinions that data point A is better than data point B but do not necessarily know which attributes are important. To address these challenges, we present a visual analytic application to help people rank multi-variate data points. We developed a prototype system, Podium, that allows users to drag rows in the table to rank order data points based on their perception of the relative value of the data. Podium then infers a weighting model using Ranking SVM that satisfies the user's data preferences as closely as possible. Whereas past systems help users understand the relationships between data points based on changes to attribute weights, our approach helps users to understand the attributes that might inform their understanding of the data. We present two usage scenarios to describe some of the potential uses of our proposed technique: (1) understanding which attributes contribute to a user's subjective preferences for data, and (2) deconstructing attributes of importance for existing rankings. Our proposed approach makes powerful machine learning techniques more usable to those who may not have expertise in these areas.

  15. Life cycle thinking and assessment tools on environmentally-benign electronics: Convergent optimization of materials use, end-of-life strategy and environmental policies

    NASA Astrophysics Data System (ADS)

    Zhou, Xiaoying

    The purpose of this study is to integrate the quantitative environmental performance assessment tools and the theory of multi-objective optimization within the boundary of electronic product systems to support the selection among design alternatives in terms of environmental impact, technical criteria, and economic feasibility. To meet with the requirements that result from emerging environmental legislation targeting electronics products, the research addresses an important analytical methodological approach to facilitate environmentally conscious design and end-of-life management with a life cycle viewpoint. A synthesis of diverse assessment tools is applied on a set of case studies: lead-free solder materials selection, cellular phone design, and desktop display technology assessment. In the first part of this work, an in-depth industrial survey of the status and concerns of the U.S. electronics industry on the elimination of lead (Pb) in solders is described. The results show that the trade-offs among environmental consequences, technology challenges, business risks, legislative compliance and stakeholders' preferences must be explicitly, simultaneously, and systematically addressed in the decision-making process used to guide multi-faceted planning of environmental solutions. In the second part of this work, the convergent optimization of the technical cycle, economic cycle and environmental cycle is addressed in a coherent and systematic way using the application of environmentally conscious design of cellular phones. The technical understanding of product structure, components analysis, and materials flow facilitates the development of "Design for Disassembly" guidelines. A bottom-up disassembly analysis on a "bill of materials" based structure at a micro-operational level is utilized to select optimal end-of-life strategies on the basis of economic feasibility. A macro-operational level life cycle model is used to investigate the environmental consequences linking environmental impact with the cellular phone production activities focusing on the upstream manufacturing and end-of-life life cycle stages. The last part of this work, the quantitative elicitation of weighting factors facilitates the comparison of trade-offs in the context of a multi-attribute problem. An integrated analytical approach, Integrated Industrial Ecology Function Deployment (I2-EFD), is proposed to assess alternatives at the design phase of a product system and is validated with the assessment of desktop display technologies and lead-free solder alternatives.

  16. Community detection in sequence similarity networks based on attribute clustering

    DOE PAGES

    Chowdhary, Janamejaya; Loeffler, Frank E.; Smith, Jeremy C.

    2017-07-24

    Networks are powerful tools for the presentation and analysis of interactions in multi-component systems. A commonly studied mesoscopic feature of networks is their community structure, which arises from grouping together similar nodes into one community and dissimilar nodes into separate communities. Here in this paper, the community structure of protein sequence similarity networks is determined with a new method: Attribute Clustering Dependent Communities (ACDC). Sequence similarity has hitherto typically been quantified by the alignment score or its expectation value. However, pair alignments with the same score or expectation value cannot thus be differentiated. To overcome this deficiency, the method constructs,more » for pair alignments, an extended alignment metric, the link attribute vector, which includes the score and other alignment characteristics. Rescaling components of the attribute vectors qualitatively identifies a systematic variation of sequence similarity within protein superfamilies. The problem of community detection is then mapped to clustering the link attribute vectors, selection of an optimal subset of links and community structure refinement based on the partition density of the network. ACDC-predicted communities are found to be in good agreement with gold standard sequence databases for which the "ground truth" community structures (or families) are known. ACDC is therefore a community detection method for sequence similarity networks based entirely on pair similarity information. A serial implementation of ACDC is available from https://cmb.ornl.gov/resources/developments« less

  17. Trends in Extreme Rainfall Frequency in the Contiguous United States: Attribution to Climate Change and Climate Variability Modes

    NASA Astrophysics Data System (ADS)

    Armal, S.; Devineni, N.; Khanbilvardi, R.

    2017-12-01

    This study presents a systematic analysis for identifying and attributing trends in the annual frequency of extreme rainfall events across the contiguous United States to climate change and climate variability modes. A Bayesian multilevel model is developed for 1,244 stations simultaneously to test the null hypothesis of no trend and verify two alternate hypotheses: Trend can be attributed to changes in global surface temperature anomalies, or to a combination of cyclical climate modes with varying quasi-periodicities and global surface temperature anomalies. The Bayesian multilevel model provides the opportunity to pool information across stations and reduce the parameter estimation uncertainty, hence identifying the trends better. The choice of the best alternate hypotheses is made based on Watanabe-Akaike Information Criterion, a Bayesian pointwise predictive accuracy measure. Statistically significant time trends are observed in 742 of the 1,244 stations. Trends in 409 of these stations can be attributed to changes in global surface temperature anomalies. These stations are predominantly found in the Southeast and Northeast climate regions. The trends in 274 of these stations can be attributed to the El Nino Southern Oscillations, North Atlantic Oscillation, Pacific Decadal Oscillation and Atlantic Multi-Decadal Oscillation along with changes in global surface temperature anomalies. These stations are mainly found in the Northwest, West and Southwest climate regions.

  18. Community detection in sequence similarity networks based on attribute clustering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chowdhary, Janamejaya; Loeffler, Frank E.; Smith, Jeremy C.

    Networks are powerful tools for the presentation and analysis of interactions in multi-component systems. A commonly studied mesoscopic feature of networks is their community structure, which arises from grouping together similar nodes into one community and dissimilar nodes into separate communities. Here in this paper, the community structure of protein sequence similarity networks is determined with a new method: Attribute Clustering Dependent Communities (ACDC). Sequence similarity has hitherto typically been quantified by the alignment score or its expectation value. However, pair alignments with the same score or expectation value cannot thus be differentiated. To overcome this deficiency, the method constructs,more » for pair alignments, an extended alignment metric, the link attribute vector, which includes the score and other alignment characteristics. Rescaling components of the attribute vectors qualitatively identifies a systematic variation of sequence similarity within protein superfamilies. The problem of community detection is then mapped to clustering the link attribute vectors, selection of an optimal subset of links and community structure refinement based on the partition density of the network. ACDC-predicted communities are found to be in good agreement with gold standard sequence databases for which the "ground truth" community structures (or families) are known. ACDC is therefore a community detection method for sequence similarity networks based entirely on pair similarity information. A serial implementation of ACDC is available from https://cmb.ornl.gov/resources/developments« less

  19. A Case Study on Attribute Recognition of Heated Metal Mark Image Using Deep Convolutional Neural Networks.

    PubMed

    Mao, Keming; Lu, Duo; E, Dazhi; Tan, Zhenhua

    2018-06-07

    Heated metal mark is an important trace to identify the cause of fire. However, traditional methods mainly focus on the knowledge of physics and chemistry for qualitative analysis and make it still a challenging problem. This paper presents a case study on attribute recognition of the heated metal mark image using computer vision and machine learning technologies. The proposed work is composed of three parts. Material is first generated. According to national standards, actual needs and feasibility, seven attributes are selected for research. Data generation and organization are conducted, and a small size benchmark dataset is constructed. A recognition model is then implemented. Feature representation and classifier construction methods are introduced based on deep convolutional neural networks. Finally, the experimental evaluation is carried out. Multi-aspect testings are performed with various model structures, data augments, training modes, optimization methods and batch sizes. The influence of parameters, recognitio efficiency and execution time are also analyzed. The results show that with a fine-tuned model, the recognition rate of attributes metal type, heating mode, heating temperature, heating duration, cooling mode, placing duration and relative humidity are 0.925, 0.908, 0.835, 0.917, 0.928, 0.805 and 0.92, respectively. The proposed method recognizes the attribute of heated metal mark with preferable effect, and it can be used in practical application.

  20. Assessment Methods of Groundwater Overdraft Area and Its Application

    NASA Astrophysics Data System (ADS)

    Dong, Yanan; Xing, Liting; Zhang, Xinhui; Cao, Qianqian; Lan, Xiaoxun

    2018-05-01

    Groundwater is an important source of water, and long-term large demand make groundwater over-exploited. Over-exploitation cause a lot of environmental and geological problems. This paper explores the concept of over-exploitation area, summarizes the natural and social attributes of over-exploitation area, as well as expounds its evaluation methods, including single factor evaluation, multi-factor system analysis and numerical method. At the same time, the different methods are compared and analyzed. And then taking Northern Weifang as an example, this paper introduces the practicality of appraisal method.

  1. Principal component analysis of normalized full spectrum mass spectrometry data in multiMS-toolbox: An effective tool to identify important factors for classification of different metabolic patterns and bacterial strains.

    PubMed

    Cejnar, Pavel; Kuckova, Stepanka; Prochazka, Ales; Karamonova, Ludmila; Svobodova, Barbora

    2018-06-15

    Explorative statistical analysis of mass spectrometry data is still a time-consuming step. We analyzed critical factors for application of principal component analysis (PCA) in mass spectrometry and focused on two whole spectrum based normalization techniques and their application in the analysis of registered peak data and, in comparison, in full spectrum data analysis. We used this technique to identify different metabolic patterns in the bacterial culture of Cronobacter sakazakii, an important foodborne pathogen. Two software utilities, the ms-alone, a python-based utility for mass spectrometry data preprocessing and peak extraction, and the multiMS-toolbox, an R software tool for advanced peak registration and detailed explorative statistical analysis, were implemented. The bacterial culture of Cronobacter sakazakii was cultivated on Enterobacter sakazakii Isolation Agar, Blood Agar Base and Tryptone Soya Agar for 24 h and 48 h and applied by the smear method on an Autoflex speed MALDI-TOF mass spectrometer. For three tested cultivation media only two different metabolic patterns of Cronobacter sakazakii were identified using PCA applied on data normalized by two different normalization techniques. Results from matched peak data and subsequent detailed full spectrum analysis identified only two different metabolic patterns - a cultivation on Enterobacter sakazakii Isolation Agar showed significant differences to the cultivation on the other two tested media. The metabolic patterns for all tested cultivation media also proved the dependence on cultivation time. Both whole spectrum based normalization techniques together with the full spectrum PCA allow identification of important discriminative factors in experiments with several variable condition factors avoiding any problems with improper identification of peaks or emphasis on bellow threshold peak data. The amounts of processed data remain still manageable. Both implemented software utilities are available free of charge from http://uprt.vscht.cz/ms. Copyright © 2018 John Wiley & Sons, Ltd.

  2. Impact of Emissions and Long-Range Transport on Multi-Decadal Aerosol Trends: Implications for Air Quality and Climate

    NASA Technical Reports Server (NTRS)

    Chin, Mian

    2012-01-01

    We present a global model analysis of the impact of long-range transport and anthropogenic emissions on the aerosol trends in the major pollution regions in the northern hemisphere and in the Arctic in the past three decades. We will use the Goddard Chemistry Aerosol Radiation and Transport (GOCART) model to analyze the multi-spatial and temporal scale data, including observations from Terra, Aqua, and CALIPSO satellites and from the long-term surface monitoring stations. We will analyze the source attribution (SA) and source-receptor (SR) relationships in North America, Europe, East Asia, South Asia, and the Arctic at the surface and free troposphere and establish the quantitative linkages between emissions from different source regions. We will discuss the implications for regional air quality and climate change.

  3. Cost-utility analysis of the National truth campaign to prevent youth smoking.

    PubMed

    Holtgrave, David R; Wunderink, Katherine A; Vallone, Donna M; Healton, Cheryl G

    2009-05-01

    In 2005, the American Journal of Public Health published an article that indicated that 22% of the overall decline in youth smoking that occurred between 1999 and 2002 was directly attributable to the truth social marketing campaign launched in 2000. A remaining key question about the truth campaign is whether the economic investment in the program can be justified by the public health outcomes; that question is examined here. Standard methods of cost and cost-utility analysis were employed in accordance with the U.S. Panel on Cost-Effectiveness in Health and Medicine; a societal perspective was employed. During 2000-2002, expenditures totaled just over $324 million to develop, deliver, evaluate, and litigate the truth campaign. The base-case cost-utility analysis result indicates that the campaign was cost saving; it is estimated that the campaign recouped its costs and that just under $1.9 billion in medical costs was averted for society. Sensitivity analysis indicated that the basic determination of cost effectiveness for this campaign is robust to substantial variation in input parameters. This study suggests that the truth campaign not only markedly improved the public's health but did so in an economically efficient manner.

  4. A combined approach for the attribution of handwriting: the case of Antonio Stradivari's manuscripts

    NASA Astrophysics Data System (ADS)

    Fichera, Giusj Valentina; Dondi, Piercarlo; Licchelli, Maurizio; Lombardi, Luca; Ridolfi, Stefano; Malagodi, Marco

    2016-11-01

    Numerous artefacts from Antonio Stradivari's workshop are currently preserved in the "Museo del Violino" (Museum of the Violin) in Cremona, Italy. A large number of them are paper models containing instructions and technical notes by the great violin maker. After his death, this collection has had several owners, while new annotations added to the original ones, sometimes imitating Stradivari's handwriting, caused problems of authenticity. The attribution of these relics is a complex task and, until now, only a small part of them has been examined by palaeographers. This paper introduces a multi-analytical approach able to facilitate the study of handwriting in manuscripts with the combined use of image processing and X-ray fluorescence spectroscopy: the former provides a fast and automatic screening of documents; the latter allows to analyse the chemical composition of inks. For our tests, 17 paper relics, dated between 1684 and 1729, were chosen. Palaeographic analysis was used as reference. The results obtained showed the validity of the combined approach proposed herein: the two techniques proved to be complementary and useful to clarify the attribution of different pieces of handwriting.

  5. Absolute order-of-magnitude reasoning applied to a social multi-criteria evaluation framework

    NASA Astrophysics Data System (ADS)

    Afsordegan, A.; Sánchez, M.; Agell, N.; Aguado, J. C.; Gamboa, G.

    2016-03-01

    A social multi-criteria evaluation framework for solving a real-case problem of selecting a wind farm location in the regions of Urgell and Conca de Barberá in Catalonia (northeast of Spain) is studied. This paper applies a qualitative multi-criteria decision analysis approach based on linguistic labels assessment able to address uncertainty and deal with different levels of precision. This method is based on qualitative reasoning as an artificial intelligence technique for assessing and ranking multi-attribute alternatives with linguistic labels in order to handle uncertainty. This method is suitable for problems in the social framework such as energy planning which require the construction of a dialogue process among many social actors with high level of complexity and uncertainty. The method is compared with an existing approach, which has been applied previously in the wind farm location problem. This approach, consisting of an outranking method, is based on Condorcet's original method. The results obtained by both approaches are analysed and their performance in the selection of the wind farm location is compared in aggregation procedures. Although results show that both methods conduct to similar alternatives rankings, the study highlights both their advantages and drawbacks.

  6. Health Care Utilization and Expenditures Attributable to Cigar Smoking Among US Adults, 2000-2015.

    PubMed

    Wang, Yingning; Sung, Hai-Yen; Yao, Tingting; Lightwood, James; Max, Wendy

    Cigar use in the United States is a growing public health concern because of its increasing popularity. We estimated health care utilization and expenditures attributable to cigar smoking among US adults aged ≥35. We analyzed data on 84 178 adults using the 2000, 2005, 2010, and 2015 National Health Interview Surveys. We estimated zero-inflated Poisson (ZIP) regression models on hospital nights, emergency department (ED) visits, physician visits, and home-care visits as a function of tobacco use status-current sole cigar smokers (ie, smoke cigars only), current poly cigar smokers (smoke cigars and smoke cigarettes or use smokeless tobacco), former sole cigar smokers (used to smoke cigars only), former poly cigar smokers (used to smoke cigars and smoke cigarettes or use smokeless tobacco), other tobacco users (ever smoked cigarettes and used smokeless tobacco but not cigars), and never tobacco users (never smoked cigars, smoked cigarettes, or used smokeless tobacco)-and other covariates. We calculated health care utilization attributable to current and former sole cigar smoking based on the estimated ZIP models, and then we calculated total health care expenditures attributable to cigar smoking. Current and former sole cigar smoking was associated with excess annual utilization of 72 137 hospital nights, 32 748 ED visits, and 420 118 home-care visits. Annual health care expenditures attributable to sole cigar smoking were $284 million ($625 per sole cigar smoker), and total annual health care expenditures attributable to sole and poly cigar smoking were $1.75 billion. Comprehensive tobacco control policies and interventions are needed to reduce cigar smoking and the associated health care burden.

  7. Buildings Change Detection Based on Shape Matching for Multi-Resolution Remote Sensing Imagery

    NASA Astrophysics Data System (ADS)

    Abdessetar, M.; Zhong, Y.

    2017-09-01

    Buildings change detection has the ability to quantify the temporal effect, on urban area, for urban evolution study or damage assessment in disaster cases. In this context, changes analysis might involve the utilization of the available satellite images with different resolutions for quick responses. In this paper, to avoid using traditional method with image resampling outcomes and salt-pepper effect, building change detection based on shape matching is proposed for multi-resolution remote sensing images. Since the object's shape can be extracted from remote sensing imagery and the shapes of corresponding objects in multi-scale images are similar, it is practical for detecting buildings changes in multi-scale imagery using shape analysis. Therefore, the proposed methodology can deal with different pixel size for identifying new and demolished buildings in urban area using geometric properties of objects of interest. After rectifying the desired multi-dates and multi-resolutions images, by image to image registration with optimal RMS value, objects based image classification is performed to extract buildings shape from the images. Next, Centroid-Coincident Matching is conducted, on the extracted building shapes, based on the Euclidean distance measurement between shapes centroid (from shape T0 to shape T1 and vice versa), in order to define corresponding building objects. Then, New and Demolished buildings are identified based on the obtained distances those are greater than RMS value (No match in the same location).

  8. Optimized scheme in coal-fired boiler combustion based on information entropy and modified K-prototypes algorithm

    NASA Astrophysics Data System (ADS)

    Gu, Hui; Zhu, Hongxia; Cui, Yanfeng; Si, Fengqi; Xue, Rui; Xi, Han; Zhang, Jiayu

    2018-06-01

    An integrated combustion optimization scheme is proposed for the combined considering the restriction in coal-fired boiler combustion efficiency and outlet NOx emissions. Continuous attribute discretization and reduction techniques are handled as optimization preparation by E-Cluster and C_RED methods, in which the segmentation numbers don't need to be provided in advance and can be continuously adapted with data characters. In order to obtain results of multi-objections with clustering method for mixed data, a modified K-prototypes algorithm is then proposed. This algorithm can be divided into two stages as K-prototypes algorithm for clustering number self-adaptation and clustering for multi-objective optimization, respectively. Field tests were carried out at a 660 MW coal-fired boiler to provide real data as a case study for controllable attribute discretization and reduction in boiler system and obtaining optimization parameters considering [ maxηb, minyNOx ] multi-objective rule.

  9. Developing Access Control Model of Web OLAP over Trusted and Collaborative Data Warehouses

    NASA Astrophysics Data System (ADS)

    Fugkeaw, Somchart; Mitrpanont, Jarernsri L.; Manpanpanich, Piyawit; Juntapremjitt, Sekpon

    This paper proposes the design and development of Role- based Access Control (RBAC) model for the Single Sign-On (SSO) Web-OLAP query spanning over multiple data warehouses (DWs). The model is based on PKI Authentication and Privilege Management Infrastructure (PMI); it presents a binding model of RBAC authorization based on dimension privilege specified in attribute certificate (AC) and user identification. Particularly, the way of attribute mapping between DW user authentication and privilege of dimensional access is illustrated. In our approach, we apply the multi-agent system to automate flexible and effective management of user authentication, role delegation as well as system accountability. Finally, the paper culminates in the prototype system A-COLD (Access Control of web-OLAP over multiple DWs) that incorporates the OLAP features and authentication and authorization enforcement in the multi-user and multi-data warehouse environment.

  10. Valuing Treatments for Parkinson Disease Incorporating Process Utility: Performance of Best-Worst Scaling, Time Trade-Off, and Visual Analogue Scales.

    PubMed

    Weernink, Marieke G M; Groothuis-Oudshoorn, Catharina G M; IJzerman, Maarten J; van Til, Janine A

    2016-01-01

    The objective of this study was to compare treatment profiles including both health outcomes and process characteristics in Parkinson disease using best-worst scaling (BWS), time trade-off (TTO), and visual analogue scales (VAS). From the model comprising of seven attributes with three levels, six unique profiles were selected representing process-related factors and health outcomes in Parkinson disease. A Web-based survey (N = 613) was conducted in a general population to estimate process-related utilities using profile-based BWS (case 2), multiprofile-based BWS (case 3), TTO, and VAS. The rank order of the six profiles was compared, convergent validity among methods was assessed, and individual analysis focused on the differentiation between pairs of profiles with methods used. The aggregated health-state utilities for the six treatment profiles were highly comparable for all methods and no rank reversals were identified. On the individual level, the convergent validity between all methods was strong; however, respondents differentiated less in the utility of closely related treatment profiles with a VAS or TTO than with BWS. For TTO and VAS, this resulted in nonsignificant differences in mean utilities for closely related treatment profiles. This study suggests that all methods are equally able to measure process-related utility when the aim is to estimate the overall value of treatments. On an individual level, such as in shared decision making, BWS allows for better prioritization of treatment alternatives, especially if they are closely related. The decision-making problem and the need for explicit trade-off between attributes should determine the choice for a method. Copyright © 2016. Published by Elsevier Inc.

  11. Direct costs of unintended pregnancy in the Russian federation.

    PubMed

    Lowin, Julia; Jarrett, James; Dimova, Maria; Ignateva, Victoria; Omelyanovsky, Vitaly; Filonenko, Anna

    2015-02-01

    In 2010, almost every third pregnancy in Russia was terminated, indicating that unintended pregnancy (UP) is a public health problem. The aim of this study was to estimate the direct cost of UP to the healthcare system in Russia and the proportion attributable to using unreliable contraception. A cost model was built, adopting a generic payer perspective with a 1-year time horizon. The analysis cohort was defined as women of childbearing age between 18 and 44 years actively seeking to avoid pregnancy. Model inputs were derived from published sources or government statistics with a 2012 cost base. To estimate the number of UPs attributable to unreliable methods, the model combined annual typical use failure rates and age-adjusted utilization for each contraceptive method. Published survey data was used to adjust the total cost of UP by the number of UPs that were mistimed rather than unwanted. Scenario analysis considered alternate allocation of methods to the reliable and unreliable categories and estimate of the burden of UP in the target sub-group of women aged 18-29 years. The model estimated 1,646,799 UPs in the analysis cohort (women aged 18-44 years) with an associated annual cost of US$783 million. The model estimated 1,019,371 UPs in the target group of 18-29 years, of which 88 % were attributable to unreliable contraception. The total cost of UPs in the target group was estimated at approximately US$498 million, of which US$441 million could be considered attributable to the use of unreliable methods. The cost of UP attributable to use of unreliable contraception in Russia is substantial. Policies encouraging use of reliable contraceptive methods could reduce the burden of UP.

  12. SU-F-BRF-10: Deformable MRI to CT Validation Employing Same Day Planning MRI for Surrogate Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Padgett, K; Stoyanova, R; Johnson, P

    Purpose: To compare rigid and deformable registrations of the prostate in the multi-modality setting (diagnostic-MRI to planning-CT) by utilizing a planning-MRI as a surrogate. The surrogate allows for the direct quantitative analysis which can be difficult in the multi-modality domain where intensity mapping differs. Methods: For ten subjects, T2 fast-spin-echo images were acquired at two different time points, the first several weeks prior to planning (diagnostic-MRI) and the second on the same day in which the planning CT was collected (planning-MRI). Significant effort in patient positioning and bowel/bladder preparation was undertaken to minimize distortion of the prostate in all datasets.more » The diagnostic-MRI was deformed to the planning-CT utilizing a commercially available deformable registration algorithm synthesized from local registrations. The deformed MRI was then rigidly aligned to the planning MRI which was used as the surrogate for the planning-CT. Agreement between the two MRI datasets was scored using intensity based metrics including Pearson correlation and normalized mutual information, NMI. A local analysis was performed by looking only within the prostate, proximal seminal vesicles, penile bulb and combined areas. A similar method was used to assess a rigid registration between the diagnostic-MRI and planning-CT. Results: Utilizing the NMI, the deformable registrations were superior to the rigid registrations in 9 of 10 cases demonstrating a 15.94% improvement (p-value < 0.001) within the combined area. The Pearson correlation showed similar results with the deformable registration superior in the same number of cases and demonstrating a 6.97% improvement (p-value <0.011). Conclusion: Validating deformable multi-modality registrations using spatial intensity based metrics is difficult due to the inherent differences in intensity mapping. This population provides an ideal testing ground for MRI to CT deformable registrations by obviating the need for multi-modality comparisons which are inherently more challenging. Deformable registrations generated in this work significantly outperformed rigid alignments. Research reported in this abstract was supported by the NIH National Cancer Institute R21CA153826 “MRI-Guided Radiotherapy and Biomarkers for Prostate Cancer” and Bankhead-Coley Cancer Research Program 10BT-03 “MRI-Guided Radiotherapy and Biomarkers for Prostate Cancer”.« less

  13. An informatics approach to assess pediatric pharmacotherapy: design and implementation of a hospital drug utilization system.

    PubMed

    Zuppa, Athena; Vijayakumar, Sundararajan; Jayaraman, Bhuvana; Patel, Dimple; Narayan, Mahesh; Vijayakumar, Kalpana; Mondick, John T; Barrett, Jeffrey S

    2007-09-01

    Drug utilization in the inpatient setting can provide a mechanism to assess drug prescribing trends, efficiency, and cost-effectiveness of hospital formularies and examine subpopulations for which prescribing habits may be different. Such data can be used to correlate trends with time-dependent or seasonal changes in clinical event rates or the introduction of new pharmaceuticals. It is now possible to provide a robust, dynamic analysis of drug utilization in a large pediatric inpatient setting through the creation of a Web-based hospital drug utilization system that retrieves source data from our accounting database. The production implementation provides a dynamic and historical account of drug utilization at the authors' institution. The existing application can easily be extended to accommodate a multi-institution environment. The creation of a national or even global drug utilization network would facilitate the examination of geographical and/or socioeconomic influences in drug utilization and prescribing practices in general.

  14. Optical Imaging and Control of Neurons

    NASA Astrophysics Data System (ADS)

    Song, Yoon-Kyu

    Although remarkable progress has been made in our understanding of the function, organization, and development of the brain by various approaches of modern science and technology, how the brain performs its marvelous function remains unsolved or incompletely understood. This is mainly attributed to the insufficient capability of currently available research tools and conceptual frameworks to deal with enormous complexity of the brain. Hence, in the last couple of decades, a significant effort has been made to crack the complexity of brain by utilizing research tools from diverse scientific areas. The research tools include the optical neurotechnology which incorporates the exquisite characteristics of optics, such as multi-parallel access and non-invasiveness, in sensing and stimulating the excitable membrane of a neuron, the basic functional unit of the brain. This chapter is aimed to serve as a short introduction to the optical neurotechnology for those who wish to use optical techniques as one of their brain research tools.

  15. Intelligent data management for real-time spacecraft monitoring

    NASA Technical Reports Server (NTRS)

    Schwuttke, Ursula M.; Gasser, Les; Abramson, Bruce

    1992-01-01

    Real-time AI systems have begun to address the challenge of restructuring problem solving to meet real-time constraints by making key trade-offs that pursue less than optimal strategies with minimal impact on system goals. Several approaches for adapting to dynamic changes in system operating conditions are known. However, simultaneously adapting system decision criteria in a principled way has been difficult. Towards this end, a general technique for dynamically making such trade-offs using a combination of decision theory and domain knowledge has been developed. Multi-attribute utility theory (MAUT), a decision theoretic approach for making one-time decisions is discussed and dynamic trade-off evaluation is described as a knowledge-based extension of MAUT that is suitable for highly dynamic real-time environments, and provides an example of dynamic trade-off evaluation applied to a specific data management trade-off in a real-world spacecraft monitoring application.

  16. Gyroharmonic converter as a multi-megawatt RF driver for NLC: Beam source considerations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, C.; Hirshfield, J.L.

    1995-06-01

    A multi-megawatt 14.28 GHz gyroharmonic converter under construction at Yale University depends critically on the parameters of an electron beam prepared using a cyclotron autoresonance accelerator (CARA). This paper extends prior analysis of CARA to find an approximate constant-of-the-motion, and to give limits to the beam energy from CARA that can be utilized in a harmonic converter. It is also shown that particles are strongly phase trapped during acceleration in CARA, and thus are insensitive to deviations from exact autoresonance. This fact greatly simplifies construction of the up-tapered guide magnetic field in the device, and augurs well for production ofmore » high-quality multi-megawatt beams using CARA. {copyright} 1995 {ital American Institute of Physics}.« less

  17. Suboptimal choice in rats: incentive salience attribution promotes maladaptive decision-making

    PubMed Central

    Chow, Jonathan J; Smith, Aaron P; Wilson, A George; Zentall, Thomas R; Beckmann, Joshua S

    2016-01-01

    Stimuli that are more predictive of subsequent reward also function as better conditioned reinforcers. Moreover, stimuli attributed with incentive salience function as more robust conditioned reinforcers. Some theories have suggested that conditioned reinforcement plays an important role in promoting suboptimal choice behavior, like gambling. The present experiments examined how different stimuli, those attributed with incentive salience versus those without, can function in tandem with stimulus-reward predictive utility to promote maladaptive decision-making in rats. One group of rats had lights associated with goal-tracking as the reward-predictive stimuli and another had levers associated with sign-tracking as the reward-predictive stimuli. All rats were first trained on a choice procedure in which the expected value across both alternatives was equivalent but differed in their stimulus-reward predictive utility. Next, the expected value across both alternatives was systematically changed so that the alternative with greater stimulus-reward predictive utility was suboptimal in regard to primary reinforcement. The results demonstrate that in order to obtain suboptimal choice behavior, incentive salience alongside strong stimulus-reward predictive utility may be necessary; thus, maladaptive decision-making can be driven more by the value attributed to stimuli imbued with incentive salience that reliably predict a reward rather than the reward itself. PMID:27993692

  18. Suboptimal choice in rats: Incentive salience attribution promotes maladaptive decision-making.

    PubMed

    Chow, Jonathan J; Smith, Aaron P; Wilson, A George; Zentall, Thomas R; Beckmann, Joshua S

    2017-03-01

    Stimuli that are more predictive of subsequent reward also function as better conditioned reinforcers. Moreover, stimuli attributed with incentive salience function as more robust conditioned reinforcers. Some theories have suggested that conditioned reinforcement plays an important role in promoting suboptimal choice behavior, like gambling. The present experiments examined how different stimuli, those attributed with incentive salience versus those without, can function in tandem with stimulus-reward predictive utility to promote maladaptive decision-making in rats. One group of rats had lights associated with goal-tracking as the reward-predictive stimuli and another had levers associated with sign-tracking as the reward-predictive stimuli. All rats were first trained on a choice procedure in which the expected value across both alternatives was equivalent but differed in their stimulus-reward predictive utility. Next, the expected value across both alternatives was systematically changed so that the alternative with greater stimulus-reward predictive utility was suboptimal in regard to primary reinforcement. The results demonstrate that in order to obtain suboptimal choice behavior, incentive salience alongside strong stimulus-reward predictive utility may be necessary; thus, maladaptive decision-making can be driven more by the value attributed to stimuli imbued with incentive salience that reliably predict a reward rather than the reward itself. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Analysis of antibiotic multi-resistant bacteria and resistance genes in the effluent of an intensive shrimp farm (Long An, Vietnam).

    PubMed

    Pham, Thi Thu Hang; Rossi, Pierre; Dinh, Hoang Dang Khoa; Pham, Ngoc Tu Anh; Tran, Phuong Anh; Ho, To Thi Khai Mui; Dinh, Quoc Tuc; De Alencastro, Luiz Felippe

    2018-05-15

    In Vietnam, intensive shrimp farms heavily rely on a wide variety of antibiotics (ABs) to treat animals or prevent disease outbreak. Potential for the emergence of multi-resistant bacteria is high, with the concomitant contamination of adjacent natural aquatic habitats used for irrigation and drinking water, impairing in turn human health system. In the present study, quantification of AB multi-resistant bacteria was carried out in water and sediment samples from effluent channels connecting a shrimp farming area to the Vam Co River (Long An Province, Vietnam). Bacterial strains, e.g. Klebsiella pneumoniae and Aeromonas hydrophila, showing multi-resistance traits were isolated. Molecular biology analysis showed that these strains possessed from four to seven different AB resistance genes (ARGs) (e.g. sul1, sul2, qnrA, ermB, tetA, aac(6)lb, dfrA1, dfr12, dfrA5), conferring multidrug resistance capacity. Sequencing of plasmids present within these multi-resistant strains led to the identification of a total of forty-one resistance genes, targeting nine AB groups. qPCR analysis on the sul2 gene revealed the presence of high copy numbers in the effluent channel connecting to the Vam Co River. The results of the present study clearly indicated that multi-resistant bacteria present in intensive shrimp cultures may disseminate in the natural environment. This study offered a first insight in the impact of plasmid-born ARGs and the related pathogenic bacteria that could emerged due to inappropriate antibiotic utilization in South Vietnam. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Biotechnology of non-Saccharomyces yeasts--the ascomycetes.

    PubMed

    Johnson, Eric A

    2013-01-01

    Saccharomyces cerevisiae and several other yeast species are among the most important groups of biotechnological organisms. S. cerevisiae and closely related ascomycetous yeasts are the major producer of biotechnology products worldwide, exceeding other groups of industrial microorganisms in productivity and economic revenues. Traditional industrial attributes of the S. cerevisiae group include their primary roles in food fermentations such as beers, cider, wines, sake, distilled spirits, bakery products, cheese, sausages, and other fermented foods. Other long-standing industrial processes involving S. cerevisae yeasts are production of fuel ethanol, single-cell protein (SCP), feeds and fodder, industrial enzymes, and small molecular weight metabolites. More recently, non-Saccharomyces yeasts (non-conventional yeasts) have been utilized as industrial organisms for a variety of biotechnological roles. Non-Saccharomyces yeasts are increasingly being used as hosts for expression of proteins, biocatalysts and multi-enzyme pathways for the synthesis of fine chemicals and small molecular weight compounds of medicinal and nutritional importance. Non-Saccharomyces yeasts also have important roles in agriculture as agents of biocontrol, bioremediation, and as indicators of environmental quality. Several of these products and processes have reached commercial utility, while others are in advanced development. The objective of this mini-review is to describe processes currently used by industry and those in developmental stages and close to commercialization primarily from non-Saccharomyces yeasts with an emphasis on new opportunities. The utility of S. cerevisiae in heterologous production of selected products is also described.

  1. NDARC NASA Design and Analysis of Rotorcraft. Appendix 5; Theory

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2017-01-01

    The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.

  2. NDARC: NASA Design and Analysis of Rotorcraft. Appendix 3; Theory

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2016-01-01

    The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet speci?ed requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft con?gurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates con?guration ?exibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-?delity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy speci?ed design conditions and missions. The analysis tasks can include off-design mission performance calculation, ?ight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft con?gurations is facilitated, while retaining the capability to model novel and advanced concepts. Speci?c rotorcraft con?gurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-?delity attribute models for a component, as well as addition of new components.

  3. NDARC NASA Design and Analysis of Rotorcraft - Input, Appendix 2

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2016-01-01

    The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration exibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tilt-rotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.

  4. NDARC NASA Design and Analysis of Rotorcraft. Appendix 6; Input

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2017-01-01

    The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.

  5. NDARC NASA Design and Analysis of Rotorcraft

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne R.

    2009-01-01

    The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool intended to support both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility; a hierarchy of models; and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with lowfidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single main-rotor and tailrotor helicopter; tandem helicopter; coaxial helicopter; and tiltrotors. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.

  6. NDARC - NASA Design and Analysis of Rotorcraft

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2015-01-01

    The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.

  7. NDARC NASA Design and Analysis of Rotorcraft Theory Appendix 1

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2016-01-01

    The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.

  8. Cost analysis of neonatal and pediatric parenteral nutrition in Europe: a multi-country study.

    PubMed

    Walter, E; Liu, F X; Maton, P; Storme, T; Perrinet, M; von Delft, O; Puntis, J; Hartigan, D; Dragosits, A; Sondhi, S

    2012-05-01

    Parenteral nutrition (PN) is critical in neonatal and pediatric care for patients unable to tolerate enteral feeding. This study assessed the total costs of compounding PN therapy for neonates, infants and children. Face-to-face and telephone interviews were conducted in 12 hospitals across four European countries (Belgium, France, Germany and UK) to collect information on resources utilized to compound PN, including nutrients, staff time, equipment cost and supplies. A bottom-up cost model was constructed to assess total costs of PN therapy by assigning monetary values to the resource utilization using published list prices and interview data. A total of 49,922 PN bags per year were used to treat 4295 neonatal and pediatric patients among these hospitals. The daily total costs of one compounded PN bag for neonates in the 12 hospitals across the four countries equalled euro 55.16 (Belgium euro 53.26, France euro 46.23, Germany euro 64.05, UK Ł 37.43/\\[euro]42.86). Overall, nutrients accounted for 25% of total costs, supplies 18%, wages 54% and equipment 3%. Average costs per bag for infants <2 year were euro 84.52 (euro 74.65 in Belgium, euro 83.84 in France, euro 92.70 in Germany and Ł 52.63/euro 60.26 in the UK), and for children 2-18 years euro 118.02 (euro 93.85 in Belgium, euro 121.35 in France, euro 124.54 in Germany and Ł 69.49/euro 79.56 in the UK), of which 63% is attributable to nutrients and 28% to wages. The data indicated that PN costs differ among countries and a major proportion was due to staff time (Ł 1=euro 1.144959).

  9. Methods used to parameterize the spatially-explicit components of a state-and-transition simulation model

    USGS Publications Warehouse

    Sleeter, Rachel; Acevedo, William; Soulard, Christopher E.; Sleeter, Benjamin M.

    2015-01-01

    Spatially-explicit state-and-transition simulation models of land use and land cover (LULC) increase our ability to assess regional landscape characteristics and associated carbon dynamics across multiple scenarios. By characterizing appropriate spatial attributes such as forest age and land-use distribution, a state-and-transition model can more effectively simulate the pattern and spread of LULC changes. This manuscript describes the methods and input parameters of the Land Use and Carbon Scenario Simulator (LUCAS), a customized state-and-transition simulation model utilized to assess the relative impacts of LULC on carbon stocks for the conterminous U.S. The methods and input parameters are spatially explicit and describe initial conditions (strata, state classes and forest age), spatial multipliers, and carbon stock density. Initial conditions were derived from harmonization of multi-temporal data characterizing changes in land use as well as land cover. Harmonization combines numerous national-level datasets through a cell-based data fusion process to generate maps of primary LULC categories. Forest age was parameterized using data from the North American Carbon Program and spatially-explicit maps showing the locations of past disturbances (i.e. wildfire and harvest). Spatial multipliers were developed to spatially constrain the location of future LULC transitions. Based on distance-decay theory, maps were generated to guide the placement of changes related to forest harvest, agricultural intensification/extensification, and urbanization. We analyze the spatially-explicit input parameters with a sensitivity analysis, by showing how LUCAS responds to variations in the model input. This manuscript uses Mediterranean California as a regional subset to highlight local to regional aspects of land change, which demonstrates the utility of LUCAS at many scales and applications.

  10. Eliciting older people's preferences for exercise programs: a best-worst scaling choice experiment.

    PubMed

    Franco, Marcia R; Howard, Kirsten; Sherrington, Catherine; Ferreira, Paulo H; Rose, John; Gomes, Juliana L; Ferreira, Manuela L

    2015-01-01

    What relative value do older people with a previous fall or mobility-related disability attach to different attributes of exercise? Prospective, best-worst scaling study. Two hundred and twenty community-dwelling people, aged 60 years or older, who presented with a previous fall or mobility-related disability. Online or face-to-face questionnaire. Utility values for different exercise attributes and levels. The utility levels were calculated by asking participants to select the attribute that they considered to be the best (ie, they were most likely to want to participate in programs with this attribute) and worst (ie, least likely to want to participate). The attributes included were: exercise type; time spent on exercise per day; frequency; transport type; travel time; out-of-pocket costs; reduction in the chance of falling; and improvement in the ability to undertake tasks inside and outside of home. The attributes of exercise programs with the highest utility values were: home-based exercise and no need to use transport, followed by an improvement of 60% in the ability to do daily tasks at home, no costs, and decreasing the chances of falling to 0%. The attributes with the lowest utility were travel time of 30 minutes or more and out-of-pocket costs of AUD50 per session. The type of exercise, travel time and costs are more highly valued by older people than the health benefits. These findings suggest that physical activity engagement strategies need to go beyond education about health benefits and focus on improving accessibility to exercise programs. Exercise that can be undertaken at or close to home without any cost is most likely to be taken up by older people with past falls and/or mobility-related disability. Copyright © 2014 Australian Physiotherapy Association. Published by Elsevier B.V. All rights reserved.

  11. Potentiality Prediction of Electric Power Replacement Based on Power Market Development Strategy

    NASA Astrophysics Data System (ADS)

    Miao, Bo; Yang, Shuo; Liu, Qiang; Lin, Jingyi; Zhao, Le; Liu, Chang; Li, Bin

    2017-05-01

    The application of electric power replacement plays an important role in promoting the development of energy conservation and emission reduction in our country. To exploit the potentiality of regional electric power replacement, the regional GDP (gross domestic product) and energy consumption are taken as potentiality evaluation indicators. The principal component factors are extracted with PCA (principal component analysis), and the integral potentiality analysis is made to the potentiality of electric power replacement in the national various regions; a region is taken as a research object, and the potentiality of electric power replacement is defined and quantified. The analytical model for the potentiality of multi-scenario electric power replacement is developed, and prediction is made to the energy consumption with the grey prediction model. The relevant theoretical research is utilized to realize prediction analysis on the potentiality amount of multi-scenario electric power replacement.

  12. Delineation of geochemical anomalies based on stream sediment data utilizing fractal modeling and staged factor analysis

    NASA Astrophysics Data System (ADS)

    Afzal, Peyman; Mirzaei, Misagh; Yousefi, Mahyar; Adib, Ahmad; Khalajmasoumi, Masoumeh; Zarifi, Afshar Zia; Foster, Patrick; Yasrebi, Amir Bijan

    2016-07-01

    Recognition of significant geochemical signatures and separation of geochemical anomalies from background are critical issues in interpretation of stream sediment data to define exploration targets. In this paper, we used staged factor analysis in conjunction with the concentration-number (C-N) fractal model to generate exploration targets for prospecting Cr and Fe mineralization in Balvard area, SE Iran. The results show coexistence of derived multi-element geochemical signatures of the deposit-type sought and ultramafic-mafic rocks in the NE and northern parts of the study area indicating significant chromite and iron ore prospects. In this regard, application of staged factor analysis and fractal modeling resulted in recognition of significant multi-element signatures that have a high spatial association with host lithological units of the deposit-type sought, and therefore, the generated targets are reliable for further prospecting of the deposit in the study area.

  13. Experimental analysis of multi-attribute decision-making based on Atanassov intuitionistic fuzzy sets: a discussion of anchor dependency and accuracy functions

    NASA Astrophysics Data System (ADS)

    Chen, Ting-Yu

    2012-06-01

    This article presents a useful method for relating anchor dependency and accuracy functions to multiple attribute decision-making (MADM) problems in the context of Atanassov intuitionistic fuzzy sets (A-IFSs). Considering anchored judgement with displaced ideals and solution precision with minimal hesitation, several auxiliary optimisation models have proposed to obtain the optimal weights of the attributes and to acquire the corresponding TOPSIS (the technique for order preference by similarity to the ideal solution) index for alternative rankings. Aside from the TOPSIS index, as a decision-maker's personal characteristics and own perception of self may also influence the direction in the axiom of choice, the evaluation of alternatives is conducted based on distances of each alternative from the positive and negative ideal alternatives, respectively. This article originates from Li's [Li, D.-F. (2005), 'Multiattribute Decision Making Models and Methods Using Intuitionistic Fuzzy Sets', Journal of Computer and System Sciences, 70, 73-85] work, which is a seminal study of intuitionistic fuzzy decision analysis using deduced auxiliary programming models, and deems it a benchmark method for comparative studies on anchor dependency and accuracy functions. The feasibility and effectiveness of the proposed methods are illustrated by a numerical example. Finally, a comparative analysis is illustrated with computational experiments on averaging accuracy functions, TOPSIS indices, separation measures from positive and negative ideal alternatives, consistency rates of ranking orders, contradiction rates of the top alternative and average Spearman correlation coefficients.

  14. The STAMPEDE trial: paradigm-changing data through innovative trial design.

    PubMed

    Carthon, Bradley C; Antonarakis, Emmanuel S

    2016-09-01

    Despite the numerous regulatory approvals for prostate cancer, metastatic prostate cancer remains a huge burden for men worldwide. In an exciting development, James et al . recently published data from the Systemic Therapy in Advanced or Metastatic Prostate Cancer: Evaluation of Drug Efficacy: a multi-stage multi-arm randomised control trial (STAMPEDE). This is an innovative multi-arm multi-stage (MAMS) trial that has utilized one control arm and several comparator arms in order to provide evidence for the inclusion of therapies beyond standard androgen deprivation alone. The patient population included: (I) men with high-risk, non-metastatic, node-negative disease; (II) men with distant-metastatic or node-positive disease; and (III) men with previously-treated prostate cancer by prostatectomy or definitive radiotherapy presenting with relapse. Men were to continue androgen deprivation for at least 2 years. The current data published by this group supports earlier results and provides additional evidence that docetaxel utilized in an up-front fashion provides a survival benefit in men with hormone-sensitive metastatic prostate cancer. Moreover, the initial results from STAMPEDE show how therapies without a demonstrated survival benefit can be efficiently excluded from further study once the likelihood of a benefit is ruled out by a predetermined analysis. In this piece, we will review the STAMPEDE data, contrast it with existing results, and provide our perspectives on how this will affect future trial conduct in the field of prostate cancer.

  15. A cloud model based multi-attribute decision making approach for selection and evaluation of groundwater management schemes

    NASA Astrophysics Data System (ADS)

    Lu, Hongwei; Ren, Lixia; Chen, Yizhong; Tian, Peipei; Liu, Jia

    2017-12-01

    Due to the uncertainty (i.e., fuzziness, stochasticity and imprecision) existed simultaneously during the process for groundwater remediation, the accuracy of ranking results obtained by the traditional methods has been limited. This paper proposes a cloud model based multi-attribute decision making framework (CM-MADM) with Monte Carlo for the contaminated-groundwater remediation strategies selection. The cloud model is used to handle imprecise numerical quantities, which can describe the fuzziness and stochasticity of the information fully and precisely. In the proposed approach, the contaminated concentrations are aggregated via the backward cloud generator and the weights of attributes are calculated by employing the weight cloud module. A case study on the remedial alternative selection for a contaminated site suffering from a 1,1,1-trichloroethylene leakage problem in Shanghai, China is conducted to illustrate the efficiency and applicability of the developed approach. Totally, an attribute system which consists of ten attributes were used for evaluating each alternative through the developed method under uncertainty, including daily total pumping rate, total cost and cloud model based health risk. Results indicated that A14 was evaluated to be the most preferred alternative for the 5-year, A5 for the 10-year, A4 for the 15-year and A6 for the 20-year remediation.

  16. A Multi-Modal Active Learning Experience for Teaching Social Categorization

    ERIC Educational Resources Information Center

    Schwarzmueller, April

    2011-01-01

    This article details a multi-modal active learning experience to help students understand elements of social categorization. Each student in a group dynamics course observed two groups in conflict and identified examples of in-group bias, double-standard thinking, out-group homogeneity bias, law of small numbers, group attribution error, ultimate…

  17. A Theory of Competence in Anesthesiology: Faculty Perspectives on Resident Performance

    ERIC Educational Resources Information Center

    Street, John P.

    2009-01-01

    This study was conducted to develop a theory of resident competence in anesthesiology and was guided by this research question: from the perspective of anesthesiology faculty members, "What are the attributes and indicators of clinical competence in residents?" The author used a grounded theory approach for this multi-case, multi-site…

  18. FracPaQ: a MATLAB™ toolbox for the quantification of fracture patterns

    NASA Astrophysics Data System (ADS)

    Healy, David; Rizzo, Roberto; Farrell, Natalie; Watkins, Hannah; Cornwell, David; Gomez-Rivas, Enrique; Timms, Nick

    2017-04-01

    The patterns of fractures in deformed rocks are rarely uniform or random. Fracture orientations, sizes, shapes and spatial distributions often exhibit some kind of order. In detail, there may be relationships among the different fracture attributes e.g. small fractures dominated by one orientation, larger fractures by another. These relationships are important because the mechanical (e.g. strength, anisotropy) and transport (e.g. fluids, heat) properties of rock depend on these fracture patterns and fracture attributes. This presentation describes an open source toolbox to quantify fracture patterns, including distributions in fracture attributes and their spatial variation. Software has been developed to quantify fracture patterns from 2-D digital images, such as thin section micrographs, geological maps, outcrop or aerial photographs or satellite images. The toolbox comprises a suite of MATLAB™ scripts based on published quantitative methods for the analysis of fracture attributes: orientations, lengths, intensity, density and connectivity. An estimate of permeability in 2-D is made using a parallel plate model. The software provides an objective and consistent methodology for quantifying fracture patterns and their variations in 2-D across a wide range of length scales. Our current focus for the application of the software is on quantifying crack and fracture patterns in and around fault zones. There is a large body of published work on the quantification of relatively simple joint patterns, but fault zones present a bigger, and arguably more important, challenge. The methods presented are inherently scale independent, and a key task will be to analyse and integrate quantitative fracture pattern data from micro- to macro-scales. New features in this release include multi-scale analyses based on a wavelet method to look for scale transitions, support for multi-colour traces in the input file processed as separate fracture sets, and combining fracture traces from multiple 2-D images to derive the statistically equivalent 3-D fracture pattern expressed as a 2nd rank crack tensor.

  19. Nondestructive detection of pork quality based on dual-band VIS/NIR spectroscopy

    NASA Astrophysics Data System (ADS)

    Wang, Wenxiu; Peng, Yankun; Li, Yongyu; Tang, Xiuying; Liu, Yuanyuan

    2015-05-01

    With the continuous development of living standards and the relative change of dietary structure, consumers' rising and persistent demand for better quality of meat is emphasized. Colour, pH value, and cooking loss are important quality attributes when evaluating meat. To realize nondestructive detection of multi-parameter of meat quality simultaneously is popular in production and processing of meat and meat products. The objectives of this research were to compare the effectiveness of two bands for rapid nondestructive and simultaneous detection of pork quality attributes. Reflectance spectra of 60 chilled pork samples were collected from a dual-band visible/near-infrared spectroscopy system which covered 350-1100 nm and 1000-2600 nm. Then colour, pH value and cooking loss were determined by standard methods as reference values. Standard normal variables transform (SNVT) was employed to eliminate the spectral noise. A spectrum connection method was put forward for effective integration of the dual-band spectrum to make full use of the whole efficient information. Partial least squares regression (PLSR) and Principal component analysis (PCA) were applied to establish prediction models using based on single-band spectrum and dual-band spectrum, respectively. The experimental results showed that the PLSR model based on dual-band spectral information was superior to the models based on single band spectral information with lower root means quare error (RMSE) and higher accuracy. The PLSR model based on dual-band (use the overlapping part of first band) yielded the best prediction result with correlation coefficient of validation (Rv) of 0.9469, 0.9495, 0.9180, 0.9054 and 0.8789 for L*, a*, b*, pH value and cooking loss, respectively. This mainly because dual-band spectrum can provide sufficient and comprehensive information which reflected the quality attributes. Data fusion from dual-band spectrum could significantly improve pork quality parameters prediction performance. The research also indicated that multi-band spectral information fusion has potential to comprehensively evaluate other quality and safety attributes of pork.

  20. Physician Service Attribution Methods for Examining Provision of Low-Value Care

    PubMed Central

    Chang, Eva; Buist, Diana SM; Handley, Matthew; Pardee, Roy; Gundersen, Gabrielle; Reid, Robert J.

    2016-01-01

    Objectives: There has been significant research on provider attribution for quality and cost. Low-value care is an area of heightened focus, with little of the focus being on measurement; a key methodological decision is how to attribute delivered services and procedures. We illustrate the difference in relative and absolute physician- and panel-attributed services and procedures using overuse in cervical cancer screening. Study Design: A retrospective, cross-sectional study in an integrated health care system. Methods: We used 2013 physician-level data from Group Health Cooperative to calculate two utilization attributions: (1) panel attribution with the procedure assigned to the physician’s predetermined panel, regardless of who performed the procedure; and (2) physician attribution with the procedure assigned to the performing physician. We calculated the percentage of low-value cervical cancer screening tests and ranked physicians within the clinic using the two utilization attribution methods. Results: The percentage of low-value cervical cancer screening varied substantially between physician and panel attributions. Across the whole delivery system, median panel- and physician-attributed percentages were 15 percent and 10 percent, respectively. Among sampled clinics, panel-attributed percentages ranged between 10 percent and 17 percent, and physician-attributed percentages ranged between 9 percent and 13 percent. Within a clinic, median panel-attributed screening percentage was 17 percent (range 0 percent–27 percent) and physician-attributed percentage was 11 percent (range 0 percent–24 percent); physician rank varied by attribution method. Conclusions: The attribution method is an important methodological decision when developing low-value care measures since measures may ultimately have an impact on national benchmarking and quality scores. Cross-organizational dialogue and transparency in low-value care measurement will become increasingly important for all stakeholders. PMID:28203612

  1. Physician Service Attribution Methods for Examining Provision of Low-Value Care.

    PubMed

    Chang, Eva; Buist, Diana Sm; Handley, Matthew; Pardee, Roy; Gundersen, Gabrielle; Reid, Robert J

    2016-01-01

    There has been significant research on provider attribution for quality and cost. Low-value care is an area of heightened focus, with little of the focus being on measurement; a key methodological decision is how to attribute delivered services and procedures. We illustrate the difference in relative and absolute physician- and panel-attributed services and procedures using overuse in cervical cancer screening. A retrospective, cross-sectional study in an integrated health care system. We used 2013 physician-level data from Group Health Cooperative to calculate two utilization attributions: (1) panel attribution with the procedure assigned to the physician's predetermined panel, regardless of who performed the procedure; and (2) physician attribution with the procedure assigned to the performing physician. We calculated the percentage of low-value cervical cancer screening tests and ranked physicians within the clinic using the two utilization attribution methods. The percentage of low-value cervical cancer screening varied substantially between physician and panel attributions. Across the whole delivery system, median panel- and physician-attributed percentages were 15 percent and 10 percent, respectively. Among sampled clinics, panel-attributed percentages ranged between 10 percent and 17 percent, and physician-attributed percentages ranged between 9 percent and 13 percent. Within a clinic, median panel-attributed screening percentage was 17 percent (range 0 percent-27 percent) and physician-attributed percentage was 11 percent (range 0 percent-24 percent); physician rank varied by attribution method. The attribution method is an important methodological decision when developing low-value care measures since measures may ultimately have an impact on national benchmarking and quality scores. Cross-organizational dialogue and transparency in low-value care measurement will become increasingly important for all stakeholders.

  2. Supervised Multi-Authority Scheme with Blind Signature for IoT with Attribute Based Encryption

    NASA Astrophysics Data System (ADS)

    Nissenbaum, O. V.; Ponomarov, K. Y.; Zaharov, A. A.

    2018-04-01

    This article proposes a three-side cryptographic scheme for verifying device attributes with a Supervisor and a Certification Authority (CA) for attribute-based encryption. Two options are suggested: using a message authentication code and using a digital signature. The first version is suitable for networks with one CA, and the second one for networks with several CAs, including dynamic systems. Also, the addition of this scheme with a blind signature is proposed to preserve the confidentiality of the device attributes from the CA. The introduction gives a definition and a brief historical overview of attribute-based encryption (ABE), addresses the use of ABE in the Internet of Things.

  3. ADVANCED UTILITY SIMULATION MODEL, MULTI-PERIOD MULTI-STATE MODULE DESIGN DOCUMENTATION (VERSION 1.0)

    EPA Science Inventory

    The report is one of 11 in a series describing the initial development of the Advanced Utility Simulation Model (AUSM) by the Universities Research Group on Energy (URGE) and its continued development by the Science Applications International Corporation (SAIC) research team. The...

  4. A dielectrophoresis-impedance method for protein detection and analysis

    NASA Astrophysics Data System (ADS)

    Mohamad, Ahmad Sabry; Hamzah, Roszymah; Hoettges, Kai F.; Hughes, Michael Pycraft

    2017-01-01

    Dielectrophoresis (DEP) has increasingly been used for the assessment of the electrical properties of molecular scale objects including proteins, DNA, nanotubes and nanowires. However, whilst techniques have been developed for the electrical characterisation of frequency-dependent DEP response, biomolecular study is usually limited to observation using fluorescent markers, limiting its applicability as a characterisation tool. In this paper we present a label-free, impedance-based method of characterisation applied to the determination of the electrical properties of colloidal protein molecules, specifically Bovine Serum Albumin (BSA). By monitoring the impedance between electrodes as proteins collect, it is shown to be possible to observe multi-dispersion behaviour. A DEP dispersion exhibited at 400 kHz is attributable to the orientational dispersion of the molecule, whilst a second, higher-frequency dispersion is attributed to a Maxwell-Wagner type dispersion; changes in behaviour with medium conductivity suggest that this is strongly influenced by the electrical double layer surrounding the molecule.

  5. A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing.

    PubMed

    Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang

    2017-07-24

    With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient.

  6. Team Formation in Partially Observable Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Agogino, Adrian K.; Tumer, Kagan

    2004-01-01

    Sets of multi-agent teams often need to maximize a global utility rating the performance of the entire system where a team cannot fully observe other teams agents. Such limited observability hinders team-members trying to pursue their team utilities to take actions that also help maximize the global utility. In this article, we show how team utilities can be used in partially observable systems. Furthermore, we show how team sizes can be manipulated to provide the best compromise between having easy to learn team utilities and having them aligned with the global utility, The results show that optimally sized teams in a partially observable environments outperform one team in a fully observable environment, by up to 30%.

  7. Neural Representations of Belief Concepts: A Representational Similarity Approach to Social Semantics

    PubMed Central

    Leshinskaya, Anna; Contreras, Juan Manuel; Caramazza, Alfonso; Mitchell, Jason P.

    2017-01-01

    Abstract The present experiment identified neural regions that represent a class of concepts that are independent of perceptual or sensory attributes. During functional magnetic resonance imaging scanning, participants viewed names of social groups (e.g. Atheists, Evangelicals, and Economists) and performed a one-back similarity judgment according to 1 of 2 dimensions of belief attributes: political orientation (Liberal to Conservative) or spiritualism (Spiritualist to Materialist). By generalizing across a wide variety of social groups that possess these beliefs, these attribute concepts did not coincide with any specific sensory quality, allowing us to target conceptual, rather than perceptual, representations. Multi-voxel pattern searchlight analysis was used to identify regions in which activation patterns distinguished the 2 ends of both dimensions: Conservative from Liberal social groups when participants focused on the political orientation dimension, and spiritual from Materialist groups when participants focused on the spiritualism dimension. A cluster in right precuneus exhibited such a pattern, indicating that it carries information about belief-attribute concepts and forms part of semantic memory—perhaps a component particularly concerned with psychological traits. This region did not overlap with the theory of mind network, which engaged nearby, but distinct, parts of precuneus. These findings have implications for the neural organization of conceptual knowledge, especially the understanding of social groups. PMID:28108495

  8. Medical Expenditures Attributable to Cerebral Palsy and Intellectual Disability among Medicaid-Enrolled Children

    ERIC Educational Resources Information Center

    Kancherla, Vijaya; Amendah, Djesika D.; Grosse, Scott D.; Yeargin-Allsopp, Marshalyn; Van Naarden Braun, Kim

    2012-01-01

    This study estimated medical expenditures attributable to cerebral palsy (CP) among children enrolled in Medicaid, stratified by the presence of co-occurring intellectual disability (ID), relative to children without CP or ID. The MarketScan[R] Medicaid Multi-State database was used to identify children with CP for 2003-2005 by using the…

  9. A Multi-Faceted Analysis of a New Therapeutic Model of Linking Appraisals to Affective Experiences.

    ERIC Educational Resources Information Center

    McCarthy, Christopher; And Others

    I. Roseman, M. Spindel, and P. Jose (1990) had previously demonstrated that specific appraisals of events led to discrete emotional responses, but this model has not been widely tested by other research teams using alternative research methods. The present study utilized four qualitative research methods, taught by Patti Lather at the 1994…

  10. Personnel Resource Allocation Strategies in a Time of Fiscal Stress: A Gap Analysis of Five Southern California Elementary Schools

    ERIC Educational Resources Information Center

    Araya, Saba Q.

    2013-01-01

    As pressure increases to ensure that limited resources are utilized as effectively as possible, funding adequacy remains a priority for all California public schools. The research was conducted through a multi-methods approach of principal interviews, site level resource allocation data, and overall student achievement on state assessments. The…

  11. Achievement Goal Questionnaire: Psychometric Properties and Gender Invariance in a Sample of Chinese University Students

    ERIC Educational Resources Information Center

    Xiao, Jing; Bai, Yu; He, Yini; McWhinnie, Chad M.; Ling, Yu; Smith, Hannah; Huebner, E. Scott

    2016-01-01

    The aim of this study was to test the gender invariance of the Chinese version of the Achievement Goal Questionnaire (AGQ-C) utilizing a sample of 1,115 Chinese university students. Multi-group confirmatory factor analysis supported the configural, metric, and scalar invariance of the AGQ-C across genders. Analyses also revealed that the latent…

  12. Sociocultural definitions of risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rayner, S.

    1990-10-01

    Public constituencies frequently are criticized by technical experts as being irrational in response to low-probability risks. This presentation argued that most people are concerned with a variety of risk attributes other than probability and that is rather irrational to exclude these from the definition and analysis of technological risk. Risk communication, which is at the heart of the right-to-know concept, is described as the creation of shared meaning rather than the mere transmission of information. A case study of utilities, public utility commissions, and public interest groups illustrates how the diversity of institutional cultures in modern society leads to problemsmore » for the creation of shared meanings in establishing trust, distributing liability, and obtaining consent to risk. This holistic approach to risk analysis is most appropriate under conditions of high uncertainty and/or decision stakes. 1 fig., 5 tabs.« less

  13. The receiver operational characteristic for binary classification with multiple indices and its application to the neuroimaging study of Alzheimer's disease.

    PubMed

    Wu, Xia; Li, Juan; Ayutyanont, Napatkamon; Protas, Hillary; Jagust, William; Fleisher, Adam; Reiman, Eric; Yao, Li; Chen, Kewei

    2013-01-01

    Given a single index, the receiver operational characteristic (ROC) curve analysis is routinely utilized for characterizing performances in distinguishing two conditions/groups in terms of sensitivity and specificity. Given the availability of multiple data sources (referred to as multi-indices), such as multimodal neuroimaging data sets, cognitive tests, and clinical ratings and genomic data in Alzheimer’s disease (AD) studies, the single-index-based ROC underutilizes all available information. For a long time, a number of algorithmic/analytic approaches combining multiple indices have been widely used to simultaneously incorporate multiple sources. In this study, we propose an alternative for combining multiple indices using logical operations, such as “AND,” “OR,” and “at least n” (where n is an integer), to construct multivariate ROC (multiV-ROC) and characterize the sensitivity and specificity statistically associated with the use of multiple indices. With and without the “leave-one-out” cross-validation, we used two data sets from AD studies to showcase the potentially increased sensitivity/specificity of the multiV-ROC in comparison to the single-index ROC and linear discriminant analysis (an analytic way of combining multi-indices). We conclude that, for the data sets we investigated, the proposed multiV-ROC approach is capable of providing a natural and practical alternative with improved classification accuracy as compared to univariate ROC and linear discriminant analysis.

  14. The Receiver Operational Characteristic for Binary Classification with Multiple Indices and Its Application to the Neuroimaging Study of Alzheimer’s Disease

    PubMed Central

    Wu, Xia; Li, Juan; Ayutyanont, Napatkamon; Protas, Hillary; Jagust, William; Fleisher, Adam; Reiman, Eric; Yao, Li; Chen, Kewei

    2014-01-01

    Given a single index, the receiver operational characteristic (ROC) curve analysis is routinely utilized for characterizing performances in distinguishing two conditions/groups in terms of sensitivity and specificity. Given the availability of multiple data sources (referred to as multi-indices), such as multimodal neuroimaging data sets, cognitive tests, and clinical ratings and genomic data in Alzheimer’s disease (AD) studies, the single-index-based ROC underutilizes all available information. For a long time, a number of algorithmic/analytic approaches combining multiple indices have been widely used to simultaneously incorporate multiple sources. In this study, we propose an alternative for combining multiple indices using logical operations, such as “AND,” “OR,” and “at least n” (where n is an integer), to construct multivariate ROC (multiV-ROC) and characterize the sensitivity and specificity statistically associated with the use of multiple indices. With and without the “leave-one-out” cross-validation, we used two data sets from AD studies to showcase the potentially increased sensitivity/specificity of the multiV-ROC in comparison to the single-index ROC and linear discriminant analysis (an analytic way of combining multi-indices). We conclude that, for the data sets we investigated, the proposed multiV-ROC approach is capable of providing a natural and practical alternative with improved classification accuracy as compared to univariate ROC and linear discriminant analysis. PMID:23702553

  15. Evaluation of hierarchical agglomerative cluster analysis methods for discrimination of primary biological aerosol

    NASA Astrophysics Data System (ADS)

    Crawford, I.; Ruske, S.; Topping, D. O.; Gallagher, M. W.

    2015-11-01

    In this paper we present improved methods for discriminating and quantifying primary biological aerosol particles (PBAPs) by applying hierarchical agglomerative cluster analysis to multi-parameter ultraviolet-light-induced fluorescence (UV-LIF) spectrometer data. The methods employed in this study can be applied to data sets in excess of 1 × 106 points on a desktop computer, allowing for each fluorescent particle in a data set to be explicitly clustered. This reduces the potential for misattribution found in subsampling and comparative attribution methods used in previous approaches, improving our capacity to discriminate and quantify PBAP meta-classes. We evaluate the performance of several hierarchical agglomerative cluster analysis linkages and data normalisation methods using laboratory samples of known particle types and an ambient data set. Fluorescent and non-fluorescent polystyrene latex spheres were sampled with a Wideband Integrated Bioaerosol Spectrometer (WIBS-4) where the optical size, asymmetry factor and fluorescent measurements were used as inputs to the analysis package. It was found that the Ward linkage with z-score or range normalisation performed best, correctly attributing 98 and 98.1 % of the data points respectively. The best-performing methods were applied to the BEACHON-RoMBAS (Bio-hydro-atmosphere interactions of Energy, Aerosols, Carbon, H2O, Organics and Nitrogen-Rocky Mountain Biogenic Aerosol Study) ambient data set, where it was found that the z-score and range normalisation methods yield similar results, with each method producing clusters representative of fungal spores and bacterial aerosol, consistent with previous results. The z-score result was compared to clusters generated with previous approaches (WIBS AnalysiS Program, WASP) where we observe that the subsampling and comparative attribution method employed by WASP results in the overestimation of the fungal spore concentration by a factor of 1.5 and the underestimation of bacterial aerosol concentration by a factor of 5. We suggest that this likely due to errors arising from misattribution due to poor centroid definition and failure to assign particles to a cluster as a result of the subsampling and comparative attribution method employed by WASP. The methods used here allow for the entire fluorescent population of particles to be analysed, yielding an explicit cluster attribution for each particle and improving cluster centroid definition and our capacity to discriminate and quantify PBAP meta-classes compared to previous approaches.

  16. Real-Time Analysis of a Sensor's Data for Automated Decision Making in an IoT-Based Smart Home.

    PubMed

    Khan, Nida Saddaf; Ghani, Sayeed; Haider, Sajjad

    2018-05-25

    IoT devices frequently generate large volumes of streaming data and in order to take advantage of this data, their temporal patterns must be learned and identified. Streaming data analysis has become popular after being successfully used in many applications including forecasting electricity load, stock market prices, weather conditions, etc. Artificial Neural Networks (ANNs) have been successfully utilized in understanding the embedded interesting patterns/behaviors in the data and forecasting the future values based on it. One such pattern is modelled and learned in the present study to identify the occurrence of a specific pattern in a Water Management System (WMS). This prediction aids in making an automatic decision support system, to switch OFF a hydraulic suction pump at the appropriate time. Three types of ANN, namely Multi-Input Multi-Output (MIMO), Multi-Input Single-Output (MISO), and Recurrent Neural Network (RNN) have been compared, for multi-step-ahead forecasting, on a sensor's streaming data. Experiments have shown that RNN has the best performance among three models and based on its prediction, a system can be implemented to make the best decision with 86% accuracy.

  17. Multi-energy Coordinated Evaluation for Energy Internet

    NASA Astrophysics Data System (ADS)

    Jia, Dongqiang; Sun, Jian; Wang, Cunping; Hong, Xiao; Ma, Xiufan; Xiong, Wenting; Shen, Yaqi

    2017-05-01

    This paper reviews the current research status of multi-energy coordinated evaluation for energy Internet. Taking the coordinated optimization effect of wind energy, solar energy and other energy sources into consideration, 17 evaluation indexes, such as the substitution coefficient of cold heat and power, the ratio of wind and solar energy, and the rate of energy storage ratio, were designed from five aspects, including the acceptance of renewable energy, energy complementary alternative benefits, peak valley difference, the degree of equipment utilization and user needs. At the same time, this article attaches importance to the economic and social benefits of the coordination of multiple energy sources. Ultimately, a comprehensive multi-energy coordination evaluation index system of regional energy Internet was put forward from the safe operation, coordination and optimization, economic and social benefits four aspects, and a comprehensive evaluation model was established. This model uses the optimal combination weighting method based on moment estimation and Topsis evaluation analysis method, so both the subjective and objective weight of the index are considered and the coordinate evaluation of multi-energy is realized. Finally the perfection of the index system and the validity of the evaluation method are verified by a case analysis.

  18. Pharmacists’ Opinions of the Value of CAPE Outcomes in Hiring Decisions

    PubMed Central

    Marsh, Wallace A.; Castleberry, Ashley N.; Kelley, Katherine A.; Boyce, Eric G.

    2017-01-01

    Objective. The Hiring Intent Reasoning Examination (HIRE) was designed to explore the utility of the CAPE 2013 outcomes attributes from the perspective of practicing pharmacists, examine how each attribute influences hiring decisions, and identify which of the attributes are perceived as most and least valuable by practicing pharmacists. Methods. An electronic questionnaire was developed and distributed to licensed pharmacists in four states to collect their opinions about 15 CAPE subdomains plus five additional business related attributes. The attributes that respondents identified were: necessary to be a good pharmacist, would impact hiring decisions, most important to them, and in short supply in the applicant pool. Data were analyzed using statistical analysis software to determine the relative importance of each to practicing pharmacists and various subsets of pharmacists. Results. The CAPE subdomains were considered necessary for most jobs by 51% or more of the 3723 respondents (range, 51% to 99%). The necessity for business-related attributes ranged from 21% to 92%. The percentage who would not hire an applicant who did not possess the attribute ranged from 2% to 71.5%; the percentage who considered the attribute most valuable ranged from 0.3% to 35%; and the percentage who felt the attribute was in short supply ranged from 5% to 36%. Opinions varied depending upon gender, practice setting and whether the pharmacist was an employee or employer. Conclusion. The results of this study can be used by faculty and administrators to inform curricular design and emphasis on CAPE domains and business-related education in pharmacy programs. PMID:29367774

  19. Geographic applications of ERTS-1 data to landscape change

    NASA Technical Reports Server (NTRS)

    Rehder, J. B.

    1973-01-01

    The analysis of landscape change requires large area coverage on a periodic basis in order to analyze aggregate changes over an extended period of time. To date, only the ERTS program can provide this capability. Three avenues of experimentation and analysis are being used in the investigation: (1) a multi-scale sampling procedure utilizing aircraft imagery for ground truth and control; (2) a densitometric and computer analytical experiment for the analysis of gray tone signatures, comparisons and ultimately for landscape change detection and monitoring; and (3) an ERTS image enhancement procedure for the detection and analysis of photomorphic regions.

  20. Multivariate analysis of fatty acid and biochemical constitutes of seaweeds to characterize their potential as bioresource for biofuel and fine chemicals.

    PubMed

    Verma, Priyanka; Kumar, Manoj; Mishra, Girish; Sahoo, Dinabandhu

    2017-02-01

    In the present study bio prospecting of thirty seaweeds from Indian coasts was analyzed for their biochemical components including pigments, fatty acid and ash content. Multivariate analysis of biochemical components and fatty acids was done using Principal Component Analysis (PCA) and Agglomerative hierarchical clustering (AHC) to manifest chemotaxonomic relationship among various seaweeds. The overall analysis suggests that these seaweeds have multi-functional properties and can be utilized as promising bioresource for proteins, lipids, pigments and carbohydrates for the food/feed and biofuel industry. Copyright © 2016. Published by Elsevier Ltd.

  1. Integration and Interoperability: An Analysis to Identify the Attributes for System of Systems

    DTIC Science & Technology

    2008-09-01

    divisions of the enterprise. Examples of the current I2 are: • a nightly feed of elearning information is captured through an automated and...standardized process throughout the enterprise and • the LMS has been integrated with SkillSoft, a third party elearning software system, (http...Command (JITC) is responsible to test all programs that utilize standard interfaces to specific global nets or systems. Many times programs that

  2. Automated quantitative muscle biopsy analysis system

    NASA Technical Reports Server (NTRS)

    Castleman, Kenneth R. (Inventor)

    1980-01-01

    An automated system to aid the diagnosis of neuromuscular diseases by producing fiber size histograms utilizing histochemically stained muscle biopsy tissue. Televised images of the microscopic fibers are processed electronically by a multi-microprocessor computer, which isolates, measures, and classifies the fibers and displays the fiber size distribution. The architecture of the multi-microprocessor computer, which is iterated to any required degree of complexity, features a series of individual microprocessors P.sub.n each receiving data from a shared memory M.sub.n-1 and outputing processed data to a separate shared memory M.sub.n+1 under control of a program stored in dedicated memory M.sub.n.

  3. Service user involvement in mental health care: an evolutionary concept analysis.

    PubMed

    Millar, Samantha L; Chambers, Mary; Giles, Melanie

    2016-04-01

    The concept of service user involvement is an evolving concept in the mental health-care literature. This study sought to explore and analyse the concept of service user involvement as used in within the field of mental health care. An evolutionary concept analysis was conducted using a literature-based sample extracted from an electronic database search. One hundred and thirty-four papers met the inclusion criteria and were analysed to discover key attributes, antecedents and consequences of service user involvement and to produce a definition of the concept. Five key attributes of service user involvement within the context of mental health care were identified: a person-centred approach, informed decision making, advocacy, obtaining service user views and feedback and working in partnership. Clarity of the attributes and definition of the concept of service user involvement aims to promote understanding of the concept among key stakeholders including mental health professionals, service users and community and voluntary organizations. The findings of the research have utility in the areas of theory and policy development, research on service user involvement in mental health care and service user involvement in mental health practice. Directions for further research regarding the concept are identified. © 2015 John Wiley & Sons Ltd.

  4. Environmental and individual attributes associated with child maltreatment resulting in hospitalization or death.

    PubMed

    Thurston, Holly; Freisthler, Bridget; Bell, Janice; Tancredi, Daniel; Romano, Patrick S; Miyamoto, Sheridan; Joseph, Jill G

    2017-05-01

    Maltreatment continues to be a leading cause of death for young children. Researchers are beginning to uncover which neighborhood attributes may be associated with maltreatment outcomes. However, few studies have been able to explore these influences while controlling for individual family attributes, and none have been able to parse out the most severe outcomes-injuries resulting in hospitalization or death. This study utilizes a retrospective, case-control design on a dataset containing both individual and environmental level attributes of children who have been hospitalized or died due to maltreatment to explore the relative influence of attributes inside and outside the household walls. Binary conditional logistic regression was used to model the outcome as a function of the individual and environmental level predictors. Separate analyses also separated the outcome by manner of maltreatment: abuse or neglect. Finally, a sub-analysis included protective predictors representing access to supportive resources. Findings indicate that neighborhood attributes were similar for both cases and controls, except in the neglect only model, wherein impoverishment was associated with higher odds of serious maltreatment. Dense housing increased risk in all models except the neglect only model. In a sub-analysis, distance to Family Resource Centers was inversely related to serious maltreatment. In all models, variables representing more extreme intervention and/or removal of the victim and/or perpetrator from the home (foster care or criminal court involvement) were negatively associated with the risk of becoming a case. Medi-Cal insurance eligibility of a child was also negatively associated with becoming a case. Government interventions may be playing a critical role in child protection. More research is needed to ascertain how these interventions assert their influence. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Utilization of a Multi-Disciplinary Approach to Building Effective Command Centers: Process and Products

    DTIC Science & Technology

    2005-06-01

    cognitive task analysis , organizational information dissemination and interaction, systems engineering, collaboration and communications processes, decision-making processes, and data collection and organization. By blending these diverse disciplines command centers can be designed to support decision-making, cognitive analysis, information technology, and the human factors engineering aspects of Command and Control (C2). This model can then be used as a baseline when dealing with work in areas of business processes, workflow engineering, information management,

  6. Multi-factor Analysis of Pre-control Fracture Simulations about Projectile Material

    NASA Astrophysics Data System (ADS)

    Wan, Ren-Yi; Zhou, Wei

    2016-05-01

    The study of projectile material pre-control fracture is helpful to improve the projectile metal effective fragmentation and the material utilization rate. Fragments muzzle velocity and lethality can be affected by the different explosive charge and the way of initiation. The finite element software can simulate the process of projectile explosive rupture which has a pre-groove in the projectile shell surface and analysis of typical node velocity change with time, to provides a reference for the design and optimization of precontrol frag.

  7. Engineering a multi-biofunctional composite using poly(ethylenimine) decorated graphene oxide for bone tissue regeneration

    NASA Astrophysics Data System (ADS)

    Kumar, Sachin; Raj, Shammy; Sarkar, Kishor; Chatterjee, Kaushik

    2016-03-01

    Toward preparing strong multi-biofunctional materials, poly(ethylenimine) (PEI) conjugated graphene oxide (GO_PEI) was synthesized using poly(acrylic acid) (PAA) as a spacer and incorporated in poly(ε-caprolactone) (PCL) at different fractions. GO_PEI significantly promoted the proliferation and formation of focal adhesions in human mesenchymal stem cells (hMSCs) on PCL. GO_PEI was highly potent in inducing stem cell osteogenesis leading to near doubling of alkaline phosphatase expression and mineralization over neat PCL with 5% filler content and was ~50% better than GO. Remarkably, 5% GO_PEI was as potent as soluble osteoinductive factors. Increased adsorption of osteogenic factors due to the amine and oxygen containing functional groups on GO_PEI augment stem cell differentiation. GO_PEI was also highly efficient in imparting bactericidal activity with 85% reduction in counts of E. coli colonies compared to neat PCL at 5% filler content and was more than twice as efficient as GO. This may be attributed to the synergistic effect of the sharp edges of the particles along with the presence of the different chemical moieties. Thus, GO_PEI based polymer composites can be utilized to prepare bioactive resorbable biomaterials as an alternative to using labile biomolecules for fabricating orthopedic devices for fracture fixation and tissue engineering.Toward preparing strong multi-biofunctional materials, poly(ethylenimine) (PEI) conjugated graphene oxide (GO_PEI) was synthesized using poly(acrylic acid) (PAA) as a spacer and incorporated in poly(ε-caprolactone) (PCL) at different fractions. GO_PEI significantly promoted the proliferation and formation of focal adhesions in human mesenchymal stem cells (hMSCs) on PCL. GO_PEI was highly potent in inducing stem cell osteogenesis leading to near doubling of alkaline phosphatase expression and mineralization over neat PCL with 5% filler content and was ~50% better than GO. Remarkably, 5% GO_PEI was as potent as soluble osteoinductive factors. Increased adsorption of osteogenic factors due to the amine and oxygen containing functional groups on GO_PEI augment stem cell differentiation. GO_PEI was also highly efficient in imparting bactericidal activity with 85% reduction in counts of E. coli colonies compared to neat PCL at 5% filler content and was more than twice as efficient as GO. This may be attributed to the synergistic effect of the sharp edges of the particles along with the presence of the different chemical moieties. Thus, GO_PEI based polymer composites can be utilized to prepare bioactive resorbable biomaterials as an alternative to using labile biomolecules for fabricating orthopedic devices for fracture fixation and tissue engineering. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr06906h

  8. Multifunctional Collaborative Modeling and Analysis Methods in Engineering Science

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Broduer, Steve (Technical Monitor)

    2001-01-01

    Engineers are challenged to produce better designs in less time and for less cost. Hence, to investigate novel and revolutionary design concepts, accurate, high-fidelity results must be assimilated rapidly into the design, analysis, and simulation process. This assimilation should consider diverse mathematical modeling and multi-discipline interactions necessitated by concepts exploiting advanced materials and structures. Integrated high-fidelity methods with diverse engineering applications provide the enabling technologies to assimilate these high-fidelity, multi-disciplinary results rapidly at an early stage in the design. These integrated methods must be multifunctional, collaborative, and applicable to the general field of engineering science and mechanics. Multifunctional methodologies and analysis procedures are formulated for interfacing diverse subdomain idealizations including multi-fidelity modeling methods and multi-discipline analysis methods. These methods, based on the method of weighted residuals, ensure accurate compatibility of primary and secondary variables across the subdomain interfaces. Methods are developed using diverse mathematical modeling (i.e., finite difference and finite element methods) and multi-fidelity modeling among the subdomains. Several benchmark scalar-field and vector-field problems in engineering science are presented with extensions to multidisciplinary problems. Results for all problems presented are in overall good agreement with the exact analytical solution or the reference numerical solution. Based on the results, the integrated modeling approach using the finite element method for multi-fidelity discretization among the subdomains is identified as most robust. The multiple-method approach is advantageous when interfacing diverse disciplines in which each of the method's strengths are utilized. The multifunctional methodology presented provides an effective mechanism by which domains with diverse idealizations are interfaced. This capability rapidly provides the high-fidelity results needed in the early design phase. Moreover, the capability is applicable to the general field of engineering science and mechanics. Hence, it provides a collaborative capability that accounts for interactions among engineering analysis methods.

  9. Estimating the Cost-Effectiveness of Implementation: Is Sufficient Evidence Available?

    PubMed

    Whyte, Sophie; Dixon, Simon; Faria, Rita; Walker, Simon; Palmer, Stephen; Sculpher, Mark; Radford, Stefanie

    2016-01-01

    Timely implementation of recommended interventions can provide health benefits to patients and cost savings to the health service provider. Effective approaches to increase the implementation of guidance are needed. Since investment in activities that improve implementation competes for funding against other health generating interventions, it should be assessed in term of its costs and benefits. In 2010, the National Institute for Health and Care Excellence released a clinical guideline recommending natriuretic peptide (NP) testing in patients with suspected heart failure. However, its implementation in practice was variable across the National Health Service in England. This study demonstrates the use of multi-period analysis together with diffusion curves to estimate the value of investing in implementation activities to increase uptake of NP testing. Diffusion curves were estimated based on historic data to produce predictions of future utilization. The value of an implementation activity (given its expected costs and effectiveness) was estimated. Both a static population and a multi-period analysis were undertaken. The value of implementation interventions encouraging the utilization of NP testing is shown to decrease over time as natural diffusion occurs. Sensitivity analyses indicated that the value of the implementation activity depends on its efficacy and on the population size. Value of implementation can help inform policy decisions of how to invest in implementation activities even in situations in which data are sparse. Multi-period analysis is essential to accurately quantify the time profile of the value of implementation given the natural diffusion of the intervention and the incidence of the disease. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  10. Improving wood properties for wood utilization through multi-omics integration in lignin biosynthesis

    DOE PAGES

    Wang, Jack P.; Matthews, Megan L.; Williams, Cranos M.; ...

    2018-04-20

    A multi-omics quantitative integrative analysis of lignin biosynthesis can advance the strategic engineering of wood for timber, pulp, and biofuels. Lignin is polymerized from three monomers (monolignols) produced by a grid-like pathway. The pathway in wood formation of Populus trichocarpa has at least 21 genes, encoding enzymes that mediate 37 reactions on 24 metabolites, leading to lignin and affecting wood properties. We perturb these 21 pathway genes and integrate transcriptomic, proteomic, fluxomic and phenomic data from 221 lines selected from ~2000 transgenics (6-month-old). The integrative analysis estimates how changing expression of pathway gene or gene combination affects protein abundance, metabolic-flux,more » metabolite concentrations, and 25 wood traits, including lignin, tree-growth, density, strength, and saccharification. The analysis then predicts improvements in any of these 25 traits individually or in combinations, through engineering expression of specific monolignol genes. The analysis may lead to greater understanding of other pathways for improved growth and adaptation.« less

  11. Improving wood properties for wood utilization through multi-omics integration in lignin biosynthesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jack P.; Matthews, Megan L.; Williams, Cranos M.

    A multi-omics quantitative integrative analysis of lignin biosynthesis can advance the strategic engineering of wood for timber, pulp, and biofuels. Lignin is polymerized from three monomers (monolignols) produced by a grid-like pathway. The pathway in wood formation of Populus trichocarpa has at least 21 genes, encoding enzymes that mediate 37 reactions on 24 metabolites, leading to lignin and affecting wood properties. We perturb these 21 pathway genes and integrate transcriptomic, proteomic, fluxomic and phenomic data from 221 lines selected from ~2000 transgenics (6-month-old). The integrative analysis estimates how changing expression of pathway gene or gene combination affects protein abundance, metabolic-flux,more » metabolite concentrations, and 25 wood traits, including lignin, tree-growth, density, strength, and saccharification. The analysis then predicts improvements in any of these 25 traits individually or in combinations, through engineering expression of specific monolignol genes. The analysis may lead to greater understanding of other pathways for improved growth and adaptation.« less

  12. Development Roadmap of an Evolvable and Extensible Multi-Mission Telecom Planning and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Cheung, Kar-Ming; Tung, Ramona H.; Lee, Charles H.

    2003-01-01

    In this paper, we describe the development roadmap and discuss the various challenges of an evolvable and extensible multi-mission telecom planning and analysis framework. Our long-term goal is to develop a set of powerful flexible telecommunications analysis tools that can be easily adapted to different missions while maintain the common Deep Space Communication requirements. The ability of re-using the DSN ground models and the common software utilities in our adaptations has contributed significantly to our development efforts measured in terms of consistency, accuracy, and minimal effort redundancy, which can translate into shorter development time and major cost savings for the individual missions. In our roadmap, we will address the design principles, technical achievements and the associated challenges for following telecom analysis tools (i) Telecom Forecaster Predictor - TFP (ii) Unified Telecom Predictor - UTP (iii) Generalized Telecom Predictor - GTP (iv) Generic TFP (v) Web-based TFP (vi) Application Program Interface - API (vii) Mars Relay Network Planning Tool - MRNPT.

  13. Improving wood properties for wood utilization through multi-omics integration in lignin biosynthesis.

    PubMed

    Wang, Jack P; Matthews, Megan L; Williams, Cranos M; Shi, Rui; Yang, Chenmin; Tunlaya-Anukit, Sermsawat; Chen, Hsi-Chuan; Li, Quanzi; Liu, Jie; Lin, Chien-Yuan; Naik, Punith; Sun, Ying-Hsuan; Loziuk, Philip L; Yeh, Ting-Feng; Kim, Hoon; Gjersing, Erica; Shollenberger, Todd; Shuford, Christopher M; Song, Jina; Miller, Zachary; Huang, Yung-Yun; Edmunds, Charles W; Liu, Baoguang; Sun, Yi; Lin, Ying-Chung Jimmy; Li, Wei; Chen, Hao; Peszlen, Ilona; Ducoste, Joel J; Ralph, John; Chang, Hou-Min; Muddiman, David C; Davis, Mark F; Smith, Chris; Isik, Fikret; Sederoff, Ronald; Chiang, Vincent L

    2018-04-20

    A multi-omics quantitative integrative analysis of lignin biosynthesis can advance the strategic engineering of wood for timber, pulp, and biofuels. Lignin is polymerized from three monomers (monolignols) produced by a grid-like pathway. The pathway in wood formation of Populus trichocarpa has at least 21 genes, encoding enzymes that mediate 37 reactions on 24 metabolites, leading to lignin and affecting wood properties. We perturb these 21 pathway genes and integrate transcriptomic, proteomic, fluxomic and phenomic data from 221 lines selected from ~2000 transgenics (6-month-old). The integrative analysis estimates how changing expression of pathway gene or gene combination affects protein abundance, metabolic-flux, metabolite concentrations, and 25 wood traits, including lignin, tree-growth, density, strength, and saccharification. The analysis then predicts improvements in any of these 25 traits individually or in combinations, through engineering expression of specific monolignol genes. The analysis may lead to greater understanding of other pathways for improved growth and adaptation.

  14. Alkyd paints in art: characterization using integrated mass spectrometry.

    PubMed

    La Nasa, Jacopo; Degano, Ilaria; Modugno, Francesca; Colombini, Maria Perla

    2013-10-03

    Alkyd resins have been commonly used as binders in artist paints since the 1940s. The characterization of alkyds in samples from artworks can help to solve attribution and dating issues, investigate decay processes, and contribute to the planning of conservation strategies. Being able to assess the components of industrially formulated paint materials and to differentiate between different trademarks and producers is extremely interesting and requires multi-analytical approaches. In this paper we describe the characterization of commercial alkyd paint materials using a multi-analytical approach based on the integration of three different mass spectrometric techniques: gas chromatography-mass spectrometry (GC/MS), high performance liquid chromatography coupled with electrospray ionization mass spectrometry with a tandem quadrupole-time of flight mass spectrometer (HPLC-ESI-Q-ToF), and flow injection analysis (FIA) in the ESI-Q-ToF mass spectrometer. GC/MS was successful in determining the fatty acid and aromatic fractions of the resins after hydrolysis; HPLC-ESI-Q-ToF analysis enabled us to identify the triglycerides (TAGs) and diglycerides (DAGs) profile of each resin, and FIA analysis was used as a rapid method to evaluate the presence of possible additives such as synthetic polymers. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Multi-sensory landscape assessment: the contribution of acoustic perception to landscape evaluation.

    PubMed

    Gan, Yonghong; Luo, Tao; Breitung, Werner; Kang, Jian; Zhang, Tianhai

    2014-12-01

    In this paper, the contribution of visual and acoustic preference to multi-sensory landscape evaluation was quantitatively compared. The real landscapes were treated as dual-sensory ambiance and separated into visual landscape and soundscape. Both were evaluated by 63 respondents in laboratory conditions. The analysis of the relationship between respondent's visual and acoustic preference as well as their respective contribution to landscape preference showed that (1) some common attributes are universally identified in assessing visual, aural and audio-visual preference, such as naturalness or degree of human disturbance; (2) with acoustic and visual preferences as variables, a multi-variate linear regression model can satisfactorily predict landscape preference (R(2 )= 0.740), while the coefficients of determination for a unitary linear regression model were 0.345 and 0.720 for visual and acoustic preference as predicting factors, respectively; (3) acoustic preference played a much more important role in landscape evaluation than visual preference in this study (the former is about 4.5 times of the latter), which strongly suggests a rethinking of the role of soundscape in environment perception research and landscape planning practice.

  16. Study of EHD flow generator's efficiencies utilizing pin to single ring and multi-concentric rings electrodes

    NASA Astrophysics Data System (ADS)

    Sumariyah; Kusminart; Hermanto, A.; Nuswantoro, P.

    2016-11-01

    EHD flow or ionic wind yield corona discharge is a stream coming from the ionized gas. EHD is generated by a strong electric field and its direction follows the electric field lines. In this study, the efficiency of the EHD flow generators utilizing pin-multi concentric rings electrodes (P-MRE) and the EHD pin-single ring electrode (P-SRE) have been measured. The comparison of efficiencies two types of the generator has been done. EHD flow was generated by using a high-voltage DC 0-10 KV on the electrode pin with a positive polarity and electrode ring/ multi-concentric rings of negative polarity. The efficiency was calculated by comparison between the mechanical power of flow to the electrical power that consumed. We obtained that the maximum efficiency of EHD flow generator utilizing pin-multi concentric rings electrodes was 0.54% and the maximum efficiency of EHD flow generator utilizing a pin-single ring electrode was 0.23%. Efficiency of EHD with P-MRE 2.34 times Efficiency of EHD with P-SRE

  17. Optimization of Multiple Related Negotiation through Multi-Negotiation Network

    NASA Astrophysics Data System (ADS)

    Ren, Fenghui; Zhang, Minjie; Miao, Chunyan; Shen, Zhiqi

    In this paper, a Multi-Negotiation Network (MNN) and a Multi- Negotiation Influence Diagram (MNID) are proposed to optimally handle Multiple Related Negotiations (MRN) in a multi-agent system. Most popular, state-of-the-art approaches perform MRN sequentially. However, a sequential procedure may not optimally execute MRN in terms of maximizing the global outcome, and may even lead to unnecessary losses in some situations. The motivation of this research is to use a MNN to handle MRN concurrently so as to maximize the expected utility of MRN. Firstly, both the joint success rate and the joint utility by considering all related negotiations are dynamically calculated based on a MNN. Secondly, by employing a MNID, an agent's possible decision on each related negotiation is reflected by the value of expected utility. Lastly, through comparing expected utilities between all possible policies to conduct MRN, an optimal policy is generated to optimize the global outcome of MRN. The experimental results indicate that the proposed approach can improve the global outcome of MRN in a successful end scenario, and avoid unnecessary losses in an unsuccessful end scenario.

  18. Multi-Source Multi-Target Dictionary Learning for Prediction of Cognitive Decline

    PubMed Central

    Zhang, Jie; Li, Qingyang; Caselli, Richard J.; Thompson, Paul M.; Ye, Jieping; Wang, Yalin

    2017-01-01

    Alzheimer’s Disease (AD) is the most common type of dementia. Identifying correct biomarkers may determine pre-symptomatic AD subjects and enable early intervention. Recently, Multi-task sparse feature learning has been successfully applied to many computer vision and biomedical informatics researches. It aims to improve the generalization performance by exploiting the shared features among different tasks. However, most of the existing algorithms are formulated as a supervised learning scheme. Its drawback is with either insufficient feature numbers or missing label information. To address these challenges, we formulate an unsupervised framework for multi-task sparse feature learning based on a novel dictionary learning algorithm. To solve the unsupervised learning problem, we propose a two-stage Multi-Source Multi-Target Dictionary Learning (MMDL) algorithm. In stage 1, we propose a multi-source dictionary learning method to utilize the common and individual sparse features in different time slots. In stage 2, supported by a rigorous theoretical analysis, we develop a multi-task learning method to solve the missing label problem. Empirical studies on an N = 3970 longitudinal brain image data set, which involves 2 sources and 5 targets, demonstrate the improved prediction accuracy and speed efficiency of MMDL in comparison with other state-of-the-art algorithms. PMID:28943731

  19. Comparative SWOT analysis of strategic environmental assessment systems in the Middle East and North Africa region.

    PubMed

    Rachid, G; El Fadel, M

    2013-08-15

    This paper presents a SWOT analysis of SEA systems in the Middle East North Africa region through a comparative examination of the status, application and structure of existing systems based on country-specific legal, institutional and procedural frameworks. The analysis is coupled with the multi-attribute decision making method (MADM) within an analytical framework that involves both performance analysis based on predefined evaluation criteria and countries' self-assessment of their SEA system through open-ended surveys. The results show heterogenous status with a general delayed progress characterized by varied levels of weaknesses embedded in the legal and administrative frameworks and poor integration with the decision making process. Capitalizing on available opportunities, the paper highlights measures to enhance the development and enactment of SEA in the region. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Reducing regional vulnerabilities and multi-city robustness conflicts using many-objective optimization under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Reed, Patrick; Trindade, Bernardo; Jonathan, Herman; Harrison, Zeff; Gregory, Characklis

    2016-04-01

    Emerging water scarcity concerns in southeastern US are associated with several deeply uncertain factors, including rapid population growth, limited coordination across adjacent municipalities and the increasing risks for sustained regional droughts. Managing these uncertainties will require that regional water utilities identify regionally coordinated, scarcity-mitigating strategies that trigger the appropriate actions needed to avoid water shortages and financial instabilities. This research focuses on the Research Triangle area of North Carolina, seeking to engage the water utilities within Raleigh, Durham, Cary and Chapel Hill in cooperative and robust regional water portfolio planning. Prior analysis of this region through the year 2025 has identified significant regional vulnerabilities to volumetric shortfalls and financial losses. Moreover, efforts to maximize the individual robustness of any of the mentioned utilities also have the potential to strongly degrade the robustness of the others. This research advances a multi-stakeholder Many-Objective Robust Decision Making (MORDM) framework to better account for deeply uncertain factors when identifying cooperative management strategies. Results show that the sampling of deeply uncertain factors in the computational search phase of MORDM can aid in the discovery of management actions that substantially improve the robustness of individual utilities as well as the overall region to water scarcity. Cooperative water transfers, financial risk mitigation tools, and coordinated regional demand management must be explored jointly to decrease robustness conflicts between the utilities. The insights from this work have general merit for regions where adjacent municipalities can benefit from cooperative regional water portfolio planning.

Top